Advice for Graduate Students in Statistics by Michael Steel. I like this guy! (:

## Reinforcement Learning blog

Satinder Singh started a new blog named Reinforcement Learning blog. Now, I can see that Michael Littman is one of the authors of the blog – though he hasn’t published any post yet. This is a good news for RL. Good luck to them!

## Tumblr and I

I have used Tumblr microblogging service to organize my machine learning-related stuff for a while. You can see it here.

Although I really like the ease of tumbling, there is a big problem here: it doesn’t naturally come with searching tool and commenting system.

It happened to me that I tried to find some papers that I found a few weeks ago, but I couldn’t find it without going through all the archive. So, I think I need to come back to this blog again.

The problem is that I may just post a link to a paper without any discussion. This may distract you readers. What do you think?! Will you become unhappy if I do so?!

## Life Goes on in Tehran

Take a look at this photoblog: Life Goes on in Tehran.

I like it because it is close to what I’ve seen there (and it is not the same as what I’ve seen on CNN!).

## Talking Robots Podcasts

You may like to listen to Talking Robots podcasts. Those are interviews with well-known researchers (such as Rodney Brooks, Marco Dorigo, Dario Floreano, Owen Holland, Francesco Mondada, Ron Pfeifer, and Luc Steels whom I more or less know) working on robotics on fascinating topics like Evolution of Communication, Human-Robot Teams, Robot Consciousness, Swarm Robotics, Robot Cognition, Behavior-based Robotics, Evolutionary Robotics, Embodied Cognition, and etc.

Enjoy the show!

## Note of the Day: Dirichlet Divisor Problem

In this post, I write about the **divisor number** and the **Dirichlet Divisor Problem**.

First, a short introduction:

Consider an integer number k. You can define divisors of this k easily, i.e. those numbers that can divide k. For instance, if k=6, its divisors are 1, 2, 3, and 6.

It is well-known that there exists a unique prime factorization for this number in the form of (p_1^r_1)*(p_2^r_2)*…*(p_w^r_w). Having this factorization, you can write all divisors as (p_1^a_1)*(p_2^a_2)*…*(p_w^a_w) in which 0<=a_i<=r_i. It is easy to see that there are (r_1 + 1)(r_2 + 1)...(r_w + 1) divisors for the k. Let's call this number d(k).
Well! All these things are high-school number theory. Let's go to something more exciting.
Tonight while I was struggling to solve my algorithms assignments (which was about string matching and those things), the following question came to my mind:
*Can I find a bound on the number of divisors of k without knowing its exact factorization?* I mean I do not want to factor k to the prime numbers and then solve the problem. I want something in a closed-form which just depends on k as its only variable and not {p_i}s or other things.

Or what about this question: *Can I find a bound on the number of the divisors of all number between 1 to n, i.e. a bound for d(1) + d(2) + … + d(n).* This is specially important if my intention is calculating the computational complexity of my algorithms which in its i-th step, has an internal loop with d(i) steps (this is exactly what I wanted to do).

I could not solve this problem after trying a bit (the reason: my assignment was already late, I was tired to think about anything else, I am not a number theorist in any case, and more importantly, this problem may not be so easy). Hence, I started searching for a few minutes, and suddenly I found the following result which is named **Dirichlet Divisor Problem**:

**d(1) + … + d(n) = n*ln(n) + (2\gamma – 1) + O(n^\theta)**

in which \gamma is Euler-Mascheroni constant. The best estimate of \theta is 7/22 as stated here!

I enjoyed this result. It helped me to replace a O(n^3) by O(n^2*log(n)). 😛 By the way, I suggest you to read Dirichlet Divisor Problem and Divisor Function.

## Understanding Mathematics

*In mathematics you don’t understand things. You just get used to them.* (John von Neumann)

I heard this quote some time ago, but Csaba Szepesvari mentioned it again in his class a few days ago.

This quote is very interesting. Do I believe it? Let me explain:

One may compare mathematics with a natural language. Both of them have some similar constructions like grammar, vocabulary, phrases, and etc.

When I am trying to prove something in mathematics, I am manipulating assumptions and other theorems in order to convey (prove) something.

What about the time I am speaking or writing in a natural language? Is it anything other than manipulating words in order to persuade others that something is true?

One may ask herself what the differences between these two languages are? Or does our brain perform these two things differently?

I am not sure, but I guess they are in some extent similar. The brain probably manipulate abstracted things the same (the grammar is the same).

However, the main difference may be that in natural languages, we often have an external object that relates to words and define their meaning, but in mathematics this sometime may be not true – specially for those fields of mathematics that a mathematician cannot find a helping physical correspond.

For instance, when I am thinking about derivatives, I can related it to the speed of an object or even I can imagine a plot in a figure and relate the derivative with its slope. Or when I am thinking about partial differential equations with some boundary conditions, I can relate them to an actual physical phenomenon like a water in a kettle which is heated from below.

It may mean that I am actually trying to use natural language tools for understanding mathematical phenomena. If this is true, I can say that we get used to mathematics in almost the same way we get used to a natural language – specially when we can find a physical real-world corresponding thing. Well, might be not!

By the way, you may like to see another quotation from von Neumann:

*You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.* (Suggesting to Claude Shannon a name for his new uncertainty function.)

## Commenting System Malfunction

Hi everybody!

I just want to note you that the commenting system “does not” work now. I will try to fix it.

## Intro: Simulation and Game Development weblog

If you are interested in computer games, you must check this Simulation and Game Development weblog. It is run by Farhad Javidi, Nic Colley, and Bahram Dahi.