You may like to listen to Talking Robots podcasts. Those are interviews with well-known researchers (such as Rodney Brooks, Marco Dorigo, Dario Floreano, Owen Holland, Francesco Mondada, Ron Pfeifer, and Luc Steels whom I more or less know) working on robotics on fascinating topics like Evolution of Communication, Human-Robot Teams, Robot Consciousness, Swarm Robotics, Robot Cognition, Behavior-based Robotics, Evolutionary Robotics, Embodied Cognition, and etc.
Enjoy the show!
Control Theory Course Notes
Need some lecture notes on robust control or nonlinear control? You can find them here in the homepage of Michael Lemmon.
He is a professor of electrical engineering at the University of Notre Dame, working on decenteralized control over embedded networks, flow control, and real-time control systems.
[Book] The Cauchy-Schwarz Master Class (Michael Steele)
“The Cauchy-Scwarz Master Class: An Introduction to the Art of Mathematical Inequalities” is a book by Michael Steele. I do not have this book myself yet, but by taking a look at its table of content, and the three online chapters of it, I guess I would like it if I had!
You can find three online chapters of this book at this address + somethings extra here. (;
I’m alive!
It’s a shame that I haven’t written anything here for such a long time. I have done many things since my previous post. I learned many things, worked on a quite few subjects, went to the NIPS conference and attended a few workshops there (the first day was completely devoted to the New Trends in Reinforcement Learning. I sampled in the second day), went to the Ottawa in the holiday, and etc.
So, this post is just to say I’m still alive!
That’s it!
Approximate Algorithms for NP-Hard (Complete) Problems: A Reference
Have you ever wondered if a specific NP-hard problem has an approximation algorithm or not? If yes, you may like to visit this site: A Compendium of NP optimization problems.
Computational Learning and Motor Control Lab@USC
You may like to visit Stefan Schaal’s Computational Learning and Motor Control Lab. at the University of Southern California (USC). Beside many papers on the application of RL, supervised ML, nonlinear control, and etc in robotics domain, you can find some nice movies too. For instance, I like the one that robot learns to imitate pole-balancing so much (here!).
I should read a few of those papers soon. I guess I read a paper by this group (Peters and et. al.) about natural policy gradient about a year ago.
Johnson-Lindenstrauss Lemma
Johnson-Lindenstrauss Lemma:
For any 0
(1-e)||v-u||^2 <= ||f(v) - f(u)||^2 <= (1+e)||v-u||^2. Also this map can be found in a randomized polynomial time. In other words, this theorem (or lemma!) states that we can always reduce the dimension of some data points with a distortion which is O(ln(n)/eps^2). Isn't it a useful results for dimension reduction and manifold learning? Specially when we do not want to assume that those samples are strictly on the manifold, but just are close (Actually it seems that Sanjoy Dasgupta and Anupam Gupta that I am refering to their paper considered the same type of application.). For reading more, refer to Sanjoy Dagupta, Anupam Gupta, "An elementary proof of the Johnson-Lindenstrauss Lemma,” Random Structures and Algorithms, 22(1):60-65, 2003.
Reading David Corfield
A note to myself:
Take a look at David Corfield‘s webpage. He is working on the philosophy of statistical machine learning which seems an interesting topic.
Note of the Day: Dirichlet Divisor Problem
In this post, I write about the divisor number and the Dirichlet Divisor Problem.
First, a short introduction:
Consider an integer number k. You can define divisors of this k easily, i.e. those numbers that can divide k. For instance, if k=6, its divisors are 1, 2, 3, and 6.
It is well-known that there exists a unique prime factorization for this number in the form of (p_1^r_1)*(p_2^r_2)*…*(p_w^r_w). Having this factorization, you can write all divisors as (p_1^a_1)*(p_2^a_2)*…*(p_w^a_w) in which 0<=a_i<=r_i. It is easy to see that there are (r_1 + 1)(r_2 + 1)...(r_w + 1) divisors for the k. Let's call this number d(k).
Well! All these things are high-school number theory. Let's go to something more exciting.
Tonight while I was struggling to solve my algorithms assignments (which was about string matching and those things), the following question came to my mind:
Can I find a bound on the number of divisors of k without knowing its exact factorization? I mean I do not want to factor k to the prime numbers and then solve the problem. I want something in a closed-form which just depends on k as its only variable and not {p_i}s or other things.
Or what about this question: Can I find a bound on the number of the divisors of all number between 1 to n, i.e. a bound for d(1) + d(2) + … + d(n). This is specially important if my intention is calculating the computational complexity of my algorithms which in its i-th step, has an internal loop with d(i) steps (this is exactly what I wanted to do).
I could not solve this problem after trying a bit (the reason: my assignment was already late, I was tired to think about anything else, I am not a number theorist in any case, and more importantly, this problem may not be so easy). Hence, I started searching for a few minutes, and suddenly I found the following result which is named Dirichlet Divisor Problem:
d(1) + … + d(n) = n*ln(n) + (2\gamma – 1) + O(n^\theta)
in which \gamma is Euler-Mascheroni constant. The best estimate of \theta is 7/22 as stated here!
I enjoyed this result. It helped me to replace a O(n^3) by O(n^2*log(n)). 😛 By the way, I suggest you to read Dirichlet Divisor Problem and Divisor Function.
Anousheh Ansari
Anousheh Ansari, an Iraninan woman, is going to go to the space in a few hours. She intends to write in her blog during her trip. You may enjoy reading it.