RSS

Tag Archives: python

A Short Account of Wrapping C++ in Python

For the AI-challenge competition I am participating to, I found that path planning (using A*) was (obviously!) the  slowest part of my Python code.  It turned out that 80% of the computation was spent planning!

So I decided to find some good C++ code, wrap it in  Python and give it a go. The code I decided to use is MicroPather, which is incredibly easy to use and it seems to be pretty fast.  One only has to cope with passing void* around, and everything is easy.

I had heard a lot of good things from Cython, so I decided to give it a try. Alas, although the new support for classes seems promising, wrapping an already existing class and playing with polymorphism proved very hard.

To cut a long story short, I reverted to the old and  good Boost Python. It took me no time at all! Another proof that sticking to old approaches is still the best solution.

BTW, it’s incredible how an old and seldom updated library like boost::python is still so incredibly good by today’s standards and changing APIs!

Next plan: include python hooks in the c++ A* code, to the best of versatility!

Advertisements
 
Leave a comment

Posted by on November 9, 2011 in Programming

 

Tags: , , ,

Quantifying the wow effect of a robot — Part I

[This is a two parts post, as I realised while writing it that it was growing without control.]

As a roboticist I spend most of my time dealing with robots. I program them, debug them, watch them perform some tasks. I might be surprised and thrilled by things people could judge as boring, or I might not get excited at all by robotic performances that others will claim as giant leaps towards a true AI. And, as I am a scientist too, I have a defect: sooner or later I will want to quantify my data. This includes the wow factor of a robot.

Imagine you are observing a robot performing a dance. At the beginning you will be caught by curiosity and you will not get your eyes off the robot. But after some time, you will notice that the robot is following a pattern. The dance movements will always be the same, no matter how much naturally random the programmer has tried to let them appear. After some time you have spent watching it, a robot will be no more interesting than a washing machine. Can we quantify this effect?

Computational theory comes to our help. In the sixties a quite clever Russian guy published a few papers about a theory which is as simple and elegant as it is powerful. After the name of the author, it is called Kolmogorov Complexity. Without digging into details, according to this theory the description length of a string in a given programming language is the complexity of that string. If the description is longer than the string itself, we’d better give up and use the string as its description.

For example, a huge pile of dishes might be a complex task for one not lucky enough to have a dishwasher (the best robot ever!), but from a computer point of view it is simply described by "2^326 dishes stacked in an uncomfortable equilibrium". On the other side the outcome of the matches of the World Cup is hardly described by an algorithm, and the list of these results is its best description. And this might explain why all of the Bayesian models and accurate simulations failed to predict the outcome of the World Cup, in spite of the efforts of the brightest minds in the world.

Obviously there are bad news: the Kolmogorov Complexity is not computable. That is, there will never be a program that receives as input an arbitrary string and gives as an output the complexity of that string. Before you leave the room shaking your head in despair, you might want to know that there is a trick: the Kolmogorov Complexity is strongly related to the entropy, which in turn is related to the compressibility of a string. I can see everybody rushing back to their seats with a light bulb shining over their heads: the Lempel-Ziv algorithm is the Swiss army knife of complexity.

To cut a long story short, the following magic lines in Python will produce a reasonable approximation to the Kolmogorov Complexity of a string s:

import zlib 
def kolmogorov(s):
  l = float(len(s))
  compr = zlib.compress(s)
  c = float(len(compr))
  return c/l 

This is called complexity per symbol, and it is between 0 and 1, where 1 means that the string is incompressible, or very complex. Being an approximation means that it works only for very big strings. Let’s see some examples.

The complexity of a string of several random numbers is high:

arr = rand(1000)
kolmogorov(arr.tostring())
  Out: 0.94303749999999997  

While it comes with no surprise that the complexity of a string of all zeros is low:

arr = zeros(1000)
kolmogorov(arr.tostring())
  Out: 0.00125  

An interesting result comes when we mix the two strings:

arr = hstack((rand(1000),zeros(1000)))
kolmogorov(arr.tostring())
  Out: 0.47612500000000002 

What about Gaussian numbers? If we use a high variance the result is not different from uniform numbers:

arr = randn(1000)
kolmogorov(arr.tostring())
  Out: 0.96350000000000002 

But we can see the complexity becoming lower when we shrink the Gaussian width:

arr = 0.5 + randn(1000)*0.0001
kolmogorov(arr.tostring())
  Out: 0.80937499999999996 

These are still random numbers, but the much smaller range makes the string of them far less complex. Below is a plot of the complexity as a function of the standard deviation. It can be seen that the smaller the width of the Gaussian, the lower the complexity of the string.

In the next post I will describe how to use the Kolmogorov Complexity to measure the wow effect of a robot behaviour.

Bookmark and Share

=-=-=-=-=
Powered by Blogilo

 
6 Comments

Posted by on July 9, 2010 in Chaos, Research, Tutorial

 

Tags: , , ,