Every programmer likes a good self reference, a recursion, a bootstrap - but this one is mind-boggling. We have an implementation of Conway's game of life in Conway's game of life. Or put more simply Life in Life.
I don't know how we missed this one, but it doesn't matter that the video was released a few months ago it is still worth viewing.
It has long been known that Conway's life is Turing complete, that is you can use it to compute anything that a Turing machine can compute, but doing it is another matter. The rules of Life are so simple that it is hard work to create something that does a particular computation, but many interesting structures have been found - gliders and flip-flops and so on. People have built huge arrangements of cells that do all sorts of interesting tasks and the only thing you can say when you see them in action is - "How did they ever think this up?" Life at this level of sophistication is very impressive - but you ain't seen nothing yet.
Some years ago, around 2006, Brice Due created a metapixel - a unit cell that can be customized to behave like any cell in a Life like cellular automata. The metapixel needs 2048x2048 Life cells and yes it is that big. The metapixel has two states just like a basic Life cell that corresponds to the majority of the cells in the middle being on or off. It takes an amazing 35,328 generations to go from on to off. The rule that the cell implements is encoded into two columns of nine "eaters" which correspond to survival and birth rules.
If you would like to see the changes involved in going from on to off watch the following video:
The metapixel is generally implemented using Golly, a very fast Life engine. The video was created using the standard Golly implementation of the metapixel and some custom Python code to capture the frames from Golly. The zoom factor is changed to produce smoothly changing scale, which is similar to that used in the famous video "powers of ten".
As you watch the video, you start out looking at the cells of the Life simulation provided by Golly and slowly zoom out to see the workings of the huge machine that is simulating Life in Life:
The most complex things come from the most simple, but in this case we have the most complex things returning to the simple and closing the loop.
The Project Oxford team has released a new demo that uses its machine learning Face detection API. This one recognizes emotional states in photographs of people and seems to be impressively sensitive. [ ... ]