|Hutter Prize Now 500,000 Euros|
|Written by Mike James|
|Thursday, 27 February 2020|
The Hutter Prize for Lossless Compression of Human Knowledge was launched in 2006. The challenge then was to find a better compression for a 100Mb sample of Wikipedia. Now Marcus Hutter has increased the sizes of both the task and the reward by a factor of ten.
I have to admit that I had never previously heard of the Hutter Prize. It is an ongoing competition which rewards incremental improvements in data compression of an excerpt from Wikipedia. The stated intention of the prize is:
to encourage the development of intelligent compressors/ programs as a path to AGI [Artificial General Intelligence].
Answering the question "Why did you go BIG in 2020?" the Hutter Prize site states:
The contest started 14+ years ago. Since then, processor speed, memory, disk space, internet speed have all increased sufficiently, that many of the issues [it originally adddressed] became less critical. >10GB RAM and >100GB free HDD space is standard by now. Moore's law for PCs has been disappointing in the last decade, so we decided to also be more generous with time. Finally I can afford to pay out larger prizes, so I decided to go 10x on all fronts: 10x Size, 10x Money, 10x Time, 10x RAM, 10x HDD.
The prize money comes from Marcus Hutter who is currently on leave from his professorship at the Research School of Computer Science at the Australian National University while acting as a Senior Researcher at Google Deep Mind in London. His previous research was centered around Universal Artificial Intelligence, the topic of his 2005 book.
According to Hutter's HomePage UAI is:
a mathematical top-down approach to AI, based on Kolmogorov complexity, algorithmic probability, universal Solomonoff induction, Occam's razor, Levin search, sequential decision theory, dynamic programming, reinforcement learning, and rational agents.
So this sets the background for a data compression contest that furthers research into AGI. As Marcus Hutter puts it:
This compression contest is motivated by the fact that being able to compress well is closely related to acting intelligently, thus reducing the slippery concept of intelligence to hard file size.
Originally the task was to losslessly compress the 100 Mb file enwik8 with a baseline of 18,324,887. The new challenge uses the 1GB file enwik9 to less than 116MB. More precisely:
In this interview with Lex Fridman, Marcus Hutter answers questions about his research in the field of artificial general intelligence, including his development of the AIXI model which is a mathematical approach to AGI that incorporates ideas of Kolmogorov complexity, Solomonoff induction, and reinforcement learning.
Yes, it's a long interview but it covers a lot of interesting ground.
or email your comment to: firstname.lastname@example.org
|Last Updated ( Thursday, 27 February 2020 )|