User-defined functions for performing data analysis on Hadoop using Apache Pig have been put together in an open source library called DataFu, courtesy of LinkedIn’s engineering group.
In a blog post announcing the availability of DataFu, Senior Software Engineer Matthew Hayes, explains that LinkedIn makes extensive use of Apache Pig for performing data analysis on Hadoop.
Pig is a simple, high-level programming language that consists of just a few dozen operators and makes it easy to write MapReduce jobs, and should be more popular if for no better reason than the fact you enter commands at the Grunt> prompt.
Pig has been designed so that programs written in it have a structure that can make use of parallel processing on a large scale, so the apps can handle very large data sets.
According to the blog, as the team at LinkedIn worked on data intensive products for LinkedIn such as “People You May Know” and “Skills”, the programmers developed a large number of UDFs, and these have been consolidated into a single, general-purpose library called DataFu which LinkedIn has made under open source.
DataFu includes UDFs for common statistics tasks, PageRank, set operations, bag operations, and a suite of tests. A pig bag is a collection of tuples (ordered sets of fields). Pig differs from normal relational databases in that you don’t have tables, you have pig relations, and the tuples correspond to the rows in the table. However, Pig relations don't require that every tuple contain the same number of fields or that fields in the same position have the same type. The UDFs in the library let you perform operations on bags such as append a tuple, prepend a tuple, concatenate bags, and generate unordered pairs.
Other UDFs give you the means to run PageRank on independent graphs; to perform set operations such as intersect and union, and to compute the haversine distance between two points on the globe.
Researchers at UT Dallas have found a really clever way of solving the anisotropic mesh problem - do it in higher dimensions and then map it back to 3D. The result is faster and more accurate meshes.& [ ... ]