Facebook Shares Deep Learning Tools
Written by Alex Armstrong   
Thursday, 22 January 2015

Facebook AI Research has announced that is open sourcing the deep-learning modules that enable it to train larger neural nets in less time than those already available.

 

fairbanner

 

Since Yann LeCun was recruited to head Facebook's newly founded AI Group, FAIR in 2013 it has become a team of 36 people and made great strides forward.

Fortunately for the area of deep learning and convolutional nets it believes that:

Progress in science and technology accelerates when scientists share not just their results, but also their tools and methods.

Hence its decision to do just that with a set of tools that give a 23.5x speed-up over publicly available convolutional layer codes, and the ability to parallelize neural networks training over GPU cards.

The tools are being made available for Torch. an open source development environment for numerics, machine learning, and computer vision widely used at a number of academic labs as well as at Google/DeepMind, Twitter, NVIDIA, AMD, Intel, and many other companies.

The following fast nn modules for Convnets and neural networks in general are provide a a plug-in to the Torch-7 framework:

  •  Fast spatial convolution modules that use FFT to accelerate convolutions. This deft conference paper has details.

  • Fast Temporal convolutions that are 1.5x to 10x faster compared to Torch's cunn implementations.

  • nn.DataParallel and nn.ModelParallel containers. Plug your model in them and see it accelerate over multiple GPUs,

  • Wrappers to use FFT/IFFT as nn modules.

  • Fast LookupTable that is used for Neural Language Models and word embeddings. Much faster than the one in torch/nn

  • Hierarchical SoftMax module, now classifying 1 million classes is a practically viable strategy

  • LP and Max Pooling over feature maps (usable for MaxOut).

To use these packages for Torch, visit the fbcunn page which has installation instructions, documentation and examples to train classifiers over ImageNet.

Facebook has also recently released iTorch, an interface for Torch using iPython  with visualization and plotting and previously has made available fbnn, extensions to torch/nn, fbcuda, extensions to CUDA, and fblualib  libraries and utilities for Lua.

Concluding the announcement on the FAIR Blog, Soumith Chintala notes:

We hope that these high-quality code releases will be a catalyst to the research community and we will continue to update them from time to time.

Banner


Microsoft Introduces .NET Smart Components
01/04/2024

Microsoft has provided a set of .NET Smart Components, described as a set of genuinely useful AI-powered UI components that you can quickly and easily add to .NET apps. The components are prebuilt end [ ... ]



Run WebAssembly Components Inside Node.js With Jco
28/03/2024

Jco 1.0 has been just announced by the Bytecode Alliance.It's a native JavaScript WebAssembly toolchain and runtime that runs Wasm components inside Node.js. Why is that useful?


More News

 

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

 

Last Updated ( Thursday, 04 October 2018 )