TensorFlow - Google's Open Source AI And Computation Engine
Written by MIke James   
Wednesday, 11 November 2015

Google has open sourced a tool that can be used for a wide range of parallel computations, including implementing neural networks and other AI learning methods.

 

tensorflow1

 

Part of Google's ongoing research into AI has been finding ways of making use of the powerful computational facilities to implement demanding AI techniques. For example, neural networks may be fairly simple, but given the number of layers and neurons of a typical network, the result is a lot of number crunching even to make use of a trained network. When it comes to training, it can take days or weeks and if we are going to find better ways of working we need to be able to try out different things a lot faster. 

Google already had an implementation of a distributed training system that made use of lots of GPUs and CPUs called DistBelief. This was used to train all of Google's neural networks that have recently made headline news - see Related Articles. 

This worked, but it was targeted exclusively at neural networks and had a reputation for being difficult to configure. Now Google has created and open sourced an alternative - TensorFlow.

While this is being reported as an AI implementation tool it is in fact a general purpose tool for distributed computation. This is more like Hadoop for really difficult problems rather than just map-reduce and it is twice as fast as DistBelief and in regular use at Google. 

 

  

The key to TensorFlow, and the reason for its name, is that it takes a flow graph where the arcs correspond to tensors and the nodes are tensor operators.

If the upmarket term "tensor" is confusing you, don't worry. In this context a tensor is just a multidimensional array. In TensorFlow all data is a multidimensional array and all operators combine tensors to produce new tensors. 

 

tensors flowing

A familiar tensor operation is matrix multiplication and this is how you use TensorFlow in Python. 

import tensorflow as tf
# Create a Constant op that produces a 1x2 matrix.
# The op is added as a node to the default graph.
#
# The value returned by the constructor represents
# the output # of the Constant op.
matrix1 = tf.constant([[3., 3.]])
# Create another Constant that produces
# a 2x1 matrix.
matrix2 = tf.constant([[2.],[2.]])
# Create a Matmul op that takes
# 'matrix1' and 'matrix2' as inputs.
# The returned value, 'product', represents the
# result of the matrix

# multiplication.
product = tf.matmul(matrix1, matrix2)


If you can represent your calculation as a flow graph then TensorFlow will implement it for you. What this means is that as well as neural networks you can use it to do things like solve partial differential equations and compute the Mandelbrot set. 

The native language of TensorFlow is C++ and this is what you have to use if you want to implement a new tensor operator. However, if you just want to use the available operators then Python is a better choice because there are lots of helper functions and you can use an interactive iPython (Jupyter) notebook. The hope is that the open source community will contribute additional and improved language bindings and environments. 

The version of TensorFlow that has been released isn't quite the full system that Google has. It only works on a single machine and not on the sort of server farm that Google uses to do its AI. It is promised that the full version will be released in the future, but at the moment TensorFlow will make use of additional CPUs and GPUs available on a single machine. All that is needed is that CUDA is installed to make it work with your machine's GPU and that means that it will work with most hardware.

At a first look the management of the computing resources doesn't seem very sophisticated. If there is a GPU it will be automatically used, but if you have multiple GPUs then you have to allocate them manually to particular nodes. Even so, this makes parallel computing more easily available to the non-specialist and it means that you can train large neural networks using a laptop or a desktop machine.

It is worth mentioning that while it is a general purpose number crunching system, it does have features specifically targeting neural networks. For example it has an auto-differentiation operator that will find the gradient for a neural network model. Several standard models are provided, but not the model which achieves the amazing ImageNet result - but this too is promised for the future. 

One of the notable things about the TensorFlow project is that the documentation is exceptional. It is really good. Lots of easy to understand example and lots of well presented explanations. This increased the chances that TensorFlow will be taken up by the programming and or AI novice. 

The fact that Google has made TensorFlow open source under the Apache 2.0 license is really good news - even if Google is keeping control of the project. There are other alternatives, Torch, Caffe and Theano, but TensorFlow has the flexibility to cover the range of hardware needed for training and use. Arguably however it is having Google behind the project that is likely to be the single biggest attractive feature. Given Google's AI team use TensorFlow and achieves useful and impressive results, TensorFlow comes with recommendation you really can't ignore. 

This could be the start of the wider spread of sophisticated AI techniques into "everyday" programming. 

tensorflow1

More Information

http://tensorflow.org/

Related Articles

RankBrain - AI Comes To Google Search

The Flaw In Every Neural Network Just Got A Little Worse

Google Files AI Patents

Inceptionism: How Neural Networks See

Google's DeepMind Learns To Play Arcade Games

Neural Networks Beat Humans

Facebook Shares Deep Learning Tools

Neural Networks Describe What They See

Neural Turing Machines Learn Their Algorithms

Google's Neural Networks See Even Better

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on, Twitter, FacebookGoogle+ or Linkedin

 

Banner


Running PostgreSQL Inside Your Browser With PGLite
18/03/2024

Thanks to WebAssembly we can now enjoy PostgreSQL inside the browser so that we can build reactive, realtime, local-first apps directly on Postgres. PGLite is about to make this even easier.



Couchbase's Coding Assistant Goes GA
11/03/2024

Capella iQ, the AI coding assistant for developers that makes interacting with Couchbase using natural language possible, has gone from private beta to being generally available.


More News

 

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Friday, 08 March 2019 )