Handson Machine Learning with JavaScript 
Author: Burak Kanber The first chapter is a lightening introduction to JavaScript and to Node.js in particular which is used for all of the examples. You almost certainly need to know more JavaScript than this, but the account of serverside JavaScript might be useful if you have only used JavaScript in a browser. Chapter 2 is titled "Data Exploration" but it starts off with a short introduction to some ML techniques and stats. There is an unfortunate mixup on pages 45/46 where the same chart is used to illustrate low and high correlation. If you are new to this game you might be confused and wonder what is going on. The chapter then goes on to discuss data and explains concepts such as outlier, missing data and standard deviation. This too is almost certainly insufficient if you have not encountered these ideas before. Chapter 3 is an overview of machine learning. It goes over the basics of unsupervised and supervised learning and a little on reinforcement learning. The final part of the chapter lists and describes some of the standard techniques  clustering, classifying, etc; and areas of application  image processing, natural language, etc. Chapter 4 is where the book starts on it core subject matter. This chapter deals with a simple clustering method  kmeans. After showing how to write a program to compute the mean, the chapter moves on to the full kmeans clustering program which is applied to some easy examples. The final part of the chapter deals with how to investigate the value of k to get a good estimate of the number of groups. There are some places where splitting the program over a page and wrapping of lines makes the code harder to read than it could be. Moving on from clustering, Chapter 5 looks at classification with the first method being the knearest neighbour classifier and the second being a Bayes classifier. At the end of the chapter the more complex SVM and Random Forest classifiers are introduced, but using code that you have to download. From this point on the book doesn't develop the code to do the core computations, it just demonstrates how to use prewritten code. Chapter 6 is devoted to the association rule algorithm which attempts to find relationships between, say, things that a person has bought. Again we use prewritten code and simply import the libraries that do the work. Chapter 7 is on prediction and mostly about regression, but some time series techniques are introduced towards the end. The regression is restricted to classical linear and polynomial. The time series is limited to filtering and seasonality with a mention of Fourier analysis. On the whole this chapter could have probably been left out  regression is hardly a machine learning technique and without covering some of the more advanced forms it is even less so. The next three chapters are probably for most readers the star of the book as they are on neural networks using TensorFlow.js. TensorFlow is one of the standard ways of working with neural networks and TensorFlow.js lets you do machine learning in the browser. Chapter 8 introduces the basics of neural networks and shows how to use TensorFlow.js to solve the XOR problem. Chapter 9 goes deeper and explains convolutional neural nets and recurrent networks. It uses the classic MNIST handwriting data to train a convolutional network. Chapter 10 is a brief introduction to natural language processing. It covers many of the standard techniques such as stemming, text distance and so on. The chapter closes with a quick look at neural networks in language processing, but no programs are presented. Chapter 11 is about real time machine learning and covers using streaming and data pipelines and cloud services. It is more a general awareness raising discussion than a practical guide. The final chapter is about choosing algorithms for your application. click on cover for special offer from Packt Handson Machine Learning with JavaScript is a book of two halves. The first half shows you how to write simple JavaScript programs to implement algorithms. The second half admits defeat and uses library code to implement the difficult parts. This isn't unreasonable as the idea of filling a book with, say, the code needed to compute a regression or, even more unthinkable, a convolutional neural network, is not a good one. In practice you are more likely to use off the shelf code that write things from scratch. So if you were expecting that the book would show you how to code these things are this level you will be disappointed. The level of coverage is very shallow and very little math is used to make things exact. This is a good thing if all you want is a practical introduction to the basics of machine learning. Notice, however, that there are lots of omissions  only one clustering algorithm given, nothing on more advanced regression techniques, nothing on reinforcement learning and so on. Again, this isn't unreasonable given the size of the book and making a selection is a good idea. However, most of the exciting things going on in machine learning are about neural networks and reinforcement learning, and this isn't covered. If you want a book that will get you started with some aspects of practical machine learning using JavaScript, then this is a good and easytoread choice. To keep up with our coverage of books for programmers, follow @bookwatchiprog on Twitter or subscribe to I Programmer's Books RSS feed for each day's new addition to Book Watch and for new reviews.


Last Updated ( Tuesday, 18 September 2018 ) 