Grokking Machine Learning

Author: Luis G. Serrano
Publisher: Manning
Date: December 2021
Pages: 512
ISBN: 978-1617295911
Print: 1617295914
Kindle: B09LK7KBSL
Audience: Python developers interested in machine learning
Rating: 5
Reviewer: Mike James
Another book on machine learning - surely we have enough by now?

Well perhaps not - this one is actually quite good. It has a few problems and it isn't for everyone, but if you are at the right level this could be what you are looking for.

First - the problems. Machine learning has come to mean the broader techniques that are used in AI and not just the dominant paradigm of the moment - neural networks. Some books with ML in the title really don't do much, if anything, about neural networks - this one does, although only in a single chapter, and for the first part of the book you are basically being prepared to encounter neural networks. However, if your focus is on learning neural networks you will find a lot of apparently irrelevant material.

The book claims to be more or less math-free. I have to say that it is a very insightful explaination of ML and well worth reading, but if you find math difficult you aren't going to grokk it. You need to not be thrown by the equation of a straight line, trig functions and similar. There isn't any calculus, but there is math. All the examples use Python for coding.

Banner

The book starts off with a look at what machine learning is and what sorts of things it can be used for. The actual subject of the book gets started at Chapter 3 where we meet linear regression. I'm not at all sure I class regression as ML - it is more classical statistics. What is interesting about this presentation is that the regression problem is solved using gradient descent, i.e an iterative algorithm like many others in machine learning. This is a good idea in that it is a simple way into the idea of iterative learning. However linear regression has a closed form solution - you don't need iterative gradient descent to solve it. Now I'm sure the author must know this, but there is no mention of it in the chapter and this might leave the reader at a disadvantage if they meet a more classically trained AI person.

Chapter 4 moves on to consider overfitting and underfitting. The explanation of regularization is particularly insightful and worth reading. Why we then move on to consider the perceptron algorithm in Chapter 5 is a puzzle. It is historically important, but if you have just covered regression as an iterative learning algorithm I'm not sure it's a good example to come next. Logistic regression is, which is the subject of chapter six. Next we learn about how to gauge the accuracy of a model and again this is well explained.

At this point things change a little because we look at the way probability works in Bayesian approaches to AI. Then decision trees take up a chapter before we reach neural networks. This is a brief introduction, but it will probably have you wondering why we need any of the other techniques.

After neural networks we go back to other forms of ML. In Chapter 11 we learn about support vector machines SVMs - the great hope before neural networks revolutionized everything. This said, it is probably the clearest explanation of SVMs I've read.

The penultimate chapter introduces the ideas of ensemble learning - how to make a group of bad predictors into something better. The final chapter is a big example.

Verdict:

If you want a book that tells you what ML was like before neural networks became the main deal then this is a great book. It gives you an overview of a wide range of techniques that come under the heading of machine learning - how important these ideas will be for you depends very much on what sort of work you are planning to do. Missing from the account is anything about clustering, dimensional reduction and, my personal favourite topic, discriminant analysis. What the book covers it does very well and it offers lots of insights into how things work - but you do need some math and you need to want to know about these historically important ideas.

For more recommendations of books on Deep Learning see AI Books To Inspire You in our Programmer's Bookshelf section.

To keep up with our coverage of books for programmers, follow @bookwatchiprog on Twitter or subscribe to I Programmer's Books RSS feed for each day's new addition to Book Watch and for new reviews.

Banner


Beautiful C++

Author: J. Guy Davidson and Kate Gregory
Publisher: Addison-Wesley Professional
Date: December 2021
Pages: 352
ISBN: 978-0137647842
Print: 0137647840
Kindle: B09HTH1X38
Audience: C++ developers
Rating: 5
Reviewer: Mike James
Can C++ be beautiful?



Bare Metal C

Author: Steve Oualline
Publisher: No Starch Press
Date: August 2022
Pages: 304
ISBN: 978-1718501621
Print: 1718501625
Kindle: B08YJB9BCF
Audience: C programmers
Rating: 3
Reviewer: Harry Fairhead
Bare metal C sounds exciting and very basic. Time to find out how the machine really works.


More Reviews

Last Updated ( Tuesday, 16 August 2022 )