Machine Learning with Python for Everyone

Author: Mark Fenner
Publisher: Addison-Wesley
Pages: 592
ISBN: 978-0134845623
Print: 0134845625
Kindle: B07VRSJ1GB
Audience: AI Beginners
Rating: 3
Reviewer: Mike James
A book that claims "for everyone" is promising a lot.

I am very firmly of the opinion that without a good math background you cannot master the new AI. This isn't to say that I think that it is impossible to learn some of the general ideas, and even some of the finer detail, but to truly become competent and do new things in the correct way you need enough math to backup your thought processes. What this means is that a book that claims to teach AI to any reader, irrespective of how much math they know, is unlikely to fully succeed. I suppose the question is how much does it succeed and this is a difficult one. Even the requirement to be able to write some Python code probably over promises.

Banner

The first thing we have to establish is what parts of modern machine learning the book covers. Arguably the most exciting aspects of machine learning are neural networks in all their variety and sophistication. They are the single big reason that machine learning is an important topic and they are more or less ignored in this account. This isn't unreasonable as covering them in detail would require more math than the book assumes, but if you want to know about the exciting parts of modern machine learning you might well be disappointed that they are relegated to a four pages near the end of the book.

What the book does cover, in great detail, are simpler models of machine learning - regression, nearest neighbour classification, Bayes modles and so on. Most of these methods, regression in particular, could just as well be called "classical statistics", but then they would not siound quite as exciting.

 

Chapter 1 starts off with a very chatty introduction to what machine learning might be. There is nothing technical in here and this really is for everyone. Chapter 2 introduces some basic mathematical ideas - essentially summing series and the inner product. These are unavoidable if you are going to deal with classical terms such as sums of squares as a way of characterizing error. Chapter 3 introduces the first models - nearest neighbor classification and naive Bayes. You could argue that no "learning" is going on in either of these. The next chapter introduces regression, which I still find hard to include in the category of "learning" methods. This is a very gentle and mostly geometric approach to understanding regression. This brings the first part of the book to a close and we have only made the acquaintance of a very limited range of machine learning models and some might not even agree that they are machine learning models.

Part II is a deep and extensive investigation of the evaluation of the performance of classifiers and predictors. This is a fairly modern concern and it is described in fine detail - so much so that it is difficult to see the overall principles. The explanations are long and in plain English something - that makes you quickly understand why the compressed code of mathematics makes it easier to express an idea or work with it. After confusion matrices, ROC curves and R squared I was almost begging for the torture to end - oh no not another variation on a theme! I do know the math and you might disagree with my verdict, but I really think any reader is going to have trouble absorbing so much uncompressed data. The discussion of R-squared was particularly confusing - correct but very diffult to understand. Where is the harm in stating that R-squared is the ratio of what your model has explained to the total variation in the data and moving on?

Part III is about other methods of machine learning. Here we meet logistic regression, Support Vector Machines, decision trees and so on. We meet regression models extended in every possible way except the classical general linear model.

Part IV is about combining methods - bagging, boosting, random forests etc. We also learn about feature selection - arguably the big problem with all non-neural network methods. This section comes to a close with a look at neural networks as if they were a variation on regression - which they are, but this misses so much.

Overall this is a well-written book and any one small part of it is easy to read and does attempt to give you the ideas behind the methods in non-mathematical terms, or more accurately using as little math as possible, and lot of stories and analogies. This works, but there is so much of it that you have to be a good study to get through it all and come out the other side with a general picture of what is going on. In a sense, at the end of the book, you will have understood what the author has explained, but you will not have the knowledge that was in his head that allowed him to write the book.

The biggest drawback of the book is the way that it covers many modern methods, but from a very traditional point of view. It would be unfair to characterize this book as "regression made modern" but it did cross my mind as I was reading it. A second problem is that Neural Networks are the elephant in the room, and when they do finally make their appearance it is as a mouse.

This is a well-intentioned, well-written book, but I can't recommended it as a way of avoiding the math. However, I can recommend reading bits of it to make sure you have fully understood the math.

 

To keep up with our coverage of books for programmers, follow @bookwatchiprog on Twitter or subscribe to I Programmer's Books RSS feed for each day's new addition to Book Watch and for new reviews.

Banner


Query Store for SQL Server 2019 (Apress)

Author: Tracy Boggiano & Grant Fritchey
Publisher: Apress
Pages: 234
ISBN: 978-1484250037
Print: 1484250036
Kindle: B07YNL3X4X
Audience: SQL Server DBAs and Devs
Rating: 4
Reviewer: Ian Stirk

This book aims to use Query Store to improve your SQL Server queries, how does it fare?



Racket Programming the Fun Way

Author: James W. Stelly
Publisher: No Starch Press
Date: January 2021
Pages: 360
ISBN: 978-1718500822
Print: 1718500823
Kindle: B085BW4J16
Audience: Developers interested in Racket
Rating: 4
Reviewer: Mike James
If you have ever wanted to Lisp then try Racket.


More Reviews

Last Updated ( Tuesday, 21 July 2020 )