By Lucas | May 16, 2016
I’ve spent the last couple of months working through course three in the University of Washington’s Machine Learning Specialization on Coursera. Course two was regression (review); the topic of the third course is classification. As has been the case with previous courses, this specialization continues to be taught by Carlos Guestrin and Emily Fox. For the classification course, Dr. Guestrin took the lead.
The time requirements did increase a bit with this third course, not excessively, but it felt like I was working an extra hour or so a week on it. Unfortunately for me, that came at a bad time personally as home repairs, a broken down car, and illness conspired together to cause me to get a couple of weeks behind in a MOOC that I had every intention of completing. I worked my way back and completed the class, but not before I learned that in this situation Coursera will do everything in its power to convince you to move your progress (completed assignments) to a future class including repeated emails and warning messages when you log into the web site. I appreciate this option, but the number of emails that Coursera sent seemed excessive.
Course Corrections
It seems that Guestrin and Fox have made some minor but appreciated adjustments based on student feedback from earlier courses. In most cases the assessments will show you the wrong answer you selected, reducing the need to write down all answers ahead of time if you want to improve your quiz score on subsequent attempts. In some situations, feedback is even offered on your incorrect answer. After a huge gap between previous courses, there is another long gap between this course and the next course, but this time the start date has already been announced (June 15), which makes it easier to plan additional continuing education opportunities between now and then.
Classification Curriculum Content
Of course, what is of greatest interest is what material is covered in the class, and what is omitted. Overall, I was satisfied with the list of topics covered in this class, but there were a few notable omissions. Guestrin emphasized logistic regression through the first couple of weeks of the course, both regularized and unregularized. There were assignments that covered both how to work through a data science problem involving logistic regression as well as implement logistic regression from scratch. Non-parametric methods were also covered, such as decision trees and boosting. In terms of boosting, Adaboost was the specific method covered. Guestrin also gave students the opportunity to learn about stochastic gradient descent and online learning. Throughout the course, a variety of general data science techniques appropriate to classification were also covered such as overfitting, imputation and precision/recall.
There were some techniques that were, perhaps surprisingly, not covered in this class. Fellow students on the forums complained that support vector machines were not a part of the curriculum. I was also surprised that random forests got only a passing mention. It is understandable that not every topic can be covered in a 6-week curriculum, but these felt like significant omissions. They are techniques I’m familiar with, but I’ve come away from every technique covered by Fox and Guestrin with a much deeper understanding than I started with. Consequently, I would have loved to hear their take on these machine learning options.
Classification Bottom Line Review
Three courses into the specialization, I feel like I have a pretty good sense of what I like with this specialization, and what I’m getting less value from. The instructional videos from Fox and Guestrin continue to be some of the best I’ve seen in an online course and are worth watching even if you don’t have time to do the assignments. I also find the quizzes that focus on concepts are a perfect marriage to those videos, doing an excellent job reinforcing the concepts from the instruction. The application assignments are also very good, as they offer bite-size versions of the data science problems I regularly encounter and cause me to reexamine my thinking in my work. I’m getting less value from the assignments that require me to implement algorithms from scratch. With these problems, I find that there are too many times I find myself dropped into the middle of an implementation that is 90% complete; I’m able to complete the remaining 10% successfully, but I find that it doesn’t really “soak in” for me. I’m sure there are other students that find this approach works for them better than it does for me.
That’s a minor complaint, and this continues to be an easy specialization to recommend. I’ve dabbled in a couple of other Coursera courses lately, and they were a good reminder that while Coursera has many excellent classes, they are not universally of excellent quality. When you find a specialization that works for you as well as one is working for me, it is worth the time, money, and effort to see it through to the end.