I found it really interesting when David Labinson started talking about Occam's Razor. Specifically, the idea that machine learning is a departure from Occam's Razor. Honestly, I've always thought of simple models as a necessary requirement, but at the same time I have understood machine learning, yet I've never put it together that it is an almost unique exception. So I thought that was a very neat observation.
Hi Zev, I am not so sure I understand what you mean by "you have understood machine learning." I assume you are referring to the fact that Machine learning is an exception in that it is understandable but at the same time highly complex. I am not so sure I would agree with this because while yes machine learning is indeed highly complex, I am not so sure the common "black-box" methods are understandable. Why did my model assign 0.8 weightings to one variable over the other? I cannot answer such a question. In fact, I believe returning to Occam's Razor and testing the efficacy of several simple models through Machine Learning's "white-box" methodology might make it the exception that you speak of.