During the interview, I found it surprising when Dan Kammen mentioned that despite the fact that we have more complex, evolved, less uncertain, and higher fidelity models. We usually forgo that in favor of models that use historic data and records to model climate change as they are more practical in regards to computational data and time. Despite the fact that climate is evolving so quickly that records need to keep up in order to maintain the accuracy of the models, the data series is still very powerful to tell us how the climate changes.
top of page
bottom of page
This has some interesting parallels in machine learning. While often times it isn't viable, quite often the gold standard for machine learning models is if you have lots and lots of historical data about what has happened before. For instance to recommend TV shows, Netflix simply looks at what people with similar watching habits have liked in the past in order to recommend stuff to you.