Prof. Laibson mentioned that supposing "big data" was "big" enough, an ML model could predict the stock market 20 years from now. How "big" would this data need to be? Would it be on the scale of Laplace's demon, where every factor in the world would need to be determined, or would a smaller subset of information be enough for this prediction? I'm curious about this question because everything is thereotically predictable by Laplace's demon, so when can we draw the line between predicting things that need the information held by Laplace's demon and predicting things that only need a piece of that information?
top of page
bottom of page