I found the interview with Professor Dan Gilbert very interesting in evaluating how humans make predictions and how we as humans can then make predictions about human behavior itself. Towards the beginning of the interview, I was already thinking a lot about things that I found interesting, some of which we have discussed in this class, but also other concepts that we have yet to really dive into. One of the first interesting points that Professor Gilbert brought up was how humans aren’t always predicting with accuracy in mind but rather predicting things that you hope will happen (a sort of manifestation tool to almost will something into happening). He talks about some of the reasons why people do this and highlights social rituals as the main reason but also incorporates the idea of how it is often a fun activity to dream and enjoy.
I think another really interesting thought brought up by Professor Gilbert is that humans react to the outcomes of their predictions and how when we get our predictions wrong or another outcome occurs, we are often upset not so much at ourselves but at the prediction and outcome itself. It is these prediction outcomes that create a tone and mindset for humans towards predictions and often cloud our predictive judgement when we want something we predict to have the desired outcome.
Aurora, you’ve highlighted an intriguing connection between prediction and human desire for control. I think predicting something you want to happen often gives you an “illusion” of control that helps to assuage any worries about the future. This is an illusion because the future is still uncertain until it occurs, despite your best estimations. In many cases, predicting what you want to occur may even decrease your "control" as you hold a biased view of the future. I wonder whether predictions that are made more according to future hopes than accuracy should still be called “predictions,” or rather “wishes.” It would be an interesting exercise to examine modern prediction systems to see where this hope introduces inaccuracy and bias. For example, if people don’t want to hear a gloomy prediction about climate change, can the predictor frame it in such a way that people will listen and take action? Moreover, could telling people what they want to hear be a self-fulfilling prophesy, as you pointed out?