I greatly enjoyed the conversation between Professor Goodman and Professor Gilbert in this episode of Prediction and Psychology. One of the more surprising information I learned in this video was that for the most part, most people care a great deal about accuracy when making predictions about their future. I was particularly interested in this because of how this ties in with when Professor Goodman was approached with the question why she is so adamant over accuracy so it was interesting to hear from a psychologist's POV on how most people actually do care about the accuracy of their so-called predictions. And as a side note, another surprising bit of information I learned in this interview was that there has been extensive research over how different species other than humans might also be capable of predicting their futures and therefore amend actions.
One question I wanted to ask Professor Gilbert is that, if most people indeed obsess and care about the accuracy of their predictions, in other words a lot of people attempt to avoid the bad outcome and get to the good outcome when making predictions about their futures, are there fundamental biases (special selection) on the questions they choose to ask/actions they choose to do in order to maximize their possibility of getting to that better outcome?
Your point wondering about how biases may affect how people make self assessments to maximize their possibility of getting a better outcome made me think about how the same concept could be applied to people's perception of events of which they asses instead of the questions they ask! In other words, I think it's likewise plausible to say that biases (or external factors like hope, goodwill, etc.) may affect one's perception of an event to align with preconceived opinion, too. The first thing that I thought about that would support both of our points is the availability heuristic, which refers to people's tendency to rely on readily available, typically recent information when making decisions (something talked about in Behavioural Economics, which was my subject area!). In the context of predicting the future, this bias may leads people to focus on recent events or vivid experiences when making predictions, creating that bias you talked about and further altering what reasoning looks like, rather than considering the full scope of information available at the time. For example, someone who has recently experienced a setback in their career may be more likely to predict further setbacks in the future, even if this is not necessarily likely.
Tying this back to the context of subconsciously chasing after accurate predictions, I think the existence of cognitive biases are pretty interesting as it means that, when reasoning is done in a largely qualitative manner, it's hard to have a standardised process that can truly be the same two times over as we typically think of in terms of processes like the Padua Rainbow!