In his interview, Stuart Firestein talks about his course called Ignorance at Columbia. I thought that this was an interesting title for the course because it emphasized that we as humans don’t know much about the world. The more we learn, the more we realize there is more to learn. This attitude of humility leads us to question past scientific findings and authorities, in pursuit of more accurate statements about the world. It is also interesting that Firestein critiques the education of science in schools, which is often focused on reading and memorizing facts rather than developing a questioning mindset and realization of how much we don’t know.
My question for Firestein is how do we take into account intrinsic human ignorance when accounting for uncertainty? Is there a tendency for humans to be overly confident in their estimates of uncertainty? Or as humans realize how much they don’t actually know, will they overcorrect when they estimate uncertainty? Perhaps a collection of crowd-sourced estimates of uncertainty can reduce uncertainty in these estimates of uncertainty.