After watching the Artificial Intelligence interview with Prof. Shneiderman (and watching some of his lecture at the Radcliffe Institute), one thing I was struck by was the attention to the importance of visual presentation methods for the success of various kinds of technology. Although it seems somewhat obvious in retrospect, I never quite understood the extent to which computers and smartphones present massive amounts of information through simple features of their GUI. This was even more apparent to me after listening to Prof. Shniederman's accomplishments during the Radcliffe talk - it amazes me how much planning, study and design goes into features like blue hyperlinks and touchscreen keyboards. This poses a unique challenge to AI designers, as it is not clear how to best pose AI in a visual way since AI currently lacks self-explanatory power.
One part of the talk that made me extremely excited and I wish lasted longer was Prof. Shneiderman's reference to Technics and Civilization by Lewis Mumford. I binge-read this book the summer before my freshman year, and I put it in a very small canon of works I consider to be quintessential masterpieces. Although I find myself interested in economics and attempt to approach issues like technological progress under a strictly social-scientific framework, Mumford forces us to realize on a fundamental level that our interactions with technology are determined by the sociocultural dynamics and myths we adopt. Whilst I think Prof. Shneiderman's use of it to highlight the obstacle of animism was effective, I believe there is so much more in that book that brings significant input to the discussion of AI. The reason why language is so important [as Prof. Shneiderman carefully highlights] is a need to manage (and in some cases combat) myths that foster around technology. For instance, I've found this point particularly important when discussing the labor-market doomsaying claim that automation and AI will displace the majority of human work. But we as a species have been making variations of this exact claim like clockwork for the advent of every new technology, and on an empirical level the probability that AI will displace a majority of the labor market is so low we may as well treat it as zero. And yet, many [intelligent] people continue to buy into the narrative. Why? It's a complicated question, but if Mumford is to be believed, it is a narrative that is subconsciously derived from the framework our socioeconomic institutions create. Perhaps it is a manifestation of the fear of displacement that the managerial class in outmoded sectors will face, one that is then converted into a language that is more relatable to laborers in general? In any case, one could spend a long time discussing Mumford's work, and I would have enjoyed more references to it.
Your point on how technology and sociocultural factors impact each other bidirectionally is very interesting. I wonder if this also impacts your first thoughts on how to best pose AI to users in terms of design — it seems that not only AI developers but also designers must also take into account the sociocultural narratives and importance of language, in order to fully meet their aims.