top of page

Forum Comments

Let's talk about AI.
In The Future of the Future
Devangana Rana
Harvard GenEd 2023
Apr 03, 2023
I found learning about how AI such as ChatGPT works very interesting in the second article. I did not know that many new AI models such as ChatGPT uses a new type of neural network known as a transformer model which can analyze multiple pieces of text simultaneously, making AI models more faster and more efficient. Furthermore, the fact that AI models can pick up emergent behaviors which include unexpected skills is very intriguing. It makes me think about how far these emergent behaviors can go or the extent to which they can have negative and unintended consequences. In the third article , I learned about how AI models are very susceptible to explicit and implicit biases. Most Language models such as ChatGPT are trained on vast amounts of text data and learn to generate responses based on patterns and associations in that data which can often include hate speech, untruths, and propaganda. This can have vast implications for users. As AI systems become more sophisticated and can mimic human language more convincingly, it may become increasingly difficult for consumers to discern whether the information they are receiving is accurate or not. If a user relies on an AI-generated response to make an important decision or to take a certain course of action, there is a risk that the response may be inaccurate. Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic This article discusses how OpenAI sent text snippets of graphic and disturbing content to an outsourcing firm in Kenya to get labeled examples of violence and hate speech which could then be fed into AI so that that tool could learn to detect these pieces of information. This article made me think about the tradeoff between efficiency in technology and the potential harm that may be caused to human workers who are working to make AI models better.
1
0

Devangana Rana

Harvard GenEd 2023
+4
More actions
bottom of page