Last week, a thing happened. Microsoft released an AI chatbot into the Twitter-verse. Then they broke Rule 2 and acted surprised when she was abused and turned into a Trump-supporting racist. None of that is a shocker to me. What I find odd is that no article about the issue seemed to take the perspective I had: that is, Tay had been abused and the bigger picture wasn't the flaw in the AI's programming, but the flaw in OURS.
Read the following: Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day. Microsoft has issue an apology over their creation of an artificial intelligence program behind a Twitter account that began to post racist remarks. Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. If you Google the incident, you'll see headline after headline, LOLing about Tay being awful. Interestingly, the titles often blame Tay for what happened and never once blame her abusers. And yet, none of these articles takes it a step farther. None of them ask the question, Why did the humans in this equation react to Tay this way? That is, none until Leigh Alexander's The tech industry wants to use women's voices -- they just won't listen to them. (Thanks, Charles Atan for bringing the article to my attention.) I feel what the interaction between Tay and the Twitter-verse says about the Twitter-verse is far more damning than what it says about Microsoft's AI program. I have to wonder why they chose the personae of a teen girl? Did they think that her youth and innocence would make her more appealing to the (default male) population? Apparently, they made that choice without once considering the ramifications. No one seems to be discussing the elephant in the room: women's experience of the internet is not a pleasant one. It's fraught with danger. And more often than not women are told to "Get tough. and get over it. The internet is a scary place. That's just the way it is." No one is holding Tay's abusers accountable. That's just normal behavior, right? In fact, one article I read laughingly ended with a statement about how Tay was like every other teen girl in that she badly needed to develop some common sense. It was so sexist, it brought me up short. In other words, she's just another young woman who made bad decisions that got her in trouble. Think about it. I believe they'll not sort out this problem until they acknowledge that social science is involved. That a key component in this equation is the context of the individual identity that their AI is assuming. In short, that their "default setting" is what is in the way of their project. They can't prepare Tay for a world they don't understand. This is an amazing opportunity to see human interaction without the default filters. We can see the unseen--that is, our systematic biases. Until then, they'll keep releasing her with the same result. She'll be abused.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
Archives
March 2023
|