Quick Takes: Microsoft’s Tay

Quick Takes: Microsofts Tay

Tech Companies Must Address Sexism in Products and Workforce.

In relying on women’s voices for chat services like Tay, assistants such as Siri, and even “girlfriend-serving” chatbots like Xiaoice, the tech industry depicts women as manipulatable servants. So while Microsoft works to reboot Tay, a prominent question lies in her wake: How is the tech industry still so blatantly sexist? Companies such as Microsoft fail to address the issues of sexism within the workplace, which perpetuates the core problem of this sexism-related issue.

According to The Telegraph, Microsoft had another recent sexism scandal in March when workers hired countless female dancers dressed in provocative school-girl outfits for their game-developer party. And, relative to the issue of female chatbots, Microsoft has also created Xiaoice in China, as stated by The Telegraph, to serve as a girlfriend and provide dating advice, and it’s used by approximately 20 million people. Yet these services modeled after females fail to have basic understandings of concepts highly relevant to women — such as rape and domestic violence, according to The Guardian. And despite women’s attempts to advance their positions in technology, companies continue to ignore their efforts. Take, for instance, the summit at South by Southwest, which worked to address abusive language toward women online. Despite its importance, the entire summit had been “isolated away from the main conference and reportedly attended only by the few brave souls already aware of the issue,” according to The Guardian.

It remains vital that tech companies address the issues within their own workforce and in the very technology that they release to the public in order to eliminate sexism in the industry. Unless companies tackle the issue head first, it is highly unlikely it will be eliminated any time soon.

 — EMILY COLLINS   Staff Writer

Obscene Bigoted Language Toward Tay Reflects the Attitudes of Users.

A machine by itself can do neither good nor evil, so the humans who make use of it bear full responsibility to use it well. Microsoft’s new chatbot shows us just how much users shape a technology’s ethics, even more so than its creators. Tay’s bigotry on Twitter clearly mirrors the flaws and biases of her users.

Microsoft announced that Tay’s purpose was to “speak as a teen girl,” ostensibly to make her relatable, empathetic and accessible to her users, those who engage with her on Twitter. However, Tay and other chatbots have no true age or gender and will adapt their own personas based on interactions with their users. Even if Microsoft had designed a chatbot, sans gender, with an avatar and voice automatically generated to maximize relatability, its users would still anthropomorphize it, assigning gender and other personal qualities to it, and treat it accordingly.

Thus, Tay serves as a litmus test for the kind of behaviors and attitudes that pervade the Internet. Users ostensibly corrupted Tay as a joke, trolling her with bouts of racism and sexism just for the irony of it, but anyone who frequents social networks knows the real prevalence of these “ironic” prejudices. Sarcasm gets lost in the medium, and mock bigotry becomes indistinguishable from the real thing, making many corners of the web a far less comfortable places for women and minorities. Tay either became a masterful troll or a genuine bigot, and the fact that the two look the same shows us the ugly truth about our society’s online culture. Regardless of creator intent, users bear the ultimate responsibility to use technology for good.

 — THOMAS FINN  Senior Staff Writer

Representation in Companies Necessary to Combat Misogyny Illustrated via Tay.

Through Microsoft’s “Tay,” we gain insight into the rhetoric that female activists, game developers and engineers who live online face every day. For example, Gamergate, a controversial movement in 2014 led by an alliance of anti-feminists, resulted in game developer Zoe Quinn and critic Anita Sarkeesian facing disturbing death and rape threats, forcing them to flee from their homes for questioning sexism in video games. In the male-dominated technology industry, women’s safety and equality online are not treated as central issues. To further the conversation, we need more women in technology.

It’s no surprise that sexism is an obstacle in the tech industry. According to the Huffington Post, in Silicon Valley, nearly half the companies have no female executives, and, as ascertained by a ValueWalk infographic, only 18 percent of all tech startups in the United States have at least one female founder. In response, companies have been reforming by turning to wage transparency and releasing reports indicating the number of female employees. Activists such as Facebook Chief Operating Officer Sheryl Sandberg have been speaking out on gender disparity, stating that Facebook provides employees training to correct their unconscious biases. The issue is even being tackled at a young age, with organizations like Girls Who Code and SciGirls.

However, the misconceptions associated with women in tech, as seen by promotional videos portraying “geek girls” as sex symbols, are still pervasive, and young girls are not experiencing the same level of support for pursuing STEM careers as their male counterparts. With more female executives in tech, we will see an increase in representation and diversity. In the future, bots like Tay will not face this same reaction online because women will not be seen as technologically inferior.

 — AARTHI VENKAT   Staff Writer

Leave a Comment
More to Discover
Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Your donation will support the student journalists at University of California, San Diego. Your contribution will allow us to purchase equipment, keep printing our papers, and cover our annual website hosting costs.

Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Comments (0)

All The UCSD Guardian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *