How is technology reinforcing gender inequalities?
According to Tech Nation, the tech workforce is currently made up of just 19% of women. Despite what you might expect, this trend is also paralleled by the tech giants, as in 2020, only 23% of employees at Google, Apple and Facebook were women.
Yet, despite this pronounced gender gap, 74% of school-aged girls show an interest in a career in a science, technology, engineering, and maths (STEM) field.
The issues don’t stop there. According to a recent study from WeAreTechWomen, 75% of women working in tech say that they don't feel like they receive adequate support and respect from male colleagues, and two thirds of the respondents feel unheard in meetings.
But, why exactly does this need to change? Alongside the ethical importance of giving more women the opportunity to develop in the field, why isn’t having talented white men in these leadership roles sufficient?
The case of female mobile voice assistants
The value of including more female perspectives – not just for women in the field, but for all users and the entire sector at large – within technology companies is illustrated perfectly by the case of female voice assistants.
We all know what it’s like to get frustrated with Siri. After repeatedly failing to get the device to play a specific song, we might give up and say “forget it”, or even snap, telling it to “shut up”. Your first reaction to this might well be, “what’s the harm? It’s not like we’re losing our temper with a real, human woman”. Well, it turns out that there are specialists who have suggested that the tone, phrasing and set responses of female mobile phone voice assistants are inherently problematic. Their passivity, complicity and politeness – it is argued – further gendered stereotypes and are markedly different from that which we see in male voice assistants.
One of the first digital reports to raise this issue came from UNESCO, in collaboration with Germany and the EQUALS Skills Coalition, in 2019.
The report heavily criticised the submissiveness of Siri, particularly in regards to its responses to sexual harassment. It stated that such a response, and the characteristic servility of female voice assistants, “provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education”.
“The thing about technology is that it seems harmless. We think of tech as neutral, right? But, what that does when the voice of an assistant – whether it be Alexa, Google Home or Siri – is a woman, is perpetuate the norm of a woman as secretary, as servant, as assistant,” explains Samantha Karlin, CEO of Empower Global.
This idea is reinforced further by the key differences between male and female voice assistants. It has been widely flagged that male voice assistants are typically used for warnings or alarms that strictly tell you to do something. Meanwhile, female voice assistants are largely used for assistive tools. In short, female voice assistants tend to follow your requests, while male voice assistants tell you what to do. Problematic, right?
Treating female voice assistants with derision is common. And what’s worse, many critics have challenged the way that these female voice assistants are designed to be subservient and provide humble, obedient responses – even if the user is being aggressive or making sexist remarks.
“I don't know about you, but I've actually gotten visibly upset when I've heard some of my male friends yelling at Alexa when she doesn't do what she's supposed to do, because it normalises abuse,” Samantha explains.
“So why is this the case? It is an example of a lack of inclusion in product design.”
The absence of diverse perspectives – what do we stand to lose?
For Samantha, this example is deeply indicative of a much wider industry trend.
“When it comes to technology, we have a responsibility to design products that are inclusive and don't perpetuate harmful norms, like the woman as the assistant. But unfortunately, that's not what we see happening.”
Delving further into the field of AI, around one in four people working in this sphere is a woman.
“So, that means that the people designing these products are men. I'm not saying men can't be empathetic and appreciative. But, it means that there's not enough people with differing lived experiences that are designing these products, who can say, ‘Hey, how about we don't design this for the default setting (white, western and male)’,” Samantha outlines.
Figures show a marked imbalance of promotions when it comes to men and women working in senior roles. This trend increased from the start of the pandemic, when 34% of men versus 9% of women received a promotion, and 26% of men versus 13% of women got a pay rise.
The gap is evident, and trends like these only threaten to widen it. But, unless greater support is given to women looking to pursue a career in tech, there will be an almost complete uniformity of perspectives among those involved in high-level discussions.
“It's not that men are not thinking about these things, but they don't have the lived experience necessary to draw those correlations,” Samantha explains.
“We need to understand that, even if technology can be a force for good, it can also easily be weaponised. And something that is normalised within our society is the abuse of women.”
Samantha urges that, when designing and developing technical products, worst case scenarios have to be considered. There are those who will seek to weaponise technologies, and we need to create measures that will protect vulnerable users against risks. One of the most concerning and widely cited examples of a failure to do so is that of social media.
“If you haven't seen the rates of teen suicide and depression in correlation to Instagram, it's shocking. In one study, it was revealed that, for girls who spend more than three hours on social media a day, 37% of them are likely to self-harm,” Samantha outlines.
In this case, empathy is clearly at odds with the financial model of the platform. The usage of the infinite scroll UX design, coupled with exceptionally sophisticated engagement algorithms, ensure users remain on the app for as long as possible. In this way, as Samantha asserts, feelings of helplessness, hopelessness, poor self-esteem, depression and anxiety were all being enhanced by Instagram.
“When our society becomes a place where technology is used to destroy young women’s self-esteem, to cause suicide, and to perpetuate harmful norms, we are failing as a technical community.”
An ‘ethical revolution’, through feminist leadership
By assessing and improving the interrelationship between ethics and technology, the industry can work towards a new ethical framework – which, Samantha states, is something that is starkly absent from within the industry.
Firstly, according to Samantha, part of this is about asking ‘better questions’. Rather than giving sole voice to purely profit-led questions, a feminist approach to leadership requires leaders to evaluate the ethical implications of a product, before it is brought to the market.
“The purpose of capitalism is profit. They're not taking a pledge to say we want to do public good,” Samantha asserts. “So, we need people within the companies to ask these questions, because of the VCs and the board members who all come from the private sector – including the female board members – there are very few social impact leaders.”
For Samantha, feminist leadership can be defined by a humanitarian, people-led approach to technology, which she believes will bring about a much-needed ‘ethical revolution’ within the industry.
“We need ethical leadership and we need feminist leadership. And when I talk about feminist leadership, it’s not leadership predicated on hating men. Feminist leadership is about valuing people over profit. It's about protecting people on the planet,” Samantha outlines.