During my career as a graphic designer, involved in digital design, app development, and all manner of design for the tech world, I have worked with programmers, front-end and back-end developers, data analysts, and software engineers. I have worked in-house, as well as contracted in agencies, and have been freelance and rented co-working spaces. Through all the years of my experience, the majority of my techie colleagues have all had one thing in common; they’re mostly guys.

Gender Bias in Digital

And it’s not just my findings; the stats back up my experience:

  • Facebook has a technical workforce that is 15% female.
  • Google has an engineering workforce that is only 17% female.
  • Apple’s global engineering workforce is 20% female.
  • Pinterest’s technical team is 21% female.

Bias summary

It’s not just in the areas of web design and software development because these numbers are reflected in wider areas of study and professional life.

Women in STEM

It’s a well-known fact that women make up a

smaller percentage of students studying STEM (science, technology, engineering, and mathematics) subjects. According to recent UK statistics from STEM Women, the number of women graduating in core STEM subjects has grown from 20,020 in 2015/16 to 22,340 in 2016/17. These stats might sound encouraging, but due to the rapid increase in the number of men graduating in these subject areas, the percentage of women in STEM has dropped from 25% to 24%. This overall decreased percentage of women graduating in subjects associated with software engineering and programming signals a decreasing female representation within the digital sector workforce.

There isn’t space in this article to explore the many factors that may contribute to the lower participation of women in tech and digital, but it will highlight why this is an issue for our industry, why it can negatively affect the QA process, and what can we do about it.

Why does this matter?

One might argue that as long as innovative technology and digital solutions are being developed, and our lives are being enriched by these intelligent and creative people who make things, regardless of their gender – then what’s the problem?

If we surround ourselves with similar people, who look, act, and think just like us, there’s an inherent risk of groupthink and failing to address the needs of people who “aren’t like us.” This can lead to us to accidentally create systems, processes, and algorithms that exclude, frustrate, or even endanger people.

Read more about algorithms and accidental bias.

A relatively harmless example of gender bias in digital is the humble emoji. When texting and posting on social media, it’s commonplace to use emojis and icons as shorthand and to add expression to our messages. The options available to us can expand our visual vocabulary and unwittingly set our expectations of what is normal. For example, when typing the word “doctor,” we may be presented with the option to include a little graphical representation of a doctor. If the only option we have is a smiling white male with a stethoscope, this immediately tells us, however subtly, that a doctor is a man’s job and a woman cannot become one.

Another example of accidental sexism found in digital was the automated recruitment tool that Amazon commissioned to filter the thousands of job applications received daily by the online retail giant. This system was scrapped during the testing phase because it was discovered that the AI system assessed that male candidates were preferable after it had trained itself on ten years of data submitted by applicants, much of which came from men. According to a report by Reuters, the system started to penalize CVs, which included the word “women” and downgraded those which referenced women’s colleges. The program was updated to make it neutral to the term, but it became clear that the system could not be relied upon to be unbiased, so it was scrapped.

A mindset that has male as the default setting can not only cause frustration, accidental sexism, and embarrassment; it can also have very real and life-threatening consequences. 

It’s not just emojis and CVs, but accidental gender bias can cost lives! A report by the American Journal of Public Health found that female drivers are much more likely than male drivers to be seriously injured in a car crash, even when both groups are wearing a seat belt. Researchers compared information on over 45,000 crash victims and found that women were 47% more likely to suffer serious injuries than men. Investigators found that female drivers were more susceptible to injuries because of differences in neck strength and musculature, the positioning of the head restraints, and, on average, their shorter stature and preferred seating posture. Car safety devices had been designed largely for men, meaning that the seats, seat belts, and other safety features had been specifically designed for the “default human” who happened to be the height and weight of an average male. This kind of thinking throughout the development and QA testing of cars and driver safety features has inadvertently lead to the death and injury of thousands of women over the years.

What can we do about it?

Hire more women

The obvious step is to get more women in our development and testing teams. By creating a team that is representative of the modern world we live in, you will, by default, be creating systems and products that are designed with more people in mind and addressing their needs.

Include more women

If your team is mostly male, and you have no immediate plans or resources to expand the team, then why not invite women and people of different backgrounds and experiences to become part of the QA process. Organize an open product testing session with people who are not in your demographic to test what you have created. They will provide valuable feedback and insights that your team may not have considered.

Design with women in mind

A good technique for helping the QA testing process is to devise personas, where you create fictional character profiles to help you to define your ideal customer or product user to understand their needs better. It can be particularly useful to create female personas then go through your product QA testing process with this persona in mind, to uncover whether any party of your design, processes, or devices might accidentally exclude or frustrate people.

Get involved with women tech groups

Reach out to organizations that specifically include, educate, and empower women to become involved in digital and tech. It can be beneficial to both parties to share information, experiences, job opportunities, and speaking events.

A few organizations that are doing great work to address gender bias in tech are:

Search for local groups in your area and ask if there’s an opportunity to collaborate; you may be surprised at the positive outcome.

Conclusion

It’s not enough to make sure your products “work”; they have to work for as many people as possible. Bake an equality checklist into your QA procedure and ask yourself, “have I considered people of different backgrounds, genders, and physical abilities when I’m designing and testing these products and experiences?” By doing this, you will help bridge the gap between genders and product better digital experiences for everyone.