Microsoft Says Building AI Systems Without Women Leads To Gender Bias
Dhir Acharya - Mar 11, 2019
With the widespread of AI, AI-based solutions developed by men only is likely to result in inherently biased technology.
- The Ultimate Tech Betrayal: OpenAI's Nuclear Revenge Plot Against Sugar Daddy Microsoft
- Microsoft Notepad Gets Major Update: Bold Text, Hyperlinks, and Markdown Support
- Microsoft Surface: A Shift from Innovation to Stability?
With the widespread of AI, AI-based solutions developed by men only is likely to result in inherently biased technology, according to top Microsoft executive on Friday.
The “World Economic Report 2018” indicates that among AI professionals, only 22 percent are female while 32 percent consider gender bias a major hurdle to exist in the recruitment process of the industry.
Director Mythreyee Ganapathy of Program Management, Cloud and Enterprise at Microsoft, said that AI systems will more likely be biased if they are built by just one group like all Asian or all male.

It is necessary that data sets used for AI training be assembled by an engineer group with certain diversity.
Ganapathy took an example of data sets used for training speech AI models; their main focus is on adult speech, which accidentally leaves out children. Consequently, the models couldn’t recognize the voices of children.
The report also illustrates that India ranks 108th in the gender gap index. Also, women make up only 27 percent in its labor market, which is among the lowest rates in the world.
It is said in the report that we need to include a different group to raise the diversity of AI teams since 52 percent of women of the planet believe that technology is for men.
Microsoft promotes traditionally female colleges as well as other universities to include the study of computer science with the aim to narrow the gender gap.

The executive noted:

Businesses and academic AI teams have transfer gender bias to AI system by accident.
For instance, machine learning experts at Amazon got rid of an AI recruiting tool back in October last year when they found that the tool didn’t like women. The team said the tool taught itself to favor male candidates.
Featured Stories
Features - Jul 01, 2025
What Are The Fastest Passenger Vehicles Ever Created?
Features - Jun 25, 2025
Japan Hydrogen Breakthrough: Scientists Crack the Clean Energy Code with...
ICT News - Jun 25, 2025
AI Intimidation Tactics: CEOs Turn Flawed Technology Into Employee Fear Machine
Review - Jun 25, 2025
Windows 11 Problems: Is Microsoft's "Best" OS Actually Getting Worse?
Features - Jun 22, 2025
Telegram Founder Pavel Durov Plans to Split $14 Billion Fortune Among 106 Children
ICT News - Jun 22, 2025
Neuralink Telepathy Chip Enables Quadriplegic Rob Greiner to Control Games with...
Features - Jun 21, 2025
This Over $100 Bottle Has Nothing But Fresh Air Inside
Features - Jun 18, 2025
Best Mobile VPN Apps for Gaming 2025: Complete Guide
Features - Jun 18, 2025
A Math Formula Tells Us How Long Everything Will Live
Features - Jun 16, 2025
Comments
Sort by Newest | Popular