Microsoft Says Building AI Systems Without Women Leads To Gender Bias

Dhir Acharya - Mar 11, 2019


Microsoft Says Building AI Systems Without Women Leads To Gender Bias

With the widespread of AI, AI-based solutions developed by men only is likely to result in inherently biased technology.

With the widespread of AI, AI-based solutions developed by men only is likely to result in inherently biased technology, according to top Microsoft executive on Friday.

The “World Economic Report 2018” indicates that among AI professionals, only 22 percent are female while 32 percent consider gender bias a major hurdle to exist in the recruitment process of the industry.

Director Mythreyee Ganapathy of Program Management, Cloud and Enterprise at Microsoft, said that AI systems will more likely be biased if they are built by just one group like all Asian or all male.

Kết quả hình ảnh cho microsoft

It is necessary that data sets used for AI training be assembled by an engineer group with certain diversity.

Ganapathy took an example of data sets used for training speech AI models; their main focus is on adult speech, which accidentally leaves out children. Consequently, the models couldn’t recognize the voices of children.

The report also illustrates that India ranks 108th in the gender gap index. Also, women make up only 27 percent in its labor market, which is among the lowest rates in the world.

It is said in the report that we need to include a different group to raise the diversity of AI teams since 52 percent of women of the planet believe that technology is for men.

Microsoft promotes traditionally female colleges as well as other universities to include the study of computer science with the aim to narrow the gender gap.

Kết quả hình ảnh cho indian women

The executive noted:

Capture

Businesses and academic AI teams have transfer gender bias to AI system by accident.

For instance, machine learning experts at Amazon got rid of an AI recruiting tool back in October last year when they found that the tool didn’t like women. The team said the tool taught itself to favor male candidates.

Comments

Sort by Newest | Popular

Next Story