AI and gender bias – who watches the watchers?
Artificial Intelligence

AI and gender bias – who watches the watchers?

This is a contributed article by Chris Baker, UK Managing Director at Concur

 

Artificial intelligence (AI) and machine learning are causing excitement all over the world. Recent reports, such as one from Accenture, claim it has the potential to revolutionise the future of all businesses operations. For instance, research tasks that take hundreds of hours, such as candidate profiling, can now be performed by an AI within seconds. It’s no wonder that many businesses are tapping into this trend – the potential savings, in both time and money, are extraordinary.

However, what are the consequences of programming AI in today’s environment? It is a well-known fact that bias in the workplace persists and, whether its conscious or unconscious, research shows that it is still prevalent. And it seems to be filtering through to how AI is programmed. Take Professor Vicente Ordonez, of the University of Virginia, who accidentally programmed his machines to be sexist, associating women with images of kitchens.

 

Gender bias in the workplace

It is clear that we still need to actively campaign for change and watch out for our unconscious biases. You don’t have to look too far to see that gender inequality in the UK still exists. In fact, recent news revealed that Britain registered the largest increase in the European Union's gender pay gap in 2015, with the difference between male and females hourly wage jumping from 19.7 per cent in 2014 to 20.8 percent in 2015. In addition to this pay gap widening, women also face unconscious biases on a daily basis. For example, if you’re a female in a group meeting with male colleagues, you’re 33 per cent more likely to get interrupted.

This isn’t to say businesses aren’t taking steps to tackle this. For us internally, we have trained staff to look out for, and call out, all kinds of biases. We’ve seen a distinct positive change; within 12 months, manager and director level positions held by women increased by 50 per cent.

So, a conscious effort to correct work inequality shows that change can be made. In 20 years time, the UK will have made great leaps forward concerning women in business. But, the issue is, if AI algorithms are being written with today’s gender bias in place, this will place our AI of our future 20 years behind these positive changes and may lead to businesses making inherently bias decisions. Something needs to be done to regulate this.

 

Accidentally translating bias into AI

AI is observing the data which is out in the real world right now. According to research undertaken by Unesco, females working within areas of scientific research worldwide account for only 29 per cent, and, even more shocking, in the UK 96 per cent of inventors are male. So, AI that is programmed today will learn that certain professional areas, such as IT and engineering, are dominated by men. And with AI, what you put in is what you get out.

AI will take these statistics and translate them into biases within their algorithms — then when a decision is made by an AI in the future they may be inherently non-inclusive, even if biases in humans has been eradicated. For example, if an AI is used to find an outstanding candidate for a position in an engineering company, it may dismiss female candidates as it believes that this sector is dominated by men, and therefore a male candidate is the best fit.

There have even been instances of current AI displaying sexist behaviour in basic tools that we use everyday. You may have seen the tweet highlighting gender bias within Google translate that went viral recently. When translating the statement ‘he is a babysitter, she is a doctor’ into Turkish, a gender-neutral language, Google switched the gender association to ‘she is a babysitter, he is a doctor’ when translating to English.

So, in order to solve this problem and stop it from getting worse, we need to be able to rely on and trust those who are in charge of creating and programming. Unfortunately, as we have seen, the technology sector in general, and even more so in the specific area of AI programming and development, is dominated by a predominantly young, white and male workforce who often are unaware of unconscious biases.

Microsoft looked into this and conducted interviews with 11,500 girls and young women across Europe concerning their interest in STEM subjects. They found that by the age of 15, female interest in STEM begins to decrease. The reasons cited included the lack of female role models and hands-on experience available within this area. A further 60 per cent of them admitted that they would be more willing to pursue a STEM-related career if they knew that men and women were equally employed in these professions.

 

We need to watch the watchers

The outdated stereotype that men are superior at STEM subjects compared to women is still causing serious damage to gender diversity within these industries. In 2016, the University of Washington showed that an understanding and belief in this particular stereotype was already prevalent in six-year-olds. The scary fact is, if this notion remains ingrained within our society, inherited not only by our children but by our AI too, it has the potential to cause serious damage to the future of workplace equality. 

AI is developing at such a fast pace and its potential is very exciting, but if it is programmed with any form of bias then the benefits of using AI in business can be completely undone. We don’t want to be trapped in a situation 20 years’ down the line where human biases are a thing of the past but our machines are now the main perpetrator of inequality.

Governments need to take control sooner rather than later and implement policies which provide a framework to follow when programming AI. My advice to business owners would be to tackle the issue at the source; you need to ensure that staff are receiving conscious and unconscious bias training. If all businesses do this we may, hopefully, reach a point where there are no biases to programme into our machines. We must all make sure that we aren’t taking the unfairness in the world and programming it into our future.

PREVIOUS ARTICLE

«Most wanted B2C tech in 2018

NEXT ARTICLE

Enterprise GitHub projects of the week: VR Data Center Experience, Jasper, and CodeSV»
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should the government regulate Artificial Intelligence?