Australia

Facebook is accused of discrimination over male-targeted job adverts for mechanics and pilots #News

[ad_1]

Facebook has been accused of discrimination after job adverts for male-dominated roles such as pilots and mechanics were found to be targeted at men on its platform.

Campaign group Global Witness said one of the platform’s algorithms, designed to show vacancies to the most interested candidates, was favouring a certain gender based on stereotypes about the profession.

For example, 96 per cent of people who viewed an advert for a mechanic were men, while 95 per cent of those who saw a nursery nurse posting were women. 

Adverts for psychologists were also far more likely to be shown to women but pilot jobs were mainly targeted at men.

Facebook said it was reviewing the findings and is preparing to update its job advert system within weeks.

In the UK, around 99 per cent of mechanics and 91 cent of pilots are estimated to be male, according to research by the independent careers website Careersmart.

Around 97 per cent of nursery nurses and 87 per cent of psychologists are women.

Facebook has been accused of discrimination after job adverts for male-dominated roles such as pilots and mechanics (pictured) were found to be targeted at men on its platform

Key findings 

Global Witness revealed that:

  • 96% of the people shown the ad for mechanic jobs were men
  • 95% of those shown the ad for nursery nurse jobs were women
  • 75% of those shown the ad for pilot jobs were men
  • 77% of those shown the ad for psychologist jobs were women 

However, London-based Global Witness claims Facebook may have breached discrimination laws and has filed complaints with the Equality and Human Rights Commission (EHRC) and Information Commissioner.

This is the first time UK authorities have been alerted to problems with the social network’s algorithms, but previous studies in the US have also suggested the AI technology can be discriminatory.  

Critics argue that one of the problems is that tech firms, particularly in Silicon Valley, have workforces dominated by male employees.

‘Big tech workers are mainly young nerdy males with little life experience,’ Noel Sharkey, emeritus professor of artificial intelligence and robotics at the University of Sheffield, told the Telegraph

‘Many errors of judgement could be avoided with a more diverse tech population.’

Jake Moore, a cyber security specialist at ESET, told MailOnline: ‘Facebook has a business model which is purely focused on maximising its revenue stream from adverts placed in feeds. 

‘There are only so many adverts one Facebook feed can view a day so they tend to tailor those requested by the client and chosen for specific user. 

‘However, this latest research clearly demonstrates that there are biases, unconscious or otherwise, involved in the decision making even when it is assumed not. 

‘Such errors highlighted in the report shows us that there is such bias involved all around us including future technologies such as artificial intelligence which is meant to be completely fair.’ 

Global Witness said that as well as finding that the Facebook algorithm gender-stereotyped jobs, the system also failed to stop campaigners posting deliberately discriminatory job adverts.

Adverts for psychologists were also far more likely to be shown to women but pilot jobs were mainly targeted at men

Adverts for psychologists were also far more likely to be shown to women but pilot jobs were mainly targeted at men

AI EXPERT WARNS AGAINST  ‘RACIST AND MISOGYNIST ALGORITHMS’ 

A leading expert in artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions.

Across the globe, algorithms are beginning to oversee various processes from job applications and immigration requests to bail terms and welfare applications.

Military researchers are even exploring whether facial-recognition technology could enable autonomous drones to identify targets. 

University of Sheffield computer expert Noel Sharkey told The Guardian, however, that such algorithms are ‘infected with biases’ and cannot be trusted.

‘There are so many biases happening now, from job interviews to welfare to determining who should get bail and who should go to jail. It is quite clear that we really have to stop using decision algorithms, and I am someone who has always been very light on regulation and always believed that it stifles innovation,’ Sharkey told the paper in 2019.

‘But then I realised eventually that some innovations are well worth stifling, or at least holding back a bit. So I have come down on the side of strict regulation of all decision algorithms, which should stop immediately.’ 

Calling for a halt on all AI with the potential to change people’s lives, Sharkey advocated for vigorous testing before they are used in public.

‘There should be a moratorium on all algorithms that impact on people’s lives. Why? Because they are not working and have been shown to be biased across the board.’ 

It approved ones that discriminated against women and those aged over 55, after requiring that a box be ticked to comply with its non-discrimination policy.

In a blog post, Global Witness said: ‘The policy is evidently self-regulatory and self-certified, and it seems from our test that Facebook will accept patently discriminatory targeting for job adverts. 

‘In order to avoid advertising in a discriminatory way, we pulled the ads from Facebook before their scheduled publication date.’

If Facebook is found to have breached the Equality Act, the EHRC can demand that it changes its practices, and potentially take it to court to enforce an order. 

Campaigners have also reported the social network to the UK’s Information Commissioner to investigate whether its ad delivery practices breach data protection laws.

These state that the processing of personal information must not result in discriminatory outcomes. 

A Facebook spokesperson said: ‘Our system takes into account different kinds of information to try and serve people ads they will be most interested in, and we are reviewing the findings within this report. 

‘We’ve been exploring expanding limitations on targeting options for job, housing and credit ads to other regions beyond the US and Canada, and plan to have an update in the coming weeks.’

The report comes just days after Facebook issued a public apology to DailyMail.com and MailOnline for adding an AI-generated label of ‘primates’ to a news video from the website that featured black men

A Facebook spokesman admitted the error was ‘unacceptable’, telling DailyMail.com: ‘We apologise to anyone who may have seen these offensive recommendations and to the Daily Mail for its content being subject to it.

‘This was an algorithmic error on Facebook and did not reflect the content of the Daily Mail’s post,’ the company admitted.    

Other major technology companies have also faced criticism over racial, gender or age-biased algorithms.

Research found that Twitter‘s automated photo-cropping algorithm favours young, feminine and light-skinned faces, while Instagram has previously been accused of discriminating against women based on how the platform promotes its users.  

Support Howtoearntips with a contribution of any size
Your support helps protect the Howtoearntips independence and it means we can keep delivering quality journalism that’s open for everyone around the world. Every contribution, however big or small, is so valuable for our future.

[ad_2]


#Facebook #accused #discrimination #maletargeted #job #adverts #mechanics #pilots

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top
// Infinite Scroll $('.infinite-content').infinitescroll({ navSelector: ".nav-links", nextSelector: ".nav-links a:first", itemSelector: ".infinite-post", loading: { msgText: "Loading more posts...", finishedMsg: "Sorry, no more posts" }, errorCallback: function(){ $(".inf-more-but").css("display", "none") } }); $(window).unbind('.infscr'); $(".inf-more-but").click(function(){ $('.infinite-content').infinitescroll('retrieve'); return false; }); $(window).load(function(){ if ($('.nav-links a').length) { $('.inf-more-but').css('display','inline-block'); } else { $('.inf-more-but').css('display','none'); } }); $(window).load(function() { // The slider being synced must be initialized first $('.post-gallery-bot').flexslider({ animation: "slide", controlNav: false, animationLoop: true, slideshow: false, itemWidth: 80, itemMargin: 10, asNavFor: '.post-gallery-top' }); $('.post-gallery-top').flexslider({ animation: "fade", controlNav: false, animationLoop: true, slideshow: false, prevText: "<", nextText: ">", sync: ".post-gallery-bot" }); }); });