Smart Speaker
Guarding your house to charity donations: 5 unique ways you can use your smart speaker
- Aug 20, 2022
- by HT Correspondent
While most of us know the regular uses, there are a whole host of other things your smart speaker can do.
Hi Alexa, Hey Google: Here’s what we expect to see from smart speakers in 2020
- Aug 20, 2022
- by Shweta Ganjoo
With Google finally https://jiji.ng/ launching a successor to its Home Mini, that is, the Google Nest Mini, and Amazon launching Echo Studio (a speaker that is also smart), 2019 .
Amazon Echo Spot review: A premium accessory for smart homes
- Aug 19, 2022
- by Kul Bhushan
Smart speakers as a category is still at its infancy. Tech firms are experimenting with different concepts. Echo Spot is among the first proof of concepts that .
Study Finds Racial Bias in Amazon’s Facial Recognition Tech
‘In light of this research, it is irresponsible for the company to continue selling this technology to law enforcement or government agencies,’ wrote one of the study’s researchers. However, Amazon is disputing the results, and claims the study is misleading the media.
I’ve been with PCMag since October 2017, covering a wide range of topics, including consumer electronics, cybersecurity, social media, networking, and gaming. Prior to working at PCMag, I was a foreign correspondent in Beijing for over five years, covering the tech scene in Asia.
Amazon’s facial recognition product — which the company has been marketing to police departments — may have some serious bias against women of color.
A new study published (Opens in a new window) on Thursday looked at whether the company’s Rekognition product can accurately identify a person’s gender. On photos of men, Amazon’s system was nearly flawless. But not so much when it came to women with dark skin tones: The Rekognition product misclassified their gender 31 percent of the time.
The study came from Joy Buolamwini of the MIT Media Lab and Deborah Raji of the University of Toronto. A year ago, the two researchers tested the facial recognition products from Microsoft and IBM, and found (Opens in a new window) the systems also struggled to accurately identify the gender of darker-skinned women. In response, both Microsoft and IBM updated their facial recognition technology to reduce the error rates.
Buolamwini is now calling on Amazon to address the bias of the company’s Rekognition product, amid growing worries the technology is both error-prone and ripe for abuse. "In light of this research, it is irresponsible for the company to continue selling this technology to law enforcement or government agencies," Buolamwini wrote (Opens in a new window) in a separate blog post.
However, Amazon is dismissing her study, calling the results inaccurate. This is because the researchers were using the Rekognition product to conduct "facial analysis" as opposed to true "facial recognition," according to Matt Wood, the general manager of artificial intelligence at Amazon Web Services.
In facial analysis, the computer system is trying to assign generic attributes to a picture, such as whether the person shown is wearing glasses, has a mustache, or may be female. Recognition is different; it focuses on trying to find matching photos of a particular face, like scouring through a large collection of mugshots and plucking out the ones that look like you.
" Facial analysis and facial recognition are completely different in terms of the underlying technology and the data used to train them," Wood said in a 1500-word blog post (Opens in a new window) on Saturday addressing the study. "Trying to use facial analysis to gauge the accuracy of facial recognition is ill-advised, as it’s not the intended algorithm for that purpose."
Wood also takes issue with how the study was conducted back in August, testing an older version of Rekognition. The company actually updated Rekognition in November to more accurately conduct both facial analysis and facial recognition. In an Amazon internal test, the company found " no significant difference in accuracy with respect to gender classification," he said.
"The research papers implies that Amazon Rekognition is not improving, and that AWS is not interested in discussing issues around facial recognition. This is false," he said. " We are acutely aware of the concerns around facial recognition, and remain highly motivated and committed to continuous improvement."
Fingerprints are becoming very common as a way to log in to apps on mobile phones, but another fasci
But Buolamwini is pushing back against Amazon’s claims. "If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free," she wrote in her own 3000-word blog post, which responds to the company’s criticism of her study.
The co-author of the study, Deborah Raji, also told PCMag that their tests of the Rekognition system occured under favorable conditions when the pictures of the subjects are clear, and easy to view. "This is a demonstration of how badly the technology fails in even incredibly easy cases," she said in an email.