Image recognition is a set of technologies where we’ve seen great progress recently. Some applications help us gain advantages of efficiency, for example identifying items that may be debris, for removal from an agricultural field. There’s also accuracy, for example identifying tumors in cancer screens at a better rate than human experts. And some applications are for convenience, for example enabling users to unlock their devices with their faces rather than passwords.
These applications can lead to good outcomes just as they can also have unintended consequences.
Related to that, at the start of the recent wave of protests in Hong Kong, a journalist opened an article with a beautiful summary of how important facial recognition had become.
“The police officers wrestled with Colin Cheung in an unmarked car. They needed his face.
“They grabbed his jaw to force his head in front of his iPhone. They slapped his face. They shouted, ‘Wake up!’ They pried open his eyes. It all failed: Mr. Cheung had disabled his phone’s facial-recognition login with a quick button mash as soon as they grabbed him.”
It seems legitimate that we fear misuse of facial recognition. It’s a question of suddenly being able to do something at a scale that would be difficult or costly earlier.
But what about subtler abuses?
That brings me to a new report, titled Facial recognition technology can expose political orientation from naturalistic facial images.
From the report: Continue reading “The Shape of Faces to Come (Facial Recognition and Political Orientation)”