Illustration by Alex Castro / Th
It’s easy to be alarmist about facial recognition, but at the end of the day, it’s a simple matching system. The algorithm establishes a set of features and looks for matches, just like a search engine or a speech-to-text system. The problems come with how people use that capability — whether it’s commercial marketing or, more controversially, police surveillance.
Those problems get particularly bad if you don’t use the system the way it’s designed. A recent Georgetown study found that NYPD officers purposely manipulate images before they input them into the system, often pasting in stock eyes or mouths in order to bring an image in line with the system’s standards. In one particularly egregious case, officers actually used a picture of…
from The Verge – All Posts https://ift.tt/2JWl9tk