Artificially-intelligent selfie editor FaceApp is once again taking heat from critics after an app update allowed users to apply a filter to change their race. After the backlash, FaceApp removed the ethnicity filters from the app on Wednesday, August 9, the same day the update launched, and this is not the first time the company has released, then removed, a controversial feature.
FaceApp uses AI and facial recognition to change faces, including aging an image, switching genders, changing expressions, or adding a goatee. The latest “ethnicity change filters” allowed users to upload a photo, of themselves or someone else, and use the software to morph into a different ethnicity. The update included 10 face-editing filters — and four of them were designed to alter race.
Wow… FaceApp really setting the bar for racist AR with its awful new update that includes Black, Indian and Asian “race filters” pic.twitter.com/Lo5kmLvoI9
— Lucas Matney (@lucasmtny) August 9, 2017
The company’s CEO, Yaroslav Goncharrov, said that filters for enhancing beauty would leave ethnicity intact, while the ethnic filters would change faces equally and even be listed in random order. FaceApp users, however, thought the app took things too far with the ethnic filters and the company later removed them in response.
FaceApp took heat earlier this year for its “hotness” filter, an effect that favored, among other things, lighter skin tones. The developer initially re-named the app before releasing a full fix, apologizing for what the company said resulted from a training bias in the AI program.
The practice of altering a face with software may inevitably attract controversy, raising such questions as who creates beauty standards, which will naturally vary among cultures. And FaceApp isn’t the only firm to come under fire for its choice of augmented reality face edits — Snapchat was ridiculed last year for a “yellowface” filter that created Asian caricatures.
Developers are often quick to place the blame on the AI. The software learns from feeding the computer thousands of examples — but if those examples are mostly white males, for example, the resulting program will have a bias towards those images. Discrimination in AI is the subject of several research projects, while the White House has itself identified discrimination as a challenge.
For now, face-altering apps are probably better off with turning selfies into cats and adding glasses and hats.