Flickr's new auto-tagging feature is generating some pretty offensive results.
The photo sharing site, which is owned by Yahoo (YHOO), launched image-recognition software two weeks ago that automatically creates tags for photos. The purpose was to make images more searchable in the system.
For example, a black and white photo of London's Tower Bridge was automatically labeled "architecture," "outdoor," "city," "monochrome" and "skyline."
But computer algorithms aren't perfect, and when they identify images incorrectly, the results can be disastrous.
Some concentration camp photos received inappropriate tags, including "sport" and "jungle gym."
Flickr had also been tagging some images of people as "ape" and "animal," including a photo of a black man named William taken by photographer Corey Deshon, according to the Guardian.
The photo service had also labeled a white woman wearing face paint as "ape" and "animal," so Flickr's algorithm does not appear to be taking a person's skin color into consideration when auto-tagging them.
Flickr has since corrected both of those mistakes, but the concentration camp errors remain.
"We are aware of issues with inaccurate auto-tags on Flickr and are working on a fix," a spokesman for Flickr said in a statement. "While we are very proud of this advanced image-recognition technology, we're the first to admit there will be mistakes and we are constantly working to improve the experience."
Flickr noted that deleting incorrect tags teaches the new algorithm to learn from its mistake and improve its results in the future. The company also noted that Flickr staff does not personally tag photos -- it's all automated.