All Jacky Alciné wanted from Google Photos was a space to upload snapshots from his life. But a few weeks ago, what he got from the photo-sharing service was racist insults and a smattering of stories in the national press after photos of him and a female friend were labeled “gorillas” instead of humans.

“That has to be wrong,” thought Alciné, who is black, as he shuffled through his library of photos, only to discover that dozens of other pictures were also labeled “gorilla.” “It’s unexcusable,” said the 22-year-old Brooklyn Web developer. “There’s no reason why this should happen.”

Photo-sharing services Google Photos and Flickr have come under fire recently for software that tags photos of black people as gorillas or apes, dredging up racist attitudes from the Colonial era. But in this case, instead of humans seeing with racist eyes, it’s their software.

“At high levels, what’s really going on is it’s just a kid that’s been raised in a particular neighborhood and doesn’t understand things from outside of its world that well,” said Vivienne Ming, a machine learning expert and co-founder of education technology firm Socos.

The problem is likely twofold, experts say. Not enough photos of African Americans were fed into the program that it could recognize a black person. And there probably weren’t enough black people involved in testing the program to flag the issue before it launched.

“We’re appalled and genuinely sorry that this happened,” Google said in a statement about the incident with Alciné. “There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these type of mistakes from happening in the future.”

Black staffers rare

Machine-learning experts say the issue — along with a surprisingly similar incident at Flickr — highlights a larger problem with diversity in Silicon Valley, where black people are dramatically underrepresented at big companies. Among Google’s U.S. employees, just 1 percent of the people who hold technical jobs are black. To be sure, the problem with Google Photos might have happened no matter who designed it; but theoretically, if more black staffers were plugging their own pictures into the service, someone would have caught the mistake.

“If you have a diverse workforce, then you have a much better chance of picking up on things that a lack of diversity would hide from them,” Ming said.

Both Flickr and Google declined to answer questions about the racial makeup of the engineering teams behind its photo services. But the numbers in the companies’ broader diversity reports provide some clue.

This year, Yahoo (which owns Flickr) said just 1 percent of its U.S. employees with technical jobs are black. In a category called “professionals,” including software engineers, black people represented 2 percent of the jobs, or 85 people, out of 4,073 employees, according to a 2013 report filed with the U.S. Equal Employment Opportunity Commission. Google has similar representation with nearly 2 percent, or 369 black people, out of 22,130 employees in its “professionals” category, according to a 2014 report to the EEOC.

Outed in tweets

On June 28, the day Jacky Alciné noticed the “gorilla” tag, a frantic conversation about the error played out with Google engineer Yonatan Zunger in front of a Twitter audience of at least 3,300 people.

“Google Photos, y’all f— up. My friend’s not a gorilla,” Alciné tweeted, adding a screenshot of his Google Photos page with snapshots of his life, including airplane rides and the ill-fated label of his black friend.

“Like I understand HOW this happens; the problem is more so on the WHY. This is how you determine someone’s target market,” Alciné tweeted.

“Holy F—. G+ CA here,” came the Twitter reply from Zunger, Google’s chief architect of social, an hour and a half later. “No, this is not how you determine someone’s target market. This is 100% Not OK.”

Later, Zunger said he didn’t think this particular bug was caused by a lack of diversity on the Google Photos team.

“Team diversity helps: you can catch more of these things,” Zunger tweeted. “But I can tell you we tested this a lot and never saw ‘gorilla’ until (Alciné) spotted it in the wild.”

Zunger said part of the issue is how machines recognize faces in photos. Often, facial recognition starts by identifying around 50 key points, such as cheekbones, eyes and mouths, he tweeted. When images are “blurry or partially obscured,” the software identifies fewer points, leading to mistakes such as identifying humans as dogs or seals.

How algorithm learns

Flickr’s photo tagging errors happened when its algorithm was fooled by some aspect of an image’s patterns. But when the tag isn’t correct, according to Flickr, users delete inaccurate labels, helping the algorithm “learn from the feedback and the technology becomes smarter and more accurate over time.”

“The thing to realize is that these kinds of errors happen all the time. But most aren’t offensive,” wrote Zunger on Twitter, adding that if Google Photos started calling white people lemurs, it wouldn’t prompt the same outrage.

“The only real ‘fix’ is defense in depth, and the attention to jump on issues and fix them when they happen,” Zunger wrote.

In a way, said Hao Li, an assistant professor in computer science at the University of Southern California, it’s like starting out with a baby who has never seen anyone.

“If you provide an immense number of images, it actually achieves a higher level of accuracy,” Li said.

Google fixed Alciné’s software in about 14 hours, and Alciné, who interned at Google in 2010, still uses Google Photos. But the incident continues to bother him, and he continued to comment on the problem in a blog post published Thursday.

“It’s confusing to me,” Alciné said. “If it was intended to use dark-toned people, maybe they would have had more consideration to add them to their database.”

Photographer Corey Deshon of North Hollywood also got a firsthand glimpse of a photo algorithm gone terribly wrong. In May, a photo by him of a black man was accidentally tagged by Flickr as an “ape.” The tag has since been removed.

“It’s unfortunate that this happened, but that’s about as far as I think it goes,” Deshon said via e-mail. “Whenever there’s automation there’s a large margin for error, and this is just another example of that.”

Technology’s downside

Alciné, who is looking for full-time work as a Web developer, noted that technology has made big leaps in recent years, creating tools that make life easier through apps like Instacart (which handles grocery delivery) or Uber. But if, as Alciné said, Google Photos were to label his mother an orangutan, that would be a problem.

“Technology is what we think of the world, and if that’s what it’s showing, we need to sit back and see what we’re thinking about,” Alciné said.

After what happened, Google’s Zunger said he can definitely guarantee one thing.

“Something of this sort will happen again,” he wrote on Twitter in July. “I have no idea what it will be in particular.”

Wendy Lee is a San Francisco Chronicle staff writer. E-mail: wlee@sfchronicle.com Twitter: @thewendylee

———

©2015 the San Francisco Chronicle

Visit the San Francisco Chronicle at sfgate.com

Distributed by Tribune Content Agency, LLC.