Image of portraits posted to Twitter
Twitter appears to be more often than not selecting the faces of white people for thumbnails on the site

Twitter algorithm appears to prefer white people for thumbnails

In a long thread on Twitter, dozens of stunned users are posting strips of photos of people of multiple races - usually multiple portraits of a black man and a white man arranged variously, and in most cases, Twitter seems to be picking out the white person to show in the thumbnail.

In attempts to reverse-engineer the algorithm Twitter uses to choose where to focus, users are taking stock photos of people and composing single images of the people arranged in a strip. A strip of 9 images of a Black man with a single white man as the second last image? The thumbnail shows the white man’s face.

While not every combination of portraits have the same effect, the results seem to show a significant, although likely unintended, preference. A response from Twitter Comms claims that Twitter performed bias testing before shipping the software and found no evidence of racial or gender bias, but in the same tweet acknowledge that the results seem to indicate a problem exists.

An example of the photos in question - expanding the images reveals the faces not chosen for the thumbnail.

The algorithm doesn’t appear to be position based - if an image has a Black man at the top and a white man at the bottom, the white man is picked. If the positions are switched so the white man is at the top, he’s still the focus of the thumbnail.

What if things are mixed up a bit? Another experiment with an Asian man, a Black man and a white man has the same result. A black woman and a white woman? The white woman smiles from the thumbnail.

Swapping positions or adding more instances of people of one race or another doesn’t appear to change the result, although masking the face of the white person does, as does using a photo of a white person with extremely poor lighting and contrast - in this case the Black person appears in the thumbnail.

Composed images of women of multiple races usually had the same result.

So what could be going on here?

It’s theorised that Twitter attempts to calculate the area of interest of the image based on more than the position alone, as evidenced here. Users in the thread guess that the contrast in the area in question has an effect, as might the presence of text, or any blurriness present.

The results seem, if not a deliberate result of racial bias, another example of artificial intelligence having a bias against minorities, as an unintended result of training machine learning algorithms with biased data - that is, data made up mostly of white people.

It seems likely that in a photo with a lot going on the area of interest would be a face - humans are usually more interested in people than what’s going on behind them. But if those algorithms are fed far more examples of white people when learning what constitutes a face, the generalised concept learnt by the software may be whiter than it should be.

Get the unlike kinds newsletter
An infrequent summary of top articles in your inbox