Twitter user @ellieeewbu has uncovered a quirk in Apple’s categorisation choices for iPhones Photos, revealing the app has been automatically grouping pictures of bras.

A number of women have since responded to the original tweet, showing results for the term “brassiere” that include a range of images that aren’t entirely safe for work.
It looks like the pictures – all of which are on the users’ photo reels – have been catalogued by the image recognition software, much in the same way it will group pictures of, say, beaches or cats.
But having pretty scenes or animals recognised by an AI is one thing, and having intimate photos analysed is another. People are understandably a bit unnerved by their phones being able to categorise images of their own bodies without their knowledge.
In fact, your iPhone has been doing this for over a year. Image recognition was introduced in June 2016 with iOS 10, with “Brassiere” and “Bra” amongst the 4,432 keywords that Photos is capable of identifying.
Apple has been keen to stress that all of this object identification is done locally, on your device. As The Verge notes, the same can’t be said for Google, which has a similar image recognition system for Google Photos – but this is done in the company’s cloud servers.
There’s also the question of why “bra” is on the list, but no sign of male underwear such as “boxers” or “briefs”. The list of recognised images is vast, but there are nevertheless authored decisions about what does or doesn’t get included. On some level, this is an example of a particular worldview being baked into supposedly neutral software – which can ultimately ripple down to unintended consequences.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.