Post
Saw a headline about how an MIT student used Playground AI with the prompt "Was trying to get a linkedin profile photo with AI editing & this is what it gave me" and it made her white. Tried this with my profile photo and the result reproduced and I became white.
Using a summer photo where I've had more sun exposure, it often turns me South Asian or African. Occasionally (maybe 1 in 10), I'll become East Asian photo, but I never stayed Vietnamese or turned into any sort of Southeast Asian.



As discussed in twitter.com/danluu/status/1270991469264793600, I wonder why companies don't do even the most basic bias tests. I don't believe I've ever played with any of these AI tools that have been released over the past decade and not had literally the first thing I tried show an obvious bias.
Does no one even try one single example before these things go viral and the company is shamed into fixing it? It just seems odd that companies be shamed into fixing things but can't be bothered to try one single example.