Last week, I opened up an email and found several photos of myself that sort of looked like me, but also did not resemble me. Upon further inspection, those photos were actually questionable depictions of my likeness, generated by AI.
For your information, reader, I look like this:
But according to this new AI, I look like this:
I did not ask for the AI-generated photos to be created. They were emailed to me by Wombo, an image manipulation software founded in 2020. Wombo recently launched a new service called ‘WOMBO Me’ that allows users to create AI-generated avatars with just one selfie.
I understand and accept that because of the nature of my job, my face is all over TikTok and Instagram. But I couldn’t help but wonder that in order to generate those AI photos of me, did that mean someone from the company had to specifically take the time to dig for footage of me online, save them, and upload them onto WOMBO?
I’m still deciding if it’s flattering or downright creepy – right now, I’m leaning towards the latter.
Harmless
Ben-Zion Benkhin, CEO of WOMBO, tells me that he had “no idea” that those photos were made and emailed to me.
“The kind of images that WOMBO’s AI can generate, from my perspective, is relatively harmless and fun… it’s the same way that someone could download your photo, stick it into Photoshop, and do things you might not want them to do… I think we’re entering that territory with AI,” he explained.
AI, consent, and privacy
The biggest issue I had with WOMBO’s entire process was that those AI-generated photos of me weren’t from my own hand, and they were created without my knowledge or permission. I may have consented to what EPIK and Snow do with my data, but not WOMBO Me.
WOMBO’s Privacy Policy lays out the scope of the company’s data collection. It’s both clear and ambiguous at the same time. For starters, there seems to be no clear promise that the app would delete AI-generated images of its users.
“Your likeness and image as contained in animated generated videos will continue to be stored in our database; anonymous, aggregated or de-identified data will be stored as long as necessary for WOMBO’s business purposes,” WOMBO states.
For how long? Benkhin shared with The Chainsaw: “After we no longer need the face data from the user, we delete everything typically in 24-hour cycles. So, every 24 hours, there’s a mass deletion job that eliminates the data we don’t need.”
“We don’t store people’s faces for training or to resell [them] to other people. That’s not something we’ve ever done, and that’s something we won’t ever do,” he added.
Next, WOMBO reminds you multiple times in its policy that ‘WOMBO Me’ works with third-party services that “collect information used to identify you.” But at the same time, the company also makes it clear they:
“[do] not control third party platforms, websites or online services, and WOMBO is not responsible for their actions… WOMBO encourages you to read the privacy policies of these platforms, websites and online services to find out more about how your usage data may be used by these vendors.”
In other words, although they hand over their users’ data to their corporate partners, they ultimately have no control over what those partners do with said data.
On AI-generated content
Cybersecurity firm Bitdefender wrote a similar article in 2021 cautioning users of the risks associated with WOMBO. According to Bitdefender, the main security risk with the app “lies in the fact that the app sends the pictures to AWS for processing and additional information to its analytical platform – data that could end up stolen in the event of a cyberattack.”
AI-generated content has flooded the internet. On social media platforms like X, it seems as though photorealistic footage and deepfake videos created with AI go viral every other day.
There are some opportunists that appear to be using AI to spread misinformation and manipulate public opinion. Celebrities, particularly women, have become unwitting victims to explicit AI-generated photos and videos. My initial reaction to receiving those photos of myself, besides discomfort, was a sense of powerlessness.
However, Ben-Zion Benkhin told The Chainsaw that what users could do on WOMBO were “limited”: “Harmful and dangerous applications of this technology are just not possible [with WOMBO] because the user doesn’t have full flexibility or control over what they can do within the app.
“Beyond that, we expect the entire industry to mature and for new technological solutions to emerge. One simple example is an ‘invisible signature’ that’s embedded inside of a particular media that allows anyone to verify where [something] was generated, how it was made, and so on… which is sort of like a deeper form of provenance,” he noted.