Have you ever wondered what you’d look like as a peasant from the 1500s? What about a fairy princess living deep in the forest? Or even a cyborg from the future?
If your answer is ‘yes,’ then keep reading.
The latest internet trend has people turning their self-portraits into historical figures, avatars, and hyper-stylized images of themselves with the help of AI-generated apps. As the trend begins to spread across TikTok, Twitter, and even Instagram (that’s when you know it’s exploding!), the ethical implications of using these programs have become increasingly questionable.
The two main apps being used for this digital transformation are Lensa and AI Time Machine. From the low cost of $7, each app takes real photos uploaded by the user and transforms them into hyper-realistic images with the user’s face featured in a variety of themes from across the globe. In addition to different time periods, the apps can also transform individuals into Disney characters, Anime stars, fairy princesses, warriors, or cyborgs.
A quick TikTok search reveals that the hashtag #AITimeMachine has amassed over 29 million views on the app.
Though results tend to vary when making your AI avatars, it seems the trend has had a positive impact on the Queer community, with members sharing their feelings of gratitude on Twitter.
One user wrote, “I think one of my favourite parts is how AI is reading me as I am, and not as any specific gender. It’s affirming for me.”
Another shared, “These #lensa AI photos have really given me so much euphoria, they really capture and enhance your energy.”
While the trend has brought many people joy, it has also highlighted some AI loopholes when it comes to diverse and minority communities. Users have noted that the app has slimmed down larger-bodied people and are warning those who haven’t taken part in the trend to save their time and money to avoid the misrepresentation.
Others have commented that the AI art generators lack racial nuance, producing portraits that either reflect an overarching “whiteness” or feature exaggerated racialized phenotypes.
Most recently, the online community has raised concerns about the ethics behind using AI art generators like Lensa.
Artist Sarah Hester Ross posted on TikTok to warn users not to use “those AI generators” as they “are taking real artists’ art and they’re replacing our faces with them… with no credit, they’re stealing.”
According to TechCrunch, Lensa works by utilising a Stable Diffusion model, meaning it generates images from text descriptions of images. It uses open-source material from the internet (which is technically legal), however, some professional artists are calling these programs a grey area that borders on stealing.
Stable Diffusion is an AI that was “trained” by scouring over 2.3 billion images online— some of which may have been copyrighted, watermarked, or otherwise privately owned. The AI then uses those examples to generate the images that become the generated avatars all over the FYP. The issue is that artists don’t have the right to opt-in or out of having their posted work used to train the AI.
While this method is legal, artists don’t technically have a say in how their artwork is used. Artists on Twitter like Meg Rae have described this grey area as “a legal loophole to squeeze out artists from the process” to avoid paying licensing fees.
Other concerns raised about this trend include what these apps do with the data they have collected from scanning users’ faces.
Though Lensa claims that it doesn’t retain face data or sell uploaded images to third parties (besides company “affiliates”— which is questionable in and of itself), it still reserves the right to do whatever it wants with the “outputs it produces from your likeness.” So essentially, users are paying to give up their rights to their likeness.
This is especially concerning for influencers, cosplayers, streamers, or anyone in professions where their image is part of their brand. Unfortunately, once a user opts in by using these apps, there’s likely not much they can do if their AI-generated portrait is incorporated into digital advertisements or more sinister content like pornography and deep fakes.
So while it may be fun to use these AI generators at the moment, the long-term implications of this trend may do more harm than intended.