New AI Tool Asks for Your Selfies but Then Feeds on Hurting Your Ego

New AI Tool Asks for Your Selfies but Then Feeds on Hurting Your Ego
Published on

The new AI tool, Clip Interrogator, analyses people and bullies them based on their selfies

Suddenly, a new AI tool is roasting people beyond repair based just on their selfies, just as we were beginning to believe that advancements in artificial intelligence generator technology may have at a relative stop. But now we have answers that there is no stopping controversial AI technology and applications.

Every morning in 2022, a new AI tool is released online. The days when internet users created goofy art for Twitter shitposting using DALLE 2 and DALLE Mini, which spews racist images, are long gone. Instead, in recent months, we've seen these technologies being used in a rising number of unethical, copyright-violating, and dystopian ways, including using them to win legitimate art competitions, replace the actual photographers in the publishing sector and even steal unique fan art. A new AI tool for selfies has been recently released and it's a no-good boy, it is named Clip Interrogator by its creator.

The AI generative artist @pharmapsychotic developed a program called CLIP Interrogator that essentially helps in determining "what a decent stimulus may be to make new images like an existing one." Take, for example, the instance of the AI thief who stole from a Genshin Impact fan artist by taking a screenshot of their live-streamed work-in-progress, feeding it into an online image generator to "complete" it first, and then posting the AI version of the art on Twitter six hours before the original artist. The conman then had the gall to claim that the artist had committed theft and demanded credit for their work.

The burglar may upload the stolen screenshot to CLIP Interrogator and receive a series of text instructions that would assist in accurately creating related art using various text-to-image converters like the DALLE mini. Although the procedure is somewhat laborious, it opens up a vast new world of opportunities for AI-powered solutions.

On Twitter, however, users can upload their selfies and receive verbal abuse from a machine by using CLIP Interrogator. The tool labeled three users: "Joe Biden as a transgender woman," "highly gendered with haunted eyes," and "beta weak male." Additionally, when photographs of women in tank tops were displayed, they appeared to be particularly referencing porn websites. Are we shocked? No, not at all. Disappointed? Like always.

We decided to test the technology by uploading some popular pictures of celebrities because we can't exactly trust an AI with our selfies. On the list was Machine Gun Kelly (MGK), the neighborhood vampire badass, Pete Davidson, and, of course, selfie enthusiast Kim Kardashian.

"Error: This program is too busy." appeared after numerous refreshes and dragging minutes. Keep at it! Finally, using one of MGK's infamous mirror selfies, we were able to get CLIP Interrogator to generate text prompts. The tool spewed, "non-binary, angst white, Reddit, Discord."

The American rapper's friend Davidson, on the other hand, received several images, including "Yung lean, criminal mugshot, weird core, bulldog, and cursed image." As a point of reference, the image in question is the shirtless photo that the Saturday Night Live comedian took to retaliate against Kanye West when they were dating Kim Kardashian. Speaking on the designer, the AI tool referred to Kim Kardashian's popular diamond jewelry photo as being "influenced by Brenda Chamberlain, wearing a kurta, regular distributions, wig."

CLIP Interrogator is "based on OpenAI's Contrastive Language-Image Pre-Training (CLIP) neural network, which was released in 2021, and hosted by Hugging Face, which has devoted some extra resources to deal with the crush of traffic," as stated by Futurism. Further details are now foggy due to the tool's heavy traffic.

The only thing we can be certain of is that the roast but still has a ways to go in terms of prejudice, particularly when users utilize it to comment on their photographs. Additionally, the fact that 320 tweets have been found on Twitter using the search term "CLIP Interrogator" as of today suggests that the technology is here to stay.

It's difficult to determine what the malicious AI hopes to accomplish. We know it's based on Hugging Face's Contrastive Language-Image Pre-Training (CLIP) neural network, which was released by OpenAI in 2021 and has extra resources set aside to handle faulty traffic.

Beyond that, though, it's a little enigmatic: Is the developer attempting to reveal how the AI is staring back at us, did they design it to be as virulent as possible, or is something else at play?

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net