TechUpdate: ConsenSys Remark Added to Deepnude Website Article

Update: ConsenSys Remark Added to Deepnude Website Article

A website that “nudifies” photographs for a price was discovered by wired reporting; it also publishes a feed that looks to show user uploads. They included pictures of little girls and pictures that looked like they were shot by outsiders.

Collage showing a cursor and an image of teenagers sitting near a pond and one of them has skin colored pixels over her...
Collage showing a cursor and an image of teenagers sitting near a pond and one of them has skin colored pixels over her…

In Short

  • Update on ConsenSys’ remark in relation to the deepnude website.
  • Discussion of AI-generated nude photos and cryptocurrency wallet usage.
  • Impact on online privacy and security. – Implications for internet users and communities.

TFD – Discover the recent update regarding ConsenSys’ involvement with the deepnude website. Explore the implications of AI-generated nude photos, cryptocurrency wallets, and online privacy on TheFoxDaily.

As AI-powered image generators have become more accessible, so have websites that digitally remove the clothes of people in photos. One of these sites has an unsettling feature that provides a glimpse of how these apps are used: two feeds of what appear to be photos uploaded by users who want to “nudify” the subjects.

The picture feeds present a startling exhibition of the targeted casualties. WIRED saw some images of girls who were clearly children. Some pictures included grown-ups with captions identifying them as female acquaintances or strangers. Visitors who are not logged in do not see any artificially generated nude photographs on the homepage of the website.

To generate and store deepfake nude photos, users must sign in to the website with a cryptocurrency wallet. Pricing isn’t stated on the website right now, but in a 2022 video that was uploaded on a related YouTube page, viewers could purchase credits to generate deepfake nude photos, with the first credit costing $5. WIRED discovered the website through a YouTube page link in a post on NFT marketplace OpenSea on a subreddit. Reddit informed WIRED that the user had been banned, and YouTube confirmed that it had closed the channel after being contacted by WIRED.

To protect the women and girls who continue to browse its feeds, WIRED is not naming the website, which is still accessible online. The website, which went live in February 2022, is hosted by Cloudflare, an infrastructure and internet security provider. A representative for the company, Jackie Dutton, clarified when questioned about its role that Cloudflare does not host a website’s contents; it only provides the IP address of the site.

The National Center for Missing & Exploited Children was informed about the site’s existence by WIRED, and it assists in notifying law authorities of child exploitation situations.

OpenAI and Stability AI, two AI developers, claim that their picture generators are for commercial and artistic purposes only, and they have safeguards in place to stop inappropriate content. However, the ability to create images with open source AI is becoming really strong, with the creation of pornography being one of the most common applications. The issue of nonconsensual nude deepfake photos, which primarily target women, has gotten worse and more pervasive as image production has become more easily accessible. In what appears to be the first instance of its sort, two Florida adolescents were reportedly detained earlier this month for allegedly producing and disseminating AI-generated nude photos of their middle school classmates without permission, according to a story by WIRED.

The deepnude website exposes a sobering truth, according to Mary Anne Franks, a professor at the George Washington University School of Law who has researched the issue of nonconsensual graphic imagery: The public is now unaware of a considerably greater number of occurrences involving AI-generated nude photos of juveniles and women who were not given permission. The photographs were exchanged within a community, and someone heard about it and raised the alarm, which is how the few public examples came to light.

According to Franks, “there will be a plethora of sites like this that are impossible to track down, and most victims are unaware that they have been harmed until someone happens to flag it for them.”

On two different pages of the WIRED website, there are feeds featuring what appear to be user-submitted pictures. There are two with the labels “Home” and “Explore.” It was evident from a few of the images that the girls weren’t older than 18.

One picture had a young child leaning against a tree while sporting a flower in her hair. Another female in what looks like a high school or middle school classroom. The image, which appears to have been taken covertly by a classmate, with the caption “PORN.”

Another photo on the website included two girls and a guy taking selfies in what appeared to be a school gymnasium. The girls smiled and posed for the photo. The trio of young teens appeared to be in middle school. A Snapchat lens enlarging the boy’s eyes to the point where they covered his face made his features invisible.

The captions for the purportedly posted photos mentioned that they featured pictures of love partners, friends, and classmates. One message reads, “My gf,” and it has a young lady snapping a selfie in front of a mirror.

Influencers who are well-known on TikTok, Instagram, and other social media sites were included in numerous pictures. Some of the images looked to be screenshots from Instagram, showing individuals posting pictures from their daily lives. In one picture, a happy-looking young lady was holding a dessert that had a festive candle on top.

There were a few photos that seemed to feature subjects who were total strangers to the photographer. One photo, taken from the back, showed a girl or lady standing close to what looked to be a tourist site without even trying to pose for a picture.

A few of the photos on the feeds that WIRED saw had their faces taken out, leaving just their crotch or chest visible.

WIRED watched the website for eight days, during which time it observed three new photographs on the Explore page and five new images of women on the Home feed. According to statistics on the website, the majority of these photos had received hundreds of “views.” It’s unclear how views are tallied or whether all photos uploaded to the website appear in the Explore or Home feeds. There are several dozen views on every post on the Home feed.

The top photographs on the website’s “Most Viewed” list are pictures of celebrities and users with substantial followings on Instagram. Actress Jenna Ortega (more than 66,000 views), singer-songwriter Taylor Swift (more than 27,000 views), and a Malaysian influencer and DJ (more than 26,000 views) are the most viewed individuals on the website ever.

Deepfake nude photos have already been used to target Ortega and Swift. The January release of phony Swift nude photos on X sparked a resurgence of conversation over the effects of deepfakes and the requirement for stronger victim protection laws. NBC revealed this month that Meta had been hosting advertisements for a deepnude app for seven months. The app boasted about its ability to “undress” people, using a picture of Jenna Ortega from when she was 16 years old.

The distribution of phony, nonconsensual nude photos is not prohibited by federal law in the United States. A few states have passed legislation of their own. However, Jennifer Newman, executive director of the NCMEC’s Exploited Children’s Division, states that artificial intelligence-generated nude photos of kids fall into the same category as other child sexual abuse material, or CSAM.

“We consider it to be child sexual abuse material if it is identical to an image of a real child, a live victim,” claims Newman. “And we will handle it as such while we forward these reports to law enforcement and process our reports.”

According to Newman, NCMEC had 4,700 reports in 2023 that “somehow connect to generative AI technology.”

The website requires users to sign in using a Coinbase, Metamask, or WalletConnect cryptocurrency wallet in order to make and save deepfake nude photos. An internal inquiry concerning the website’s interaction with the corporate wallet is being conducted, according to McKenna Otterstedt, a spokesman for Coinbase. Metamask is owned by ConsenSys, and while the tech company was unaware of the site prior to WIRED’s reporting, it has now launched an investigation: “We will need to determine how our Terms of Use are implicated and what steps would be appropriate to ensure the safety of our users and the broader web3 ecosystem.”

A request for comment was not answered by WalletConnect.

A video on the deepnude website’s YouTube channel from November 2022 claimed that viewers could “buy credit” with a Visa or Mastercard. WIRED reached out to both payment processors, but none provided a response.

30 NFTs featuring unedited, non-deepfaked photos of various female Instagram and TikTok influencers were listed on OpenSea, a marketplace for NFTs. A web record indicates that the website was still in its early stages when owners obtained access to it via purchasing an NFT with the cryptocurrency ether, which is currently valued at $280. The NFT listings stated that for its users, “privacy is the ultimate priority.”

The women’s perceived attributes were used as tags to categorize the NFTs. Boob Size, Country (the majority of the ladies were indicated as being from Malaysia or Taiwan), and Traits (with labels like “cute,” “innocent,” and “motherly”) were among the categories.

The account’s reported NFTs were all never sold. Within ninety minutes of WIRED getting in touch with OpenSea, the listings and the account were removed. When asked to remark, none of the women featured in the NFTs answered.

Who or how many persons own or developed the deepnude website is unknown. The profile picture of the now-deleted OpenSea account was exactly the same as the third “nerd” Google Image result. The creator’s credo is to “reveal the shitty thing in this world” and then share it with “all douche and pathetic bros,” according to the account bio.

The identical bio was used by an X account that was linked from the OpenSea account; it also pointed to a blog that is no longer active and discussed “Whitehat, Blackhat Hacking” and “Scamming and Money Making.” The owner of the account, who went by the handle 69 Fucker, seems to have been one of three bloggers for the site.

Just one person, with a profile photo of an East Asian male who looked to be under 50, endorsed the website on Reddit. On the other hand, a March 2022 website archive states that the website “was created by 9 horny skill-full people.” The employment names were all ironic, and most of the profile photos seemed like stock photos. Scary Stalker, Booty Director, and Horny Director were three of them.

A website-associated email address did not return requests for comment.

3/25/2014, 3:05 PM EST Update: The ConsenSys remark has been added to this article.

Conclusion

The involvement of ConsenSys in the deepnude website raises concerns about AI-generated nude photos and online privacy. Stay vigilant and informed about emerging technologies and their implications on TheFoxDaily.

— ENDS —

Connect with us for the Latest, Current, and Breaking News news updates and videos from thefoxdaily.com. The most recent news in the United States, around the world , in business, opinion, technology, politics, and sports, follow Thefoxdaily on X, Facebook, and Instagram .

Popular

More like this
Related

Big Bang Singularity Explored: New Insights from Researchers

In ShortBig Bang Theory: Describes the universe's origin...

Ranger hurt in shooting at hotel in Yellowstone National Park; shooter killed

In ShortIncident Summary: Gunman killed, park ranger injured...

Trump Hitler Comment : Trump made a claim that Hitler “did a lot of good things.”

In ShortTrump's comment: Allegedly praised hitler during a...