AI Tools Breathe Real-Life Magic Into Deepfake Porn; Netizens Worried

Gujarat News, Gujarati News, Latest Gujarati News, Gujarat Breaking News, Gujarat Samachar.

Latest Gujarati News, Breaking News in Gujarati, Gujarat Samachar, ગુજરાતી સમાચાર, Gujarati News Live, Gujarati News Channel, Gujarati News Today, National Gujarati News, International Gujarati News, Sports Gujarati News, Exclusive Gujarati News, Coronavirus Gujarati News, Entertainment Gujarati News, Business Gujarati News, Technology Gujarati News, Automobile Gujarati News, Elections 2022 Gujarati News, Viral Social News in Gujarati, Indian Politics News in Gujarati, Gujarati News Headlines, World News In Gujarati, Cricket News In Gujarati

AI Tools Breathe Real-Life Magic Into Deepfake Porn; Netizens Worried

| Updated: April 18, 2023 13:31

If you thought Artificial Intelligence (AI) tools were limited to writing and authoring only, think again. In what is turning out to be a man-made menace, AI is now being used to slur an individual’s dignity with images and clips of “non-consensual deepfake pornography.”

Deepfakes are videos and images that have been digitally created or altered with AI or machine learning. Porn created using the technology first began spreading across the internet several years ago when a Reddit user shared clips that placed the faces of female celebrities on the shoulders of porn actors.

Since then, deepfake creators have disseminated similar videos and images targeting online influencers, journalists and others with a public profile. Thousands of videos exist across a plethora of websites. And some have been offering users the opportunity to create their own images – essentially allowing anyone to turn whoever they wish into sexual fantasies without their consent, or use the technology to harm former partners.

The problem, experts say, grew as it became easier to make sophisticated deepfakes. And they say it could get worse with the development of generative AI tools that are trained on billions of images from the internet and spit out novel content using existing data.

“The reality is that technology will continue to develop and will continue to become as easy as pushing the button,” said Adam Dodge, the founder of EndTAB, a group that provides trainings on technology-enabled abuse. “And as long as that happens, people will continue to misuse the technology to harm others, primarily through online sexual violence, deepfake porn and fake nude images.”

Noelle Martin, of Perth, Australia, has experienced that reality. The 28-year-old found deepfake porn of herself 10 years ago when out of curiosity one day she used Google to search an image of herself. To this day, Martin says she doesn’t know who created the fake images, or videos of her engaging in sexual intercourse. Horrified, Martin contacted different websites for a number of years in an effort to get the images taken down. Some didn’t respond. Others took it down but she soon found it up again. “You can’t win,” she said. Eventually, Martin turned her attention towards legislation, advocating for a national law in Australia that would fine companies 555,000 Australian dollars ($370,706) if they don’t comply with removal notices for such content from online safety regulators.

But governing the internet is next to impossible when countries have their own laws for content that’s sometimes made halfway around the world. In the meantime, some AI models say they’re curbing access to explicit images. 

Some social media companies have also been tightening up their rules to better protect their platforms against harmful materials. TikTok said last month all deepfakes or manipulated content that show realistic scenes must be labelled to indicate they’re fake or altered in some way, and that deepfakes of private figures and young people are no longer allowed. 

The gaming platform Twitch also recently updated its policies around explicit deepfake images after a popular streamer named Atrioc was discovered to have a deepfake porn website open on his browser during a livestream in late January. The site featured phony images of fellow Twitch streamers.

Also Read: Rupee Crosses 82 Barrier To A Dollar; Plunge Likely To Continue Further

Your email address will not be published. Required fields are marked *

%d