Jump to content

Fake nude photography

From Wikipedia, the free encyclopedia

Fake nude photography is the creation of nude photographs designed to appear as genuine nudes of an individual.[1][2] The motivations for the creation of these modified photographs include curiosity, sexual gratification, the stigmatization or embarrassment of the subject, and commercial gain, such as through the sale of the photographs via pornographic websites.[1][3][4][5][6] Fakes can be created using image editing software or through machine learning. Fake images created using the latter method are called deepfakes.

History

[edit]

Magazines such as Celebrity Skin published non-fake paparazzi shots and illicitly obtained nude photos, showing there was a market for such images.[7] Subsequently, some websites hosted fake nude or pornographic photos of celebrities, which are sometimes referred to as celebrity fakes. In the 1990s and 2000s, fake nude images of celebrities proliferated on Usenet and on websites, leading to campaigns to take legal action against the creators of the images[8][9] and websites dedicated to determining the veracity of nude photos.[10] "Deepfakes", which use artificial neural networks to superimpose one person's face into an image or video of someone else, were popularized in the late 2010s, leading to concerns about the technology's use in fake news and revenge porn.[11][12]

Fake nude photography is sometimes confused with Deepfake pornography, but the two are mostly different. Fake nude photography typically starts with human-made non-sexual images, and merely makes it appear that the people in them are nude (but not having sex). Deepfake pornography typically starts with human-made sexual (pornographic) images or videos, and alters the actors' facial features to make the participants in the sexual act look like someone else.

DeepNude

[edit]

In June 2019, a downloadable Windows and Linux application called DeepNude was released which used a Generative Adversarial Network to remove clothing from images of women. The images it produced were typically not pornographic, merely nude. Because there were more images of nude women than men available to its creator, the images it produced were all female, even when the original was male. The app had both a paid and unpaid version.[13] A few days later, on June 27, the creators removed the application and refunded consumers, although various copies of the app, both free and for charge, continue to exist.[14] On GitHub, the open-source version of this program called "open-deepnude" was deleted.[15] The open-source version had the advantage of allowing it to be trained on a larger dataset of nude images to increase the resulting nude image's accuracy level.[16] A successor free software application, Dreamtime, was later released, and some copies of it remain available, though some have been suppressed.

Deepfake Telegram Bot

[edit]

In July 2019 a deepfake bot service was launched on messaging app Telegram that used AI technology to create nude images of women. The service was free and enabled users to submit photos and receive manipulated nude images within minutes. The service was connected to seven Telegram channels, including the main channel that hosts the bot, technical support, and image sharing channels. While the total number of users was unknown, the main channel had over 45,000 members. As of July 2020, it is estimated that approximately 24,000 manipulated images had been shared across the image sharing channels.[17]

Nudify websites

[edit]

By late 2024, most ways to produce nude images from photographs of clothed people were accessible at websites rather than in apps, and required payment.[18]

Purposes

[edit]

"They are fake nudes, altered in Photoshop, and it is one of many tactics that has been used to silence me,"
Diane Rwigara.[6]

CNN

The reasons for the creation of nude photos may range from a need to discredit the target publicly, personal hatred for the target, or the promise of pecuniary gains for such work on the part of the creator of such photos.[1][3][4]

Fake nude photos often target prominent figures such as businesspeople or politicians.[6] This is the case for candidate for Rwanda's presidential election in 2017, Diane Rwigara.[6]

Notable cases

[edit]

In 2010, 97 people were arrested in Korea after spreading fake nude pictures of the group Girls' Generation on the internet.[19] In 2011, a 53-year-old Incheon man was arrested after spreading more fake pictures of the same group.[20][21][22]

In 2012, South Korean police identified 157 Korean artists of whom fake nudes were circulating.[23]

In 2012, when Liu Yifei's fake nude photography released on the network, Liu Yifei Red Star Land Company declared a legal search to find out who created and released the photos.[24][25]

In the same year, Chinese actor Huang Xiaoming released nude photos that sparked public controversy, but they were ultimately proven to be real pictures.[26]

In 2014, supermodel Kate Upton threatened to sue a website for posting her fake nude photos.[27] Previously, in 2011, this page was threatened by Taylor Swift.[28]

In November 2014, singer Bi Rain was angry because of a fake nude photo that spread throughout the internet. Information reveals that: "Rain's nude photo was released from Kim Tae-hee's lost phone." Rain's label, Cube Entertainment, stated that the person in the nude photo is not Rain and the company has since stated that it will take strict legal action against those who post photos together with false comments.[29][30]

In July 2018, Seoul police launched an investigation after a fake nude photo of President Moon Jae-in was posted on the website of the Korean radical feminist group WOMAD.[31]

In early 2019, Alexandria Ocasio-Cortez, a Democratic politician, was berated by other political parties over a fake nude photo of her in the bathroom. The picture created a huge wave of media controversy in the United States.[32][33][34][35]

Methods

[edit]

Fake nude images can be created using image editing software or neural network applications.[12][36] There are two basic methods:[37]

  • Combine and superimpose existing images onto source images, adding the face of the subject onto a nude model.[19]
  • Remove clothes from the source image to make it look like a nude photo.[1][3][38]

Impact

[edit]

Images of this type may have a negative psychological impact on the victims and may be used for extortion purposes.[1][39][40]

See also

[edit]

References

[edit]
  1. ^ a b c d e "Phát hoảng với nạn "fake sĩ"" [Terrified with "fake photo maker"]. An ninh thế giới (in Vietnamese). August 27, 2012. Retrieved June 30, 2019.
  2. ^ "Phát hoảng vì trò fake ảnh tục". Người Đưa Tin (in Vietnamese). December 27, 2012. Retrieved June 30, 2019.
  3. ^ a b c "Phát hoảng vì trò fake ảnh tục". baodatviet.vn (in Vietnamese). Archived from the original on June 30, 2019. Retrieved June 30, 2019.
  4. ^ a b "Kerala woman wins battle against fake nude pictures". The Times of India. November 26, 2018. Retrieved June 30, 2019.
  5. ^ Ohlheiser, Abby (January 11, 2019). "A nude-photo hoax was supposed to silence Alexandria Ocasio-Cortez. Instead, she turned up the volume". The Washington Post. Retrieved June 30, 2019.
  6. ^ a b c d Busari, Stephanie; Idowu, Torera (August 5, 2017). "Fake nude photos were used to 'silence me', disqualified Rwandan candidate says". CNN. Retrieved June 30, 2019.
  7. ^ P. David Marshall; Sean Redmond (14 October 2015). "Exposure: The Public Self Explored". A Companion to Celebrity. Wiley. pp. 510–12. ISBN 978-1-118-47492-1.
  8. ^ Richard A. Spinello; Herman T. Tavani (2004). Readings in Cyberethics. Jones & Bartlett Learning. p. 209. ISBN 978-0-7637-2410-8.
  9. ^ Jeff Walls (21 Aug 1999). "Why every star is naked on the Net: It may be Sandra Bullock's face, but everything else below her neck belongs to someone else". National Post – via ProQuest. And taking her anti-faking crusade to the artists' virtual turf, the mother of actress Alyssa Milano has launched a counter-site, www.cyber-tracker.com, to "empower celebrities to take back" their images. Lin Milano contends she combines "sensitivity to the celebrity with the toughness required to make a serious impact on the Web pornographers".
  10. ^ Kushner, David (November 2003). "These Are Definitely Not Scully's Breasts". Wired. Vol. 11, no. 11. Retrieved 2009-05-19.
  11. ^ Cole, Samantha (11 Dec 2017). "AI-Assisted Fake Porn Is Here and We're All Fucked". Motherboard. Retrieved 27 November 2019.
  12. ^ a b Cole, Samantha (24 Jan 2018). "We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now - VICE". Motherboard. Retrieved 27 November 2019.
  13. ^ Cole, Samantha; Maiberg, Emanuel; Koebler, Jason (26 June 2019). "This Horrifying App Undresses a Photo of Any Woman with a Single Click". Vice. Archived from the original on 2 July 2019. Retrieved 2 July 2019.
  14. ^ Vincent, James (3 July 2019). "DeepNude AI copies easily accessible online". The Verge. Archived from the original on 8 February 2021. Retrieved 11 August 2023.
  15. ^ Cox, Joseph (July 9, 2019). "GitHub Removed Open Source Versions of DeepNude". Vice Media. Archived from the original on September 24, 2020. Retrieved December 15, 2019.
  16. ^ Redmon, Jennifer (July 7, 2019). "DeepNude- the AI that 'Undresses' Women- is Back. What Now?". Cisco. Archived from the original on March 1, 2023. Retrieved March 11, 2023.
  17. ^ Hao, Karen (2020-10-20). "A deepfake bot is being used to "undress" underage girls". MIT Technology Review. Archived from the original on 2023-04-20. Retrieved 2023-04-20.
  18. ^ Rody (2024-08-23). "24 DeepNude Alternatives (Tested Free & Paid)". Retrieved 2024-08-31.
  19. ^ a b "Nhóm nhạc SNSD dính nghi án ảnh khỏa thân". Nld.com.vn (in Vietnamese). Retrieved June 30, 2019.
  20. ^ Phi Yến (February 22, 2012). "Bắt được nghi can tung ảnh nude giả của SNSD". Thanh Niên (in Vietnamese). Retrieved June 30, 2019.
  21. ^ Thảo Linh (February 22, 2012). "Bắt nghi can tung ảnh nude giả của Girls Generation". Dân Việt. Retrieved June 30, 2019.
  22. ^ "Bắt nghi can tung ảnh nude giả của sao Hàn". Nld.com.vn (in Vietnamese). Retrieved June 30, 2019.
  23. ^ Quỳnh An (April 3, 2012). "157 sao nữ bị phát tán ảnh "đen"". ngoisao.vn (in Vietnamese). Retrieved June 30, 2019.
  24. ^ Hải Lan (December 13, 2012). "Lưu Diệc Phi bị hãm hại bằng ảnh nóng - VnExpress Giải Trí". vnexpress.net. Retrieved June 30, 2019.
  25. ^ Hàn Giang (December 12, 2012). "Ảnh Lưu Diệc Phi khoả thân tràn ngập web đen". Dân Việt (in Vietnamese). Retrieved June 30, 2019.
  26. ^ Duy Tại (June 9, 2012). "Huỳnh Hiểu Minh bị phát tán ảnh nude 'dỏm mà như thật'". IONE.VNEXPRESS.NET (in Vietnamese). Retrieved June 30, 2019.
  27. ^ Nguyen Thuy (September 2, 2014). "Kate Upton quyết truy đuổi bất kỳ ai tung ảnh khỏa thân của mình". thanhnien.vn (in Vietnamese). Retrieved June 30, 2019.
  28. ^ M. Khuê (March 10, 2014). "Bị tung ảnh khỏa thân giả, Kate Upton dọa kiện". Người lao động (in Vietnamese). Retrieved June 30, 2019.
  29. ^ "K-pop star Rain angered by his fake nude photo". KoreaHearld.com. November 15, 2014. Retrieved December 10, 2022.
  30. ^ "Rain sues over allegedly fake nude pics". The Korea Times. 2014-11-14. Archived from the original on December 28, 2023.
  31. ^ The Korea Times, Police probe fake nude photo of President Moon on the radical feminist website
  32. ^ Gold, Michael (January 10, 2019). "The Latest Smear Against Ocasio-Cortez: A Fake Nude Photo". The New York Times. Retrieved August 11, 2023.
  33. ^ Pilkington, Ed (January 10, 2023). "Alexandria Ocasio-Cortez hits out at 'disgusting' media publishing fake nude image". The Guardian. Retrieved August 11, 2023.
  34. ^ Alptraum, Lux (January 14, 2019). "Opinion | The Real Naked Selfies Are Coming". The New York Times. Retrieved June 30, 2019.
  35. ^ Daniel Moritz-Rabson (January 7, 2019). "Fake nude photo of Alexandria Ocasio-Cortez debunked by foot fetishist". Newsweek. Retrieved June 30, 2019.
  36. ^ P. David Marshall (31 October 2016). The Celebrity Persona Pandemic. University of Minnesota Press. pp. 31–. ISBN 978-1-4529-5226-0.
  37. ^ Verma, Pranshu (2023-11-04). "AI fake nudes are booming. It's ruining real teens' lives". Washington Post. ISSN 0190-8286. Retrieved 2023-12-06.
  38. ^ "Phương Trinh: 'Tôi mà dám khỏa thân ư?'". doisongtieudung.vn (in Vietnamese). Retrieved June 30, 2019.
  39. ^ "Cube khởi kiện vì HuynA bị ghép ảnh nude". Phụ nữ online (in Vietnamese). June 10, 2014. Retrieved June 30, 2019.
  40. ^ "Mỹ nhân Trouble Maker khốn khổ vì ảnh lõa lồ". Tin tức 24h (in Vietnamese). Retrieved June 30, 2019.

Further reading

[edit]
  • Forbes, chapter 169, no 1–6, p. 84, Bertie Charles, Forbes Incorporated, 2002, California university.
  • American Journalism Review: AJR., chapter 18, no 1–5, p. 29, College of Journalism of the University of Maryland at College Park, 1996
  • Hana S. Noor Al-Deen, John Allen Hendricks, Social Media: Usage and Impact, p. 248, Lexington Books, 2012.
  • Janet Staiger, Media Reception Studies, p. 124, NYU Press, 1 July 2005
  • Kola Boof, Diary of a Lost Girl: The Autobiography of Kola Boof, p. 305, Door of Kush, 2006.
  • Laurence O'Toole, Pornocopia: porn, sex, technology and desire, p. 279, Serpent's Tail, 1999