
New Bill Would Make Nudification Technology Illegal In Minnesota
Technology can be fantastic, with many advances involving AI providing several benefits for professionals everywhere.
Unfortunately, that same technology that is used to make life easier and more efficient is also being used to hurt and embarrass people. Countless recent scams use AI technology to rip people off. One of these scams places calls to people, emulating the voice of a loved one to say they are in trouble and need money.
Another way technology is being used against people is AI nudification. This is when someone has a perfectly innocent photo or video of another person and then uses an app or other AI platform to make them appear nude and often in compromising positions.
To the person in the photo or video, this can be an extremely violating experience, especially since they appear real and are often shared on social media. The whole thing seems wrong and now Minnesota is looking to make it illegal.
Minnesota Bill Introduced That Would Ban Nudification Technology
Senator Erin Maye Quade, DFL-Apple Valley, has introduced MN SF1119, a bill that would ban 'nudification technology', which is defined as 'digitally altering images or videos to reveal intimate body parts that were not originally present, or creating such realistic depictions that a reasonable person would believe an identifiable individual's intimate parts are exposed.'

The state already has a relatively new law that bars nonconsensual sexual deepfakes, and Quade says this bill is meant as an extension of that law.
KSTP-TV reports that the Senate Judiciary and Public Safety Committee heard testimony on February 19 from people warning of the technology’s wide availability and ease of use.
Megan Hurley, a victim of this technology, spoke out about someone she knew who made an explicit image of her using a photo she posted to a private Facebook page. This person did the same thing with about 80 other women.
She added that she cannot overstate the damage this technology has done to her as those fake and humiliating images can live forever online.
The bill requires owners or controllers of nudification websites, applications, software, or programs not to allow users to access, download, or use technology to nudify images or videos.
Under the bill, owners of AI websites or applications that fail to remove the feature in Minnesota would face a $500,000 minimum fine for each unlawful access or download of these nude AI images. The bill also makes it easier for people to sue.
SEE NOW: The Important Truth About Filing Missing Person Reports In Minnesota
At the February 19 meeting, the committee postponed a vote on the bill to allow for an amendment to be written that would generate funds from the civil fines and use them for survivors of the deepfake offenses.
If ultimately passed as currently written, the law would go into effect on August 1, 2025.
Can You Help? The FBI's Most Wanted List In Minnesota
Gallery Credit: Lauren Wells
20 Common Scams To Look Out For In Minnesota
Gallery Credit: Lauren Wells
More From B105








