Minnesota’s new nudification law applies to any company that makes AI tools capable of generating fake intimate images — and it targets the moment of creation, not distribution.
That’s the shift. Existing laws like revenge porn statutes require proving the creator intended harm. The federal Take It Down Act only kicks in after images are shared. Minnesota’s law doesn’t wait for either. If your app can generate a fake nude, it’s now illegal in Minnesota, full stop.
Maye Quade, the Minnesota lawmaker who led the push, credited victim-survivors who testified in committee and spoke publicly. One of the women targeted, Molly Kelley, spent two years on this and framed the problem correctly: “These images don’t exist without a third-party involvement and some sort of machine learning model.” The law is aimed at the supply chain, not the end user.
For founders building image generation tools, the read is this: creation-point liability is a new category. You can’t rely on “we don’t control what users do.” If the tool can do it, you’re exposed.
Enforcement has a gap. DeepSwap, the service used against the Minnesota women, operates overseas, claiming bases in Hong Kong and Dublin. One state can’t reach a Hong Kong company. Advocates say a federal ban is the preferable solution.
Governor Walz still needs to sign. Once he does, expect other states to copy the template.
Nathan Zakhary