Must be clearly legal, not just in a grey area that hasn't been tested in the courts. (i.e. in a similar status to adult pornography.)
It's probably already legal to possess, per the supreme court precedent in Ashcroft, but not entirely clear. There's some disagreement among the courts.
https://en.wikipedia.org/wiki/Legal_status_of_fictional_pornography_depicting_minors#2008%E2%80%93present
How does it resolve if Ashcroft remains in place and the supreme court doesn't clarify it before 2030?
@IsaacKing I don't think that would be the correct resolution. Wikipedia lists all four cases concerning it, and none of them involved someone being convicted solely for possession of simulated CP. #1 and #4 involved a bunch of other crimes. In #2 it was only the anime so no charges were filed. In #3 someone was intimidated into pleading guilty and gave up their right to appellate review of the constitutionality of the original charges, which is very dubious after Ashcroft.
I created another market to operationalize this differently:
The Child Pornography Prevention Act added two categories of speech to the definition of child pornography. The first prohibited "any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture" that "is, or appears to be, of a minor engaging in sexually explicit conduct." In the Ashcroft case, the Court observed that this provision "captures a range of depictions, sometimes called 'virtual child pornography,' which include computer-generated images, as well as images produced by more traditional means."
It appears that AI-generated child pornography would only be illegal if the AI generates images of a minor engaging in sexually explicit conduct. Would, for instance, and dear God I hope that this will never be the case, AI images of minors posing nude in the same fashion as Playboy models satisfy your criteria? Under current law such images would not be illegal, in my understanding. Such images would only be illegal if the minors were engaged in sexually explicit conduct.
Ashcroft v. Free Speech Coalition - Wikipedia
" The U.S. Supreme Court established the test that judges and juries use to determine whether matter is obscene in three major cases: Miller v. California, 413 U.S. 15, 24-25 (1973); Smith v. United States, 431 U.S. 291, 300-02, 309 (1977); and Pope v. Illinois, 481 U.S. 497, 500-01 (1987). The three-pronged Miller test is as follows:
Whether the average person, applying contemporary adult community standards, finds that the matter, taken as a whole, appeals to prurient interests (i.e., an erotic, lascivious, abnormal, unhealthy, degrading, shameful, or morbid interest in nudity, sex, or excretion);
Whether the average person, applying contemporary adult community standards, finds that the matter depicts or describes sexual conduct in a patently offensive way (i.e., ultimate sexual acts, normal or perverted, actual or simulated, masturbation, excretory functions, lewd exhibition of the genitals, or sado-masochistic sexual abuse); and
Whether a reasonable person finds that the matter, taken as a whole, lacks serious literary, artistic, political, or scientific value.
Any material that satisfies this three-pronged test may be found obscene. "
Citizen's Guide To U.S. Federal Law On Obscenity (justice.gov)
As has been pointed out many times, the Miller test is basically confused and meaningless. Is a nude image of a child obscene according to Miller? No if it's in a medical textbook. No, apparently, if it's in a Superman movie. Yes... when? When the "average person" feels like it.
@ForrestTaylor ie, under the "community standards" prong of the Miller test, what might be considered "obscene" in Texas might not be considered "obscene" in California, and so on and so forth
@ForrestTaylor Half the content of pornhub would fail the miller test but nobody gets prosecuted for obscenity for posting it, de facto
The question is "What is pornography?" and "What is obscenity?" I don't like to talk about this, but pictures of nude children are legal under U.S. law. If the criteria is simply "nude photorealistic images of children," well, I'm sorry, but "Superman" (1978) includes a video of a nude child.
I understand the question to be about if currently-illegal material will be legalized.
Nude photorealistic images of children are not currently illegal. What is illegal is 1. Child Sexual Abuse Material and 2. Pornography of children- defined by the DOJ as "A picture of a naked child may constitute illegal child pornography if it is sufficiently sexually suggestive."
What does "sufficiently sexually suggestive" mean? I have no idea, and I don't think the courts do, either.
Citizen's Guide To U.S. Federal Law On Child Pornography (justice.gov)
I think this needs more detailed resolution criteria. As I understand it, drawn/fictional pornography depicting minors is protected expression under the First Amendment, as long as it is not considered obscene plus other caveats (Wikipedia). I can imagine a court ruling that certain things that could fall under "AI-generated child pornography" (depending on your intended definition) are protected under current law.