An interesting problem with that question is "identifiable by whom?"
Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.
I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.
That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.