Then, regulation should be avoided on the following?
Developint artificially-biased LLMs towards "engagement"? When unregulated, this aspect of LLMs will result in mental health issues, radicalization, suicide attempts and more in otherwise sane people. See recent articles on Ars Technica on...