
China is moving to rein in one of the fastest-growing frontiers of technology emotionally interactive artificial intelligence.
On Saturday, the country’s cyber regulator unveiled draft regulations aimed at AI systems that simulate human personalities and forge emotional connections with users signaling Beijing’s growing concern over the psychological and societal impact of these technologies.
These draft rules, now open for public comment, target AI products and services that communicate with users via text, images, audio, video, or other digital formats, mimicking human thought patterns, personalities, and conversational styles.
The goal to ensure that AI does not manipulate, overwhelm, or emotionally endanger its users.
China’s authorities are particularly wary of the addictive potential of AI, warning that users could develop unhealthy emotional reliance.
Under the proposed regulations, providers must issue clear warnings against excessive use and intervene whenever signs of emotional dependence emerge.
Accountability is central to the new framework. AI developers and operators would be responsible for safety across the entire product lifecycle, with mandatory systems for algorithm review, data security, and personal information protection.
Psychological safeguards form a cornerstone of the draft. AI platforms would be expected to monitor user emotions, identify distress or addictive behavior, and take immediate corrective action when necessary.
Content boundaries are also strictly defined, AI systems cannot generate material that threatens national security, spreads misinformation, promotes violence, or encourages obscenity.
Experts say the move highlights China’s effort to tame the social influence of cutting-edge AI technologies. As emotionally intelligent AI becomes increasingly sophisticated, Beijing is signaling that innovation must walk hand in hand with ethical governance, public safety, and social stability.
With this bold step, China is setting the stage for a global conversation on the role of AI in society raising the stakes for developers, policymakers, and users alike.



