

Some search engines and social media platforms make at least half-assed efforts to prevent or add warnings to this stuff, because anorexia in particular has a very high mortality rate, and age of onset tends to be young. The people advocating AI models be altered to prevent this say the same about other tech. It’s not techphobia to want to try to reduce the chances of teenagers developing what is often a terminal illness, and AI programmers have the same responsibility on that as everyone else,
If you link to suicide hotlines, mention which ones can send cops (under the guise of EMS) to your door without your consent. (For instance, I’d never suggest 988 because of this.) That way, users know you know what’s up and that they can trust the group’s info. And given cops’ record of shooting mentally ill people in crisis, it’s pretty important.