Nina is the name I’ll use for the girl who loved my 16-year-old son. Until recently, when he wanted to think about her, he could still find her videos on TikTok, including the ones she posted under a false name while she was a patient in a psychiatric hospital. He could see her smeared makeup and tears. He could watch her zoom in on the locked hospital doors, the plastic windows that don’t open, the empty smoking room. He could hear her joke about having been abandoned.
My son barely survived the last couple of school years, during which 9% of high schoolers surveyed by the U.S. Centers for Disease Control and Prevention attempted suicide. Nina did not survive.
TikTok knew more about Nina’s mental state than any of the adults around her, reflecting her fear and pain back to her. To what extent TikTok’s algorithm contributed to her death is impossible for anyone to know. What I do know for sure is that, more than six months later, my son and his classmates could still go to the app whenever they wanted to be retraumatized by images of Nina just before her suicide. That is not just irresponsible on TikTok’s part. That is reckless.