His Job Was to Make Instagram Safe for Teens. His 14-Year-Old Showed Him What the App Was Really Like.

 “Please don’t talk about my underage tits,”

 Bejar’s daughter shot back before reporting his comment to Instagram. A few days later, the platform got back to her: The insult didn’t violate its community guidelines. Bejar was floored—all the more so when he learned that virtually all of his daughter’s friends had been subjected to similar harassment.

“DTF?”

 a user they’d never met would ask, using shorthand for a vulgar proposition. Instagram acted so rarely on reports of such behavior that the girls no longer bothered reporting them. 

Bejar began peppering his former colleagues at Facebook with questions about what they were doing to address such misbehavior. The company responded by offering him a two-year consulting gig.

That was how Bejar ended up back on Meta’s campus in the fall of 2019, working with Instagram’s Well-Being Team. Though not high in the chain of command, he had unusual access to top executives—people remembered him and his work.

Bejar shared his findings in an email to Meta CEO Mark Zuckerberg and other top executives. PHOTO: DAVID PAUL MORRIS/BLOOMBERG NEWS


From the beginning, there was a hurdle facing any effort to address widespread problems experienced by Instagram users: Meta’s own statistics suggested that big problems didn’t exist. 

During the four years Bejar had spent away from the company, Meta had come to approach governing user behavior as an overwhelmingly automated process. Engineers would compile data sets of

unacceptable content—things like terrorism, pornography, bullying or “excessive gore”—

 and then train machine-learning models to screen future content for similar material.

According to the company’s own metrics, the approach was tremendously effective. Within a few years, the company boasted that 99% of the terrorism content that it took down had been removed without a user having reported it. While users could still flag things that upset them, Meta shifted resources away from reviewing them. To discourage users from filing reports, internal documents from 2019 show, Meta added steps to the reporting process. Meta said the changes were meant to discourage frivolous reports and educate users about platform rules.

Previous Post Next Post