Nasty black gay twitter
Ted Cruz were threatening to abandon Twitter, the NYU researchers began monitoring a subset of 600,000 tweets and scanning for users who they thought might soon be suspended for hate speech. So, last July, as racial justice protests were swelling, anti-Asian sentiment was filling social media and conservatives like Sen. candidate in NYU's department of politics and the lead researcher on the report. "We wanted to find a way that would basically prevent them from migrating to these platforms, but at the same time, that would result in the reduction of hate speech," said Mustafa Mikdat Yildirim, a PhD. The researchers at NYU's Center for Social Media and Politics developed their experiment last summer, in response to what was beginning to look like a mass migration of Twitter users to more extreme platforms like Parler. And those warnings can change users' behavior even when users aren't in the heat of the moment and about to tweet something regrettable.
![nasty black gay twitter nasty black gay twitter](https://pbs.twimg.com/profile_images/450329921684905984/LQFwfkiU.jpeg)
Now, a new study - this one from researchers at New York University - adds to the evidence that giving users warnings about hate speech can actually cut down their use of hate speech by 10-20%.
![nasty black gay twitter nasty black gay twitter](https://static-eu-central-1.thegailygrind.com/uploads/2013/06/c8004e62730211e284c322000a1fbca9_7-300x300.jpg)
The decision, the company said at the time, was based on a successful test of the messages in the run-up to the 2020 election. Twitter in May said it would begin prompting users who are about to tweet something nasty to either revise or delete the message before sending.