Twitter is partnering with two groups of academic researchers to figure out how to measure the health of conversations happening on the platform.
It’s all part of the company’s continuing long game for social relevance. Despite the fact that it lost 1 million users, the company also posted $100 million in profit during last week’s earnings. But neither of these numbers seem to change the fact that Twitter needs to make some adjustments to the way people use the social network.
In March, Twitter called for proposals from researchers to see how they might approach the issue of analytics around the types and manner of conversations on Twitter. These proposals were thoroughly reviewed by Twitter employees from a variety of departments at the company, including Engineering, Product, Machine Learning, Data Science, Trust and Safety, Legal and Research. Twitter also says that the review committee was organized to include representatives from diverse groups across the company. (Reminder: less than 10 percent of Twitter employees are diverse, so it was likely a busy few months for those people.)
Well, the review process is now over and Twitter has decided on two research teams, who will focus on two different issues.
The first team, led by scholars from Leiden University, will look at how echo chambers form and their effect, as well as the difference between incivility and intolerance within Twitter conversations. The team — including Dr. Rebekah Tromble, Assistant Professor of Political Science at Leiden University, Dr. Michael Meffert at Leiden, Dr. Patricia Rossini and Dr. Jennifer Stromer-Galley at Syracuse University, Dr. Nava Tintarev at Delft University of Technology, and Dr. Dirk Hovy at Bocconi University — has found in past research that echo chambers can cause hostility and promote resentment towards those not having the same conversation.
The first set of metrics this team is focusing on will look at the extent to which people acknowledge and engage with diverse viewpoints on Twitter. The second set of metrics will look at the difference between incivility and intolerance. Past research by this group shows that incivility can serve important functions in political dialogue, though not without spurring its own problems. On the other hand, intolerant speech (hate speech, racism, xenophobia) threatens our democracy. The team plans to develop algorithms that will distinguish between more useful incivility and the very useless intolerance we encounter daily on Twitter.
The second research project will be led by Professor Miles Hewstone and John Gallacher at The University of Oxford, in partnership with Dr. Marc Heerdink at the University of Amsterdam. The work will be an extension of Prof. Hewstone’s long standing work to study intergroup conflict. The current findings from this study show that when conversation contains more positive sentiments, cooperative emotions, and more complex thinking and reasoning from multiple perspectives, prejudice will go down and the quality of the relationships will go up.
“As part of the project, text classifiers for language commonly associated with positive sentiment, cooperative emotionality, and integrative complexity will be adapted to the structure of communication on Twitter,” the Twitter blog says.
Just like any social network, Twitter provides scaffolding. Users, on the other hand, construct buildings made of dialogue. How exactly Twitter will be able to adjust the scaffolding to produce more useful, empathetic conversations is still a mystery. But bringing in the academic community to help is an excellent next step.
Tech Stories Are Here.