Thursday, July 5, 2018

What's app rewards program


WhatsApp announces generous reward programme to curb spread of misinformation in India: Will it work?






WhatsApp has launched a research programme in India to stop the circulation of fake news, violence-inducing content and other vicious content on its mobile application. The Research Awards for Social Science and Misinformation has a generous cash prize of $58,000 (approx. Rs 34,37,500).
"WhatsApp cares deeply about the safety of our users. Through this new project, we look forward to working with leading academic experts in India to learn more about how online platforms are used to spread misinformation. This local research will help us build upon recent changes we have made within WhatsApp and support broad education to help people spot false news and hoaxes," WhatsApp Spokesperson said in a statement.
The announcement comes in the wake of a series of lynching incidents due to fake messages claiming the presence of child traffickers.

Here's what WhatsApp is looking forward in the research in India:

Information processing of problematic content: WhatsApp is looking forward to understanding the psyche of the local population on how they come to conclusion whether they are reading is trustworthy or plain propaganda. It is expecting researchers with proposals that explore the social, cognitive, and information processing variables involved in the consumption of content received on WhatsApp, its relation to the content's credibility, and the decision to promote that content with others.
It also wants to understand the social cues and relationships, personal value systems, features of the content, content source etc and are interested in understanding what aspects of the experience might help individuals engage more critically with potentially problematic content.
Election-related information: WhatsApp also wants the participants to examine how political players are making of its messenger app to organize and potentially influence elections in their constituencies. Though political leaders will be able to maintain a direct connection with local constituency's voters, WhatsApp fears the messenger app can also be misused to share inaccurate or inflammatory political content. It wants to know the unique characteristics of WhatsApp for political activity and it's place in the ecosystem of social media and messaging platforms, distribution channels for political content, targeting strategies, etc.
Network effects and virality: This aspect wants to build a check to control the spread of viral posts and videos, which might even contain false information leading to communal violence, which is recurring of late, particularly in rural parts of India. So, WhatsApp wants to know the characteristics of networks and content. WhatsApp is designed to be a private, personal communication space and is not designed to facilitate trends or virality through algorithms or feedback. However, these behaviours do organically occur along social dimensions. The company is interested in projects that inform our understanding of the spread of information through WhatsApp networks.
Digital literacy and misinformation: WhatsApp is also expecting researchers to explore the relation between digital literacy and vulnerability to misinformation on WhatsApp. WhatsApp is very popular in some emerging markets, and especially so among new to Internet and populations with lower exposure to technology. It is interested in research that informs their efforts to bring technology safely and effectively in underserved geographical regions. This includes studies of individuals, families and communities, but also wider inquiries into factors that shape the context for the user experience online.
Detection of problematic behaviour within encrypted systems: We welcome proposals that examine technical solutions to detecting problematic behaviour within the restrictions of and in keeping with the principles of encryption. WhatsApp's end-to-end encrypted system facilitates privacy and security for all WhatsApp users, including people who might be using the platform for illegal activities. How might we detect illegal activity without monitoring the content of all our users? We are particularly interested in understanding and deterring activities that facilitate the distribution of verifiably false information.

What are the chances of WhatsApp Research Awards for Social Science and Misinformation programme's success in India?

Before I comment on it, I would like to appreciate WhatsApp for its efforts in controlling the spread of misinformation. In the digital world, technology is like the fire. It can be used to bring light to the lives, but if we go too near (read misuse), it will burn us.
WhatsApp and other social media platforms, too have this kind of effect, they have the power to bring people closer. But, if misused, it can create hatred and might eventually lead to violence.
Coming back to the topic at hand, it is going to be a hard task for social science researchers given the fact WhatsApp is fully encrypted and they will not be given any inside information not even a small sample data of WhatsApp consumers communicating on the messenger app.
Adding to the woes, India is a very diverse country with cultures differing in almost every 100 or so KMs and researchers have to spread out across vast geography of India to create a full overview of the study. But, with big cash rewards and scholarship in the offing, this will be motivating enough for the researchers to finish their job and submit it to WhatsApp.
Though it is certain to take time to cover the whole of India, they are likely to complete the given job. With the information, WhatsApp will be able to bring out a set for pointers for its users to create awareness and also how to recognise whether the information they are reading is a fake news or any propaganda content.
Interested individuals or any institution can apply for WhatsApp Research Awards for Social Science and Misinformation programme (Here). It can be noted that the applicant has to be a PhD scholar.

No comments:

Post a Comment