-
Business

Preventing Online Abuse Without Upsetting Customers: Mission Impossible?

Matilda Dorotic

Can social activism jeopardize your brand relationships?

As business and social interactions move online, various cybercrime activities soar too. Particularly alarming are EU and US reports indicating an increase of online child sexual exploitation and abuse during the pandemic.

The Norwegian government’s own statistics show that 1 in 10 teenagers aged 9-17 years have created or shared online materials that could be classified as sexual exploitation. Both Norway and the European Council have therefore made the detection and prevention of internet-related child abuse one of their top national priorities.

Firms operating online are increasingly expected to take social responsibility actions to detect, report and remove abuse material and prevent cybercrime. In July, the European Parliament derogated personal privacy laws to allow providers of electronic communication services to scan and report people’s private messages for material containing child sex abuse or attempts of child grooming.

But what should firms do when supporting social activism may directly hurt customer experiences by breaching the privacy and trust of their customers? How can they help detect and prevent cybercrime without jeopardizing brand trust?

Slow tech giants

The role of corporations in promoting social activism has been controversial, with some firms creating and selling automated detection technologies to law enforcement (with its share of scandals) and online service providers like social media and gaming firms being reluctant to implement automated prevention technologies that may affect their users.

Facebook, Twitter and Google have so far been slow and unwilling to identify and ban users for malicious actions in social media, such as spreading misinformation. The Centre for Countering Digital Hate analysis of 812,000 anti-vaccine posts showed that 65 percent of the content was attributable to social media posts from only 12 persons. Currently, only three of these have been banned from all three social media platforms.

Business leaders are generally divided on the responsibility of brands to promote social activism due to the adverse effect it may have on customers and stakeholders, according to a recent survey from the Chief Marketing Officer Council.

CEOs who engage in public discussions on corporate social activism (CSA) topics, such as racism, elicits an adverse reaction from investors, who are not prone to risk the public backlash. On average, they evaluate CSA activities as a signal of the firm potentially moving away from profit-oriented objectives and toward a riskier and more uncertain outcome.

The dilemmas become particularly tricky when  employing artificial intelligence (AI) for automated detection of potential offenders that use data provided by the users (images and text). While powerful, the technology is not infallible.

Users may be falsely accused (flagged by an algorithm as potential offenders based on race, gender, age or some other characteristics), which will have detrimental consequences for the brand.

Consumer resistance to data sharing

The challenge for firms is exacerbated by customers’ general lack of trust in sharing personal information.  

More than one in five Europeans do not want to share any type of personal data with the authorities, and 41% are unwilling to share these with a private company, according to a survey conducted by the EU. The same survey found that only one in 20 is willing to share their facial images (6%), political views (5%) or fingerprint scan (4%) with private companies.

Not your average CSR dilemma

Traditional corporate social responsibility (CSR) activities have proven to enhance brand recognition, firm reputation, and customer satisfaction. However, firms are typically only engaged in CSR activities that support their brand. The dilemmas discussed in this article differ significantly, since they have the potential to both strengthen and sever stakeholder relationships with the firm.

For the fight against cybercrimes to be successful, firms must strongly engage in cooperation with police and public institutions. At the same time, firms depend on trust of the customers who expect firms to protect their privacy and prevent algorithmic biases against them. Trust, satisfaction and innovation abilities drive investors’ decisions.

In view of all this, it is important to align multiple stakeholders’ interests and understand the drivers and obstacles facing managers and policy makers responsible for inducing social change.

Reference:

This article was written for the upcoming edition of the BI Marketing Magazine.

Matilda Dorotic and BI Norwegian Business School are currently involved in an ongoing research project, led by NTNU, where researchers and the Norwegian police will collaborate to develop new knowledge about sexual abuse of children on the internet, including insights into arenas used by perpetrators, as well as technological trends that may help in the detection and prevention.

Published 7. September 2021

You can also see all news here.