Blog:

Opinion: All Businesses Must Ask Themselves Questions About Their Role in Social Media Malpractice

Written by Professor Christopher Bones Tuesday 04 December 2018
Digital ethics are an issue for managers in all industries
Smartphone with few social media icons on the display

In the last few months, business practices in ‘big tech’ have faced serious scrutiny. In May 2018, perhaps reflecting the position it finds itself in, Google relegated its pledge ‘don’t be evil’ from the beginning of its code of conduct to a final aside.

Watching the fate of its counterparts such as Facebook, who been hauled before regulators and managed nervy investors as a result, business leaders outside of the industry would be excused for feeling lucky about not being involved.

But it would be a mistake to think they do not share responsibility for social media malpractice. After all, the business practices of technology giants are part-funded by adverts for other businesses. To be blunt, probably your business. As companies take the opportunity to get to know customers better, and to target them online with laser-like precision, we all need to accept our responsibility for engaging with, and benefiting from, the digital advertising industry.

Admittedly, there has been a growing chorus of discontent about fake views, inappropriate advert placement and other examples of digital advertising representing poor value for money. But business leaders must not limit themselves to an economic debate. We need to accept that investment in digital technology has created an ecosystem that exploits individuals’ data. This data has usually been given up unwittingly, and sometimes even unwillingly, through having to comply with ‘all-or-nothing’ terms and conditions.

You May Be Involved in Controversial Online Activity

How much information a business seeks, what it does with it and how it chooses to manage the manipulation of people’s emotions and opinions, are important ethical considerations.

This isn’t a new problem in advertising, but the stakes have never been higher. The ability to reach large audiences using a plethora of messaging that doesn’t need to pass through any regulatory control invites risk-taking that can quickly destroy reputations.

While GDPR may restrict direct access to unconsenting consumers, practices that remain concerning include: online polls that deliberately skew sentiment; the use of social influencers to promote content without a clear indication that they are accepting payment; and the proliferation of fake reviews.

As a result, businesses should incorporate ethical standards into their online activity.

Questions to Ask About Digital Ethics

Here are five simple questions that business leaders can ask their digital teams in order to improve their ethics:

  • How much human judgement is involved in the decisions about what you do with customers’ data?
  • Beyond basic legal standards, what additional controls do you have in place to manage segmented, or even personalised, marketing messages that are sent out in the company’s name?
  • What are the commercial and reputational risks if your use of data were to be exposed to customers?
  • If your customers could see all your different forms of digital messaging at once would that concern you?
  • How much control do you hand over to your customers to shape their engagement with you?

These are not easy questions. But without seeking to answer them and recognising the role of your business in the larger ecosystem of digital marketing, then you are accepting the high probability that someone may accuse your business of ‘doing evil’.

Image: Shutterstock

Professor Christopher Bones

Professor Christopher Bones

Professor Christopher Bones is a Companion of the Chartered Management Institute and co-author of Optimizing Digital Strategy. He is Dean Emeritus of Henley Business School, Emeritus Professor of Creativity and Leadership at Alliance Manchester Business School and the Chairman of Good Growth, an international digital strategy, insight and innovation consultancy.