AI, data security and digital panopticon: which challenges for communication professionals?
by Chiara Valentini
Professor of Corporate Communication
JSBE, University of Jyäskylä, Finland
Undoubtedly today we live in an interesting technological era with great potential for human developments but a lot of challenges. Some challenges are technological – for instance, how to improve driverless cars so that they are capable of handling vehicle’s performance like a human in all possible conditions (Himmelreich, 2018, April 25) – , others are legal, – for example, how to avoid that big data turns into a discriminatory tool or leads to transformative effects that impact social and individual autonomy (Pagallo, 2017) –. But there are also communication challenges.
A communication challenge could be, for instance, promoting artificial intelligence (AI) in societies by explaining why we need to invest more on it and how such technologies can improve our lives. This is a challenge given that many societies, as presented by the latest study by the Pew Research Center, are not ready and fervent to the idea of robots taking over human activities (Wike & Stoke, 2018, September 13).
Another communication challenge is assuring our stakeholders that we take great care of how we use, store and secure data on them – and this can be particularly challenging –. In Europe, the new General Data Protection Regulation (GDPR) is addressing these concerns in part (see European Commission, 2018), but it is not sufficient to avoid misuses and abuses by organizations as we have experienced and will experience.
We periodically hear about scandals related to the use of people’s digital data, that is, the footprints we leave every time we enter the digital environment. Starting from the infamous revelations by Edward Snowden on US National Security Agency and their systematic and massive interception and collection of information on US and non-US citizens and organizations alike, to the Cambridge Analytical case harvesting private information from the Facebook profiles of more than 50 million American users without their permission to build a system that could target them with personalized political advertisements based on their psychological profile (Greenfield, 2018, March 28), people are becoming more and more concerned, and they should be, about their own privacy and security.
This is, indeed, relevant for communication professionals, since much of today communication happen in the digital environment. Not only, we scan, monitor, and track our organizations, clients, and brands and collect information through different software to gain important Intelligence. We use this Intelligence for strategically planning, developing, executing and measuring our communication activities off and online. No doubt on this.
Few years back I wrote a provocative academic piece published in Public Relations Review entitled “Is using social media “good” for the public relations profession? A critical reflection” (Valentini, 2015), where I reflected on a couple of important implications related to the vast and massive use of digital media by public relations professionals to communicate with and to their publics. One of these was related to the impact of such communications on human relations.
Almost four years later I do still think that the implications on human relations are too often underestimated, or better, quietly forgotten for the sake of organizations’ strategic goals. Paraphrasing media sociologist Fuchs (2015) we are exploiting people and their online communication behaviours for organizational interests.
How? Through digital stakeholder engagement initiatives. We ask organizations’ followers to collaborate, to participate, to get involved in organizations’ initiatives, to share organizational contents, or even to create contents for them. All these actions are typically voluntary. We don’t pay them, we may offer a raffle prize or a small discount on companies’ stores, but we don’t pay them. Those activities produce economic value for the organizations we work for without rendering any economic benefits to those who actually create such value. We should not forget that such digital engagement is, in reality, a form of “digital labour” (emphasis mine) that creates surplus for an organization (Fuchs & Sandoval, 2014), if not in direct sales, in reputation.
So, how do we explain this to our stakeholders? Should we explain it at all? And how will it look like for a company which brags to be responsible towards its stakeholders?
Even when digital engagement does not happen, we encounter a number of challenges related to security and data protection that call for an ethical approach beyond sales figures and marketing purposes. Attention to the use of digital data has increased and we could expect that more and more people will become aware of the use of their digital data by organizations. To this we should add our responsibility – if we perceive ourselves as ethical communication professionals – to limit, whenever possible, “digital panopticon” to use Han’s (2014) terminology. A digital panopticon in a digital environment in which everyone can be observed and controlled everywhere and by anyone.
What are the implications for communication professionals?
At the best our publics become inoculated against the messages they receive, at worst they will hijack us and turn towards negative engagement (Lievonen et al., 2018).
What are the implications for human relations?
At the best we engineered sociality (van Dijck, 2013) – in my view an inferior form of sociality, unless we think that parasocial relations equate face-to-face relations – and we increase “networked individualism” (Wellman et al., 2003) at the expense of human-to-human interactions; at worst we facilitate an increased sense of alienation, loss of sociality and overall distrust of others.
There is a bright light at the end of the tunnel. Bringing back the ‘human’ to the centre of our thinking, theorizing and practicing communication will help us addressing if not completely, at least in part these challenges through ethical choices and stakeholder-centred communications.
Fuchs, C. (2015). Culture and economy in the age of social media. London, UK: Routledge.
Fuchs, C. & Sandoval, M. (2014). Digital workers of the World unite! A framework for critically theorising and analysing digital labour. tripleC: Open Access Journal for a Global Sustainable Information Society, 12 (2), 486-563
European Commission (2018) Data Protection in the EU. European Commission, URL: https://ec.europa.eu/info/law/law-topic/data-protection/data-protection-eu_en, accessed on 9 November 2018
Greenfield, P. (2018, March 28).The Cambridge Analytica files: the story so far. The Guardian. URL: https://www.theguardian.com/news/2018/mar/26/the-cambridge-analytica-files-the-story-so-far, accessed on 9 November 2018
Han, B.-C. (2015). The Transparency Society. Stanford, CA: Stanford University Press
Himmelreich, J. (2018, April 25). The everyday ethical challenges of self-driving cars. The Independent, Indy/life. URL: https://www.independent.co.uk/life-style/gadgets-and-tech/ethical-challenges-self-driving-cars-driverless-vehicles-tempe-arizona-crash-a8287776.html, accessed on 9 November 2018
Lievonen, M., Luoma-aho, V. & Bowden, J. (2018). Chapter 36: Negative engagement. In K. A. Johnston & M. Taylor (Eds). The Handbook of Communication Engagement. Wiley-Blackwell
Pagallo, U. (2017). The legal challenges of big data: Putting secondary rules first in the field of EU data Protection. European Data Protection Law Review, 3(1), 36-46
Valentini, C. (2015). “Is using social media “good” for the public relations profession? A critical reflection”. Public Relations Review, 41(2), 170-171
van Dijck, J. (2013). The culture of connectivity. In A critical history of social media. Oxford, UK: Oxford University Press
Wellman, B., Quan-Hasse, A., Boase, J., Chen, W., Hampton, K., de Diaz, I. I., et al. (2003). The social affordances of the Internet for networked individualism. Journal of Computer-Mediated Communication, 8(3), 1–16. http://dx.doi.org/10.1111/j.1083-6101.2003.tb00216.x
Wike, R. & Stoke, B. (2018, September 13) In Advanced and Emerging Economies Alike, Worries About Job Automation. Pew Research Center, Global Attitudes & Trends, URL: http://www.pewglobal.org/2018/09/13/in-advanced-and-emerging-economies-alike-worries-about-job-automation/, accessed on 9 November 2018