Background Image
Table of Contents Table of Contents
Previous Page  135 / 658 Next Page
Basic version Information
Show Menu
Previous Page 135 / 658 Next Page
Page Background

133

Thursday, November 10

0 9 : 0 0 – 1 0 : 3 0

PP 008

Putting the Canaries in the Data Mine. Some Suggestions for the Practical, Ethical, and Legal Challenges of Researching the ‘Black Box’

B. Bodo

1

, J. Moller

2

, K. Irion

1

, F. Zuiderveen Borgesius

1

, N. Helberger

1

, C. de Vreese

2

1

University of Amsterdam, Institute for Information Law, Amsterdam, Netherlands

2

University of Amsterdam, Amsterdam School of Communications Research, Amsterdam, Netherlands

Algorithmic agents (AA) permeate every instant of our online existence. Based on our digital profiles built from the massive surveillance of our digital

existence, they rank search results, filter our emails, hide and show news items on social networks feeds, try to guess what products we might buy next

for ourselves and for others, what movies we want to watch, and when we might be pregnant. They select, filter, and recommend products, information,

and people; they increasingly customize our physical environments (including the temperature and the mood). Increasingly, algorithmic agents don’t just

select from the range of human created alternatives, but also they create. Algorithmic agents are increasingly capable of providing us with content made

just for us, and engage with us through one-of- a-kind, personalized interactions. In order to understand the implications from algorithms, for users and

society, and possible threats or opportunities for the realization of fundamental rights and values we need to be able to better understand the workings and

effects of algorithms. As we describe in this paper, there are more than one ways of approaching this. Seeing that some of the most powerful and influential

algorithms are among the best kept business secrets in the world, asking the developers and owners of algorithms to let us study the code is not a very likely

route to success. Another possibility is to reverse engineer the algorithms to see how the inputs to the black box define the outputs of the algorithms, but

many factors limit the successful application of reverse engineering. Instead, our team at the University of Amsterdam decided to go for a third approach,

which enables us to observe the space in which algorithmic agents interact with us, humans, and to see how agents and people shape each other’s be‑

havior. The objectives of our paper are three-fold. The first one is to describe our approach to researching the ‘Black Box’, and to share our experiences with

the academic community. The second objective is to instill a more fundamental discussion about the ethical and legal issues of tracking the trackers, as

well as the costs and trade-offs involved, e.g. for the privacy of those users we are observing. The third objective is to contribute to developing a vision on

algorithms and transparency in general. Our paper will contribute to the discussion on the relative merits, costs and benefits of different approaches towards

transparency (such as bottom up reverse engineering vs. statutory transparency obligations). We will argue that besides shedding light on the internal

workings of specific algorithms, we also need to be able to understand how different methods of cracking the black boxes open compare to each other in

terms of costs and benefits so we don’t end up sacrificing a golden goose to warn us to the dangers in the data mine.

PP 009

Towards a Public Service Algorithm That Promotes News Diversity

P. Verdegem

1

, E. Lievens

2

1

Ghent University, Communication Sciences, Ghent, Belgium

2

Ghent University, Interdisciplinary Study of Law- Private Law and Business Law, Ghent, Belgium

Digitalization processes have profoundly changed the news ecology. As a consequence, the very definition of news itself is under pressure. Whilst tradi‑

tional news production typically revolves around news values in determining what is newsworthy, datafication principles are increasingly being used to

determine what is newsworthy and what news offerings should consist of (Hammond, 2015). This impacts the news ecology on different levels: decisions

on the editorial floor about what content needs to be produced is increasingly based on what generates ‘engagement’ (c.g., internet traffic) (Lee, Lewis &

Powers, 2014), while various news outlets experiment with algorithms to offer personalized news (Carlson, 2015). To facilitate the filtering of news and in‑

formation, news recommender systems have been developed.They are powerful and popular tools for audiences to cope with the information overload and

assist in decision-making processes, based on the user’s news preferences. As such, these systems are clear examples of the algorithmic culture at work in

the big data era. The increasing importance of algorithms and datafication brings about new opportunities, e.g. offering a customized news experience and

facilitating innovative journalistic practices, but might also entail less positive consequences. As such, the hyper-personalized news selection may endanger

the basic function of news, since it may result in a ‘filter bubble’(Pariser, 2011), a world created by the shift from ‘human gatekeepers’to ‘algorithmic gate‑

keepers’employed by Facebook and Google, which present the content they believe a user is most likely to click. Against this background, this paper aims to

conceptually explore an innovative and societally relevant use of algorithmic power, i.e.‘public service algorithms’that make recommendations that help in

opening our horizons and offer something‘new and different’. First, the concept of‘news diversity’will be analyzed both from a communication science and

legal perspective (e.g. source diversity versus exposure diversity; Burri, 2015, Helberger 2015). In a second step, based on the foregoing analysis on the one

hand and a literature study on the functioning of algorithms on the other hand, the paper will identify and examine essential principles that a public service

algorithm must embody (e.g. transparency, user control and data subject rights, accountability). REFERENCES Burri, M. (2015). Contemplating a 'Public

Service Navigator': In Search of New (and Better) Functioning Public Service Media. International Journal of Communication, 9, 1341–1359. Carlson, M.

(2015). The robotic reporter. Automated journalism and the redefinition of labor, compositional forms, and journalistic authority. Digital Journalism, 3(3),

416–431. Hammond, P. (2015). From computer-assisted to data-driven: Journalism and big data. Journalism, doi: 10.1177/1464884915620205 Helberger,

N. (2015), “Merely Facilitating or Actively Stimulating Diverse Media Choices? Public Service Media at the Crossroad”, International Journal of Communica‑

tion, 9, 1324–1340. Lee, A.M., Lewis, S.C. & Powers, M. (2014). Audience clicks and news placement: A study of time-lagged influence in online journalism.

Communication Research, 41(4), 505–530. Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.