

139
Thursday, November 10
1 6 : 3 0 – 1 8 : 0 0
CLP04
Rethinking Data Protection and Privacy
PP 214
Data, Prediction and Automatic Law?
A. Kenyon
1
, J. Goldenfein
2
1
University of Melbourne, Law, Melbourne, Australia
2
Swinburne, Law, Melbourne, Australia
Big data and predictive analytics could be said to strive for a perfectly calculable future, where virtualized possibilities are used in analyzing the present.
In that it might be said to have risks of authoritarianism. Here, we explore challenges for using law with the aim of regulating big data programs and
the knowledge they generate. Legal limitations on big data could be based on limiting: 1) the data that is available for analysis; 2) the types of com‑
putational processes that can be used for making decisions (that is, regulate how decisions are made); or 3) the resulting effects on individuals – be it
discrimination, stigmatization or profiling. If effectively implemented, legal systems capable of modulating these processes and outcomes could profoundly
redefine the relationship between society and automated systems. However, legal systems capable of constraining big data in this way do not yet exist. Data
protection regimes rarely limit data analytics or automated decision making programs in a substantial way; the regulations that do exist are of questionable
efficacy overall. It has recently been argued that law, as a technology of the script, may be losing its protective power, and that any normative constraint on
predictive analytics and automated systems may require more than just providing textual instruction to those who control systems. In other words, effective
regulation may require an articulation of legal constraint into the telecommunications infrastructure itself. It may require automated, self-executing legal
systems, operating at the technical level. This paper analyses the possibility of implementing legal constraints on predictive analytics through technical
systems. It explores the possibility of hardcoding legal limitations within big data systems. Legal expert systems that assist administrative governance have
been in use for some time, but the use of systems through which legal norms are translated into programming code is rarely compulsory or applied with
the force of law. Rethinking the mode and materiality of legal transfer (in a way that includes the computer, the network and programming code) may offer
a fruitful path. However, it is also fraught with jurisprudential quagmires and practical barriers. Accordingly, we provide an analysis of recent exercises that
have attempted to translate and replicate legal norms through programming languages to constrain data mining and analytics.We focus on the possibilities
and consequences of ‘automatic’law as a vehicle for reshaping the relationship between individuals, automated systems and their effects.
PP 215
Tracing Audience Behaviour for a Reason – A Matter of Paternalism?
E. Appelgren
1
1
Södertörn University, Journalism, Huddinge, Sweden
This paper analyses in a news context the reasons that are often provided for using audience data found in privacy agreement texts and cookie consent
information. Due to current data protection legislation, media companies and other website owners must obtain informed consent from their audience
in order to use cookies to measure web traffic. New EU data protection regulations from January 2016 promise modernized, unified rules that benefit
businesses while also giving people better control over their personal data. Currently, informed consent is often obtained by asking for permission to collect
audience data through “cookie consent” or acceptance of user terms. Some media companies also communicate their reasons for collecting the data in
separate texts, stating reasons such as enhancing the user’s experience and personalizing content tailored to individual audience members. User experience
and personalization are frequent research topics related to ubiquitous computing. Here, researchers strive to make computers invisible, having them“stay
out of the way”, even though they may be everywhere (Weiser and Brown, 1996). As a consequence, computers are being entrusted to make decisions for
people and improve the everyday lives of people without the technology creating a disturbance. This process can be described as an act of technological
paternalism (Hofmann, 2003), i.e. machines make decisions for individuals using behavioral data and pre-programmed rules that go into action without
the conscious and active consent of the users. Spiekermann and Pallas (2006) account for how paternalism today is accepted in many contexts and societies,
as it is claimed mainly to be in the interest of the user. Paternalism also involves the ethics of technology (Hofmann, 2003). At times paternalism has been
considered positive, such as in medicine where physicians in some circumstances, for example biological statistics, are able to diagnose patients without
talking to them (Hofmann, 2003). Today, however, the term has predominantly negative connotations, and engineers, scientists and experts are often
“accused” of paternalism when technological solutions compromise the autonomy of individuals (Hofmann, 2003, p. 323). Furthermore, systems that are
paternalistic are described as able to “punish” humans even though the punishment may be in their own interest. This paper analyses in a news context
the reasons that are often given for using audience data. Privacy agreement texts and cookie consent information collected from 60 news sites, more spe‑
cifically ten national and ten regional news outlets in three countries (US, UK, and SE), are analyzed within the context of paternalism. Preliminary results
indicate that the provided reasons may not be beneficial for the audience, and therefore in the long term not viable for the media companies themselves.
There are many implications for media companies since a lack of transparency or justified reasons may compromise the trust of the audience. Given the ac‑
cess media companies currently have to audience data in its richest form, is it possible to actually acheive the noble reasons for collecting audience data as
stated in the privacy texts?