Content

Maximilian von Grafenstein

The Principle of Purpose Limitation in Data Protection Laws

The Risk-based Approach, Principles, and Private Standards as Elements for Regulating Innovation

1. Edition 2018, ISBN print: 978-3-8487-4897-6, ISBN online: 978-3-8452-9084-3, https://doi.org/10.5771/9783845290843

Series: Schriften zur rechtswissenschaftlichen Innovationsforschung, vol. 12

CC-BY-NC-ND

Bibliographic information
The Principle of Purpose Limitation in Data Protection Laws Maximilian von Grafenstein The Risk-based Approach, Principles, and Private Standards as Elements for Regulating Innovation Schriften zur rechtswissenschaftlichen Innovationsforschung 12 Nomos Schriften zur rechtswissenschaftlichen Innovationsforschung Herausgeber: Professor Dr. Wolfgang Hoffmann-Riem Professor Dr. Karl-Heinz Ladeur Professor Dr. Hans-Heinrich Trute Band 12 BUT_Grafenstein_4897-6.indd 2 20.03.18 11:42 Maximilian von Grafenstein The Principle of Purpose Limitation in Data Protection Laws The Risk-based Approach, Principles, and Private Standards as Elements for Regulating Innovation Nomos BUT_Grafenstein_4897-6.indd 3 20.03.18 11:42 The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at http://dnb.d-nb.de a.t.: Hamburg, Univ., Diss., 2017 ISBN 978-3-8487-4897-6 (Print) 978-3-8452-9084-3 (ePDF) British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. ISBN 978-3-8487-4897-6 (Print) 978-3-8452-9084-3 (ePDF) Library of Congress Cataloging-in-Publication Data Grafenstein, Maximilian von The Principle of Purpose Limitation in Data Protection Laws The Risk-based Approach, Principles, and Private Standards as Elements for Regulating Innovation Maximilian von Grafenstein 675 p. Includes bibliographic references and index. ISBN 978-3-8487-4897-6 (Print) 978-3-8452-9084-3 (ePDF) 1st Edition 2018 © Nomos Verlagsgesellschaft, Baden-Baden, Germany 2018. Printed and bound in Germany. This work is subject to copyright. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage or retrieval system, without prior permission in writing from the publishers. Under § 54 of the German Copyright Law where copies are made for other than private use a fee is payable to “Verwertungs gesellschaft Wort”, Munich. No responsibility for loss caused to any individual or organization acting on or refraining from action as a result of the material in this publication can be accepted by Nomos or the author. This work is licensed under the Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License. To view a copy of this license, visit http:// creativecommons.org/licenses/by-nc-nd/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA. BUT_Grafenstein_4897-6.indd 4 20.03.18 11:42 To my father The principle of purpose limitation in data protection law is usually considered as a barrier to data-driven innovation. According to this principle, data controllers must specify the purpose of the collection at the latest when collecting personal data and must not process the data in any way that does not comply with the original purpose. Whether the principle of purpose limitation conflicts with data-driven innovation, however, depends on two sub-questions: On the one hand, one has to know how precisely a data controller must specify the purpose and under which conditions the subsequent processing is fully compatible or incompatible with that purpose. On the other hand, one has to understand the effects of a legal principle such as the principle of purpose limitation on innovation processes. Surprisingly, despite the long-standing and ongoing debate, there is little research that thoroughly examines the regulatory concept of the principle of purpose limitation, and even less its actual impact on innovation. To close this gap, was the aim of this dissertation, which reflects the debate until January 2017. This dissertation evolved in the context of the interdisciplinary research project “Innovation and Entrepreneurship” at the Alexander von Humboldt Institute for Internet and Society. The main research question of this thesis was the result of hands-on observations in our Startup Clinics that we created and carried out for more than four years in order to empirically research the disabling and facilitating factors of internet-enabled innovation. In the Startup Law Clinic, where I helped more than 100 startups to cope with the legal challenges they faced during their innovation processes, I realised quite early that most of the startup founders were able to do a great variety of things in a very efficient and creative way, except one: Reliably expect what will happen next month, next week, or even the next day. Under these circumstances of knowledge uncertainty, I wondered how these founders should be able to reliably assess what their future data processing purposes would look like. This hands-on observation served as an inspiring research question and pushed me throughout the four years of its production. The result of this research process was in some way even puzzling to me: As a legal principle, the principle of purpose limitation is not only a highly efficient instrument to protect individuals against the 5 risks caused by data-driven innovation but it can even enhance innovation processes of data controllers, when combined with co-regulation instruments. For the inspiring tour de force of these four years, I would like to thank, first and foremost, Prof. Dr. Wolfgang Schulz who not only aroused my interest in regulation as a research discipline but also always immediately and constructively helped me with his oversight, precision in the details and humour. I would also like to especially thank Prof. Dr. Dr. Thomas Schildhauer who has given me the economic perspective on innovation and who in turn has always been pro-actively open to my regulatory viewpoints and ideas. Furthermore, I would like to thank Prof. Dr. Marion Albers, without whose contributions to informational self-determination and data protection my own work would not have been possible, and who compiled the second vote very quickly. Furthermore, I am very thankful and honoured to be included in Prof. Dr. Wolfgang Hoffmann-Riem’s, Prof. Dr. Dr. h.c. Karl-Heinz Ladeur’s and Prof. Dr. Hans-Heinrich Trute’s publication series Legal Research on Innovation (“Rechtswissenschaftliche Innovationsforschung”) on that my dissertation is based on. I would also like to thank the German Ministry of the Interior for the financial support of the publication of my thesis. Finally, I want to thank my colleagues: Elissa Jelowicki, who helped me to revise my thesis throughout the creation process, Jörg Pohle, the “walking library” (I think I do not have to explain that) and all my other colleagues for the endless and inspiring discussions. Last but not least, I am grateful to my wonderful fiancée Eva Schneider, who in countless evenings of discussions helped me to structure my ideas, and above all motivated me to keep on going. To my father 6 Content Overview IntroductionA. 31 Problem: Conflict between innovation and risk protectionI. 32 Innovation as an economic driver for public welfare1. 32 Protection against the risks of innovation2. 33 Uncertainty about the meaning and extent of the principle of purpose limitation 3. 34 Practical examples referring to two typical scenarios4. 35 Interim conclusion: Uncertainty about the concept of protection and its legal effects 5. 45 Research questions and approachII. 48 Legal research about innovation1. 48 The regulator’s perspective2. 49 Possible pitfalls taking the effects of regulation instruments into account 3. 54 Course of examinationIII. 55 Conceptual definitions as a link for regulationB. 61 Innovation and entrepreneurshipI. 61 Process of innovative entrepreneurship1. 63 Regulation of innovative entrepreneurship2. 71 Data protection as a risk regulationII. 79 Risk terminology oscillating between “prevention” and “precaution” 1. 79 Sociological approaches defining “dangers” and “risks”2. 82 German legal perspectives: Different protection instruments for different types of threat 3. 84 Searching for a scale in order to determine the potential impact of data protection risks 4. 89 7 Theories about the value of privacy and data protectionIII. 91 The individual’s autonomy and the private/public dichotomy 1. 91 Criticism: From factual to conceptual changes2. 94 Nissenbaum’s framework of “contextual integrity”3. 96 Clarifying the relationship between “context” and “purpose” 4. 99 Values as a normative scale in order to determine the “contexts” and “purposes” 5. 105 The function of the principle of purpose limitation in light of Article 8 ECFR and further fundamental rights C. 109 Constitutional frameworkI. 109 Interplay and effects of fundamental rights regimes1. 110 The object and concept of protection of the German right to informational self-determination 2. 144 Different approach of Article 7 and 8 ECFR with respect to Article 8 ECHR 3. 174 The requirement of purpose specification and its legal scaleII. 231 Main problem: Precision of purpose specification1. 231 Criticism: Stricter effects on the private than the public sector 2. 295 Solution approach: Purpose specification as a riskdiscovery process 3. 325 Requirement of purpose limitation in light of the range of protection III. 424 Different models of purpose limitation and change of purpose 1. 425 Solution approach: Controlling risks that add to those specified previously 2. 483 Data protection instruments in non-linear environmentsIV. 513 Scope of application and responsibility (Article 8 sect. 1 ECFR) 1. 514 Content Overview 8 Legitimacy of processing of personal data (Article 8 sect. 2 ECFR) 2. 547 The individual’s “decision-making process” (in light of the GDPR) 3. 563 Empirical approach in order to assist answering open legal questions D. 597 Clarifying different risk assessment methodologiesI. 598 Different objects of risk assessments1. 598 Different assessment methods2. 603 Interim conclusion: Unfolding complexity3. 608 Multiple-case-studies: Combining research on risks with research on innovation processes II. 611 Reason for the case study approach1. 611 Generalizing the non-representative cases2. 613 Designing the case studies3. 614 Researching the effects of data protection instruments in regards to innovation processes III. 616 Enabling innovation: Contexts, purposes, and specifying standards 1. 616 Demonstration on the basis of the examples provided for in the introduction 2. 624 Summary: Standardizing “purposes” of data processing5. 644 Final conclusion: The principle of purpose limitation can not only be open towards but also enhancing innovation E. 649 Bibliography 655 Content Overview 9 Table of Content IntroductionA. 31 Problem: Conflict between innovation and risk protectionI. 32 Innovation as an economic driver for public welfare1. 32 Protection against the risks of innovation2. 33 Uncertainty about the meaning and extent of the principle of purpose limitation 3. 34 Practical examples referring to two typical scenarios4. 35 Coming from a practical observation: Startups and nonlinear innovation processes a) 36 First scenario: Purpose specification by the controller concerning the use of data of its users b) 37 The unpredictable outcome of entrepreneurial processes aa) 37 Excursus: In which circumstances do data controllers actually need “old” data? bb) 39 Second scenario: The limitation of the later use of data collected by third parties c) 40 No foreseeable negative impact on individualsaa) 40 Negative impact foreseeable on the individualsbb) 42 Interim conclusion: Uncertainty about the concept of protection and its legal effects 5. 45 Research questions and approachII. 48 Legal research about innovation1. 48 The regulator’s perspective2. 49 Possible pitfalls taking the effects of regulation instruments into account 3. 54 Course of examinationIII. 55 Conceptual definitions as a link for regulationB. 61 Innovation and entrepreneurshipI. 61 Process of innovative entrepreneurship1. 63 Key Elements for the entrepreneurial processa) 63 Business Opportunities: Discovery and creationb) 66 11 Strategic management: Causation and effectuationc) 69 Entrepreneurial contexts: The Law as one influencing factor in innovation processes amongst others d) 70 Regulation of innovative entrepreneurship2. 71 Do laws simply shift societal costs either protecting against or being open to innovation? a) 72 Principles between openness toward innovation and legal uncertainty b) 73 Legal (un)certainty as a factor that mediates the regulatory burden aa) 74 Conditioning further legal certainty as a promoting factor for entrepreneurial activity bb) 76 Interim conclusion with respect to the principle of purpose limitation c) 77 Data protection as a risk regulationII. 79 Risk terminology oscillating between “prevention” and “precaution” 1. 79 Sociological approaches defining “dangers” and “risks”2. 82 German legal perspectives: Different protection instruments for different types of threat 3. 84 Protection pursuant to the degree of probabilitya) 85 Protection pursuant to the available knowledge in linear-causal and non-linear environments b) 87 Interim conclusion: Fundamental rights determining the appropriateness of protection c) 88 Searching for a scale in order to determine the potential impact of data protection risks 4. 89 Theories about the value of privacy and data protectionIII. 91 The individual’s autonomy and the private/public dichotomy 1. 91 Criticism: From factual to conceptual changes2. 94 Nissenbaum’s framework of “contextual integrity”3. 96 Clarifying the relationship between “context” and “purpose” 4. 99 Values as a normative scale in order to determine the “contexts” and “purposes” 5. 105 Table of Content 12 The function of the principle of purpose limitation in light of Article 8 ECFR and further fundamental rights C. 109 Constitutional frameworkI. 109 Interplay and effects of fundamental rights regimes1. 110 The interplay between European Convention for Human Rights, European Charter of Fundamental Rights and German Basic Rights a) 111 The effects of fundamental rights on the private sectorb) 113 Third-party effect, protection and defensive function aa) 114 European Convention on Human Rights(1) 115 Positive obligations with respect to Article 8 ECHR (a) 116 Right to respect for private life under Article 8 ECHR (b) 117 European Charter of Fundamental Rights(2) 118 Market freedoms and fundamental rights(a) 118 The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR (b) 120 German Basic Rights(3) 125 Protection function of the right to informational self-determination (a) 126 Priority of contractual agreements and the imbalance of powers (b) 129 Balancing the colliding constitutional positions (c) 130 Balance between defensive and protection functionbb) 132 The 3-Step-Test: Assessing the defensive and protection function (1) 133 A first review: decomposing the object and concept of protection (2) 136 Which instruments actually protect which object of protection? (a) 136 Example: “Commercialized” consent threatening the object of protection including… (b) 137 … individuality?(c) 138 Table of Content 13 … solidarity?(d) 139 … democracy?(e) 140 Equal or equivalent level of protection compared to state data processing? cc) 141 Interim conclusion: Interdisciplinary research on the precise object and concept of protection c) 142 The object and concept of protection of the German right to informational self-determination 2. 144 Genesis and interplay with co-related basic rightsa) 145 Autonomous substantial guaranteeb) 148 Right to control disclosure and usage of personal data as protection instrument? c) 152 Infringement by ‘insight into personality’ and ‘particularity of state interest’ d) 158 Purpose specification as the essential link for legal evaluation e) 164 In the public sector: Interplay between the three principles clarity of law, proportionality, and purpose limitation aa) 164 Principles of clarity of law and purpose limitation referring to the moment when data is collected (1) 164 The proportionality test also takes the use of data at a later stage into account (2) 167 In the private sector: The contract as an essential link for legal evaluation bb) 171 Interim conclusion: Conceptual link between ‘privacy’ and ‘data processing’ f) 172 Different approach of Article 7 and 8 ECFR with respect to Article 8 ECHR 3. 174 Genesis and interplay of both rightsa) 175 Concept of Article 8 ECHR: Purpose specification as a mechanism for determining the scope of application (i.e. the individual’s ‘reasonable expectation’) b) 178 Substantial guarantee of “private life”: Trust in confidentiality and unbiased behavior aa) 178 Criteria established for certain cases: Context of collection, nature of data, way of usage, and results obtained bb) 180 Table of Content 14 Particular reference to the individual’s “reasonable expectations” cc) 182 ‘Intrusion into privacy’(1) 183 Public situations: ‘Systematic or permanent storage’ vs. ‘passer-by situations’ (2) 184 ‘Data relating to private or public matters’, ‘limited use’ and/or ‘made available to the general public’ (3) 186 ‘Unexpected use’ pursuant to the purpose perceptible by the individual concerned (4) 188 Consent: Are individuals given a choice to avoid the processing altogether? dd) 192 Conclusion: Assessment of ‘reasonable expectations’ on a case-by-case basis ee) 194 Concept of Articles 7 and 8 ECFR: Ambiguous interplay of scopes going beyond Article 8 ECHR c) 195 Comparing the decisions of the European Court of Justice with the principles developed by the European Court of Human Rights aa) 195 General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-bycase approach (1) 195 Differences between private life and data protection under Articles 7 and 8 ECFR (2) 198 Protection against first publication and profiles based on public data (a) 198 Protection against collection, storage, and subsequent risk of abuse (b) 201 Reference to further fundamental rights under Article 7 and/or 8 ECFR (3) 205 Which right is used to discuss other fundamental rights? (a) 206 The answer depends on the type of threat posed (b) 207 Protection in (semi)-public spheres irrespective of ‘reasonable expectations’? (4) 211 Going beyond the requirement of consent provided for under Article 8 ECHR (5) 214 Table of Content 15 Interim conclusion: Article 8 ECFR as a regulation instrument? bb) 217 Location of protection instruments under Article 8 ECFR (1) 217 Protection going beyond Article 8 ECHR(2) 218 Remaining uncertainty about the interplay between Article 7 and 8 ECFR (3) 220 Referring to substantial guarantees as method of interpreting fundamental rights in order to avoid a scope of protection that is too broad and/or too vague cc) 222 The reason for why the scope is too vague: Difference between data and information (1) 223 The reason for why the scope is too broad: Increasing digitization in society (2) 225 Advantages and challenges: ‘Personal data’ as legal link for a subjective right (3) 226 Possible consequence: A legal scale provided for by all fundamental rights which determine the regulation instruments under Art. 8 ECFR (4) 229 The requirement of purpose specification and its legal scaleII. 231 Main problem: Precision of purpose specification1. 231 ECtHR and ECJ: Almost no criteriaa) 232 Requirements provided for by European secondary lawb) 234 Central role of purpose specification within the legal system aa) 235 Scope of protection: ‘Personal data’(1) 236 ‘All the means reasonably likely to be used’ (a) 236 Example: IP addresses as ‘personal data’?(b) 236 The case of “Breyer vs. Germany”(c) 238 Liability for ‘data processing’: ‘Controller’ and ‘processor’ (2) 240 Further legal provisions referring to the purpose (3) 241 Criteria discussed for purpose specificationbb) 244 Preliminary note: Clarifying conceptual (mis)understandings (1) 245 Table of Content 16 Legal opinion on the function of the specification of a purpose (2) 247 Legal opinion on the function of ‘making a specified purpose explicit’ (3) 249 Legal opinion on the reconstruction of a purpose and its legitimacy (4) 250 Purposes of processing specified when consent is given cc) 251 Purposes of data processing authorized by legal provisions dd) 252 ePrivacy Directive(1) 252 Data Protection Directive and General Data Protection Regulation (2) 254 Preliminary note: Clarifying conceptual (mis)understandings (a) 255 Legal opinion on ‘performance of a contract’ (b) 257 Legal opinion on ‘legal obligation’, ‘vital interests’, and ‘public task’ (c) 258 Legal opinion on ‘legitimate interests’(d) 259 Transposition of the requirement of purpose specification into German law c) 262 Purposes of processing authorized by the Telecommunication Law aa) 264 Purposes of processing authorized by the Telemedia Law bb) 266 Purposes of processing authorized by the Federal Data Protection Law cc) 269 Three basic legitimate grounds(1) 269 ‘Performance of a contract’, Article 28 sect. 1 sent. 1 no. 1 BDSG (2) 270 ‘Justified interests of the controller’, Art. 28 sect. 1 sent. 1 no. 2 BDSG (3) 271 ‘Generally accessible data’, Art. 28 sect. 1 sent. 1 no. 3 BDSG (4) 272 Privileges and restrictions pursuant to the purpose (5) 273 Table of Content 17 Purposes of processing specified when consent is given dd) 275 Not a waiver but execution of right to informational self-determination (1) 276 Requirements for consent and consequences of its failure (2) 277 Discussion on the degree of precision of a specified purpose (3) 278 Comparison with principles developed by the German Constitutional Court ee) 281 Public sector: Purpose specification as a result of the principle of clarity of law (1) 281 Function of purpose specification (basic conditions) (a) 281 Examples for specific purposes: Certain areas of life or explicitly listed crimes (b) 284 Examples for unspecific purposes: Abstract dangers or unknown purposes (c) 286 Liberalization of the strict requirement by referring to the object of protection (d) 290 Private sector: ‘Self-control of legitimacy’(2) 293 Criticism: Stricter effects on the private than the public sector 2. 295 Difference in precision of purposes specified by legislator and data controllers a) 296 Data processing for undisputed ‘marketing purposes’ authorized by law aa) 297 Disputed ‘marketing purposes’ specified by data controllers bb) 298 Further examples for different scales applied in order to specify the purpose cc) 299 Can the context help interpret a specified purpose?dd) 300 A different scale for ‘purpose specification’ pursuant to the German concept of protection ee) 301 Interim conclusion: Do regulation instruments dictate the scale for ‘purpose specification’? ff) 303 Table of Content 18 Further ambiguities and possible reasons behind the same b) 304 Common understanding about the function of ‘purpose specification’ aa) 305 Ambiguous understanding regarding the functions of ‘making specified purpose explicit’ bb) 306 Arguable focus on data collection for legal evaluation in the private sector cc) 307 Arguable legal consequences surrounding the validity of the consent dd) 310 The lack of a legal scale for ‘purpose specification’ in the private sector c) 312 No legal system providing for ‘objectives’ of data processing in the private sector aa) 313 Differentiating between the terms ‘purpose’, ‘means’ and ‘interest’ bb) 315 ‘Interests’ protected by the controller’s fundamental rights (1) 316 Is the ‘purpose’ determined by the individual’s fundamental rights? (2) 318 Inclusion or exclusion of future ‘purposes’ and ‘interests’ bb) 320 Present interests vs. future interests(1) 321 Purpose specification pursuant to the type of threat? (2) 323 Summary of conceptual ambiguitiesd) 324 Solution approach: Purpose specification as a riskdiscovery process 3. 325 Regulative aim: Data protection for the individual’s autonomy a) 327 Intermediate function of data protectionaa) 328 Different functions of rights (opacity and transparency) (1) 329 Disconnecting the exclusive link between data protection to privacy (2) 331 Data protection for all rights to privacy, freedom, and equality (3) 334 Table of Content 19 Purpose specification as a risk regulation instrument bb) 336 ‘A risk to a right’: Quantitative vs. qualitative evaluation? (1) 337 Challenges of bridging risks to rights(a) 338 Example: German White Paper on DPIA(b) 339 Criticism: Incoherence of current risk criteria (c) 341 Purpose specification discovering risks posed to all fundamental rights (2) 343 Pooling different actions together in order to create meaning (a) 343 Separating unspecific from specific risks (first reason why data protection is indispensable) (b) 345 Central function with respect to all fundamental rights (second reason why data protection is indispensable of data protection) (c) 348 Function of making specified purposes explicit(3) 350 Interim conclusion: Refining the concept of protection cc) 353 Tying into the Courts’ decisions and European legislation (1) 353 Advantages compared to existing (unclear) concepts of protection (2) 356 Effectiveness and efficiency of protection instruments (a) 356 Appropriate concept for innovation processes (b) 357 Excursus: Objective vs. subjective risks(c) 359 Fundamental rights which determine purpose requirements b) 361 Right to privacy (aka ‘being left alone’)aa) 361 Unfolding specific guarantees of privacy(1) 362 At home: Protection of ‘haven of retreat’(a) 363 Using communications: Protection against ‘filtering opinions’ (b) 365 Table of Content 20 “Privacy in (semi)-public spheres”: Protection against the risks of later usage of data (c) 366 Necessity requirement, irrespective of inconvenience (2) 370 ‘Framing’ privacy expectations(3) 371 Research on the individual’s decision making process (consent) (a) 372 First example: The legislature’s considerations on the use of ‘cookies’ (b) 374 Second example: Considerations surrounding ‘unsolicited communications’ (c) 375 Right to self-determination in publicbb) 377 Clarification of substantial guarantees(1) 377 First publication: Strict requirements(2) 378 Necessity of publication(a) 379 Strict requirements for consent(b) 380 Re-publication: Weighing ‘interests’ against ‘old and new purposes’ (3) 382 Misconceptions in the decision of “Mr. González vs. Google Spain” (a) 383 Excursus: Case law provided for by the German Constitutional Court (b) 385 Conclusion in regards to the decision of “Mr. González vs. Google Spain” (c) 387 Internal freedom of developmentcc) 389 Does the German right to informational selfdetermination provide for such a guarantee? (1) 389 Discussion on such a substantial guarantee(2) 392 Articles 7 and/or 8 ECFR: Information pursuant to insights into personality and possibilities of manipulation (3) 394 Table of Content 21 Specific rights to freedomdd) 397 Focus on the collection of data: Omission by the individual of exercising their rights out of fear (1) 398 Considerations of the Courts with respect to the freedom of expression and the individuals risk of being unreasonably suspected by the State (a) 398 Considerations on further rights of freedom(b) 400 Focus on the later usage of data or information: Restriction or hindrance of exercise of rights of freedom through usage of data or information (2) 403 Interim conclusion: How “privacy in public” can be further determined (3) 404 Specific contexts of collection of personal data (a) 405 Later use of personal data in the same context (b) 407 Protection instruments enabling the individual to adapt to or protect him or herself against the informational measure (c) 411 Rights to equality and non-discriminationee) 417 In the public sector: Criteria for intensity of infringement (1) 417 In the private sector: ‘Tool of opacity’ vs. private autonomy? (2) 418 Interim conclusion: Additional legitimacy requirement for the data-based decision-making process (3) 420 Conclusion: Purpose specification during innovation processes c) 422 Table of Content 22 Requirement of purpose limitation in light of the range of protection III. 424 Different models of purpose limitation and change of purpose 1. 425 European models: ‘Reasonable expectations’ and purpose compatibility a) 425 Change of purpose pursuant to ECtHR and ECJaa) 426 ECtHR: ‘Reasonable expectations’ as a main criteria (1) 426 ECJ: Reference to data protection instruments instead of ‘reasonable expectations’ (2) 428 Are the terms ‘necessity’, ‘adequacy’ and ‘relevance’ used as objective criteria for the compatibility assessment? (a) 429 Purpose identity for the consent(b) 430 Compatibility assessment required by the Data Protection Directive with respect to the opinion of the Art. 29 Data Protection Working Party bb) 431 Preliminary analysis: Pre-conditions and consequences (1) 432 Example: The expectations of a customer purchasing a vegetable box online (2) 435 Criteria for the substantive compatibility assessment (3) 436 First criteria: ‘Distance between purposes’(a) 436 Second criteria: ‘Context and reasonable expectations’ (b) 437 Third criteria: ‘Nature of data and impact on data subjects’ (c) 439 Fourth criteria: ‘Safeguards ensuring fairness and preventing undue impact’ (d) 441 Excursus: Compatibility of ‘historical, statistical or scientific purposes’ (4) 444 Specification of the compatibility assessment (even prohibiting positive effects) (a) 444 Safeguards corresponding to the characteristics of the purposes (b) 445 Table of Content 23 Hierarchy of safeguards: From anonymization to functional separation (c) 446 Purpose identity required by the ePrivacy Directivecc) 447 Strict purpose identity for the processing of ‘communication data’, ‘traffic data’ and ‘location data other than traffic data’ (1) 447 The individual’s consent as an exclusive legal basis for a change of purpose (2) 448 Interim conclusion: A lack in the legal scale for compatibility assessment dd) 449 German model: Purpose identity and proportionate change of purpose b) 452 Change of purpose in the private sector pursuant to ordinary law aa) 453 Strict purpose identity required by Telemedia Law and Telecommunication Law (1) 453 The more nuanced approach established by the Federal Data Protection Law (2) 454 Comparison with the principles developed by the German Constitutional Court for the public sector bb) 457 Strict requirement of purpose identity limiting the intensity of the infringement (1) 458 Proportionate change of purpose(2) 461 Identification marks as a control-enhancing mechanism (3) 466 Alternative concepts provided for in German legal literature cc) 467 Purpose identity and informational separation of powers (1) 468 Purpose specification by the individual instead of the controller (a) 469 Principle of purpose limitation and informational separation of powers (b) 470 Example of re-registration: Collection and transfer of data on the citizen’s request (c) 472 Compatibility of purposes(2) 473 Criticism of the “subjective” purpose approach (a) 473 Table of Content 24 Compatibility instead of identity of purposes (b) 474 Supplementing protection instruments(c) 475 Purpose identity and change of purpose as ‘a threshold for duty of control‘ (3) 476 Criticism of purpose compatibility(a) 477 Specification, identity and change of purpose as equivalent regulation instruments (b) 477 The opposing fundamental rights providing for the objective legal scale (c) 478 Interim conclusion: Right to control data causing a ‘flood of regulation’ dd) 479 Solution approach: Controlling risks that add to those specified previously 2. 483 Conceptual shift: From the exclusion of unspecific risks to the control of specific risks a) 483 Different types of changes of purpose in light of different types of risks aa) 484 Purpose compatibility as an “umbrella assessment” (1) 484 Custer’s and Ursic’s taxonomy: “Data recycling, repurposing, and recontextualization” (2) 486 Clarification of an objective scale: “Same risk, higher risk, and another risk” (3) 489 Refinement of current concepts of protectionbb) 490 Article 8 ECFR and European secondary law(1) 490 “Purpose identity” forbidding additional risks (than specified before) (a) 491 Further protection instruments that can avoid purpose incompatibility (b) 491 Systemizing the criteria for the compatibility assessment (c) 493 Right to private life under Article 8 ECHR and the right to informational self-determination (2) 496 Applying a ‘non-linear perspective’cc) 497 Table of Content 25 Substantial guarantees: Providing criteria for a compatibility assessment b) 499 Right of ‘being left alone’: ‘Reasonable expectations’ determined by risks aa) 500 Self-representation in the public: A balancing exercise instead of purpose determination bb) 503 Internal freedom of development: Specific instead of preliminary information cc) 505 External freedoms of behavior: Purpose identity as one potential element amongst several protection instruments dd) 507 Equality and non-discrimination: Specifying incompatible purposes in the course of social life ee) 508 Conclusion: Purpose limitation in decentralized data networks c) 510 Data protection instruments in non-linear environmentsIV. 513 Scope of application and responsibility (Article 8 sect. 1 ECFR) 1. 514 Problems in practice: A balance between too much and too little protection a) 515 How data may be related to an individualaa) 515 Anonymization of personal databb) 518 Again: The problem of a “yes-or-no-protection” solution cc) 521 Alternative solution: Scope(s) pursuant to the type of risk b) 522 Theoretical starting point: Different levels of protection aa) 523 Pro and cons for precautionary protection against abstract dangers (1) 524 Abstract precautionary protection only in cases of special danger (2) 525 Advantages of a nuanced approach(3) 527 Differentiating between the general scope of protection and the application of specific protection instruments bb) 530 General scope of protection enabling specification of purpose (aka risk) (1) 531 Table of Content 26 Application of protection instruments determined by specific risks (2) 532 Rights to privacy(a) 533 Right of self-representation in the public(b) 534 Internal freedom of behavior(c) 535 Rights to freedom and non-discrimination(d) 538 Again: General scope of protection requiring data security (against unspecific risks) (3) 539 Excursus: Responsibility (“controller” and “processor”)c) 542 Cumulative responsibility for precautionary protection (1) 544 Cooperative responsibility for preventative protection (2) 545 Legitimacy of processing of personal data (Article 8 sect. 2 ECFR) 2. 547 Same measures but differently applied in the public and private sector a) 548 Different risks in the public and private sectoraa) 549 Example: Requirements to specify the purpose and limit the processing at a later stage bb) 552 Legal-technical constraints surrounding the prohibition rule cc) 553 Possible approaches of regulation in the private sectorb) 554 Classic instruments: Specific legal provisions, broad legal provisions, and/or consent aa) 555 Conceptual shift: From a legal basis to ‘legitimacy assessment’ bb) 556 Side note: State regulated self-regulation increasing legal certainty cc) 558 Interplay of consent and legal provisionsdd) 560 Interim conclusion: Balancing the colliding fundamental rights c) 562 The individual’s “decision-making process” (in light of the GDPR) 3. 563 Static perspective: Opt-in or opt-out procedure for consent? a) 565 Classic discussion regarding current data protection laws aa) 565 Table of Content 27 Further approaches considered by the legislator and Constitutional Courts bb) 567 Requirements illustrated so far, with respect to different guarantees cc) 569 Dynamic perspective: Interplay of several protection instruments b) 570 Consent: “Later processing covered by specified purpose?” aa) 570 Risks as object of consent (not data)(1) 572 Extent of consent limiting the later use of data (instead of being illegal as a whole) (2) 574 Change of purpose: Opt-out procedures for higher and opt-in procedures for other risk (3) 577 Clarifying recital 50 GDPR: “Separate legal basis if purpose not compatible” bb) 579 Arg. ex contrario: Is an incompatible purpose legal on a separate legal basis? (1) 580 Differentiating between “not compatible” and “incompatible” purposes (2) 581 Assessment of safeguards that ensure that purposes do not (definitely) become incompatible (3) 581 Legal basis and opt-out: Change of purposecc) 582 Opt-out: A risk-reducing protection instrument(1) 583 Examples: New risks not covered by consent (in light of the specified purpose) (2) 584 Examples: New risks not covered by a former applicable provision (3) 585 Information duties and further participation rightsdd) 586 Controller’s duties of information(1) 587 Data collection: Customizing information in relation to daily decision-making processes (a) 588 Change of purpose: Interpreting information duties regarding specific risks (b) 589 Profiling and automated decision-making(c) 589 Individual’s right to rectification(2) 592 Conclusion: Specifying the decision-making process (Art. 24 and 25 GDPR) c) 592 Table of Content 28 Empirical approach in order to assist answering open legal questions D. 597 Clarifying different risk assessment methodologiesI. 598 Different objects of risk assessments1. 598 Risk-based approach of purpose specification and limitation (Art. 5 sect. 1 lit. b GDPR) a) 598 Data Protection Impact Assessment (Art. 35 GDPR)b) 599 Further methodologies (technology assessment and surveillance impact assessment) c) 601 Different assessment methods2. 603 Examining abstract constitutional positions from a social science perspective a) 604 Pre-structuring interests through multiple-stakeholder and expert participation b) 605 Specifying ‘decision-making process’ by user-centered development of data protection-by-design c) 605 Interim conclusion: Unfolding complexity3. 608 Multiple-case-studies: Combining research on risks with research on innovation processes II. 611 Reason for the case study approach1. 611 Generalizing the non-representative cases2. 613 Designing the case studies3. 614 Researching the effects of data protection instruments in regards to innovation processes III. 616 Enabling innovation: Contexts, purposes, and specifying standards 1. 616 Enabling data controllers to increase legal certaintya) 617 Enhancing competition on the “data protection” marketb) 617 Remaining questions in relation to the effects of legal standards c) 620 Demonstration on the basis of the examples provided for in the introduction 2. 624 Example of “personalized advertising”a) 624 Preliminary legal analysisaa) 624 Initial product and business model: Internal freedom of development (1) 625 Change of product and business model: No substantive change of purpose (2) 626 Table of Content 29 Open legal questions (‘propositions’)bb) 627 Standardization of “personalized marketing” purpose (1) 628 Competitive advantage(2) 629 Example of “anonymized data for statistic/research purposes” b) 630 Preliminary legal analysisaa) 630 Processing of public personal data: Selfdetermination in public (1) 630 The taxi driver: Attributing anonymized data to passengers (2) 631 Open legal questions (‘propositions’)bb) 633 Standardization of “statistical” or “scientific” purposes (1) 633 Competitive advantage(2) 635 Example of “scoring in the employment context”c) 636 Preliminary legal analysisaa) 636 Re-publication of personal data: fair balance instead of a priority rule (1) 637 Freedom to find an occupation: Participation instruments (2) 639 Open legal questions (‘propositions’)bb) 642 Standardization of “profiling potential employees” (1) 642 Signaling legal certainty (to the “workers’ council”) (2) 643 Summary: Standardizing “purposes” of data processing5. 644 Final conclusion: The principle of purpose limitation can not only be open towards but also enhancing innovation E. 649 Bibliography 655 Table of Content 30 Introduction Dating back to the early discussions regarding the concept of data protection, the so-called “principle of purpose limitation” is one of the fundamental principles of data protection law.1 The principle essentially requires that personal data may only be processed for the original purpose of collection of the data,2 or in the words of the OECD Privacy Guidelines, at least, so long as it is not incompatible with the original purpose.3 In light of our ever increasing digitization of society, the principle of purpose limitation is more and more debated amongst legal scholars.4 The most recent motivations behind these discussions arose because the European Council’s draft of the General Data Protection Regulation was leaked in the beginning of 2015 by the non-profit association European Digital Rights (EDRi).5 Article 6 sect. 4 of the European Council’s draft widely abandoned the principle of purpose limitation by stating that personal data can be used, even if it is incompatible with its original purpose, so long as it can be based on a legal provision in accordance with Article 6 sec 1 lit a-e. An exception to this rule is Article 6 sect. 1 lit. f of the draft, which provides that the collection of data is legal if it is “necessary for the purposes of the legitimate interests pursued by the controller (underlining by the author)”. Only if the collection of personal data is based on this provision, A. 1 See Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, pp. 4 and 6 ff.; Handbook on European data protection law, p. 68; De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 4; Bygrave, Data Privacy Law, p. 153; v. Zezschwitz, Concept of Normative Purpose Limitation, cip. 1; Pohle, Purpose limitation revisited, p. 141; contrary, Härting, Purpose limitation and change of purpose in data protection law, who affirms the requirement of purpose limitation only applicable to the legislator but not to the data controller. 2 Cf. v. Zezschwitz, Concept of Normative Purpose Limitation, cip. 14. 3 See no. 9 of part two of the OECD Privacy Framework, p. 14. 4 See, instead of many, Cate/Cullon/Viktor Mayer-Schönberger, Data Protection Principles for the 21st Century, p. 11. 5 See the documents linked by Naranjo, Leaked documents: European data protection reform is badly broken, retrieved on the 2nd of February 2016 from https://edri.org/ broken_badly/. 31 then the principle of purpose limitation should apply.6 European Digital Rights particularly criticized this extensive abandonment of the principle of purpose limitation because it would undermine “control and predictability” as “the core of data protection”.7 In essence, this doctoral thesis addresses the question of whether this consideration is true or not, or from a more academic point of view, what the function of the principle of purpose limitation actually is. Problem: Conflict between innovation and risk protection From an academic perspective, there are two main aspects of the principle of purpose limitation that are particularly interesting: Firstly, the principle of purpose limitation appears to conflict with the societal needs for innovation and is the perfect example of a more general conflict for the regulators: How can the legislator enable or enhance innovation and, simultaneously, protect against its risks? The second aspect refers to the uncertainty of how to apply the principle of purpose limitation in general. Only if the principle of purpose limitation was clear and we knew what is actually meant, would it be possible to answer the preceding question. Innovation as an economic driver for public welfare A multitude of international studies and policy recommendations brings the importance of innovation for the public welfare more and more into public debate. For instance, the OECD Science, Technology and Industry Outlook 2014 considers: “Innovation is a major driver of productivity and economic growth and is seen as a key way to create new business values.”8 Another OECD report focusing on data-driven innovation considers its positive effects as “significantly accelerating research and the development of new products, processes, organisational methods and markets”.9 I. 1. 6 See Grafenstein, The Principle of Purpose Limitation between Openness toward Innovation and the Rule of Law, DuD 2015 (12), p.789. 7 See EDRi / access / Privacy International / Fundacja Panoptykon: Data Protection Broken Badly. 8 See OECD Science, Technology and Industry Outlook 2014, p. 21. 9 See OECD: Data-Driven Innovation for Growth and Well-Being. A. Introduction 32 The World Economic Forum draws, in its 2014 report on how to enhance Europe’s competitiveness, the attention to entrepreneurship as the key source of innovation.10 From an entrepreneurial perspective, however, the law is usually not perceived as a driver of but rather barrier for innovation. The Eurobarometer on “Entrepreneurship in the EU and beyond” surveyed that a “large majority of respondents (..) agreed that business start-ups were difficult due to complex administrative procedures: 71%, in total agreed and 29% strongly agreed.”11 Similarly, the Global Entrepreneurship Monitor 2014 surveyed, amongst others, “the lowest evaluation corresponded to government policies toward regulation”.12 Protection against the risks of innovation This perception corresponds to the general view amongst innovation researchers who consider that the law actually acts as a barrier rather than as a pro-active instrument which would influence and develop, besides other factors, the process of innovation. The reason for this perception might be that the term “innovation” usually refers to something unexpected and new, while the law seeks to guarantee a certain and expected outcome.13 The principle of purpose limitation restraining the later use of personal data to the original purpose of collection indeed appears to be diametrically opposed to such unexpected outcomes of innovation. However, the public discussion also recognizes the risks caused by innovation. The abovementioned OECD report not only considers the positive effects of datadriven innovation but also its risks, in particular, for privacy and security.14 Having applied a “bottom-up cultural analysis of historical, philosophical, political, sociological, and legal sources”, Solove elaborated in his book Understanding Privacy on a taxonomy of 16 privacy risks and/or harms, from the collection of information to its processing and distribution as well as invasion.15 In this regard, two terminological issues shall briefly 2. 10 See World Economic Forum: Insight Report: Enhancing Europe's Competitiveness – Fostering Innovation-Driven Entrepreneurship in Europe. 11 See Eurobarometer: Entrepreneurship in the EU and beyond, p. 75. 12 See Singer et al., Global Entrepreneurship Monitor – 2014 Global Report, p. 14. 13 See Eifert, Innovation-enhancing Regulation, p. 11 and 12; cf. also Lipshaw, Why the Law of Entrepreneurship Barely Matters. 14 See OECD: Data-Driven Innovation for Growth and Well-Being. 15 Solove, Understanding Privacy, pp. 101 ff. as well as 171 ff. I. Problem: Conflict between innovation and risk protection 33 be clarified: so far, this thesis does not (yet) differentiate between the terms data and information;16 second, except of this differentiation, this doctoral thesis does not make a difference between the terms “processing”, “treatment”, “use” and “usage” of data and/or information. In any case, the study “Commercial Digital Surveillance in Daily Life” summarizes the most common or, at least, commonly known cases of data mining techniques (for example, predictive analytics about one’s pregnancy, status of relationship or emotional state of mind based on purchase behavior, Facebook likes or keyboard usage patterns) and its commercial exploitation in the insurance, finance or HR industry.17 Boyd and Crawford stress in particular the high subjectivity and potential inaccuracy of those data mining techniques.18 The regulator must thus not only seek to enable and enhance innovation but also to protect against the risks caused by innovation.19 In conclusion, the question therefore is which role the principle of purpose limitation plays within this regulatory conflict between enhancing innovation and protecting individuals against its risks. Uncertainty about the meaning and extent of the principle of purpose limitation This leads to the second reason that makes an academic examination of the principle of purpose limitation interesting: the uncertainty about its precise meaning and extent. In order to apply the principle of purpose limitation, it is necessary to determine the original purpose of collection. The main question hence is how precisely the original purpose must or, vice versa, how broadly it can be specified: the wider that the original purpose is specified, for example, the purpose of money making, the broader the scope of action will be for the controller and/or others to be able to use that data for the same purpose.20 However, the question how precisiely a 3. 16 See the differentiation below under point C. I. 3. c) cc) (1) “The reason for why the scope is too vague: Difference between data and information”. 17 See Christl, Commercial Digital Surveillance in Daily Life. 18 Boyd and Crawford, Critical Questions for Big Data, pp. 666 ff. 19 See Hoffmann-Riem, Innovation Responsibility, p. 16. 20 See Forgó et al., Purpose Specification and Informational Separation of Powers, p. 34; Mehde, Handbook of European Fundamental Rights, cip. 24; in contrast, see Bygrave, Data Privacy Law, p. 155, who considers this first component of the principle of purpose limitation “relatively free of ambiguity”. A. Introduction 34 processing purpose must be specified is an open question. Comparably, regarding the second component of the principle of purpose limitation, i.e. the question of under which conditions another (later) purpose is compatible with the original purpose, there are only few reliable criteria, if at all, that help really answer this question. The Article 29 Data Protection Working Party refers in its “Opinion 03/2013 on purpose limitation” to a bundle of criteria (see, now, also Art. 6 sect. 4 GDPR) such as the relationship between the original purpose and the further processing, the context of collection, the nature of the data and the impact caused by the later use on the individual, as well as the safeguards applied in order to prevent any undue impact.21 However, these criteria also pose two problems: First, each criteria lacks an objective scale which would help to determine, for instance, the “relationship” between the purposes; and second, the fact that all criteria together can be used as an entire basis to reach a decision, produces different results amongst decision makers who weigh the criteria against each other. Interestingly, there is little academic literature on the precise meaning and extent of the principle purpose limitation that allows one, in light of the fundamental rights concerned, to determine reliable criteria.22 This is particularly the case since most of the publications refer to the processing of personal data by the State, and not in the private sector, which is what this thesis focuses on. Practical examples referring to two typical scenarios Both aspects, i.e. the appearing conflict of the principle of purpose limitation together with the openness of innovation processes, and the ever increasing uncertainty about how to apply this principle within our current technological environment, result from the ambiguity of the current legal concept of protection. The following examples shall give the reader of this thesis an impression of the effects of this ambiguity in today’s business world. 4. 21 See Opinion 03/2013 on purpose limitation, pp. 23 to 27. 22 See only Hofmann, Purpose Limitation as Anchor Point for a Procedural Approach in Data Protection; Forgó et al., Purpose Specification and Informational Separation of Powers; Eifert, Purpose Compatibility instead of Purpose Limitation; Albers, Treatment of Personal Information and Data, cip. 123. I. Problem: Conflict between innovation and risk protection 35 Coming from a practical observation: Startups and non-linear innovation processes Practically, for the past three years, I have often discussed this issue with founders of Internet-enabled startups in the Startup Law Clinic of the Alexander von Humboldt Institute for Internet and Society (HIIG), and the specific legal challenges they face in trying to develop and implement their business model in today’s society.23 The Startup Law Clinic is part of the interdisciplinary research project Innovation and Entrepreneurship.24 Based on empirical data gathered in these Startup Clinics, the research project aims to understand, on a more efficient level, Internet-enabled entrepreneurship. In doing so, the project focuses on Internet-enabled startups that are, pursuant to some business observers “turning the conventional wisdom about entrepreneurship on its head.”25 For instance, Blank observes that startups differ to traditional larger companies, amongst other aspects, in how they react or adapt to uncertainties: While traditional companies create long-term business plans based on the “assumption (..) that it’s possible to figure out most of the unknowns of a business in advance” and then execute such plans, step-by-step, according to the so-called waterfall principle, “lean” startups search for a business model going “quickly from failure to failure, all the while adapting, iterating on, and improving their initial ideas as they continually learn from customers.”26 Such a methodological difference does not mean that traditional larger companies are not able to apply the same methods as startups do. In contrast, authors like Blank, as well as Ries, argue that traditional companies more and more apply this methodology.27 However, startups are known to apply this methodology most rigorously in light of the particular uncertainty they face. Ries, at least, defines a startup, amongst others, as being “designed to confront situations of extreme uncertainty.”28 Unlike a “clone of an existing business”, an innovative startup is always looking for “novel scientific a) 23 See the preliminary findings in the Working Paper by Dopfer et al., Supporting and Hindering Factors for Internet-Enabled Startups, pp. 23. 24 See the description of the research project retrieved on the 4th of February 2016 from: http://www.hiig.de/en/project/innovation-and-entrepreneurship/ 25 See Blank, Why the Lean Start-Up Changes Everything; cf. also Blank, Four Steps to the Epiphany, as well as Ries, The Lean Startup. 26 See Blank, ibid. 27 See Blank, ibid.; Ries, ibid, pp. 36 and 37. 28 See Ries, ibid., p. 38. A. Introduction 36 discoveries, repurposing an existing technology for a new use, devising a new business model that unlocks value that was hidden, or simply bringing a product or service to a new location or a previously underserved set of customers” and, thus, confronted with constant change.29 Indeed, this phenomenon also became apparent in the Startup Law Clinic.30 Therefore, with respect to startups developing their business models based on the processing of personal data, it was interesting to figure out how far they were, in effect, able to apply the principle of purpose limitation. Not surprisingly, there essentially were two types of cases particularly relevant when seeking to find an answer to this question: The first case refers to situations where startups want to process data of its own users but cannot yet specify the purpose of the later processing; the second case concerns situations where startups want to process personal data that was originally collected by a third party. In this second case, the problem for the startups was not only their own inability to specify the new purposes, but also the high uncertainty about the precise meaning and extent of the legal requirement to restrict their processing to the purposes initially specified by the third party when the data was first collected. First scenario: Purpose specification by the controller concerning the use of data of its users In the first case, the main problem exists in the controller’s limitations to specify the purpose of collection. The main reason for this limitation is the openness of its entrepreneurial process. The following example shall illustrate this process and the resulting problem with respect to the requirement of purpose specification. The unpredictable outcome of entrepreneurial processes One startup, which exemplifies this conceptual issue, was started in early 2014 with the idea to develop a wallpaper app for smartphones with android operating systems. Android operating systems allow the user (and their apps) to interact on the home screen of the smartphone with the unb) aa) 29 See Ries, ibid., p. 38. 30 Cf. Dopfer et al., Supporting and hindering factors for internet-enabled startups. I. Problem: Conflict between innovation and risk protection 37 derlying interface. In essence, the mobile app enabled its user to choose different background pictures (via a double tap on the home screen), to zoom into certain parts of the picture, to fade out to full screen, to like and to share it. The pictures were tagged with certain categories such as “red” for the main colour or “car” for the theme so that they could be matched with profiles of the users. The startup wanted to create these profiles in order to deliver image advertising pursuant to the users’ usage behavior. The startup’s business model consisted in the revenues received from its advertising partners paying for the personalized advertising space. So far, this purpose, the collection of personal data for advertising as explained before, and the way of how this data was processed, could easily be specified before the start of the closed beta test using 20 users. Indeed, as a result of the closed beta test, the startup decided in the middle of 2014, to broaden its concept: Instead of a pure wallpaper app, the app should become a new media format enabling its users to explore different kinds of media. The wallpaper picture should serve as the visual entry point for the user to follow, still via the double tap on the home screen, a link to the actual media format such as the new album of a music band, the newspaper article or, still, the image advertising. Even if this concept was still based on the profiles of the app users, the business model has now changed. Now, not only advertisers should pay for advertising space, but also additional business partners, such as newspapers and music editing houses, should pay the startup a percentage of the price received for selling their online offers to the app users. Hence, the question was whether or not the original purpose still covered the new purpose and the processing operations. Taking into account possibly later changes, the startup had, in the first Startup Law Clinic session, before the closed beta test, used an umbrella: Before the startup specifically described the concrete purpose, data and means of the processing, it had clarified that the whole processing pursues the purpose of “personalized marketing”. However, the Article 29 Data Protection Working Group stated in its “Opinion 03/2013 on the principle of purpose limitation” that the term of “marketing purposes” would be too broad.31 In the course of the following months, the startup started an open beta test for its app, which quickly got up to 30.000 users, and therefore looked 31 See Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 16. A. Introduction 38 for further private investors. However, the search for a working business model remained very difficult. In April 2015, the startup joined, having now around 100.000 users, a round table discussion with finance experts organized by the HIIG Business Model Innovation Clinic. On this occasion, one founder of the startup gave a short presentation, in particular, about the success regarding the user growth and the on-going struggle to find a functioning business model. After a brief discussion, one finance expert provided a solution for the problem: Why spend so many efforts on finding the business model if the user growth was still exploding? The experts’ advice was simply to focus, so far, on the user growth. The expert continued to advise and stated that as soon as the number of users was large enough, the startup would only then find out which revenue model would work later on. Equipped with such advice, indeed, the startup was not able to definitely specify the purpose of its later use of the collected data. Even the broad purpose of “personalized marketing” was just a guess. In beginning 2016, the startup had 180.000 users and was still looking for the business model. Excursus: In which circumstances do data controllers actually need “old” data? This example of an iterative development process for a mobile app illustrates how difficult it may become, if not impossible, to specify the purpose of all-later processing operations when the data is collected. However, data-driven innovation does not require, in general, that the entrepreneur must be able to use all personal data that has ever been collected. In contrast, for many innovations, it may be sufficient to use data that was only recently collected: If the qualitative data gathered by the startup is just good enough or the user base just large enough, the startup might be able to find its business model or even deliver personalized marketing on an “almost-real time” basis. In conclusion, even if an iterative process principally hinders entrepreneurs to specify the purpose of a later processing, this must not necessarily be so in each particular case. bb) I. Problem: Conflict between innovation and risk protection 39 Second scenario: The limitation of the later use of data collected by third parties As mentioned previously, the second constellation refers to controllers processing personal data that another entity collected originally. In these cases, the problem is not only the iterative entrepreneurial process itself which hinders the controller to specify the purpose of the later processing. Rather, the purpose originally specified by another entity might hinder the controller in its entrepreneurial process. Indeed, it is characteristic for a law to hinder someone’s action in order to protect another one.32 However, the essential point here is to illustrate the uncertainty accompanying entrepreneurial activity when controllers seek to apply the principle of purpose limitation. Two further examples shall illustrate this conceptual uncertainty. No foreseeable negative impact on individuals The first example is about a startup that retrieved personal data from social network communities such as Facebook and Twitter via a public API, in order to create so-called social heat maps. The social heat map was designed to predict not only the places, but also how many people would be and at what time and for what reason at a certain establishment. One economic business idea of the startup was to sell this information to taxi drivers enabling them to plan their driving routes in a more efficient manner. In order to achieve this objective, the startup transferred data from the social networks’ servers to their own servers. The transferred datasets contained data that related to geo-locations of events organized by users via the networks, as well as of users themselves sending a signal from where they were (so-called check-ins). The moment that the data was transferred to the startup’s own servers, a self-learning algorithm sorted out the specific data which was useful in order create the social heat map. So far, the participants of the Startup Law Clinic sessions, could not see a negative impact on the users’ concerned. Indeed, it was the opposite. The participants could only actually see a positive effect in that the users, possibly, will more likely find a taxi, for example, when they come out of a concert c) aa) 32 Cf. Hoffmann-Riem, Openness toward Innovation and Responsibility for Innovation by means of Law, p. 258. A. Introduction 40 or a restaurant. The participants particularly came to this conclusion because the startup anonymized the data the moment it had retrieved it from the social networks (via the public API). However, with respect to the current data protection framework, the problem was that the data was not made anonymous before its retrieval. This lead to directive 95/46/EC (Data Protection Directive) being applicable, in principle.33 As a consequence, two legal issues arose. The first issue concerned the legitimate basis of the data processing intended by the startup. Social networks usually base their processing of data on their users consent. However, the consent given produced two problems. On the one hand, the consent may not cover the later use of the data intended by the startup, because the social network could not foresee the later usage. On the other hand, the purpose may be specified as being so broad that it ran the risk of not being sufficiently precise (e.g. the purpose of ‘transfer to third parties’). Therefore, the startup had either to base its data processing on an additional consent given by the users concerned, or on another legitimate basis provided for by law, (as stipulated in Art. 7 of the Data Protection Directive, as well as in Article 6 of the General Data Protection Regulation). Since the startup would have had, in light of the amount of data concerned, practical difficulties to get the consent of all users’ concerned, the startup focused on another legitimate basis provided for by law. Indeed, whether this ‘secondary option’, i.e. referring to a legal provision when the individual’s consent does not cover the intended processing, would have been legal was also questionable because it might be seen as a circumvention of the original consent.34 In any event, even if this had been possible, it was unclear whether or not the startup could base the data processing on, in particular, the general clause of Article 7 lit. f of the Data Protection Directive (correspondingly, Article 6 sect. 1 lit. f of the 33 The directive itself was, indeed, not directly applicable since it must be transposed into national law in order to directly bind the data controller; for the sake of simplicity, however, this thesis does refer, so far, to the directive and not national law; with respect to the transposition into national German law, see, in more detail, point C. II. 1. c) “Transposition of the requirement of purpose specification into German law”. 34 Cf. Gola/Schomerus, Federal Data Protection Law, § 4 cip. 16; in contrast, see Article 17 sect. 1 lit. b GDPR, which excludes the individual’s right to require from the controler, based on an objection to his or her consent, to delete the personal data if the controller can base the processing on another legitimate ground foreseen by law. I. Problem: Conflict between innovation and risk protection 41 General Data Protection Regulation). This Article allows the data processing if it “is necessary for the purposes of the legitimate interests pursued by the controller (…), except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject”. Whether or not this provision covered the intended data processing was doubtful because the balancing exercise based on the bundle of criteria again produces legal uncertainty. In the Law Clinic Session the participants examined the users consent and it became apparent that the original purpose was not identical with the later use intended by the startup or was not sufficiently precise. Therefore, the second question became additionally relevant: whether or not the later processing intended by the startup was in accordance with the compatibility assessment proposed by the Article 29 Data Protection Working Party with respect to Article 6 sect. 1 lit. b of the Data Protection Directive (correspondingly, Article 5 sect. 1 lit. b of the General Data Protection Regulation).35 On the one hand, there was no negative impact on the individuals concerned; it seemed to be the same context (communicating with friends and going to social events = private/leisure life?); and the data was, once retrieved by the startup, immediately anonymized. On the other hand, the relationship between the original purpose of collection (connecting friends) with the later processing by the startup (creating social heat maps) was disconnected; the data was sensitive (geo-location data)36; and the users of the social networks did not probably expect this kind of usage. Hence, even if there was no intended negative impact on the users of the social networks concerned and the data was immediately anonymized, there was enough legitimate criteria resulting in the finding that the later use was incompatible with the original purpose of collection. Negative impact foreseeable on the individuals In the second example, the participants of the Startup Law Clinic session could clearly target a possible impact on the individuals concerned by the bb) 35 See the Art. 29 Data Protection Working Party, Opinion 03/2013 on the principle of purpose limitation, pp. 20 ff. 36 Cf. the Article 29 Data Protection Working Party, Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of the Directive 95/46/EC, p. 38. A. Introduction 42 data processing. A startup retrieved generally accessible personal data from professional networks. The startup created, in a first version, profiles based on the data that users of the professional networks have made publically available. The profiles contained predictions about three characteristics of the users of the professional networks that could potentially interest future employers: First, the probability that the user changes his or her current employment; second, the probability that the user would also change the city for a new employment; and third, the degree of expertise in a certain professional domain or area. The startup sought to sell the access to these profiles to the human resources departments of companies in the private market as the access to the profiles, would enable the human resources departments to make better decisions when finding and/or considering the right candidate for a certain job. Since the employer was intended to connect the profile with the candidate, the data could not be considered anonymous. Additionally, in light of the fact that the focus was to sell the product to employers, only, the potential employees (i.e. the users of the professional networks) would not be able to gain access to the database as a whole or to their specific profiles. Similar to the preceding example, two main questions arose. First, whether or not the later use of the personal data could be based on the users’ consent or another legitimate basis provided for by law. Here again, the consent sought by the professional networks from its users did not either cover the later use or was too broad in its purpose. Hence, the startup had to base its data processing either on Article 7 lit. b or f of the Data Protection Directive (correspondingly, Article 6 sect. 1 lit. b or f of the General Data Protection Regulation). The first provision allows the processing if it “is necessary (…) in order to take steps at the request of the data subject prior to entering into a contract (underlining by the author)”. In the example, the creation of the profiles and the access to it could hence only be necessary for the potential employer if the employee takes the initiative of actually applying for a job. For other cases where the employer searches for new potential employees based on their own initiative, only the general clause under Article 7 lit. f of the Data Protection Directive (and Article 6 sect. 1 lit. f of the General Data Protection Directive, correspondingly) came into question. Insofar, the participants of the Startup Law Clinic considered the search (and help) for potential employees indeed was a legitimate interest. However, it was arguable whether or not the potential employee had an overriding interest, for example, for his or her freedom to choose an occupation protected under Article 15 ECFR. I. Problem: Conflict between innovation and risk protection 43 This interest might have overridden the potential employer’s (and the startup’s) interest because of one particular reason. There was no reason for why the potential candidate could not be able to correct inaccurate data and add further advantageous information or do anything else which could improve his or her chances for being invited to the interview. With respect to the compatibility of the purposes at hand, it was unclear whether or not the profiling of potential employees in order to find the right job applicants could be seen as a sub-category of the original purpose of the professional network to connect professionals and, thus, identical. In order to avoid any doubts, the participants of the Law Clinic session sought to apply the compatibility test proposed by the Article 29 Data Protection Working Party. The question of whether or not the later processing was compatible with the original purpose of the professional networks depended, indeed, on a bundle of criteria which was very similar, if not identical, to the balancing test required under Article 7 lit. f of the Data Protection Directive (and Article 6 sect. 1 lit. f of the General Data Protection Regulation).37 There were several reasons in favour of the application: 1) the relationship between the later processing and the original purpose was close because the latter processing could have been considered as a subcategory of the first; 2) the data appeared not to be sensitive since it was published by the users and the categories of the profiles did not reveal any information about race, geo-location or similar information; 3) the later processing seemed to belong to the same context (professional life?); and 4) the user might consequently have expected the later use. On the other hand, the impact on the individual concerned could have been significant if he or she was filtered out, only for the reason that his or her profile did not match with the potential employer’s expectations. This was even more the case if there was no official proof of whether or not the profile really mirrored the likeliness that the employee would not have the expected attributes. 37 Cf. the criteria proposed by the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 20 ff., and Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, pp. 33 ff. A. Introduction 44 Interim conclusion: Uncertainty about the concept of protection and its legal effects In conclusion, albeit both of the two last examples significantly differed to each other with respect to the impact, it was hard, if not impossible, to answer the question if the later data processing was legal or not. Similarly, the first example already illustrated that the requirement to specify the purpose creates uncertainty in itself. This sheds light on what startups might mean when they express hope for improvement in political regulations and bureaucracy, rather than for social or advisory support.38 However, it shall again be stressed that these examples should only illustrate the general questions of how to specify the purpose and determine which later use is compatible with the original purpose and which is not. An answer to these general questions does not depend on the practical examples but on the legal concept of protection. However, finding an answer to these questions is highly important for companies and organizations. These entities try to apply the law because of their reputation, amongst other factors.39 If a data protection authority examines their use of data and comes to the conclusion that they are using that data illegally, there is a high risk of losing their reputation in the market. Consequently, the higher the risk of a loss of reputation, the more important it is for the processing entity to rely on clear criteria that would assist in correctly applying the law. Correspondingly, the same uncertainty is true with respect to the individuals concerned by the processing of data. Hallinan and Friedewald examined in one of their works more than ten public opinion surveys supplemented by further sources such as ethnographic studies and focus groups regarding the European public perception on the data environment. One of their aims was to find out why individuals’ behavior “at first sight appears erratic and even contradictory to declared privacy preferences.”40 Irrespec- 5. 38 See Kollmann et al., European Startup Monitor 2015, pp. 62 and 63, indeed showing financial support as the even higher ranked hope. 39 Cf. Jarchow and Estermann, Big Data: Chances, Risks and Need for Action of the Swiss Confederation, pp. 14 and 15. 40 See Hallinan and Friedewald, Public Perception of the Data Environment and Information Transactions – A selected-survey analysis of the European public’s views on the data environment and data transactions, pp. 62 and 76/77. I. Problem: Conflict between innovation and risk protection 45 tive of differences in national perceptions,41 the European public considers the protection of personal data as very important and that the disclosure of personal data raises significant concerns. However, individuals appear to accept the disclosure of personal data considering it as being “simply a part of modern life”.42 In order to explain the individual logic behind these contradictory observations, Hallinan and Friedewald referred to economic considerations proposed by Acquisti and Grossklags about potential limiting factors for rational decision-making.43 In light of these considerations, the contradictions between general privacy awareness and specific disclosure of personal data result, in particular, from the following three aspects: First, individuals often only have a limited understanding of the risks implied in data transactions.44 For example, while they are specifically aware of ID fraud as a serious threat, only few individuals consider or understand “the more abstract, invisible and complex aspects” such as “the value of the data, the nature of the technologies involved or the shape or nature of data flows – that is to say, (…) the critical parts of the data environment”.45 The second reason, besides limited information or conceptual understanding, is psychological distortion. Individuals tend, for instance, to prefer certain short-range rewards, such as an online service “for free”, to uncertain long-range risks caused by a potential misuse of data. Finally, ideological or personal attitudes constitute another factor for why an individual might either not disclose personal data at all, albeit the benefits are higher than potential losses, or vice versa.46 Hallinan and Friedewald stress that these factors challenged the common understanding of economic behavior that the current data protection 41 See, for example, Vodafone Institute for Society and Communications: Big Data – A European Survey on the Opportunities and Risks of Data Analytics, p. 17, showing that “Germans are especially critical concerning privacy issues“, while “South Europeans in the survey are generally more relaxed as far as the collection and use of their data is concerned“. 42 See Hallinan and Friedewald, ibid., p. 65 and 68. 43 See Hallinan and Friedewald, ibid., pp. 70 et al. with reference to Acquisti, Alessandro and Grossklags, Jens, “Privacy Attitudes and Privacy Behavior: Losses, Gains, and Hyperbolic Discounting“, in: Camp, J. L. and Lewis, S. (eds.), The Economics of Information Security, 2004 Kluwer, as well as ibid., “Privacy and rationality in individual decision making“, IEEE Security and Privacy 2005, pp. 26 to 33. 44 See Hallinan and Friedewald, ibid., pp. 72 to 74. 45 See Hallinan and Friedewald, ibid., p. 75. 46 See Hallinan and Friedewald, ibid., p. 74. A. Introduction 46 system is actually built on. The misconception by the legislator about the individual’s behavior might be the reason for why the European public has the feeling that the current laws do not fulfill their objective.47 In light of this, critics recognize that current data protection law suffers, from both the individual’s perspective and the controller’s perspective, a “credibility crisis”.48 Several legal scholars stress that this credibility crisis results from the uncertainty about the conception behind data protection law.49 In particular, v. Lewinski unfolds, in detail, the different dimensions of protection covered by the broad term “data protection”. While data protection laws are typically meant to regulate the relationship between individuals, on the one hand, and companies and the State, on the other hand, the object of protection, as well as the concept of protection is less clear.50 In v. Lewinski’s opinion, the term “data protection” refers to several objects of protection (i.e. the question of “what is protected”) such as the individual’s dignity, his or her private sphere, or the societal balance of informational power.51 Similarly, there are several possible concepts of protection (i.e. referring to the question of “how to protect the objects”) as: first, practical protection mechanisms such as self-protection; second, normative mechanisms such as social, technical and legal norms but also mechanisms of self-regulation such as standards, codes of conduct, and certificates; third, institutions that enable, for example, individual’s self-protection, limit informational power, or enforce legal requirements; and fourth, the range of protection such as protection against concrete infringements, or specific risks and dangers, or even precautionary protection against unspecific risks and abstract dangers.52 47 See Hallinan and Friedewald, ibid., pp. 65 and 71. 48 See Kuner et al., The Data Protection Credibility Crisis, IDPL 2015 Vol. 5 no. 3, pp. 161. 49 Cf. Stentzel, The Fundamental Right to ...? The Search of the Object of Protection of Data Protection in the European Union, PinG 05.15, pp. 185; cf. Solove, Understanding Privacy; cf. v. Lewinski, The Matrix of Data Protection. 50 See v. Lewinski, ibid., pp. 1 to 16. 51 See v. Lewinski, ibid., pp. 7 as well as 17 to 63; see also De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 5. 52 See v. Lewinski, ibid., pp. 64 to 85. I. Problem: Conflict between innovation and risk protection 47 Irrespective of whether or not this “matrix of data protection” is correct and comprehensive,53 it does help clarify the question that the meaning and extent of the principle of purpose limitation cannot be answered without being clear on the object and concept of protection of data protection law. Only if the object and concept of protection are sufficiently precise, it is possible to answer the question of how to balance the need for innovation against its risks with respect to the processing of personal data. Research questions and approach Therefore, the research questions of this doctoral thesis are: 1. What is the meaning and function of the principle of purpose limitation on the private sector, in light of the object and concept of protection of data protection law? 2. In order to find a balance between the societal need for data-driven innovation and protection against its risks, what regulation instruments should transpose the principle of purpose limitation in the private sector? In order to answer these questions, this doctoral thesis builds upon the research approach regarding innovation developed by the Center of Law and Innovation (CERI) in Hamburg, Germany. Legal research about innovation The CERI research project “Law and Innovation” reacted to the situation that at the beginning of 1990, legal scholarship had not yet started, at least not in Germany, doing research about innovation, in contrast to other research disciplines such as technical, economic, and social sciences.54 Consequently, the object of this research approach does not primarily look to innovate the law, but rather how the regulator can regulate technological, II. 1. 53 See v. Lewinski, ibid., pp. 87 to 90 commenting on the deficits of such a matrix and highlighting, however, its main use for structuring the public debate, enhancing legal comparison on an international level, and discovering deficits of legal protection. 54 See Hoffmann-Riem, Openness toward Innovation and Responsibility for Innovation by means of Law, p. 256. A. Introduction 48 economic and social innovation in today’s society.55 This approach acknowledges that the primary objective of the law is to protect against harm and risks and, thus, restricts the scope of action of entities that actual cause these harms and risks. Such a restriction, is in particular, at stake if the law expands its scope of protection from known risks to even unknown risks. One instrument for expanding the scope of protection, can be the so-called precautionary principle (as discussed in chapter B. II. Data protection as a risk regulation). However, regulating innovation, not only leads to the question of how to protect against the actual risks caused by innovation, but also how to enable the development of innovation within society.56 Contrary to the common prejudice that the law is an inherent barrier for innovation, the law levels, protects and enforces innovation.57 Taking both of these effects of law into account, i.e. those restricting the scope of action of risk-causing innovators, as well as those leveling, protecting, and enforcing innovation, Hoffmann-Riem summarizes this approach by posing the essential question: How should legal instruments be shaped in order to enable and even promote innovation without denying necessary protection? From this point of view, only those regulations that do not take particularities of innovation processes into account, and, thus, are badly drafted, are an unjustified barrier for innovation.58 The regulator’s perspective Referring to theories of evolutionary economics, the research approach that focuses on innovation builds upon modern movements in administrative law that seek to cope with the problem that the regulator has limited knowledge of future events.59 With respect to German law, Voßkuhle pinpoints the essential differences between this new and the traditional approach by giving a brief summary of its historical development. The tradi- 2. 55 See Hoffmann-Riem, ibid., p. 257. 56 See Hoffmann-Riem, ibid., pp. 256 ff. 57 See, instead of many, Mayer-Schönberger, The Law as Stimulus: The Role of Law in Fostering Innovative Entrepreneurship, pp. 159 to 169; Gasser, Cloud Innovation and the Law: Issues, Approaches, and Interplay, pp. 19 and 20. 58 Hoffmann-Riem, ibid., 260 and 261; cf. also Brownsword and Yeung, Regulating Technologies: Tools, Targets, and Thematics, p. 21. 59 See Hoffmann-Riem, ibid., pp. 259 to 262; Appel, Tasks and Procedures of the Innovation Impact Assessment, p. 149. II. Research questions and approach 49 tional approach mainly concentrates on the judicial act and examines its conformity with law. This examination is based on a systematic review of positive law and the elaboration of underlying principles. This examination results, in essence, with either a yes or no answer. Its primary aim is binding the executive to the rule of law.60 Several studies from the 1970’s had proved, however, high execution deficiencies of this classic form of imperative public law, particularly in the environmental sector. Upcoming new forms of informal cooperation, between the public and private sector appeared, at the time, to function better than these classic forms of regulation. Researchers started, therefore, to thoroughly investigate the interrelationship between legislative rule making, administrative, as well as judicial decision-making, and its implementation within society. As a main starting point for alternative strategies and forms of regulation, they discovered that the regulator, in particular, did not have the full knowledge of a situation caused by more and more complex environments (particularly in the environmental, telecommunications, and other technique-driven sectors), its increasing non-linear dynamics, and, thus, (objectively) unforeseeable and (sometimes) irreversible effects.61 Methodologically, the new regulatory approach ties into the concept of control theory developed in political sciences.62 Elaborating on this approach, German legal scholars in administrative law usually build on a concept of control focusing on the actions of those individuals or entities that are affected by it. This concept differentiates between the individuals and entities, aim, and instruments of control, as well as the controlling entity. Indeed, the term “controlling entity” should not conceal the fact that there often is no single entity but rather an interactive process that consists of several entities, working together and against each other, and producing regulatory outputs.63 Similarly, with respect to the individuals and entities affected by the regulation, legal scholars recognize that society finds its solutions for problems in complex structures and a central regulator, in particular the legislator, may have difficulties to appropriately address the individuals in order to achieve its regulatory aims. Keeping this in mind, 60 See Voßkuhle, New Regulatory Approach of Administrative Law, cip. 2 to 8. 61 See the summary of the evolvement at Voßkuhle, ibid., cip. 10 and 11; cf. also Hoffmann-Riem, ibid., pp. 261 to 265; Eifert, New Regulatory Approach of Administrative Law, cip. 1 and 2. 62 See Voßkuhle, ibid., cip. 18. 63 See Voßkuhle, ibid., cip. 20. A. Introduction 50 the modern regulatory approach nevertheless focuses on the state’s point of view and on legislative measures as its main regulation instrument. With these measures, the state seeks to create a certain impact on the individual or entity by focusing on their legal liability should they not adhere to the system. This is the main conceptual difference to the so-called governance perspective, which applies a different point of view that is not restricted in pursuing specific aims by legal means.64 Focusing on Internet governance, Hofmann, Katzenbach and Gollatz, advocate that the governance perspective instead focuses on reflexive coordination and, thus, “refers to addressing, questioning, and renegotiating Internet-related coordination practices.”65 However, despite or rather because of the analytical difference between both perspectives, the new regulatory approach may refer well to theoretical concepts and empirical findings of the governance approach in order to find out whether “self-regulation” processes already fulfill the regulator’s aims or whether there is a need for state regulatory support. On an international level, legal scholars equally elaborate on the functions, modes, and strategies coming into question for regulation in complex and non-linear environments, however, not always using the same terminology.66 The common starting point consists in, as mentioned previously, the knowledge deficiencies of regulators acting in these environments. Raab and De Hert describe this common starting point promoting that any understanding of the functioning of regulation (and its “tools”) requires one to consider the regulatory activity as a process “in which, in 64 See Eifert, ibid., cip. 5 and 6; Voßkuhle, ibid., cip. 21; cf. also Braithwaite et al., Can regulation and governance make a difference?, p. 3; Hofmann, Katzenbach and Gollatz, Between coordination and regulation: Finding the governance in Internet governance, pp. 6 and 7. 65 See Hofmann, Katzenbach and Gollatz, ibid., p. 13. 66 Cf. Baldwin and Cave, Understanding Regulation – Theory, Strategy and Practice; Raab and De Hert, Tools for Technology Regulation: Seeking Analytical Approaches Beyond Lessig and Hood; Murray, Conceptualising the Post-Regulatory (Cyber)state, with further references, amongst others, to Black, Decentring Regulation: Understanding the Role of Regulation and Self Regulation in a ‘Post-Regulatory’ World’ as well as Scott, Regulation in the Age of Governance: The Rise of the Post Regulatory State’, further developed, ibid, The Regulation of Cyberspace – Control in the Online Environment; Franzius, Modes and Impact Factors for the Control through Law; Eifert, Regulation Strategies. II. Research questions and approach 51 theory, several actors may participate in the making, using, and governing of each tool”.67 The terminology regarding the regulatory functions, modes, and strategies, is often not comprehensively clear. The German scholar Eifert explains the terminological ambiguity with respect to the diversity of theoretical concepts applied, respectively. He favors to determine, at least, the regulatory strategies pursuant to the state role within the regulation distinguishing, though, between imperative law (“command and control”, often also described as “rules), state regulated self-regulation (“co-regulation”, often referring to “principles” or “standards”), and societal self-regulation. Focusing on two main types of regulation, i.e. imperative law (commandand-control) and instruments of regulated self-regulation (co-regulation),68 Eifert sums up the positive and negative aspects of these two types of regulation. On the one hand a command-and-control regulation provides for high legal certainty (given by the clarity of legal “if-then”-rules and the direct effects of its execution). On the other hand, this kind of regulation might be inefficient because it does not take into consideration individuals’ economic behaviour. The inflexibility of this kind of regulation constrains more intensively an individual’s actions. This restriction leads to three effects: First, it lowers the acceptance of the regulation amongst individuals; second, this increases the probability that the individuals will try to circumvent the regulation; and finally, it increases the efforts of the state to hinder the individuals’ circumvention of the law itself. Therefore, this kind of regulation is considered to work best when the following two conditions are met: first, the regulator aims to prohibit third parties’ rights or interests being harmed; and, second, the regulator has sufficient knowledge about the effectiveness and efficiency of the corresponding protection instruments. In contrast, if the regulator does not possess sufficient knowledge, such as in a dynamic and non-linear environment, and creativ- 67 See Raab and De Hert, ibid., p. 282. 68 See Eifert, ibid., cip. 13 to 15; focusing on privacy-related principles, Maxwell, Principles-based regulation of personal data: the case of ‘fair processing’, pp. 212 to 214, referring to J Black, ‘Forms and Paradoxes of Principles Based Regulation’, LSE Law, Society and EconomyWorking Paper 13/2008, SSRN abstract n8 1267722, L Kaplow, ‘Rules Versus Standards: An Economic Analysis’ (1992) 42 Duke L. J. 557; R Posner, Economic Analysis of Law (8th edn., Aspen/Wolters Kluwer, New York, 2011), p. 747. A. Introduction 52 ity is needed in order to solve a variety of problems, this kind of command and control regulation does not provide for the appropriate instruments.69 Instead, in order to enhance problem-solving creativity, Eifert stresses co-regulation as the more appropriate regulation strategy. Thereby, taking the decentralized knowledge of private entities into account does not only increase the problem-solving capacities in the society. Rather, the fact that the regulator adapts its regulation instruments to the inherent logics of the entities acting on the private market also increases their acceptance of the regulation instruments. Furthermore, this kind of regulation decreases the administrative costs because the private structures used for it are often also financed privately. Finally, instruments of co-regulation can provide a solution for the territorial problem of “command and control” regulation because its execution does not depend, at least not directly, on the State but private entities not being bound to national territories.70 However, a possible disadvantage is that this kind of regulation does not meet the regulator’s expectations but, instead, makes the regulation more complex, opaque and less effective or efficient than the classic form. Another risk is that the regulated private entities abuse their knowledge advantage toward the State. This could be the case, for example, if the State gives privileges to these private entities because it thinks that their solutions really serve society, but in reality serves their particulars interests, only.71 In any case, Eifert stresses, like Franzius, that the complexity of this form of regulation requires the regulator to learn. This means to frequently evaluate its effectiveness and efficiency of its regulation instruments.72 Such an evaluation should refer to other disciplines, such as to social and economic sciences, and build upon their validated knowledge. The moment when the legislator extends its view to the effects of its regulation, reference to these other disciplines and their methodologies included will increase the rationality of law.73 69 See Eifert, ibid., cip. 25 and 26. 70 See Eifert, ibid., cip. 59. 71 See Eifert, ibid., cip. 60. 72 Cf. Eifert, ibid., cip. 60; Franzius, ibid., cip. 81 to 103. 73 See Hoffmann-Riem, Innovation Responsibility, p. 39. II. Research questions and approach 53 Possible pitfalls taking the effects of regulation instruments into account In conclusion, Voßkuhle summarizes the promises and possible pitfalls of this legal research approach seeking to gain deeper knowledge about the complex effects of law as a regulation instrument. He considers the promises as: first, this approach broadens the scope in which the law is just one regulation mechanism amongst others, such as beside further mechanisms of economic markets, networks or within organizations; second the approach enables researchers to ascertain and take the effects and efficiency of legal instruments into account, and their interplay with further mechanisms; and third, in doing so, the approach enables legal researchers to interconnect with other research disciplines. This last aspect enables researchers to build on theoretical frameworks and empirical methodologies already elaborated on in other disciplines. However, the possible pitfalls of this approach are: On the one hand, legal scholars considering the effects of regulation instruments may over-simplify the complex interplay of cause and effect. The reason is that all theoretical models mirror just one part of the reality and the choice of regulation instruments based on them thus runs the risk of not being able to meet the legislator’s goal. On the other hand, the regulatory function of the law is not the only function. The law also serves as an expression of the values provided for by the constitution. This means that legal provisions do not lose their validity just because in some circumstances it has little effect, only, for example, because of inefficient execution of the law.74 These considerations are important for the examination of the principle of purpose limitation pursued in this thesis. The principle of purpose limitation suffers, indeed, from a lack of execution in the private market. And this may result from the uncertainty about its precise meaning and extent.75 However, this lack of execution does not mean, per se, that the principle of purpose limitation should be abandoned as a whole. This hesitation is particularly justified because the uncertainty about its meaning and 3. 74 See Voßkuhle, ibid., cip. 22 to 28. 75 See, in general, the above-mentioned studies as well as, in particular, the observations made in the HIIG Law Clinic where startups simply went on developing their products if they could not definitely clarify how to apply the principle of purpose limitation and expected that data protection authorities would not become aware of their practice, anyway. A. Introduction 54 extent is not a special problem of the principle of purpose limitation but of all legal principles in general. The less imperative law and its conditional if-then-scheme serves as regulation instrument, the more important instruments, such as legal principles, become. Principles do not provide for a binary scheme that will answer the question of whether an act is legal or not but allows individuals to explore different, and in the best possible outcome an optimal solution.76 Indeed, with the abandonment of imperative law and its conditional decision rule, the individuals’ legal uncertainty increases because individuals do not know whether the solution found meets the regulators expectations. Consequently, individuals and the regulator have to start an interactive process reconstructing together, the certainty of legal rules.77 The answer of whether or not or in which way the regulator meets its expectations regarding the principle of purpose limitation depends, in the first instance, on the above-mentioned research questions of this thesis. Course of examination In order to answer the research questions, the next chapter clarifies the conceptual definitions which provides a basis for regulation of innovation. The first sub-chapter illustrates how economic theories define and conceptualize “innovation” and “entrepreneurship” and which role the law plays in these conceptualizations of “innovative entrepreneurship”. In doing so, one particular focus is on the illustration of economic models describing the non-linearity of innovative entrepreneurship processes. Subsequently, the examination goes on to review literature from both economic and legal perspectives and examines the effects of legal certainty on “innovative entrepreneurship”. The first sub-chapter concludes with the appearing regulatory conflict: On the one hand, as discussed, regulation instruments, such as the principle of purpose limitation, is open toward innovation but decreases legal certainty; on the other hand, legal uncertainty hinders innovation. Therefore, it will be key to explore mechanisms that combine both aspects, i.e. being open toward innovation but also ensuring legal certainty and, thus, even promoting innovation. The second sub-chapter draws at- III. 76 See Franzius, ibid., cip. 7; cf. Raab and De Hert, ibid., p. 278. 77 Cf. Franzius, ibid., cip. 17. III. Course of examination 55 tention to the other side of the “innovation” coin, i.e. data protection law as a regulation of risks caused by innovation. This sub-chapter clarifies the terms “risks” and “dangers”, as well as the often correspondingly used protection mechanisms “prevention” and “precaution”. This distinction is highly relevant for exploring the function of the principle of purpose limitation at a later stage. The discussion on various protection instruments for different types of threats leads to the last sub-chapter that clarifies the conceptual definitions for the regulation of data-driven innovation: The question of what is threatened, in terms of data protection and, thus, which object of protection the principle of purpose limitation serves. Based on Nissenbaum’s work Privacy in Context, this last sub-chapter provides an overview about the prevailing theories, concepts, and approaches on the value of privacy. So far, this work does hence not yet clarify the distinction between privacy and data protection and, correspondingly, privacy and data protection laws; this distinction is an essential element of the conceptual work of this thesis and will be proposed later on. This subchapter finally gives a first response to Nissenbaum’s critique on the purpose-oriented concept of protection by clarifying the relationship between the terms “purpose” and “context”. This will lead to a first insight into the function of the principle of purpose limitation. The third chapter contains the main part of this thesis: An analysis of the legal framework determining the meaning and function of the principle of purpose limitation. Elaborating on the object and concept of protection of data protection law, this chapter seeks to clarify three main question: first, the precise meaning and extent of the requirement to specify the purpose; second, the precise meaning and extent of the requirement to limit the later use of data to the purposes originally specified; and third, which specific instruments are appropriate for establishing these two requirements in the private sector in order to find a sound balance between enabling innovation and protection against its risks in society. In doing so, the first sub-chapter clarifies the interrelationship between the different regimes of fundamental rights focusing on the European Convention on Human Rights (ECHR), the European Charta of Fundamental Rights (ECFR), and German Basic Rights (GG). Furthermore, it treats the question of the effects of these fundamental rights in the private sector, in particular, of the right to privacy under Article 8 ECHR, the rights to privacy and data protection under Article 7 and 8 ECFR, as well as the German right to informational self-determination under Article 1 sect. 1 in combination with 2 sect. 1 GG. The question is whether these fundamental A. Introduction 56 rights directly bind private entities that process personal data, like the State, or whether they have only an indirect effect in the private sector. The thought behind this question is that the second alternative gives the legislator more room for transposing the constitutional requirements into secondary and/or ordinary law. The sub-chapter goes on to analyze the object and concept of protection developed by the European Court of Human Rights (ECtHR), the European Court of Justice (ECJ), and the German Constitutional Court (BVerfG), with respect to each of the above-mentioned fundamental rights. This parallel analysis will effectively allow one to compare the differences between the corresponding objects, as well as concepts of protection. The first sub-chapter concludes with an analytical result on the challenges facing, in general, from these objects and concepts of protection being very broad and vague. A theoretical solution provides a first hint on how this may also affect the determination of the function of the principle of purpose limitation. The next sub-chapter draws the attention to the main problem resulting from such concepts of protection that are intrinsically broad and/or vague: The uncertainty about how to legally specify the purpose of the data processing. On a European level, the analysis will illustrate that there are almost no criteria which help specify the purpose, provided for by the judicial courts in light of the corresponding fundamental rights. However, it will be illustrated that the specification of the purpose is an essential element in secondary law because several further definitions and requirements, such as the scope of application, refer to the purpose specified. Despite this essential role, the Article 29 Data Protection Working Party, having an advisory status for questions about the interpretation of the Data Protection Directive, does not provide reliable criteria for the specification of the purpose, either (nor does the General Data Protection Regulation address this issue). Therefore, the sub-chapter continues to examine how the secondary law itself specifies certain purposes of processing such as for “marketing electronic communications services”, pursuant to Art. 6 sect. 3 of the Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePrivacy Directive). Subsequently, the examination turns into the question of how the German legislator transposes these requirements into German ordinary law. This allows the comparison, since there are almost no criteria provided for by European fundamental rights, of the concept of protection established within ordinary law, at least, with German basic rights. The III. Course of examination 57 analysis of European secondary and German ordinary law, as well as its comparison with the (so far developed) constitutional requirements, alludes to the fact that there are several flaws in the concept of protection. The results not only confirm the general challenges stemming, as concluded previously, from the object and concept of protection as being very broad and vague, now with particular respect to the requirement of purpose specification. Rather, it is apparent from the results that these flaws consist, in essence, in the fact that the constitutional requirements for the processing of data by the State are, in essence, equally applied to private entities. Since private entities have different means for specifying purposes at their disposal than the State, this leads to the situation that the effects of the requirements are even stricter for private entities than for the State. This sub-chapter hence concludes, with a particular focus on the European Charta of Fundamental Rights, with a refinement of the object and concept of protection serving a better scale to private entities for the specification of the purpose of their data processing. The following sub-chapter treats the second component of the principle of purpose limitation, i.e. the question on the precise meaning and extent of the requirement to limit the later processing to the purpose(s) initially specified. The examination exemplifies two different models: The European model of purpose compatibility and the German model requiring strict purpose identity allowing, however, a change of purpose if this change is proportionate. With regard to the European model, this doctoral thesis examines the criteria developed by the European Court of Human Rights, as well as the European Court of Justice in light of the corresponding fundamental rights. While the European Court of Human Rights mainly refers to the “reasonable expectations” of the individual concerned by the processing of data related to him or her, the European Court of Justice does not. Interestingly, the Article 29 Data Protection Working Party nevertheless refers, proposing their criteria helping answer the extent of the requirement of purpose compatibility, to the individual’s “reasonable expectations”,78 albeit the Data Protection Directive does not either (interestingly, Article 6 sect. 4 lit a-e of the General Data Protection Regulation also lists all criteria but the “reasonable expectations” criterion). It is apparent from the analysis that the criteria proposed do not actually help in 78 See the Art. 29 Data Protection Working Party, Opinion 03/2013 on the principle of purpose limitation, pp. 24 and 25. A. Introduction 58 answering the question on the extent of the requirement of purpose compatibility. This doctoral thesis therefore continues, in order to receive inspiration on which functions the limitation of purposes can have, to examine the German model. Interestingly, albeit German ordinary law transposes the European directive, it deviates, at least formally, from the compatibility requirement. The examination therefore draws the attention to the concept of protection provided for by the German basic right to informational self-determination in order to find the reason for the deviation. Since the reason for the deviation appears to come, indeed, from the application of the German basic right (and not of the European fundamental rights), this thesis presents three alternative approaches proposed within German legal literature in order to get a clearer understanding about the possible functions of the principle of purpose limitation. Indeed, all three approaches refer to the processing of data by the State. Taking the results of the preceding analysis into account, the thesis concludes this sub-chapter with a new approach defining the meaning and extent of the principle of purpose limitation for the private sector. On the basis of the own approaches developed in the two last-preceding sub-chapters, the last sub-chapter treats the question of which specific regulation instruments serve best in order to establish this new understanding of the meaning and extent of the principle of purpose limitation in the private sector. Here, the thesis exemplifies, iteratively, the impact of this understanding on the following elements: first, the scope(s) of application of all protection instruments; second, the specific application of the protection instruments in the private sector (in particular, the necessity as well as interplay of the individual’s consent and other legitimate basis laid down by law); and third, on particular aspects of the consent, its withdrawal, and a right to object to the data processing, as well as on further protection instruments such as rights of information, participation, and deletion of personal data, by taking the individual’s decision-making process as a whole into account.79 Finally, on the basis of the refined concept of protection regarding the principle of purpose limitation and related protection instruments, the last chapter of this thesis comes back to answer questions about the effects of these instruments. These questions refer to both sides of the “innovation” 79 Cf. the concept and terminology of ”choice architectures“ at Thaler and Sunstein, Nudge – Improving Decisions About Health, Wealth, and Happiness. III. Course of examination 59 coin, i.e. the effects on processes of “innovative entrepreneurship” as well as on the efficiency of risk protection instruments. The preceding chapters will have made certain remaining questions apparent that cannot sufficiently be answered by legal analysis alone. This last chapter therefore proposes an empirical methodology that helps answer the remaining questions. On the basis of these results, the regulator might answer the overarching question of which instruments fits best its regulatory aims. A. Introduction 60 Conceptual definitions as a link for regulation This chapter clarifies the conceptual definitions that provide a link for the regulation of innovation. While the first sub-chapter refers to economic theories defining the terms “innovation” and “entrepreneurship”, the second sub-chapter draws the attention to the other side, i.e. data protection law as a regulation of risks caused by innovation. This sub-chapter illustrates the discussion on various protection instruments for different types of threats, such as prevention and precaution or dangers and risks. This leads to the last sub-chapter treating the question of what is actually threatened. The clarification of the interplay between “context” and “purpose” provides a first understanding of the meaning and extent of the principle of purpose limitation. Innovation and entrepreneurship If the regulator refers, at least implicitly, to entrepreneurial innovation, it permits one to tie definitions that have been developed by other research disciplines.80 Indeed, in other disciplines, there is not a common definition of “innovation” or “entrepreneurship”. Scholars consider that innovation and entrepreneurship are phenomena that can and should be analyzed from various, interdisciplinary perspectives. This might be the reason for the lack of common definitions.81 However, as one of the first economists, Schumpeter recognized, coming from an evolutionary understanding of private markets, innovation as an essential force for societal change. In his work Capitalism, Socialism & Democracy, he disagreed with the common view on price competition as the main driver of economy and determined, instead, “the new consumers’ goods, the new methods of production or transportation, the new markets, the new forms of industrial organization B. I. 80 See Hoffmann-Riem, Openness toward Innovation and Responsibility for Innovation by means of Law, p. 257. 81 See regarding the first term at Fagerberg, Innovation: A Guide to the Literature, p. 1, and regarding the second term at Fueglistaller et al., Entrepreneurship – Basics, p. 6. 61 that capitalist enterprise creates” as the fundamental impulse “that sets and keeps the capitalist engine in motion”.82 From this perspective, the “function of entrepreneurs is to reform or revolutionize the pattern by exploiting an invention or, more generally, an untried technological possibility for producing a new commodity or producing an old one in a new way (…) and so on.”83 Hence, Schumpeter differentiated between inventions, i.e. the first realization of a solution for a problem, and the innovation bringing an invention to the market.84 This differentiation is, until today, widely recognized. Today’s economists are focusing, in essence, on four types of innovations: First, product and service innovations; second, process innovations; third, business model innovations; and fourth, social innovations which often refer to new forms of communication or cooperation being mostly considered, actually, either as the basis for the before-mentioned types of innovations or as their result.85 Further categories classify innovations pursuant to their impact on current production processes or market structures. This perspective differentiates between: on the one hand, “incremental” or “marginal” innovations describing continuous improvements of one or more innovation types listed previously; and on the other hand, “radical” innovations or “technological revolutions” referring to the introduction of a new technology or cluster of technologies which did not exist before in society.86 Keeping this in mind, it is common ground today that data provides, more and more, the basis for many, if not once most, of these types or categories of innovation.87 82 See Schumpeter, Capitalism, Socialism & Democracy, pp. 82 and 83. 83 See Schumpeter, ibid., p. 132. 84 See Fagerberg, ibid., p. 5; Fueglistaller et al., Entrepreneurship – Innovation and Entrepreneurship, p. 98. 85 See Fueglistaller, ibid., pp. 99 and 100; cf. also Neveling et al., Economic and Sociological Approaches of Innovation Research, pp. 369 and 370, as well as Fagerberg, ibid., pp. 8 and 9. 86 See Fagerberg, ibid., p. 9 referring to Schumpeter. 87 See, instead of many, at Mayer-Schönberger and Cukier, Big Data: A Revolution That Will Transform How We Live, Work, and Think, in particular at pp. 6 to 35 and 322 to 336. B. Conceptual definitions as a link for regulation 62 Process of innovative entrepreneurship Entrepreneurship research poses, in particular, the question of how entrepreneurs create such innovation.88 After researchers had initially focused on the personality of the entrepreneur per se, Drucker stressed, in his influential article The Discipline of Innovation, that it is less the personality per se that constitutes entrepreneurship than the entrepreneurial activity.89 Over time, several economics had elaborated on models describing the entrepreneurial process as the overarching unit of analysis encompassing entrepreneurial phenomena such as activity, novelty, and change.90 In order to extract a common model being both generic, i.e. describing the common patterns of all different kinds of entrepreneurial processes, as well as distinct, i.e. differentiating entrepreneurial from non-entrepreneurial processes, Moroz and Hindle analyzed more than 32 of existent models. They came to the result, however, that the models analyzed were too fragmented in order to achieve the initial aim of building a common model being both generic and distinct.91 Despite this fragmentation, or rather because of it, three aspects shall be explained in more detail because they may serve as reference points for answering the question of how legal regulation instruments function with respect to the logics of entrepreneurs creating innovation. Key Elements for the entrepreneurial process The first aspect being of interest for this doctoral thesis refers to key elements which are decisive for entrepreneurship. Gartner elaborated on several of these key elements, who conducted, in the 1980’s, a study with academics, practitioners and politicians related to the entrepreneurial field in order to gain a more comprehensive understanding about what kind of ac- 1. a) 88 See Drucker, The Discipline of Innovation, p. 3. 89 See Drucker, The Discipline of Innovation, p. 3. 90 See Moroz and Hindle, Entreneurship as a Process: Toward Harmonizing Multiple Perspectives. 91 See Moroz and Hindle, Entreneurship as a Process: Toward Harmonizing Multiple Perspectives, p. 781. I. Innovation and entrepreneurship 63 tivity or situation is considered as entrepreneurial.92 Pursuant to this model, entrepreneurs locate business opportunities, accumulate resources, and build organizations in order to produce and market products or services, while constantly responding to their environment.93 Moroz and Hindle stress that this model does not actually describe a behavior being distinct to others, such as pure managerial activities. However, they also point to the implicit distinctness of this model describing the entrepreneur as being “involved in a multidimensional process of organizational emergence that is focused upon the creation of a new venture that is independent, profit oriented, and driven by individual expertise. The newness attached to this process is linked to products, processes, markets, or technologies where the firm is considered a new entrant or supplier to a market.”94 Fueglistaller proposes a very similar process model determined by the following five key elements: The entrepreneur, a business opportunity, sufficient resources, a form of organization, and a supportive environment.95 92 See Gartner, What are we talking about when we talk about entrepreneurship?, as well as, A Conceptual Framework for Describing the Phenomenon of New Venture Creation. 93 See Gartner, A Conceptual Framework for Describing the Phenomenon of New Venture Creation, p. 702. 94 See Moroz and Hindle, ibid., p. 800. 95 See Fueglistaller et al., Entrepreneurship – Basics, p. 7. B. Conceptual definitions as a link for regulation 64 ALEXANDER VON HUMBOLDT INSTITUT FÜR INTERNET UND GESELLSCHAFT1 Entrepreneur Resources Business opportunity Organization Configuration FitFocus Discover, evaluate, use Bundle, combine Build, manage Environment Environment Graphic: Key Elements for Entrepreneurial Process96 Consequently, the entrepreneur constitutes the core of an enterprise discovering or creating business opportunities, evaluating and using them. In such an emergent process, the individual capacities, capabilities, and attitudes play a decisive role. The entrepreneur’s cognitive capacities influence the identification or creation of business opportunities; the evaluation of the opportunity depends, on the one hand, on the characteristics of the opportunity and, on the other hand, on the individual attitude such as toward risks; and the use of the opportunities depends on the abilities of how to practically organize the process as a whole.97 96 Following Fueglistaller et al., ibid., p. 8. 97 See Fueglistaller et al., ibid., pp. 7 to 14. I. Innovation and entrepreneurship 65 Business Opportunities: Discovery and creation The second aspect focuses on how entrepreneurs identify or create business opportunities. Economics usually consider the existence of a “business opportunity” if “there is an opportunity to introduce a new product, new service, or new method and to sell it for a higher price than its production costs”.98 They also agree on the assumption that such an opportunity arises “whenever competitive imperfections in an industry or market exist”.99 However, economics argue about from where these market imperfections come: Does an entrepreneur discover or create these market imperfections and, as a consequence, the business opportunity? There are two main theories seeking to answer this question, the Discovery- and Creation Theory. Tying into teleological theories of human action, both theories aim to explain the relationship between entrepreneurial action and the ability to produce innovation.100 Alvarez and Barney summarize the essential differences between both theories as:101 Discovery Theory Creation Theory Nature of Business Opportunities/Market Imperfections Caused by exogenous shocks to pre-existing industries or markets Caused by endogenous actions of individuals to produce new products or services Nature of Entrepreneurs Entrepreneurs are different than non-entrepreneurs in some critical and enduring ways Entrepreneurs may be the same or different than non-entrepreneurs; any differences, ex ante, may be magnified by entrepreneurial actions Nature of Decision Making Those who are aware of and seek to exploit opportunities operate under conditions of risk Those creating opportunities act under conditions of uncertainty Table: Differentiating aspects of Discovery and Creation Theories102 b) 98 See Fueglistaller et al., ibid., p. 10: “Im Allgemeinen spricht man von einer unternehmerischen Gelegenheit, wenn sich die Möglichkeit bietet, ein neues Produkt, eine neue Dienstleistung oder eine neue Methode einzuführen und zu einem höheren Preis als die Produktionskosten zu verkaufen.” 99 See Alvarez and Barney, Discovery and Creation: Alternative Theories of Entrepreneurial Action, p. 6. 100 See Alvarez and Barney, ibid., p. 2 to 4. 101 See Alvarez and Barney, ibid., pp. 2 and 6. 102 Following Alvarez and Barney, ibid., p. 6. B. Conceptual definitions as a link for regulation 66 The last category, i.e. the nature of decision making, clarifies the interplay of both theories. Alvarez and Barney differentiate, pursuant to the possibility and probability of outcomes, between the terms “certainty”, “risk”, “ambiguity”, and “uncertainty”: While the term “certainty” refers to situations were a certain outcome is sure, entrepreneurs act under conditions of “risk” if they know (or are able to know) which outcome is possible and under which degree of probability; in contrast, an outcome is “ambiguous” if an entrepreneur has sufficient information (or are at least is able to retrieve it) in order to foresee that an outcome is possible but does not have enough information that he or she would able to determine its probable or likely outcome. Finally, an entrepreneur acts under “uncertainty” if he or she does not even know that outcome is possible). This differentiation allows one to clarify the knowledge-related pre-conditions of each theory: While the Discovery Theory assumes that entrepreneurs are able, principally, “to predict both the range of possible outcomes associated with producing new products or services, as well as the probability that these different outcomes will occur”103, the Creation Theory “assumes that the end of an emergent process cannot be known from the beginning.”104 In such an uncertain situation, entrepreneurs are, hence not able to calculate, based on a risk-calculation methodology the opportunity costs related to their actions. As a consequence, the Creation Theory instead proposes focusing on the losses an entrepreneur can accept if his or her actions do not lead to a successful outcome.105 Alvarez and Barney draw from these assumptions the following implications: “Discovery Theory suggests that entrepreneurs maximize their probability of success by (1) carefully collecting and analyzing information about opportunities to calculate their return and possible opportunity costs, (2) developing a rigorous business plan that describes the opportunities they are going to pursue, and (3) obtaining capital to execute these plans from outside sources. Creation Theory suggests that entrepreneurs maximize their probability of success by (1) engaging in iterative, incremental, and inductive decision making, (2) developing very flexible and constantly adjusting business plans, and (3) obtaining capital from friends and family—people who are willing to bet on them and not on the oppor- 103 See Alvarez and Barney, ibid., p. 13. 104 See Alvarez and Barney, ibid., p. 20. 105 See Alvarez and Barney, ibid., pp. 20 and 21. I. Innovation and entrepreneurship 67 tunities they may or may not exploit.”106 Alvarez and Barney stress that the Creation Theory may also solve problems that appear to arise in other economic research fields, such as in strategic management theories. For example, these theories could not explain, so far, the reason for the empirical finding that entrepreneurs generate competitive advantages by using “valuable, rare, and costly to imitate resources”.107 The Creation Theory can explain such phenomena, arguing that the path dependency of a process emerged under uncertainty “is likely to generate resources that, from the point of view of potential competitors, are intractable (…) and causally ambiguous (…).”108 The differences between both theories do not mean that they must be considered, practically, as exclusive to each other. Instead, the conditions under which entrepreneurs act rather clarify which theory is more appropriate for predicting successful entrepreneurial behavior in specific situations. In situations where the entrepreneur has sufficient knowledge or, at least, is able to retrieve it in order to determine the risks, his or her actions lead more likely to successful innovation if they are consistent with the Discovery Theory; in contrast, if the entrepreneur acts under uncertainty, thus, is not even able to foresee that a specific outcome is possible, he or she will more likely be successful when acting consistent with Creation Theory.109 Indeed, Alvarez and Barney also stress for cases in between: First, ambiguous situations where an entrepreneur has enough information to foresee that an outcome is possible, but not its probability; in these cases their predictions are less clear.110 Second, there are also situations where the advantage of one process methodology toward the other one may change over time if entrepreneurs are moving from “risky” to “uncertain” situations, and vice versa.111 In any case, both theories provide illustrative examples of how economics conceptualize the action-related logics of entrepreneurs and which role legal regulation may play with respect to the knowledge base for their activities. 106 See Alvarez and Barney, ibid., p. 32. 107 See Alvarez and Barney, ibid., p. 36. 108 See Alvarez and Barney, ibid., pp. 36 and 37. 109 See Alvarez and Barney, ibid., pp. 33 and 34. 110 See Alvarez and Barney, ibid., p. 35. 111 See Alvarez and Barney, ibid., p. 34. B. Conceptual definitions as a link for regulation 68 Strategic management: Causation and effectuation This leads to the third aspect being of interest for this thesis. Economics discuss two approaches describing in more detail the different logics of how entrepreneurs may decide and act in specific situations named “causation” and “effectuation”. Sarasvathy describes these two approaches as: “Causation processes take a particular effect as given and focus on selecting between means to create that effect. Effectuation processes take a set of means as given and focus on selecting between possible effects that can be created with that set of means.”112 Sarasvathy exemplifies the implications of this approach as: Causation Processes Effectuation Processes Givens Effect is given Only some means or tools are given Decision-making selection criteria Help choose between means to achieve the given effect Selection criteria based on expected return Effect dependent: Choice of means is driven by characteristics of the effect the decision maker wants to create and his or her knowledge of possible means Help choose between possible effects that can be created with given means Selection criteria based on affordable loss or acceptable risk Actor dependent: Given specific means, choice of effect is driven by characteristics of the actor and his or her ability to discover and use contingencies Competencies employed Excellent at exploiting knowledge Excellent at exploiting contingencies Context of relevance More ubiquitous in nature More useful in static, linear, and independent environments More ubiquitous in human action Explicit assumption of dynamic, nonlinear and ecological environments Nature of unknowns Focus on the predictable aspects of an uncertain future Focus on the controllable aspects of an unpredictable future Underlying logic To the extent we can predict future, we can control it To the extent we can control future, we do not need to predict it Outcomes Market share in existent markets through competitive strategies New markets created through alliances and other cooperative strategies Table: Differentiating Aspects of Causation and Effectuation Processes (words in bold and/or italic highlighted by the author)113 c) 112 See Sarasvathy, Causation and Effectuation, p. 245. 113 Following Sarasvathy, ibid., p. 251. I. Innovation and entrepreneurship 69 Fueglistaller et al. refer to both approaches in order to illustrate the process of strategic management. They define the term “strategy” as the “systematic planning of all business activities and processes in order to pursue long-term competitive advantages.”114 The classic strategic management process is usually categorized along four phases: Analysis, development of strategic goals, strategic execution, and control.115 In contrast to such a linear-causal approach, the effectuation approach focuses on the means available in a specific situation and the iterative-nonlinear development of the strategic aims. The effectuation approach thus, fits well situations defined by many unknown factors in which, for example, startups mainly operate.116 Entrepreneurial contexts: The Law as one influencing factor in innovation processes amongst others Focusing on specific situations and the means actually available for an entrepreneur, the context plays a more important role. Welters highlights the importance that specific historical, institutional, societal and social contexts can have in determining the resources, as well opportunities and boundaries for entrepreneurial activities. From this perspective, the legal regulatory framework is, as an example of formal institutions, one impact factor for “entrepreneurship as taking place in (further) intertwined social, societal, and geographical contexts, which can change over time and all of them which can be perceived as an asset or a liability by entrepreneurs” (word in brackets added by the author).117 Welters also stresses the recursivity of links between these contexts during the entrepreneurial process.118 Innovation produced by entrepreneurs hence, is not the result of a one-dimensional and linear process, but of a multi-factor-based non-linear process.119 Fagerberg highlights this interdependency as an essential read) 114 See Müller et al., Entrepreneurship – Strategy and business model, p. 138: “Strategie: Die planvolle Ausrichtung sämtlicher Unternehmensaktivitäten und -prozesse zur Erzielung langfristig wirkender Wettbewerbsvorteile.” 115 See Müller et al., ibid., p. 143. 116 See Müller et al., ibid., p. 147 to 150. 117 See Welter, Contextualizing Entrepreneurship, pp. 172 and 176. 118 See Welter, ibid., pp. 177. 119 See Neveling et al., ibid., pp. 371 and 372 with references to J. S. Metcalfe, Impulse and Diffusion in the Study of Technical Change, Futures 13 (1981), p. 347, B. Conceptual definitions as a link for regulation 70 son for why many inventions take time turning, if at all, into an innovation as: “There may not be sufficient need (yet!) or it may be impossible to produce and/or market because some vital inputs or complementary factors are not (yet!) available.”120 Mayer-Schönberger concludes from this that many current laws suffer from a conceptual flaw because they would imply, in his opinion, a linear model of innovation processes. Taking the multi-dimensional and non-linear model seriously, the legislator should give up its reactive approach and understand itself, instead, as proactive actor directly creating – equally beside the other mechanisms (be they technical, social, cultural etc.) – business opportunities, and not only facilitating them.121 Regulation of innovative entrepreneurship The preceding illustration of Entrepreneurship theories provides several links in order to answer the question of how innovation may be regulated through the law. First, considering entrepreneurs as the main driver of innovation (in which organizational form ever this occurs)122 they appear to be appropriate addressees of laws aiming to regulate such innovation. Second, the action-oriented approach of entrepreneurship theories, in particular, the Discovery and Creation Theory corresponds to the regulatory approach applied in this thesis, which focuses, equally, on action.123 Third, entrepreneurship models describing the entrepreneurial process correspond with the observation made in practice, as well as in regulation theory, that innovation often, if not mainly, occurs in highly dynamic non-linear processes, and not in causal-linear ways.124 There are indeed causallinear innovation processes, such as in research science; however, academics stress that most innovations do not occur in research settings but instead is driven by the experience of users and, thus, in more non-linear 2. as well as K. J. Schmidt-Tiedemann, A New Model of the Innovation Process 25 (1982), pp. 18 ff. 120 See Fagerberg, ibid., pp. 5 and 6. 121 See Mayer-Schönberger, The Law as Stimulus: The Role of Law in Fostering Innovative Entrepreneurship, pp. 180 to 183. 122 See Fueglistaller et al., Entrepreneurship – Basics, pp. 12 and 13. 123 See above under point A. II. Research questions and approach. 124 Cf. above under point A. I. 4. Practical examples referring to two typical scenarios, and A. II. Research questions and approach. I. Innovation and entrepreneurship 71 environments.125 Finally, the context-oriented view of entrepreneurship research corresponds with the self-understanding of the regulatory approach considering the law as just one mechanism beside further ones, such as informal norms or geographical conditions.126 Even if there is neither a common understanding of innovation or entrepreneurship research, in general, nor a holistic theory of entrepreneurial processes and its contextualization, in particular, the preceding aspects make it suitable as a conceptual model of reality for doing research on the effects of legal regulation instruments on processes of innovation.127 The following paragraphs shall shed further light on the various effects of regulation on “innovative entrepreneurship” discussed in entrepreneurship as well as legal literature. Do laws simply shift societal costs either protecting against or being open to innovation? The legislator may shape laws conflicting with the non-linearity of innovation processes in order to protect individuals concerned. The principle of purpose limitation could be considered as an example for such a law, at least so long as it requires from the controller to exactly specify the intended use of personal data and then strictly limit the later use to this initial specification. Such an understanding of the principle of purpose limitation principally conflicts with the openness of innovation processes because it does not allow controllers to use the data for purposes other than for those that the controller could foresee when the data is collected. Mayer-Schönberger describes such a law as simply shifting costs between different groups in society. He gives an example of labor law in order to illustrate his opinion: The legislator can structure labor law in such a way, allowing entrepreneurs to easily hire and fire employees. On the one hand, this would enable entrepreneurs to save costs, i.e. constantly adapt expenses for human resources to the actual need at low transaction costs. On the other hand, either the employee concerned has to bear the costs for finding new employment (or other ways of financing his or her living expenses) or a) 125 See Fagerberg, ibid., Box 1.3 “What innovation is not: the linear model“, p. 11. 126 Cf. above under point A. II. Research questions and approach. 127 See again Fagerberg, ibid., p. 1; Fueglistaller et al., ibid., p. 6; Moroz and Hindle, ibid., p. 781; Welter, ibid., p. 177. B. Conceptual definitions as a link for regulation 72 the state for supporting the unemployed.128 In light of this, the principle of purpose limitation as described before may be considered as simply shifting costs from the individual concerned to the controller referring to an assumption as: If the later use of personal data is limited to the originally specified purpose, the individual (or the social welfare state) may suffer less harm and though have less costs; the controller bears these costs, in turn, being limited in its innovation process. Principles between openness toward innovation and legal uncertainty In contrast, the legislator might also choose another way and decrease costs overall. Instead of shaping a law that only shifts costs from one group in society to another, the legislator might “also influence the probability of incurring a cost even when holding expected values (and thus costs for taxpayers) constant, thus prompting more people to engage in entrepreneurial activity”.129 In the first instance, principles may be considered as such a regulation instrument. As illustrated in the introduction, the legislator does not often have sufficient knowledge for determining precisely the circumstances of an entrepreneurial process and its impact on society. Therefore, the legislator can choose to establish principles, which leaves private companies more room in finding the best solutions themselves in order to meet the regulatory aim. Indeed, this form of regulation decreases legal certainty because the companies are not able to state whether or not they actually meet the regulator’s exact expectations.130 So far, at least, from this perspective, the principle of purpose limitation does not simply shift costs from the individuals to the controllers. Instead, it gives controllers room to find the best solution to apply the principle of purpose limitation and, thus, different ways of avoiding costs, not only for themselves, but also for the individuals concerned. This approach assumes that it is possible, in principle, for the controller to use, for example, personal data in a very broad way, or even for another purpose than initially specified, so that the way the data is being used does not harm the individual, and thus, does not lead in an increase in costs for the individual or sob) 128 See Mayer-Schönberger, ibid., with further examples on pp. 175 ff. 129 See Mayer-Schönberger, ibid., p. 180, see also pp. 176/177. 130 Cf. again Raab and De Hert, ibid., p. 278; Eifert, ibid., cip. 25 and 26; Franzius, ibid., cip. 7, 17, 81 to 103;. I. Innovation and entrepreneurship 73 ciety. If this assumption turned out to be correct, i.e. no costs for the individual or society, the subsequent question is: what impact the decrease of legal certainty has on entrepreneurial activity. Legal (un)certainty as a factor that mediates the regulatory burden In order to answer this question, two empirical studies shall be highlighted. First, the study conducted by Hartog et al. examined the impact of the regulatory burden and rule of law on entrepreneurial activity. Their results confirmed previous works “suggesting that social security entitlements, taxes, and employment protection legislation are negatively associated with (different forms of) entrepreneurial activity.”131 This result corresponds to Mayer-Schönberger’s understanding of the type of regulation that shifts costs from one group in society to another. However, their study additionally came to the (seemingly) counter-intuitive result that countries with stronger rule of law had lower entrepreneurial activities. The authors considered this result as counter-intuitive because they assumed that a strong rule of law would not only hinder entrepreneurial activity, but would also help entrepreneurs, for example when they want to enforce their own contracts that they have concluded with third parties.132 Hartog et al. considered that a possible reason for this result was that because, in developed countries, primarily large enterprises profit from the benefits of a strong rule of law.133 The second study, which was conducted by Levie and Autio, proposes a more detailed explanation for this phenomenon: “Entrepreneurial and new ventures face disproportionately high compliance costs, because their small initial size makes it costly for them to maintain compliance functions internally. For industry incumbents, whose large size permits a greater degree of internal specialisation and the maintenance of a larger administrative function in absolute terms, compliance costs are less significant.”134 If one were to pre-suppose that there is a causal relationship, these considerations lead to the result that higher legal aa) 131 See Hartog et al., Institutions and Entrepreneurship: The Role of the Rule of Law, p. 3. 132 See Hartog et al., ibid., p. 8. 133 See Hartog et al., ibid., p. 3. 134 See Levie and Autio, Regulatory Burden, Rule of Law, and Entry of Strategic Entrepreneurs: An International Panel Study, p. 1411. B. Conceptual definitions as a link for regulation 74 certainty hinders innovative entrepreneurs, rather than enabling them to pursue their activity. At least this is the case, so long as the entrepreneur’s organizational structure remains so small, that the bearing of compliance costs still is disproportionate. In this study, Levie and Autio however, came to a more nuanced result. They took a deeper look at the particular interplay between the regulatory burden and the rule of law and its effects on strategic entrepreneurial decisions. Referring, amongst other unities of analysis, to an individual’s decision to enter into business and, conceptually, to Signaling Theory, they assumed that individuals, who aim to profit most from their decisions, make their decisions in light of how they perceive the influence of institutional factors within society in relation to their activities. Similar to Mayer- Schönberger’s understanding of a regulation shifting costs between different societal groups, the way how entrepreneurs perceive these factors regulates “the distribution of profits between stakeholders and, thus, the accumulation and approbiability of returns to entrepreneurial efforts.”135 Levie and Autio concluded a further conceptual dimension from this: their findings confirmed, firstly, the already known assumption that a “lighter regulatory burden (is) associated with a higher rate and relative prevalence of strategic entrepreneurial entry (word in brackets added by the author).”136 However, the new finding was that rule of law “moderates this effect such that regulation has a significant effect on strategic entry only when rule of law is strong.”137 Instead of a weaker rule of law, as considered by Hartog et al., Levie and Autio thus suggest that a stronger rule of law enables entrepreneurship, under the condition that the regulatory burden is low. In order to explain this suggestion, Levie and Autio generally considered four different types of interrelationships: First, if the rule of law is weak and the regulatory burden is heavy, corrupt officials get the opportunity to siphon off entrepreneurial rents; even if corruption is low, strategic entrepreneurs are more likely to interact with officials than non-strategic entrepreneurs and, thus, run a higher risk of being regulated heavily. Second, if the rule of law is weak and the regulatory burden is light, corrupt officials have fewer opportunities to siphon off entrepreneurial rents; however, entrepreneurs are less able to defend their own interests against other private parties by means of law. Third, if the rule of law is strong and the 135 See Levie and Autio, ibid., p. 1395. 136 See Levie and Autio, ibid., p. 1392. 137 See Levie and Autio, ibid., p. 1392. I. Innovation and entrepreneurship 75 regulatory burden is heavy, officials have fewer opportunities to siphon off entrepreneurial rents and entrepreneurs are able to defend their interests against other parties by legal means; however, they must pay the costs resulting from a heavy (effective) regulation. Consequently, Levie and Autio promote the fourth case as the best solution; if the rule of law is strong and the regulatory burden is low, entrepreneurs do not end up paying for corruption costs resulting from heavy regulation, but they also have sufficient legal means to defend their interests.138 Even if their study referred to the distribution of profits between entrepreneurs and employees and, thus, to the choice of being a potential employer or an employee,139 they draw a more general conclusion as: “Bureaucracy and red tape hamper entrepreneurial growth and divert scarce resources of potentially highgrowth entrepreneurial firms away from their core business. Regulations, then, can adversely affect the prevalence and anatomy of entrepreneurial activity, particularly in countries in which the rule of law is respected.”140 Thus, in their opinion, if the regulatory burden is low, high legal certainty not only enables innovative large companies, but also small and middlesized companies. Conditioning further legal certainty as a promoting factor for entrepreneurial activity These results lead back to Mayer-Schönberger’s approach. He considers a strong rule of law as an incentive for entrepreneurial activity. He argues that in light of the many uncertainties entrepreneurs are confronted with, they generally prefer to precisely know what the law expects from them. In Mayer-Schönberger’s opinion, this knowledge would enable them to calculate their legal risks and associated costs. From this point of view, “the role of the legal system in facilitating entrepreneurial activity is to reduce the uncertainties that entrepreneurs perceive.”141 Mayer-Schönberger refers, similarly to Levie and Autio, to the Expected Utility Theory. Howbb) 138 See Levie and Autio, ibid., pp. 1400 and 1401. 139 See Levie and Autio, ibid., pp. 1395 and 1396. 140 See Levie and Autio, ibid., p. 1411. 141 See Mayer-Schönberger, ibid., pp. 177 and 178; cf. also Kloepfer, Law enables Technology – About an understimated function of environmental and technology law, p. 417 and 418. B. Conceptual definitions as a link for regulation 76 ever, he emphasises that the focus should be on how the law may play a decisive role in entrepreneurial risk calculation: In light of the individually different capabilities of evaluating risks, Mayer-Schönberger clarifies, at first, that more legal certainty does not necessarily lead to better entrepreneurial decisions but, at least, to more entrepreneurial activities. Second, in light of empirical findings demonstrating that individuals become more risk-averse the higher the potential payoff is, he suggests to increase legal predictability if entrepreneurs face high benefits or costs. Third, since individuals are more risk-averse when they evaluate potential benefits and more risk-taking regarding possible losses, he proposes “that lawmakers should focus on making legal rules more certain for financial benefits offered to entrepreneurs, like subsidies, rather than costs, like taxes”.142 He concludes that this perspective would enable the regulator to enhance entrepreneurial activity without decreasing protection, i.e. increasing costs, for third parties.143 Interim conclusion with respect to the principle of purpose limitation So far, there appears to be a conflict. In the first instance, the principle of purpose limitation is principally open toward innovation because it leaves data controllers enough room to find the most cost effective way of applying the principle. However, in the second instance, the principle of purpose limitation decreases legal certainty and therefore fails in enhancing entrepreneurial activity. However, the previous considerations allows us to come to the conclusion that there are different hypotheses regarding the interplay between the principle of purpose limitation and data-driven innovation: First, legal certainty acts as an incentive for entrepreneurs to apply the law, so long as the regulatory burden does not turn red tape. Whether this is the case or not with respect to the principle of purpose limitation depends on its interpretation and application in the specific case. Second, the higher the potential payoff for entrepreneurs is, the better legal certainty can act as an incentive to apply the principle of purpose limitation. This means that mechanisms clarifying how to apply the principle of purpose c) 142 See Mayer-Schönberger, ibid., pp. 179 and 180. 143 See Mayer-Schönberger, ibid., p. 180. I. Innovation and entrepreneurship 77 limitation only work better the more the data controllers potentially stand to lose or gain. The first might be the case if the penalties for non-compliance with the principle of purpose limitation are so high that the controller would consider its execution as a real loss. The second might be the case if the controller is going to break through in gaining users, customers or financial investors for their product, service or enterprise and these parties require, in exchange for giving data controllers their trust (i.e. personal data, money or investment), an assurance that the controller is applying the law (the principle of purpose limitation). This second case refers to the socalled competitive advantage of data protection law:144 Users may only disclose their data to the data controller or customers may only pay for the product if certain data protection principles are met. Financial investors might verify whether the data controller has complied with data protection law, similarly to compliance with copyright law, as a condition for their investment. Indeed, there is little scientific evidence to what extent users, customers, or investors really expect such a compliance with data protection law. However, there is at least a study which demonstrates that users prefer products from online merchants with better privacy policies even if they have to pay a higher price for the product.145 In any case, so long as a user or customer base does not yet constitute a real asset for the data controller or it does not need an external investment, these requirements do not serve an incentive per se. However, the moment where these factors constitute an asset for the controller, the second hypothesis becomes relevant: Since potential gains serve better than potential losses as incentive, the legislator should focus more, if it had to choose, on increasing legal certainty enabling entrepreneurs to exploit a competitive advantage than on penalties. 144 See, instead of many others, the ”Statement by Vice President Neelie Kroes‚ on the consequences of living in an age of total information’“ from the 4th of July 2013, retrieved on the 10th of March 2016 from http://europa.eu/rapid/press-relea se_MEMO-13-654_en.htm. 145 See Nissenbaum, Privacy in Context, p. 106 referring to Tsai, J., Egelman, S., Cranor, L., and Acquisti, A. 2007. The Effect of Online Privacy Information on Purchasing Behavior: An Experimental Study. Paper presented at the 6th Workshop on the Economics of Information Security (WEIS), Carnegie Mellon University, Pittsburgh, PA, p. 35. B. Conceptual definitions as a link for regulation 78 Data protection as a risk regulation After having illustrated how economic models about innovative entrepreneurship provide links for doing research on the regulation of innovation, this sub-chapter draws the attention to the other side of the regulation of data-driven innovation, i.e. the protection against the risks. In the preceding considerations, the terms “risks”, “dangers”, “threats” and “harms” were already mentioned frequently, even if, however, rather casually. The following considerations clarify the meaning of these terms and how they serve, conceptually, as links for regulation. Risk terminology oscillating between “prevention” and “precaution” Legal scholars stress the function of data protection law as a regulation of risks.146 And many data protection sources indeed aim to regulate risks caused by the processing of personal data. The revised OECD Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data define, for example, its scope of application by referring to personal data as “which, because of the manner in which they are processed, or because of their nature or the context in which they are used, pose a risk to privacy and individual liberties.”147 With respect to the EU directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the movement of data (Data Protection Directive), the Article 29 Data Protection Working Party stresses that the risk-based approach is “not a new concept, since it is already well known under the current Directive 95/46/EC.”148 Indeed, in several provisions, the Data Protection Directive explicitly refers, for instance, to “the risks represented by the processing” (regarding data security under Article 17), to “specific risks to the rights and freedoms of data subjects” (regarding prior checking under Article 20), and to the proportionality test (general clause II. 1. 146 See Kuner et al., Risk management in data protection; Costa, Privacy and the precautionary principle; Gellert, Data protection: a risk regulation? Between the risk regulation of everything and the precautionary alternative. 147 See OECD Guidelines Governing The Protection Of Privacy And Transborder Flows Of Personal Data in Article 2. 148 See the Article 29 Data Protection Working Party, Statement on the role of a riskbased approach in data protection legal frameworks, p. 2. II. Data protection as a risk regulation 79 for the controller’s legitimate interests under Article 7 lit. f) that is typical for risk regulation regimes.149 In the forthcoming General Data Protection Regulation (GDPR), risks play an even more important role, in particular, with respect to the so-called risk-based approach. Veil categorizes the multitude of terms referring to the risk-based approach and its legal consequences. For example, while one category referring to high risks can lead to the application of specific requirements, another category referring to low risks may result in the exclusion of requirements; yet another category determines, for instance, the extent and manner of how data controllers must implement measures protecting against risks.150 In this last regard, Article 24 of the General Data Protection Regulation provides for a central provision stating as: “Taking into account the nature, scope, context and purposes of the processing as well as the risks of varying likelihood and severity for the rights and freedoms of individuals, the controller shall implement appropriate technical and organisational measures to ensure and be able to demonstrate that the processing of personal data is performed in compliance with this Regulation. These measures shall be reviewed and updated where necessary.”151 The Article 29 Data Protection Working Party stresses that such a riskbased approach “goes beyond a narrow ‘harm-based-approach’ that concentrates only on damages and should take into consideration every potential as well as actual adverse effect, assessed on a very wide scale ranging from an impact on the personal concerned by the processing in question to a general societal impact (e.g. loss of social trust).”152 From a historical perspective, indeed, it is not a new idea to focus on risks, thus, on a moment before a danger occurs. The idea behind such a temporal extension of protection is that a protection for an individual, who might be the subject of the use of information, could be too late if he or she was only able to claim against the specific use of that information after it had been collected. Legal scholars had recognized, very early in the discussions about data protection, as well as privacy that a protection against 149 With respect to the last aspect, see Kunert et al., ibid., p. 98, as well as Costa, ibid., p. 19. 150 See Veil, GDPR: Risk-based approach instead of rigid principle of prohibition, pp. 351 and 352. 151 Cf. already the Article 29 Data Protection Working Group, Opinion 3/2010 on the principle of accountability. 152 See the the Article 29 Data Protection Working Group, Statement on the role of a risk-based approach in data protection legal frameworks, p. 4. B. Conceptual definitions as a link for regulation 80 the collection of the data (providing the basis for the information), can instead be more effective. For instance, in 1969, Miller highlighted that “the most effective privacy protection scheme is one that minimizes the amount of potentially dangerous material that is collected and preserved; a regulatory scheme that focuses on the end use of the data by governmental or private systems might be a case of too little, too late.”153 The reason for this fear is that once information is spread, in metaphorical words, the cat is led out of the bag, and it is difficult to get it back. Once the State or a private entity knows something about somebody else, it can base its decisions (with all possibly negative consequences for the individual concerned) on this knowledge.154 Thus, from a regulatory perspective, it seems to be more difficult to enforce the State or a private entity not to base its decisions on this knowledge than to regulate the collection of the personal data as the source of this informational risk. Such a risk-related regulatory approach plays also an important role in Germany. Costa refers to the so-called precautionary principle that was first formalized by Germany during the 1970’s in environmental law;155 and Gellert quotes the “pioneering” data protection legislation established by the German Land Hessen that “implicitly frames data protection as a risk regulation regime since one of its purposes is to: ‘safeguard the constitutional structure of the state (…) against all risks entailed by automatic data processing’.”156 The German legal scholar Roßnagel draws the attention to the regulator’s protection instruments resulting from such a risk approach. He highlights the principle of data minimization as an example for the precautionary principle because it extends, similar to the minimization principle in environmental law, the protection provided for by preventative means by adding precautionary means. In his opinion, the requirement of data minimization particularly goes beyond the necessity requirement (i.e. that the data processing must be necessary for achieving the purpose of the 153 See Miller, Personal Privacy in the Computer Age: The Challenge of a New Technology in an Information-Oriented Society, p. 1221. 154 See Grimm, Data protection before its refinement, p. 586. 155 See Costa, ibid., p. 4, referring to Olivier Godard, “Introduction générale“, in: “Le principe de précaution dans la conduite des affaires humaines“ (Paris: Editions de la Maison des sciences de l’homme Institut National de Recherche Agronomique, 1994), p. 25. 156 See Gellert, ibid., p. 5, referring to Lee A Bygrave, Data Protection Law—Approaching Its Rationale, Logic, and Its Limits (Kluwer, The Hague; London; New York 2002), 39, at 5. II. Data protection as a risk regulation 81 processing intended) because the latter depends on a specific purpose while the first questions the purpose per se. Thus, the principle of data minimization does not require asking whether or not the processing is necessary for a given purpose but whether the purpose as such can be formulated more narrowly in order to minimize the data collection as a whole. In light of this, Roßnagel differentiates between both principles pursuant to their range of protection: while the necessity requirement serves the prevention of dangers, the requirement of data minimization is a means of precaution.157 This consideration leads to the question of how to differentiate, actually, between prevention and precaution. Sociological approaches defining “dangers” and “risks” The German legal scholar Jaeckel considers the difference between prevention and precaution as corresponding to the question of how to differentiate between dangers and risks.158 Indeed, while there is common sense in the meaning of an actual harm or damage, e.g. “a loss to a person or their property”159, the precise meaning of terms like danger and risk referring to a potential harm (i.e. overall threat) is less clear. Jaeckel gives an overview about sociological and legal conceptions of how to differentiate between dangers and risks.160 From a sociological perspective, she highlights the concepts proposed by Evers and Novotny, on the one hand, and Luhmann, on the other hand. Evers’ and Novotny’s starting point is to define “risk” as a term seeking to make dangers calculable. Thus, the specific knowledge about the probability and severity of a threat turns dangers into risks.161 Subsequently, Evers and Novotny draw the attention to the normative dimension of risks. 2. 157 See Roßnagel, The Requirement of Data Minimization, pp. 43 to 45. 158 See Jaeckel, Differentiating between Danger and Risk, p. 117; Prevention of Danger through Law and Legal Conceptualization of Risk, p. 70. 159 See, for example, Costa, ibid., p. 14. 160 See Jaeckel, Prevention of Danger through Law and Legal Conceptualization of Risk, pp. 49 ff. 161 See Jaeckel, ibid., pp. 51 and 52, by referring to Evers and Novotny, Umgang mit Unsicherheit, Suhrkamp 1987, Berlin; cf. also Gellert, ibid., pp. 7 and 13, referring to Patrick Peretti-Watel, La société du risque (Repères. La Découverte, Paris 2010); Olivier Borraz, Les politiques du risque (Presses de Sciences Po, Paris 2008), Jenny Steele, Risks and Legal Theory, vol 68 (Hart Publishing, Oxford, B. Conceptual definitions as a link for regulation 82 They stress that the difference between dangers and risks depends on its general perception in today’s society. For example, citizens express their concerns and fears about a certain issue like environmental pollution or state surveillance based on an abuse of personal data because there is a societal consensus that environmental health or privacy or autonomy in a democratic civil society is a value. Thus, the moment citizens perceive a non-calculable threat for environmental health, their privacy or autonomy, this perception can turn a risk back to a danger for these values. Jaeckel stresses Evers’ and Novotny’s conclusion that mathematic and system-analytical methods of calculating risks alone can hence not explain the treatment of uncertainties in a society; instead, this treatment also depends on its normative expectations.162 Luhmann, in contrast, differentiates between dangers and risks pursuant to the question of who is considered as responsible for the (potential) harm. If the harm is considered as resulting from an external factor, Luhmann refers to the term “danger”; instead, there is a risk if the harm is considered as resulting from a human decision. Jaeckel considers this perspective as interesting from a legal viewpoint because it illustrates that not only decisions which lead to active action but also decisions not to act, may in itself be considered as causing risks. For example, the prohibition of a certain medicine against a certain disease can avoid risks resulting from unwanted side effects but, simultaneously, create or increase the risk caused by the disease itself. This nature of decisions as a two-sided sword UK; Portland, Oregon 2004) 21, Jacqueline Peel, Science and Risk Regulation in International Law (Cambridge University Press, Cambridge, UK 2010) 79–80. 162 See Jaeckel, ibid., pp. 51 and 52, by referring to Evers and Novotny, Umgang mit Unsicherheit, Suhrkamp 1987, Berlin; cf. also van Dijk, Gellert and Rommetveit, A risk to a right? Beyond data protection risk assessments, p. 13, referring, amongst others, to Felt U,Wynne B, Callon M, Gonçalves ME, Jasanoff S, Jepsen M, et al. Taking European knowledge society seriously (report of the expert group on science and governance to the science, economy and society directorate, directorate-general for research). Luxembourg: European Commission; 2007, as well as Irwin A,Wynne B, editors. Misunderstanding science? – the public reconstruction of science and technology. Cambridge: Cambridge University Press; 1996; see, regarding the German perspective, at Forum Privatheit, White Paper – Data Protection Impact Assessment, pp. 29 and 30. II. Data protection as a risk regulation 83 leads to the result that potential negative effects must always be weighed against potential positive effects in order to determine the overall risk.163 In any case, Jaeckel comes to the conclusion that both concepts do actually not correspond to approaches developed so far in (German) legal literature: Luhmann’s concept does not help, in her opinion, determine the real risk or danger and, therefore, does not answer the question of which protection instruments are needed in order to establish against real risks or dangers. And the concept by Evers and Novotny contradicts the legal discussion considering the relationship between danger and risk in the reverse direction. In Germany, at least, the legal discussion considered that a danger was the calculable threat, whereas a risk was considered as an uncertain threat that could not comprehensively be grasped.164 German legal perspectives: Different protection instruments for different types of threat In Germany, initially focusing on police law, the debate centered, for more than a century, on the notion of prevention of danger. In contrast, the legal debate started to develop the notion of precaution against risks in the 1980’s, holding the reference to this relatively new term as a necessary answer to the scientific and technological progress.165 This progress produced a new type of threat that did not appear to fit to the classic understanding of a danger. The debate discovered, in particular, the following characteristics: First, these threats only become apparent after a long period had lapsed and/or when it is looked at from a global perspective; second, only the combination of several issues, which are, per se, not risky if they remain a singular phenomenon, together cause a threat; or third, a threat is indeed extremely unlikely but runs the risk of causing an ex- 3. 163 See Jaeckel, ibid., pp. 53 to 56, referring, amongst others, to Luhmann, Soziologie des Risikos, pp. 30 ff, as well as, ibid., Die Moral des Risikos and das Risiko der Moral, in: Bechmann, Risiko und Gesellschaft, pp. 327 and 331. 164 See Jaeckel, ibid., pp. 52 as well as 55 and 56. 165 See Jaeckel, ibid., p. 57, referring, amongst others, to decisions of the Prussian Higher Administrative Court (Preußisches Oberverwaltungsgericht) as well as to Murswiek, Die staatliche Verantwortung für die Risiken der Technik, p. 80, and Kloepfer, Umweltrecht, 1. Auflage 1989, p. 45 cip. 46. B. Conceptual definitions as a link for regulation 84 tremely severe and irreparable harm.166 In light of the perception of such risks in society as a new form of threat, the legislator started to use the term in law, and the legal discussion started to react to this term by clarifying its precise meaning and extent. Protection pursuant to the degree of probability At first, the legal discussion elaborated on a three-layered model differentiating between dangers, risks, and remaining risks combined with different legal consequences: While a regulator had to strictly prevent a danger, it could only minimize a risk; and there also is a remaining risk that had to be accepted without protection against it. On the basis of this differentiation, this model defined the term danger as a situation that may turn, with sufficient probability, into a harm for a specific object of protection if nobody were to stop this causal chain. Certainty about the harm, thus, is not necessary; however, the concept of harm as being an only possible threat was considered as insufficient for regulation. Between these two poles, i.e. certainty and possibility, the regulation depended on the probability of the harm. Indeed, there is no fixed probability required, instead, the following balancing exercise had to be carried out: The more severe the potential harm is, the less probable it had to be in order to create a state duty of protection, and vice versa. Indeed, the moment the existence of a danger could be determined, the State had to prevent it, irrespective of how much effort had to be spent on prevention; in the worst case scenario, the State or any other party had to refrain from the action or decision that caused the danger.167 In contrast to such a prevention of dangers, precaution against risks takes place before preventative measures can protect against threats. Pursuant to the three-layered model, a situation is risky if harm is possible but the methods elaborated with respect to a danger cannot determine its probability. This might be the case because of one of the following three reasons, which were mentioned previously: First, the negative effects of an action or decision may take place too far in the future; second, its causality is hard to determine because there are too many factors leading to the poa) 166 See Jaeckel, ibid., p. 58 with reference to Murswiek, Die staatliche Verantwortung für die Risiken der Technik, p. 80. 167 See Jaeckel, ibid., pp. 57 to 60 with further references. II. Data protection as a risk regulation 85 tential harm; or third, its probability is just too low. In light of the lower threat of a risky situation than of a dangerous one, the regulator does not have to prevent the threat as a whole but only to minimize it. Furthermore, this duty depends on the technical possibilities, as well as the proportionality between efforts and utility. Another difference between prevention of a danger and precaution against risks is that the individual concerned has a subjective right to protection only against dangers but not against risks. Finally, this three-layered model acknowledged a third category of threat, i.e. remaining risks that must be socially accepted without having protection measures against it. This results from the fact that no technology can guarantee full protection against all threats imaginable. A duty of protection against such threats would therefore be disproportionate and lead to a prohibition of technology development.168 Jaeckel confirms that this three-layered approach brought to light the issue that there are different kinds of threats that require different protection instruments. However, the problem of this model was that it only superficially provided a clear differentiation between dangers, risks, and remaining risks. In fact, it was hardly possible to precisely determine which situation bears a danger, or a risk, or only a remaining risk. This uncertainty was problematic because the three-layered model tied precise legal requirements to these three categories: If one type of threat (i.e. danger) requires preventative protection measures, another type of threat (i.e. risk) requires minimizing measures, only, and a third type of threat (i.e. remaining risk) requires no protection at all, then its differentiation should be clear.169 In order to minimize this problem, legal scholars had therefore proposed, a two-layered model that mainly differentiated between dangers and risks, on the one hand, and remaining risks, on the other. This twolayered model considered a risk as the umbrella term and a danger as a specific type of risk. From this perspective, the term risk meant all possible threats, whereas a danger is a threat with a certain probability.170 Jaeckel affirms that this concept enables one to tie different proportionate protection instruments to different types of threats, without drawing an artificial and over-formalistic line of distinction. However, in her opinion, it would nevertheless be helpful to clearly differentiate between dangers and 168 See Jaeckel, ibid., pp. 60 and 61 with further references. 169 See Jaeckel, ibid., pp. 62 to 63. 170 See Jaeckel, ibid., p. 66 referring to Murswiek, Die staatliche Verantwortung für die Risiken der Technik, pp. 80 ff. and 335 ff. B. Conceptual definitions as a link for regulation 86 risks in order to choose the adequate and proportionate protection instruments.171 Protection pursuant to the available knowledge in linear-causal and non-linear environments Tying into the conceptual approaches developed by Di Fabio and Ladeur, Jaeckel finally comes to the conclusion that the actual difference between dangers and risks consists in the methodologies for (administrative) “decisions under uncertainty”:172 A danger refers to a type of threat that is, based on individual and societal experience, which is already known so that the State is able to react to it with an experienced set of methodologies. In contrast, the term “risk” refers to knowledge that is not certain. This perceived uncertainty results from the conceptual shift from a linear and causal approach to a non-linear and dynamic approach in understanding the world.173 In a non-linear dynamic world, “the loose connection between cause and effect requires new concepts for actions or decisions based on uncertain knowledge: ‘The connection between action and knowledge, which was made in the past through the term of danger, has to be made today, under the conditions of increased complexity and uncertainty, through the term of risk.’”174 From this knowledge perspective, the main difference between a danger and a risk hence is that an objective observer having all the knowledge of the world is principally able to determine under which conditions a danger turns into harm; in contrast, regarding risks, there is no objective knowledge horizon about the outcome of a risk, instead, there principally is only a subjective point of view. In Jaeckel’s opinion, the regulator reacts to this paradigm shift (i.e. with respect to the knowledge uncertainties) by introducing, more and more, subjective elements into the law: First, by accumulating knowledge through the integration of expert groups and private entities and by stretching, second, these procedures from a time perspective, as well as by binding them to b) 171 See Jaeckel, ibid., pp. 69 and 70. 172 See Jaeckel, ibid., p. 77. 173 See Jaeckel, ibid., pp. 78 to 80. 174 See Jaeckel, ibid., p. 81, quoting Ladeur, The Environmental Law of the Knowledge Society: From the protection against dangers to the management of risks, p. 78. II. Data protection as a risk regulation 87 procedural rules; and third, by acknowledging that the introduction of legal objectives, like broad legal terms and principles, corresponds with a certain limitation of the judicial review. If knowledge is exclusively subjective, then the Courts have to acknowledge this subjectivity and cannot substitute it by their own “objective” point of view. Indeed, Jaeckel stresses that this limitation of judicial review only applies insofar as there really is an uncertainty that limits the construction of an objective knowledge horizon.175 Interim conclusion: Fundamental rights determining the appropriateness of protection With respect to the protection instruments, preventative measures thus seek to directly protect against dangers, i.e. linear-causal threats of sufficient probability for specific objects of protection. In contrast, precautionary measures react to the knowledge deficiencies resulting from dynamic and non-linear environments. They serve to maintain possibilities for action if there is, for example, no objective proof for a causal connection between a certain action and a later harm for a specific object of protection. Therefore, they often refer, at first, to informational measures rather than control. Jaeckel advocates that this conceptual difference enables the regulator to choose, with respect to the particularities of a certain area of life, the proportionate protection instruments for the different types of threats.176 Indeed, the choice for the proportionate protection instruments consists, of two different questions: The first question refers to the duty of protection of the State. This question posed is: which type of threat requires which protection instrument, in other words, whether preventative or precautionary measures are necessary in order to (finally) avoid a potential harm. The answer depends, similarly for the actual harm, on the fundamental rights of the individuals concerned or other constitutional guarantees (e.g. environmental protection under Article 37 of the European Charta of Fundamental Rights).177 The second question posed is: c) 175 See Jaeckel, Differentiating between Danger and Risk, p. 120. 176 Jaeckel, ibid., p. 123. 177 See Jaeckel, Duties of Protection in German and European Law, pp. 85 to 88 as well as 165 and 166; cf. also van Dijk, Gellert and Rommetveit, A risk to a right? Beyond data protection risk assessments, pp. 17 and 18. B. Conceptual definitions as a link for regulation 88 whether the protection instrument established in order to fulfill a State duty of protection is proportionate or not. The answer to this question does not only refer to the fundamental rights of the individual concerned, but also on the fundamental rights of the entities (e.g. entrepreneurs), which must apply this protection instrument. Thus, this answer therefore depends on the balancing exercise between the opposing fundamental rights. This balancing exercise may result in the fact that the prevention of a certain action (e.g. its prohibition) that leads to a risk (not a danger) would be disproportionate. In contrast, a precautionary measure, which only seeks to gather information in order to potentially discover a danger is proportionate. The reason is that the requirement to gather information infringes the fundamental rights of the entrepreneur less, than the prohibition of its actions.178 Searching for a scale in order to determine the potential impact of data protection risks The essential point here is that this doctoral thesis does not purport to decide which definition of risks and dangers is appropriate. However, its aim is to illustrate that there are different kinds of threats that require different protection instruments. Therefore, this thesis mainly refers to the term, “threat” or uses both terms “risks” and “dangers”, synonymously, unless stated otherwise. In conclusion, amongst these threats, there are particular situations where there is insufficient knowledge in order to specify an object of protection threatened by a certain action or to determine a causal link between this action and a potential harm. Costa describes the precaution against these kind of threats, giving yet another definition, as based on “hypotheses that have not been scientifically confirmed”, in contrast to the prevention of “identifiable risks”.179 In other words, “while the prevention is the remedy against the exposure with regard to a known harm, precaution is meant to avoid the mere possibility of suffering harm or 4. 178 See Jaeckel, Duties of Protection in German and European Law, pp. 85 to 88 as well as 165 and 166; Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 105 to 109; cf. Kuner et al., ibid., p. 98; see below in more detail regarding the duties of protection point C. I. b) The effects of fundamental rights on the private sector. 179 See Costa, ibid., p. 15. II. Data protection as a risk regulation 89 loss.”180 From this point of view, both approaches of protection, i.e. prevention of known risks and precaution against unknown risks, do not exclude each other but, instead complement each other. Thus, when the risk is “known” or “identified”, this is the essential moment when there is a switch from precautionary to preventative measures. It is at this moment, when the protection instruments do not primarily aim to identify a risk anymore but instead to prevent it.181 Such a differentiating approach is particularly important if protection measures shall not forbid all future innovations, but instead, the protection instruments applied shall be proportionate, respecting the conflicting constitutional positions, such as fundamental rights.182 However, the most urgent challenge of such a “risk-based” approach applied to data protection law is the question of how to determine the potential harm, i.e. the object of protection that actually is threatened by a certain action or decision. Many scholars stress that beyond common sense, i.e. that not only material but also immaterial harm must be considered, there is little agreement on how to determine the corresponding threats.183 This is a desperate situation for a regulation aiming to protect against threats caused by the processing of personal data. The reason is that effective protection is possible only if it is clear which of these threats are legally relevant. The answer to this general question may lead, in particular, to further answers to more specific questions, such as: what kind of information is actually needed in order to discover threats; which threats must be accepted without having protection instruments against it; how to avoid “rabulistic games” with numbers determining the probability and severity of threats; and thus, how to avoid, firstly, that the risk-based approach undermines rights and duties provided for by fundamental rights and, second, risk management processes provided for by ordinary data protection law “may be perverted into a self-legitimation exercise that serves no other purpose than that of managing operational and reputational 180 See Costa, ibid., p. 5. 181 Cf. Costa, ibid., pp. 2, 5, and 14 to 18. 182 See the criticism of the precautionary principle provided for by data protection, in particular, at Thierer, Privacy Law’s Precautionary Principle Problem. 183 See, for example, Kuner et al., ibid., p. 97; Center for Information Policy Leadership, The Role of Risk Management in Data Protection – Paper 2 of the Project on Privacy Risk Framework and Risk-based Approach to Privacy, p. 13. B. Conceptual definitions as a link for regulation 90 risks, and which, ultimately, is itself a risk to the management of (primary) risks.”184 Theories about the value of privacy and data protection In order to answer this question, it is necessary to determine the overall objective that data protection actually serves. It is necessary to stress that this chapter does not yet precisely differentiate between theories, concepts, or approaches of privacy, on the one hand, and data protection, on the other. Both terms are therefore (still) synonymously used.185 The individual’s autonomy and the private/public dichotomy Without requiring a complete and detailed description of each single theory on this matter, Nissenbaum provides, in her book Privacy in Context, an overview about “predominant themes and principles, as well as a few of the well-known theories that embody them.”186 In doing so, Nissenbaum organizes these theories into two categories: First, theories that consider privacy as related or even necessary for further moral or political values; and, second, theories that attribute the legitimacy question of privacy to the individual’s capacity to control a certain “private zone”.187 With respect to the first category, i.e. theories connecting privacy with further moral or political values, the individual’s autonomy plays an important role. There can be several threats endangering the autonomy of individuals who are concerned by the processing of personal data. Quoting Stanley Benn, Nissenbaum defines autonomy as “self-determination embodied in the individual ‘whose actions are governed by principles that are his own’ and who ‘subjects his principles to critical review, rather than III. 1. 184 See Gellert, ibid., pp. 14 to 17, referring, with respect to the quote, to Michael Power, The Risk Management of Everything – Rethinking the Politics of Uncertainty (Demos, London 2004), p. 19. 185 See, for example, in relation to EU law, the discussion about the terminological (and conceptual) shift from “privacy“ to “data protection“ at González-Fuster, The Emergence of Data Protection as a Fundamental Right of the EU. 186 See Nissenbaum, Privacy in Context, p. 13. 187 See Nissenbaum, ibid., p. 73. III. Theories about the value of privacy and data protection 91 taking them over unexamined from his social environment’”.188 Nissenbaum acknowledges that such an understanding of autonomy might indeed be endangered in light of the thought experiment proposed by Jeffrey Reiman called the “informational panopticum”:189 Similar to Jeremy Bentham’s panoptic prison, the life of an individual trapped in an informational panopticum can be observed from one single point of view. Given the current development of collection, aggregation, and analysis of personal data, Nissenbaum considers such a thought experiment not as unreasonable.190 Instead, she delves deeper into the four types of risks that Reiman considers for an individual’s autonomy caused by the informational panopticum: “risks of extrinsic and intrinsic losses of freedom, symbolic risks, and risks of ‘psycho-political metamorphosis’”.191 An extrinsic loss of freedom arises when an individual suffers from negative decisions by third parties due to information third parties are able to gather about the individual. For example, an employer receives information (that could be true or untrue) about the work performance of a potential employee and decides not to give the potential employee the job based on this information. An intrinsic loss of freedom results from anteceding self-censorship because the individual fears such potential external losses and therefore omits behaviors that could lead, once somebody else is informed about it, to a negative decision made by others. The symbolic risk refers to a lack of institutional bodies and concepts affirming the right of the individual to act autonomously without having to fear losses of their freedom. The fourth risk of psycho-political metamorphosis finally “follows Reiman’s speculation that if people are subjected to constant surveil- 188 See Nissenbaum, ibid., p. 81 quoting Stanley Benn (1971), Privacy, Freedom and Respect for Persons, in: Privacy, ed. J. R. Pennock and J. W. Chapman, New York: Atherton Press, pp. 1 to 27 (p. 24), reprinted in Philosophical Dimensions of Privacy: An Anthology, ed. F. Schoeman. Cambridge: Cambridge University Press, 1984, pp. 223– 244. 189 See Nissenbaum, ibid., quoting Jeffrey Reiman (1995), Driving to the Panopticum: A Philosophical Exploration of the Risks to Privacy Posed by the Highway Technology of the Future, Santa Clara Computer and High Technology Law Journal 11(1): pp. 27 to 44 (p. 33). 190 See Nissenbaum, ibid., p. 75 referring to Jeffrey Bentham (1995), The Panopticon Writings. M. Bozovic, ed. London: Verso. 191 See Nissenbaum, ibid., pp. 75 and 76 referring to Jeffrey Reiman (1995), Driving to the Panopticum: A Philosophical Exploration of the Risks to Privacy Posed by the Highway Technology of the Future, Santa Clara Computer and High Technology Law Journal 11(1): pp. 27 to 44 (p. 42). B. Conceptual definitions as a link for regulation 92 lance, they will be stunted not only in how they act, but in how they think. They will aspire to a middle-of-the-road conventionality — to seek in their thoughts a ‘happy medium.’”192 From this perspective, a right to privacy and/or data protection protecting against these threats indeed serves an individual’s autonomy.193 However, Nissenbaum concedes that autonomy does not require that individuals are totally free from any social influence. It is a thin line to draw between coercion, manipulation, and deception, on the one hand, and respecting the individual’s autonomy, on the other. In particular, there is no proof that the processing of personal data leads, in general and automatically, to harm for the autonomy, but only that it may.194 The preceding considerations about the individual’s autonomy lead to the second value of privacy, for human relationships. Several theorists stress the value of privacy which enables individuals to decide who they want to trust or not, i.e. it is the individuals who decide who they want to share personal information with. Autonomy therefore is an important precondition for developing relationships.195 Finally, and equally related to the concept of autonomy, Nissenbaum refers to another scholar who stresses the importance of privacy for society as a whole: Priscilla Regan considers and promotes the notion that privacy enables individuals to decide on which aspects of their personal life they want to place in the background, distinguishing them from others, and which aspects they choose to share with others in order to signal their commonalities. This ability is an essential pre-requisite for being a citizen in a democracy, which becomes particularly obvious with respect to the freedom of association. However, there are further constitutional positions related to or even dependent on privacy such as the fundamental right to anonymous speech or the institution of the secret ballot. These examples make apparent that privacy per se must not be at the complete disposal of individuals, who use their privacy or may abandon it, but has to be considered as a collective good. Regan 192 See Nissenbaum, ibid. 193 Cf. Nissenbaum, ibid., p. 81. 194 See Nissenbaum, ibid., p. 83 and 84. 195 See Nissenbaum, ibid., pp. 84 and 85 referring to Charles Fried (1986), Privacy: A Moral Analysis, Yale Law Journal 77(1): pp. 475– 493 (pp. 477 ff.) as well as Ferdinand Schoeman (1984), Privacy and Intimate Information, in: Philosophical Dimensions of Privacy: An Anthology, ed. Ferdinand Schoeman, Cambridge University Press, pp. 403 to 418 (p. 408) and James Rachels (1975), Why Privacy Is Important, Philosophy & Public Affairs 4(4): pp. 383 to 423 (p. 326). III. Theories about the value of privacy and data protection 93 advocates that this nature of privacy “as a non-excludable, indivisible collective good like clean air and national defense” gives a good reason for concluding that the legislator should regulate privacy by public law and not completely leave it to mechanisms of the private market.196 The second category of theories equally considers privacy as important for an individual’s ability to avoid scrutiny, approbation and hence, in more general words, threats for his or her autonomy. However, these theories consider privacy and how it is conceptualized by the preceding theories as too broad and therefore focus on its function to define a specific “private zone”. From this point of view, all concepts of privacy can only refer to a private realm but not to the public sphere. Nissenbaum calls this approach the “private/public dichotomy”.197 Pursuant to her analysis, there are three basic strands defining this private/public dichotomy. The first strand defines the dichotomy by distinguishing between private and public “actors”. The second strand defines it by distinguishing between private and public spaces. And the third strand refers to the distinction between private and public information.198 Pursuant to these theories, a right to privacy shall exist only for these private zones, otherwise the value of privacy and, thus, protection for it is unclear.199 Criticism: From factual to conceptual changes Nissenbaum criticizes all of these approaches. With respect to the second category, theories referring to the private/public dichotomy, in her opinion, these theories are not problematic as such, but are not useful in today’s world for elaborating on a normative concept of protection. She argues: “Although, in the past, it might have served as a useful approximation for delineating the scope of a right to privacy, its limitations have come to light as digital information technologies radically alter the terms under which others – individuals and private organizations as well as government – have access to us and to information about us in what are traditionally understood as private and public domains. In the period before such 2. 196 See Nissenbaum, ibid., p. 87 referring to Priscilla Regan (1995), Legislating Privacy, Chapel Hill: University of North Carolina Press, pp. 226 and 227. 197 See Nissenbaum, ibid., pp. 89 and 90. 198 See Nissenbaum, ibid., pp. 91 ff. 199 See Nissenbaum, ibid., pp. 98. B. Conceptual definitions as a link for regulation 94 technologies were common, people could count on going unnoticed and unknown in public arenas; they could count on disinterest in the myriad scattered details about them.”200 Today, in contrast, personal data can be, once it is collected in a certain context, permanently stored and can always be analyzed and used in another context. In light of this “always-possible context change”, the linear private/public dichotomy, hence, does not serve as a useful criterion reliably distinguishing, for example, between private and public spaces or private and public information anymore.201 However, the theories described before, which focus on the value of privacy in relation to further moral or political values, in particular to autonomy, do not provide reliable criteria in order to distinguish various forms of data processing from others either. Nissenbaum summarizes, in particular, the following weaknesses of these theories as: “One recurring skeptical challenge, for instance, cites the lack of concern many people seem to demonstrate in day- to-day behaviors, contradicting claims that privacy is a deeply important moral and political value that deserves stringent protection. Another is the clearly evident cultural and historical variation in commitments to privacy, hard to explain if privacy is supposed to be a fundamental human right. A third points to the difficulty of resolving conflicts between privacy and other moral and political values, such as property, accountability, and security.”202 The shortcomings of all these theories become, in Nissenbaum’s opinion, most apparent in light of their inappropriate answers to the threats to privacy caused by modern Internet and Information technologies. The existing theories lead to the result that the public discourse discusses some of the new technologies with great anxiety even if they do actually not pose a significant risk to privacy. In contrast, existing concepts do not provide for sufficient protection measures against other technologies, which heavily put traditional understandings of privacy in question, only because their principles are “’blind’ to essential elements and differences” of these technologies.203 As a consequence of all these challenges, Nissenbaum finally develops her approach not by creating her own new principles of privacy, but rather by reacting to altered factual conditions and, thus, elaborating 200 See Nissenbaum, ibid., pp. 116 and 117. 201 Cf. Nissenbaum, ibid., pp. 113 ff. 202 See Nissenbaum, ibid., p. 14. 203 See Nissenbaum, ibid., pp. 103 and 104. III. Theories about the value of privacy and data protection 95 on the existing principles:204 the framework of “contextual integrity”.205 One essential element of this approach is to specify conditions for the flow of personal information with respect to a certain context. From this point of view, a right to privacy is not a right to secrecy or to control of certain information, but to appropriate flow of information.206 Interestingly, Nissenbaum also heavily criticizes the purpose-based approach. However, before analyzing this criticism and, as a consequence, coming to the question of the relationship between a “context” in which the data processing (aka information flow) takes place and the “purpose” of this processing, the next paragraph delves deeper into the approach of contextual integrity. The reason is that this approach may help, once the question of the context-purpose-relationship is clarified, find an answer to the research question about the meaning and extent of the principle of purpose limitation. Nissenbaum’s framework of “contextual integrity” Elaborating on her framework of contextual integrity, Nissenbaum underlines, as mentioned previously, that she does not want to substitute current intuitive principles of privacy. In contrast, she seeks to provide a concept, which functions better than current theories, in order to evaluate whether or not a certain flow of information infringes such intuitive principles of privacy. Pursuant to her framework, a certain use of information infringes “contextual integrity” only if it conflicts with “informational norms” that exist in specific contexts. These informational norms are specified by the 3. 204 See Nissenbaum, ibid., p. 118 quoting Lawrence Lessig (1999), Code and Other Laws of Cyberspace, New York: Basic Books, p. 116 as: “This form of argument is common in our constitutional history, and central to the best in our constitutional tradition. It is an argument that responds to changed circumstances by proposing a reading that neutralizes those changes and preserves an original meaning... It is reading the amendment differently to accommodate the changes in protection that have resulted from changes in technology. It is translation to preserve meaning’”; cf. the same approach in German law, Grimm, Data protection before its refinement, p. 585, who differentitates between the over-arching aim specified by the object of protection of fundamental rights and the concept of protection that must be adapted, from time to time, to the changes of the environment. 205 See Nissenbaum, ibid., p. 14. 206 See Nissenbaum, ibid., pp. 127 and 239. B. Conceptual definitions as a link for regulation 96 following factors: First, the corresponding context; second, the actors involved; third, attributes such as the type of information; and fourth, principles for the transmission of the information.207 Nissenbaum proposes the following explanations for these factors: the term “context” refers to “structured social settings with characteristics that have evolved over time (sometimes long periods of time) and are subject to a host of causes and contingencies of purpose, place, culture, historical accident, and more.”208 By way of example, she names contexts such as health care, education, employment, religion, family, and the commercial marketplace.209 The second factor, i.e. the type of information, can refer to the dichotomy between private and publically available information, but it is however, not restricted to these types. Instead, further types can equally be relevant. In this regard, Nissenbaum provides examples that friends might share intimate information amongst each other but not, for example, their salaries; in contrast, the same people might share the information about their salaries with their bankers or tax lawyers, but not the intimate information shared with their friends; similarly, the information exchange about religious affiliation might be appropriate amongst friends, but not between an employer and his or her employee; and finally, a physician might ask for medical information but not about the religious or financial matters of an individual.210 Correspondingly, the definition of the social role by the individual also depends on the context. For example, in a health-care context it is decisive in order to define the social norms, whether the doctor, receptionist, nurse, or bookkeeper receives certain types of information.211 This example also points to the fourth factor, i.e. the transmission principle. Nissenbaum stresses that her framework of contextual integrity is not restricted to a binary transmission principle, such as having access or not having access to information. Instead, she stresses the point that there are several possible conditions governing how in a certain context, certain types of information might be shared amongst certain actors. For instance, there may be a principle of reciprocity for sharing information, such as amongst friends; or rights of receiving certain information; or duties of providing for certain 207 See Nissenbaum, ibid., p. 181. 208 See Nissenbaum, ibid., p. 130. 209 See Nissenbaum, ibid., p. 130. 210 See Nissenbaum, ibid., pp. 143 and 144. 211 See Nissenbaum, ibid., pp. 141 and 142. III. Theories about the value of privacy and data protection 97 information; or a right for individuals to determine by themselves the conditions of a certain information flow; there may be a principle requiring that information is shared voluntarily or consensually or based on the knowledge of the individual concerned (“notice”) or on his or her permission (“consent”), or a combination of all or some of these conditions.212 In any event, Nissenbaum stresses that “contexts are not formally defined constructs, but (…) are intended as abstract representations of social structures experienced in daily life. (…). In other words, the activity of fleshing out the details of particular types of contexts, such as education or health care, is more an exercise of discovery than of definition.”213 Irrespective of whether this statement is correct or not, and supposing that the particularities of a specific context is fleshed out in detail, and its informational norms are determined, the next step in the framework of contextual integrity is to evaluate whether or not a certain flow of information challenges the corresponding norms and therefore violates its contextual integrity. Nissenbaum recognizes the fact that if all information flows that challenge an already existing norm were considered as violating its contextual integrity, the evolvement of new norms, i.e. change per se, would be problematic. In order to avoid a “lock-in effect” in entrenched norms that hinders new developments, Nissenbaum hence adds to her framework a normative component: the value of a specific context. In light of this component, new informational norms challenging existing ones “can be justified on moral grounds insofar as they support the attainment of general as well as context-based values”.214 Thus, coming from her approach that existing informational norms are presumed to be appropriate norms, she considers that new norms can also be justified, so long as they are more effective in supporting, promoting or achieving contextrelated values than existing informational norms.215 These contextual values, in other words, purposes, objectives or ends hence play an essential role for evaluating whether or not a new informational norm within a given context violates the contextual integrity. Nissenbaum stresses, referring to Schatzki’s “teleology”, the function of these contextual values as necessary for any understanding of why individuals behave in certain contexts in a certain way, in more abstract words, why certain context-related infor- 212 See Nissenbaum, ibid., pp. 145 to 147. 213 See Nissenbaum, ibid., p. 134. 214 See Nissenbaum, p. 181 and pp. 158 ff. 215 See Nissenbaum, p. 181 and pp. 158 ff. B. Conceptual definitions as a link for regulation 98 mational norms exist. She comes to the conclusion that even if “settling on a definitive and complete list of contextual values is neither simple nor non-contentious, the central point is that contextual roles, activities, practices, and norms make sense largely in relation to contextual teleology, including goals, purposes, and ends.”216 Clarifying the relationship between “context” and “purpose” Promoting this approach of contextual integrity, Nissenbaum also criticizes, as mentioned previously, the purpose-based approach. In her opinion, the principle of purpose limitation that consists of the two requirements, first, to specify the purpose of the processing of personal data and, second, to limit the later use of the data to the purpose initially specified, has “only indexical meaning”.217 She stresses that so long as there is no substantive criteria in order to specify a purpose, privacy and/or data protection laws “constitute a mere shell, formally defining relationships among the principles (that refer to the purpose of the data processing) and laying out procedural steps to guide information flows.”218 Since such a concept of protection leaving the specification of the purpose to the controller’s will serve a “glaring loophole”,219 Nissenbaum comes to the conclusion that another concept focusing on a principle for “respect for context” is “something materially different, something better.”220 In essence, Nissenbaum’s criticism of the principle of purpose limitation refers to the same challenges as mentioned in the introduction of this thesis. However, considering a context-based approach as materially different and (sic!) better than a purpose-based approach requires, at first, determining the “tertium comparationis” (i.e. the commonality allowing a 4. 216 See Nissenbaum, p. 134 referring to Schatzki, T (2001), Practice Minded Orders, in: The Practice Turn in Contemporary Theory, ed. T. R. Schatzki, K. K. Cetina, and E. von Savigny, London: Routledge, pp. 42 to 55. 217 See Nissenbaum, Respect for Context as a Benchmark, p. 291. 218 See Nissenbaum, ibid., p. 292. 219 See Nissenbaum, ibid., p. 291, referring to Fred Cate (2006), “The failure of Fair Practice Information Principles,” Consumer Protection in the Age of the Information Economy, July 8. Accessed July 1, 2013 from http://papers.ssrn.com/sol3/pa pers.cfm?abstract_id=1156972. 220 See Nissenbaum, ibid., p. 292. III. Theories about the value of privacy and data protection 99 comparison) of both approaches.221 In addition, such a conclusion presupposes that there is no framework for helping to determine, similar to the approach of contextual integrity, substantive criteria for the specification of the purpose. Such an implicit presumption is particularly important for Nissenbaum’s conclusion, since she admits that the success of her approach also depends on how the “context” is interpreted.222 However, her observation that the principle of purpose limitation constitutes, without such a framework providing for substantive criteria, a mere shell remains valid. In 1989, the German legal scholar Badura equally criticized the legislation process of the German Federal Data Protection Law and at the time stated that it remained unclear “what the term ‘purpose’ actually means (…)”.223 However, the term “context”, with respect to its function to a right to privacy, today is clearer in particular in light of Nissenbaum’s approach. Thus, it should be possible to elaborate on a concept that equally clarifies the term “purpose”. Indeed, before turning to this task it is necessary to clarify the interrelationship between both terms “context” and “purpose” because legal scholars, as well as data protection authorities, often use these terms ‘simultaneously, at least, without explicitly clarifying the precise differences in their meaning.224 In its “Decision on Population Census”, the German Constitutional Court provided the first and, compared to its following decisions, most comprehensive approach in defining both terms and explaining their interrelated functions. In order to determine the extent of the basic right to informational self-determination, it held that “it is not only necessary to examine the type of the data provided but also to examine the possibilities of 221 Cf. Bygrave, p. 157, associating the criteria of „context“ with „purpose compatibility“ and also the individual’s „reasonable expectations“ (with respect to this latter relationship, see in particular below under C. II. 1. a) ECtHR and ECJ: Almost no criteria. 222 See Nissenbaum, ibid., p. 292. 223 See Albers, Treatment of Personal Information and Data, cip. 124 quoting Peter Badura, Anhörungsbeitrag in der öffentlichen Anhörung des Innenausschusses des Deutschen Bundestages vom 19. Juni 1989, in: Deutscher Bundestag (Hrsg.), Fortentwicklung der Datenverarbeitung und des Datenschutzes, Zur Sache 17/1990, S. 15 (16): “Es sei unklar, was denn Zweck überhaupt ist, wie eng oder wie weit der Zweck zu sehen ist, ob Zweck etwa gleich Aufgabe ist oder organisatorisch definiert werden kann usw.” 224 See, instead of many, the Article 29 Data Protection Working Grouop, Opinion 03/2013 on purpose limitation, pp. 23 and 24. B. Conceptual definitions as a link for regulation 100 its usage. These depend, on the one hand, on the purpose of the collection and, on the other hand, on the possibilities of the specific technique of processing the data and on the possibilities of its combination. Consequently, a datum that is, per se, irrelevant can become relevant; insofar, under the conditions of automated data processing, there is no ‘irrelevant’ data. Whether information is sensitive cannot only depend on the intimacy of the events. In order to determine the relevance of the datum for the personality right, it is rather necessary to know the context of its usage. Only when it is clear for which purpose the information is required and which possibilities of linking and usage exist, it is possible to answer the question of whether the infringement of the right to informational self-determination is constitutionally legal or not (underlining by the author).”225 In essence, the Court clarified that the relevance of data with respect to the personality right of the data subject does not only depend, similar to Nissenbaum’s approach, on the type of data or the intimacy of the event, but also on further factors. One decisive factor for determining the legal relevance of data is, from the Court’s perspective, the context of its usage. Interestingly, the Court determines the context by referring to the purpose of the collection of the data, as well as referring to the actual technical possibilities of how the data can be combined and used.226 Therefore, in order to answer the question of what the term purpose really means, it seems plausible to refer to contexts in the meaning that Nissenbaum describes. The specification of the 225 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83, cip. 176 and 177: “(...) Dabei kann nicht allein auf die Art der Angaben abgestellt werden. Entscheidend sind ihre Nutzbarkeit und Verwendungsmöglichkeit. Diese hängen einerseits von dem Zweck, dem die Erhebung dient, und andererseits von den der Informationstechnologie eigenen Verarbeitungsmöglichkeiten und Verknüpfungsmöglichkeiten ab. Dadurch kann ein für sich gesehen belangloses Datum einen neuen Stellenwert bekommen; insoweit gibt es unter den Bedingungen der automatischen Datenverarbeitung kein ‚belangloses’ Datum mehr. Wieweit Informationen sensibel sind, kann hiernach nicht allein davon abhängen, ob sie intime Vorgänge betreffen. Vielmehr bedarf es zur Feststellung der persönlichkeitsrechtlichen Bedeutung eines Datums der Kenntnis seines Verwendungszusammenhangs: Erst wenn Klarheit darüber besteht, zu welchem Zweck Angaben verlangt werden und welche Verknüpfungsmöglichkeiten und Verwendungsmöglichkeiten bestehen, lässt sich die Frage einer zulässigen Beschränkung des Rechts auf informationelle Selbstbestimmung beantworten. (...)” 226 See also Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, p. 575. III. Theories about the value of privacy and data protection 101 purpose serves, from this perspective, to pre-determine the (future) context of the intended use of data and, thus, the context-related informational norms. Indeed, Hofmann already stated in his work Purpose Limitation as Anchor Point for a Procedural Approach in Data Protection from 1991 that the specification of the purpose serves to create “well-designed, transparent and controllable structures” and its limitation to “maintain the original context of collection”.227 Pohle stresses the similarity, if not equality, of these functions with Nissenbaum’s approach of “contextual integrity”.228 In any event, the determination of a future context in advance through the specification of the purpose makes it possible to determine, for example, the transmission principles before the use of data takes actually place. Having the considerations on the regulation of risks in mind, referring to the purpose of the data processing enables the data controller to apply the transmission principles (or to prepare their application) in advance in order to avoid the (potential) later harm, that means, a later violation of contextual integrity. So far, the requirement to specify the purpose would not be a mere shell as Nissenbaum promotes. Instead, it is just another legal link for regulation. This approach focuses, by expanding legal protection before the violation of contextual integrity can take place, on the prevention of or precaution against risks for the individual’s autonomy. However, despite the German Constitutional Court’s elaborated approach, the difference between both terms “context” and “purpose” is not sufficiently clear when reviewing the different acts of data treatment, i.e. stages of the information flow: Firstly, there is no clear distinction between contexts of different acts of data treatment over time. The Court only refers to the context of later usage. In contrast, the collection of data is also embedded in a certain context. This differentiation is important in order to exactly determine, as Nissenbaum proposes, the context in which the data usage precisely occurs and whether this use challenges the corresponding informational norms or not. Furthermore, the difference is im- 227 See Hofmann, Purpose Limitation as Anchor Point for a Procedural Approach in Data Protection, p. 25/26 regarding the first quote, and p. 126 regarding the second quote; cf. Bygrave, Data Privacy Law, p. 153, who highlights the importance of the principle of purpose limitation “ensuring adequate information quality and that the data-processing outcomes conform with the expectations of data controllers”. 228 See Pohle, Purpose limitation revisited, footnote 24, referring to Helen Nissenbaum, Privacy as contextual integrity, Washington Law Review 79, pp. 101 to 139. B. Conceptual definitions as a link for regulation 102 portant in order to obtain a clear distinction between the purpose specified the moment the data is collected and each later use of data. The reason is that one must be clear about the fact that each time the data is used, this use might pursue another purpose which would then determine another future context of the data treatment etc. etc. The second unclear aspect is that there is no specific explanation for the interplay between, the purpose of the collection and (…) the possibilities of the specific technique of processing the data and on the possibilities of its combination. The Court thus differs between the usage intended by the data controller and the usages that are factually possible. In doing so, the Court appears to imply that all factual possibilities of data processing could be pre-determined. Such an implication becomes reasonable in light of the data processing techniques that had existed at the time. In the 1980’s, data processing was based on very few large central-computing systems. These central systems determined the different phases and possibilities of the processing of data and its possible combination. The legal terms of collection, storage, processing, change, usage, and deletion of personal data actually followed the technical environment at the time. Instead, today, the treatment of personal data often takes place in highly decentralized and non-linear environments. The different stages of the treatment of data, such as the collection, changing, combination, and transfer of data – how it is often described in literature and within the German law – do not necessarily succeed in this linear direction. Instead, in today’s non-linear environment, these different types of data processing occur simultaneously or parallel and are intertwined, again and again, with the information constantly retrieved. Consequently, the information depends, more than before, on the corresponding context of usage.229 This leads to the result that the computing system as such cannot determine all factual possibilities of data processing. A concept protecting (in other words, preserving) principles of privacy and/or data protection and, thus, a definition of the terms “context” and “purpose” must mirror this consequence. In conclusion, in light of the fact that de-centralized and non-linear environments do not allow for the pre-determination of all factual possibilities of data processing, one has to, firstly, focus on examining the present 229 See Albers, ibid., cip. 121 and 122; highlighting the current change of the computational systems and environments compared to the times of the first “Decision on Population Census” in 1983, Hoffmann-Riem, Protection of the Confidentiality and Integrity of Information Technological Systems, pp., 1009 and 1010. III. Theories about the value of privacy and data protection 103 context in which the data is currently processed. Secondly, an appropriate legal link for determining the future context, is the present purpose. Therefore, in this thesis, the term “purpose” means the intended reason behind the data controller’s treatment of the data referring to a future context; from this point of view, the realization of the purpose is a causal process with, at least an analytical final end that is determined by this purpose. The purpose serves to bundle the different acts of the data processing to a meaningful unity. From the perspective of the entity setting the purpose, the purpose thus decides on whether the means, which are used in order to reach the purpose, are appropriate or not.230 In contrast, the term “context” does not primarily refer, be it a present or future one, to a certain result of a human-caused process but, as quoted previously, to “structured social settings with characteristics that have evolved over time (sometimes long periods of time) and are subject to a host of causes and contingencies of purpose, place, culture, historical accident, and more.”231 So far, this definition of the term “purpose” does not exclude or substitute the “context” as defined within the framework of contextual integrity but rather incorporates it. Indeed, Nissenbaum also refers, in turn, to the term “purpose” when she elaborates on the definition of context. However, it is obvious that her context definition referring to the ‘causes and contingencies of purpose’ rather means the value, objective or end of a specific context than the subjective purpose formulated by an individual within that context. In any event, this thesis explicitly ties into the definition by the German Constitutional Court considering a purpose set by an individual not only referring to a future context of the data use, but also as another factor characterizing the present context. The reason is that a determination of the legal responsibility of the entity processing personal data, must also take its purpose into account. Without the knowledge about the purpose of the processing, it would be hard to determine the reason of the entrepreneurs behavior and, thus, at least, the entrepreneurs social role.232 Hence, the context of a data treatment includes the purpose of the data processing – and this purpose characterizes, together with further circum- 230 See Albers, ibid., cip. 123; Pohle, Purpose limitation revisited, pp. 142 and 143; see, from a sociological perspective, Luhmann, The Concept of Purpose and the Rationality of Systems, in particular, pp. 1 ff., 9 ff. and 114 ff. 231 See Nissenbaum, ibid., p. 130. 232 Cf. Nissenbaum herself with respect to the necessity of knowing the purpose of a context in order to understand it, ibid., p. 134. B. Conceptual definitions as a link for regulation 104 stances, the corresponding context. A purpose thus links the existent context of the current act of data treatment to a future, intended one.233 By means of an example: The startups mentioned in the introduction each publish their own websites, in order to improve the process and experience of users of their websites, and use the service of a provider of analytical tools, who in turn analyze the behavior of the users visiting the website. This analysis is based on the collection and processing of user data, such as the time and date of his or her visits, the visit behavior (for example, from which page does the user come from, on which page does he or she start, how much time does the user stay and when does he or she leave) as well as, possibly, the user’s IP address, the location and type of his or her device and the browser (“attributes”). The moment a user’s data is collected, the context is determined by: the publisher of the website using the service of the service provider, the service provider itself (both with respect to their corresponding purposes) and the social role of the data subject the moment when he or she uses the website (“actors”); the general expectations of whether the data might be collected or under which conditions and for which purposes it might be used (“norms”). Thus, the purposes of the website publisher and the service provider determine, amongst others, the context of the data collection. The future contexts can be, given that the website publisher and the service provider constantly develop their products further, mainly prescribed by these purposes. The way the website is developed and the analytical software used per se, only allows in a limited way to pre-determine, pursuant to the technical environment, the future context of the concrete data processing. Values as a normative scale in order to determine the “contexts” and “purposes” However, this example evidences that there is, over time, not only an unlimited number of contexts in which the data processing may occur but also, an unlimited number of purposes which pre-determine these contexts. Accordingly, the service provider collects the data, deletes certain other data and combines it with further data, firstly, for the purpose of analyzing it. The analysis as such takes place for the purpose of transferring 5. 233 Cf. Albers, ibid., cip. 121 and 122. III. Theories about the value of privacy and data protection 105 the analytical results to the website publisher and, possibly, in order to improve the functioning of its analytical software. While all purposes take place in order to maintain the corresponding businesses, the service provider may know or not know the true purposes of the website publisher using the analytical results. The publisher of the website might use them, as described above, for the purpose of improving the user experience of its website but also in order to present it to (potential) cooperation partners and financiers. Even the storage of the data for an unknown purpose is, as such, a purpose. Hence, there are many acts of a data treatment occurring iteratively or simultaneously for many different purposes and, consequently, in corresponding contexts. For example, the purpose of a preceding act can lead to a following one, i.e. a subsequent purpose, or be completely different. Depending on the respective purposes, data may not only be intended to be transferred from one context into another one, but also the context in which the processing occurs may remain the same or turn into another one. The reason for this change is that the determination of the context depends on the perspective of the observer (whoever exercises this judgment task), just like the specification of the purpose depends on the actors’ point of view. The question therefore is how to distinguish the different purposes and contexts, as well as the different acts of data treatment from a legal point of view: Which acts of the data treatment, which corresponding purposes, which contexts are legally relevant? Nissenbaum herself provides a solution to this question: The values serve as the main criteria for determining a context as a common unity of analysis. Values explain the reason of behavior in a context and, thus, which elements observed are relevant within this context and which are not. Values hence not only help answer the question of which new informational norms that challenge entrenched ones are justified, but already, in a preceding step, the question of how to determine the specific context, i.e. which elements observed belong to a specific context and which not. As a consequence, values may fulfill the same function in order to determine the relevance of the purpose of data processing. From this perspective, the main task of this thesis is then to elaborate on such values as a normative concept that can assist in determining context-relative informational norms and, in this framework, the function of the principle of pur- B. Conceptual definitions as a link for regulation 106 pose limitation.234 This may imply answers to the question of how precisely purposes of data processing must or how broadly they may be specified. 234 Cf. De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 4, summarizing how data protection regulation "formulates the conditions under which processing is legitimate.” III. Theories about the value of privacy and data protection 107 The function of the principle of purpose limitation in light of Article 8 ECFR and further fundamental rights As a main part of this thesis, this chapter illustrates the legal framework surrounding the collection and processing of personal data with respect to the principle of purpose limitation. Seeking to prove the hypothesis made in the preceding chapter that values define the contexts in which data is being processed and, consequently, define the purposes for why the data is processed, this chapter elaborates on a normative concept for the definition of purposes and contexts. This concept intends to clarify, which informational norms govern certain contexts and, consequently, what legal function the principle of purpose limitation has in our digital society. In order to elaborate on such a normative concept, the first sub-chapter examines the constitutional framework that is applicable, in general, to the processing of personal data in the private sector within the European Union. On this basis, the second chapter draws the attention to the first component of the principle of purpose limitation, i.e. the requirement to specify the purpose, in light of the specific fundamental rights concerned. The third chapter focuses on the second component, i.e. the requirement to limit the later processing to the purpose initially specified. Finally, the fourth chapter treats the question of which regulation instruments come into question for establishing, by means of ordinary law, the principle of purpose limitation in the private sector. Constitutional framework Any ordinary law and, consequently, regulation instrument, as well as its interpretation, must correspond to our current notation of fundamental rights. Thus, the constitutional framework, such as the European Charter of Fundamental Rights not only serves as a scale of control for the interpretation of ordinary law by the judiciary and the executive, such as the (independent) data protection authorities, but it also determines the scope C. I. 109 of decision making for the legislator.235 Even if all fundamental rights regimes treated in this thesis cover, in principle, privacy and/or data protection, there are essential differences with respect to the respective objects and concepts of protection. These differences are highly relevant in determining the function of the principle of purpose limitation with respect to the European Charter of Fundamental Rights. This sub-chapter attempts and starts, hence, to clarify the scope of application of the different fundamental rights regimes and its legal effects in the private sector. The analysis continues to examine the object and concept of protection of the German right to informational self-determination. In light of the extensive case law provided for, in the last 30 years, on this right, this examination serves as a starting point for analyzing the different objects and concepts of protection of the fundamental regimes provided for on a European level. From this perspective, it may hence serve as a source of inspiration.236 In this regard, it must be stressed that the subsequent analysis is not a complete evaluation of all existing case law regarding data protection and/or privacy in the European Union. Instead, the analysis concentrates on those Court decisions that appear to be most suitable in providing guidance in order to answer the main research question of this thesis. Interplay and effects of fundamental rights regimes Consequently, the following three constitutional frameworks are relevant, surrounding privacy and/or data protection in the European Union, as well as in Germany (as one of its Member States): The European Convention for Human Rights, the European Charter of Fundamental Rights of the European Union and, as an example for the national level, German Basic Rights.237 In contrast, in this thesis, international treaties such as the 1. 235 Cf. Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, pp. 562 and 563; Burgkardt, Data Protection between the German Basic Law und Union Law, p. 29. 236 Cf. Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 49; Bäcker, Constitutional Protection of Information regarding Private Parties, pp. 115 and 116. 237 See Burgkardt, ibid., p. 53 and 81. C. The function of the principle of purpose limitation in light of Article 8 ECFR 110 OECD Guidelines play a role, only, so long as the Courts, which interpret the fundamental rights, explicitly refer to it.238 The interplay between European Convention for Human Rights, European Charter of Fundamental Rights and German Basic Rights In this triangle, the European Convention for Human Rights affects both the legal frameworks of the European Union, as well as its Member States, which also are members of the European Council and, as such, addressees of the European Convention. The European Convention has the status of constitutional or, at least, ordinary law in most members of the European Council.239 In contrast, the European Union has not yet acceded to the European Council. Therefore, the European Convention does not directly bind the European Union.240 However, Article 6 sect. 3 of the Treaty on European Union and Article 52 sect. 3 ECFR require the European Court of Justice to interpret the European Charter of Fundamental Rights in light of the European Convention.241 Historically, this requirement results from the fact that the European Convention for Human Rights served as a source for the establishment of the Charter of Fundamental Rights.242 The European Charter of Fundamental Rights primarily binds the institutions, bodies, offices and agencies of the European Union. It also binds Member States, but only when the respective Member State is implementing Union law, Article 51 sect. 1 sent. 1 ECFR.243 This principle of "primacy of application" seeks to avoid the divergent application of Union law amongst the EU Member States. If each Member State could interpret Union law under the light of their national constitutions, Union law would run the risk of being applied differently within each Member State.244 Given that there is no legal definition in relation to the question of how each Member State is implementing Union law, the European Court of Justice a) 238 See, however, on the general impact of the OECD guidelines, Kirby, The history, achievement and future of the 1980 OECD guidelines on privacy. 239 Cf. Schweizer, European Convention and Data Protection, pp. 462 and 463. 240 See Burgkardt, ibid., p. 246. 241 See Streinz/Michl in: Streinz, EUV/AEUV, EUV Art. 6 cip. 25, 21 ff. 242 See Niedobitek, Development and General Principles, cip. 95. 243 See Streinz/Michl, ibid., GR-Charta Art. 51 cip. 3. 244 See Streinz/Michl, ibid., EUV Art. 4 cip. 35 (and the following). I. Constitutional framework 111 has developed a solution through several types of cases whereby Union law was considered and deemed to apply. Firstly, European fundamental rights undoubtedly govern European regulations that are directly applicable in all Member States.245 An important example in this context is the General Data Protection Regulation that will come into force on 25 May 2018, pursuant to Article 99. Less certain is the scale of control in relation to the application of European directives within Member States, such as the Data Protection and ePrivacy Directives. Directives are not directly applicable within the Member States. Instead, they must be transposed into national law through the national legislator. This leads critics to come to various opinions, as summarized by Burgkardt: While some critics come to the conclusion that the transition into national law falls under the scope of national constitutional law. In contrast, the prevailing opinion argues that many directives are so precise in their wording, which means that the directive can almost be translated on a literal basis into national law. If the national legislator has no room to interpret a directive, national fundamental law does, in consequence, not apply. These critics therefore differentiate between the parts of the directive that must be identically transposed and the other parts that have to be interpreted. While European fundamental rights govern the first, national basic rights principally provide a scale of control for the latter.246 Indeed, the European Court of Justice stresses that this room of interpretation does not apply to notions being autonomously interpreted in light of European law.247 Thus, if the ePrivacy Directive authorizes, for example, the processing of personal data for “marketing electronic communications services or for the provision of value added services”, these terms appear to leave no room for interpretation by the Member States.248 245 See Burgkardt, ibid., p. 33. 246 See Burgkardt, ibid, pp. 34, with further references, and who stresses that the European Court of Justice holds European fundamental rights as binding for national legislators even in the case that there is a certain scope of transition because the transition must never contradict the directive that consists, on its part, of the purposes of European fundamental rights. 247 See Britz, The Fundamental Right to Data Protection in Article 8 ECFR, p. 8 and 9. 248 See Article 6 sect. 3 sent. 1 of the Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications). C. The function of the principle of purpose limitation in light of Article 8 ECFR 112 This leads to the situation whereby the scope of the directive defines whether the European Charter of Fundamental Rights or national constitutional law, such as the German Basic Law, applies. The application of the European Charter of Fundamental Rights upon Member States depends, therefore on two prevailing factors. The first factor pertains to the scope of the directive. The second relates to the room of interpretation that the European legislator left to the national legislator for transposing the secondary law.249 In conclusion, both the European Union, as well as its Member States, have to respect the European Convention. The European Charter of Fundamental Rights binds, in any case, the European Union. Whether the European Charter of Fundamental Rights also bind the Member States, depends on the fact of whether or not they are implementing Union law. This will undoubtedly be the case, if Member States execute European regulations such as the General Data Protection Regulation. In contrast, if Member States transpose European directives into national law, it will depend on the scope and room of interpretation of the directive. The effects of fundamental rights on the private sector The different fundamental rights regimes undoubtedly address the public bodies, i.e. the legislator, the executive, and the judiciary. Indeed, the subject-matter of this thesis is not to examine the effects of the principle of purpose limitation on the collection and processing of personal data by the State but private companies operating through the private sector. The way in which fundamental rights affect private parties depends on the concept of protection provided for by the respective constitutional regimes.250 b) 249 Cf. Grimm, Data protection before its refinement, pp. 589 to 592, who stresses the extreme wide scope of application of the right to data protection under Article 8 ECFR because this right covers, across to normal fundamental rights, all areas of social life under the only condition that the processing of personal data is at stake; Burkhardt, ibid., pp. 53 and p. 59. 250 Cf. Britz, ibid., pp. 562 and 563. I. Constitutional framework 113 Third-party effect, protection and defensive function The basic differentiation is whether or not fundamental rights have an indirect or direct effect to third-parties. In the latter case, fundamental rights not only bind the State but also private entities. This leads to the situation where not only the State, but also private parties have to justify any harm caused against an individual’s fundamental right. In the former case, in contrast, it is only the public bodies bound by fundamental rights. In this case, only the State is bound to justify all infringements, whereas private parties are principally free, for example, to process personal data even if this harms another’s fundamental right to privacy and/or data protection.251 Another terminological issue shall be stressed in this regard: this thesis calls a State intrusion into the scope of protection of a fundamental right an “infringement”; in contrast, if a private party intrudes into the scope of protection this intrusion is called a “harm” for the fundamental right.252 In any case, if a private party harms another party’s fundamental right(s), the public bodies must balance, through the establishment and execution of regulation instruments, the colliding fundamental rights of these private entities interacting on the private sector.253 This duty of balance can also be described by two different functions of fundamental rights. Firstly, there is a defensive function that enables the private party to defend him or herself against actions of the State. Secondly, there is a protection function that obliges the State to protect an individual’s fundamental right against threats caused by sources other than that of the State if the individual is not able to protect him or herself against this threat.254 This can be the case with respect to natural disasters for example, because a person alone is not able to protect his or her house against a flood. However, in situations where a threat does not result from natural sources but from third parties’ behavior, both the protection and aa) 251 See Papier, Third-Party Effect of German Basic Rights, cip. 23/24; cf. Bethge, Collision of Fundamental Rights, cip. 9 to 11, who apparently refers in his criticism to the direct third-party effect; with particular respect to the processing of personal data, see Gusy, Informational Self-Determination and Data Protection: Continuing or New Beginning?, p.60. 252 Cf. Eckhoff, The Infringement of Fundamental Rights, pp. 288 to 290; ; Grimm, Data protection before its refinement, p. 587. 253 See Papier, ibid., cip. 23/24; cf. Bethge, ibid., cip. 9 to 11. 254 See with regard to German Basic Rights, Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 103 and 104. C. The function of the principle of purpose limitation in light of Article 8 ECFR 114 defensive functions potentially come into conflict to each other: in these situations, the same State action intending, on the one side, to protect the basic rights of individuals against harmful behavior of third parties may infringe, on the other side, the defensive function of the third parties’ basic rights. The State hence has to weigh these colliding fundamental rights in order to make both rights as effective as possible in practice.255 Amongst the Member States of the European Union, an indirect effect of fundamental rights on the private sector is widely recognized only with regard to the laws of torts. However, critics believe that there is a general tendency amongst countries to transfer the concept to further areas of law. Germany, Switzerland, the United Kingdom, Italy, France, Spain (and the USA as well) appear, more or less, to principally acknowledge an indirect effect of their fundamental rights.256 In contrast, the concept of the protection function of fundamental rights is less acknowledged, in general. Leading Scholars of Constitutional Law consider that only Germany, Austria, France, and Ireland recognize the protection function as a basic principle within their constitutional regimes.257 Given the diversity of the doctrinal concepts amongst these countries, it is worth illustrating to what extent the fundamental rights regimes considered in this thesis, generally provide for an indirect effect or even the protection function, and, in particular, to what extent, their respective fundamental rights to privacy and/or data protection do so. European Convention on Human Rights While the European Convention on Human Rights does not directly bind third parties, the European Court of Human Rights recognizes the protection function by establishing what are called “positive obligations” on the members of the Council of Europe. The term “positive obligations” means (1) 255 Cf. Callies, regarding to German Basic Rights, Duties of Protection, cip. 3 and 5 as well as 18 and 22; Jaeckel, Duties of Protection in German and European Law, pp. 63 to 79, who also stresses the frequent difficulties when trying to clearly differentiate between both functions. 256 See Papier, ibid., cip. 47 and 48. 257 See Calliess, ibid., cip. 15. I. Constitutional framework 115 that the members have to establish protective measures against the harm of fundamental rights by third parties in the private sector.258 Positive obligations with respect to Article 8 ECHR Indeed, the extent of such a protection function differs to the corresponding fundamental rights in question. The protection function of Article 2 ECHR only protects against intentional harm or intentional killing. In contrast, the protection function of the right to respect for private and family life under Article 8 ECHR protects not only against intentional but also non-intentional harms.259 In the case of “López Ostra vs. Spain”, the Court considered that “naturally, severe environmental pollution may affect individuals’ well-being and prevent them from enjoying their homes in such a way as to affect their private and family life adversely, without, however, seriously endangering their health.”260 Indeed, the Court appears not to conceptually differentiate between the protection and the defensive function in light of the following reasoning: “whether the question is analysed in terms of a positive duty on the State – to take reasonable and appropriate measures to secure the applicant’s rights under paragraph 1 of Article 8 (...) -, as the applicant wishes in her case, or in terms of an ‘interference by a public authority’ to be justified in accordance with paragraph 2 (...), the applicable principles are broadly similar. In both contexts regard must be had to the fair balance that has to be struck between the competing interests of the individual and of the community as a whole, and in any case the State enjoys a certain margin of appreciation.”261 Critics stress that even if the positive function of Article 8 ECHR is therefore recognized, its concept of protection with respect to its effects in the private sector is not comprehensively clear.262 (a) 258 See Schweizer in: Handbook of Basic Rights – Europe I, § 138 cip. 64 (and the following); however, see also Linskey, The Foundations of EU Data Protection Law, pp. 115-118 (referring to further sources) who also applies the concept of “mittelbare Drittwirkung” to the ECHR. 259 See Calliess, ibid., cip. 16. 260 See ECtHR “López Ostra vs. Spain“ (Application nr 16798/90), cip. 51. 261 See ECtHR, ibid., cip. 51. 262 See Calliess, ibid., cip. 16; ECtHR “Guerra et alt. Vs. Italy” (Application nr. 14967/89), cip. 58 and 60; Jaeckel, ibid., pp. 179 to 181. C. The function of the principle of purpose limitation in light of Article 8 ECFR 116 Right to respect for private life under Article 8 ECHR Legal scholars stress the importance of the positive duties of protection in Article 8 ECHR in light of the wording ‘right to respect for private life’ (underlining by the author).263 Thus, regarding the different guarantees mentioned before, they consider two substantial elements which undoubtedly fall under Article 8 ECHR: The right for private life serves, firstly, a defensive function (also called negative duty of protection) and, secondly, a protection function (also called positive duty of protection).264 With regard to the private sector, for example, in the case of “Craxi vs. Italy”, the press published information that originally stemmed from private documented court files. The European Court of Human Rights held, in general, that the public bodies concerned were obliged, pursuant to Article 8 ECHR, to provide measures that are necessary for the protection of private life.265 With a particular view to the processing of personal data, the protection function of the right to respect for private life may also provide, for instance, for the right to access to personal data, the deletion of personal data, the correction of inaccurate data, and even the need for a supervisory authority can result from this right.266 With respect to the balancing of colliding fundamental rights, in the case of “K. U. vs. Finland”, the European Court of Human Rights had in particular to balance the right of private life in Article 8 ECHR between two private parties. In this case, information about a 12 year old boy, such as his age, physical data, telephone number, address and his pretended desire for an intimate relationship with another coeval or older boy, were published, without the boy's knowledge, on a dating website. The boy subsequently became a victim of an apparent pedophile. Despite the gravity of the harm caused, the service (b) 263 See Schweizer, DuD 2009, Decisions of the European Court of Human Rights on the Fundamental Rights to Personality and Data Protection (Die Rechtsprechung des Europäischen Gerichtshofs für Menschenrechte zum Persönlichkeits- und Datenschutz), p. 464. 264 See Burgkardt, ibid., pp. 247. 265 See ECtHR, Case of Craxi vs. Italy from 17 July 2003 (application no. 25337/94), cip. 73. 266 See De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 7 and 19. I. Constitutional framework 117 provider for the website did not provide the dynamic IP address of the person who published the information.267 The European Court of Human Rights finally weighed the right of confidentiality in favor of the, so far, unknown person who published the data against the right of physical integrity of the violated boy.268 Legal scholars stress that the Court, at least, indirectly balanced the defensive and the protection function of the right of private life of Article 8 ECHR, on the one side, in favor of the person who published the information and, on the other side, in favor of the violated boy.269 Thus, even if the concept of protection regarding the negative and positive duties of a Sate is not comprehensively clear, structurally, the Court applies the general principle weighing the colliding fundamental rights. European Charter of Fundamental Rights Amongst legal scholars, it is heavily debated, whether the European Constitution directly applies to the private sector or not. While some critics deny a third-party effect, in general, in relation to the lack of application of Union Law on private parties, others confirm it, at least, with regard to market freedoms.270 Market freedoms and fundamental rights Interestingly, the European Court of Justice affirmed in several decisions a direct third-party effect of two market freedoms: the freedom to provide services and the freedom of movement for workers, under Article 49 and 45 of the Treaty on the Functioning of the European Union. In the cases of “Walrave and Koch vs. Association Union Cycliste Internationale” and “Gaeton Donà vs. Mario Mantero”, the Court affirmed the third-party effect for collective agreements on the sector of services and employment. (2) (a) 267 See ECtHR, Case of K.U. vs. Finland from 2 December 2008, (application no. 2872/02), cip. 6 to 14. 268 See ECtHR, Case of K.U. vs. Finland from 2 December 2008, (application no. 2872/02), cip. 48. 269 See Burgkardt, ibid., pp. 280 to 282. 270 See Niedobitek, ibid., cip. 103 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 118 In addition, in the case of “Angonese vs. Cassa de Risparmio”, the Court finally confirmed the third-party effect even for agreements that were concluded on an individual basis.271 In contrast, with regard to the principle of free movement of goods, the European Court of Justice denied the direct third-party effect in the private sector. In the case of “Dansk Supermarked vs. Imerco”, the Court stated that the breach of an individual agreement prohibiting the commercial exploitation of a good in a certain Member State must not be considered as an infringement of unfair competition law. The decision clearly addressed the referring court, which had to interpret the national unfair competition clause, with the result that the principle of free movement of goods had only an indirect effect on the private sector. In the case of “Bayer vs. Süllhöfer”, the European Court of Justice explicitly denied a direct thirdparty effect of the principle of free movement of goods. In the case of “Commission vs. France”, the Court finally stated that there was an obligation of the Member State to guarantee the free movement of goods on the single market and that it had to, given that private parties hinder such free movement, weigh this freedom with the colliding fundamental rights.272 In conclusion, the European Court of Justice affirmed the thirdparty effect, however, only in relation to the freedom to provide services and for the movement of workers. In relation to the principle of free movement of goods, the Court denied the direct-third party effect and instead appeared to favor the protection function. This means that it is not the private parties, but the Member States who are bound and must balance the fundamental freedoms with the fundamental rights of the private parties concerned. The decisions described above concerned, primarily, the fundamental freedoms and not the fundamental rights. Critics conclude that the European Court of Justice will apply, at least, the protection function for the fundamental rights also.273 Calliess stresses, in particular, the wording and importance of Article 1 ECFR which states that “Human Dignity is inviolable (and/..) must be respected and protected” (underlining by the author). From his point of view, this duty of protection implies, in light of the fact 271 See Papier, ibid., cip. 50 to 54 with references to ECJ C36/74, ECJ 13/76, ECJ C-415/93, and ECJ C-281/98. 272 See Papier, ibid., cip. 55 to 59 with references to ECJ 58/80, ECJ 65/86, and ECJ C-295/95. 273 See Jaeckel, ibid., pp. 279 to 281. I. Constitutional framework 119 that human dignity is inherent in all fundamental rights,274 that the protection function applies, in general, to fundamental rights of the European Charter.275 The European Court of Justice did not clearly comment on the effects of the fundamental rights to private life under Article 7 ECFR and to data protection provided for by 8 ECFR between private parties, for example, in the cases “Lindqvist” and “PROMUSICAE”. Since these and further decisions all referred, so far, to the European directives applicable to both the public and private sector, it is not exactly clear which kind of effects the European Court of Justice considers for the fundamental rights to private life and data protection.276 In any case, in order to illustrate, in more detail, how the European Court of Justice weighs the opposing fundamental rights of the private parties involved, the subsequent few decisions of the European Court of Justice shall be discussed. The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR In these decisions, it becomes clear that the European Court of Justice does not (yet) clearly differentiate between the right to private life and to data protection, under Article 7 and 8 ECFR. In the cases “Telekom vs. Germany”, “SABAM vs. Scarlet” and “SABAM vs. Netlog”, for example, the Court referred to the right to data protection under Article 8 ECFR, only. In the first-mentioned case “Telekom vs. Germany”, a German telecommunications network provider, Deutsche Telekom AG, published, based on the individuals’ consent, the names and telephone numbers of its own customers as (b) 274 Cf. Papier, ibid., cip. 23. 275 See Calliess, ibid., cip. 17. 276 See Britz, Europeanisation of Data Protection Provided for by Fundamental Rights?, p. 8; v. Danwitz, The Fundamental Rights to Private Life and to data Protection, p. 585; ECJ C-101/01 (Lindqvist); ECJ C-275/06 (PROMUSICAE); See Kokott and Sobotta, The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, p. 225, stressing an only indirect effect on the private sector; in contrast, De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, pp. 9 and 10, seem to assume a direct effect on the private sector stating that the “Charter extends the protection of personal data to private relations and to the private sector.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 120 well as those of third parties in the public directory. The claimant's, Go Yellow GmbH and Telix AG, operated an Internet inquiry service and a telephone directory enquiry service, offering the said data in return for payment. The companies demanded, on the grounds of Article 25 section 2 Universal Service Directive 2002/22/EC, from Deutsche Telekom that it must provide not only the data of the customers of Deutsche Telekom AG but also of the third parties. Pursuant to Article 25 section 2 Universal Service Directive 2002/22/EC, “Member States shall ensure that all undertakings which assign telephone numbers to subscribers meet all reasonable requests to make available, for the purposes of the provision of publicly available directory enquiry services and directories, the relevant information in an agreed format on terms which are fair, objective, cost oriented and non-discriminatory.” The referring German court asked the European Court of Justice to consider whether Article 12 Directive on privacy and electronic communications 2002/58/EG hindered, in light of the fact that the Defendant lacked the explicit consent or objection from the said third parties or their customers, the transfer of the data concerned.277 Article 12 sect. 2 Directive on privacy and electronic communications 2002/58/EC only obliges the Member States, amongst others, to “ensure that subscribers are given the opportunity to determine whether their personal data are included in a public directory.” In order to answer this question, the Court stated, referring only to Article 8 ECFR, as: “Article 8(2) of the Charter authorizes the processing of personal data if certain conditions are satisfied. It provides that personal data ‘must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law’. (…) Moreover, the Directive on privacy and electronic communications makes it clear that that directive makes the publication, in printed or electronic directories, of personal data concerning subscribers conditional on the consent of those subscribers.”278 The decision appears, in the first instance, to presume a direct effect of Article 8 section 2 ECFR between the parties involved. Since it is not public bodies but private companies that collected and transferred the data in question, the Court seems to presume that Article 8 ECFR addresses these private parties. However, from a second perspective, such a third-party effect becomes arguable by focusing on which entity actually caused the transfer of data. Article 25 section 2 Universal Service Directive establishes an obligation for private undertakings to make the personal data available to third parties. Due to the fact that the law obliged these private companies to transfer the data, 277 See ECJ C-543/09 cip. 19, 20, and 27. 278 See ECJ C-543/09 cip. 52 and 54. I. Constitutional framework 121 they had no choice in the matter of whether or not to transfer the same. It is hence the legislature establishing the obligation and not the private company that infringes the right of Article 8 ECFR. The right to data protection therefore had, so far, no direct effect on the private parties. In the next case “SABAM vs. Scarlet”, Scarlet was an Internet Service provider offering its customers access to the Internet. SABAM was an association of authors, composers and publishers representing the interests of its members in the field of copyright. SABAM had noticed that Internet users used the service of Scarlet by downloading copyright protected works by members of SABAM without any authorization or payment of royalties. SABAM filed an injunction against Scarlet to block any illegal file sharing. The referring Belgian court asked the European Court of Justice to consider whether such a filtering system harmed the fundamental right for the protection of personal data in Article 8 ECFR, since such a filtering system implied the processing of certain IP addresses.279 Similarly, in the case of “SABAM vs. Netlog”, Netlog was a social online community where users were able to set up a personal profile and communicate to each other sharing all sorts of information. SABAM was of the opinion that users on Netlog shared copyright protected works of its members and filed an injunction against Netlog in order for it to cease illegally making available the said musical and audiovisual content of SABAM’s repertoire by installing a filter system. The Belgian court also referred this case to the European Court of Justice asking whether, amongst other matters, the Data Protection Directive and the Directive on privacy and electronic communication “permit Member States to authorize a national court (…) to order a hosting service provider to introduce, for all its customers, in abstracto and as a preventive measure (…) a system for filtering most of the information which is stored on its servers in order to identify” works of the said repertoire.280 The European Court of Justice balanced the right to data protection of the individuals using the Internet service and the social network, respectively, as well as the rights of the providers with the opposing fundamental rights of the claimant, i.e. the association of authors, composers, and publishers. The Court stated, at first, that “such an injunction would result in a serious infringement of the freedom of the ISP concerned to conduct its business since it would require that ISP to install a complicated, costly, permanent computer system at its own expense (…). In those circumstances, it must be held that the injunction to install the contested filtering system is to be regarded as not respecting the requirement that a fair balance be struck between, on the one hand, the protection of the intellectual-property right en- 279 See ECJ C-70/10 cip. 15 to 26. 280 See ECJ C-360/10 cip. 15 to 25. C. The function of the principle of purpose limitation in light of Article 8 ECFR 122 joyed by copyright holders, and, on the other hand, that of the freedom to conduct business enjoyed by operators as ISPs. Moreover, the effects of that injunction would be limited to the ISP concerned, as the contested filtering system may also infringe that fundamental rights of that ISP’s customers, namely their right to protection of their personal data and their freedom to receive or impart information, which are rights safeguarded by Articles 8 and 11 of the Charter respectively.” In the case of “SABAM vs. Netlog”, the Court considered in more detail how such a filtering system would harm the fundamental right to data protection of users in the social network in question: “Indeed, the injunction requiring installation of the contested filtering system would involve the identification, systematic analysis and processing of information connected with the profiles created on the social network by its users. The information connected with those profiles is protected personal data because, in principle, it allows those users to be identified”. The Court concluded, referring to the preceding case of “SABAM vs. Scarlet”, that the injunction would not be in line with the requirement of a fair balance between, on the one side, the copyright of the SABAM members and, on the other, the right to protection of personal data of the users of the social network. While the European Court of Justice referred in the preceding cases to the right to data protection under Article 8 ECFR, only, it additionally referred, in the cases of “ASNEF vs. FECEMD” and “González vs. Google Spain”, to the right to private life provided for by Article 7 ECFR. The first case of “ASNEF vs. FECEMD” is interesting because the Court did not weigh the opposing rights itself. Instead, the Court decided on the question of whether or not the Spanish legislator was correct in the way it has balanced the opposing rights, in light of Articles 7 and 8 ECFR, in accordance with Article 7 lit. f) of the Data Protection Directive. Article 7 lit. f) of the directive states that the Member States shall provide, transposing the directive into national law, that personal data may be processed only if the “processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject which require protection under Article 1 (1)” of the directive. The Spanish legislator transposed this provision into Spanish law excluding, in general, the processing of personal data that not has been made publicly available before.281 281 See ECJ C-468/10 and C-469/10, cip. 22. I. Constitutional framework 123 The European Court of Justice stated, at first, that the “Member States must, when transposing Directive 95/46, take care to rely on an interpretation of that directive which allows a fair balance to be struck between the various fundamental rights and freedoms protected by the EU legal order”. The Court agreed with the national legislator that the fact that the data was already publically available before might influence the intensity of the harm of the fundamental rights of the individual concerned. The intensity of harm for the individual is much higher if the data was not publically available before its processing. This higher intensity of harm must be taken into account balancing the individual’s rights with the opposing rights of the third parties. However, the Court stated that the Spanish legislator interfered with Article 7 lit. f) of the Data Protection Directive by “excluding, in a categorical and generalized manner, the possibility of processing certain categories of personal data, without allowing the opposing rights and interests at issue to be balanced against each other in a particular case.” The Court added that this might be only different, in accordance with Article 8 of the Data Protection Directive, with respect to special categories of data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life. While the European Court of Justice decided this case in favor of the data controllers,282 it followed, in the case of “González vs. Google Spain”, a more restrictive approach in favor of the individual concerned by the data processing. In this case, the claimant was involved, in 1998, in a real estate-auction as a measure for recovering social security debts. A Spanish newspaper had published articles about the auction that Internet users could find, until 2012, under the claimant’s name, via Google’s search engine. The claimant requested not only from the newspaper to delete his name in the articles or, at least, to use technical tools so that Google’s search engine could not find the articles but also from Google itself to delete the links to the articles. The case ended up before the European Court of Justice, which finally denied the first but affirmed the second request: Google had to delete the links.283 The European Court of Justice weighed the fundamental rights to privacy and data protection of Mr. González against the fundamental rights of the search engine operator linking to the articles, and the Internet users who 282 See the similar case of ECJ C-582/14, cip. 50 to 64. 283 See ECJ C-131/12 cip. 14 to 20. C. The function of the principle of purpose limitation in light of Article 8 ECFR 124 could find these articles searching for his name. In doing so, the Court clearly differentiated not only between the interests of the publishers of the articles and the operator of the Internet search engine but also between the effects of the publication of the articles, as such, and the fact that they can be found by means of the search engine.284 In the Court’s opinion, the increased possibilities of finding and interconnecting the articles within the Internet can even have a worse affect on the claimant than the first publication of the articles within the newspaper itself. The Court concluded from this that Articles 7 and 8 ECFR “override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information (...).”285 From the Court’s point of view that might be only different “if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question.”286 German Basic Rights On the German level, finally, constitutional law primarily binds, pursuant to Article 1 sect. 3 GG, the State and not private parties. However some critics believe that German Basic rights not only address the State but also private individuals. They argue that, nowadays, it is not only the State but also private entities that are able to infringe fundamental rights.287 Simitis, who also chaired the Expert Group set up by the European Commission in order to prepare the European Charter of Fundamental Rights, particularly considers that the personality right, more precisely, the right to informational self-determination guaranteed in Article 2 sect. 1 and Article 1 sect. 1 GG serves as “classic link for the third-party effect of constitutional rights”.288 Nevertheless, the prevailing opinion denies such a direct effect of fundamental rights on the private sector, even if third parties have com- (3) 284 See ECJ C-131/12 cip. 87. 285 See ECJ C-131/12 cip. 97. 286 See ECJ C-131/12 cip. 97. 287 See Papier, ibid., cip. 4 to 6. 288 See Expert Group on Fundamental Rights, p. 27; Simitis, NJW 1984, p. 401; denying Wente, NJW 1984, 1446. I. Constitutional framework 125 prehensive power of control. A direct third-party effect is only recognized in exceptions explicitly provided for by the German Basic Rights.289 Irrespective of the question of the direct third-party effect of German Basic Rights, it is common ground that these rights have an indirect effect on third parties. The legal doctrine elaborated several objective and subjective functions of the Basic Law. In light of these functions, the Basic Rights do not only serve, as illustrated previously, the defensive function that is at stake if someone seeks to defend him or herself against state regulation, but also serves a protection function. This function results from the “objective order of values” provided for by German Basic Law. The justification of the protection function refers especially to Article 1 sect. 1 sent. 1 GG, which requires, similarly to Article 1 ECFR, that all state authorities must respect and protect human dignity.290 Protection function of the right to informational self-determination In the decision of “Release of Confidentiality” (Schweigepflichtentbindung), the German Constitutional Court affirmed this concept of protection with particular respect to the data processing by private parties. In this case, the claimant complained about a certain contractual obligation in her disability insurance contract that contained an authorization for the release of her confidential information of the insurance policy. The claimant reached an agreement with the insurance company for a life policy with a supplementary insurance for occupational disablement.291 The contract for this supplementary insurance consisted of the claimant's duty to authorize the insurance company to “retrieve appropriate information from all doctors, hospitals, nursing homes, where I (the claimant) was or will be treated, as well as from my (the claimant’s) health insurance company and other personal insurance companies, social insurance companies, public agencies, current and former employers.”292 When an insurance event occurred, the claimant refused to authorize the general release of confidential information and instead offered to authorize the respective entities to disclose her personal information on a case-by-case basis. The defendant refused to do this and, consequently, refused to pay out the policy. The claimant brought an action against the insurance company declaring that the specific clauses of the agreement in (a) 289 See Jarass in: Jarass/Pieroth, GG, Art. 1 cip. 50; Jaeckel, ibid., pp. 100/101. 290 See Papier, ibid., cip. 7 to 10. 291 See BVerfG, 1 BvR 2027/02 (Release of Confidentiality), cip. 1 to 11. 292 See BVerfG, ibid., cip. 13. C. The function of the principle of purpose limitation in light of Article 8 ECFR 126 question were illegal and demanded that the insurance company pay out according to the policy. After the civil courts denied the claim in all instances, the claimant brought a constitutional complaint about the decisions of the civil courts on the grounds that the decisions would infringe the claimant’s basic right to informational self-determination.293 The Constitutional Court affirmed the claim stating that the decisions of the civil courts infringed the claimant’s general personality right in its specific form as the right to informational self-determination. The Court incorporates the state duty of protection regarding the right to informational self-determination with the following reasoning: “The judgments in question of the Regional Court and Higher Regional Court must be conform with the duty of the public authorities resulting from Art. 2 sect. 1 in combination with Art. 1 sect. 1 GG to guarantee the individual’s informational self-determination in relation to third parties (…). The general personality right consists of the right of the individual to determine by him or herself the disclosure and usage of his or her personal data (…). This right also affects (…) the private law. If the judge, who decides on a case according to private law, misunderstands the object of protection of the general personality right, he or she infringes, by means of his or her decision, the protection function of the citizen’s basic right (…). Indeed, especially on the private sector, the general personality right does not constitute an absolute control about certain information. The individual has to be rather considered as a personality that develops within the social community and depends on communication (…). This might result in the situation in which the individual has to respect the interests of communications by others. Principally, it belongs to the individual to form his or her communicational relationships and to decide whether he or she discloses or keeps certain information secret. Also the freedom to release information is protected by basic rights. For the individual, it is generally possible and reasonable to take preventative measures in order to maintain his or her interests of confidentiality. The general personality right safeguards that the legal order provides and maintains the legal conditions under which the individual is able to participate in communicational processes in a self-determined way and to develop his or her personality. In order to fulfill this duty, the individual must be reasonably enabled to protect him or herself in informational matters. If this is not the case, there is a responsibility of the State to establish the conditions for a self-determined participation in communication. In this case, the State cannot deny persons concerned protection under reference to the only seemingly voluntariness of the disclosure of certain information. The duty of protection that results from the general per- 293 See BVerfG, ibid., cip. 12 to 23. I. Constitutional framework 127 sonality right rather requires from the responsible public agencies to provide the legal pre-conditions for an efficient informational self-protection.”294 Thus, the duty of protection resulting from the right to informational selfdetermination obliges the State to establish and safeguard mechanisms that enable the individual concerned to protect him or herself against the threats resulting from the data processing by third parties. 294 See BVerfG, ibid., cip. 27 to 33: “Die angegriffenen Urteile des Landgerichts und des Oberlandesgerichts sind an der aus Art. 2 Abs. 1 in Verbindung mit Art. 1 Abs. 1 GG folgenden Pflicht der staatlichen Gewalt zu messen, dem Einzelnen seine informationelle Selbstbestimmung im Verhältnis zu Dritten zu ermöglichen. Das allgemeine Persönlichkeitsrecht umfasst die Befugnis des Einzelnen, über die Preisgabe und Verwendung seiner persönlichen Daten selbst zu bestimmen (...). Dieses Recht entfaltet als Norm des objektiven Rechts seinen Rechtsgehalt auch im Privatrecht. Verfehlt der Richter, der eine privatrechtliche Streitigkeit entscheidet, den Schutzgehalt des allgemeinen Persönlichkeitsrechts, so verletzt er durch sein Urteil das Grundrecht des Bürgers in seiner Funktion als Schutznorm (...). Gerade im Verkehr zwischen Privaten lässt sich dem allgemeinen Persönlichkeitsrecht allerdings kein dingliches Herrschaftsrecht über bestimmte Informationen entnehmen. Der Einzelne ist vielmehr eine sich innerhalb der sozialen Gemeinschaft entfaltende, auf Kommunikation angewiesene Persönlichkeit (...). Dies kann Rücksichtnahmen auf die Kommunikationsinteressen anderer bedingen. Grundsätzlich allerdings obliegt es dem Einzelnen selbst, seine Kommunikationsbeziehungen zu gestalten und in diesem Rahmen darüber zu entscheiden, ob er bestimmte Informationen preisgibt oder zurückhält. Auch die Freiheit, persönliche Informationen zu offenbaren, ist grundrechtlich geschützt. Dem Einzelnen ist es regelmäßig möglich und zumutbar, geeignete Vorsorgemaßnahmen zu treffen, um seine Geheimhaltungsinteressen zu wahren. Das allgemeine Persönlichkeitsrecht gewährleistet, dass in der Rechtsordnung gegebenenfalls die Bedingungen geschaffen und erhalten werden, unter denen der Einzelne selbstbestimmt an Kommunikationsprozessen teilnehmen und so seine Persönlichkeit entfalten kann. Dazu muss dem Einzelnen ein informationeller Selbstschutz auch tatsächlich möglich und zumutbar sein. Ist das nicht der Fall, besteht eine staatliche Verantwortung, die Voraussetzungen selbstbestimmter Kommunikationsteilhabe zu gewährleisten. In einem solchen Fall kann dem Betroffenen staatlicher Schutz nicht unter Berufung auf eine nur scheinbare Freiwilligkeit der Preisgabe bestimmter Informationen versagt werden. Die aus dem allgemeinen Persönlichkeitsrecht folgende Schutzpflicht gebietet den zuständigen staatlichen Stellen vielmehr, die rechtlichen Voraussetzungen eines wirkungsvollen informationellen Selbstschutzes bereitzustellen.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 128 Priority of contractual agreements and the imbalance of powers Subsequently, the German Court specified under which conditions the decision of an individual, in relation to a contractual agreement, has to be considered as voluntary or ‘only seemingly voluntary’, which finally lead to the infringement of the basic right by the deciding courts: “The contract is the essential instrument in order to develop free and self-responsible actions in relation to third parties. The contract, which mirrors the harmonious will of the contracting parties generally, allows the assumption of a fair balance of their interests and must be principally respected by the State. However, if it is apparent that one party of the contract is so powerful that he or she can, in fact, unilaterally determine the contract, the law must safeguard both constitutional positions in order to avoid that the self-determination of one party perverts into being completely controlled by the other party. Such unilateral power of determination can result, amongst others, from the fact that the service offered by one party for the maintenance of the personal circumstances of the other is so essential that the latter cannot reasonably refuse to conclude the contract and, subsequently, to disclose the information demanded by the first. If those contract clauses – which concern the right to informational self-determination – are, in fact, not negotiable, the corresponding duty of protection requires the judge to weigh the interests of confidentiality of the one party with the other’s interests of disclosure.”295 (b) 295 See BVerfG, ibid., cip. 34 to 36:“Der Vertrag ist das maßgebliche Instrument zur Verwirklichung freien und eigenverantwortlichen Handelns in Beziehung zu anderen. Der in ihm zum Ausdruck gebrachte übereinstimmende Wille der Vertragsparteien lässt in der Regel auf einen sachgerechten Interessenausgleich schließen, den der Staat grundsätzlich zu respektieren hat (...). Ist jedoch ersichtlich, dass in einem Vertragsverhältnis ein Partner ein solches Gewicht hat, dass er den Vertragsinhalt faktisch einseitig bestimmen kann, ist es Aufgabe des Rechts, auf die Wahrung der Grundrechtspositionen beider Vertragspartner hinzuwirken, um zu verhindern, dass sich für einen Vertragsteil die Selbstbestimmung in eine Fremdbestimmung verkehrt (...). Eine solche einseitige Bestimmungsmacht eines Vertragspartners kann sich auch daraus ergeben, dass die von dem überlegenen Vertragspartner angebotene Leistung für den anderen Partner zur Sicherung seiner persönlichen Lebensverhältnisse von so erheblicher Bedeutung ist, dass die denkbare Alternative, zur Vermeidung einer zu weitgehenden Preisgabe persönlicher Informationen von einem Vertragsschluss ganz abzusehen, für ihn unzumutbar ist. Sind in einem solchen Fall die Vertragsbedingungen in dem Punkt, der für die Gewährleistung informationellen Selbstschutzes von Bedeutung ist, zugleich praktisch nicht verhandelbar, so verlangt die aus dem allgemeinen Persönlichkeitsrecht folgende Schutzpflicht eine gerichtliche Überprüfung, ob das Geheimhaltungsinteresse des unterlegenen Teils dem Offenbarungsinteresse des überlegenen Teils angemessen zugeordnet wurde. Dazu sind I. Constitutional framework 129 The Court finally came to the conclusion that the power of negotiation of the contracting parties was so unbalanced that the claimant could not safeguard her informational self-protection on her own. The Court stated that in light of the current low level of state insurance for occupational disability, professionals have to, in order to safeguard their living standard, take out private insurance policies. Furthermore, the Court held the clause in question as not negotiable. Even if the claimant could choose between different policies which were offered by different insurance companies, the differences in the policies on the market, referred only to the conditions and the extent of the services of the policy as such but not to the collection and processing of the personal data. Thus, the Court did not see that competition which existed in the market with regard to the clauses that were relevant with respect to data protection law.296 Balancing the colliding constitutional positions Consequently, the German Constitutional Court stated on how the constitutional positions of the contracting parties may be weighed against each other. On the one hand, the Court considered, with the following reasoning, that the contractual obligation of release of confidentiality did essentially harm the claimant’s right to informational self-determination: “The persons and institutions that are, in part, rather generally listed in the authorization of release from confidentiality can have sensible information about the claimant which dramatically affects her development of personality. (…) (Given the release of confidentiality), the claimant looses the possibility to control her interests of confidentiality by her own because of the general wording of the authorization, which does not determine specific inquiry offices nor specific inquiries, so that she cannot foresee which information about her will be demanded by whom. (…) The authorization demanded by the defendant is comparable with a general authorization to retrieve sensitive information with respect to the insurance event which extent is merely foreseeable by the claimant. (…) Because of the broad term ‘appropriate’, the policy-holder is not able to estimate which information can be retrieved on the basis of the authorization. The district court considered ‘all facts which might be, even indirectly, legally relevant for the approval and execution of the policy services’ as appropriate. As a consequence, actually each reference to the (c) die gegenläufigen Belange einander im Rahmen einer umfassenden Abwägung gegenüberzustellen (...).” 296 See BVerfG, ibid., cip. 37 to 40. C. The function of the principle of purpose limitation in light of Article 8 ECFR 130 event of insurance suffices in order to allow the inquiry. (…) Mechanisms of control to prove whether the collection of the data occurs in accordance to the (… /clause) are lacking. (…) The contract does not provide any duties of special information in favor of the policy-holder about specific collections of the data. The insurant has only after the disclosure of the information, given that he or she becomes aware of it, the possibility to control its legitimacy and to bring judicial action against it. However, at this moment, his or her interest can be already irreparably harmed (… /by the insurance company).”297 On the other hand, the German Constitutional Court considered that the defendant has an equally essential interest to obtain the information: “It is of high relevance for the insurance company to verify whether the event of insurance really occurred. (…) In addition, the insurance company is, in light of the variety of the events, not able to pre-list, already in the contract 297 See BVerfG, ibid., cip. 43, 45 to 48: “Wenn die Beklagte von der Beschwerdeführerin die Abgabe der begehrten Schweigepflichtentbindung verlangen kann, wird deren Interesse an wirkungsvollem informationellem Selbstschutz in erheblichem Ausmaß beeinträchtigt. Die in der formularmäßigen Erklärung der Schweigepflichtentbindung genannten, zum Teil sehr allgemein umschriebenen Personen und Stellen können über sensible Informationen über die Beschwerdeführerin verfügen, die deren Persönlichkeitsentfaltung tief greifend berühren. (...) Dabei begibt sie sich auch der Möglichkeit, die Wahrung ihrer Geheimhaltungsinteressen selbst zu kontrollieren, da wegen der weiten Fassung der Erklärung, in der weder bestimmte Auskunftsstellen noch bestimmte Auskunftsersuchen bezeichnet sind, für sie praktisch nicht absehbar ist, welche Auskünfte über sie von wem eingeholt werden können. (...) Die von der Beklagten verlangte Ermächtigung kommt damit einer Generalermächtigung nahe, sensible Informationen mit Bezug zu dem Versicherungsfall zu erheben, deren Tragweite die Beschwerdeführerin kaum zuverlässig abschätzen kann. (...) Es fehlt an einem wirksamen Kontrollmechanismus für die Überprüfung der Sachdienlichkeit einer Informationserhebung. (...) Aufgrund der Weite des Begriffs der Sachdienlichkeit kann der Versicherungsnehmer nicht im Voraus bestimmen, welche Informationen aufgrund der Ermächtigung erhoben werden können. Das Landgericht hat ausgeführt, sachdienlich seien “alle Tatsachen, die für die Feststellung und Abwicklung der Leistungen aus dem Versicherungsvertrag rechtserheblich sein können, und sei es auch nur mittelbar als Hilfstatsachen“. Damit reicht praktisch jeder Bezug zu dem behaupteten Versicherungsfall aus, um eine Auskunftserhebung zu begründen. (...) Eine gesonderte Aufklärung des Versicherungsnehmers über die einzelnen Erhebungen ist in den Vertragsbedingungen nicht vorgesehen. Allenfalls nach einer Auskunftserteilung hat der Versicherte, soweit er von ihr erfährt, die Möglichkeit, deren Berechtigung zu prüfen und gegebenenfalls gerichtlichen Rechtsschutz in Anspruch zu nehmen. Zu diesem Zeitpunkt kann sein Interesse jedoch bereits irreparabel geschädigt sein, wenn das Versicherungsunternehmen unbefugt sensible Informationen erhoben hat.” I. Constitutional framework 131 clause, all the information that might become relevant for the subsequent verification. Evaluating the importance of the defendant’s interests, also the organizational and financial efforts that result from different possibilities of verification may come into consideration.”298 In conclusion, the German Constitutional Court examines, first, whether or not the State actually infringes a State duty of protection and, in doing so, whether or not the individual concerned is really able to protect him or herself. Only if this is not the case, the State then has the duty to weigh itself (in this case, the Constitutional Court) the opposing fundamental rights, instead of the private parties. Balance between defensive and protection function As demonstrated so far, the European Court of Human Rights does not precisely differentiate between the defensive and the protection function of human rights. In turn, the European Court of Justice does not even clarify, at least not explicitly, the type of effect of the fundamental rights to private life and/or data protection on the private sector. In contrast, the German Constitutional Court explicitly applies an indirect effect of basic rights, elaborating, precisely on the protection and defensive function in order to balance the basic rights opposing the German right to informational self-determination. Therefore, even if not all fundamental rights regimes recognize the defensive and protection function as applicable principles, it is worth examining their interplay, in general, which can serve as a structural aid in order to find a sound balance between the colliding fundamental rights.299 bb) 298 See BVerfG, ibid., cip. 50 to 52: “Dem Interesse der Beschwerdeführerin an informationeller Selbstbestimmung steht ein Offenbarungsinteresse der Beklagten von gleichfalls erheblichem Gewicht gegenüber. Es ist für das Versicherungsunternehmen von hoher Bedeutung, den Eintritt des Versicherungsfalls überprüfen zu können. (...) Zudem ist es aufgrund der Vielzahl denkbarer Fallgestaltungen dem Versicherer nicht möglich, bereits in der Vertragsklausel alle Informationen im Voraus zu beschreiben, auf die es für die Überprüfung ankommen kann. Im Rahmen der Gewichtung des Interesses der Beklagten kann auch der organisatorische und finanzielle Aufwand berücksichtigt werden, den verschiedene Prüfungsmöglichkeiten erfordern.” 299 Cf. Jaeckel, ibid., p. 103, who stresses the many commonalities of all three fundamental rights regimes, i.e. the ECHR, the ECFR, and the German Basic Rights C. The function of the principle of purpose limitation in light of Article 8 ECFR 132 The 3-Step-Test: Assessing the defensive and protection function There is a rough consensus on how to assess both the protection function and the defensive function of fundamental rights.300 Both assessments usually follow three steps: Firstly, it is necessary to determine the scope of protection of the fundamental right in question. The second step requires examining whether or not a certain action invades into the scope. So far, the first and second steps are very similar in its approach. The third step seeks to assess whether or not the invasion into the scope of protection leads to a disproportionate violation of the fundamental right or not. It is the third step of this test where the assessment is different between the defensive and protective function as demonstrated below. As mentioned previously, like the defensive function, the protection function applies to all three state powers, i.e. the legislator, the executive, and the judiciary. Regarding the protection function, the third step of the assessment refers to the question of whether or not the harm caused by a private party to another private party must be considered as a non-fulfillment of the duty of protection by the State. However, with respect to an legislator’s action, or rather omission, the protection function is particular. In Germany, it can be assessed pursuant to the principle called “prohibition of insufficient means”. The German Constitutional Court requires, in essence, only “an – under respect of colliding objects of protection – adequate level of protection; it is essential, that such protection is effective. The measures provided for by the legislator must be sufficient for an adequate and effective protection and must be, in addition, based on an accurate investigation of facts and on reasonable estimations.”301 Hence, the duty of protection principally follows the objects of protection guaranteed (1) regarding the state duty of protection; Eckhoff, ibid, regarding the terminology, pp. 288 to 290. 300 See Jaeckel, ibid., examining in detail the criteria for the distinction between the protection and defensive function in the light of German Basic Rights, pp. 63 to 79, the ECHR, pp. 141 to 154, and the ECFR, pp. 247 to 159. 301 See Calliess, ibid., cip. 6 with reference to BVerfGE 88, 203, cip. 159: “Notwendig ist ein – unter Berücksichtigung entgegenstehender Rechtsgüter – angemessener Schutz; entscheidend ist, daß er als solcher wirksam ist. Die Vorkehrungen, die der Gesetzgeber trifft, müssen für einen angemessenen und wirksamen Schutz ausreichend sein und zudem auf sorgfältigen Tatsachenermittlungen und vertretbaren Einschätzungen beruhen“. I. Constitutional framework 133 by the fundamental rights.302 Consequently, these guarantees also determine the so-called range of protection. The following three questions essentially determine the range of protection in order to provide for an adequate level of protection: Is a subsequent protection against a harm that already had occurred sufficient?; or is a preventative protection against specific risks necessary?; or is a precautionary protection against unspecific risks even required?303 Calliess stresses a further factor determining the duty of protection: the state “monopoly on the use of force”.304 This monopoly forbids individuals to execute their rights themselves. Therefore, the less private individuals are legally allowed to protect themselves against harms by third parties, the more the State is in charge of controlling the protection of their fundamental rights. In contrast, the more the legislator provides mechanisms enabling private parties to protect themselves, e.g. by self-regulation mechanisms such as codes of conducts, certificates or the individual’s consent, the less strict is the state duty of protection.305 Similarly, if private entities become so powerful that they can unilaterally determine the conditions on the market, the state duty of protection requires rebalancing this market power.306 Overall, the State must safeguard that the legal system effectively and efficiently enables the individual to protect him or herself; the system of protection provided for must be suited to repel the harm (depending on its risk and intensity that it poses), according to the fundamental right in question.307 However, even if the duty of protection is strict, the legislator always has a certain margin of discretion for how to fulfill its duty of protection. This is the particularity of the protection function with respect to the legislator, compared to the executive or the judiciary. This margin results from the separation of powers: A Constitutional Court belonging to the Judiciary must not substitute the legislator which is democratically empowered 302 See Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 86 and 87. 303 See above under point B. II. 3. c) Interim conclusion: Fundamental rights determining the appropriateness of protection; Jaeckel, ibid., regarding the German Basic Rights, pp. 85 to 88, the ECHR, pp. 165 and 166, and the ECFR, pp. 260 to 265; cf. Kuner et al., Risk management in data protection, p. 98. 304 See Calliess, ibid., cip. 2. 305 See Calliess, ibid., cip. 20 to 22. 306 Cf. v. Danwitz, The Fundamental Rights to Private Life and to Data Protection, pp. 584 and 585. 307 See Calliess, ibid., cip. 20 to 22, 25, and 26. C. The function of the principle of purpose limitation in light of Article 8 ECFR 134 by its citizens. The Constitutional Court would substitute the legislator if it made an order as to how the legislator has to fulfill its protection function.308 Only the importance of the substantial guarantee in question, the severity of the infringement, and the importance of opposing constitutional guarantees can restrict the margin of appreciation.309 In contrast, the assessment of whether or not a state action conflicts with the defensive function of a fundamental right generally foresees a narrow margin of discretion, thus, it is stricter. Here, the assessment always refers to a specific state action. If this specific action infringes the scope of protection of a fundamental right, the question is whether or not the infringement is legitimate or not. In answering this last question, a proportionality test plays a decisive role.310 This proportionality test refers to the following four questions: 1. Does the action intruding into the scope of the fundamental right follow a legitimate aim? (Pre-question) 2. If so, is the action adequate in order to achieve this aim? 3. If so, is the action necessary for this aim, in other words, is there no other action being equally efficient in achieving the legitimate aim and intruding less into the scope of the fundamental right? 4. If so, is the action proportionate with respect to the colliding fundamental rights? In conclusion, the regulator has to balance the colliding fundamental rights by respecting, with regard to the protection function, the “prohibition of insufficient means” and, with respect to the defensive function, the proportionality test.311 In this regard, it is the legislator who is primarily in charge of balancing the colliding fundamental rights through means of implementing ordinary law, be it civil, administrative, or penal law. And even if it is the classic role of civil law to solve conflicting interests 308 See, with respect to German law, Callies, ibid., cip. 6; Rupp, The State Duty of Protection for the Right to Informational Self-Determination in the Press Sector, pp. 46 to 53. 309 Cf. v. Danwitz, The Fundamental Rights to Private Life and to Data Protection, p. 582. 310 See, regarding the European Convention on Human Rights, Matscher, Methods of Interpretation of the Convention, p. 67; regarding the European Charter of Fundamental Rights, González-Fuster, The Emergence of Data Protection as a Fundamental Right of the EU, pp. 200 to 205, who also stresses the uncertainties on the interplay of Article 8 sect. 2 and 3 ECFR and Article 52 ECFR. 311 See Grimm, Data protection before its refinement, p. 587 and 588. I. Constitutional framework 135 amongst private individuals, it does not have to be considered as the only regime of regulation instruments. Administrative law, comparably, serves to prevent such conflicts, especially with regard to relationships where multiple individuals are involved.312 This might be in particular the case if the object of regulation concerns a collective good so that it must not completely depend on the disposal by private parties. As mentioned previously, Regan promotes to consider privacy as such a collective good because it constitutes the pre-conditions for being a citizen in a democracy.313 In any event, the legislator provides this legal framework on both an abstract and a general level and has, therefore, a wide scope with respect to the consideration of the relevant facts, its evaluation, and finally the establishment of the regulation instruments.314 A first review: decomposing the object and concept of protection Weighing both functions in a correct way thus is a rather complex task. It does not only depend on the object of protection guaranteed by the fundamental right concerned, but also on the specific protection instruments. The challenge of drawing the line between efficient protection of fundamental rights and an infringement of opposing fundamental rights because of over-regulation, becomes particularly apparent with respect to privacy and data protection, in other words, threats caused by the “processing of personal data”. Which instruments actually protect which object of protection? With respect to the German right to informational self-determination, the way the State balances the duty of protection with opposing fundamental rights, can be differentiated, in essence, pursuant to the following categories: First, a ban to disclose personal data (e.g. by legal prohibitions or technical means); and second, support for informational self-protection (2) (a) 312 See Bethge, § 72 – Collision of Basic Rights, cip. 16, 17, 22, and 24; Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 109 and 110. 313 See Nissenbaum, ibid., p. 87, referring to Priscilla Regan (1995), Legislating Privacy, Chapel Hill: University of North Carolina Press, pp. 226 and 227. 314 Cf. Jarass, ibid., Vorb. vor. Art. 1 cip. 56; Callies, ibid, cip. 6. C. The function of the principle of purpose limitation in light of Article 8 ECFR 136 (e.g. by information or technical self-protection).315 The State is usually able to fulfill its duty of protection by the second mean, i.e. supporting measures. Only if these supporting measures are not effective, or in order to protect the fundamental rights of third parties who were concerned by the disclosure, then the State is allowed to prohibit the self-disclosure of personal data. In any case, abstract constitutional aims (such as environmental protection), do not create a duty of protection. Such constitutional positions can only help justify provisions, which infringe the defensive function of fundamental rights, in the balancing exercise of the colliding fundamental rights.316 Example: “Commercialized” consent threatening the object of protection including… Regarding the abstract constitutional positions, Buchner unfolds the diverse aspects that are discussed in German literature, focusing on the consent, regarding the object of protection of the right to informational selfdetermination. In particular, the following aspects of the object of protection are discussed: a protection of individuality, of solidarity, and of democracy in society. Promoters of these positions argue that the focus on the individual’s consent as the main self-regulation instrument of informational self-determination inevitably leads, in the private market, to its commercialization and as a consequence, endangers not only the dignity of the individual but also society as a whole. The individuals would degrade themselves to a mere economic asset, which simultaneously disintegrates the basis for a democratic civil society.317 Buchner does not negate these criticisms per se, but stresses that this discussion actually refers to the relationship between reality and law. He asserts that the economic exploitation of personal data is a fact. Meanwhile, there is a long-standing market in which its participants trade data as economic goods. Conse- (b) 315 Cf. Sandfuchs, Privacy against one’s will?, pp. 299 to 302. 316 See Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 104 and 105. 317 See Buchner, ibid., pp. 183, with further references to the German discussion; Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 50. I. Constitutional framework 137 quently, he poses the question for the legislator: “Should its regulatory function focus on guaranteeing, by means of certain procedural rules, a minimum of balance between the market participants? Or should the legislator also be in charge of setting up ethical rules and enforcing them, eventually, even against the actual covetousness of the market?”318 Buchner responds to these questions by referring to the decision of “Marlene Dietrich” by the German Federal Court of Justice, i.e. the highest civil court in Germany: The legal order must restrain the commercialization of the personality right “where superior legal or ethic principles require this”.319 Buchner then unfolds these principles, with respect to the commercialization of the right to informational self-determination. … individuality? At first, Buchner refers to the criticism that individuals would degrade themselves, resulting from the commercialization, to mere economic assets. From this perspective, human life would be, more and more, interpreted pursuant to economic categories and human beings, which are reduced to mere rates and, thus, are quantitatively measurable and comparable. Critics therefore assume that the economic exploitation of personal data automatically increases the pressure of homogenization and eliminates qualitative differences. In contrast, Buchner challenges this mechanism by stressing the factual development of personalized marketing. Its aim is not to equalize the individual but to capture his or her particularities in order to increase the customer’s loyalty. From this point of view, indeed, the commercialization of personal data leads less to a homogenization of individuals than to an individualization of production and marketing processes.320 However, besides the marketing, Buchner admits there is a pressure of adaptation: Private parties decide with whom and under which conditions they want to contract on the basis of the available information. For instance, the more information private companies (such as insurance companies, creditors, landlords or employers) have or gain about individuals (such as debtors, tenants and employees), the higher the pres- (c) 318 See Buchner, ibid., pp. 185 and 186. 319 See Buchner, ibid., p. 187 with reference to BGHZ 143, 214 (225) – Marlene Dietrich. 320 See Buchner, ibid., pp. 184, 189, and 190. C. The function of the principle of purpose limitation in light of Article 8 ECFR 138 sure becomes for those individuals to comply with those expectations. However, Buchner considers that this pressure is not arguable in itself or new, at least, so long as it safeguards proper legal or contractual behavior and the processing of data is correct and fair. In contrast, the new issue raised by the processing of personal data is the increasing differentiation with respect to how certain characteristics of the potential contractual partner are pre-determined and, consequently, of contractual relations.321 … solidarity? The last aspect leads to another criticism regarding the commercialization of personal data: The disintegration of the community of solidarity. The more individuals can profit, in the form of economic advantages, from the disclosure of their personal data, the less they will be willing to accept common (contractual) conditions protecting others who cause higher risks or costs. Buchner concludes from this that the more information can principally be retrieved, be it by better algorithms or a higher willingness of individuals to share their data, the more difficult it will be, by means of law, to impose an artificial ignorance in favour of the equality between or amongst the individual.322 In essence, there are two, partly intertwined, categories of law covering this phenomenon: The rights of equality and non-discrimination and the Social State Principle guaranteed by the German Basic Law. Buchner stresses that even if the increased differentiations do not infringe the rights to equality and non-discrimination of the individuals concerned, it increases the challenges for those individuals who do not fit into the advantageous expectations of the economy. Consequently, Buchner recognizes an increasing social gap between individuals within an economic meaning, ‘good’ and ‘bad’ data, respectively. However, he sees in this phenomenon that it is primarily a problem related to the Social State principle. Therefore, he asks whether the State can or should impose, by means of data protection law, its social responsibility on private companies. Buchner favors a solution for this social problem by public social law and not by data protection law regulating interactions between private parties.323 (d) 321 See Buchner, ibid., pp. 190 and 191. 322 See Buchner, ibid., p. 194. 323 See Buchner, ibid., pp. 197 and 198. I. Constitutional framework 139 … democracy? Finally, Buchner deals with the criticism whether, and if so, to which extent the commercialization of personal data on the private sector endangers the pre-conditions of a democratic civil society. Accordingly, he determines, as a main source of this criticism, the “Decision on Population Census” by the German Constitutional Court that stated that: “In light of the right to informational self-determination, no social or legal order would be possible if citizens would not be able to know what information others have about them. The person who is unsure if their deviant behavior will be noted and permanently stored, used or transferred will attempt not to attract attention with such behavior. The person who is aware of being registered by the State when he or she takes part at an assembly or is part of an association will possibly give up on exercising his or her corresponding fundamental rights (…). This would not only restrict the chances of individual freedom of development but also the common welfare because self-determination is an essential condition for a free and democratic civil society that builds upon the ability of action and participation of its citizens.”324 (e) 324 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), retrieved on the 7th of February 2016 from https://openjur.de/u/268440.html, cip. 172: “Wer nicht mit hinreichender Sicherheit überschauen kann, welche ihn betreffende Informationen in bestimmten Bereichen seiner sozialen Umwelt bekannt sind, und wer das Wissen möglicher Kommunikationspartner nicht einigermaßen abzuschätzen vermag, kann in seiner Freiheit wesentlich gehemmt werden, aus eigener Selbstbestimmung zu planen oder zu entscheiden. Mit dem Recht auf informationelle Selbstbestimmung wären eine Gesellschaftsordnung und eine diese ermöglichende Rechtsordnung nicht vereinbar, in der Bürger nicht mehr wissen können, wer was wann und bei welcher Gelegenheit über sie weiß. Wer unsicher ist, ob abweichende Verhaltensweisen jederzeit notiert und als Information dauerhaft gespeichert, verwendet oder weitergegeben werden, wird versuchen, nicht durch solche Verhaltensweisen aufzufallen. Wer damit rechnet, daß etwa die Teilnahme an einer Versammlung oder einer Bürgerinitiative behördlich registriert wird und daß ihm dadurch Risiken entstehen können, wird möglicherweise auf eine Ausübung seiner entsprechenden Grundrechte (Art. 8, 9 GG) verzichten. Dies würde nicht nur die individuellen Entfaltungschancen des Einzelnen beeinträchtigen, sondern auch das Gemeinwohl, weil Selbstbestimmung eine elementare Funktionsbedingung eines auf Handlungsfähigkeit und Mitwirkungsfähigkeit seiner Bürger begründeten freiheitlichen demokratischen Gemeinwesens ist.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 140 These considerations are similar to the approach promoted by Priscilla Regan.325 However, Buchner stresses that the Constitutional Court developed this reasoning with respect to the State. He agrees that treatment of data by a State endangers a free political discourse but doubts that the treatment of personal data in the private sector is relevant for the individual’s ability to freely participate in public discourses. Buchner argues that private legal transactions primarily serve the exchange of goods and services but not the execution of civil rights. Even if the concepts of private and public autonomy would be inextricably linked to each other, he doubts that the commercialization of personal data would hinder the individual’s autonomy. In his opinion, while the disclosure of personal data indeed increases the knowledge of third parties, this does not automatically hinder the autonomy of the individual concerned. Autonomy does not require individuals to know anything about other individuals, nor does one’s own knowledge always leads to another’s manipulation. Therefore, Buchner advocates that it is important to only concentrate on the real problematic cases and not on every single aspect of the processing of data by private parties because each social interaction in a digitized society would be problematic.326 Equal or equivalent level of protection compared to state data processing? Before coming to a first conclusion on the previous considerations, there is still another question to be considered. Given that there is an only indirect effect of fundamental rights, and the object of protection is so broad covering abstract constitutional positions (such as individuality, solidarity, and democracy), the question to consider is: whether or not the data protection instruments established in the private sector should be identical to the public sector or, at least, equivalent. There are two contrasting opinions in relation to this issue amongst legal scholars. Pursuant to the first opinion, the level of protection and regulation instruments are the same for both the public and private sector. An ‘equal level’ of protection is considered because the imbalance of power caused by the processing of personal cc) 325 See above point B. III. 1. The individual’s autonomy and the private/public dichotomy. 326 See Buchner, ibid., pp. 193 and 194. I. Constitutional framework 141 data is the same on the public and the private sector. De Hert and Gutwirth give a vivid explanation why data protection law is often considered as equally applicable in the public and in the private sector, as: “The power of those, be it in the public or in the private sector, who process personal data concerning others (whether with the help of information technology or not) is generally already greater to begin with. The stream of personal data primarily flows from the weak actors to the strong. Citizens not only need to provide information to the authorities, but they also need to do so as a tenant, job seeker, customer, loan applicant and patient. That is precisely why legal tools of transparency and accountability under the form of data protection regulations were devised for application both in the public and in the private sector.”327 In contrast, legal scholars promoting an ‘equivalent level’ of protection do not require the same protection instruments but consider different protection instruments to be implemented in the private or public sector. This might result, pursuant to the particular circumstances of the case, to a higher, equal or lower level of protection. Others finally doubt that these questions make sense at all. Buchner argues, for example, that such a comparison of different levels of protection implies an objective scale. In the private sector, such an objective scale does not exist, in his opinion, because the fundamental right of the individual concerned is not an ‘absolute’ right but must instead be weighed against the opposing fundamental rights. The result is that fundamental rights always lack an objective scale that would actually be the pre-condition in order to answer the question of whether there should be a higher, lower or equivalent level of protection.328 Interim conclusion: Interdisciplinary research on the precise object and concept of protection The previous discussion illustrates the difficulties in deciding the appropriate regulation instruments, whilst balancing on the one hand, in the private sector, the opposing fundamental rights and further constitutional c) 327 See De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, p. 78. 328 See above C. I. 1. b) cc) Equal or equivalent level of protection compared to state data processing?, referring to Buchner, Informational self-determination in the private sector, pp. 44 and 45 with further references, as well as pp. 57 and 58. C. The function of the principle of purpose limitation in light of Article 8 ECFR 142 positions. All three fundamental rights regimes, i.e. the European Charter on Human Rights, the European Charter of Fundamental Rights, and the German Basic Rights, tend to apply an only indirect effect of fundamental rights between private parties. Even if not all particularities are comprehensively clarified, the 3-step-tests assessing a protection and defensive function of fundamental rights can provide structural help for this balancing exercise. In this regard, the question of how the legislator should provide for protection against threats resulting from the processing of personal data by private entities depends on the objects and concepts of protection of the fundamental rights. However, already defining the object of protection of privacy and/or data protection is a difficult task. Buchner decomposes the object of protection of the German right to informational self-determination considering individuality, solidarity and democracy as abstract constitutional positions, in his words, superior legal or ethic principles. Indeed, these constitutional positions do not create per se a state duty of protection. However, the legislator may refer to these positions justifying its protection instruments established, primarily, in order to protect an individual’s fundamental right. And in doing so, the legislator has a wide margin of discretion for establishing the adequate protection instruments. Therefore, the legislator can indeed decide to impose certain mechanisms on the private sector, supplementing the social basis for a democratic and supportive Civil Society. Even if Buchner’s observations are principally correct, the legislator can therefore well decide, for example, to implement certain Social State principles by means of data protection law and not by Social Law. At least, this thought applies so long as these objective constitutional aims are not the only reason for the regulation, but are additional to the protection of an individual’s fundamental right. Equally, this idea applies, in principle, to the discussion on whether the data protection instruments applied on the public and private sector should be, in light of the same (or similar) imbalances of informational power, the same or equivalent. If the legislator comes to the conclusion that there are informational imbalances on both the public and the private sector, it can well address these imbalances with the same or different protection instruments. However, there is another aspect to this regulation which is problematic: All of the negative effects discussed in legal discourse regarding the processing of personal data in the private sector, are mainly grounded on assumptions. For example, do contractual differentiations between private parties, such as in the insurance industry, really increase the pressure of I. Constitutional framework 143 social adaptation, and if so, to what extent? If private parties are able to more and more pre-control their contractual partners, instead of retrospectively sanctioning them for disappointing trustful expectations, does this destroy social trust as a pre-condition for autonomous behaviour? How much do imbalances of information threaten the balance of public discourses? Are there informational power inequalities? And how do we actually capture these inequalities in our theoretical concepts? The concepts underlying these questions are similar, if not the same, to the concepts proposed previously: Nissenbaum summarized these concepts referring to autonomy, human relationships, and the society as a whole, as the actual values of privacy. If such concepts serve as a basis for the legislator, then actually, it is absolutely necessary to clarify and validate both its theoretical as well as empirical presumptions in order to improve the rationality of law.329 Only if it is clear what the fundamental rights protect, it is possible to validate, first, the actual threats for these objects of protection; and second, the efficiency of the protection instruments applied in order to achieve these aims, such as the principle of purpose limitation.330 The object and concept of protection of the German right to informational self-determination Clarifying the object and concept of protection hence, is key, in order to help data controllers apply the principle of purpose limitation. As illustrated in the introduction, data controllers often have difficulties in precisely specifying the purpose of the processing intended. The German Constitutional Court has developed the object and concept of protection of the German right to informational self-determination over three decades. Examining these decisions shall thus serve as a comparison with (or even a source of inspiration for the development of) the rights to private life and data protection under Articles 7 and 8 ECFR. 2. 329 See Hoffmann-Riem, Innovation Responsibility, p. 39. 330 See above point B. II. 4. Searching for a scale in order to determine the potential impact of data protection risks. C. The function of the principle of purpose limitation in light of Article 8 ECFR 144 Genesis and interplay with co-related basic rights The German State of Hessen established, in 1970, the first data protection law in the world.331 However, interestingly, German Basic Law does not explicitly state that an individual's data is protected. Legal scholars consider that the various plans to introduce the right to data protection in German Basic Law became superfluous in light of the comprehensive definition provided for by the German Constitutional Court in the “Decision on Population Census” (Volkszählungsurteil) from 1983. In this case, the German Constitutional Court recognized the so-called right to informational self-determination as an autonomous guarantee provided for by the general personality right.332 The right to informational self-determina-tion primarily served to protect the individual against the informational interest of the State. Under German Basic Law, there are several rights that mirror this purpose of protection with regard to specific aspects of life, such as the right to privacy of correspondence, posts and telecommunication under Article 10 GG, as well as the right to the inviolability of the home under Article 13 GG.333 Another fundamental right related to the right to informational self-determination refers to the protection of the confidentiality and integrity of information technological systems (Grundrecht auf Vertraulichkeit und Integrität informationstechnischer Systeme). This fundamental right extends the general scope of protection for the individual’s personality to the moment before the personal data is collected. This right protects the individual’s trust that the information technological system used by him or her functions properly. Recognizing this kind of protection, the German Constitutional Court decided not to discuss this issue under the right to informational self-determination, because this would have meant extending its already broad scope of protection even further. Instead, the Court decided to establish a new guarantee, which indeed is also provided for by the general personality right.334 Despite the different guarantees provided for by the German basic rights surrounding the protection of personal data, the German Constitutional Court often connects them in order to evaluate an infringement by the State. For example, the Court a) 331 See Rudolf, Right to Informational Self-Determination, cip. 8. 332 See Rudolf, ibid., cip. 8 and cf. Burgkardt, ibid., p. 85. 333 Cf. Burgkardt, ibid., p. 85. 334 See Hoffmann-Riem, Protection of the Confidentiality and Integrity of Information Technological Systems, p. 1015. I. Constitutional framework 145 considers the basic right to privacy of telecommunications under Article 10 GG and the basic right to privacy of the home under 13 GG as “specifications of the basic right to informational self-determination”, and applies their principles to the more general right to informational self-determi-nation, at least, “as long as they are not the result of the particularities of the special guarantees.”335 Before the recognition of the basic right to informational self-determination, the German Constitutional Court referred in similar cases to the protection of being private, comparable to Art. 8 ECHR and Art. 7 ECFR. This right resulted in a “right to be left alone.”336 Pursuant to the so-called theory of spheres, the more that the data was considered as being connected to the individual concerned, the stricter the protection of personal data was. Despite the clarity of this concept, the theory of spheres failed to provide clear criteria in order to differentiate between the different spheres. Some scholars view this as the essential problem that finally lead to the development of the right to informational self-determination, and was recognized by the German Constitutional Court in the famous “Decision on Population Census”.337 In light of the development of both the following constitutional decisions, as well as the technical possibilities of data collection and processing today, the introduction of this decision is worth being quoted in this thesis. In this case, citizens within Germany filed several constitutional complaints against a law for a state census including population, housing, profession and work areas. The German Court described the social backgrounds that lead to the constitutional complaints in the introduction of its judgment as: “The data collection intended by this law caused anxiety even in those parts of the population who respect as loyal citizens the right and duty of the State 335 See BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 90: “Da diese Grundrechte spezielle Ausprägungen des Grundrechts auf informationelle Selbstbestimmung darstellen (...), sind diese Maßstäbe auch auf das allgemeinere Grundrecht anwendbar, soweit sie nicht durch die für die speziellen Gewährleistungen geltenden Besonderheiten geprägt sind.“ as well as BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 137, and BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 169. 336 See Burgkardt, ibid., p. 87. 337 See Albers, Informational Self-Determination, pp. 211 and 212; Burgkardt, ibid., p. 88; cf. the criticism of the “private/public dichotomy” by Nissenbaum above under point B. III. 2. “Criticism: From factual to conceptual changes”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 146 to collect the information necessary for reasonable public action. This might result from the fact that the extent and purpose of the census was, to a great extent, unknown and that the necessity to reliably inform the citizens concerned was not taken early enough into account despite the fact that public awareness (…) increased in view of the development of automated data processing. Nowadays, the possibilities of modern data processing are, to a large extent, transparent only to experts and can provoke the fear of uncontrolled profiling, even if the legislator demands the collection of such information which is necessary and reasonable.”338 Thus, in this decision, the Court stated, with respect to the public sector, that the “free development of the personality requires, under the modern conditions of data processing, the protection of the individual against unlimited collection, storage, usage and transfer of his or her personal data.”339 In this statement, the Court does not want to protect the individual against all kinds of treatment of ‘his or her’ data but instead, only wants to protect the individual against the unlimited treatment of data.340 The subsequent analysis will therefore illustrate how the German Court frames the principle of purpose limitation in light of the object and concept of protection of the right to informational self-determination in order to protect against an unlimited use of personal data. 338 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 8: “Die durch dieses Gesetz angeordnete Datenerhebung hat Beunruhigung auch in solchen Teilen der Bevölkerung ausgelöst, die als loyale Staatsbürger das Recht und die Pflicht des Staates respektieren, die für rationales und planvolles staatliches Handeln erforderlichen Informationen zu beschaffen. Dies mag teilweise daraus zu erklären sein, daß weithin Unkenntnis über Umfang und Verwendungszwecke der Befragung bestand und daß die Notwendigkeit zur verläßlichen Aufklärung der Auskunftspflichtigen nicht rechtzeitig erkannt worden ist, obwohl sich das allgemeine Bewußtsein durch die Entwicklung der automatisierten Datenverarbeitung (...) erheblich verändert hatte. Die Möglichkeiten der modernen Datenverarbeitung sind weithin nur noch für Fachleute durchschaubar und können beim Staatsbürger die Furcht vor einer unkontrollierbaren Persönlichkeitserfassung selbst dann auslösen, wenn der Gesetzgeber lediglich solche Angaben verlangt, die erforderlich und zumutbar sind. (...)” 339 See BVerfG, ibid., cip. 173: “Freie Entfaltung der Persönlichkeit setzt unter den modernen Bedingungen der Datenverarbeitung den Schutz des Einzelnen gegen unbegrenzte Erhebung, Speicherung, Verwendung und Weitergabe seiner persönlichen Daten voraus.” 340 Cf. Hoffmann-Riem, ibid., p. 1015. I. Constitutional framework 147 Autonomous substantial guarantee In this same “Decision on Population Census”, the Court firstly determined on the conceptual provenance and normative aim of the right to informational self-determination. In this regard, it must be stressed that this thesis uses, so far, the terms “object of protection” and “substantial guarantee” provided for by fundamental rights, synonymously. Both the meaning and differences of the terms shall be examined, later on, with respect to the differentiation of the fundamental rights to privacy and data protection under Article 7 and 8 ECFR.341 In any case, the German Constitutional Court considers the normative substance of the right to informational self-determination as: “The human dignity of a person who acts as a member of a free society in a free and self-determined manner constitutes the center of the constitutional order. Besides specific guarantees of freedom, the general personality right of Art. 2 sect. 1 in combination with Art. 1 sect. 1 GG serves as a protection (of human dignity) and can become relevant especially in the light of modern developments and new dangers for the human personality. (…) Stemming from the idea of self-determination, it (the general personality right) contains (…) the right of the individual to basically decide by him or herself when and to what extent personal facts about his or her live are revealed. (…) Individual self-determination requires (…) that the individual can freely decide on his or her actions, including the freedom to genuinely act corresponding to their decisions.”342 b) 341 See under point C. I. 3. c) cc) “Referring to substantial guarantees as method of interpreting fundamental rights in order to avoid a scope of protection that is too broad and/or too vague”. 342 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 170 to 172: “Im Mittelpunkt der grundgesetzlichen Ordnung stehen Wert und Würde der Person, die in freier Selbstbestimmung als Glied einer freien Gesellschaft wirkt. Ihrem Schutz dient - neben speziellen Freiheitsverbürgungen - das in Art 2 Abs. 1 in Verbindung mit Art 1 Abs. 1 GG gewährleistete allgemeine Persönlichkeitsrecht, das gerade auch im Blick auf moderne Entwicklungen und die mit ihnen verbundenen neuen Gefährdungen der menschlichen Persönlichkeit Bedeutung gewinnen kann (...). (Die bisherigen Konkretisierungen durch die Rechtsprechung umschreiben den Inhalt des Persönlichkeitsrechts nicht abschließend.) Es umfaßt (...) auch die aus dem Gedanken der Selbstbestimmung folgende Befugnis des Einzelnen, grundsätzlich selbst zu entscheiden, wann und innerhalb welcher Grenzen persönliche Lebenssachverhalte offenbart werden (...). Diese Befugnis bedarf unter den heutigen und künftigen Bedingungen der automatischen Datenverarbeitung in besonderem Maße des Schutzes. Sie ist vor allem deshalb gefährdet, C. The function of the principle of purpose limitation in light of Article 8 ECFR 148 The phrase ‘that the individual can freely decide on his or her actions, including the freedom to genuinely act corresponding to their decisions’ appears to mean that the right to informational self-determination primarily serves to protect the individual’s freedom of action. In this sense, the specific rights of freedom could add to a differentiated scale that helps determine the extent of the right and, thus, the specification of the purpose as required for the data processing.343 In other words, the specific rights to freedom may define the informational norms governing a certain context. However, in the following decisions, the Court clarified that the extent of the right to informational self-determination does not depend on a specific risk for other basic rights. This becomes particularly apparent in the case of “License Plate Recognition” (Kennzeichenerfassung).344 In this case dated 11 March 2008, the constitutional action was brought against provisions of police law, which authorized the automated recognition of license plates of cars. Using this method, video cameras record the passing cars on the street. Certain software extracts the code with numbers and figures of the license plates and is then automatically checked against police investiweil (bei Entscheidungsprozessen nicht mehr wie früher auf manuell zusammengetragene Karteien und Akten zurückgegriffen werden muß, vielmehr) heute mit Hilfe der automatischen Datenverarbeitung Einzelangaben über persönliche oder sachliche Verhältnisse (einer bestimmten oder bestimmbaren Person (personenbezogene Daten (vgl. § 2 Abs. 1 BDSG)) technisch gesehen unbegrenzt speicherbar und jederzeit ohne Rücksicht auf Entfernungen in Sekundenschnelle abrufbar sind. Sie können darüber hinaus - vor allem beim Aufbau integrierter Informationssysteme - mit anderen Datensammlungen zu einem teilweise oder weitgehend vollständigen Persönlichkeitsbild zusammengefügt werden, ohne daß der Betroffene dessen Richtigkeit und Verwendung zureichend kontrollieren kann. Damit haben sich in einer bisher unbekannten Weise die Möglichkeiten einer Einsichtnahme und Einflußnahme erweitert, welche auf das Verhalten des Einzelnen schon durch den psychischen Druck öffentlicher Anteilnahme einzuwirken vermögen. Individuelle Selbstbestimmung setzt aber (- auch unter den Bedingungen moderner Informationsverarbeitungstechnologien -) voraus, daß dem Einzelnen Entscheidungsfreiheit über vorzunehmende oder zu unterlassende Handlungen einschließlich der Möglichkeit gegeben ist, sich auch entsprechend dieser Entscheidung tatsächlich zu verhalten. (...)” 343 Cf. Grimm, Data protection before its refinement, p. 585 and 586, who stresses, first, the delimited scope of protection in the light of the fact that all personal data are relevant and, second, considers the specific rights to freedom and possible legal links determining the scope of protection. 344 In this regard, it must be stressed that the German Constitutional Court does not differentiate, terminologically, between risks and dangers as elaborated on in the preceding chapter B. II. Data protection as a risk regulation. I. Constitutional framework 149 gation data. In the case of a match, the software delivers a report, stores the data together with further information such as the time and place of the car recorded and provides, in doing so, the basis for potentially follow up investigations. If there is no match, the records, as well as the code of the license plates, are immediately deleted. The wording of the provisions authorizing the automatic license plate recognition stated: “The police authorities are authorized to automatically collect on public streets and spaces data from license plates of cars for the purpose of checking the data against the data files for open investigations. Data that is not part of the data files for open investigations must immediately be deleted.”345 The German Constitutional Court affirmed that the legal provisions, which the claimant addressed in its constitutional claim, infringed the general personality right, more precisely, the right to informational self-determination. Pursuant to the Court’s decision, this right “meets the threats of dangers of infringements of the personality which for the individual results, especially under the conditions of modern data processing, from informational measures. This right supplements and broadens the protection of freedom of action and of being private; it (the protection) already begins as soon as there is danger to the personality. Such a danger may already exist before there is a specific threat for an object of protection.”346 Thus, the right to informational self-determination is conceptually independent from the other basic rights and only indirectly serves to protect the specific rights of freedom. Consequently, these further rights do not add, so far, to a differentiated scale in order to determine its scope, the purpose of the data processing or the context in which the processing occurs. However, it is clear that the object and concept of protection of the right to informational self-determination is very similar to the other rights to privacy. This 345 See BVerfG, 11th of March 2008, 1 BvR 2074/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 1, 2 and 9: “Die Polizeibehörden können auf öffentlichen Straßen und Plätzen Daten von Kraftfahrzeugkennzeichen zum Zwecke des Abgleichs mit dem fahndungsbestand automatisiert erheben. Daten, die im Fahndungsbestand nicht enthalten sind, sind unverzüglich zu löschen.” 346 See BVerfG, ibid., cip. 63: “Das Recht auf informationelle Selbstbestimmung trägt Gefährdungen und Verletzungen der Persönlichkeit Rechnung, die sich für den Einzelnen, insbesondere unter den Bedingungen moderner Datenverarbeitung, aus informationsbezogenen Maßnahmen ergeben (...). Dieses Recht flankiert und erweitert den grundrechtlichen Schutz von Verhaltensfreiheit und Privatheit; es lässt ihn schon auf der Stufe der Persönlichkeitsgefährdung beginnen. Eine derartige Gefährdungslage kann bereits im Vorfeld konkreter Bedrohungen von Rechtsgütern entstehen.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 150 becomes particularly apparent in the decision of “Retrieval of Bank Account Master Data” (Kontostammdatenabfrage) from 2007. In this case, a German financial institution and two individuals who received social security benefits filed a constitutional complaint against the “law for the advancement of the financial market” and the law “for the encouragement of tax compliance”. The law for the advancement of the financial market obliged each financial institution to store certain master data relating to its bank accounts. The Federal Financial Supervisory Authority (BaFin) was authorized to automatically retrieve these data as long as it was necessary for purposes of its supervision. The data only referred to the existence of the bank account and the person(s) who was authorized to view it. The law did not authorize the use of further information such as account activities. The use of information by BaFin occurred without notifying the financial institution that stored the data, because they did not want to alert the financial institutions unnecessarily. BaFin was allowed to transfer the data to public state agencies, such as competent courts for international legal assistance in criminal matters. The law for the encouragement of tax compliance then broadened the circuit to which the data could be transmitted, such as to tax or social security authorities. In order to authorize the transfer of data all that was required was that authorization had to refer to a notion or term contained in the Income Tax Act.347 In this case, the Constitutional Court clarified the differences, or better, interplay between the various basic rights as: “The general personality right guarantees elements of the personality which are not protected by special guarantees of freedom but are, nevertheless, not less constitutive for the personality. (...) The acknowledgement of a concrete claim by the claimant in relation to the different aspects of the personality right hence depends on the different threats for the personality that result from the circumstances of the individual case. (…) The right to informational self-determination complements prevailing special guarantees of being private such as the right to privacy of correspondences, posts and telecommunications of Art. 10 GG and the right to spatial privacy guaranteed by Art. 13 GG. It exists beside other basic rights typifying the general personality right which can also guarantee constitutional protection of being private against revelation and usage of information, such as the protection of the private sphere or the right to the spoken word.”348 347 See BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 10 to 29. 348 See BVerfG, ibid., cip. 62 and 63: “Das allgemeine Persönlichkeitsrecht gewährleistet Elemente der Persönlichkeit, die nicht Gegenstand der besonderen I. Constitutional framework 151 Right to control disclosure and usage of personal data as protection instrument? Similarly to other rights to privacy enabling the individual to decide on whether or not someone else intrudes into his or her private sphere, the right to informational self-determination provides an individual’s ‘right to basically determine by him or herself the disclosure and the usage of his or her personal data’.349 The German Constitutional Court justifies this right of control, particularly, with the ‘increased danger which is based on the technical possibilities under modern conditions of data processing’ resulting in the situation that the ‘data are not only, on a second-by-second basis, retrievable at any time and place but can also be, especially in the case c) Freiheitsgarantien des Grundgesetzes sind, diesen aber in ihrer konstituierenden Bedeutung für die Persönlichkeit nicht nachstehen (...). (Einer solchen lückenschließenden Gewährleistung bedarf es insbesondere, um neuartigen Gefährdungen zu begegnen, zu denen es im Zuge des wissenschaftlich-technischen Fortschritts und gewandelter Lebensverhältnisse kommen kann (...).) Die Zuordnung eines konkreten Rechtsschutzbegehrens zu den verschiedenen Aspekten des Persönlichkeitsrechts richtet sich daher vor allem nach der Art der Persönlichkeitsgefährdung, die den konkreten Umständen des Anlassfalls zu entnehmen ist (...). (Das allgemeine Persönlichkeitsrecht trägt in seiner Ausprägung als Recht auf informationelle Selbstbestimmung Gefährdungen und Verletzungen der Persönlichkeit Rechnung, die sich für den Einzelnen aus informationsbezogenen Maßnahmen, insbesondere unter den Bedingungen moderner Datenverarbeitung, ergeben (...). Es gibt dem Einzelnen die Befugnis, grundsätzlich selbst über die Preisgabe und Verwendung seiner personenbezogenen Daten zu bestimmen (...).) Das Recht auf informationelle Selbstbestimmung ergänzt besonders geregelte Garantien der Privatheit, die ihm vorgehen, insbesondere das Post- und Fernmeldegeheimnis nach Art. 10 GG (...) und den durch Art. 13 GG gewährleisteten Schutz der räumlichen Privatsphäre des Wohnungsinhabers (...). Es steht neben anderen Ausprägungen des allgemeinen Persönlichkeitsrechts, die als Gewährleistungen von Privatheit gleichfalls grundrechtlichen Schutz gegenüber Kenntnisnahme und Verarbeitung von Informationen vermitteln können, wie dem Schutz der Privatsphäre (...) oder dem Recht am gesprochenen Wort (...).” 349 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 173; cf. equally BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 136 and BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 132 and BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 64 and BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 63; BVerfG, 1 BvR 2027/02 (Release of Confidentiality), cip. 31. C. The function of the principle of purpose limitation in light of Article 8 ECFR 152 of integrated information systems, combined with other data collections leading to multiple possibilities of usage and linking’.350 Some legal scholars praise, even on an international level, this object and concept of protection (which was actually advocated already by Westin in 1967)351 in light of its “intermediate value” serving the final values of “dignity”, “autonomy” and, therefore, the “free and democratic society” as a whole.352 And indeed, the construction of this right and the considerations behind it appear to be very similar to some of the conceptual thoughts surrounding the value of privacy as summarized by Nissenbaum and illustrated previously in chapter “Theories about the value of privacy and data protection”.353 However, the German Court seems to have foreseen that such a concept might lead to far-reaching effects in social interactions. It already stressed in its first “Decision on Population Census” not to guarantee the individual an absolute control over his or her social representation (i.e. how he or she is perceived by others), which is based on data related to him or her. Rather, the concept only guarantee certain ‘chances of individual freedom of development’.354 It explicitly stated “the individual does not have a right in the meaning of an absolute and boundless control about ‘his or her’ data; (conceptually), he or she rather has to be considered as a personality developing within the social community who depends on communication. Information constitutes, even if it is related to a person, a picture of social reality that cannot be exclusively contributed only to the person concerned.”355 In the decision of “Release of Confidentiality” (Schweigepflichtent-bindung), the Constitutional Court stressed this thought with particular respect to the data processing by private parties. 350 See only BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 65. 351 See Westin, Privacy and Freedom, p. 7: “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.” 352 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 57 and 58. 353 See above under point B. III. 1 The individual’s autonomy and the private/public dichotomy. 354 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 174. 355 See BVerfG, ibid., cip. 174. I. Constitutional framework 153 The Court stated “especially on the private sector, the general personality right does not constitute an absolute control about certain information. The individual has to be rather considered as a personality that develops within the social community and depends on communication (…).”356 Despite these statements about the individual’s dependency on communications in the social community, the scope of application of the right to informational self-determination remains rather broad. As noted above, specific rights of freedom do not determine the same. Even more so, the scope is wider than certain prevailing rights to privacy. In the case of “Big Eavesdropping Operation” (Großer Lauschangriff) in 2004, the Court decided that an eavesdropping operation occurring from outside protected rooms, infringes the right to privacy of the home under Article 13 GG, only if the communication could not be – naturally – recognized by acoustic means. In this case, the objects of the constitutional complaint related to several provisions of the German Code of Criminal Procedure. The complaint focused on the central provision of § 103 c sect. 1 nr. 3 StPO, which authorized the State to record non-public communications of a suspected person in his or her home if certain facts justified the suspicion that the person committed a crime listed by the law with respect to organized crime. The State measure referred only to the suspected person. Nevertheless, the law also authorized the observation of homes of third parties if the suspected person was staying in the third party’s home. The observation was exclusively used for state investigative purposes. The data could only be transferred, in principle, for criminal proceedings. In addition, the law restricted the duty to notify the person being surveyed. If an operator received a special authorization by the competent court, the state could hold back from notifying the particular for a period of six months or more after the end of the observation.357 The German Constitutional Court clarified in this decision that “even the perception of such a communication that can be heard from outside without acoustic means can infringe the guarantee of being private. However, such communication is not protected by Article 13 GG if the person concerned makes the perception of the communication from outside by him or 356 See BVerfG, 1 BvR 2027/02 (Release of Confidentiality), cip. 32: “Gerade im Verkehr zwischen Privaten lässt sich dem allgemeinen Persönlichkeitsrecht allerdings kein dingliches Herrschaftsrecht über bestimmte Informationen entnehmen. Der Einzelne ist vielmehr eine sich innerhalb der sozialen Gemeinschaft entfaltende, auf Kommunikation angewiesene Persönlichkeit (...).” 357 See BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 14, 20 and 21. C. The function of the principle of purpose limitation in light of Article 8 ECFR 154 herself possible and thus, does not actually use the spatial sphere of privacy in order to protect him or herself.”358 In contrast, in “License Plate Recognition”, the Court stated that the right to informational self-determination is not restricted to personal data originating from the private sphere. It equally protects personal data that is publicly available: “(…) even if the individual takes him or herself to the public, the right to informational self-determination protects his or her interest that the related personal information is not automatically collected for the purpose of storage enabling to further use.”359 In the case of “Video Surveillance” (Videoüberwachung), the Court finally clarified that the right to informational self-determination protects an individual against being recorded in public even if the person concerned knows that he or she will be recorded the moment he or she enters a monitored space.360 In this case, a city installed an artwork at one of its main squares. It was a relief on the soil mirroring the rest of the medieval synagogue hidden under the ground. The artwork should serve as a meeting place for the public. After several incidences, the city decided to implement video cameras in order to police the place. A citizen filed a complaint against the video surveillance before the administrative court.361 When the case finally came to the Constitutional Court, the Constitutional Court affirmed that the right to informational self-determination also protects against such a collection of personal data in the public.362 The Court clarified in this case also the question of whether the individuals recorded by the video camera gave their consent to the recording because they knew that they were being filmed. From the Court’s point of 358 See BVerfG, ibid., cip. 138: “Zwar kann auch die Wahrnehmung der aus der Wohnung nach außen dringenden und ohne technische Hilfsmittel hörbaren Kommunikation deren Privatheit beeinträchtigen. Solche Lebensäußerungen nehmen aber nicht am grundrechtlichen Schutz des Art. 13 GG teil, weil der Betroffene die räumliche Privatsphäre nicht zu seinem Schutz nutzt, wenn er die Wahrnehmbarkeit der Kommunikation von außen selbst ermöglicht.” 359 See BVerfG, ibid., cip. 67: “Auch wenn der Einzelne sich in die Öffentlichkeit begibt, schützt das Recht der informationellen Selbstbestimmung dessen Interesse, dass die damit verbundenen personenbezogenen Informationen nicht im Zuge automatisierter Informationserhebung zur Speicherung mit der Möglichkeit der Weiterverwertung erfasst werden (...).” 360 See BVerfG, 23rd of February 2007, 1 BvR 2368/06 (Video Surveillance), cip. 39 and 40. 361 See BVerfG, ibid., cip. 2 to 14. 362 See BVerfG, ibid., cip. 39 and 40. I. Constitutional framework 155 view, a person who does not explicitly disagree with the recording, does not automatically consent to it.363 Thus, even if the individual has a choice of not entering the monitored space and voluntarily enters that space, the right to informational self-determination still protects him or her. So far, the Court’s statement that the individual has no “right in the meaning of an absolute and boundless control about ‘his or her’ data”364 has little effect on the scope of protection. Comparably, the Court’s statements that the right to informational selfdetermi-nation seeks to guarantee “that the individual can freely decide on his or her actions, including the freedom to genuinely act corresponding to their decisions”365 and, therefore, “supplements and broadens the protection of freedom of action and of being private”366 does not determine the scope of application. In the opposite, in “Big Eavesdropping Operation” as well as in the case of “Surveillance of Telecommunications” (Telekommunikationsüberwachung I), the Court actually applies the opposite methodology: in these cases, not the right to informational self-determination supplements the rights to freedom but, vice versa, the rights to freedom supplement the right to informational self-determination. In this second-mentioned case of “Surveillance of Telecommunications”, the Constitutional Court decided on the synchronicity between the right to informational self-determination and the right to privacy of correspondences, posts and telecommunications of Art. 10 GG. The claimants filed an ultra vires action against the surveillance, data collection and processing of telecommuni- 363 See BVerfG, ibid., cip. 39 and 40. 364 See again BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 174: “(…) Der Einzelne hat nicht ein Recht im Sinne einer absoluten, uneinschränkbaren Herrschaft über "seine" Daten (…).” 365 See BVerfG, ibid., cip. 172: “(…) daß dem Einzelnen Entscheidungsfreiheit über vorzunehmende oder zu unterlassende Handlungen einschließlich der Möglichkeit gegeben ist, sich auch entsprechend dieser Entscheidung tatsächlich zu verhalten.” 366 See BVerfG, 11th of March 2008, 1 BvR 2074/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 63: “Das Recht auf informationelle Selbstbestimmung trägt Gefährdungen und Verletzungen der Persönlichkeit Rechnung, die sich für den Einzelnen, insbesondere unter den Bedingungen moderner Datenverarbeitung, aus informationsbezogenen Maßnahmen ergeben (...). Dieses Recht flankiert und erweitert den grundrechtlichen Schutz von Verhaltensfreiheit und Privatheit; es lässt ihn schon auf der Stufe der Persönlichkeitsgefährdung beginnen. Eine derartige Gefährdungslage kann bereits im Vorfeld konkreter Bedrohungen von Rechtsgütern entstehen.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 156 cations by the German Federal Bureau of Investigation. The so-called law for the suppression of crime expanded, amongst other issues, the legal possibility to collect and process personal data that was provided for by means of telecommunications. On the one hand, this law added several purposes for the collection and processing of data, such as the prevention, intelligence, and criminal prosecution of: international terrorist attacks, international distribution of weapons of war, exports of drugs into the Federal Republic, and of counterfeiting of currencies committed abroad. On the other hand, this law only applied to non-cable based telecommunications and, amongst other issues, under the pre-condition that only concrete facts arising from the data about the planning or commitment of one of the crimes mentioned. The law did not authorize the observation of single connections of telecommunications, but it enabled the selection via certain key words in order to fulfill the purposes described. Nevertheless, the observation of single connections of telecommunications of foreigners abroad was possible. Finally, the observation did not have to be communicated to the person concerned if the data was deleted within three months.367 Several of the claimants, who were journalists living in Germany and abroad, who carried out research and published news articles in the field of international terrorism, argued that their conversations with contacts in Germany and abroad could potentially contain key words which fit those key words provided by the German Federal Bureau of Investigation. They argued that the general collection, the selection corresponding to the key words and acts following those collections would consequently infringe their right to privacy of correspondence, posts and communications in Art. 10 GG.368 In this case, the German Constitutional Court explicitly stressed the significance of other fundamental rights, such as the freedom of the press stating that “the protection of Art. 10 GG can be supplemented by further fundamental guarantees which depends on the specific content and context of the communication or on the negative effects resulting from the usage of the information which is used in new contexts.”369 And in the case of “Big Eavesdropping Operation”, the Court provided the example of a conversation between a married couple at home which could not only fall, from its point of view, under the right to privacy of the home pursuant to Article 13 section 1 GG but also under Article 6 section 1 GG which provides for special protection of a marriage. Comparably, the protection of conversations with people who have to respect professional secrets can equally be supplemented by further basic rights such as, for example, clerical people, 367 See BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 11 to 14, 16 to 18. 368 See BVerfG, ibid., cip. 28, 28, 49 to 51. 369 See BVerfG, ibid., cip. 154. I. Constitutional framework 157 by the freedom of faith and conscience under Article 4 GG. The Court also set down certain criteria in order to determine when the general personality right is supplemented by further special guarantees, which is “the special necessity for protection of the communicating people”.370 Indeed, both decisions referred to the prevailing rights to privacy of Article 10 and 13 GG. Since the principles of these two basic rights and of the right to informational self-determination (of Article 2 section 1 in combination with Article 1 section 1 GG) can be transposed between each other,371 it is very likely that the specific rights of freedom also supplement the right to informational self-determination. Thus, in light of these considerations, not the right to informational self-determination supplements the rights to freedom but, in the opposite, the rights to freedom supplement the right to informational self-determination. Infringement by ‘insight into personality’ and ‘particularity of state interest’ In summary, the broad scope of the right to informational self-determination protects against all threats against the individual’s personality by automated data processing, irrespective of whether or not there is a specific risk in relation to specific rights of freedom or privacy. Consequently, the German Constitutional Court principally considers each act of collection and processing – such as the storage, filtering, and transferal – of personal data as an infringement of its scope. In the case of “Surveillance of Telecommunications”, the Court clarified that the collection of personal data can also infringe Article 10 GG, if it cannot immediately be related to d) 370 See BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 135: “Auch in Bezug auf die Kommunikation mit Berufsgeheimnisträgern können neben dem grundrechtlichen Schutz der räumlichen Privatsphäre Grundrechte in Betracht kommen, die - wie etwa Art. 4 GG im Hinblick auf das Gespräch mit einem Geistlichen - der besonderen Schutzbedürftigkeit der Kommunizierenden Rechnung tragen.” 371 See BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 137, BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 169, and BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 90. C. The function of the principle of purpose limitation in light of Article 8 ECFR 158 a person at that time but easily at a later stage. 372 However, the Court also acknowledged that certain acts of data treatment do not infringe the scope of protection. With respect to telecommunication data, it decided that “the collection does not infringe Art. 10 GG, so long as the telecommunication between German connection points is only unintentionally collected because of technical reasons and is, directly after the conditioning of the signal, technically eliminated without a trace.”373 This exception was particular to the case at hand. The question therefore is whether, and if so, there exists a more general principle in order to answer the question whether an act of data treatment infringes the scope of protection of the right informational self-determination. With respect to a similar situation, the Court argued, slightly different, in the case of “License Plate Recognition”, that the collection and processing of personal data does not infringe the right to informational self-determination “if checking against key investigation words immediately occurs after the collection, that leads to a negative result (…) and if it is legally and technically safeguarded that the data remain anonymous and is immediately deleted without leaving the possibility to relate it to a person. In contrast, the storage of the license plate that was recorded, which provides the basis for potentially further measures, infringes the basic right.”374 The Court justified this differentiation stating that “this is the intended goal of the measure if the license plate matches the key words (…). From this point in time, the license plate recorded is available for the processing by state agencies and the specific danger for the freedom of action and of being private occurs, 372 See BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 160. 373 See BVerfG, ibid., cip. 160. 374 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 69: “(Zu einem Eingriff in den Schutzbereich des Rechts auf informationelle Selbstbestimmung kommt es daher in den Fällen der elektronischen Kennzeichenerfassung dann nicht,) wenn der Abgleich mit dem Fahndungsbestand unverzüglich vorgenommen wird und negative ausfällt (sogenannter Nichttrefferfall) sowie zusätzlich rechtlich und technisch gesichert ist, dass die Daten anonym bleiben und sofort spurenlos und ohne die Möglichkeit, einen Personenbezug herzustellen, gelöscht werden. Demgegenüber kommt es zu einem Eingriff in das Grundrecht, wenn ein erfasstes Kennzeichen im Speicher festgehalten wird und gegebenenfalls Grundlage weiterer Maßnahmen werden kann.” I. Constitutional framework 159 which justifies the protection of the basic right to informational self-determination.”375 In other cases, such as of “Dragnet Investigation” (Rasterfahndung), in order to determine an infringement, the Court had also referred to the state’s intended purpose and the fact that the data treatment would provide a basis for further measures. In this case from 2006, the claimant contested judicial decisions in relation to police orders of the so-called “Rasterfahndung” (dragnet investigation). The dragnet investigation is a special tracing method based on data processing for wanted people whereby the data of a large number of people are checked against existing data in a database. There are two types of laws that permit the use of this tracing method in Germany. Firstly, § 98 StPO permits the Dragnet investigation for criminal proceedings. Secondly, Police Law permits it in order to prevent the commitment of crimes. Originally, most of these provisions required an existing danger to the security of the State or for life, health or freedom of a natural person and referred to certain types of data that could be collected and processed. Most of the States (Länder) in Germany changed these requirements as they abandoned the need to use the criteria of “existent” or of “existent danger” entirely. After the terrorist attacks carried out on 11th of September 2001, the States within Germany organized together, with the Federal Bureau of Investigation a German-wide dragnet investigation. The “Internal Security team”, defined national-wide criteria in order to discover potential Islamistic terrorists. The State demanded from universities, registration of addresses offices and the central register of foreigners, data relating to the following: whether the person was male, those aged between 18 to 40, whether or not he was a student or former student, whether or not he was from the Islamic religion, his country of birth or nationality of states with mainly Islamic population. These data were collected on a State level and were then transferred to the Federal Bureau of Investigation where it was stored in a network file named “Schläfer” (sleeper).376 The State of Nordrhein-Westfalen authorized, via its own law, the collection and processing not only of certain types of data but also ‘other data which are necessary for the concrete case’. It collected approximately 5.2 million data sets fitting to several pre-criteria defined by its public agencies. These data were then automatically checked against the criteria defined by the working group “Internal 375 See BVerfG, ibid, cip. 69: “Darauf vor allem ist die Maßnahme gerichtet, wenn das Kraftfahrzeugkennzeichen im Fahndungsbestand aufgefunden wird (sogenannter Trefferfall). Ab diesem Zeitpunkt steht das erfasste Kennzeichen zur Auswertung durch Staatliche Stellen zur Verfügung und es beginnt die spezifische Persönlichkeitsgefährdung für Verhaltensfreiheit und Privatheit, die den Schutz des Grundrechts auf informationelle Selbstbestimmung auslöst.” 376 See BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 7 to 12. C. The function of the principle of purpose limitation in light of Article 8 ECFR 160 Security” with the result of around 11.000 data sets (the persons concerned were, afterwards, informed about the collection and treatment of their data); the rest was deleted. More than 1,000 of these data sets transferred to the Federal Bureau of Investigation did not fit the requirements of the judicial order, either because the people concerned were female or Christians. Consequently, these data sets were deleted, and the rest were transferred to the competent police station, which manually checked the personal identity of the individuals concerned. The remaining 816 cases were sent back to the Federal Bureau of Investigation who started further investigations into 72 cases.377 In conclusion, German-wide data of 200,000 to 300,000 people were temporarily stored. None of the further investigations revealed “sleepers” or led to prosecutions of any individuals.378 The claimant in the particular case fit several of the criteria defined by the working group “Internal Security” as he was born in 1978, of Moroccan nationality and Islamic faith. While the judicial orders that he contested by the constitutional complaint came into force, he studied at the University Duisburg in Germany.379 In this case the German Constitutional Court firstly considered whether “the information about each of the single data (concerned) provides, in combination with other data, a separate insight into the personality” and then held it as essential “to determine whether the state interest, with respect to the overarching context and with respect to the purpose of surveillance and usage, for the data concerned is so particular that it qualitatively affects a person’s fundamental right.”380 The Court came to the conclusion that “the combination of the data in question – name, address, day and date of birth – combined with other data such as (…) nationality, religion or field of studies can and shall provide information about personal conducts and, by these means, suspicious facts and especially – how it is stated within (…/the law offended by the claimant) – about ‘danger-increasing characteristics of this person’.”381 Similar, in the before-mentioned case of “Retrieval of Bank Account Master Data”, the Court examined 377 See BVerfG, ibid., cip. 22 to 27. 378 See BVerfG, ibid., cip. 12 and 13. 379 See BVerfG, ibid., cip. 29. 380 See BVerfG, ibid., cip. 67 and 69: “Maßgeblich ist, ob sich bei einer Gesamtbetrachtung mit Blick auf den durch den Überwachungs- und Verwendungszweck bestimmten Zusammenhang das behördliche Interesse an den betroffenen Daten bereits derart verdichtet, dass ein Betroffensein in einer einen Grundrechtseingriff auslösenden Qualität zu bejahen ist.” 381 See BVerfG, ibid., cip. 67: “Die Kombination der (ausdrücklich in § 31 Abs. 2 PolG NW 1990) benannten Daten - Name, Anschrift, Tag und Ort der Geburt mit anderen, etwa, (wie im vorliegenden Fall,) der Staatsangehörigkeit, der Reli- I. Constitutional framework 161 whether the collection and processing of the claimant’s personal data provided an insight into his personality and why the state’s interest in it became so specific that it ‘qualitatively affected his fundamental right’: “Corresponding to the current customs, most of the payments (…) are processed via banking accounts. If information about the content of the accounts of one person is collected for a common purpose, this collection provides an insight into the economic situation and the social contacts of the person concerned, given that these (…/social contacts) consist of a financial dimension. Some of the account data could also allow for further conclusions about the conduct of the person concerned. The state investigations (…) based on the provisions of those offended can prepare measures, which can essentially concern the individual’s interests and would have not been possible without the knowledge retrieved.”382 The considerations described made it apparent that the criteria developed by the German Court in order to determine whether a state act of data treatment infringes the individual’s right to informational self-determination are not clear. At least, there appear to be four requirements: First, the data treatment must provide an insight into the personality of the individual concerned. This is the case if the data reveal, for example, the person’s personal conducts, economic situation or social contacts. Second, the Court considers not only the enforced revelation of data by the State, but also the factual treatment of data such as by secret or public observagionszugehörigkeit oder der Studienfachrichtung, kann und soll Aufschluss über Verhaltensweisen und damit Verdachtsmomente und insbesondere (- wie es in § 31 Abs. 1 PolG NRW 2003 nunmehr ausdrücklich heißt -) über "gefahrenverstärkende Eigenschaften dieser Personen" ermöglichen.” 382 See BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 68 and 69: “Nach den gegenwärtigen Gepflogenheiten werden die meisten Zahlungsvorgänge, die über Bargeschäfte des täglichen Lebens hinausgehen, über Konten abgewickelt. Werden Informationen über die Inhalte der Konten einer bestimmten Person gezielt zusammengetragen, ermöglicht dies einen Einblick in die Vermögensverhältnisse und die sozialen Kontakte des Betroffenen, soweit diese - etwa durch Mitgliedsbeiträge oder Unterhaltsleistungen eine finanzielle Dimension aufweisen. Manche Konteninhaltsdaten, etwa die Höhe von Zahlungen im Rahmen verbrauchsabhängiger Dauerschuldverhältnisse, können auch weitere Rückschlüsse auf das Verhalten des Betroffenen ermöglichen. Die auf der Grundlage der hier angegriffenen Normen erfolgenden behördlichen Ermittlungen über Kontostammdaten können anschließende Maßnahmen vorbereiten, die ohne die erlangten Kenntnisse nicht möglich wären und die die Belange der Betroffenen erheblich berühren können.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 162 tions.383 Third, the Court requires that there must be an intention or a purpose behind the collection of data when it refers to the ‘intended goal’ or ‘state interest, with respect to the overarching context and with respect to the purpose’. The collection of data by coincidence without further interests of usage, does not infringe the right to informational self-determination.384 Finally, the Court does not consider each act of data treatment intended by the state as an infringement. An infringement will occur only if it either constitutes a ‘specific danger for the freedom of action and of being private’; or if it ‘qualitatively affects a person’s fundamental right’ or if it can ‘essentially concern the individual’s interests’. Indeed, it remains unclear in what way these last criteria relate to each other and what they actually mean. For example, does the term ‘specific danger for the freedom of action and of being private’ only require the data to be stored for the purpose of providing the basis for potential further measures, or must these measures be specific? Does the term ‘particularity of the state interest qualitatively affecting a fundamental right’ mean that there must be a specific threat for another fundamental right, be it a specific right to privacy, freedom or equality or is any type of unspecific threat sufficient? Finally, does the term ‘individual’s interests’ cover more aspects than a fundamental right? One thought seems at least to be clear. The Court considers the accumulation of data related to the same person, as well as the retrieval of information through combining data, as different types of one infringement. In contrast, the Court considers subsequent measures, which are based on an infringement as previously described, as a separate infringement. For example, if license plates recorded are combined with further data, such as the type of car etc., this means that there has been an extension of the infringement of the right to informational self-determination. If these different types of data are combined and processed retrieving further information regarding, for instance, the driver, the court considers this a deepening of the infringement. In contrast, if this gathered information leads to the result that the police stops the car in order to, for example, check the driver’s license, this is seen as a separate infringement.385 383 See Bechler, Informational Harm by Intransparent Treatment of Personal Data, pp. 58 f. 384 Cf. Bechler, ibid., pp. 60 ff. 385 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 74. I. Constitutional framework 163 Purpose specification as the essential link for legal evaluation Last but not least, these considerations lead to another important aspect of the object and concept of protection of the right to informational self-determination: The relevant moment for the legal evaluation, in particular, of whether the principle of purpose limitation is met or not. In the public sector: Interplay between the three principles clarity of law, proportionality, and purpose limitation The relevant moment regarding the legal evaluation becomes particularly apparent with respect to infringements by the State. The German Constitutional Court combines the principle of clarity of law, the principle of proportionality and the principle of purpose limitation essentially resulting in the requirement that all future acts of usage of personal data must be predetermined when it is collected. Principles of clarity of law and purpose limitation referring to the moment when data is collected This requirement already becomes apparent in the Court’s first “Decision on Population Census”. With respect to individualized data, i.e. data which is not anonymized, the Court stated: “An obligation for the provision of personal data requires that the legislator precisely and specifically determines in certain areas the purpose of usage and should ensure that the information is suitable and necessary for achieving this purpose. The collection ahead of non-anonymized data for an undetermined or not yet determinable purpose is disproportionate with this (requirement). All (public) agencies collecting personal data in order to perform their tasks are restrained to the minimum which is necessary for achieving their given goals. The usage of the data is restricted to the purpose provided for by the provision. In the light of the dangers of automated data processing, it is necessary to establish protection, by means of transfer and usage bans, against the misuse of data for other purposes other than originally determined. Obligae) aa) (1) C. The function of the principle of purpose limitation in light of Article 8 ECFR 164 tions to clarify and to inform those about the data processing and to delete the data are essential measures for procedural protection.”386 Indeed, the Court does not forbid the State to collect data in advance for non-pre-determined purposes if the State only processes anonymized data for statistical purposes. However, the Court limits this broader range of action through other procedural restrictions and specifies the general objective aim of these requirements as: “Clearly defined requirements for the processing of data are necessary in order to guarantee that the individual does not become, under the conditions of automated collection and processing of his or her personal data, a mere object of information.”387 Consistent with these requirements, the Court handed down its reasoning in the case of “License Plate Recognition”. In this case, the Court stressed again that the moment personal, non-anonymized data is collected, is the cardinal point for the question of whether or not later acts of data processing is constitutionally legitimate or not: “The concrete requirements for the pre-determined clarification of the authorizing provision depend on the type and intensity of the infringement. Hence, the authorizing provision must especially pre-determine whether it allows serious infringements. If it does not exclude such (serious) infringements in a sufficiently clear manner, the provision has to also meet the legal requirements 386 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 179 and 180: “Ein Zwang zur Angabe personenbezogener Daten setzt voraus, daß der Gesetzgeber den Verwendungszweck bereichsspezifisch und präzise bestimmt und daß die Angaben für diesen Zweck geeignet und erforderlich sind. Damit wäre die Sammlung nicht anonymisierter Daten auf Vorrat zu unbestimmten oder noch nicht bestimmbaren Zwecken nicht zu vereinbaren. Auch werden sich alle Stellen, die zur Erfüllung ihrer Aufgaben personenbezogene Daten sammeln, auf das zum Erreichen des angegebenen Zieles erforderliche Minimum beschränken müssen. Die Verwendung der Daten ist auf den gesetzlich bestimmten Zweck begrenzt. Schon angesichts der Gefahren der automatischen Datenverarbeitung ist ein - amtshilfefester - Schutz gegen Zweckentfremdung durch Weitergabeverbote und Verwertungsverbote erforderlich. Als weitere verfahrensrechtliche Schutzvorkehrungen sind Aufklärungspflichten, Auskunftspflichten und Löschungspflichten wesentlich.” 387 See BVerfG, ibid., cip. 167: “Es müssen klar definierte Verarbeitungsvoraussetzungen geschaffen werden, die sicherstellen, daß der Einzelne unter den Bedingungen einer automatischen Erhebung und Verarbeitung der seine Person betreffenden Angaben nicht zum bloßen Informationsobjekt wird.” I. Constitutional framework 165 which apply to these (serious) infringements.”388 In the case of “Data Retention”, the Court provided its reasoning on the function of such requirement. In this case, the German Court had to decide on the validity of the German provisions transposing the European Data Retention Directive into German law – before the High Court of Ireland referred the homonymous case to the European Court of Justice.389 According to Article 1 of that directive, Member States were hold to oblige network and service providers to retain data for “the purpose of the investigation, detection, and prosecution of serious crime, as defined by each Member State in its national law.” The directive should apply to traffic and location data but not to the content of electronic communications, Article 1 section 2 of the Data Retention Directive. Pursuant to its Article 4, Member States should “adopt measures to ensure that data retained in accordance with this Directive are provided only to competent national authorities in specific cases and in accordance with national law. The procedure to be followed and the conditions to be fulfilled in order to gain access to retained data (…) shall be defined by each Member State”. While Article 6 of the directive required the duration of the data being retained between six months up to two years, its Article 7 regulated certain data protection and security measures. In contrast to the Irish Court, the German Constitutional Court did not stay its proceedings in order to let prove the validity of the directive according to the European Charter of Fundamental Rights by the European Court of Justice but decided the case, autonomously, on the grounds of the German Basic Law. The German Court argued it could autonomously decide the case because the Data Retention Directive left enough room for the national legislator in order to implement it in accordance with German basic rights.390 In its opinion, the German Basic Law did not prohibit per se the transposition of the Data Retention Directive into German law so that the ‘pri- 388 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 95: “Die konkreten Anforderungen an die Bestimmtheit und Klarheit der Ermächtigung richten sich nach der Art und Schwere des Eingriffs (...). Die Eingriffsgrundlage muss darum erkennen lassen, ob auch schwerwiegende Eingriffe zugelassen werden sollen. Wird die Möglichkeit derartiger Eingriffe nicht hinreichend deutlich ausgeschlossen, so muss die Ermächtigung die besonderen Bestimmtheitsanforderungen wahren, die bei solchen Eingriffen zu stellen sind (...).” 389 See beneath, under point C. I. 3. c) aa) (2) (b) Protection against collection, storage, and subsequent risk of abuse, the homonymous case of “Digital Rights vs. Ireland”, decided by the ECJ in 2014, ECJ C-293/12 and C-594/12. 390 See BVerfG, 2nd March 2010, 1 BvR 256/08, 1 BvR 263/08, and 1 BvR 586/08 (Data Retention), cip. 186, discussed above under point C. I. 2. d) aa) (1) Principles of clarity of law and purpose limitation referring to the moment when data is collected. C. The function of the principle of purpose limitation in light of Article 8 ECFR 166 macy of application’ of European fundamental rights did not become relevant.391 At first, the German Constitutional Court clarified the retention of the data by providers as a direct state interference because the providers pursued public purposes only, and there was no room left to make their own decisions. Furthermore, albeit other laws should provide pre-conditions for a concrete request of data by the state authorities, it already considered the provisions regarding the transfer as an infringement because these provisions already listed the general purposes for the later use of data. Consequently, these provisions released the providers from their duty of confidentiality.392 However, regarding the later usage of the data that was collected, i.e. its treatment by Intelligent Services that provide their results to state authorities, the Court clarified that “the constitutional limits of these authorities using the data must not be undermined by a wider authorization for the preceding usage (by the Intelligence Services).”393 Thus, the flux of data and the retrieval of information are principally bound to the constitutional evaluation the moment it is first collected and stored. The proportionality test also takes the use of data at a later stage into account In relation to the test of proportionality of the legal provisions that authorize the collection of personal data, the Constitutional Court takes several criteria into account: First, who and how many individuals are concerned; second, under which circumstances the data is collected, e.g. whether the individuals gave a reason or not or whether the data collection occurs secretly or open; and third, the intensity of the infringement.394 With regard to the last aspect, i.e. the intensity of the infringement, the Court considers the essential criteria as: first, how relevant the information is for the per- (2) 391 See BVerfG, ibid., cip. 187. 392 See BVerfG, ibid., cip. 192 to 194. 393 See BVerfG, 2nd March 2010, 1 BvR 256/08, 1 BvR 263/08, and 1 BvR 586/08 (Data Retention), cip. 233: “(Dies ist erst möglich durch Folgemaßnahmen der für die Gefahrenabwehr zuständigen Behörden,) deren verfassungsrechtliche Begrenzungen bei der Datenverwendung nicht durch weitergehende Verwendungsbefugnisse im Vorfeld unterlaufen werden dürfen.” 394 See BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 192. I. Constitutional framework 167 sonality of the individuals, in particular, if it is combined with further data; and second, whether or not the individuals could expect that the data about them would be treated in a certain way.395 In this last respect, the intensity of an infringement is particularly high if it interferes with the expectation of privacy in the home or regarding the use of telecommunications.396 In contrast, an infringement in relation to an individual’s conduct within the public is less intensive.397 The possibilities of later usage of the data also play an essential role.398 Consequently, the Court takes the disadvantages caused by the later usage for the individual into account. In doing so, the Court considers not only real disadvantages but also potential disadvantages that the individuals have reasonably to fear in order to determine the intensity of the infringement. The Court justifies the first aspect, i.e. real disadvantages, by considering that the state treatment of data related to unsuspicious individuals leads to their risk of being an object of state investigations, which adds to their general risk of being unreasonably suspected.399 It also indirectly increases the risk of being stigmatized in daily or professional life, in particular, if the treatment of data refers to criteria, such as religion or ethnic origin, listed in Article 3 of the German Basic Law, which guarantees the freedom of equality.400 The Court also takes into account whether or not the individual is able to defend him or herself against the current or following state measures.401 With respect to the second aspect, i.e. potential disadvantages, the Constitutional Court stresses that the individual’s fear of being surveyed can lead, in advance, to a bias in communication and to adaptations of personal conduct. These chilling effects concern not only the individual but also communication in society 395 See BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 92 and 93. 396 See BVerfG, ibid., cip. 93. 397 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 83. 398 See BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 109; cf. also BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 82. 399 See BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 227; BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 103. 400 See BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 106. 401 See BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 111. C. The function of the principle of purpose limitation in light of Article 8 ECFR 168 as a whole.402 Comparably, it takes into account the ‘diffuse threat’ for the individual. This threat results from the fact that the individuals know that the State has some information about them but do not know the precise information it has and what it will do with it.403 However, if the State meets certain requirements, the treatment of data can nevertheless be proportionate. In the decision of “Data Retention”, the Court precisely elaborated on the procedural measures coming into account in order to meet the principle of proportionality. The Court stressed that this can be, in particular, the case, if the authorizing law provides sufficiently clear rules, beside the extent and purpose of the data processing, on the security, transparency, and sanctions of the treatment of the data itself.404 With respect to the first point, data security requirements, the Court was of the opinion that the retention required an especially high standard of data security, because the collected data attracted, in light of its multifunctional informative value, the attention of many different stakeholders. Given that these stakeholders are private entities, they have little incentive to maintain a high level of data security. In order to maintain a particularly high standard of data security, for example, the following issues come into question: the systemic separation of the data, its encryption, a secure access control, and an irreversible documentation.405 Regarding the transparency of the data retention, the Court stressed, at first, that “the legislator must tackle the diffuse threat, which results from the data storage, by effective transparency rules. These serve to diminish the unspecific threat resulting from the lack of knowledge about the real relevance of the data, to counter unsettling speculations, and to enable the individuals concerned to question these measures in a public discourse. Furthermore, these requirements result from the principle of effective judicial relieve, pursuant to Art. 10 sect. 1 GG in combination with Art. 19 sect. 4 GG. Without corresponding knowledge, the individuals concerned can neither claim against an illicit usage of data by the authorities nor for 402 See BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 207; BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 230. 403 See BVerfG, 2nd March 2010, 1 BvR 256/08, 1 BvR 263/08, and 1 BvR 586/08 (Data Retention), cip. 241. 404 See BVerfG, ibid., cip. 220. 405 See BVerfG, ibid., cip. 222 and 224. I. Constitutional framework 169 their rights to deletion, rectification or compensation.”406 Finally, the Court stressed the importance of effective sanctions in order to meet the principle of proportionality as “if even severe infringements of the privacy of telecommunications were not sanctioned, with the result that the protection of the personality right specified in Art. 10 sect. 1 GG became stunted in light of its immaterial nature, this would contradict the state duty to enable the individual developing his or her personality and to protect him against dangers for his or her personality caused by third parties. This might be in particular the case if illicitly retrieved data could be freely used or an illicit usage of data remained without compensation, serving the satisfaction of the individual concerned, because there is no material damage.”407 In the most recent case of “Federal Criminal Police Office Act” (Bundeskriminalamtgesetz), the Constitutional Court consolidated its previous decisions, and highlighted another aspect being relevant for meeting the principle of proportionality. In this case, several individuals, such as politicians, lawyers, psychologists and journalists lodged a constitutional complaint against the law for the prevention of dangers of international terrorism through the Federal Criminal Po- 406 See BVerfG, ibid., cip. 241: “Der Gesetzgeber muss die diffuse Bedrohlichkeit, die die Datenspeicherung hierdurch erhalten kann, durch wirksame Transparenzregeln auffangen. (...) Sie haben zum einen die Aufgabe, eine sich aus dem Nichtwissen um die tatsächliche Relevanz der Daten ergebende Bedrohlichkeit zu mindern, verunsichernde Spekulationen entgegenzuwirken und den Betroffenen die Möglichkeit zu schaffen, solche Maßnahmen in die öffentliche Diskussion zu stellen. Zum anderen sind solche Anforderungen auch aus dem Gebot des effektiven Rechtsschutzes gemäß Art. 10. Abs. 1 GG in Verbindung mit Art. 19 Abs. 4 GG herzuleiten. Ohne Kenntnis können die Betroffenen weder eine Unrechtmäßigkeit der behördlichen Datenverwendung noch etwaige Rechte auf Löschung, Berichtigung oder Genugtuung geltend machen.” 407 See BVerfG, ibid., cip. 252: “Würden auch schwere Verletzungen des Telekommunikationsgeheimnisses im Ergebnis sanktionslos bleiben mit der Folge, dass der Schutz des Persönlichkeitsrechts, auch soweit er in Art. 10 Abs. 1 GG eine spezielle Ausprägung gefunden hat, angesichts der immateriellen Natur dieses Rechts verkümmern würde (...), widerspräche dies der Verpflichtung der staatlichen Gewalt, dem Einzelnen die Entfaltung seiner Persönlichkeit zu ermöglichen (...) und ihn vor Persönlichkeitsgefährdungen durch Dritte zu schützen (...). Dies kann insbesondere der Fall sein, wenn unberechtigt gewonnene Daten weitgehend ungehindert verwendet werden dürften oder eine unberechtigte Verwendung der Daten mangels materiellen Schadens regelmäßig ohne einen der Genugtuung der Betroffenen dienenden Ausgleich bliebe.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 170 lice Office (Gesetz zur Abwehr von Gefahren des internationalen Terrorismus durch das Bundeskriminalamt). This law authorizes, amongst others, secret measures carried out by the German Federal Criminal Police Office, such as long-term observations, acoustic and optical surveillance of the home, online investigations, and surveillance of telecommunications as well as the later use of the data for other purposes than for that it was originally collected.408 The claimants argued that this law would infringe their basic right to inviolability of the home, Art 13 GG, right to privacy of correspondence, posts and telecommunication of Article 10 GG, and their rights to the confidentiality and integrity of information technological systems informational as well as to informational self-determination, both provided for by Article 2 sect. 1 in combination with Article 1 sect. 1 GG. They justified their claim because they could get, in light of their human rights-related activities, in contact with individuals whom the law considers, pursuant to its broad provisions, as international terrorists and therefore could be also concerned by the surveillance measures.409 In this case, the Court stressed, beside the requirements mentioned previously, the importance of supervisory authorities to control the treatment of data, and reporting duties before the parliament and the public. The particularity of these additional requirements results, in the Court’s opinion, from the fact that the measures foreseen in the law are usually taken in secret and, though, the individuals concerned cannot defend themselves.410 In the private sector: The contract as an essential link for legal evaluation The concept of protection of the right to informational self-determination in relation to the private sector is similar to the approach described with respect to the public sector. In the private sector, from the Court’s point of view, “the contract is the essential instrument in order to develop free and self-responsible actions in relation to third parties.”411 Taking the contract into the center of the execution of the right to informational self-determination, the Court declares, in comparison to the public sector, that the esbb) 408 See BVerfG, 20th of April 2016, 1 BvR 966/09 and 1 BvR 1140/09 (Federal Bureau of Investigation Law), cip. 1 to 5. 409 See BVerfG, ibid., cip. 79 to 84. 410 See BVerfG, ibid., cip. 140 to 143. 411 See BVerfG, 23rd of October 2006, 1 BvR 2027/02 (Release of Confidentiality), cip. 34: “Der Vertrag ist das maßgebliche Instrument zur Verwirklichung freien und eigenverantwortlichen Handelns in Beziehung zu anderen.” I. Constitutional framework 171 sential determining point is the moment that a contract has been concluded. This means, since the conclusion of the contract usually precedes the collection of data, the essentially moment for legal evaluation is thus, before the data is collected. However, it recognized that the conclusion of the contract is not the only possible moment for evaluating the treatment of data at a later stage. With respect to the release of confidential information, the Court weighed the effects of the release of confidential information about the individual concerned against the equally important interest of the insurance company to receive the information.412 Balancing the opposing constitutional positions, the Court also considered that the point after the contract had been concluded was also relevant in respect of evaluating the legal relevance of the later treatment of data. In the Court’s opinion, such moments would have been possible by using alternative or supplementary mechanisms as: First, by means of specific releases of confidentiality for the particular request, referring to the specific institutions involved; second, by an information mechanism which enables the policy holder to object to the retrieval of data intended; third, by a mechanism where the institution involved does not provide the information about the policy holder directly to the insurance company but, before, to the policy holder who can then decide to add information and forward it to the insurance company or not, with the possible result that it looses the insurance claim.413 Interim conclusion: Conceptual link between ‘privacy’ and ‘data processing’ In conclusion, the concept of protection of the German right to informational self-determination establishes an autonomous substantial guarantee providing the individual a right to ‘basically determine by him or herself the disclosure and later usage of ‘his or her’ data’. This concept leads to several problematic aspects of protection: First, the concept leads to a rather broad scope of protection of the basic right. The broad scope results in the situation that each treatment of personal data must be justified. If the State treats personal data, this basically f) 412 See BVerfG, ibid., cip. 43, 45 to 48 as well as 50 and 51. 413 See BVerfG, ibid., cip. 59 and 60. C. The function of the principle of purpose limitation in light of Article 8 ECFR 172 constitutes an infringement of the basic right and consequently must be justified by a parliamentary law.414 Given that such a right shall not be an absolute right but rather be considered with regard to its function in society as a whole, the German Constitutional Court seeks to restrain the broadness of its scope in two ways. First, by determining what acts actually infringe the scope of protection. Second, when using a balancing exercise, by taking the intensity of the infringement into account. In the public sector, the essential moment for this examination is at the point of collection.415 In the private sector, a private party’s treatment of data related to an individual does not infringe his or her basic right, but can harm this basic right. Because of the protection function of the basic right, the State has to provide for protection instruments that enable the individual to effectively protect him or herself. A main protection instrument is the private contract. The broad scope principally leads, also in the private sector, to the situation that an individual can ‘basically determine by him or herself the disclosure and later usage of ‘his or her’ data’. However, in the private sector, the moment of legal evaluation of the data processing does not have only to be when the data is first collected, but also at later stages, depending on the specific contractual arrangement in question. This leads to the second problematic aspect of the concept of protection: that the specification of the purpose, which serves as an essential link for determining the legal relevance of the treatment of data, mainly refers to the moment of collection. Critics give two reasons for this approach: The first reason is, here again, that the concept of protection provides for an individual’s right to control over the collection and usage of ‘his or her’ data; such a control right naturally begins with the data collection. The second reason is that the concept of protection actually implies a centralized and linear environment where the data processing takes place. Critics consider this as problematic because the requirement of purpose specification should rather be considered, in light of the de-centralized and non-linear environment today, as a regulation instrument serving to structure the 414 See Härting, Purpose limitation and change of purpose in data protection law, p. 3284. 415 See Hoffmann-Riem, Protection of the Confidentiality and Integrity of Information Technological Systems, p. 1014. I. Constitutional framework 173 non-linear processes regarding the data treatment.416 Thus, the requirement of purpose specification should not focus on the moment that personal data is collected, as this results in the situation that all possible future purposes must be pre-determined the moment it is collected. Rather, it should refer to the specific data processing and usage of information, irrespective of the moment it occurs. Different approach of Article 7 and 8 ECFR with respect to Article 8 ECHR The challenges described with respect to the concept of protection of the German right to informational self-determination raise the question of how they might be avoided. The German Constitutional Court has developed the concept of protection of the right to informational self-determination over decades, starting in a time of non-linear environments. This makes it difficult for private data controllers to apply, in particular today, the requirements surrounding the principle of purpose limitation in innovative non-linear environments. The previous insights thus constitute a great opportunity for elaborating on the object and concept of protection of the new fundamental right to data protection under 8 ECFR. The object and concept of protection of this right, in particular, with respect to the fundamental right to private life in Article 7 ECFR is still not sufficiently clear.417 It is therefore a not only demanding but even more so promising task to elaborate on Article 8 ECFR as a fundamental right that fits the needs in non-linear environments. 3. 416 See Albers, Treatment of Personal Information and Data, cip. 121 to 123; highlighting the current change of the computational systems and environments compared to the times of the first “Decision on Population Census” in 1983, Hoffmann-Riem, Protection of the Confidentiality and Integrity of Information Technological Systems, pp. 1009 and 1010. 417 See, instead of many, Schneider, Status of and Perspectives for the European Data Traffic and Data Protection Law, pp. 515 and 516. C. The function of the principle of purpose limitation in light of Article 8 ECFR 174 Genesis and interplay of both rights Before the European Charter of Fundamental Rights came into force, the European Court of Justice referred to the right to private life under Article 8 ECHR when it had to decide on cases in which data protection and/or privacy played a role. Under normal circumstances, the European Court of Justice also referred to the constitutional traditions amongst the Member States in order to develop, on the level of the European Union, the respective definition of fundamental rights. However, in relation to the definition of “data protection” there were, and still are, no common principles in the constitutional traditions. For example, while there is an explicit fundamental right for data protection in the Netherlands, Finland, Austria, Belgium and Greece treat it as part of the right of private life. Denmark, Estonia and Italy frame data protection under the right of communication, and in Germany, it results from the general personality right.418 In light of these different concepts, the European Court of Justice could not refer to a common tradition amongst Member States but had to focus on the European Convention. Today, after the European Charter of Fundamental Rights came into force, the wording of Article 8 ECHR reappears, almost literally, in the right to private life under Article 7 ECFR.419 However, beside that Article, the European legislator established the right to data protection under Article 8 ECFR in order to harmonize the different approaches of data protection amongst the Member States by strengthening the protection of individuals against the new risks caused by the processing of personal data. Some critics stress that it is, actually, this new right that enables judicial courts to interpret internal market instruments, such as the Data Protection Directive in a way that effectively protects the individual’s fundamental rights.420 a) 418 See Bernsdorff, European Charter of Fundamental Rights, cip. 3; see also De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 14; and Lynskey, The Foundations of EU Data Protection Law, p. 89. 419 See Burgkardt, ibid., p. 343. 420 Cf. De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, pp. 8 and 9; De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, p. 81; Tzanou, Data protection as a fundamental right next to privacy? ‘Reconstructing’ a not so new right, p. 94. I. Constitutional framework 175 In light of the above, it is necessary to examine whether, and if so, to what extent the objects and concepts of protection of Articles 8 and 7 ECFR, as well as of Article 8 ECHR differ to each other. Article 52 section 3 ECFR states, in this regard: “in so far as this Charter contains rights which correspond to rights guaranteed by the Convention for the Protection of Human Rights and Fundamental Freedoms, the meaning and scope of those rights shall be the same as those laid down by the said Convention. This provision shall not prevent Union law providing more extensive protection.” The explanations of the European Charter of Fundamental Rights provide further assistance in order to answer the question of whether or not Articles 7 and/or 8 ECFR correspond to Article 8 ECHR. Pursuant to the Explanations of the European Charter of Fundamental Rights, only “the rights guaranteed in Article 7 (ECFR) correspond to those guaranteed by Article 8 of the ECHR.” In relation to Article 8 ECFR, the Explanations of the European Charter of Fundamental Rights state that “this Article has been based on (…) Article 8 of the ECHR” (underlining by the author).421 With respect to further systematic reasons, legal scholars conclude from this wording that Article 8 ECFR does not exactly correspond to Article 8 ECHR, but is interpreted by the European Court of Justice within the general framework provided for by the European Convention on Human Rights.422 Legal scholars stress that the establishment of the new right to data protection solves several problems that existed with respect to the protection of personal data under the right to private life in Article 8 ECHR. For example, the right to access to personal data and to have it rectified, pursuant to section 2, tackles problems that remain unanswered by the European Court of Human Rights.423 However, the precise interplay between the right to data protection under Article 8 ECFR and the right to private life provided for by Article 7 ECFR is heavily debated amongst legal scholars.424 Eichenhofer and González-Fuster summarize the spectrum of opinions pursuant to three 421 See Explanations of the European Charter of Fundamental Rights, 2007/C 303/02. 422 See Burgkardt, ibid., p. 348 with further references. 423 See De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, p. 81. 424 See also the unclear interplay between privacy and data protection in the OECD Guidelines, Tzanou, Data protection as a fundamental right next to privacy? ‘Reconstructing’ a not so new right, p. 91. C. The function of the principle of purpose limitation in light of Article 8 ECFR 176 categories: First, approaches considering both rights either as exclusive or, second, complementary to each other, or third, understanding one right as prevailing the other one.425 The so-called exclusivity approach considers the right to private life as solely covering aspects of private life, whereas the right to data protection only protects against risks caused by the processing of personal data.426 In contrast, the second approach advocates that the right to data protection covers a special part of the broader right to private life and, thus, prevails the right to private life so long as the processing of personal data is the matter of the case.427 This opinion is supported by the fact that more recently established secondary law refers to the right to data protection, only, and not to the right to private life anymore.428 Indeed, this approach foresees an exception from the exclusive attribution of the processing of personal data to the fundamental right to data protection, if the data processing constitutes a particular risk to the personality of the individual concerned. For instance, this can be the case if the processing leads to extensive profiles of the individuals concerned. In such a case, as an exception, the fundamental right to private life prevails.429 The third approach finally considers both rights as intersecting with each other in certain cases. Pursuant to this opinion, both rights may cover, jointly, certain situations while having, each of them, an autonomous scope of application. On the one hand, the right to private life is wider than the right to data protection because it protects an individual’s private life, irrespective of the processing of personal data. On the other hand, the right to data protection is wider than the right to private life because it also 425 See Eichenhofer, Privacy in the Internet as Protection of Trust, p. 61; González- Fuster, The Emergence of Data Protection as a Fundamental Right of the EU, p. 200; cf. also Lynskey, The Foundations of EU Data Protection Law, pp. 89-130, regarding the case law provided by the ECtHR with respect to the right to private life under Art. 8 ECHR 426 See Eichenhofer, ibid., p. 61, referring to González-Fuster, ibid., p. 200, referring, in turn, to Carlos Ruiz-Miguel, El derecho a la protección de los datos personales en la Carta de Derechos Fundamentales de Unión Europea: Análisis crítico, p. 8. 427 See Bernsdorff, European Charter of Fundamental Rights, Art. 8 cip. 13; Mehde, Handbook of European Fundamental Rights, § 21 cip. 13; Eichenhofer, ibid., p. 61, with further references. 428 See Eichenhofer, ibid., p. 61, referring to González-Fuster, ibid., pp. 243 ff. 429 See Eichenhofer, ibid., p. 61, referring, amongst others, to Opinion of Advocate General Cruz Villalón, 12th of December 2013, Case C-293/12 (Digital Rights vs Ireland), cip. 65. I. Constitutional framework 177 protects against risks caused by data processing that do not refer to the individual’s private life. A certain action can therefore either only conflict with the right to private life under Article 7 ECFR or with the right to data protection of Article 8 ECFR, or, simultaneously, with both fundamental rights.430 Concept of Article 8 ECHR: Purpose specification as a mechanism for determining the scope of application (i.e. the individual’s ‘reasonable expectation’) Before analyzing in detail how the European Court of Justice constructs the interplay of both rights to private life and to data protection under Articles 7 and 8 ECFR, so far, it is essential to examine the decisions of the European Court of Human Rights with respect to the right to private life under Article 8 ECHR. Regarding the European Convention for Human Rights, data protection falls under the right for private life and family in Article 8 ECHR. As mentioned before, only Article 7 ECFR corresponds to Article 8 ECHR, whereas, Article 8 ECFR is only based on it. Furthermore, the right to data protection of Article 8 ECFR explicitly mentions the requirement of purpose specification, while the right to private life under Article 7 ECFR does not. Therefore, it is helpful to first understand the function of purpose specification applied by the European Court of Human Rights with respect to Article 8 ECHR. As a second step, this analysis can further help answer the question about the interplay of Articles 7 and 8 ECFR. Substantial guarantee of “private life”: Trust in confidentiality and unbiased behavior In 1950, when the European Convention on Human Rights was signed, data protection as such, was not publically discussed. Therefore, beside b) aa) 430 See, for example, De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 6; Kokott and Sobotta, The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR; Albers, Treatment of Personal Information and Data, cip. 43; Eichenhofer, ibid., p. 61, with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 178 the terms “private and family life”, “correspondences”, and the “home”, data protection is not explicitly set out or conceptualized in the text of the convention. Nevertheless, the scope of application of Article 8 ECHR is considered to be broad enough to cover the recent technical and social development of data processing and accordingly, is interpreted by the European Court of Human Rights. In doing so, the Court does not always clarify whether it considers the processing of personal data as falling under “correspondences” or “private life”.431 In any case, with respect to the term “private life”, the Court has developed its definition through case law, instead of providing for a common definition that is generally applicable to all types of cases.432 This approach has meant that there is now a fairly ambiguous and wide scope of application of Article 8 ECHR that appears to repel several particular risks for its substantial guarantee(s).433 In the case of “Gillan and Quinton vs. The United Kingdom”, the European Court of Human Rights summarized, for example, several guarantees, which it has elaborated on the term “private life”, and clarified that “(…) the concept of ‘private life’ is a broad term not susceptible to exhaustive definition. It covers the physical and psychological integrity of a person. The notion of personal autonomy is an important principle underlying the interpretation of its guarantees (…). The Article also protects a right to identity and personal development, and the right to establish relationships with other human beings and the outside world. It may include activities of a professional or business nature. There is, therefore, a zone of interaction of an individual with others, even in a public context, which may fall within the scope of ‘private life’.”434 431 See De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 18; Burgkardt, ibid., pp. 247 with further references. 432 See Matscher, Methods of Interpretation of the Convention, pp. 63/64, with respect to the method of interpretation of the European Court of Human Rights, in general. 433 See, instead of many, Schweizer, European Convention and Data Protection, p. 464; Eichenhofer, Privacy in the Internet as Protection of Trust, p. 58, with further references; regarding the fact that not all data processing falls under the scope of protection, see De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, pp. 80 and 81. 434 See ECtHR, Case of Gillan and Quinton v. The United Kingdom from 12 January 2010 (application no. 4158/05), cip. 61. I. Constitutional framework 179 Indeed, it is a difficult task to generalize certain rulings of the European Court of Human Rights because those are based on a case-by-case approach. Legal scholars, however, stress that the principle of autonomy plays a significant role in all rulings of the European Court of Human Rights on the right to private life.435 From this perspective, the general objective of the right to private life is to protect the individual’s interest that certain actions and opinions by him or her remain confidential.436 This aspect becomes particularly apparent in a case where the European Court of Human Rights decided about the treatment of medical data. In this case of “Z. vs. Finland”, the Court stated that “the protection of personal data, not least medical data, is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life as guaranteed by Article 8 of the Convention (…). Respecting the confidentiality of health data is a vital principle in the legal systems of all the Contracting Parties to the Convention. It is crucial not only to respect the sense of privacy but also to preserve his or her confidence in the medical profession and in the health services in general. Without such protection, those in need of medical assistance may be deterred from revealing such information of a personal and intimate nature as may be necessary in order to receive appropriate treatment and, even, from seeking such assistance, thereby endangering their own health and, in the case of transmissible diseases, that of the community.”437 In light of these considerations, Article 8 ECHR provides individuals with confidence that their privacy is respected in order for them to act within society on an unbiased basis which is necessary to protect themselves and the society as a whole. Criteria established for certain cases: Context of collection, nature of data, way of usage, and results obtained In light of such a guarantee, which is relatively broad, but also takes into account the case-by-case approach, the question is the following: What bb) 435 See De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 15, with further references to corresponding considerations by the ECtHR. 436 See Schweizer, ibid., p. 466. 437 See ECtHR, Case of Z. vs. Finland from 25 February 1997 (application no. 22009/93), cip 95. C. The function of the principle of purpose limitation in light of Article 8 ECFR 180 kind of data is precisely protected against which kind of usage? The European Court of Justice, indeed, does not recognize all types of personal data as being protected.438 Hence, in order to answer this question, it is necessary to examine in detail the types of cases the European Court of Human Rights has considered as falling under the scope of application of Article 8 ECHR. One type of case concerns telecommunication data, which is protected insofar as participants of telecommunication processes usually expect their data to be confidential.439 Therefore, both the content of the communication, as well as its meta data is protected, for example, phone numbers, as well as the time and the duration of the call.440 Beside telecommunication data, other forms of “correspondences” fall under Article 8 ECHR as, for instance, letters, documents, and files.441 Another type of case refers to the term “physical and psychological integrity” of the individual. The Court elaborated in several cases on what this term means. In the case of “S. and Marper vs. The United Kingdom”, the European Court of Human Rights lists, in particular, the following aspects covered by this term: “Elements such as, for example, gender identification, name and sexual orientation and sexual life fall within the personal sphere protected by Article 8 (…). Beyond a person’s name, his or her private and family life may include other means of personal identification and of linking to a family (…). Information about the person’s health is an important element of private life (…). The Court furthermore considers that an individual’s ethnic identity must be regarded as another such element (see, in particular, Article 6 of the Data Protection Convention quoted in paragraph 41 above, which lists personal data revealing racial origin as a special category of data along with other sensitive information about an indi- 438 See De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, pp. 24 to 26. 439 Cf. ECtHR, Case of Copland vs. The United Kingdom from 3 April 2007 (application no. 62617/00), cip. 41 and 42; ECtHR, Case of Halford vs. The United Kingdom from 25 June 1997 (application no. 20606/92), cip. 42 to 46. 440 See ECtHR, Case of Copland vs. The United Kingdom from 3 April 2007 (application no. 62617/00), cip. 43. 441 See examples at Schweizer, ibid., p. 465. I. Constitutional framework 181 vidual).”442 Legal critics stress that the European Court of Justice acknowledged the category of sensitive data in its decision.443 In any case, the European Court of Human Rights clarified that “in determining whether the personal information retained by the authorities involves any of the private-life aspects mentioned above, the Court will have due regard to the specific context in which the information at issue has been recorded and retained, the nature of the records, the way in which these records are used and processed and the results that may be obtained (…).”444 Thus, the above-listed categories of personal information is not protected per se. Instead, its protection must be examined, again, on a case-by-case basis, pursuant to the context of collection, the nature of the data, and the results retrieved from it. Particular reference to the individual’s “reasonable expectations” When undertaking this exercise, the European Court of Human Rights often grounds its decision in further cases, by also referring to the “reasonable expectations” of the individual concerned. In this regard, the purpose of the data processing can play a decisive role.445 cc) 442 See ECtHR, Case of S. and Marper vs. the United Kingdom from 4 December 2008 (application nos. 30562/04 and 30566/04), cip. 66. 443 See Schweizer, ibid., p. 466; with respect to genetic data, see ECtHR, Case of S. and Marper vs. the United Kingdom from 4 December 2008 (application nos. 30562/04 and 30566/04), cip. 70 to 77. 444 See ECtHR, Case of S. and Marper v. The United Kingdom from 4 December 2008 (application nos. 30562/04 and 30566/04), cip. 67. 445 The following categorization is not the only possible one, of course. For example, Lindsay categorizes possible infringements of the right to private life under Art. 8 ECHR along the five elements: „storage of data relating to the private life of an individual“, „systematic collection and storage of (non-private) data“, „use of collected data infringing the individual’s ‚reasonable expectations‘“, „concerned data constitute sensitive personal information“, and „whether consent was given“ - see at Linskey, The Foundations of EU Data Protection Law, pp. 108-110. The most apparent difference to Linskey’s scheme is that the following criteria are altogether categorized under the umbrella criterion of the „individual’s reasonable expectations“. C. The function of the principle of purpose limitation in light of Article 8 ECFR 182 ‘Intrusion into privacy’ The Court refers, for instance, to the individual’s “reasonable expectations” in order to determine whether or not an intrusion into his or her private sphere infringes the right to private life. The Court hence does not affirm that each intrusion into the individual’s privacy is an infringement of the individual’s right to private life. However, in the case of “Copland vs. The United Kingdom”, the Court affirmed that such an infringement took place. In this case, the claimant worked at a state college in England. When her supervisor suspected that she had an “improper relationship” with another male employee at the college, the supervisor started to monitor her telephone, email and Internet usage.446 In this case, the European Court of Human Rights affirmed that the claimant’s right to private life under Article 8 ECHR had been infringed considering that “the applicant in the present case had been given no warning that her calls would be liable to monitoring, therefore she had a reasonable expectation as to the privacy of calls made from her work telephone. The same should apply in relation to the applicant’s email and Internet usage.”447 In the case of “Halford vs. The United Kingdom”, the Court also examined further factors, beside the mere use of telecommunications by the individual, which reinforced her “reasonable expectations” to privacy. In this case, the claimant was an Assistant Chief Officer at the Merseyside police office in England. When her supervisor refused to promote the claimant, despite existing vacancies, the claimant started proceedings before the judicial court on the grounds of gender discrimination. The claimant furthermore alleged that her employer intercepted the telephones that she had used in her office in order to use that information against her in the discrimination proceedings. The claimant had two telephones, one for business and one for private use. There were no restrictions or guidance given by her employer for the use of these phones.448 (1) 446 See ECtHR, Case of Copland vs. The United Kingdom from 3 April 2007 (application no. 62617/00), cip. 6 to 17. 447 See ECtHR, ibid., cip.z 41. 448 See ECtHR, Case of Halford vs. The United Kingdom from 25 June 1997 (application no. 20606/92), cip. 8 to 20. I. Constitutional framework 183 The Court confirmed that an infringement had taken place: “There is no evidence of any warning having been given to Ms. Halford, as a user of the internal telecommunications system operated at the Merseyside police headquarters, that calls made on that system would be liable to interception. She would, the Court considers, have had a reasonable expectation of privacy for such calls, which expectation was moreover reinforced by a number of further factors. As Assistant Chief Constable she had sole use of her office where there were two telephones, one of which was specifically designated for her private use. Furthermore, she had been given the assurance, in response to a memorandum, that she could use her office telephones for the purposes of her sex-discrimination case.”449 Both cases illustrate that the Court does not strictly differentiate between the two legal terms “private life” and “correspondences”. However, the Court examines an infringement of Article 8 ECHR took place, by referring, commonly, to the individual’s “reasonable expectations”. Public situations: ‘Systematic or permanent storage’ vs. ‘passer-by situations’ In cases related to public situations, the European Court of Human Rights elaborates on the criteria regarding the individual’s “reasonable expectations” extensively. This was in particular the case in the decision of “P.G. and J.H. vs. The United Kingdom”. In this case, the police wanted to compare the voice samples of the applicants with voices recorded during a conversation held on the occasion of an earlier event. In light of that the applicants had denied, during their arrest, to voluntarily provide such voice samples, the police installed covert listening devices in order to record their voices while police officers asked them formal questions. Hence, the applicants did not know that their voices were recorded during that conversation.450 In order to determine the scope of protection of Article 8 ECHR, the Court took into account that “there are a number of elements relevant to a consideration of whether a person’s private life is concerned in measures effected outside a person’s home or private premises. Since there are occa- (2) 449 ECtHR, ibid., cip. 45. 450 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 15 and 16. C. The function of the principle of purpose limitation in light of Article 8 ECFR 184 sions when people knowingly or intentionally involve themselves in activities that are or may be recorded or reported in a public manner, a person’s reasonable expectation as to privacy may be a significant, although not necessarily conclusive, factor. A person who walks down the street will, inevitably, be visible to any member of the public who is also present. Monitoring by technological means of the same public scene (for example, a security guard viewing through closed-circuit television) is of similar character. Private-life considerations may arise, however, once any systematic or permanent record comes into existence of such material from the public domain. It is for this reason that files gathered by security services on a particular individual fall within the scope of Article 8, even where the information has not been gathered by an intrusive or covert method (…)”451 The Court referred in this decision to the precedent cases of “Amann vs. Switzerland” and “Rotaru vs. Romania”: While in the first case, “the storing of information about the applicant on a card in a file was found to be an interference with private life, even though it contained no sensitive information and had probably never been consulted”452, the Court had stressed in the second case that the systematic or permanent storage of public information especially falls under Article 8 if “such information concerns a person’s distant past (…,) some of the information has been declared false and is likely to injure the applicant’s reputation.”453 Consequently, in the case of “Herbecq vs. Belgium”, the European Court of Justice decided that a video system controlling a public space does not fall under Article 8 ECHR if the visual data is not recorded because “it is difficult to see how the visual data obtained could be made available to the general public or used for purposes other than to keep a watch on places.”454 From the Court’s point of view, “the data available to a person looking at monitors is identical to that which he or she could have ob- 451 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 57. 452 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 57. 453 See ECtHR, Case of Rotaru vs. Romania from 4 May 2000 (application no. 28341/95), cip. 43 and 44. 454 See ECtHR, Case of Herbecq and the Association League des Droits de l’Homme vs. Belgium from 14 January 1998 (application nos. 32200/96 and 32201/96), p. 97. I. Constitutional framework 185 tained by being on the spot in person (…). Therefore all that can be observed is essentially, public behavior.”455 ‘Data relating to private or public matters’, ‘limited use’ and/or ‘made available to the general public’ While in the case of “Herbecq vs. Belgium” the right to private life under Article 8 ECHR did not apply because there was no “systematic or permanent storage” of personal data at all, the Court denied the application of Article 8 ECHR in the cases of “Lupker vs. the Netherlands” and “Friedl vs. Austria” for further reasons. The Court stated that these decisions “concerned the unforeseen use by authorities of photographs which had been previously voluntarily submitted to them (…/for example, during an application process for a passport or drivers license) and the use of photographs taken by the authorities during a public demonstration (…).”456 In these cases, the photographs taken during an application process were later used for criminal proceedings; and the photographs taken by the authorities during a public demonstration were used for policing the demonstration, only. The Court decided this case by referring to the following criteria as: “In those cases, the Commission attached importance to whether the photographs amounted to an intrusion into the applicant’s privacy (as, for instance, by entering and taking photographs in a person’s home), whether the photograph related to private or public matters and whether the material thus obtained was envisaged for a limited use or was likely to be made available to the general public. In (../the second case) the Commission noted that there was no such intrusion into the ‘inner circle’ of the applicant’s private life, that the photographs taken of a public demonstration related to a public event and that they had been used solely as an aid to policing the demonstration on the relevant day. In this context, the Commission attached weight to the fact that the photographs taken remained anonymous in that no names were noted down, the personal data recorded and pho- (3) 455 See ECtHR, Case of Herbecq and the Association League des Droits de l’Homme vs. Belgium from 14 January 1998 (application nos. 32200/96 and 32201/96), p. 97. 456 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61. C. The function of the principle of purpose limitation in light of Article 8 ECFR 186 tographs taken were not entered into a data-processing system and no action had been taken to identify the persons photographed on that occasion by means of data processing (ibid.). Similarly, in (../the first case), the Commission specifically noted that the police used the photographs to identify offenders in criminal proceedings only and that there was no suggestion that the photographs had been made available to the general public or would be used for any other purpose.”457 Consequently, the Court considered the use of the data was not infringing the right to private life under Article 8 ECHR. In the next case of “Peck vs. The United Kingdom”, the European Court of Human Rights at first tied into the criteria considered in the case of “Herbecq vs. Belgium” – whether the treatment of data is comparable to a passer-by or security situation – and then explicitly differentiated between the moment the data is collected and its later usage. In this case, the camera of a CCTV-system had filmed the applicant walking around at a junction with a kitchen knife in his hand, directly after he tried to commit suicide.458 The defendant published the record in its CCTV News publication while the identity of the applicant was not appropriately masked.459 The Court stressed in its decision that the “applicant did not complain that the collection of data through the CCTV-camera monitoring of his movements (…) amounted to an interference to his private life. (…) Rather, he argued that it was the disclosure of that record of his movements to the public in a manner in which he could never have foreseen which gave rise to such an interference.”460 The Court affirmed a serious infringement of Article 8 ECHR had occurred taking into account that “the footage was disclosed to the media for further broadcasting and publication purposes. Those media included the audiovisual media: Angelia Television broadcast locally to approximately 350,000 people and the BBC broadcast nationally, and it is ‘commonly acknowledged that the audiovisual media have often a much more immediate and powerful effect than the print media’ (…). (…/The applicant) was 457 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61. 458 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 10. 459 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 62. 460 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 60. I. Constitutional framework 187 recognized by certain members of his family and by his friends, neighbours and colleagues.”461 The Court therefore decided that “the relevant moment was viewed to an extent which far exceeded any exposure to a passer-by or to security observation (…) and to a degree surpassing that which the applicant could possibly have foreseen when he walked (…/in the moment he was filmed).”462 In light of the seriousness of the infringement, that being: the applicant’s identity was not appropriately masked, and that the footage was not published for purposes of crime detection or prevention, the Court came to the conclusion that the infringement was not justified.463 ‘Unexpected use’ pursuant to the purpose perceptible by the individual concerned In the cases described, the European Court of Human Rights more or less implicitly referred to the purpose of the collection and usage of the personal data in order to examine whether the individual could reasonably expect the collection and, more importantly, the later usage or not. In all of the cases, the Court examined whether the data ‘amounted to an intrusion into the applicant’s privacy, related to private or public matters and whether the information obtained was envisaged for a limited use or was likely to be made available to the general public’.464 However, even a limited use, not being a publication of data, can interfere with an individual’s ‘reasonable expectation’. In the above-mentioned case of “P.G. and (4) 461 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 62 and 63. 462 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 62. 463 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 79, 85, and 87. 464 Cf. ECtHR, Case of Herbecq and the Association League des Droits de l’Homme vs. Belgium from 14 January 1998 (application nos. 32200/96 and 32201/96), p. 97; ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 58; ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61 referring to the Case of Friedl vs. Austria from 31 January 1995 (Series A no. 305-B) and Case of Lupker vs. the Netherlands from 7 December 1992 (application no. 18395/91); see also, for example, ECtHR, Case of von Hannover vs. Germany from 24 June 2004 (application no. 59320/00), cip. 52. C. The function of the principle of purpose limitation in light of Article 8 ECFR 188 J.H. vs. The United Kingdom”, the Court came to the conclusion that the covert recording of voices during a conversation in the police station fell within the scope of Article 8 ECHR. In conclusion, the Court did not follow the opinion of the defending government that the applicants could not expect their privacy in that context.465 From the Court’s point of view, “a permanent record has nonetheless been made of the person’s voice and it is subject to a process of analysis directly relevant to identifying that person in the context of other personal data. Though it is true that when being charged the applicants answered formal questions in a place where police officers were listening to them, the recording and analysis of their voices on this occasion must still be regarded as concerning the processing of personal data about the applicants.”466 While the Court referred in this case only to the fact that the covert voice sample became “subject to a process of analysis directly relevant to identifying (…/the applicant) in the context of other personal data”467, the purpose of the data treatment played in the other decisions a more explicit role. In the case “Herbecq vs. Belgium”, the Court held it as essential that the visual data from the video camera control could not be, in light of the fact that it did not record the data, “used for purposes other than to keep a watch on places.”468 In the cases of “Friedl vs. Austria” and of “Lupker vs. the Netherlands”, the Court considered that the photographs had been used, in the first case, “solely as an aid to policing the demonstration on the relevant day” and, in the other case, “to identify offenders in criminal proceedings only (…/without giving) suggestion that the photographs (…) would be used for any other purpose.”469 In the case of “Peck vs. the United Kingdom”, the Court finally came to the conclusion that the usage of the visual data had clearly surpassed what the applicant could have fore- 465 Cf. ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 54. 466 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 59. 467 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 59. 468 See ECtHR, Case of Herbecq and the Association League des Droits de l’Homme vs. Belgium from 14 January 1998 (application nos. 32200/96 and 32201/96), p. 97. 469 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61. I. Constitutional framework 189 seen because it was not only recorded for security reasons, but also “disclosed to the media for further broadcasting and publication purposes.”470 Finally, the purpose plays an even more explicit and decisive role in the decisions of “Perry vs. the United Kingdom” and of “M.S. vs. Sweden”. In the case of “Perry vs. the United Kingdom”, the applicant had, in connection of a robbery for which he was accused, refused an identity parade. The police therefore decided to indirectly make the identity parade possible by means of a tape record: An engineer adjusted a custody suite camera in the police station in order to ensure that it took clear pictures of the applicant in the moment when he, being arrested, entered the police station. After the record, the police prepared a compilation video in which other persons mimicked the actions of the applicant how it was recorded. When this compilation was shown to witnesses of the robbery, some of them positively identified the applicant.471 Similar to the case of “P.G. and J.H. vs. The United Kingdom””, the defending Government argued that the police station “could not be regarded as a private place, and that as the cameras which were running for security purposes were visible to the applicant he must have realized that he was being filmed, with no reasonable expectation of privacy in the circumstances.”472 In contrast, the European Court of Human Rights had a more differentiated approach on privacy within the meaning of Article 8 ECHR. It affirmed, at first, that “the normal use of security cameras, whether in public or on premises, such as shopping centres, or police stations, where they serve a legitimate and foreseeable purpose, do not raise issues under Article 8 § 1 of the Convention. However, the police regulated the security camera so that it could take clear footage of the applicant in the custody suite and inserted it in a montage of film of other persons to show to witnesses for the purposes of seeing whether they identified the applicant as the perpetrator of the robberies under investigation. The video was also shown during the applicant’s trial in a public court room. (…) The Court recalls that the applicant had been brought to the police station to attend an identity parade and that he had refused to participate. Whether or not he was aware of the security cameras running in the custody suite, there is no 470 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 62. 471 See ECtHR, Case of Perry vs. the United Kingdom from 17 July 2003 (application no. 63737/00), cip. 14 and 15. 472 See ECtHR, Case of Perry vs. the United Kingdom from 17 July 2003 (application no. 63737/00), cip. 39. C. The function of the principle of purpose limitation in light of Article 8 ECFR 190 indication that the applicant had any expectation that footage was being taken of him within the police station for use in a video identification procedure and, potentially, as evidence prejudicial to his defence at trial. This ploy adopted by the police went beyond the normal or expected use of this type of camera, as indeed is demonstrated by the fact that the police were required to obtain permission and an engineer had to adjust the camera. (…) The Court considers therefore that the recording and use of the video footage of the applicant in this case discloses an interference with his right to respect for private life.”473 This interference was not justified because the police did not inform the applicant about the actual purpose of the filming before it, which is required by the national law concerned.474 Finally, in the case of “M.S. vs. Sweden”, a medical clinic has sent, without prior notice of the applicant, the applicant’s medical records to a Social Insurance Office. The Office had requested the data because of a claim of the applicant for compensation after she had an accident at work.475 In this case, the European Court of Human Rights examined whether the transfer constituted an infringement of Article 8 ECHR taking into account “that the medical records in question contained highly personal and sensitive data about the applicant (…). Although the records remained confidential, they had been disclosed to another public authority and therefore to a wider circle of public servants (…). Moreover, whilst the information had been collected and stored at the clinic in connection with medical treatment, its subsequent communication had served a different purpose, namely to enable the Office to examine her compensation claim. It did not follow from the fact that she had sought treatment at the clinic that she would consent to the data being disclosed to the Office (…). Having regard to these considerations, the Court finds that the disclosure of the data by the clinic of the Office entailed an interference with the applicant’s right to respect for private life guaranteed by paragraph 1 of Article 8.”476 However, the Court considered that the inference was justified within Ar- 473 See ECtHR, Case of Perry vs. the United Kingdom from 17 July 2003 (application no. 63737/00), cip. 40, 41, and 43. 474 See ECtHR, Case of Perry vs. the United Kingdom from 17 July 2003 (application no. 63737/00), cip. 47 and 49. 475 See ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 8 to 14. 476 See ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 35. I. Constitutional framework 191 ticle 8 ECHR because the Office had a legitimate interest in the data as it could not have checked otherwise whether the applicant’s claim for the compensation was well-founded or not. Furthermore, the receiving office was under a duty to verify that the pre-conditions for the transfer were met. In addition, they were also under a duty to keep this information confidential, so that limitations for further use existed, as well as safeguards against abuse.477 Consent: Are individuals given a choice to avoid the processing altogether? In the same case, the European Court of Human Rights also examined, in more detail, the pre-conditions and extent of a potential waiver of the individual’s right to private life. The Court discussed, in particular, whether the applicant consented to the transfer of her medical data in what would have excluded, in the Court’s opinion, the application of Article 8 ECHR. In doing so, it took into account that the “communication of such data by the clinic to the Office would be permissible under the Insurance Act only if the latter authority had made a request and only to the extent that the information was deemed to be material to the application of the Insurance Act (…). This assessment was left exclusively to the competent authorities, the applicant having no right to be consulted or informed beforehand (…). It thus appears that the disclosure depended not only on the fact that the applicant had submitted her compensation claim to the Office but also on a number of factors beyond her control. It cannot therefore be inferred from her request that she had waived in an unequivocal manner her right under Article 8 § 1 of the Convention to respect for private life with regard to the medical records at the clinic. Accordingly, the Court considers that this provision applies to the matters under consideration.”478 The Court similarly focused on the question of whether or not the individual is able to control the collection of his or her data in the case of “Gillan and Quinton vs. the United Kingdom”. dd) 477 See ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 42 to 44. 478 See ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 32. C. The function of the principle of purpose limitation in light of Article 8 ECFR 192 In this case, the police has stopped, on the grounds of the Terrorism Act 2000, passers-by and searched their bags in connection with a demonstration.479 The government argued that the individual’s concerned had given their consent to the search because they would have “brought themselves into contact with the public sphere through their voluntary engagement with a public demonstration.”480 The Court of Human Rights did not accept this argument nor, in particular, “the analogy drawn with the search to which passengers uncomplainingly submit at airports or at the entrance of a public building. It does not need to decide whether the search of the person and of his bags in such circumstances amounts to an interference with an individual’s Article 8 rights, albeit one which is clearly justified on security grounds, since for the reasons given by the applicants the situations cannot be compared. An air traveller may be seen as consenting to such a search by choosing to travel. He knows that he and his bags are liable to be searched before boarding the aeroplane and has a freedom of choice, since he can leave personal items behind and walk away without being subjected to a search. The search powers under section 44 are qualitatively different. The individual can be stopped anywhere and at any time, without notice and without any choice as to whether or not to submit to a search.”481 The Court concluded from this that the searches interfered with Article 8 ECHR and were, not justified on the grounds of the authorizing law (section 44 of the Terrorism Act 2000). The reason was that the searches were “neither sufficiently circumscribed nor subject to adequate legal safeguards against abuse”.482 As it had already affirmed that an infringement of Article 8 ECHR had taken place, the Court held that it was not necessary to examine further rights under ECHR, such as the freedom of expression or assembly.483 479 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 7 to 9. 480 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 60. 481 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 65. 482 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 87. 483 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 88 to 90. I. Constitutional framework 193 Conclusion: Assessment of ‘reasonable expectations’ on a case-bycase basis In conclusion, the European Court of Human Rights does not, in general, define but rather examines, on a case-by-case basis, which acts of data treatment are legally relevant: Be it medical or communication data, or a human action in public. The Court tends to answer the question of whether or not the treatment of data is legally relevant by determining the specific context. In doing so, it takes into account “whether the (… / personal data) amounted to an intrusion into the applicant’s privacy, whether (… / it) related to private or public matters and whether the material thus obtained was envisaged for a limited use or was likely to be made available to the general public.”484 In this examination exercise, the Court does not explicitly refer to the principle of purpose limitation, but rather to the individual’s “reasonable expectations”. In this regard, indeed, the purpose of the data processing plays an important instrumental role.485 The explicit purpose of the collection for the individual concerned provides a link for examining whether or not he or she could expect an intrusion into his or her private sphere or, respectively, could expect how their data was used later on. However, the European Court of Human Rights does not refer to any further human rights in order to determine the impact resulting from the treatment of the data for the individual. In the case of “Gillan and Quinton vs. The United Kingdom”, the Court rather, concluded that it did not have to examine any further rights of the European Charter on Human Rights, such as the freedom to expression or to assembly, since it had already affirmed a violation under Article 8 ECHR.486 ee) 484 See ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61. 485 However, see Bygrave, Data Privacy Law, p. 154, who sees the principle of purpose limitation “far from salient in ECtHR case law”. 486 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 88 to 90. C. The function of the principle of purpose limitation in light of Article 8 ECFR 194 Concept of Articles 7 and 8 ECFR: Ambiguous interplay of scopes going beyond Article 8 ECHR After having examined the reasons developed by the European Court of Human Rights with respect to Article 8 ECHR, it is now possible to analyze how the European Court of Justice transposes these functions of purpose specification into the concept of protection of Articles 7 and 8 ECFR, respectively. Comparing the decisions of the European Court of Justice with the principles developed by the European Court of Human Rights A comparison of the decisions held, on the one hand, by the European Court of Human Rights and, on the other hand, the European Court of Justice, reveals more differences than commonalities. One reason for this is that the European Court of Justice clearly developed the concept of protection further by referring, either, to the right to private life under Article 7 ECFR, or to the right to data protection under Article 8 ECFR, or to both fundamental rights. General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-by-case approach The first difference concerns the way how the European Court of Justice constructs the scope of protection of the fundamental rights, respectively. After the European Charter of Fundamental Rights came into force, the European Court of Justice commonly defined the scope(s) of protection of both rights to private life under Article 7 ECFR and data protection under Article 8 ECFR by referring to the term ‘personal data’. In doing so, the European Court of Justice principally applies the reasoning of the European Court of Human Rights. This becomes particularly apparent in the case of “Schecke vs. Land Hessen”. In this case, the applicants of the main proceedings were a group of agricultural companies that were financially supported by the department of European agricultural funds. According to the corresponding European regulation, the executive public agency published data about the applicants, such as their names, their place of establishment and residence, as well as the annual amounts of the money received from the department. The claimants brought c) aa) (1) I. Constitutional framework 195 an action against the publication of their information, which was finally referred by the national court to the European Court of Justice.487 The European Court of Justice explicitly referred to the decisions of “Amann vs. Switzerland” and “Rotaru vs. Romania” of the European Court of Human Rights stating not only that the right to data protection under Article 8 ECFR “is closely connected with the right to private life expressed in Article 7 ECFR”488 but also “that the term ‘private life’ must not be interpreted restrictively”.489 The Court appears to construct one common fundamental right, stressing: “The right to respect for private life with regard to the processing of personal data, recognized by Article 7 and Article 8 of the Charter, concerns any information relating to an identified or identifiable individual (…) and the limitations which may lawfully be imposed on the right to the protection of personal data correspond to those tolerated in relation to Article 8 of the Convention.”490 While some critics consider that the Court “assimilates Article 7 and 8 of the Charter to create an unprecedented right”,491 others stress that the unclear reasoning does not automatically mean that the Court assumes both Articles 7 and 8 ECFR as one fundamental right in relation to the meaning of Article 8 ECHR.492 The European Court of Justice affirmed this combination of Article 7 and 8 ECFR in the case of “FECEMD and ASNEF”.493 However, the Court basically applies the same definition for affirming the scope of protection in decisions where it refers to the right to data protection under Article 8 ECFR, only. This is the case, for example, in the decisions of “SABAM vs. Scarlet” and “SABAM vs. Netlog”.494 In both cases, the European Court of Justice simply affirmed that the IP addresses concerned did 487 See ECJ C-92/09 and C-93/09 (Schecke vs. Land Hessen), cip. 25 to 28. 488 See ECJ C- 92/09 and C-93/09 cip. 47 and 52. 489 See ECJ C-92/09 and C-93/09 cip. 59. 490 See ECJ C-92/09 and C-93/09 cip. 52. 491 See González-Fuster, The Emergence of Data Protection as a Fundamental Right of the EU, pp. 234 to 236. 492 See Burgkardt, ibid., pp. 349 to 356 with further references. 493 See ECJ C-468/10 and C-469/10, cip. 40 to 42, and the facts of the case above under point C. I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. 494 See the facts of the case above under point C. I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. C. The function of the principle of purpose limitation in light of Article 8 ECFR 196 indeed fall under Article 8 ECFR “because (they) allow those users to be precisely identified.”495 Legal critics are of the opinion that this reasoning indicates a rather broad interpretation of the term ‘personal data’ without any further requirements, such as a link to the private sphere or data sensitivity.496 So far, the essential aspect is that the European Court of Justice uses the term ‘personal data’ for defining both scopes of protection of Article 7 and Article 8 ECFR, like the European Court of Human Rights with respect to Article 8 ECFR, but uses a different method for constructing the scopes. The European Court of Human Rights constructs the scope of protection of the right to private life on a case-by-case basis and does not provide for a definition of private life that is capable of a general application.497 Consequently, the legal doctrine elaborating on such a general definition plays a much smaller role at the European Court of Human Rights level than it does in the continental European traditional level. Based on the more empirical approach of common law, there is, consequently, no “general formula” determining the “implicit limitations” of fundamental rights. Instead, these limitations must be defined for each (type of) case(s), for example, by means of affirming or denying the scope of protection.498 In contrast, scholars stress that the European Court of Justice does not sufficiently take into account the particularities of the case at hand.499 Therefore, even if there is not yet a commonly accepted normative methodology of interpreting union law,500 the European Court of Justice shows a strong tendency – at least, with respect to the rights to private life and data protection under Articles 7 and 8 ECFR – to apply another method of interpretation than the European Court of Human Rights. The European Court of Justice defines the scopes of protection of both fundamental rights un- 495 See ECJ C-70/10 cip. 51 and ECJ C-360/10 cip 49. 496 See Burgkardt, ibid., pp. 349 to 356 with further references. 497 See above under point C. I. 3. c) aa) (1) General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-by-case approach. 498 See Matscher, Methods of Interpretation of the Convention, pp. 63 to 67, who also stresses that a comparative analysis with the judicature by the European Court of Justice would be interesting. 499 See Fleischer, European Methodology, p. 717, referring to Vogenauer, Die Auslegung von Gesetzen in England und auf dem Kontinent I und II (2001), pp. 255 ff. 500 See Fleischer, ibid., pp. 707 to 710, referring, indeed, to prescriptive methodologies such as at Ulla B. Neergaard, Ruth Nielsen, Lynn M. Rosenberry, European legal Method: Paradoxes and Revitalisation (2011). I. Constitutional framework 197 der Article 7 and 8 ECFR referring, in general, to the term of “personal data”. This term serves as the Court’s main starting point when considering, by means of its deductive method, all processing of personal data as falling under the scope(s) of protection. This leads to the result that the European Court of Human Rights remains, in light of its case-by-case approach, relatively free in examining the particularities of the case at hand and, though, affirming or denying the scope of application of the right to private life under Article 8 ECHR. In contrast, the European Court of Justice, which refers to its general definition of the term of “personal data”, is bound, once personal data is the main focal point of the case, to affirm the scope of protection of the rights to private life and/or data protection under Article 7 and 8 ECFR. Differences between private life and data protection under Articles 7 and 8 ECFR The second difference concerns the elements that were originally covered, all together, by the right to private life under Article 8 ECHR and are now located, in one part, under the homologue right of Article 7 ECHR and, in another part, under the new right to data protection of Article 8 ECFR. So far, this re-location is not a substantive further development regarding the concept of protection provided for by Article 8 ECHR. It rather, is a formal change due to the explicit wording of Article 8 sect. 2 and 3 ECFR. However, since the European Court of Justice does not apply a case-bycase approach, as the Court of Human Rights does, but sets up a common definition for both fundamental rights, it is necessary to examine how the European Court of Justice differentiates between both fundamental rights. Protection against first publication and profiles based on public data At first, the European Court of Justice affirms, similar to the European Court of Human Rights, an infringement of the right to private life under Article 7 ECFR if personal data is firstly published. In doing so, the Court basically considers, such as in its decision of “Schecke vs. Germany”, the right to data protection as “closely connected with the right to private (2) (a) C. The function of the principle of purpose limitation in light of Article 8 ECFR 198 life”.501 However, in the case of “González vs. Google Spain”, the data was in fact already published. In this case, sort of an instrumental character of the (new) right to data protection for the (old) right to private life becomes apparent. Here in particular, the purpose of the data processing is also an essential element behind the Court’s reasoning.502 The European Court of Justice examined, at first, the effects of data processing by Google’s search engine on Mr. González’ right to private life. It then considered and answered the question of whether or not Mr. González could request Google to delist the articles containing information about him from its search results. In particular, the Court took into account the purpose of the initial publication and the time that had elapsed after the first publication of the article (16 years). Referring to the Data Protection Directive, the Court stressed that “it follows from those requirements, laid down in Article 6(1) lit. c) to (e) (…), that even initially lawful processing of accurate data may, in the course of time, become incompatible with the directive where those data are no longer necessary in the light of the purposes for which they initially were collected or processed. That is in particular where they appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes and in the light of the time that has elapsed.”503 The Court went on to state that such a right to be delisted does not require “that the inclusion of the information in question in the list of results causes prejudice to the data subject. As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer made available to the general public by its inclusion in such a list of results, it should be held (…) that those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information”.504 The specification of the purpose basically required by Article 8 sect. 2 ECFR thus played an instrumental role in order to safeguard Mr. González’ right to private life. 501 See ECJ C- 92/09 and C-93/09 cip. 47 and 52, and the facts of this case above under point C. I. 3. c) aa) (1) General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-by-case approach. 502 See the facts of this case above under point C. I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. 503 See ECJ C-131/12, cip. 93. 504 See ECJ C-131/12, cip. 96 and 97. I. Constitutional framework 199 Indeed, the Court did not discuss whether this requirement directly applies to the private sector nor did it examine what the initial purpose was and why the later usage of that data by the search engine operator actually conflicted with this initial purpose. However, so far, the reasoning appears to be consistent with the principles provided for by the European Court of Human Rights. The Court of Human Rights would probably have considered whether the constant availability of these articles through Google’s search engine interfered with the “reasonable expectations” of Mr. González’ or not.505 This might have been the case because the “processing enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet — information which potentially concerns a vast number of aspects of his private life and which, without the search engine, could not have been interconnected or could have been only with great difficulty — and thereby to establish a more or less detailed profile of him. Furthermore, the effect of the interference with those rights of the data subject is heightened on account of the important role played by the internet and search engines in modern society, which render the information contained in such a list of results ubiquitous (...).”506 When the newspapers initially published the information 16 years ago, Mr. González therefore had probably not expected the profile that was later created through the Internet search engine when Internet users typed in the claimant’s name. In addition, from the point of view of the European Court of Human Rights, it might have played a role that the first publication “took place upon order of the Ministry of Labor and Social Affairs and was intended to give maximum publicity to the auction (in that Mr. González was involved at the time) in order to secure as many bidders as possible”. The first publication, hence, depended not only on the fact that Mr. González could not pay his security debts ‘but also on a number of factors beyond his control.’507 505 Cf. ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 62. 506 See ECJ C-131/12 cip. 80. 507 Cf. ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 32. C. The function of the principle of purpose limitation in light of Article 8 ECFR 200 Protection against collection, storage, and subsequent risk of abuse The right to data protection under Article 8 ECFR, in particular, the requirement to specify the purpose, can therefore play an important role in the Court’s reasoning in order to determine an infringement of the right to private life under Article 7 ECFR. In the subsequent case “Digital Rights vs. Ireland”, the Court again refers to the purpose of the data processing in order to examine an infringement of the right to private life. However, in this case, the Court more precisely differentiates between both fundamental rights. In this case, Digital Rights Ireland Ltd. lodged a complaint before an Irish court challenging national legislative and administrative measures regarding the retention of data related to electronic communications. These measures were based on the Data Retention Directive.508 In light of the broad scope of the directive, the Irish court referred the decision, unlike the German Constitutional Court, to the European Court of Justice asking on its legality with respect to the right to privacy in Article 7 ECFR, the right to data protection in Article 8 ECFR, and the freedom of expression in Article 11 ECFR.509 With respect to the scopes of application of the fundamental rights, the European Court of Justice stressed, at first, that the “data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.”510 The Court concluded from this that, albeit no content of the communication should have been retained, “it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by (b) 508 See ECJ C-293/12 and C-594/12 cip. 17; Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC (Data Retention Directive). 509 Cf. above under point Principles of clarity of law and purpose limitation referring to the moment when data is collected, referring to BVerfG, 2nd March 2010, 1 BvR 256/08, 1 BvR 263/08, and 1 BvR 586/08 (Data Retention), cip. 186. 510 See ECJ C-293/12 and C-594/12 cip. 27. I. Constitutional framework 201 Article 11 of the Charter.”511 The Court continued to state that “the retention of data for the purpose of possible access to them by the competent authorities (…) directly and specifically affects private life and, consequently, the rights guaranteed by Article 7 of the Charter.”512 With respect to Article 8 ECHR, the Court finally added that “such a retention of data also falls under Article 8 of the Charter because it constitutes the processing of personal data within the meaning of that article and, therefore, necessarily has to satisfy the data protection requirements arising from that article (…).”513 Regarding an infringement of these rights, the Court stressed, at first, “the fact that data retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.”514 However, the Court clarified that “it does not matter whether the information on the private lives concerned is sensitive or whether the people concerned have been inconvenienced in any way”.515 As a consequence, both the obligation to retain the data, as well as to grant access to it interferes “with the rights guaranteed by Article 7 of the Charter.”516 With respect to the right to data protection, the Court simply considered that “likewise, (…/the Data Retention Directive) constitutes an interference with the fundamental right to the protection of personal data guaranteed by Article 8 of the Charter because it provides for the processing of personal data.”517 Examining whether these infringements are justified, the ECJ principally upheld the distinction between the right to private life in Article 7 ECFR and of the right to data protection in Article 8 ECFR. At first, it determined whether the Data Retention Directive affects the essence of the corresponding fundamental right: “So far as concerns the essence of the fundamental right to privacy and the other rights laid down in Article 7 of the Charter, it must be held that, even though the retention of data required (…) constitutes a particularly serious interference with those rights, it is 511 See ECJ C-293/12 and C-594/12 cip. 28. 512 See ECJ C-293/12 and C-594/12 cip. 29. 513 See ECJ C-293/12 and C-594/12 cip. 29. 514 See ECJ C-293/12 and C-594/12 cip. 37. 515 See ECJ C-293/12 and C-594/12 cip. 33. 516 See ECJ C-293/12 and C-594/12 cip. 34 and 35. 517 See ECJ C-293/12 and C-594/12 cip. 36. C. The function of the principle of purpose limitation in light of Article 8 ECFR 202 not such as to adversely affect the essence of those rights given that, as follows from Article 1(2) of the directive, the directive does not permit the acquisition of knowledge of the content of the electronic communication as such. Nor is that retention of data such as to adversely affect the essence of the fundamental right to the protection of personal data enshrined in Article 8 of the Charter, because Article 7 of (…/the Data Retention Directive) provides, in relation to data protection and data security, that, without prejudice to the provisions adopted pursuant to (…/the Data Protection Directive) and (…/the ePrivacy Directive), certain principles of data protection and data security must be respected by (…/service and network providers). According to those principles, Member States are adopted against accidental or unlawful destruction, accidental loss or alteration of data.”518 Thus, while Article 7 ECFR contains the essence that nobody else gets access to the content of communication, the essence of Article 8 ECFR requires a minimum set of data protection principles and data security. However, coming to the question of whether the interferences of Article 7 and 8 ECFR are proportionate, the European Court of Justice again interconnects both rights. The Court considered, at first, that the Member States’ margin of discretion implementing the Data Retention Directive into national law is limited and can therefore be strictly reviewed by the Court because “of the important role played by the protection of personal data in the light of the fundamental right to respect for private life and the extent and seriousness of the interference with that right caused by (…/the directive)”.519 It then goes on to state that “the fight against serious crime, in particular against organized crime and terrorism (…), however fundamental it may be, does not, in itself, justify a retention measure such as that established by (…/the directive)”.520 The Court stressed that “so far as concerns the right to respect for private life, the protection of that fundamental right requires, according to the Court’s settled case-law, in any event, that derogations and limitations in relation to the protection of per- 518 See ECJ C-293/12 and C-594/12 cip. 39 and 40; Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePrivacy Directive). 519 See ECJ C-293/12 and C-594/12 cip. 45 to 48. 520 See ECJ C-293/12 and C-594/12 cip. 51. I. Constitutional framework 203 sonal data must apply only in so far as is strictly necessary (…).”521 While it referred, in this respect, only to the right to privacy in Article 7 ECFR, it continued, taking the right to data protection into account, as: “In that regard, it should be noted that the protection of personal data resulting from the explicit obligation laid down in Article 8(1) of the Charter is especially important for the right to respect for private life enshrined in Article 7 of the Charter.”522 In conclusion, the European Court of Justice tends to refer to the fundamental right to private life if there is a direct effect on the individual’s privacy, such as conclusions to be drawn, based on the collection of the personal data, about “the private lives of the persons whose data has been retained, such as the habits of everyday life.”523 In contrast, the court rather refers to the right to data protection under Article 8 ECFR if there are no “sufficient safeguards (…) to ensure effective protection (…) against the risk of abuse and against any unlawful access and use of that data.”524 The Court thus appears to focus on the right to private life as protecting against the direct impact of the collection of data on the individual, while focusing on the right to data protection as an instrument protecting against potential threats caused by the storage and potential later usage of the data. In its essence, the European Court of Justice affirmed this differentiation in the subsequent decision of “Schrems vs. Facebook”. In this case, Mr. Schrems, an Austrian resident as well as national, has been a user of the social network Facebook. Facebook concludes with its users, at the beginning of their registry for the platform, a contract regulating, amongst others, the processing of their personal data. This data is transmitted from the subsidiary Facebook Ireland to the Facebook Inc. in the USA, and stored there. Mr. Schrems lodged a complaint to the Data Protection Commissioner in Ireland demanding to stop Facebook Ireland transferring the personal data related to Mr. Schrems to the USA. He argued, based on Mr. Snowden’s revelations about the processing of personal data by the National Security Agency (NSA), that the level of protection in the USA is not adequate to the level within the European Union and the data transfer therefore conflicts with the 521 See ECJ C-293/12 and C-594/12 cip. 52; affirmed in the subsequent case of “Digital Rights vs. Ireland”, ECJ C-293/12 and C-594/12, cip. 92. 522 See ECJ C-293/12 and C-594/12 cip. 53. 523 See ECJ C-293/12 and C-594/12 cip. 27; see also Kokott and Sobotta, The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, p. 224, giving further examples of similar wordings. 524 See ECJ C-293/12 and C-594/12 cip. 66, cf. also ECJ C- 92/09 and C-93/09 cip. 52 as well as ECJ C-468/10 and C-469/10, cip. 41. C. The function of the principle of purpose limitation in light of Article 8 ECFR 204 Data Protection Directive. The Data Protection Commissioner refused the complaint. From the Commissioner’s view point, it was hindered to validate the facts of Mr. Schrems’ complaint because, amongst others, the European Commission had found in its Decision 2000/520 (so-called Safe Harbour decision) that the level of data protection in the USA was adequate. Mr. Schrems lodged a claim against this decision of the Commissioner before the High Court of Ireland that finally referred the case to the European Court of Justice.525 In this decision, the Court took into account, one the one hand, “the important role played by the protection of personal data in the light of the fundamental right to respect for private life”526 and concluded from this that an “interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must (…) lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data.”527 On the other hand, the Court affirmed a separate infringement of the essence of the fundamental right to private life under Article 7 ECFR because the legislation in question allowed “the public authorities to have access on a generalised basis to the content of electronic communications”.528 Reference to further fundamental rights under Article 7 and/or 8 ECFR In the same cases, the European Court of Justice additionally referred to further fundamental rights, beside the right to private life and the right to data protection under Article 7 and 8 ECFR. This reference to further fundamental rights constitutes a third difference of the decisions by the European Court of Human Rights with respect to the right to private life protected by Article 8 ECHR. (3) 525 See ECJ C-362/14 (Schrems vs. Facebook), cip. 26 to 36. 526 See ECJ C-362/14 (Schrems vs. Facebook), cip. 78. 527 See ECJ C-362/14 (Schrems vs. Facebook), cip. 91. 528 See ECJ C-362/14 (Schrems vs. Facebook), cip. 94. I. Constitutional framework 205 Which right is used to discuss other fundamental rights? In the case of “Schrems vs. Facebook”, the European Court of Justice pointed, in relation to the right to data protection under Article 8 ECFR, to further fundamental rights, beside the right to private life under Article 7 ECFR. In doing so, the European Court of Justice referred, at first, to Article 1, as well as Recitals 2 and 10 of the Data Protection Directive, which state to protect not only the fundamental rights to private life and data protection under Article 7 and 8, but also all other fundamental rights.529 However, the European Court of Justice makes it clear that this function of data protection instruments referring to all fundamental rights does not only result from secondary law, but also from the fundamental right to data protection under Article 8 ECFR. From its point of view, if the Safe Harbour decision hindered a national data protection commissioner to examine an individuals’ claim, these individuals “would be denied the right, guaranteed by Article 8(1) and (3) of the Charter, to lodge with the national supervisory authorities a claim for the purpose of protecting their fundamental rights.”530 The European Court of Justice also examines, in more detail, which further fundamental right comes into question being supplemented by the rights guaranteed by Article 8 sect. 1 and 3 ECFR. In this case, for instance, the Court referred to Article 47 ECFR as: “legislation not providing for any possibility for an individual to pursue legal remedies in order to have access to personal data relating to him, or to obtain the rectification or erasure of such data, does not respect the essence of the fundamental right to effective judicial protection”.531 The European Court of Justice also took, in its preceding decision of “Digital Rights vs. Ireland”, further fundamental rights into account. Indeed, the European Court of Justice discussed the fundamental right of freedom to expression provided for by Article 11 ECFR in relation to the right to private life under Article 7 ECFR. In particular, the Court considered the unspecified threat of being constantly surveyed, as well as that individuals are likely to limit their communication. Even if the Court did not use these considerations in order to determine the scope of Article 7 ECFR, it referred to it in order to determine the intensity of the infringe- (a) 529 See ECJ C-362/14 (Schrems vs. Facebook), cip. 39. 530 See ECJ C-362/14 (Schrems vs. Facebook), cip. 58 as well as 56. 531 See ECJ C-362/14 (Schrems vs. Facebook), cip. 95. C. The function of the principle of purpose limitation in light of Article 8 ECFR 206 ment.532 However, the reason for the different attribution of further fundamental rights, on the one hand, to the right to private life and, on the other hand, to the right to data protection, appears to lie in the different type of threat: As analyzed before, the European Court of Justice tends to refer to the right to private life if the collection of personal data leads to a direct effect on the individual’s privacy.533 Or how the Advocate General Cruz- Villalón puts it in its Opinion to the case of “Digital Rights vs. Ireland”: In this case, “it is not the processing of the data retained, (..) in terms of the manner in which they are used (…), which requires the utmost vigilance, but the actual collection and retention of the data at issue, as well as the data’s impact on the right to privacy”.534 The reason for this is that these “are data which, qualitatively, relate essentially to private life, to the confidentiality of private life (…). The issue which arises in such cases is not yet that of the guarantees relating to data processing but, at an earlier stage, that of the data as such, that is to say, the fact that it has been possible to record the circumstances of a person’s private life in the form of data, data which can consequently be subject to information processing.”535 Thus, the deterring effect of this kind of data collection on the exercise of the freedom of expression “would be merely a collateral consequence of interference with the right to privacy”.536 In contrast, the European Court of Justice tends to refer to the right to data protection if the threat results from the storage and later use of the data retained rather than from the collection per se.537 The answer depends on the type of threat posed Indeed, the preceding decisions do not definitely clarify under which circumstance the reference to further fundamental rights should be related to (b) 532 See ECJ C-293/12 and C-594/12 cip. 37 referring to Opinion of Advocate General Cruz Villalón delivered on 12 December 2013 on Case C‑293/12, cip. 52. 533 See above under point C. I. 3. c) aa) (2) (b) Protection against collection, storage, and subsequent risk of abuse. 534 See Opinion of Advocate General Cruz Villalón delivered on 12 December 2013 on Case C‑293/12, cip. 59. 535 See ibid., cip. 65. 536 See ibid., cip. 52. 537 See above under point C. I. 3. c) aa) (2) (b) Protection against collection, storage, and subsequent risk of abuse. I. Constitutional framework 207 Article 7 and to Article 8 ECFR. However, the idea of referring privacy and/or data protection to further areas of social life protected by other fundamental rights already became apparent in an earlier case, which was decided before the European Charter of Fundamental Rights came into force. Thus, at the time of this decision, i.e. the case of “Rechnungshof vs. ORF”, the European Court of Justice still decided on the grounds of the European Convention on Human Rights. It was thus still unclear whether the European Court of Justice would refer to other fundamental rights under the angle of the right to private life protected by Article 7 ECFR or the right to data protection under Article 8 ECHR. In this case, an Austrian law obliged institutions subject to the control of the Austrian Court of Audit to inform the Court of the salaries and pensions of employees that superseded a certain amount. Several institutions denied the information or provided the information but without personal data such as the names of the employees concerned. The Court of Audit insisted in receiving all information required and, as a consequence, brought an action before the Austrian Constitutional Court which finally stayed the proceedings asking the European Court of Justice whether the duty of information provided for by the Austrian law interfered with Community law, in particular, with Article 8 ECHR.538 Before treating the hypothetical question about the fundamental rights angle possibly chosen by the European Court of Justice if the European Charter of Fundamental Rights had already been in force, it is necessary to examine, in more detail, the Court’s reasoning in the case. Referring, here again, to the decisions “Amann vs. Switzerland” and “Rotaru vs. Romania” decided by the European Court of Human Rights, the European Court of Justice stated: “First of all, the collection of data by name relating to an individual’s professional income, with a view to communicating it to third parties, falls within the scope of Article 8 of the Convention.” Subsequently, the Court differentiated, pursuant to the context in which the data was processed, stressing that “while the mere recording by an employer of data by name relating to the remuneration paid to his employees cannot as such constitute an interference with private life, the communication of that data to third parties, in the present case a public authority, infringes the right of the persons concerned to respect for private life, whatever the subsequent use of the information thus communicated, and constitutes an interference 538 See ECJ C-465/00, C-138/01 and C-139/01 (Rechnungshof vs. ORF), cip. 3, 18 to 21, and 48. C. The function of the principle of purpose limitation in light of Article 8 ECFR 208 within the meaning of Article 8 of the Convention.”539 Examining the intensity of the infringement, the Court took into consideration that the individuals concerned by the disclosure of the information required “may suffer harm as a result of the negative effects of the publicity attached to their income from employment, in particular on their prospects of being given employment by other undertakings, whether in Austria or elsewhere, which are not subject to control by the Rechnungshof.”540 The Court concluded from this that the referring Austrian Constitutional Court had to examine whether not only the disclosure of the salaries and pensions exceeding the certain thresholds defined by the Austrian law, but also the names of the employees concerned, is really necessary and appropriate in order to meet the aim of the law in question.541 In conclusion, the European Court of Justice did not consider each act of data treatment as legally relevant. The collection and processing of personal data by the employer for purposes of payroll accounting did not amount to a harm under Article 8 ECHR. In contrast, the transfer of that data for the purpose of its publication did.542 The decision is interesting, compared with the decisions developed by the European Court of Human Rights: While its conclusion was in line with the concept of protection developed by the European Court of Human Rights, its reasoning was different. Both Courts principally consider that the publication of personal data infringes the right to private life of the individuals concerned.543 However, if the European Court of Human Rights had affirmed a violation of the right to private life, it did not examine whether or not there is an additional violation of another human right.544 In contrast to this approach, the European Court of Justice also took, at least implicitly, other fundamental rights into account. The court considered that the publication of the individual’ salaries in relation to their names could have negative effects on 539 See ECJ C-465/00, C-138/01 and C-139/01 cip. 73 and 74. 540 See ECJ C-465/00, C-138/01 and C-139/01, cip. 89. 541 See ECJ C-465/00, C-138/01 and C-139/01, cip. 90. 542 See ECJ C-465/00, C-138/01 and C-139/01 cip. 73 and 74. 543 See, on behalf of the European Court of Justice, also ECJ C-92/09 and C-93/09 cip. 58; on behalf of the European Court of Human Rights, ECtHR, Case of Peck vs. the United Kingdom from 28 January 2003 (application no. 44647/98), cip. 61. 544 Cf. ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 88 to 90. I. Constitutional framework 209 their chances of being given employment by other undertakings.545 Indeed, in the case of “Rotaru vs. Romania”, the European Court of Human Rights also considered that the ‘systematic and permanent storage’ of personal data falls under Article 8 ECHR especially if the “information concerns a person’s distant past (…) has been declared false and is likely to injure the applicants reputation (underlining by the author).”546 However, the individual’s reputation rather belongs to the individual’s ‘psychological or social integrity’ protected by Article 8 ECHR than to another fundamental right. In contrast, the individual’s chances of ‘being employed by an other undertaking’ rather fall under a fundamental right related to work. Indeed, when the European Court of Justice decided on the case of “Rechnungshof vs. ORF”, the European Charter of Fundamental Rights was not yet in force. However, the Charter already existed as a draft.547 In light of this, it appears reasonable that the European Court of Justice thought, at least, about the freedom to choose an occupation and the right to engage in work provided for by Article 15 ECFR. Presupposing that the European Charter of Fundamental Rights had already been in force, these considerations may allow the following hypothetical analysis: The fact that the Court considered the later usage of the information, and not the data collection, as legally relevant, principally speaks in favor of Article 8 ECFR that provides the instrument for protection for the right to work. Instead, in favor of the right to private life, it can be stressed that the publication of information already leads to the risk for the individual’s right to engage in work. In this instance, the Court usually considers the publication as an infringement of the right to private life under Article 7 ECFR in combination with Article 8 ECFR. Therefore, it is also possible that the European Court of Justice had discussed the freedom to find an occupation protected by Article 15 ECFR in relation to both rights to data protection and to private life.548 In any case, the essential point here is that the concept of referring to the right to engage in work in 545 See ECJ C-465/00, C-138/01 and C-139/01, cip. 89. 546 See ECtHR, Case of Rotaru vs. Romania from 4 May 2000 (application no. 28341/95), cip. 43 and 44. 547 The decision was ruled on 20th May 2003, while the proclamation of the Charter of Fundamental Rights was in 2000, retrieved from http://ec.europa.eu/justice/fun damental-rights/charter/index_en.htm. 548 Cf. Tzanou, Data protection as a fundamental right next to privacy? ‘Reconstructing’ a not so new right, pp. 94 and 95. C. The function of the principle of purpose limitation in light of Article 8 ECFR 210 order to examine the effects of the data processing on the individual concerned can easily be transferred to further fundamental rights of freedom or equality.549 Protection in (semi)-public spheres irrespective of ‘reasonable expectations’? Another difference between the decisions of the European Court of Justice and the European Court of Human Rights concerns the mechanism of the individual’s “reasonable expectations” when determining the scope of protection of the fundamental rights. This mechanism was already mentioned, briefly, with respect to the case of “Mr. González vs. Google Spain”.550 By conducting a thought experiment, the following question was raised: whether the European Court of Human Rights would have come to the same or a different result as the European Court of Justice if it had referred to Mr. González’ “reasonable expectations”. This decision was based on both the right to private life and the right to data protection under Articles 7 and 8 ECFR. The same thought experiment conducted in “González vs. Google Spain” will now be also be transcribed in the three following cases of “Telekom vs. Germany”, “SABAM vs. Scarlet”, and “SABAM vs. Netlog” where personal data was also already published, at least, in (semi)-public spheres. In these cases, the European Court of Justice referred only to Article 8 ECFR.551 In the case of “Telekom vs. Germany”, the European Court of Justice does not explain why it refers only to the right to data protection under Article 8 ECFR. One reason might be that the personal data in question was already made publically available so that the second publication of the personal data simply in another directory did not reveal any more aspects (4) 549 Cf. Britz, Europeanisation of Data Protection Provided for by Fundamental Rights?, p. 11; De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, p. 44. 550 See above under point C. I. c) aa) (2) (a) Protection against first publication and profiles. 551 See the facts of these cases above under point C.I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. I. Constitutional framework 211 of the individual’s private life.552 Another reason might be that the decision depended on the individual’s consent, which is explicitly foreseen under Article 8 ECFR, and not under Article 7 ECFR. Indeed, in the subsequent cases of “SABAM vs. Scarlet” and “SABAM vs. Netlog”, the Court equally referred only to Article 8 ECFR even if, here, the consent of the individuals did not play a role. Therefore, regarding the case of “Telekom vs. Germany”, the reason might be that the Court implicitly considered that the personal data identifying the individuals concerned was already public, at least, within the sharing communities, so that the filtering of the data did not reveal information of their private life. If we were to suppose that this consideration is correct, the decisions in “SABAM vs. Scarlet” and “SABAM vs. Netlog” appear to deviate from the principles developed by the European Court of Human Rights. The European Court of Human Rights usually refers, if the data is collected in public spheres, to the individual’s “reasonable expectations”. If the data controller reveals its real purpose of the processing, the individual concerned is principally able to avoid the processing for this purpose by not entering the sphere where the data is collected: The purpose recognizable for the individual concerned frames his or her “reasonable expectations”.553 In contrast, the European Court of Justice does not refer, so far, to the individual’s “reasonable expectations”. This observation is interesting in light of the same thought experiment as conducted with respect to the decision of “Mr. González vs. Google Spain”: In the cases “SABAM vs. Scarlet” and “SABAM vs. Netlog”, the filtering systems would probably not infringe the users’ right to private life under Article 8 ECHR if the Internet access provider and the social network had informed them of the processing and further usage of the data through these systems. This information would thus have framed their expectations. Indeed, such an approach would probably have far reaching effects for the users and even for the Internet Society as a whole. If just the information about the existence and purpose of the filtering system excluded an infringement of the fundamental right, 552 Cf. Opinion of Advocate General Cruz Villalón delivered on 12 December 2013 on Case C‑293/12, cip. 65. 553 Cf. above under point C. I. 3. b) cc) Particular reference to the individual’s “reasonable expectations”; cf. Kokott and Sobotta, The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, p. 227, who argue, in a similar way, with respect to the decision of “González vs. Google Spain”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 212 most Internet access providers and social networks would likely start to filter the information in order to avoid damage claims by copyright holders for the copyright infringements conducted by the users.554 Therefore, potentially the European Court of Justice had the same reasoning as the German Constitutional Court in mind, considering a negative impact on the users ‘becoming an object of copyright enforcement which adds to their general risk of being unreasonably suspected’.555 Suppose that all Internet access and social network providers install such systems, it might, furthermore, be arguable whether or not the users really had a choice of avoiding the treatment of ‘their’ data by these systems. Indeed, in light of the reasoning given by European Court of Human Rights in “Gillan and Quinton vs. The United Kingdom”, a rather liberal approach has been applied. In this case, the Court considered, as stressed before, that the individuals concerned by the airplane access control could avoid this by choosing not to travel by plane.556 Given this, Internet users equally have a choice of not using Internet access services or social networks, respectively, or, at least, of not sharing content through these services. Like air travellers who could choose to travel by train or by boat, Internet users could use, instead, classic means of communications such as postal services. The European Court of Justice might have foreseen the far-reaching consequences. If the pure information about the filtering systems excluded an infringement of the Internet users’ “reasonable expectations” and, consequently, their fundamental right to data protection, there would be no protection against these surveillance measures, and the risk of being unreasonably suspected. It might be for this reason why the European Court of Justice does not refer, so far, to the “reasonable expectations”-mechanism determining the scope of protection of the right to data protection of Article 8 ECFR. In the case of “Mr. González vs. Google Spain” the same thought experiment was applied. However, the European Court of Justice had the chance to 554 Cf. Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 48. 555 Cf. BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 227; BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 103. 556 See ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 65. I. Constitutional framework 213 circumvent the question on Mr. González’ “reasonable expectations” – or other individuals who must expect, at least today, that almost everything is re-published on the Internet. The Court was able to avoid this question by referring to the direct impact on the individual concerned; it clearly differentiated between the effects of the publication of the articles, as such, and the fact that they can be found by means of an Internet search engine. Since the latter effects can be even worse for the claimant than the publication of the articles per se, the Court makes it clear that Article 7 ECFR particularly protects against such profiling, even if the information was known before.557 In contrast, in the cases of “Telekom vs. Germany”, “SABAM vs. Scarlet”, and “SABAM vs. Netlog”, the Court did not refer to such an impact of data processing on the individuals concerned – and probably could not because the filtering per se does not constitute a profile and has no comparable impact – but to the right to data protection, only. Since all these cases related, at least, to situations in semi-public-spheres, the question is why the European Court of Justice did not refer to the users’ “reasonable expectations”. The reasons might be that the application of this mechanism would have far too reaching effects on the scope of protection of the fundamental right to data protection overall. Even if it had been possible to deny such expectations in the present cases, the pure reference to this mechanism principally opens a floodgate for legitimizing the processing of personal data in the future: The pure information about the filtering systems can ‘frame’ the individuals’ “reasonable expectations”.558 The Court therefore appears to have used the opportunity to elaborate on the right to data protection as a fundamental right distinctive to the right to private life of Article 8 ECHR and, consequently, to Article 7 ECFR. Going beyond the requirement of consent provided for under Article 8 ECHR With respect to the individual’s consent, the decision of “Telekom vs. Germany” reveals another and, so far, final difference to the concept applied (5) 557 See ECJ C-131/12 cip. 87. 558 Cf. Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 48. C. The function of the principle of purpose limitation in light of Article 8 ECFR 214 by the European Court of Human Rights. As set out previously, the referring national court asked the European Court of Justice to consider whether the ePrivacy Directive hindered the defendant from transferring personal data for the purpose of, again, publishing it in another directory. The reason for this doubt was that the Defendant lacked the individuals’ explicit consent for the transfer and second publication.559 In order to answer this question, the European Court of Justice referred only to Article 8 ECFR and affirmed, implicitly, here again, that the nature of the customers’ names and telephone numbers were considered as personal data.560 Referring exclusively to Article 8 ECFR, the Court examined, in more detail, the purpose that essentially determined the extent and function of the individual’s consent. The Court stated that “where a subscriber has consented to the passing of his personal data to a given undertaking with a view to their publication in a public directory of that undertaking, the passing of the same data to another undertaking intending to publish a public directory without renewed consent having been obtained from that subscriber is not capable of substantively impairing the right to protection of personal data, as recognized in Article 8 of the Charter.”561 The Court also clarified what requirements were needed for the information to be provided for by the private company. It must inform, “before the first inclusion of the data in the public directory, of the purpose of that directory and of the fact that those data will may be communicated to another telephone service provider and that it is guaranteed that those data will not, once passed on, be used for purposes other than those for which they were collected with a view to their first publication.”562 Even if this decision principally applies the logic of the European Court of Human Rights, it seems to refine the requirement of purpose specification in one aspect: Principally, the European Court of Human Rights considers an un-consented publication of personal data as an infringement of Article 8 ECHR because it usually interferes with the “reasonable expectation” of the individual concerned. However, the moment when the data 559 See ECJ C-543/09 cip. 19,20, and 27, and see above the further facts of this case under point C. I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. 560 See ECJ C-543/09 cip. 49 to 54. 561 See ECJ C-543/09 cip. 66. 562 See ECJ C-543/09 cip. 66 and 67. I. Constitutional framework 215 controller communicates the purpose to the individual, this information frames his or her expectation of how the data will be used and, as a consequence, does not infringe his or her right to private life. In this regard, it should be stressed that the pure information about the purpose already excludes an interference with the individual’s expectation. The individual must not give his or her consent in a certain form. It is sufficient that he or she has an initial choice of avoiding how the data will be treated and the possibility to refuse the same.563 However, the European Court of Justice goes one step beyond this. In the Court’s judgment, it is not only necessary to inform the individual concerned about the concrete purpose but also ‘of the fact (…) that it is guaranteed that those data will not, once passed on, be used for purposes other than those for which they were collected’. Thus, while the European Court of Human Rights only requires that the data should not be factually used at a later stage, for other purposes, the European Court of Justice requires that this fact must be explicitly stated in the initial information provided to the individual. Whether this statement means that the treatment of data infringes the right to data protection under Article 8 ECFR, if the information only does inform the individual about the positive purposes, but not of the fact that it is guaranteed that the data is not used for further purposes, must, so far, remain open. 563 Cf., on the one hand, under point C. I. 3 b) dd) “Consent: are individuals given a choice to avoid the processing altogether?”, as well as ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 87; ECtHR, Case of Rotaru vs. Romania from 4 May 2000 (application no. 28341/95), cip. 46; Case of Leander vs. Sweden from 26 March 1987 (application no. 9248/81), cip. 48; Case of Kopp vs. Switzerland from 25 March 1998 (application no. 13/1997/797/1000), cip. 53; Case of Amann vs. Switzerland from 16 February 2000 (application no. 27798/95), cip. 69; and, on the other hand, Article 2 lit. h of the Data Protection Directive stating that ”’the data subject’s consent shall mean any freely given specific and informed indication of his whishes by which the data subject signifies his agreement to personal data relating to him being processed“ and, finally, § 13 sect. 2 of the German Telemedia Law that states that the consent must be given, at least, in electronic form. C. The function of the principle of purpose limitation in light of Article 8 ECFR 216 Interim conclusion: Article 8 ECFR as a regulation instrument? In conclusion, it became apparent that the European Court of Justice does not strictly apply the principles developed by the European Court of Human Rights with respect to Article 8 ECHR, but instead has started to elaborate on the particularities of the concept of protection provided for by Article 7 and Article 8 ECFR. Location of protection instruments under Article 8 ECFR One important difference is that the European Court of Justice discusses ‘effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data’, not with respect to Article 7 ECFR protecting, correspondingly to Article 8 ECHR, the right to private life but in the new right to data protection provided for by Article 8 ECFR.564 The decisions developed by the European Court of Human Rights equally foresees such safeguards against abuse by further usage of the data.565 However, this re-location is not a substantive further development regarding the concept of protection provided for by Article 8 ECHR, but rather a formal change. With respect to the publication of personal data, it essentially applies the principles developed by the European Court of Human Rights.566 For example, just like the publication of an individual’s name and salary interferes with Article 8 ECHR so does, after the European Charter of Fundamental Rights has come to force, the publication of an individual’s name and the amount of funding received from the State interfere with Article 7 in combination with Article 8 ECFR.567 However, when it comes to the question of the extent of the consent limiting a protection against the publication, the European Court of Justice only refers to Article 8 ECFR. According to these decisions, Article 8 ECFR appears bb) (1) 564 See ECJ C-293/12 and C-594/12 cip. 66. 565 See, for example, ECtHR, Case of Z. vs. Finland from 25 February 1997 (application no. 22009/93), cip 95; ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 41. 566 See above under point C. I. 3. b) cc) (3) ‘Data relating to private or public matters’, ‘limited use’ and or ‘made available to the general public’. 567 See, regarding the first case, ECJ C-465/00, C-138/01 and C-139/01 (Rechnungshof vs. ORF), and with respect to the second case, ECJ C-92/09 and C-93/09 (Schecke vs. Land Hessen). I. Constitutional framework 217 to provide for regulation instruments that are necessary in order to protect, at least, the right to private life under Article 7 ECFR.568 Protection going beyond Article 8 ECHR However, this mediating function of the right to data protection of Article 8 ECFR does not mean that its level of protection would be lower than that of the right to private life under Article 8 ECHR. In contrast, with respect to the individual’s “reasonable expectations”, the European Court of Justice appears, so far, to not apply the principles developed by the European Court of Human Rights under Article 8 ECHR. In the cases of “SABAM vs. Scarlet” and “SABAM vs. Netlog”, the European Court of Justice confirmed that there was an infringement of the right to data protection under Article 8 ECFR, albeit providers of the Internet access or social network, respectively, would be able, in the future, to inform their users about the filtering systems and, though, frame the users’ “reasonable expectations”. The Court might have foreseen the negative effects in the future for the Internet Society that the introduction of the “reasonable expectations”-mechanism into the concept of protection of Article 8 ECFR would have caused. This mechanism is principally able to open the floodgates for surveillance measures essentially making Internet users, in terms of the German Constitutional Court, ‘an object of surveillance that adds to their general risk of being unreasonably suspected’.569 The European Court of Justice might therefore have avoided referring to the individuals’ “reasonable expectations”. Similarly, in the case of “Mr. González vs. Google Spain”, the Court did not explicitly or, at least, not precisely elaborate on the function of the requirement of purpose specification provided for by Article 8 ECFR. It might have implicitly considered that Mr. González could not reasonably expect that Internet search engines will once make use of the information initially published about him in newspa- (2) 568 See above under point C. I. 3. c) aa) (2) (b) Protection against collection, storage, and subsequent risk of abuse, referring, for example, to ECJ C-293/12 and C-594/12 cip. 53. 569 Cf. BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 227; BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 103, and see above under point C. I. 2. d) aa) (2) The proportionality test also takes the use of data at a later stage into account. C. The function of the principle of purpose limitation in light of Article 8 ECFR 218 pers. However, it is arguable that the Court would deny protection only because individuals today can expect the profiling of information by Internet search engines. In contrast, in this case, the Court appears to apply a different approach referring to the individual’s ‘social and/or psychological integrity’ protected by Article 7 ECFR and using the principle of purpose limitation provided for by Article 8 ECFR in order to evaluate the infringement of the right to private life and the justification from a time perspective. These differences between the concept of protection under Article 8 ECHR and under Article 7 and 8 ECFR do not interfere with Article 52 sect. 3 ECFR. Article 52 section 3 ECFR states, as stressed before, that this “provision shall not prevent Union law providing more extensive protection.” Following the explanations of the European Charter of Fundamental Rights, the European Court of Justice therefore appears to apply the principles of the European Court of Human Rights when interpreting the corresponding right to private life but elaborates further on the concept of protection under Article 8 ECFR which is only “based on (…) Article 8 of the ECHR”.570 This development leads to a more extensive protection and becomes particularly apparent if the regulation instruments provided for by Article 8 ECFR serves not only to protect the right to private life of Article 7 ECFR, but also the other fundamental rights to freedom and non-discrimination. This leads to the last important difference between the concept of protection under Article 8 ECHR and that provided for by Article 7 and 8 ECFR. In contrast to the European Court of Human Rights, the European Court of Justice also takes other fundamental rights into account. In the case of “Rechnungshof vs. ORF”, it considers the negative effects for the individuals concerned by the publication of their salaries with respect to the risk of ‘being employed by an other undertaking’. Since the European Charter of Fundamental Rights only existed, at the time of this decision, as a draft, the European Court of Justice appears to have, at least, thought about the freedom to choose an occupation and the right to engage in work under Article 15 ECFR. In contrast, during the case of “Digital Rights vs. Ireland” the Charter of Fundamental Rights was already in force. In this case, the court explicitly referred to the right to freedom and expression 570 See Explanations of the European Charter of Fundamental Rights, 2007/C 303/02; Burgkardt, ibid., p. 348, with further references. I. Constitutional framework 219 under Article 11 ECFR. The Court considered the collection and storage of the telecommunication data is likely to lead to a bias in communication. Indeed, the Court took these effects into account in order to determine the intensity of the infringement of the right to private life under Article 7 and not to orient the protection instruments provided for by Article 8 ECFR toward the substantial guarantees endangered by the later usage of the data. However, the reason likely is that the treatment of personal data in question essentially consisted in the collection and not the later usage of the data. In contrast, in the case of “Schrems vs. Facebook”, the European Court of Justice considered that the rights under Article 8 sect. 1 and 3 ECFR also serve to “lodge (…) a claim for the purpose of protecting their fundamental rights” and, in particular, “the fundamental right to effective judicial protection, as enshrined in Article 47 of the Charter.”571 Whether the European Court of Justice discusses further fundamental rights in relation to the right to private life under Article 7 ECFR or the right to data protection under Article 8 ECFR appears, thus, to depend on the type of threat caused by the data processing.572 Remaining uncertainty about the interplay between Article 7 and 8 ECFR In light of these decisions, there is indeed a tendency by the European Court of Justice to differentiate between Article 7 and Article 8 in the following way: while Article 8 ECFR, rather, provides regulation instruments for the treatment of personal data, the right to private life provides protection for a more substantial guarantee. This becomes, for example, apparent in the case of “Digital Rights vs. Ireland” where it states that the Data Retention Directive offended “does not provide for sufficient safeguards (…) to ensure effective protection (…) against the risk of abuse and against any unlawful access and use of that data”573 and that Article 8 ECFR is, in this regard, “especially important for”574 the right to private life in Article (3) 571 See ECJ C-362/14 (Schrems vs. Facebook), cip. 56, 58, and 95. 572 See above under point C. I. 3. c) aa) (3) (b) The answer depends on the type of threat posed. 573 See ECJ C-293/12 and C-594/12 cip. 66, Cf. also ECJ C- 92/09 and C-93/09 cip. 52 as well as ECJ C-468/10 and C-469/10, cip. 41. 574 See ECJ C-293/12 and C-594/12 cip. 53. C. The function of the principle of purpose limitation in light of Article 8 ECFR 220 7 ECFR. However, the Court does not clarify what is actually threatened. It only refers to the causes of threat, i.e. ‘unlawful access and use of (…) data’. The Court only states that Article 8 ECFR is “especially important for”575 the right to private life in Article 7 ECFR. Its precise functioning with respect to this right remains unclear. The problem of such an unclear concept of protection becomes obvious in the case of “González vs. Google Spain”. The European Court of Justice affirmed Mr. González’ right to require Google Spain to delist him from the search results because the right to private life and to data protection “override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information.”576 The Court considered that this might exceptionally not be the case “if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having (…) access to the information in question.”577 The result of this reasoning is that the European Court of Justice provides, by tying into its definition of personal data in both Articles 7 and 8 ECFR, the individual concerned a rather comprehensive right to control the social interaction that others have with him or her.578 If the relationship of rule and exception developed by the European Court of Justice in the case of “González vs. Google Spain” generally applies – ‘as a rule’ – to any other situation where personal data is treated, the extent of such a right risks conflicting with the often-repeated statement of the European Court of Justice that this right “is not an absolute right but must be considered in relation to its function in society.”579 One technical reason for this conflict is that the European Court of Justice does not define, unlike the European Court of Human Rights, the scope of protection on a case-by-case basis. Instead, it sets out a general definition, referring to the term “personal data”, for both Articles 7 and 8 ECHR. This difference has far-reaching consequences on the scopes: 575 See ECJ C-293/12 and C-594/12 cip. 53. 576 See ECJ C-131/12, cip. 97. 577 See ECJ C-131/12, cip. 97. 578 See v. Grafenstein and Schulz, The right to be forgotten in data protection law: a search for the concept of protection, pp. 262 to 264; cf. Grimm, Data protection before its refinement, p. 588. 579 See, for example, ECJ C-92/02 and C-93/09, cip. 48. I. Constitutional framework 221 While the European Court of Human Rights is principally free, based on its case-by-case approach, to deny or affirm protection referring to certain type of cases, the deductive method of the European Court of Justice leads to the situation that any processing of personal data generally falls under the scope of protection.580 Referring to substantial guarantees as method of interpreting fundamental rights in order to avoid a scope of protection that is too broad and/or too vague A potential solution for this conflict might be not to focus on the term ‘personal data’ as the only criteria for determining the scope of protection of both fundamental rights, but on their substantial guarantees. In order to explain this idea, it is necessary to illustrate in more detail how the scope of protection of a fundamental right can be constructed. Usually, the definition of the scope of protection has two functions. First, the definition determines the threshold of constitutional protection. Judicial courts defining the scope of protection therefore dispose of a mechanism in order to decide whether fundamental rights protect individuals against certain acts of others, be it by the State or private parties, or not. The individual concerned can claim protection against it only if a certain act falls under the scope of a fundamental right. Secondly, the scope of protection determines which fundamental right is applicable in a particular case. This second issue is paramount with respect to Articles 7 and 8 ECFR. The European Court of Justice defines by commonly referring to the term ‘personal data’, both rights under the same scope of protection. This raises the question of how to distinguish these fundamental rights from each other. The approach referring to a substantial guarantee provided for by fundamental rights provides an alternative method of distinguishing fundamental rights. It is more normative than the method of defining the scope pursuant to certain ontological categories. While the latter usually refers to pre-known phenomena as so-called objects of protection, such as ‘family’, ‘privacy’ or ‘personal data’, the method falls short if the object of protection is too broad or too vague. The object of cc) 580 See above under point C. I. 3. c) aa) (1) General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-by-case approach. C. The function of the principle of purpose limitation in light of Article 8 ECFR 222 protection of personal data is, as such, a pure ontological category, both too broad and too vague.581 The reason for why the scope is too vague: Difference between data and information The term is too vague, at least, with respect to the legal effects of the treatment of data for the individual concerned. Legal scholars stress, in this regard, the difference between data and information.582 In particular, the German scholars Albers and Britz conclude from this differentiation that it is not data as such, but the information retrieved from data which provides the basis for social interaction.583 Thus, it is not the data but the information that leads, possibly, to an infringement of fundamental rights. While data are signs stored on physical carriers, be it analogously in the form of text, audio or video documents or as digital data retained in memory chips, they must, at first, be interpreted corresponding to the social context in order to make sense. The interpretation constitutes the information serving a basis for the social interaction, which possibly infringes the fundamental rights of the individual concerned by the treatment of ‘his or her’ data.584 Focusing on the German right to informational self-determination, Britz concludes from this: that a concept of protection directly referring to an individual’s right to determine data guarantees what is not necessary; in contrast a concept of protection providing for an individual’s right to determine information, is not possible. While basic rights can only guarantee (1) 581 See v. Grafenstein and Schulz, ibid., pp. 254 to 257, with further references; cf. also Dietlein, The Doctrine of Duties of Protection of Basic Rights, pp. 78 to 81, stressing, amongst others, “property”, “marriage and family”, “free press” as well as “free research” as so-called institutional guarantees that cannot be pre-determined pursuant to ontological categories but must be normatively specified by the legislator. 582 See Pombriant, Data, Information and Knowledge – Transformation of data is key, pp. 97 and 98, who adds, furthermore, the third dimension of subjective “knowledge”; Albers, Treatment of personal information and data, cip. 8 to 15; Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, pp. 567 and 568. 583 See Albers, ibid.; Britz, ibid; Grimm, Data protection before its refinement, p. 586. 584 See Albers, ibid., cip. 8 to 15 and 68; Britz, ibid., pp. 567 and 568; Grimm, ibid., p. 586. I. Constitutional framework 223 the determination of data by individuals because data as such does not depend on subjective interpretation, data has no direct relevance for constitutional protection. Consequently, an individual’s right to dispose of data must mainly be considered as an instrument of protection for specific guarantees provided for (also) by other fundamental rights.585 Albers does not consider the German right to informational self-determination as purely instrumental. However, she particularly criticizes that the concept of protection, developed so far by the German Constitutional Court, focuses on data instead of information. This leads to a flood of protection instruments that have no substantive object of protection and therefore miss the actual threats caused by the use of context-related information.586 Britz similarly argues that the German Constitutional Court had principally acknowledged the social pre-condition of information quoting the “Decision on Population Census” as:587 “The individual does not have a right in the meaning of an absolute and boundless control about ‘his or her’ data; (conceptually), he or she rather has to be considered as a personality developing within the social community who depends on communication. Information constitutes, even if it is related to a person, a picture of social reality that cannot be exclusively contributed only to the person concerned. The Basic Law decided (…) that the field of tension between the individual and the community has to be solved in the way that the former is related and bound to the latter.”588 However, Britz considers that the German Court does not actually transpose this reasoning into its concept of protection. Instead, it falls short by affirming the fact that an individual’s right to comprehensively determining the disclosure and, even more important, the usage of ‘his or her’ personal data.589 The result of this inconsequent concept of protection is that the individual does not have certain chances of influencing the social interaction but can determine it in a rather comprehensive way.590 585 See Britz, ibid., pp. 567 and 568. 586 See Albers, ibid, cip. 68. 587 See Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, p. 566. 588 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 174. 589 See Britz, ibid., p. 567. 590 Cf. Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, pp. 51 and 52. C. The function of the principle of purpose limitation in light of Article 8 ECFR 224 These considerations comparably apply to the, so far, ambiguous concept of protection developed by the European Court of Justice. As stressed above, the European Court of Justice also acknowledges that the right to data protection under Article 8 ECFR ‘is not an absolute right but must be considered in relation to its function in society’.591 Despite this asseveration, it also essentially affirms, particularly in the case of “González vs. Google Spain”, an individual’s right to comprehensively control the social interaction based on the processing of personal data. The European Court of Justice is doing so by affirming that an individual who is concerned by the processing of ‘his or her personal data’ has a right, which supersedes, as a rule, the opposing fundamental rights of others using that data. Thus, so long as the term ‘personal data’ serves the only and common link in order to define the scopes of both the right to private life under Article 7 ECFR and the right to data protection under Article 8 ECFR, it is, at least, too vague to determine the scope of protection of both rights in light of its functioning in society. The reason for why the scope is too broad: Increasing digitization in society The vagueness of the term ‘personal data’ additionally results, in combination with the ambiguous concept of protection, in an object of protection that is too broad. The reason for this is that both rights to private life and to data protection under Article 7 and 8 ECFR risk to substitute, in light of increasing digitization in society, the other fundamental rights more and more. The more digitization overlaps into different areas of social life, the broader the scope of application of both rights becomes.592 In light of the broad definition of the term ‘personal data’ by the European Court of Justice, both rights ‘concern any information relating to an identified or identifiable individual’.593 Given this broad definition, and in light of the in- (2) 591 See above under point C. I. 3. c) bb) (3) Remaining uncertainty about interplay between Article 7 and 8 ECFR, referring, for example, to ECJ C-92/02 and C-93/09, cip. 48. 592 See already v. Grafenstein and Schulz, The right to be forgotten in data protection law: a search for the concept of protection, p. 262. 593 See, for example, ECJ C-92/09 and C-93/09 cip. 52, ECJ C-70/10 cip. 51, and ECJ C-360/10 cip 49. I. Constitutional framework 225 creasing digitization, Articles 7 and 8 ECFR apply more and more to any given social interaction. The reason for this is that the diversity of social interaction consists, more and more, on the processing of personal data. Before digitization, in contrast, different areas of social life were covered by the diversity of all fundamental rights. For example, in the “analogue world”, concluding contracts in the private sector actually falls under the private autonomy guaranteed by fundamental rights. The fundamental right to the physical integrity of a person usually covers health related situations. The freedom to choose an occupation and the right to engage in work principally protects against actions, be it by the State or private parties, hampering the individual in conducting his or her work. Cases of discrimination are normally answered in light of the fundamental rights of non-discrimination.594 Instead, in a digital world, the more digitization penetrates all these different areas of social life, the more comprehensively the rights to private life and to data protection apply, superseding the other fundamental rights. Advantages and challenges: ‘Personal data’ as legal link for a subjective right However, the term ‘personal data’ as an essential link for legal regulation also has advantages. Information provides the basis for social interaction, not data, which possibly leads to an infringement of fundamental rights. Even if information provides a more direct link for legal instruments regulating informational social interaction, it cannot be the direct reference point of an individual’s subjective right. Since information builds on data that must be interpreted pursuant to social contexts in order to make sense, the individual to whom the information is related cannot directly refer to it, at least, cannot determine it.595 In contrast, linking the regulation instruments not to information, but to specific data enables an individual to directly enforce his or her subjective right: While the individual cannot determine interpretations of third parties by him or herself, he or she can in- (3) 594 See, for example, Folz, Article 16 ECFR – Freedom to Conduct a Business, cip. 3, and Article 3 ECFR – Freedom to Integrity, cip. 1 to 3, and Article 15 ECFR – Freedom to Work, cip. 4, and Article 21 – Freedom to non-discrimination, cip. 1 to 5. 595 Cf. Albers, ibid., cip. 68. C. The function of the principle of purpose limitation in light of Article 8 ECFR 226 deed determine the disclosure and use of data on which the information is built on.596 In this thesis, this is the legal link that will be taken up in light of the explicit wording of Article 8 sect. 1 ECFR, which states: “Everyone has the right to the protection of personal data concerning him or her.” Indeed, since it is not data, but information that possibly leads to harm or an infringement of fundamental rights, a “right to the protection of personal data” must be understood as just a certain legal link for regulating the use of information.597 At this moment, indeed, the question again is how to avoid that the scope of application of such a protection instrument becomes too broad and vague. With respect to the German right to informational self-determination, the legal scholar Albers therefore promotes a combination of an objective and a subjective regulatory approach: On a first level, the German general personality right shall mainly provide the necessary regulation instruments. These are: First, the objective requirement that data and information is only processed and used in an appropriate and transparent manner; second, an individual’s guarantee that he or she is able being informed of the informational actions related to him or her; and third, an individual’s guarantee that he or she can participate in the informational process, be it through a claim of cease and desist of certain usages of information, of deletion and rectification of certain information or positively influence the information. On a second level, all other German basic rights shall provide the scale determining the contexts for informational protection and, as a consequence, which kind of informational action and, consequently, which kind of informational protection is legally relevant.598 Britz builds upon Albers’ approach proposing a compromise between the two-level concept by Albers and the more subjective approach applied by the German Constitutional Court. As mentioned previously, in order to avoid a scope of protection becoming too broad and vague, Britz advocates that the German right to informational self-determination should be considered, at least partly, as an accessory right, which provides for protection for the other “more specific” constitutional norms.599 Indeed, the 596 Cf. Albers, Treatment of Personal Information and Data, cip. 11; as well as Hoffmann-Riem, Protection of the Confidentiality and Integrity of Information Technological Systems, p. 1010. 597 Cf. Britz, ibid., pp. 573 and 574. 598 See Albers, ibid., cip. 69 to 83. 599 See Britz, ibid., pp. 573 and 574. I. Constitutional framework 227 German Constitutional Court actually seeks, already, to determine the right to informational self-determination by referring to other basic rights.600 However, in Britz’ opinion, the German Court does so only when balancing, as a last step of the proportionality assessment, the right with opposing constitutional positions. In contrast, Britz stresses the other basic rights should already determine its scope, thus, as a first step of the assessment.601 So far, this thesis does not decide for one or the other approach. Rather, this thesis seeks to illustrate different ways of how a broad and vague scope of protection, which results from a commingling of the phenomena and terms “data” and “information”, could be avoided. In this regard, however, there is one aspect regarding Britz’ concept that shall be clarified: Even if her considerations are principally correct, she however overlooks that the German Court does not only refer to other basic rights in its balancing exercise, but already before, as a second step of the proportionality assessment, when examining whether or not harm or an infringement exists.602 Indeed, as was stressed before, the Court appears to be reluctant to narrow the scope, at this level. The ambiguity possibly results from the far-reaching effects that the indirect restriction of the scope – by narrowly defining harm or an infringement – has on the concept of protection. The moment where certain acts of usage of personal data do not fall under the scope of application, the Constitutional Court is not able to react to the same with its corresponding regulations.603 600 See above under point C. I. 2. d) Infringement by ‘insight into personality’ and ‘particularity of state interest’, and C. I. 2. e) aa) (2) The proportionality test also takes takes the use of data at a later stage into account. 601 See Britz, ibid., pp. 566 to 568 as well as 573 and 574. 602 See above under point C. I. 2. d) Infringement by ‘insight into personality’ and ‘particularity of state interest’. 603 Cf. above under point C. I. 1. b) bb) (1) The 3-Step-Test: Assessing the defensive and protection function; v. Grafenstein and Schulz, The right to be forgotten in data protection law: a search for the concept of protection, pp. 254 to 257 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 228 Possible consequence: A legal scale provided for by all fundamental rights which determine the regulation instruments under Art. 8 ECFR In conclusion, a concept of protection that refers to data, not to information, in order to provide for an individual’s subjective right bears two risks: Either, it is too vague and broad and, therefore, inefficient; or, a narrow determination of which act constitutes a harm or an infringement restricts the scope and therefore fails, perhaps too early, in providing for protection at all. One solution for this conflict could be to open, first, the scope of application of the fundamental right to data protection at a very early stage. So far, the reference to the term ‘personal data’ indeed opens a broad and vague scope of protection. However, the other fundamental rights of privacy, freedom and non-discrimination could then determine. As a second step, which specific data protection instruments are necessary in order to efficiently protect against the threats for the provided substantial guarantees.604 Such a concept serves three advantages compared, at least, to the current concepts of protection: First, it focuses not only on the scope(s) per se which is, so far, mainly determined by the term ‘personal data’, but on the substantial guarantees allowing one more precisely to differentiate between fundamental rights. In this respect, it should be noted that the distinction between the guarantees help not only to see whether an individual’s behavior is principally covered by the scope, but also whether it (e.g. a certain processing of personal data), conflicts with this guarantee and whether or, more precisely, under which conditions it might legitimately limit this fundamental right.605 In light of this normative approach, the right to data protection under Article 8 ECFR could be considered as a regulation instrument serving to protect the substantial guarantees provided for by all the other fundamental rights. In this respect, Article 8 ECFR would not only serve to protect the guarantees to respect for private and family life, home and communications in Article 7 ECFR, but also substantial guarantees provided for by further fundamental rights. This protection function serving all fundamental rights could help avoid the scope of application being too vague and broad. (4) 604 See v. Grafenstein and Schulz, ibid., pp. 260. 605 See v. Grafenstein and Schulz, ibid., pp. 254 and 255. I. Constitutional framework 229 Second, such a concept of protection would avoid the situation where it provides either too much (i.e. ineffective and inefficient) or too little protection. As shown before, it would open the scope of protection at a very early stage but determine its specific protection instruments pursuant to the other fundamental rights. And third, if all fundamental rights provide a scale in order to determine the legal relevance of data processing, Article 8 ECFR is not exclusively linked to privacy.606 Instead, the fundamental right to data protection can equally serve specific rights to freedom and non-discrimination. The fundamental right to data protection hence does not provide a right to informational self-determination with the result that the individual had a ‘right to basically determine by him or herself about the disclosure and the usage of his or her personal data’607. It does not merely focus on the individual’s consent as the main regulation instrument but provides for further regulation instrument for the treatment of personal data constituting a “heading of a set of rights and obligations and limitations to these which are put together as an elaborated system of checks and balances.”608 In conclusion, such a concept of protection corresponds to the different contexts of social life that are endangered by a data treatment and correspondingly protected by the substantial guarantees provided for by all fundamental rights. Regarding Nissenbaum’s context-based approach, all the fundamental rights could thus provide a normative scale in order to determine the context-relative informational norms.609 And as a possible consequence, the diversity of all fundamental rights may also help determine the function of the principle of purpose limitation. 606 Cf. above under point C. I. 2. f) Interim conclusion: Conceptual link between ‘privacy’ and ‘data processing’. 607 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 173; cf. equally BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 136 and BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 132 and BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 64 and BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 63. 608 See Kranenborg, Article 8 – Protection of Personal Data, cip. 8.176. 609 See above under point B. III. 5. Values as a normative scale in order to determine the “contexts” and “purposes”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 230 The requirement of purpose specification and its legal scale The vague and broad nature of scope of protection of the fundamental right to private life under Article 7 ECFR and/or the fundamental right to data protection under Article 8 ECFR, which was, so far, considered from a theoretical point of view, becomes obvious, in practice, in relation to the requirement of purpose specification. As mentioned in the introduction of this thesis, private entities often have difficulties answering the question of how precisely they have to specify the purpose of their data processing. Neither the decisions of the European Court of Human Rights nor of the European Court of Justice provide reliable criteria, in order to answer this question, albeit the purpose plays a central role in secondary and ordinary data protection laws. Therefore, this sub-chapter will analyze how European secondary laws themselves specify purposes of data processing. It will also illustrate how the German legislator transposes the requirements of the European directives into national law. In light of the conceptual differences between European and German laws, the German provisions will then be compared to the concept of protection of the German right to informational self-determina-tion. The idea behind this is that the German legislator rather tied, perhaps, into the German concept of protection than that of Article 8 ECFR, since the latter was not yet as developed as the German right. In any case, the comparison will reveal several flaws in the current concepts of protection when applied to the requirement of purpose specification in the private sector. On the basis of these results, this subchapter concludes with refining the object and concept of protection of the fundamental right to data protection of Article 8 ECFR, with respect to the function of the requirement of purpose specification. Main problem: Precision of purpose specification The following sections will, firstly, illustrate the criteria provided for by the European Court of Human Rights and the European Court of Justice. So far, in fact, there are only few criteria that help determine the purpose. In light of this, it is necessary to examine which requirements are established by European secondary law and how, in particular, the Article 29 Data Protection Working Group interprets the same. The next chapter will examine how the German legislator transposed the requirements provided for by the European directives into German ordinary law. It will become II. 1. II. The requirement of purpose specification and its legal scale 231 apparent that the German discussion on how to interpret the German requirements refers less to European constitutional law than to the German right to informational self-determination. Therefore, the criteria developed by the German Constitutional Court in relation to purpose specification assists in providing a deeper understanding of the requirements discussed in German legal literature. However, in light of its comprehensive decisions, it might also provide a further source in order to develop criteria for the precision of purpose specification with respect to Articles 7 and 8 ECFR. ECtHR and ECJ: Almost no criteria The European Court of Human Rights does not explicitly deal with the issue of how precise the purpose needs to be in relation to the processing of data. The reason for this is that it does not explicitly require the controller to specify the purpose, but instead, examines the purpose imposed by the controller in order to evaluate an infringement under Article 8 ECHR.610 In doing so, the range of purposes classified by the Court in order to undertake the evaluation is limited. The collection of data intruding into the individuals’ privacy, as well as the purpose of publishing personal data, usually infringes Article 8 ECHR. With regard to the State, the Court also has confirmed that there will be an infringement of Article 8 ECHR if the data is ‘systematically and permanently’ stored. This is the case even if “it contained no sensitive information and had probably never been consulted”.611 However, the limited re-use of data, which was collected and stored for another limited purpose, usually does not infringe the scope of protection of Article 8 ECHR. The only exception to this rule is if the later use of data differs considerably from the supposed purpose interfering with the individual’s ‘reasonable expectation’. From a data controller’s perspective, it might be clear enough how to avoid an infringement of Article 8 ECHR by not intruding in someone’s privacy and not publishing ‘his or her’ personal data. In contrast, a data controller might have difficulties defining which purpose is limited and which one goes beyond an individual’s ‘reasonable expectation’. This might less be the case if the a) 610 See above the analysis under point C. I. 3. b) cc) Particular reference to the individual’s “reasonable expectations”. 611 See ECtHR, Case of P.G. and J.H. vs. The United Kingdom from 25 September 2001 (application no. 44787/98), cip. 57. C. The function of the principle of purpose limitation in light of Article 8 ECFR 232 controller has, the moment that the data is collected, the intended use of that data already in mind. Instead, if the controller wants to re-use the data at a later stage, going beyond the initial purpose, the controller might have more difficulties in defining the criteria for its legitimate usage. Applying its case-by-case approach, the European Court of Human Rights does not provide more general criteria in order to determine which purposes and, correspondingly, which acts of usage interfere with the individual’s right to private life.612 The European Court of Justice provides even fewer criteria. Similar to the European Court of Human Rights, the European Court of Justice considers the publication of personal data as infringing the right to private life provided for by Article 7 ECFR with respect to the right to data protection in Article 8 ECFR.613 However, with particular view to the private sector, even if the Court examines, in the case of “Telekom vs. Germany”, the purpose in more detail, it does actually not provide any criteria for determining the precision of the purpose in general. The Court stated that the data controller must inform, in essence, the individual about the publication of the data before its first inclusion in the public directory.614 This case hence refers again only to a publication of the data. Comparably, in the case of “Mr. González vs. Google Spain”, the Court did not precisely examine what the initial purpose of the newspaper publishing the articles and the later purpose of the Internet search engine were and why this resulted in an infringement of Mr. González’ right to private life in Article 7 ECFR combined with Article 8 ECFR.615 With respect to the processing of personal data by the State, the European Court of Justice does also not elaborate on precise criteria in order to specify the purpose. In the case of “Digital Rights vs. Ireland”, the Court examined whether or not the legislator of the Data Retention Directive met the requirement that: limitations of the right to data protection, with respect to the protection of the individuals’ private life, must be limited to what is strictly necessary in order to reach the legislator’s objective. In this 612 See above under point C. I. 3. b) ee) Conclusion: Assessment of ‚reasonable expectations’ on a case-by-case basis. 613 See ECJ C-465/00, C-138/01 and C-139/01 (Rechnungshof vs. ORF), and ECJ C-92/09 and C-93/09 (Schecke vs. Land Hessen). 614 See ECJ C-543/09 cip. 66 and 67. 615 See above under point C. I. 3. c) aa) (2) (a) Protection against first publication and profiles. II. The requirement of purpose specification and its legal scale 233 regard, the Court simply criticized the following failures: first, the directive did not differentiate between the specific crimes in question; second, the directive did not limit the authorities obtaining access to the data, in light of their specific tasks; third, it did not require that a control mechanism be put in place prior to accessing the data, for example by the Court or another independent public authority. Finally, the directive did not provide any criteria in order to limit the period of time the data could be held that would be strictly necessary for the aim pursued in the case.616 The Court referred to these considerations in the later case of “Schrems vs. Ireland” stating “that legislation is not limited to what is strictly necessary where it authorises, on a generalised basis, storage of all the personal data (…) without any differentiation, limitation or exception being made in the light of the objective pursued and without an objective criterion being laid down by which to determine the limits of the access of the public authorities to the data, and of its subsequent use, for purposes which are specific, strictly restricted and capable of justifying the interference which both access to that data and its use entail”.617 These considerations do not, in any detail, treat the issue of the degree of precision in which the State has to specify the purpose of the processing of data. Requirements provided for by European secondary law Irrespective of the few criteria provided for by the European Courts, European secondary law (i.e. the Data Protection Directive, the ePrivacy Directive, the Civil Rights Directive, and the upcoming General Data Protection Regulation) foresees a comprehensive system regulating data processing in the private sector, which circles around the purposes of the processing. This system serves several goals: The Data Protection Directive generally pursues, on the one hand, the free traffic of personal data in the European Single Market and, on the other hand, the protection of individuals in relation to the treatment of ‘their personal data’.618 The ePrivacy Directive establishes further requirements with respect to personal data processed by means of information and communication technologies (ICT), in particub) 616 See ECJ C-293/12 and C-594/12 cip. 56 to 64. 617 See ECJ C-362/14 (Schrems vs. Facebook), cip. 92 and 93. 618 Regarding the Data Protection Directive, Ehmann/Helfrich, EU Data Protection Directive, Introduction, cip. 4. C. The function of the principle of purpose limitation in light of Article 8 ECFR 234 lar, Internet and electronic messaging services. The Civil Rights Directive finally amended several provisions of the ePrivacy Directive. It reacted to technological development, particularly, with respect to “new applications based on devices for data collection and identification, which could be contactless devices using radio frequencies” such as Radio Frequency Identification Devices (RFIDs).619 Finally, the General Data Protection Regulation, which shall apply, pursuant to Article 99, from the 25th of May 2018, will substitute the Data Protection Directive and be directly applicable in all EU Member States. Pursuant to the principles of these laws, the processing of personal data must apply certain principles and requirements for lawfulness within society. In particular, the data controller must apply the following two requirements together: first, that the processing must be either based on the individuals consent or on an authorizing law. The general prohibition to process personal data therefore applies not only to the public but also to the private sector.620 Second, Article 6 sect. 1 lit. b of the Data Protection Directive and Article 5 sect. 1 lit. b of the General Data Protection Regulation requires that personal data must be “collected for specified, explicit and legitimate purposes”. In the subsequent chapters, we will review; first, the role of this requirement within the current legal framework in relation to data protection; second, the criteria discussed in order to specify the purpose, and finally the purposes specified within the laws itself. Central role of purpose specification within the legal system In relation to European Data Protection Law, the specification of the purpose plays a decisive role. Amongst several other factors, it determines the scope of application of the applicable laws, and which entity is legally responsible for applying the laws (i.e. who is the ‘controller’, and who is the ‘processor’). aa) 619 See recital 56 of the Civil Rights Directive. 620 See, regarding Article 7 of the Data Protection Directive, Ehmann/Helfrich, ibid, Art. 7, cip. 1; Dammann/Simitis, EU Data Protection Directive, Art. 7, Explanations sect. 1, and regarding Article 6 GDPR, Härting, Data Protection Regulation: The new data protection law in operational practice, cip. 318. II. The requirement of purpose specification and its legal scale 235 Scope of protection: ‘Personal data’ The definition of the term ‘personal data’ plays an essential role because it determines the scope of application. Article 2 lit. a of the Data Protection Directive, and Article 4 sect. 1 of the General Data Protection Regulation, essentially define the term ‘personal data’ as “any information relating to an identified or identifiably natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity (bold words added in the General Data Protection Regulation)”. ‘All the means reasonably likely to be used’ Recital 26 of the Data Protection Directive further clarifies that in order “to determine whether a person is identifiable, account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify the said person”. In its recital 26, the General Data Protection Regulation ties into these considerations (sent. 3), and adds (sent. 4): “To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.” Example: IP addresses as ‘personal data’? One prominent example of this assessment concerns the question of whether IP addresses constitute personal data or not; the same question arises with respect to ‘unique device identifiers’ (UID or UDID), used for portable devices, and ‘media access control’ (MAC) addresses used for network technologies, such as Ethernet and Wifi.621 In relation to IP addresses, the prevailing opinion considers static IP addresses as ‘personal (1) (a) (b) 621 See, for example, Schreibauer, Federal Data Protection Law and further Provisions, § 11 TMG, cip. 4. C. The function of the principle of purpose limitation in light of Article 8 ECFR 236 data’, as long as they relate to natural individuals. The reason for this is that the individuals behind the static addresses can always be identified by means of Who-Is search requests, for example on www.ripe.net. This opinion leads to the situation that IP addresses accessible on the new Internet protocol IPv 6 are automatically ‘personal data’ because, with IPv 6, each device receives one single address. In light of the sheer amount of addresses available through the implementation of IPv 6, in contrast to IPv 4, which provides for approximately 4.3 billion addresses, IPv 6 provides around 340 sextillion addresses622 – critics argue that the relation of an IP address to a natural person becomes so complex that IP addresses of IPv 6 should be considered as anonymized data.623 However, with respect to IPv 4, which is still mainly used, IP addresses are not statically but dynamically, that means only for a certain period of time, related to individuals or, in more precise words, to the devices used by individuals. Indeed, some legal scholars advocate a rather strict approach: As long as it is theoretically possible to identify the individual, IP addresses must be considered as ‘personal data’. In contrast, other legal scholars argue that IP addresses can only be considered as ‘personal data’ if the data controller is able to identify itself the individual using the address.624 The European Court of Justice stated in the above-illustrated cases of “SABAM vs. Scarlet” and “SABAM vs. Netlog” that the IP addresses concerned did fall under Article 8 ECFR “because (they) allow those users to be precisely identified.”625 Some legal scholars conclude from this that the European Court of Justice generally considers all IP addresses as ‘personal data’. In contrast, other legal scholars argue that the Court only affirmed the nature of IP addresses as ‘personal data’ because the providers of the Internet access and the social network had the registration data and could only therefore identify the individuals.626 In light of this, the European Court of Justice had indeed not yet answered this question, explicitly – until the case of “Breyer vs. Germany”. 622 See Federal Communications Commission: Internet Protocol Version 6: IPv 6 for Consumers. 623 See Schreibauer, ibid., cip. 5 with further references. 624 See Schreibauer, ibid., cip. 7 and 8, who summarizes the spectrum of opinions, with further references. 625 See ECJ C-70/10 cip. 51 and ECJ C-360/10 cip 49. 626 See Schreibauer, ibid., cip. 9 with further references. II. The requirement of purpose specification and its legal scale 237 The case of “Breyer vs. Germany” In the case of “Breyer vs. Germany”, the entity processing the IP addresses could not identify the users itself. This decision therefore sheds further light on how the Court elaborates on the definition of the scope of application of the Data Protection Directive in light of the right to data protection under Article 8 ECFR. In this case, the referring German Civil Supreme Court asked the European Court of Justice whether IP addresses have to be considered as personal data within the meaning of the Data Protection Directive. Pursuant to the facts of the case, a public agency processed IP addresses of the users of its website. In particular, the agency recorded which IP addresses accessed the website at which time and date in order to guarantee not only the specific but also more general functionality of the website, for instance, in order to prosecute potential cyber attacks against the website in the case of denial-of-service attacks. As stressed before, the public agency providing the website could not identify the user behind the IP address by itself. For identifying the user, the agency had to combine the IP address with further data stored at and by the Internet service provider. The question of the referring German court therefore was whether the definition of “personal data” in the Data Protection Directive requires that the public agency itself is able to identify the user or whether it is sufficient that the agency can identify the user through the Internet service provider as a middle-man.627 Referring to recital 26 of the Data Protection Directive, the European Court of Justice affirms that additional information held by an internet service provider can be sufficient in order to identify the individual.628 The Court affirmed, in particular, that the combination of that data is a ‘reasonable means’ because it is not “prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and man-power, so that the risk of identification appears in reality to be insignificant.”629 In this decision the Court explicitly refers to the General Advocate who has stated, in its opinion: “Just as recital 26 refers not to any means which may be used by the controller (in this case, the provider of services on the Internet), but only to those that it is likely ‘reasonably’ to use, the legislature must also be understood as referring to ‘third parties’ who, also in a reasonable manner, may be approached by a (c) 627 See Opinion of Advocate General Campos Sánchez-Bordona, C-582/14, 12th of May 2016, cip. 1 to 10 as well as 79 and 80. 628 See ECJ C-582/14, cip. 40 to 44. 629 See ECJ C-582/14, cip. 46. C. The function of the principle of purpose limitation in light of Article 8 ECFR 238 controller seeking to obtain additional data for the purpose of identification. This will not occur when contact with those third parties is, in fact, very costly in human and economic terms, or practically impossible or prohibited by law. Otherwise, as noted earlier, it would be virtually impossible to discriminate between the various means, since it would always be possible to imagine the hypothetical contingency of a third party who, no matter how inaccessible to the provider of services on the Internet, could — now or in the future — have additional relevant data to assist in the identification of a user.”630 Referring to these considerations, the European Court of Justice came, in the present case, to the conclusion “that, in particular, in the event of cyber attacks legal channels exist so that the online media services provider is able to contact the competent authority, so that the latter can take the steps necessary to obtain that information from the internet service provider and to bring criminal proceedings. Thus, it appears that the online media services provider has the means which may likely reasonably be used in order to identify the data subject, with the assistance of other persons, namely the competent authority and the internet service provider, on the basis of the IP addresses stored."631 In conclusion, this decision applies the same reasoning as considered by the European Commission which has stressed that the processing of the IP address is, in particular, reasonable because it was stored exactly for that purpose to identify the user, in the case of cyber attacks.632 Thus, it would be contradictory not to consider the IP addresses as personal data, albeit they are collected for the purpose to identify the user. The purpose hence plays, here again, an essential role in order to ascertain whether the scope of protection applies or not. However, as the General Advocate correctly stressed, this case concerns a situation where an internet service provider is the middle-man. Thus, it does not refer to other situations where further individuals or entities might be able to identify the user.633 How far these considerations can be transferred to further cases, in particular, in light of 630 See Opinion of Advocate General Campos Sánchez-Bordona, C-582/14, 12th of May 2016, cip. 68. 631 See ECJ C-582/14, cip. 47 and 48. 632 See Opinion of Advocate General Campos Sánchez-Bordona, C-582/14, 12th of May 2016, cip. 38. 633 See Opinion of Advocate General Campos Sánchez-Bordona, C-582/14, 12th of May 2016, cip. 63. II. The requirement of purpose specification and its legal scale 239 the upcoming General Data Protection Regulation must remain, so far, an open question. Liability for ‘data processing’: ‘Controller’ and ‘processor’ In order to determine who is responsible for the data processing, the purpose also plays an essential role. In this regard, it must first be clarified what the term “data processing” means. Pursuant to Article 2 lit. b of the Data Protection Directive, the “‘processing of personal data’ (..) shall mean any operation or set of operations which is performed upon personal data or sets of personal data, whether or not by automatic means, such as collection, recording, organization, structuring, storage, adaptation, or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available or combination, blocking (or restriction), erasure or destruction (bold words in brackets added or changed in Article 4 sect. 2 GDPR)”. Some legal scholars believe that this definition covers as many acts of data processing as possible: For example, even the act of deletion of data or the mere reading of data by an individual falls under the scope of protection.634 In order to determine who is responsible for the processing, Article 2 lit. d of the Data Protection Directive, and Article 4 sect. 7 sent. 1 of the General Data Protection Regulation, define the ’controller’ as “the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data”. This definition implies a dynamic and non-linear understanding regarding the concept of data processing, which results in the situation that different controllers might be involved in one process.635 In contrast to the “controller”, a “processor” essentially is, pursuant to Article 2 lit. e of the Data Protection Directive and Article 4 sect. 8 of the General Data Protection Regulation, a “natural or legal person, public authority, agency or any other, body which processes personal data on behalf (2) 634 See Ehmann/Helfrich, ibid., Art. 2, cip. 27 et seqq; Dammann/Simitis, EU Data Protection Directive, Art. 2 cip. 5 et seqq. 635 Cf. Ehmann/Ehrlich, ibid., cip. 39 et seqq.,; Dammann/Simitis, ibid., cip. 11 et seqq.; see also “Opinion 1/2010 on the concepts of ’controller’ and ’processor’ “ by the Article 29 Data Protection Working Party, p. 12; also affirmed in Article 26 GDPR (‘joint controllers’). C. The function of the principle of purpose limitation in light of Article 8 ECFR 240 of the controller.” This definition implies several aspects: First, amongst other requirements, the controller must contractually bind the processor to its purpose of the data processing. The moment the processor determines itself the purposes and means, the processor becomes a controller and thus is more liable in relation to data protection compliance. This is decisive because even if the General Data Protection Regulation stipulates that the processor must adhere to several duties, in contrast to the Data Protection Directive, the liability is still much more extensive for the controller than for the processor.636 For example, while the requirement to implement appropriate measures of security-by-design applies to both the controller and the processor, pursuant to Article 32 GDPR, the requirement to implement measures of data protection-by-design provided for by Article 24 GDPR applies to the controller, only. In conclusion, the specification of the purpose plays an important role in order to determine the contractual powers of the processor and which legal requirements the controller and/or processor has to fulfill in order to protect the individual concerned by the data processing. Further legal provisions referring to the purpose There are further requirements provided for by law, which also depend on the purpose. For example, the principles of data-minimisation and storagelimitation provided for by Article 6 lit. c and e of the Data Protection Directive and Article 5 sect. 1 lit. c and e, requires that personal data must be “adequate, relevant and not excessive in relation to the purposes for which they are collected and further processed” and “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which they were collected or for which they were collected or for which they are further processed” (words crossed-out only mentioned in the directive). The first principle means that the individual concerned must be able, before the data is collected about him or her, to determine whether the collection is relevant with respect to the purposes specified by the controller. From a formalistic point of view, legal scholars admit that the collection of data for the purpose of simply ‘storing’ it would actually be sufficiently (3) 636 See Härting, ibid., cip. 577 to 584. II. The requirement of purpose specification and its legal scale 241 relevant. However, since these scholars also pre-suppose that the Data Protection Directive requires a strict purpose limitation, any later usage going beyond the storage would not be allowed.637 Other scholars provide further considerations regarding the terms “adequate” and “excessive”. For example, the data collected related to an individual’s health or political views is, principally, not adequate in order to evaluate him or her as a potential employee; and therefore, more general, the processing of personal data in more detail than is necessary for the purpose is deemed excessive. The second principle adds a time dimension to the first: the moment when the purpose is fulfilled, the further storage of personal data is only allowed if it cannot be related to the individual in the first instance. While some legal scholars stress that this requires that the data gets completely anonymized638, others consider that the Member States has to answer this question transposing the directive into national law.639 In any case, Article 17 of the General Data Protection Regulation essentially builds upon this requirement and establishes an individual’s right to have personal data deleted, amongst other factors, if the data is no longer necessary in relation to the purposes for which the data was collected or otherwise processed in the first place; this so-called right to be forgotten does not apply, for example, if the processing is necessary for exercising the freedom of expression. The European Court of Justice explicitly referred, in the case of “Mr. González vs. Google Spain”, to these principles without precisely examining, indeed, what the initial and the current purposes were.640 The principle of accuracy under Article 6 lit. d of the Data Protection Directive and Article 5 sect. 1 lit. d of the Data Protection Regulation states that personal data must be “accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified without delay” (words crossed-out only mentioned in the directive, bold words added in the regulation). Based on this principle, the individual concerned has the right to rectify incorrect data or to complete incomplete data, pursuant to Article 16 of the regulation. 637 Cf. Ehmann/Helfrich, ibid., cip. 24. 638 See Dammann/Simitis, ibid., cip. 17. 639 See Ehmann/Helfrich, ibid., cip. 31. 640 See above under point C. I. 3. c) aa) (2) (a) Protection against first publication and profiles. C. The function of the principle of purpose limitation in light of Article 8 ECFR 242 Beside these principles, there are further requirements for the “legitimate” processing of personal data and further rights and duties, which refer to the purpose specification requirement. Article 7 lit. a of the Data Protection Directive and Article 6 sect. 1 lit. a of the General Data Protection Regulation state that processing of personal data is lawful only if the individual concerned has provided their consent to the actual processing of his or her personal data for one or more specific purposes. Article 7 lit. f of the Data Protection Directive and Article 6 sect. 1 lit. f of the General Data Protection Regulation authorize the processing of personal data if it “is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data”. Articles 10 lit. b and 11 sect. 1 b of the Data Protection Directive, as well as Article 13 sect. 1 lit. c and sect. 3, and Article 14 sect. 1 lit. c, and sect. 4 of the General Data Protection Regulation, require the controller to provide information about the purpose of processing their data. Article 12 lit. a of the directive and Article 15 sect. 1 lit. a of the regulation provide that an individual also has the right to that information. Furthermore, the General Data Protection Regulation provides, in its Articles 24 and 32, for the following: „Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary.” The data protection impact assessment required under Article 35 of the regulation also refers to the purpose, providing for the duty of prior consultation of the data protection authority if the assessment reveals a high risk under Article 36. Pursuant to Article 29 sect. 2 of the regulation, a data protection officer must “have due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing”. The controller’s duty to designate a representative, provided for by Article 27 sect. 2 lit. a of the regulation, also depends on the risks for the individual in light of the purpose of the processing. Finally, the administrative fines foreseen under Article 83 equally refer to the nature, gravity and duration of the infringement taking into account the nature, scope or purpose of the processing. II. The requirement of purpose specification and its legal scale 243 All these requirements refer to the purpose specified by the controller. However, if the data protection laws itself determine the purpose, it is, in principle, not so difficult for the entities processing personal data to fulfill the purpose specification requirement.641 In contrast, if the purpose is not determined by law, the question is how the entities have to specify the purpose (on which, as shown before, all the before-mentioned requirements depend). Criteria discussed for purpose specification Unfortunately, data protection laws do not provide explicit criteria in order to determine how precisely the purposes should be specified.642 With respect to the Data Protection Directive, legal critics stress that the term ‘collected for specified and explicit purposes’ requires that the purpose of the data processing is made explicit to the data subject before its collection. These critics explain this requirement by referring to the legislation process. The European Parliament stated with respect to the first draft of the Data Protection Directive that there must be as much transparency as possible about which data is stored, about whom, and for which purpose; if individuals shall have the right to contest the storage, it must firstly be clear what shall be contested in the first place.643 While some legal scholars advocate further that the purposes must usually be specified in written form644, others stress that this requirement was actually abandoned bb) 641 See, for example: Article 2 sect. 2 lit. b GDPR (material scope of the regulation regarding purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security); Article 4 sect. 9 GDPR (recipient of personal data); Article 85 (journalistic, academic, artistic or literacy purposes); Article 88 GDPR (recruitment purposes and purposes of exercise of rights and benefits related to employment); Article 89 GDPR (archiving, scientific or historical research, or statstical purposes). 642 Cf. regarding the Data Protection Regulation, Härting, ibid., cip. 95. 643 See “Allgemeine Beobachtungen des Berichterstatters“ in der Begründung zur Stellungnahme im Bericht des Ausschusses für Recht und Bürgerrechte (Hoon- Report) vom 15. Januar 1992, S. 16: “Es muss die größtmögliche Transparenz darüber bestehen, welche Daten über welche Personen und für welche Zwecke gespeichert werden. Wenn Menschen das Recht erhalten sollen, Einspruch zu erheben, so muss zuerst feststehen, wogegen Einspruch erhoben werden soll.” 644 See Ehmann/Helfrich, ibid., cip. 13. C. The function of the principle of purpose limitation in light of Article 8 ECFR 244 in the course of the legislation process.645 However, if the purpose is not clear, some critics consider that the controller is not allowed to process the data.646 Regarding the precision of the purpose specified by the controller, scholars provide an example that the purpose must not be so broad that it implicitly includes unlawful sub-purposes.647 And the legal scholars Ehmann and Helfritz quote the European Commission as: “A general or vague definition or description of the object of the processing (such as for “commercial purposes”) does not meet the principle of purpose specification required by Article 6 lit. b” of the directive.648 Correspondingly, recital 28 of the Data Protection Directive states that “(…) purposes must be explicit and legitimate and must be determined at the time of collection of the data”. The General Data Protection Regulation only slightly liberalizes this approach by changing, pursuant to its recital 39 sent. 6, the “must”-requirement into a “should”-recommendation. None of these considerations effectively help answer the question of how specific the purpose must be specified. Preliminary note: Clarifying conceptual (mis)understandings The Article 29 Data Protection Working Group seeks to provide further guidance in order to determine the requirement to specify the purpose. In its “Opinion 03/2013 on purpose limitation”, the Working Group principally differentiates between the requirement of purpose specification and limitation even if it intermingles, conceptually, and in the wording, both requirements from time to time. For example, while the Group structures the role of the concept of the principle of purpose limitation in a ‘first building block: purpose specification’ and a ‚second building block: compatible use’, it states with respect to Article 8 ECFR that the “Charter clearly establishes the principle (1) 645 See Dammann/Simitis, ibid., cip. 6. 646 See Ehmann/Helfrich, ibid., cip. 13. 647 See Dammann/Simitis, ibid., cip. 7. 648 See Ehmann and Helfrich, EU Data Protection Directive, Article 6 cip. 12, referring to “Geänderter Vorschlag der Kommission, ABl. EG Nr. C 311 v. 27.11.1992, S. 15: “Eine allgemeine oder vage Definition oder Beschreibung des Gegenstandes einer Verarbeitung (beispielsweise “für kommerzielle Zwecke”) entspricht dem Grundsatz der Definition der Zweckbestimmung nach Artikel 6 Buchstabe b nicht.” II. The requirement of purpose specification and its legal scale 245 of purpose limitation, specifying that personal data must be processed ‘fairly for specified purposes’.”649 In fact, Article 8 ECFR does not refer to the requirement of purpose limitation, but only of purpose specification (at least with respect to its explicit wording). Comparably, the Working Group does not refer, in a precise way, to the decisions developed by the European Court of Human Rights with respect to the right to private life under Article 8 ECHR. From its point of view, the approach provided for by Article 8 ECHR “is based on a general prohibition of interference with the right of privacy and allows exceptions only under strictly defined conditions. In cases where there is ‘interference with privacy’ a legal basis is required, as well as the specification of a legitimate purpose as a precondition to assess the necessity of the interference.”650 It adds that “in the course of time, the European Court of Human Rights also developed the test of ‘reasonable expectations of privacy’ to help decide whether there had been an interference with the right to privacy.”651 The Working Group hence appears to consider two arguable aspects: First, that the concept of protection developed by the European Court of Human Rights ‘is based on a general prohibition of’ data processing; and second, that there is a requirement of purpose specification, which functions in order to evaluate, first, the necessity of an infringement of Article 8 ECHR and, second, its justification. In contrast, as shown previously, the European Court of Human Rights mainly examines the purpose pursued by the data controller in order to determine whether there is an infringement at all.652 More importantly: even if the case-by-case approach of the European Court of Human Rights has led to a rather wide scope of protection, it cannot be concluded from its decisions that the right to private life under Article 8 ECHR is based on a general prohibition of data process- 649 See the Article 29 Data Protection Working Group, Opinion 03/2013 on purpose limitation, with respect to the first aspect, pp. 11 to 12, and, with respect to the second aspect, p. 10. 650 See the Article 29 Data Protection Working Group, Opinion 03/2013 on purpose limitation, p. 7, as well as Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC, p. 6. 651 See the Article 29 Data Protection Working Group, Opinion 03/2013 on purpose limitation, p. 6. 652 See above under point C. I. 3. b) cc) Particular reference to the individual’s “reasonable expectations”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 246 ing.653 This is in particular the case with respect to the private sector where the Court instead refers to the ‘positive obligations’ of the States to establish safeguards protecting the interests of confidentiality of individuals against a misuse of ‘their’ data by third private parties.654 However, despite these ambiguous considerations, the opinion of the Working Party on the requirement of purpose specification is highly important in order to understand the concept provided for by the European data protection laws. The Working Party felt compelled to elaborate on its opinion in light of the divergent interpretations existing amongst the EU Member States. It stated: “In some countries, specific rules may apply to the public sector. In others, purposes may sometimes be defined in very broad terms. The approaches in the different Member States also vary as to how the purposes are made explicit, for example, whether specification of purpose is required in the notification to the data protection authority or in the notice to the data subject.”655 Thus, in order to give guidance for a consistent interpretation, the Working Group stresses, at first, the connection between the requirement of purpose specification and related concepts: Transparency, predictability, and user control. In its opinion “there is a strong connection between transparency and purpose specification. When the specified purpose is visible and shared with stakeholders such as data protection authorities and data subjects, safeguards can be fully effective. Transparency ensures predictability and enables user control. (…) If data subjects fully understand the purposes of the processing, they can exercise their rights in the most effective way. For instance, they can object to the processing or request the correction or deletion of their data.”656 Legal opinion on the function of the specification of a purpose Subsequently, the Group elaborates on the meaning and function of the terms ‘specified, explicit and legitimate’ purposes. From its point of view, (2) 653 See above under point C. I. 3. b) ee) Conclusion: Assessment of ‘reasonable expectations’ on a case-by-case basis. 654 See above under point C. I. 1. b) aa) (1) European Convention on Human Rights. 655 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 10. 656 See the Article 29 Data Protection Working Group, ibid., pp. 13 and 14. II. The requirement of purpose specification and its legal scale 247 the requirement to specify the purpose serves to “determine whether data processing complies with the law, and to establish what data protection safeguards should be applied (…/and therefore is) a necessary precondition to identify the specific purpose(s) for which the collection of personal data is required.”657 It adds: “Purpose specification requires an internal assessment carried out by the data controller and is a necessary condition for accountability. It is a key first step that a controller should follow to ensure compliance with applicable data protection law. The controller must identify what the purposes are, and must also document, and be able to demonstrate, that it has carried out this internal assessment.”658 The Working Group also advocates “that the purposes must be specified prior to, and in any event, not later than, the time when the collection of personal data occurs” and “must be detailed enough to determine what kind of processing is and is not included within the specified purpose, and to allow that compliance with the law can be assessed and data protection safeguards applied”.659 The Group concludes from this function that purposes “such as, for instance ‘improving users’ experience’, ‘marketing purposes’, ‘IT-security purposes’ or ‘future research’ will – without more detail – usually not meet the criteria of being ‘specific’.”660 However, it recognizes that “the degree of detail in which a purpose should be specified depends on the particular context in which the data are collected and the personal data involved.”661 With respect to the fact that data is usually processed for several purposes, it states as: “Personal data can be collected for more than one purpose. In some cases, these purposes, while distinct, are nevertheless related to some degree. In other cases the purposes may be unrelated. A question that arises here is to what extent the controller should specify each of these distinct purposes separately, and how much additional detail should be provided.”662 Coming to these questions, the Working Group points to the core challenge of the requirement of purpose specification. However, so far, it only provides a method for applying this requirement as: “For ‘related’ processing operations, the concept of an overall purpose, 657 See the Article 29 Data Protection Working Group, ibid., p. 13. 658 See the Article 29 Data Protection Working Group, ibid., p. 15. 659 See the Article 29 Data Protection Working Group, ibid., p. 15. 660 See the Article 29 Data Protection Working Group, ibid., p 16. 661 See the Article 29 Data Protection Working Group, ibid., p 16. 662 See the Article 29 Data Protection Working Group, ibid., p 16. C. The function of the principle of purpose limitation in light of Article 8 ECFR 248 under whose umbrella a number of data processing operations take place, can be useful. That said, controllers should avoid identifying only one broad purpose in order to justify various further processing activities which are in fact only remotely related to the actual initial purpose.”663 In conclusion, the Article 29 Data Protection Working Group only provides a rather superficial objective scale in order to determine the degree of precision of a purpose as: “Ultimately, in order to ensure compliance with Article 6(1)(b), each separate purpose should be specified in enough detail to be able to asses whether collection of personal data for this purpose complies with the law, and to establish what data protection safeguards to apply.”664 Legal opinion on the function of ‘making a specified purpose explicit’ The Working Group also elaborates on the meaning and function of the requirement that the specified purpose must be made explicit to the individual. In its opinion “the purposes of collection must not only be specified in the minds of the persons responsible for data collection. They must also be made explicit. In other words, they must be clearly revealed, explained or expressed in some intelligible form. It follows from the previous analysis that this should not happen later than the time when the collection of personal data occurs. (…) The requirement that the purposes be specified ‘explicitly’ contributes to transparency and predictability. (…) It helps all those processing data on behalf of the controller, as well as data subjects, data protection authorities and other stakeholders, to have a common understanding of how the data can be used. This, in turn, reduces the risk that the data subject’s expectation will differ from the expectations of the controller. In many situations, the requirement also allows data subjects to make informed choices – for example, to deal with a company that uses personal data for a limited set of purposes rather than with a company that uses personal data for a wider variety of purposes.”665 In this regard, the Working Group also stresses how differently Member States transposed this requirement into national laws. While some Member States, often linguistically originating from the Latin family of lan- (3) 663 See the Article 29 Data Protection Working Group, ibid., p 16. 664 See the Article 29 Data Protection Working Group, ibid., p 16. 665 See the Article 29 Data Protection Working Group, ibid., p. 17. II. The requirement of purpose specification and its legal scale 249 guages, refer to the requirement in the meaning of ‘unfold, unravel, and explain’, other countries such as Germany or Hungary understand this to mean ‘unambiguous’. This second understanding does not necessarily require that the purpose must be expressed in a certain form.666 However, the Working Group exemplifies how the specified purpose may be made explicit as: “Describing the purposes in a notice provided to the data subjects, in a notification provided to the supervisory authority, or internally in the information provided to a data protection officer.”667 It also stresses the function of the requirement with respect to accountability. For example, on the one hand, purposes made explicit in written form or another appropriate documentation, help data controllers to verify that they had fulfilled the requirement of purpose specification. On the other hand, it equally helps data subjects to exercise their rights. However, the Working Group clarifies that such documentation might not be necessary in every case. In some cases, it is sufficiently clear for which purpose the controller uses the data.668 Legal opinion on the reconstruction of a purpose and its legitimacy The Working Group considers that data processing that does not meet the specified requirements is not automatically unlawful. Instead, “it will be necessary to reconstruct the purposes of processing, keeping in mind the facts of the case. While the publicly specified purpose is the main indicator of what the data processing will actually aim at, it is not an absolute reference: where the purposes are specified inconsistently or the specified purposes do not correspond to reality (for instance in case of a misleading data protection notice), all factual elements, as well as the common understanding and reasonable expectations of the data subjects based on such facts, shall be taken into account to determine the actual purposes.”669 (4) 666 See the Article 29 Data Protection Working Group, ibid., footnote 42. 667 See the Article 29 Data Protection Working Group, ibid., p. 18. 668 See the Article 29 Data Protection Working Group, ibid., p. 18; cf. the reasoning of the German Constituional Court, 16th of June 2009, 2 BvR 902/06 (Email Confiscation), cip. 102, illustrated beneath under point C. III. 1. b) bb) (3) Identification marks as control-enhancing mechanisms. 669 See the Article 29 Data Protection Working Group, ibid., p. 18. C. The function of the principle of purpose limitation in light of Article 8 ECFR 250 Finally, the Working Group states on the legitimacy requirement as: “In order for the purposes to be legitimate, the processing must – at all different stages and at all time – be based on at least one of the legal grounds provided for by Article 7 (of the Data Protection Directive). However, the requirement that the purposes must be legitimate is broader than the scope of Article 7. In addition, Article 6(1)(b) also requires that the purposes must be in accordance with all provisions of applicable data protection law, as well as other applicable laws such as employment law, contract law, consumer law, and so on. (…) This includes all forms of written and common law, primary and secondary legislation, municipal degrees, judicial precedents, constitutional principles, fundamental rights, other legal principles, as well as jurisprudence, as such ‘law’ would be interpreted and taken into account by competent courts. Within the confines of law, other elements such as customs, codes of conduct, codes of ethics, contractual arrangements, and the general context and facts of the case, may also be considered when determining whether a particular purpose is legitimate. This will include the nature of the underlying relationship between the controller and the data subjects, whether it be commercial or otherwise.”670 Purposes of processing specified when consent is given In addition to the requirements described, data processing must either be based on the consent of the individual concerned or an authorizing law. With respect to the consent, Article 2 lit. f of the ePrivacy Directive refers to the same requirements as provided for by the Data Protection Directive. Article 2 lit. h of the Data Protection Directive states: “‘The data subject’s consent’ shall mean any freely given specific and informed indication of his whishes by which the data subject signifies his agreement to personal data relating to him being processed.” Regarding the term ‘specific’, the Working Group considers that a “blanket consent without determination of the exact purposes does not meet the threshold.”671 Legal scholars refine this criteria by stressing that the individual must be informed not only about the specific data processing, but also about its consequences. In their cc) 670 See the Article 29 Data Protection Working Group, ibid., pp. 19 and 20. 671 See the Article 29 Data Protection Working Group, ibid., p. 34. II. The requirement of purpose specification and its legal scale 251 opinion, the term ‘specific’ does not exclude future acts of usage but rather means concrete circumstances, including the purpose of the processing. In addition, the question of how detailed the controller must specify the consequence, depends on how intensively the later usage affects the individuals fundamental rights.672 However, the term ‘specific’ does not reveal, so far, further criteria determining the purposes provided for within the consent. Purposes of data processing authorized by legal provisions As mentioned previously, the limited criteria set out in order to determine the precision of the purpose is less problematic for the controller (and further entities) if the law itself defines the purpose. The ePrivacy Directive, the Data Protection Directive and the General Data Protection Regulation provide for several provisions authorizing the processing of personal data for specific purposes. ePrivacy Directive The ePrivacy Directive provides, in its current version amended by the Civil Rights Directive, several authorizations for the processing of personal data that prevail over the general provisions in the Data Protection Directive. These provisions mainly concern four types of data: 1. ‘Communications and the related traffic data’ and, with a particular view to cookies, ‘information stored in the terminal equipment of a subscriber or user’, Article 5; 2. ‘traffic data relating to subscribers and users processed and stored by the provider of a public communications network or publicly available electronic communication service’, Article 6; 3. ‘location data other than traffic data’, Article 9; and 4. ‘information provided for by electronic calling and communication systems without human intervention (automatic calling machines), facsimile machines (fax) or electronic mail, Article 13 (‘unsolicited communications’). dd) (1) 672 See Dammann/Simitis, ibid., cip. 22. C. The function of the principle of purpose limitation in light of Article 8 ECFR 252 Article 2 lit. d defines the term ‘communication’ as “any information exchanged or conveyed between a finite number of parties by means of a publicly available electronic communications service”; pursuant to Article 2 lit. b, the term of ‘traffic data’ means “any data processed for the purpose of the conveyance of a communication on an electronic communications network or for the billing thereof”; Article 2 lit. c states on the definition of the term of ‘location data’ as “any data processed in an electronic communications network, indicating the geographic position of the terminal equipment of a user of a publicly available electronic communications service”; and Article 2 lit. h defines the term of ‘electronic mail’ as “any text, voice, sound or image message sent over a public communications network which can be stored in the network or in the recipient’s terminal equipment until it is collected by the reception.” Regarding the first type of data, Article 5 of the ePrivacy Directive requires EU Member States to ensure that communication and related traffic data remain confidential. This kind of data may be processed only, in the private sector, under the following conditions: 1. Always if it is based on the user’s consent (sect. 1 sent. 2); 2. Its storage only if it is necessary for the conveyance of a communication without prejudice to the principle of confidentiality (sect. 1 sent. 3); 3. The recording of communications and related traffic data carried out in the course of lawful business practice for the purpose of evidence of a commercial transaction or of any other business communication if it is legally authorized (sect. 2); and 4. Finally, the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user, here again, either on the basis of his or her consent, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network, or if it is strictly necessary for the provider of an Information Society service explicitly requested by the subscriber or user to provide the service’ (sect. 3). Regarding the second type, traffic data, Article 6 of the ePrivacy Directive authorizes, in essence, it’s processing only if it is: 1. Made anonymous the moment where it is no longer needed for the purpose of the transmission of a communication (sect. 1); 2. For the purposes of subscriber billing and interconnection payments (sect. 2); and II. The requirement of purpose specification and its legal scale 253 3. For the purposes of marketing electronic communications services or for the provision of value added services, as long as it is necessary for the marketing or service or if the subscriber or the user has given his or her prior consent (sect. 3); in the last respect, article 2 lit. g of the directive defines the term of ‘value added service’ as “any service which requires the processing of traffic data or location data beyond what is necessary for the transmission of a communication or the billing thereof.” Concerning the third type of data, i.e. location data other than traffic data, the requirements are the strictest: Article 9 of the ePrivacy Directive authorizes its processing only if it is made anonymous or with the consent of the subscribers or users to the extent and for the duration necessary for the provision of a value added service. Finally, regarding the fourth type of data, unsolicited communications, Article 13 of the ePrivacy Directive authorizes the use of automated calling machines, fax or email for the purposes of direct marketing only if the subscribers or users has given their prior consent. Data Protection Directive and General Data Protection Regulation As far as the prevailing provisions of the ePrivacy Directive do not apply, the Data Protection Directive provides several purposes under which the processing of personal data is justified. In essence, the upcoming General Data Protection regulation corresponds to these provisions. Irrespective of the processing of special categories of data, Article 7 of the Data Protection Directive, as well as Article 6 sect. 1 of the General Data Protection Regulation generally authorize the processing of personal data as:673 1. If it is necessary for the performance of a contract (lit. b); 2. If it is necessary for the compliance of a legal obligation of the data controller (lit. c); 3. If it is necessary in order to protect the vital interests of the individual concerned (lit. d); 4. If it is necessary for a task carried out in the public interest (lit. e); 5. Or, if it is necessary for the purposes of the legitimate interests pursued by the controller or by the third party to whom the data are disclosed, (2) 673 See Dammann/Simitis, ibid., cip. 4 referring to the Explanation sect. 4. C. The function of the principle of purpose limitation in light of Article 8 ECFR 254 except where such interests are overridden by the individual’s interests for fundamental rights and freedoms which require protection under Article 1 sect. 1 of the directive or Article 1 sect. 2 of the regulation. Pursuant to Article 1 sect. 1 of the directive, Member States transposing the directive into national law are required to not only protect the individual’s right to privacy with respect to the processing of personal data, but also the other fundamental rights and freedoms. And Article 1 sect. 2 of the regulation states: “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.” Preliminary note: Clarifying conceptual (mis)understandings The Article 29 Data Protection Working Group also provides in this regard, in its “Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC”, guidance on how to interpret these purposes specified within the laws itself. Comparably to its “Opinion 03/2013 on purpose limitation”, the Working Group briefly refers, at first, to the European Convention on Human Rights and the European Charter of Fundamental Rights in order to explain the conceptual background of its recommendations. With respect to the European Convention on Human Rights, the Working Group is, here again, of the opinion that the approach developed by the European Court of Human Rights “is based on a general prohibition of interference with the right of privacy and allows exceptions only under strictly defined conditions.”674 It adds: “In cases where there is 'interference with privacy' a legal basis is required, as well as the specification of a legitimate purpose as a precondition to assess the necessity of the interference.”675 In the Working Group’s opinion, “this approach explains that the ECHR does not provide for a list of possible legal grounds but concentrates on the necessity of a legal basis, and on the conditions this legal basis should meet.”676 Similarly, the Group refers to the European Charter of Fundamental Rights stating: “The (a) 674 See the Article 29 Data Protection Working Group, Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC, p. 6. 675 See the Article 29 Data Protection Working Group, ibid., p. 6. 676 See the Article 29 Data Protection Working Group, ibid., p. 6. II. The requirement of purpose specification and its legal scale 255 Charter enshrines the protection of personal data as a fundamental right under Article 8, which is distinct from the respect for private and family life under Article 7. Article 8 lays down the requirement for a legitimate basis for the processing. In particular, it provides that personal data must be processed ‘on the basis of the consent of the person concerned or some other legitimate basis laid down by law’. These provisions reinforce both the importance of the principle of lawfulness and the need for an adequate legal basis for the processing of personal data.”677 The Working Group hence appears to conclude from both the European Charter on Human Rights, as well as the European Charter of Fundamental Rights, a general prohibition about the processing of personal data even for the private sector. In any event, it does not treat the question of whether these rights have a direct or an indirect effect on private parties processing personal data.678 However, the respectable aim of the Working Group is to “clarify the relationship of the ‘legitimate interests’ ground with the other grounds of lawfulness – e.g. in relation to consent, contracts, tasks of public interest” in order to “contribute to legal certainty”. This is highly creditable since the Data Protection Directive, as well as the General Data Protection Regulation establishes a general prohibition of the processing of personal data, not only for the public, but also for the private sector, and the data controller therefore heavily depends on these legitimate grounds.679 Though, the Working Group firstly states (in relation to the interplay between the consent and the other legal grounds) provided for by the directive “the first ground, Article 7(a), focuses on the self-determination of the data subject as a ground for legitimacy. All other grounds, in contrast, allow processing – subject to safeguards and measures – in situations where, irrespective of consent, it is appropriate and necessary to process the data within a certain context in pursuit of a specific legitimate interest.”680 677 See the Article 29 Data Protection Working Group, ibid., p. 8. 678 Cf. above under points C. I. 1. b) The effects of fundamental rights on the private sector, and C. I. 3. b) Concept of Article 8 ECHR: Purpose specification as a mechanism for determining the scope of application (i.e. the individual’s ‘reasonable expectation’, and C. I. 3. c) Concept of Articles 7 and 8 ECFR: Ambiguous interplay of scopes going beyond Article 8 ECHR. 679 See the Article 29 Data Protection Working Group, ibid., p. 10. 680 See the Article 29 Data Protection Working Group, ibid., p. 13. C. The function of the principle of purpose limitation in light of Article 8 ECFR 256 Legal opinion on ‘performance of a contract’ Article 7 lit. b of the directive provides, and allows, the processing of certain data which is necessary for the performance of a contract. In relation to a contract that had already existed before the data was processed, the Working Party provides examples about which situations may meet this requirement (i.e. for the ‘performance’ of a contract) and which do not: The profiling of an individual regarding his or her purchase behavior usually does not meet the requirement because the contract most often refers to the delivery of products or services and not to profiling (in the Working Group’s opinion, this is not even the case if the profiling is explicitly mentioned “in the small print of the contract”);681 while “a company-wide internal employee database containing the name, business address, telephone number and email address of all employees, to enable employees to reach their colleagues may in certain situations be considered as necessary”682, “electronic monitoring of employee internet, email or telephone use, or video-surveillance of employees” is more likely not to be necessary for the performance of the employment contract; while formal reminders referring to outstanding contractual obligations usually meet the requirement, the transfer of personal data to external debt collection or lawyers’ companies do not.683 However, other legal grounds such as for the ‘legitimate interests’ might authorize these kinds of data processing.684 Regarding data processing prior to the entering of a contract, these considerations comparably apply: “If an individual requests a retailer to send her an offer for a product, processing for these purposes, such as keeping address details and information on what has been requested, for a limited period of time, will be appropriate”. In contrast, “detailed background checks, for example, processing the data of medical check-ups before an insurance company provides health insurance”, “credit reference checks prior to the grant of a loan” or “direct marketing at the initiative of the retailer/controller” is not necessary for the contract that shall be conclud- (b) 681 See the Article 29 Data Protection Working Group, ibid., p. 17. 682 See the Article 29 Data Protection Working Group, ibid., p. 17. 683 See the Article 29 Data Protection Working Group, ibid., pp. 17 and 18. 684 See the Article 29 Data Protection Working Group, ibid., pp. 17 and 18. II. The requirement of purpose specification and its legal scale 257 ed.685 Of course, again, the ‘legitimate interests’ in Article 7 lit. f of the directive might authorize the data processing.686 Legal opinion on ‘legal obligation’, ‘vital interests’, and ‘public task’ With respect to the other purposes of data processing authorized by Article 7 lit. c to e of the directive, the Working Group provides further guidelines regarding its interpretation of the same. Article 7 lit. c of the directive provides for the processing of personal data in order to fulfill a legal obligation. The Working Group regards this as “the data controller must not have a choice whether or not to fulfill the obligation. Voluntary unilateral engagements and public-private partnerships” do not meet this provision. Consequently, “Article 7(c) (only) applies on the basis of legal provisions referring explicitly to the nature and object of the processing. The controller should not have an undue degree of discretion on how to comply with the legal obligation. The legislation may in some cases set only a general objective, while more specific obligations are imposed at a different level, for instance, either in secondary legislation or by a binding decision of a public authority in a concrete case.”687 Article 7 lit. d of the directive authorizes the processing of personal data if it is necessary in order to protect the vital interests of the data subject. In this regard, the Working Party essentially considers this to be as: first, referring to recital 31 of the directive, the term ‘vital interest’ limits the scope only to questions of life and death situations; second, the situation must refer to a specific threat to the individuals life (an abstract threat is not sufficient); and third, the controller is allowed to refer to this legal provision only if it cannot seek consent from the data subject.688 Article 7 lit. e of the directive furthermore authorizes data processing by private parties in relation to a ‘public task’. The Working Party clarifies that this provision particularly becomes relevant if “there is no requirement for the controller to act under a legal obligation”, for example, if the controller becomes aware of a fraud and wants to inform public authorities, even if it is not legally obliged to do so. Here again, the Working Group stresses (c) 685 See the Article 29 Data Protection Working Group, ibid., p. 18. 686 See the Article 29 Data Protection Working Group, ibid., p. 18. 687 See the Article 29 Data Protection Working Group, ibid., pp. 19 and 20. 688 See the Article 29 Data Protection Working Group, ibid., p. 20. C. The function of the principle of purpose limitation in light of Article 8 ECFR 258 that the “official authority or public task will have been typically attributed in statutory laws or other legal regulations. If the processing implies an invasion of privacy or if this is otherwise required under national law to ensure the protection of the individuals concerned, the legal basis should be specific and precise enough in framing the kind of data processing that may be allowed.”689 Legal opinion on ‘legitimate interests’ Finally, Article 7 lit. f of the directive authorizes the data processing which ‘is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed’. In this regard, EU Member States are only allowed to specify these interests but not to broaden or limit the provision.690 Concerning this last authorization, the European Court of Justice came to the conclusion in the case of “ASNEF vs. FECEMD” that the Spanish legislator had not found an adequate balance between the opposing fundamental rights.691 Transposing Article 7 lit. f of the Data Protection Directive into Spanish ordinary law, the Spanish legislator had excluded the processing of personal data, which had not been made publically available before, from this provision.692 This general exclusion of this type of data, not yet made publically available, conflicted, in the European Court of Justice’ opinion, with the general clause of Article 7 lit. f. Similarly, in the case of “Breyer vs. Germany”, the German legislator cannot restrict, when transposing this provision into national law, the storage of personal data to such cases where it is necessary for guaranteeing the specific operability of a certain service. In contrast, Article 7 lit. f of the directive may also authorize the storage of that data if it is necessary for the general operability of the service.693 In contrast, in the case of “Mr. González vs. Google Spain”, the Court decided, turning the relationship between rule and exception provided for by Article 7 lit. f on its head, that the fundamental rights to private life and to data protection “override, as a (d) 689 See the Article 29 Data Protection Working Group, ibid., pp. 21 and 22. 690 See Dammann/Simitis, ibid., cip. 2. 691 See ECJ C-468/10 and C-469/10, cip. 43 to 48. 692 See ECJ C-468/10 and C-469/10, cip. 22. 693 See ECJ C-582/14, cip. 50 to 64. II. The requirement of purpose specification and its legal scale 259 rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information”.694 The Article 29 Data Protection Working Party provides further guidance on how to interpret Article 7 lit. f of the directive. At first, regarding the requirement that the data processing must be ‘necessary’ for the purpose of the legitimate interest, the Working Party states that “this condition complements the requirement of necessity under Article 6 (of the directive), and requires a connection between the processing and the interests pursued. (…) As in other cases, this means that it should be considered whether other less invasive means are available to serve the same end.”695 With respect to the question on how precisely the ‘interest’ must be articulated, the Working Party advocates that there must be “a real and present interest, something that corresponds with current activities or benefits that are expected in the very near future. In other words, interests that are too vague or speculative will not be sufficient.”696 The Working Party also promotes that the term ‘legitimate’ interest can “include a broad range of interests, whether trivial or very compelling, straightforward or more controversial. It will then be a second step, when it comes to balancing these interests against the interests and fundamental rights of the data subjects, that a more restricted approach and more substantive analysis should be taken.”697 Consequently, it exemplifies several ‘legitimate interests’ as: the exercise of the right to freedom of expression or information, including in the media and the arts; conventional direct marketing and other forms of marketing or advertisement; unsolicited non-commercial messages, including for political campaigns or charitable fundraising; enforcement of legal claims including debt collection via out-of-court procedures; prevention of fraud, misuse of services, or money laundry; employee monitoring for safety or management purposes; whistle-blowing schemes; physical security, IT and network security; processing for historical, scientific or statistical purposes; processing for research purposes (including marketing research). Whatever the specific interest might be, the Working Group stresses that “an interest can be considered as legitimate as long as the controller 694 See ECJ C-131/12, cip. 92 to 99. 695 See the Article 29 Data Protection Working Group, ibid., p. 29. 696 See the Article 29 Data Protection Working Group, ibid., p. 24. 697 See the Article 29 Data Protection Working Group, ibid., p. 24. C. The function of the principle of purpose limitation in light of Article 8 ECFR 260 can pursue this interest in a way that is in accordance with data protection and other laws. In other words, a legitimate interest must be ‘acceptable under the law’.”698 This is in particular the case if the interests are guaranteed by fundamental rights such as: the freedom of expression and information; the freedom of the arts and sciences; the right to access to documents; the right to liberty and security; the freedom of thought, conscience, and religion; the freedom to conduct a business; the right to property; the right to effective remedy and to a fair trial; and the presumption of innocence and right of defense.699 In conclusion, the Working Party provides several examples for legitimate interests: a company’s interest to know the ‘needs and desires’ of their customers is principally allowed. In contrast, ‘unduly monitoring’ their online and offline activities is not allowed. Similarly, the combination of vast amounts of data from different sources, that were initially collected in other contexts and for different purposes is not allowed. The creation – with the involvement of data brokers as intermediaries – of complex profiles’ might also not be allowed.700 With respect to the interests of third parties, the Working Party also takes the interests of the public into account. In this regard, it takes into consideration transparency and accountability of private entities. For example, the salaries of top managers within large corporations might be disclosed; the re-publication of data, such as by the press or in general in a more innovative and user-friendly way is another example. Finally, and in addition, historical or other kinds of research might be a legitimate interest under Article 7 lit. f of the Data Protection Directive.701 Interestingly, the European Court of Justice came, in the cases of “Rechnungshof vs. ORF” and “González vs. Google Spain” to a contrasting result to the above. The publication of the salaries of the individuals concerned in combination with their name was not proportionate; and the re-publication of personal data via an Internet search engine did not override the interests of the individual concerned. With respect to the second aspect, the Court indeed considered that this re-publication was very user friendly. However, this did not lead to being deemed a legitimate 698 See the Article 29 Data Protection Working Group, ibid., p. 25. 699 See the Article 29 Data Protection Working Group, ibid., p. 34. 700 Cf. the Article 29 Data Protection Working Group, ibid., p. 26. 701 Cf. the Article 29 Data Protection Working Group, ibid., pp. 27 and 28. II. The requirement of purpose specification and its legal scale 261 interest. Rather, it was the opposite, as it was deemed a particular severe harm for the individual concerned.702 Transposition of the requirement of purpose specification into German law The German legislator transposed these requirements into German ordinary law, as set out in the following sections. In contrast to the two directives on the European Level, which currently apply, in Germany, there essentially are three laws regulating the processing of personal data in the private sector: The Federal Data Protection Law, the Telemedia Law, and the Telecommunication Law. The dispersion of data protection instruments over several laws makes it very difficult to decide which law actually applies. Consequently, the potential interplay of these laws, and with respect to a particular case (whatever that may be), is highly debated in German legal literature.703 In principle, the Federal Data Protection Law provides the basic regulation instruments for any kind of processing of personal data.704 However, pursuant to Article 3 sect. 3 sent. 1 of the Federal Data Protection Law, other, more specific laws must prevail if applicable. This regulation leads to a prevalence of the more specific Telemedia Law and the Telecommunication Law over the Federal Data Protection Law. Telemedia Law and the Telecommunication Law differ from each other in terms of the different services regulated by these laws: While the Telemedia Law applies – correspondingly to Information Society services – to so-called telemedia services, the Telecommunication Law applies to telecommunication services. Article 1 sect. 1 sent. 1 of the Telemedia Law defines the term ‘telemedia’ as “any electronic information and communication service as long as it is not a telecommunication service (…) or a telecommunication-based service in the meaning of the Telecommunication Law c) 702 See above under points C. I. 3. c) aa) (3) (b) The answer depends on the type of threat posed, and C. I. 3. c) aa) (2) (a) Protection against first publication and profiles based on public data. 703 See, for example, Boos et al., Data protection and cloud computing pursuant to the Telecommunication Law, Telemedia Law, and Federal Data Protection Law. 704 See v. Lewinski, Federal Data Protection Law and further Provisions, § 1 BDSG, cip. 31. C. The function of the principle of purpose limitation in light of Article 8 ECFR 262 (…)”.705 Thus, the Telemedia Law partly defines its scope negatively to the Telecommunication Law. Article 3 no. 24 of the Telecommunication Law states, in turn, on the term of ‘telecommunication services’ as “services (…) which totally or mainly consist in the transfer of signals via telecommunication networks”.706 Applying these definitions, legal scholars provides the following examples for telecommunication services: Cloud services, as long as they provide the infrastructure but not the hosting as such; email services, as long as they consist in the transport, but not the storage and administration; Voice over IP, Internet VPN, and messaging services, as long as they control the transfer even if it is based on a third party’s infrastructure.707 In contrast, telemedia is considered as, for example: Chat rooms, blogs, Internet search engines, online shops, advertising emails, wikis, and online games.708 The principles of the telecommunication law and telemedia law could potentially lead to the situation where a provider combines both telemedia and telecommunication services and therefore has to apply all laws simultaneously when offering its services.709 However, all three laws apply the systematic approach of the directives: The processing of personal data, irrespective of the specific type of data, is only allowed on the basis of a legal provision or if the individual provides the necessary consent to the processing. 705 See Article 1 sect. 1 sent. 1 TMG states: “Dieses Gesetz gilt für alle elektronischen Informations- und Kommunikationsdienste, soweit sie nicht Telekommunikationsdienste nach § 3 Nr. 24 des Telekommunikationsgesetzes, die ganz in der Übertragung von Signalen über Telekommunikationsnetze bestehen, telekommunikationsgestützte Dienste nach § 3 Nr. 25 des Telekommunikationsgesetzes oder Rundfunk nach § 2 des Rundfunkstaatsvertrages sind (Telemedien).” 706 See Article 3 no. 24 TKG states as: “Im Sinne dieses Gesetzes ist oder sind ‚Telekommunikationsdienste’ in der Regel gegen Entgelt erbrachte Dienste, die ganz oder überwiegend in der Übertragung von Signalen über Telekommunikationsnetzwerke bestehen, einschließlich Übertragungsdienste in Rundfunknetzen.” 707 See v. Lewinski, ibid., Vor. zu § 88 TKG, cip. 37 to 60 as well as 23. 708 See Schreibauer, Federal Data Protection Law and further Provisions, Vor. zu § 11 TMG, cip. 8 with further references. 709 See v. Lewinski, ibid, cip. 61. II. The requirement of purpose specification and its legal scale 263 Purposes of processing authorized by the Telecommunication Law The most restrictive law, the Telecommunication Law, differentiates between three different types of data: Personal data in relation with a contract, traffic data, and location data. In comparison to the ePrivacy Directive, the Telecommunication law authorizes the processing of personal data for specifically listed purposes within these three categories. At first, article 88 sect. 1 and 2 of the Telecommunication Law generally protects the content of telecommunication, as well as its ‘closer circumstances’ against any kind of processing. Pursuant to sect. 3, providers of telecommunication services are only allowed to process such data: 1. In order to provide the telecommunication service; 2. For safeguarding protection of the technical system; and 3. Or for other purposes authorized by legal provisions that explicitly limit the scope of application; such provisions are, amongst others, Articles 96 regarding traffic data and, concerning location data, Article 98 of the Telecommunication Law. In contrast, data collected in relation to a contract does not necessarily fall under Article 88 of the Telecommunication Law, because in principle, it does not directly relate to a specific communication process.710 Article 3 no. 3 of the Telecommunication Law defines the term ’data in relation to a contract’ as “data of a participant (of the telecommunication network) which is collected for the conclusion, alignment, changing, or termination of a contract.” Legal scholars exemplify such data as: telephone numbers, email addresses, personal names, addresses or birthdays, device numbers, static IP addresses, passwords, or bank account and credit card data. Even if this kind of data does not fall under Article 88 of the Telecommunication Law, its Article 95 sect. 1 requires that: 1. This data may be, in essence, used only for the purposes of the contract agreed between the service provider and the participant; 2. Section 2 of this article states that, for purposes of marketing or market research, the usage of that data related to participants of other networks is allowed only based on their consent; aa) 710 See Heun, Federal Data Protection Law and further Provisions, § 88 TKG, cip. 7 to 13, and § 95 cip. 1. C. The function of the principle of purpose limitation in light of Article 8 ECFR 264 3. In contrast, the provider may use the data related to participants of its own network for purposes of marketing or market research as long as the participants do not object.711 After the termination of the contract, the service provider must delete the data after a certain period of time as set out under Article 95 sect. 3 of the Telecommunication Law. Articles 96 and 98 of the Telecommunication regulate the processing of traffic and location data. Legal scholars justify the strictness of the provisions because of the vast amount of information that this kind of data may reveal and the private companies and the Police, which therefore have a particularly high interest in the data.712 This is why the provisions authorizing the processing of such data are especially restrictive. Just as Article 6 of the ePrivacy Directive, Article 96 sect. 1 of the Telecommunication Law essentially allows collecting and processing traffic data only as: 1. For the purpose of transferring the telecommunication signals and for billing purposes;713 2. The processing of traffic data for purposes of marketing, improving the service or for the provision of value added services is allowed only if the participant belonging to the network consented to it; since their consent also authorizes the processing of data related to the individuals who are called or contacted, sentence 2 states that their personal data must immediately be anonymized.714 Equal to the provisions provided for by Article 9 of the ePrivacy Directive, the requirements regarding location data are even stricter. Pursuant to Article 98 sect. 1 of the Telecommunication Law, the collection and processing of location data is allowed only in anonymized form or with the participants’ consent. In addition, providers of value-added services collecting location data regarding its participants or users have to inform them about each collection, for example, via text messages. Only if the telecommunication service provider uses the location data exclusively for showing the participant his or her location, the text message is not required.715 These provisions lead to the result that providers of value-added 711 See Heun, ibid., § 95 TKG, cip. 8 et seqq. 712 See Heun, ibid., § 96 cip. 1 and § 98 cip. 1. 713 See Heun, ibid., § 96 sect. 1 cip. 12. 714 See Heun, ibid., cip. 18 to 23. 715 See Heun, ibid., cip. 22 referring to Article 98 sect. 1 sent. 3 of the Telecommunication Law. II. The requirement of purpose specification and its legal scale 265 services most often require the user’s consent because their product or business model needs the user to be identified (e.g. for location based marketing), and so, the data collected cannot be anonymized.716 In addition, if the provider of the value-added service is not the provider of the telecommunication network, but a third party, the consent must be, amongst other requirements, given in writing.717 This results in the situation whereby many value-added services can hardly be offered because the requirement of submitting a written confirmation to use the data enforces users to get in contact with the value-added service provider by post. The only solution for this problem seems to be that the telecommunication service provider includes the purpose of processing for such a value-added service offered by third parties in the contracts with its own participants. Purposes of processing authorized by the Telemedia Law European directives do not regulate data processing in relation with ‘telemedia services’; however, the German legislator decided to nevertheless apply certain regulatory principles for these services.718 The provisions of the German Telemedia Law refer to data processing in relation with a contract (article 14) and to ‘usage data’ (article 15). In contrast, ‘content data’ do not fall under these provisions but under the Federal Data Protection Law. Legal scholars define the term of ‘content data’ as data referring to a transaction where the Telemedia service is not the object of the contract but is only used during the process of agreement. Hence, data referring to a contract that could also have been concluded in the offline world, for example, the online contract of an offline purchase (such as an Amazon or Ebay purchase), are considered to be ‘content data’.719 If ‘content data’ is not at stake, but data in relation to a contract or ‘usage data’ is, providers of Information Society services are allowed to only process this kind of data, pursuant to Article 12, as: bb) 716 See Heun, ibid., cip. 12 to 14. 717 See Heun, ibid., cip. 18 referring to the electronic consent regulated in Article 94 of the Telecommunication Law. 718 See Schreibauer, Federal Data Protection Law and further Provisions, § 11 TMG, cip. 2. 719 See Schreibauer, ibid., cip. 11 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 266 1. On the basis of the user’s consent; 2. Or a provision explicitly limiting the scope of application; in German law, so far, there are only Articles 14 and 15 of the Telemedia Law limiting this scope.720 Article 14 sect. 1 of the Telemedia Law only allows the data processing if it occurs in relation with a contract. Legal scholars consider, comparably to Telecommunication Law, the following data as falling under the provision: names, addresses, email addresses, user names, or passwords.721 The collection and processing of that data is only allowed if it is necessary for the conclusion, alignment, or changing of a contract between the user and the service provider. Indeed, the meaning of the term ‘necessary’ is discussed in German legal literature: While some legal scholars require that the data must be necessary for the provision of the service itself, others consider that a legitimate interest in the data with respect to the conclusion, alignment, or changing of the contract is sufficient.722 Article 15 sect. 1 of the Telemedia Law only allows the processing of ‘usage data’ if it is necessary for the provision of the telemedia service or for its billing.723 The provision exemplifies usage data as: Identifiers referring to the user; information about the beginning, the end, and the extent of the concrete usage, for example, the time, data volume or downloads; and information about the concrete usage of the services, such as the specific websites visited by the user. The record of which fields the user tapped on the websites and the information provided for by cookies can also be usage data. The definition makes it principally possible that this kind of data simultaneously relates to the contract and the usage. In these cases both Articles 14 and 15 of the Telemedia Law apply.724 The term ‘necessary’ means, here again that the provider has a legitimate interest in the data for providing the service. Legal scholars argue that the collection and processing of this kind of data is necessary, at least, if there is no reasonable technical alternative. In order to ascertain if there is a technical alternative, it must be taken into account whether it is possi- 720 See Schreibauer, ibid., cip. 9. 721 See Schreibauer, ibid., cip. 6 and 7 with further references. 722 See Schreibauer, ibid., cip. 15 with further references. 723 See also the discussion on the extent of security purposes covered by this provision above under point C. II. 1. b) dd) (2) (d) Opinion on ’legitimate interests’, referring to ECJ C-582/14, cip. 50 to 64. 724 See Schreibauer, ibid., cip. 6 to 10 with further references. II. The requirement of purpose specification and its legal scale 267 ble to irreversibly anoymize personal data or, at least, to pseudonymize it.725 Pseudonymization is a technical term which means, the data is separated from the identifier, such as the name or the email address of the person concerned.726 Consequently, as far as IP addresses are considered as personal data, their processing is allowed only if it is necessary for the provision of the service. For example, web tracking including IP addresses is usually not necessary because the provision of the service would also be possible without the tracking. In contrast, session cookies are allowed as long as they serve the user process from one sub-website to another or the purchase process in an online shop. In this last respect, of course, it is only allowed when the user has really chosen certain products to buy. Comparably, log data including IP addresses or user profiles are not allowed, pursuant to Article 15 sect. 1 of the Telemedia Law. However, in these cases either the user’s consent or Article 15 sect. 3 of the Telemedia Law might authorize the processing.727 Article 15 sect. 3 of the Telemedia Law allows the processing of ‘usage data’ for the purposes of advertising, market research and technical improvements of the user experience under the following conditions: 1. First, the data is pseudonymized; 2. Second, the user does not object to the processing of his or her data; and 3. Third, the transfer of such data to third parties is only allowed in anonymized form. If these conditions are met, the data controller does not need the user’s consent. In contrast, if these purposes also cover other types of data, such as ‘content data’ or data in relation to a contract, the processing must also be based on the user’s consent. Hence, analytical tools for websites do not require consent if the data are anonymized or, at least, pseudonymized.728 Pursuant to Article 15 sect. 3 sent. 3 of the Telemedia Law, the repseudonymization (i.e. combining the data with the identifier) is forbidden. This is only allowed if the user requires information about the data stored under his or her pseudonym, pursuant to Article 15 sect. 7 of the Telemedia Law in combination with article 34 of the Federal Data Protection Law. 725 See Schreibauer, ibid., cip. 13. 726 Cf. Article 4 no. 5 of the General Data Protection Regulation. 727 See Schreibauer, ibid., cip. 14 to 16. 728 See Schreibauer, ibid., cip. 19 to 22 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 268 Purposes of processing authorized by the Federal Data Protection Law If both the Telecommunication Law and Telemedia Law do not apply, the processing of personal data possibly falls under the scope of the Federal Data Protection Law. The Federal Data Protection law principally differentiates, in contrast to the European Data Protection laws, between the public and private sector.729 However, the principle that data processing is only allowed on the basis of an authorizing law or the individual’s consent applies not only to the public but also to the private sector.730 For the private sector, Article 27 et seqq. of the Federal Data Protection Law provide several of these authorizing provisions. Here, the purpose of the data processing again plays a decisive role because it provides a link for the degree of regulation. The legislator principally differentiates between data processing for the data controller’s ‘own’ purposes and that for third parties. Legal scholars justify this disparity with the different levels of risk posed for the individual. As soon as the data processing does not occur for the controller’s own purpose but for the purpose of third parties, the risk significantly increases. If the data is used for different purposes by third parties, the individual concerned has less overview and can control the later usage of the data less. In light of this, the German regulation applies a stricter approach if the data controller pursues the interest of a third party instead of its own.731 Three basic legitimate grounds Irrespective of legal provisions in other data protection laws and the individual’s consent, the Federal Data Protection Law establishes, in essence, three different legitimate grounds for the processing of personal data as:732 1. The processing occurs in relation with a legal or quasi-legal obligation (article 28 sect. 1 sent. 1 no. 1); cc) (1) 729 See Simitis, Federal Data Protection Law, § 27, cip. 1. 730 See Kramer, Federal Data Protection Law and further provisions, § 28 BDSG, cip. 1. 731 See Simitis, ibid., cip. 4 and 5. 732 See Kramer, ibid., cip. 7. II. The requirement of purpose specification and its legal scale 269 2. The legitimate interests of the data controller are not overridden by the interests of the individual concerned (article 28 sect. 1 sent. 1 no. 2); 3. The data processed is generally accessible (article 28 sect. 1 sent. 1 no. 3); Beside the type of data (e.g. special categories of data such as health data regulated under article 28 sect. 6 to 9), the legitimate grounds are more or less limited pursuant to further (more specific) purposes. The Federal Data Protection Law differentiates, in essence, between: the processing of personal data for the controller’s purpose of address trading and marketing (article 28 sect. 3 to 3b); the transfer of data to credit agencies (article 28a); the processing of data for scoring (article 28b); the collection, storage and transfer of data to third parties, in particular, for purposes of marketing, credit agencies, or address trading (article 29); the treatment of data for purposes of market research for third parties (article 30a); the treatment of data for purposes of an employment (article 32); the treatment of data for scientific research (article 40); and the treatment of data for purposes of journalism and literature (article 41). In principle, the data controller is not obliged to explicitly say on which of these legitimate grounds it bases its treatment of data. However, pursuant to article 28 sect. 1 sent. 2, the controller must stipulate, as soon as the data is collected, the purpose of the data processing.733 Before addressing the restrictions or privileges under legislation, in relation to the abovementioned purposes, the next paragraphs will provide a summary about the pre-conditions provided for by the basic legitimate grounds listed above. ‘Performance of a contract’, Article 28 sect. 1 sent. 1 no. 1 BDSG Comparably to the European directives, Article 28 sect. 1 sent. 1 no. 1 of the Federal Data Protection Law states that the data processing is allowed if it is ‘needed to create, carry out or terminate a legal obligation or quasilegal obligation with the data subject’. The reference to ‘legal and quasilegal obligation’ includes not only contracts, but also transactions which do not require a contract, for instance, price competitions or spontaneous (2) 733 See Kramer, ibid., cip. 8. C. The function of the principle of purpose limitation in light of Article 8 ECFR 270 associations.734 With respect to the term ‘needed’, there is an ongoing discussion in German legal literature about what this term actually means. At least, it seems to be common ground that there must be a direct relationship between the processing intended and the concrete purpose of usage.735 However, the following examples might illustrate the difficulties in defining this requirement: While the contracting parties usually disclose their names and contact addresses, the shipping address is undoubtedly necessary for the delivery; in contrast, the complete address is ‘only’ useful, for example, when it comes to enforcement proceedings.736 Another example is the credit assessment before the conclusion of a contract: Since the provider of a product or service could always retain the purchase until full payment, the assessment would not be necessary.737 Therefore, legal scholars stress that the requirement actually leads to a balancing act of the colliding interests of the data subject and the data controller. In so doing, they consider that the interests of the data controller might often prevail the interests of the data subject because the data subject can, in principle, decide whether or not to enter into the contract.738 ‘Justified interests of the controller’, Art. 28 sect. 1 sent. 1 no. 2 BDSG With respect to the second legitimate ground listed above, Article 28 sect. 1 sent. 1 no. 2 of the Federal Data Protection Law authorizes the processing of personal data “in so far as this is necessary to safeguard justified interests of the controller (…) and there is no reason to assume that the data subject has an overriding legitimate interest in his data being excluded from processing or use”. This provision provides an equivalent legitimate basis similar to the other legal grounds. However, legal scholars stress that the legal ground provided for by Article 28 sect. 1 sent. 1 no. 2 should usually be examined after the other legal grounds. The reason is that a contract, for example, concluded between the data subject and the (3) 734 See Kramer, ibid., cip. 39 with further references. 735 See Kramer, ibid., cip. 30 with further references. 736 See Kramer, ibid., cip. 32 with further references. 737 See Kramer, ibid., cip. 50 with further references. 738 See Kramer, ibid., cip. 31 with further references, particularly, to contra opinions. II. The requirement of purpose specification and its legal scale 271 controller might reveal interests or even contain explicit clauses of confidentiality which prevail the general weighing of interests foreseen in article 28 sect. 1 sent. 1 no. 2.739 If this is not the case, the data controller can pursue any interest so long as their interests do not conflict with the law.740 Article 28 sect. 1 sent. 1 no. 3 of the Federal Data Protection Law structures the weighing in two steps: At first, the data processing must be, here again, ‘necessary’ to safeguard the justified interests. If this is the case, the data controller must examine whether there is a ‘reason to assume’ that the individual has an overriding interest in that the data controller does not use the individual’s personal data. This means, on the one hand, that the individual’s interest must override the controller’s interest, not the reverse; on the other hand, the moment the data controller assumes that an overriding interest for the individual exists, it cannot base its processing on this provision.741 ‘Generally accessible data’, Art. 28 sect. 1 sent. 1 no. 3 BDSG In contrast to this rather differentiated approach, article 28 sect. 1 sent. 1 no. 3 of the Federal Data Protection Law provides, with respect to generally accessible data, a priority rule when weighing the interests. Article 10 sect. 5 sent. 2 of the Federal Data Protection Law defines the term of ‘generally accessible data’ as: “Data which anyone can use, be it with or without prior registration, permission or the payment of a fee.” This is, for instance, the case with respect to data stemming from public registers, such as the population register. Secondly, private databases are generally accessible as long as the access does not depend on an arbitrary decision of the provider. Therefore, for example, the processing of data stemming from public profiles in social networks usually falls under this provision.742 However, there are two restrictions provided for by Article 28 sect. 1 sent. 1 no. 3 of the Federal Data Protection Law: First, the processing must exclusively refer to the type of data that is collected i.e. the publically available data. That means that the controller is not allowed, on the basis of this provision, to combine publically made available data with data (4) 739 See Kramer, ibid., cip. 57 with further references. 740 See Kramer, ibid., cip. 57 with further references. 741 Cf. Kramer, ibid., cip. 59 and 74 whose methodology is even more complex. 742 See Kramer, ibid., cip. 16 to 20 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 272 that is not publically available.743 Some legal scholars are of the opinion that this provision does not even cover the combination of data stemming from different generally accessible sources.744 Second, this kind of data may only be processed on the basis of article 28 sect. 1 sent. 1 no. 3 of the Federal Data Protection Law “unless the data subject's legitimate interest in his data being excluded from processing or use clearly outweighs the justified interest of the controller of the filing system.” Thus, the priority rule does not apply if an ‘objective and impartial observer’ would identify an issue of confidentiality.745 This might be the case, for instance, if information about an individual is stored in online or in offline archives and should, from an objective perspective, be ‘forgotten’; or if information about an individual in public rating portals harms his or her social reputation. This is in particular the case if the individual has objected to any further publication.746 Privileges and restrictions pursuant to the purpose These three basic legitimate grounds are more or less restricted with respect to the (further) specific purposes described above. The following sections will highlight few of these specific purposes in order give an impression of the idea (and complexity) of the regulatory approach. Article 28 sect. 3 to 4 of the Federal Data Protection Law essentially regulates the processing of data for the purpose of marketing and address trading, on the one hand, in the controller’s interest, and on the other hand, in the interest of a third party. In principle, this kind of processing is only allowed on the basis of the consent of the individual concerned (sect. 3 sent. 1). However, the data controller is allowed to process the data without consent if the following conditions are met (sent. 2): First, if the data refers only, amongst others, to the individual’s name, title, academic degree, profession, address, year of birth, as well as the name of his or her branch, business, and profession. And second, this data is used only for (5) 743 See Kramer, ibid., cip. 14; Simitis, Federal Data Protection Law, § 28 cip. 164. 744 See Gola/Schomerus, Federal Data Protection Law, § 28 BDSG cip. 31; Wolff/ Brink, Federal Data Protection Law, § 28 BDSG cip. 84; contra opinion by Kramer, ibid., cip. 14. 745 See Kramer, ibid., cip. 23 with further references. 746 See Kramer, ibid., cip. 24 to 27 with further references. II. The requirement of purpose specification and its legal scale 273 purposes, first, of marketing of the controller’s offers, and second, of marketing in relation to the profession of the individual and under his or her professional address, or third, marketing for donations. In light of both the restriction (sent. 1) and the privilege (sent. 2), it is decisive to define the term ‘marketing or address trading for own purposes’. If Article 28 sect. 3 to 4 do not apply, the processing may only be based under Article 28 sect. 1 and 2 (the ‘basic legitimate grounds’) or Article 30a (‘market research for third parties’) or Article 29 (‘address trading for third parties’). This complicated systematic approach has lead to a heavy debate within German literature. In conclusion, the following ‘rules of thumb’ apply: in terms of marketing, if the controller uses the customer primarily in order to sell new products, Article 28 sect. 3 to 4 apply – in contrast, if the customer contact primarily occurs for the service regarding products already sold, the processing falls under Article 28 sect. 1 and 2; in terms of market research, if the market research is conducted for third parties, Article 30a applies – in contrast, the treatment of data for an internal market research falls under Article 28 sect. 1 or 2; and in terms of address trading, this falls only under Article 28 sect. 3 to 4 if it occurs for the purpose of direct marketing – if not, Article 29 applies.747 The transfer of personal data to credit agencies regarding a legal claim is, pursuant to Article 28a of the Federal Data Protection Law, essentially allowed if the obligation is not fulfilled in time and the claim is, either (first), officially verified by a judicial court, or (second) during the course of an insolvency procedure, or (third) by the individual concerned. For “scoring purposes” (similar to profiling), in relation to the conclusion, execution or termination of a contract, Article 28b of the Federal Data Protection Law essentially establishes the following procedural requirements: First, the data is, pursuant to an established mathematical-statistical method, relevant in order to calculate the probability of a particular behavior; second, the profiling is not only based on address data (such as a home address); and third, if the home address of an individual concerned is used, he or she is informed about the usage before the calculation. Article 29 of the Federal Data Protection Law authorizes the commercial collection and processing of personal data for the purpose of transferring it to third parties under similar conditions as provided for by Article 28 sect. 1 (‘own business purposes’) as: first, there is no reason to assume that the individ- 747 See Kramer, ibid., cip. 92 to 101 with further references. C. The function of the principle of purpose limitation in light of Article 8 ECFR 274 ual concerned has an interest of confidentiality; second, the data concerned stem from publically available sources and the interests of the individual concerned do not prevail; or third, the conditions provided for by Article 28a sect. 1 or 2 (‘transfer to credit agencies’) are met. The treatment of data for purposes of market research for third parties is, pursuant to Article 30a of the Federal Data Protection Law, allowed under similar conditions as provided for by Article 29 (‘address trading for third parties’). Finally, Article 32 of the Federal Data Protection Law regulates the processing of data of employees. Section 1 sentence 1 states: “Personal data of an employee may be collected, processed or used for employmentrelated purposes where necessary for hiring decisions or, after hiring, for carrying out or terminating the employment contract.” Legal scholars essentially discuss, in this regard, the following issues: first, the interplay of this Article with Article 28 sect. 1 and 2 (‘own business purposes’); second what the term ‘employment-related purposes’ actually means; and third, whether there actually is a difference in the methods of how the colliding interests are weighed against each other. 748 Purposes of processing specified when consent is given Beside all these legitimate grounds provided for by law, the data controller can also base its data processing on the consent of the individual concerned. Legal scholars stress that the data controller should only seek the individual’s consent, so long as there is no legal provision authorizing the processing of the data. The reason is that the principle of good faith might prohibit the controller to fall back on the legal provisions if the consent is illegal or the individual objects to it.749 If the individual objects to the processing, the data controller only has to stop the processing of the individual’s data. If the consent, as a whole is illegal, the processing itself (from the start) would also be illegal. dd) 748 See Kramer, ibid., cip. 60 to 61 with further references. 749 See Kramer, ibid., cip. 60 to 61 with further references; Gola/Schomerus, ibid., § 4 cip. 16. II. The requirement of purpose specification and its legal scale 275 Not a waiver but execution of right to informational selfdetermination Irrespective of the fact that the requirements for consent provided for by German ordinary law are based on the European directives, German legal scholars refer to the German informational self-determination right when they stress that the consent is not a waiver but a form of execution of German Basic Law.750 They refer, in particular, to the decision of “Release of Confidentiality” which stated, as quoted previously: “The general personality right safeguards that the legal order provides and maintains the legal conditions under which the individual is able to participate in communicational processes in a self-determined way and to develop his/her personality. (…) The contract is the essential instrument in order to develop free and self-responsible actions in relation to third parties. The contract, which mirrors the harmonious will of the contracting parties generally allows the assumption of a fair balance of their interests and must be principally respected by the State.”751 Pursuant to the concept of protection of the German right to informational self-determination, the consent provided, thus is not a waiver, but a form of execution of this right. In contrast, the European Court of Human Rights, appears to consider the individual’s consent as a waiver of a right to private life under Article 8 ECHR. In the decision of “M.S. vs. Sweden”, the Court explicitly dealt with this issue, as quoted previously: “It cannot therefore be inferred from her request that she had waived in an unequivocal manner her right under Article 8 § 1 of the Convention to respect for private life with regard to the medical records at the clinic.”752 Since the right to private life in Article 7 ECFR corresponds to Article 8 ECHR, the European Court of Justice might equally consider the consent as a waiver of fundamental rights, at least with respect to Article 7 ECFR. In this case, the German legislator would have to apply this concept of protection as long as there is no mar- (1) 750 See Kramer, ibid., § 28 BDSG cip. 1 and 2 referring to the quoted decision ”Release of Confidentiality“ by the German Constitutional Court as well as to Simitis, Federal Data Protection Law, § 4a cip. 2. 751 See BVerfG, 1 BvR 2027/02, cip. 33 and 34. 752 See ECtHR, Case of M.S. vs. Sweden from 27 August 1997 (74/1996/693/885), cip. 32. C. The function of the principle of purpose limitation in light of Article 8 ECFR 276 gin of discretion transposing the European directives into German law.753 However, the European Court of Justice has not yet decided, at least not explicitly, on this issue.754 Requirements for consent and consequences of its failure With respect to the formal requirements, the Federal Data Protection Law, on the one hand, as well as the Telecommunication Law and the Telemedia Law, on the other hand, provide different requirements for the consent that needs to be given. While Article 4a of the Federal Data Protection Law principally requires the consent in writing, Article 94 of the Telecommunication Law and Article 13 sect. 2 of the Telemedia Law allow the user to also consent in electronic form. These two last-mentioned provisions, hence, avoid the scenario whereby the participants or users of the regulated services, have to change from the online world into the offline world.755 In essence, the consent in electronic form is only legitimate if the service provider meets the following requirements: that the user consents to the processing of ‘his or her’ data explicitly and unambiguously; the consent is documented by the controller; the user is able to always access his or her consent; and the user gets the opportunity to always object the processing. In whatever form the consent is provided for, finally, the question always is: What happens with the consent from a legal perspective if the individual consenting to his or her data processing made a mistake when they initially provided their consent. Most often, legal scholars, as well as judicial courts, consider the consent given as invalid. This may be the case, for instance, if the data controller fooled the individual or did not inform him or her about relevant circumstances of the treatment of data.756 The question thus is closely connected to the information that the data controller has to provide to the individual concerned. In this regard, Ar- (2) 753 See above under point C. I. 1. a) The interplay between European Convention for Human Rights, European Charter of Fundamental Rights and German Basic Rights. 754 See in more detail beneath under point C. IV. 3. b) aa) Consent: “Later processing covered by specified purpose?”. 755 See Schreibauer, ibid., § 12 cip. 10. 756 See Kramer, ibid., § 4a BDSG cip. 12, 13 and 22 with further references to Gola/ Schomerus, ibid., § 4a cip. 22; Plath, ibid., § 4a cip. 29; OLG Köln, decision from the 17th of June 2011 (6 U 8/11). II. The requirement of purpose specification and its legal scale 277 ticle 4a sect. 1 sent. 2 of the Federal Data Protection Law refers, in particular, to the purpose of the treatment of data as: “Data subjects shall be informed of the purpose of collection, processing or use”. However, similarly to the European directives, there are no further criteria provided for by law in order to determine the purpose. Discussion on the degree of precision of a specified purpose As a consequence, these criteria are also highly discussed in German legal literature, particularly, referring to the degree of precision. In summary, the reasoning provided for in the discussion, often appears to be circular and/or overly strict. However, some legal scholars at least refer to the constitutional concept of protection in order to justify their reasoning. As a common ground, comparably to the European discussion, these scholars stress that the individual must be able to understand the factual extent of his or her consent. However, when reviewing the details in order to determine the ‘extent’ of the consent, this criteria starts to get blurred. For example, Taeger summarizes that, “the consent must be so precise that the type of personal data and the purpose of collection or usage as well as, in the case of a transfer, possible recipients are sufficiently specified.”757 Däubler ads a dynamic element, referring to the intensity of the possible infringements, as: “The consent must not be, pursuant to the common understanding, a blanket; it must specify which data is processed or used for which purpose. The more the protection of the personality is concerned the more precise the possibilities of processing must be specified.”758 Däubler continues to provide examples: “This is the case (i.e. it is sufficiently precise) if a medical patient consents to the transfer of the remuneration claim for ‘billing purposes’ and to the transfer of the related in- (3) 757 See Taeger/Gabel, BDSG Kommentar, § 4a, cip. 30: “Vielmehr muss die Erklärung so bestimmt sein, dass die Art der personenbezogenen Daten und der zweck der Erhebung und Verwendung sowie im Falle der Übermittlung etwaige Empfänger hinreichend genau benannt werden.” 758 See Däubler/Klebe/Welde/Weichert, BDSG, § 4a cip. 18 with further references to court decisions: “Die Einwilligung nach allgemeiner Auffassung keinen pauschalen Charakter tragen; sie muss erkennen lassen, welche Daten zu welchem Zweck verarbeitet werden oder genutzt werden sollen. Je stärker der Schutz der Persönlichkeit tangiert ist, umso präziser müssen die Verarbeitungsmöglichkeiten umschrieben sein.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 278 formation. In contrast, the consent to a transfer to ‘any refinancing bank’ is illicit because the individual concerned cannot overview the extent of his or her consent.”759 Däubler comes to the conclusion that “the requirement of specification is justified, at the end, because the individual is only on its basis able to overview the process (…).”760 Kramer provides comparable examples. From his point of view, the information about the “transfer (of the data) to partner companies” is not sufficient because the recipients of the data would not be identifiable when the data is first collected. Comparably, the information given about the usage of data ‘for marketing purposes’ would not be sufficient because it is not clear whether the usage will be that of either the controller or that of a third party.761 In this last respect, this reasoning clearly refers to the German legislator’s thoughts that the risks for the individual caused by the processing of personal data in relation to a controller’s own interest and that of a third party is different. Therefore, this legal scholar argues the controller must clarify their interest as a basis for gathering personal data.762 Däubler and Kramer already referred, more or less, to a broader concept of protection. However, Simitis explicitly ties into the concept of protection developed by the German Constitutional Court stating as: “The consent does only serve to safeguard and concretize the individual’s right of decision if it is sufficiently specified, in other words, if it informs about under which conditions the individuals consented to the processing of 759 See Däubler/Klebe/Welde/Weichert, BDSG, § 4a cip. 18 with further references to court decisions: “Dem ist Rechnung getragen, wenn ein Patient in die Abtretung der Honorarforderung des Arztes ‚zu Abrechnungszwecken’ und in die Weitergabe der dafür notwendigen Informationen einwilligt; eines besonderen Hinweises auf die ärztliche Schweigepflicht bedarf es nicht. Unzulässig ist dagegen eine Einwilligung zur Abtretung an jede ‚refinanzierende Bank; hier kann der Betroffene die Tragweite seiner Erklärung nicht überblicken.” 760 See Däubler/Klebe/Welde/Weichert, BDSG, § 4a cip. 18 with further references to court decisions: “Das Bestimmtheitserfordernis rechtfertigt sich insgesamt damit, dass nur auf diese Weise der Vorgang für den Einzelnen überschaubar und damit der Grundsatz der Datentransparenz gewahrt bleibt.” 761 See Kramer, ibid., § 4a BDSG cip. 21 with references to Wolff/Brink, ibid., § 4a cip. 44 and Plath, ibid., § 4a cip. 47. 762 See above the introduction of point C. II. 3. c) cc) Purposes of processing authorized by the Federal data Protection Law, referring to Simitis, Federal Data Protection Law, § 27, cip. 4 and 5. II. The requirement of purpose specification and its legal scale 279 which data.”763 Indeed, Simitis admits that the information is limited: “Nobody may seriously expect that the individual’s consent meticulously refers to each single detail of the data process. (…) The degree of precision of the consent depends on the particular case. However, in any case, the consent has to refer not only to the information given by the individual but also to the agreed aims and phases of the processing.””.764 He therefore has a rather strict approach. In his opinion, information about the following purposes is not sufficiently precise: for ‘prudent business management’; ‘usual support of the authorizing person’; ‘credit security’; ‘market research’; not even the transfer of ‘data of the debtor for credit processing’ would suffice, in his opinion, to the individual’s right to self-determination. Instead, the data controller must specify which concrete data is processed and used. Only in special circumstances, would the reference to a certain “type” of data could be sufficient. In summary, “only information which is as precise as possible enables the individual concerned to principally hinder the processing of single information that he or she considers as particularly dangerous, for example, the unlimited transfer of ‘negative credit data’ to credit agencies. The same applies with respect to a general consent to the transfer of data to other companies of the same branch. An effective protection depends, here like in other situations, on an early enough restriction of the circle of recipients.”765 763 See Simitis, Federal Data Protection Law, § 4a cip. 77: “Die Einwilligung kann vielmehr die ihr zugewiesenen Aufgabe, das Entscheidungsvorrecht der Betroffenen zu gewährleisten wie zu konkretisieren, nur dann erfüllen, wenn sie hinreichend bestimmt ist, also klar zu erkennen gibt, unter welchen Bedingungen sich die Betroffenen mit der Verarbeitung welcher Daten einverstanden erklärt haben.” 764 See Simitis, ibid., § 4a cip. 80: “Niemand kann ernsthaft mit einer Äußerung der betroffenen rechnen, die minutiös alle Einzelheiten des Verarbeitungsprozesses aufgreift. (...) Wie spezifiziert die Erklärung zu sein hat, lässt sich letztlich nur vor dem Hintergrund der konkreten Verarbeitungssituation beurteilen. So viel steht jedoch fest: Der Erklärung müssen in jedem Fall nicht nur die jeweils in Betracht kommenden Angaben zu entnehmen sein, sondern auch die gebilligten Verarbeitungsziele und Verarbeitungsphasen.“ 765 See Simitis, ibid., § 4a cip. 81 and 82: “Kurzum, nur eine möglichst präzise Aussage räumt den Betroffenen grundsätzlich die Chance ein, eine aus ihrer Sicht besonders gefährliche Verarbeitung einzelner Angaben, etwa die uneingeschränkte Übermittlung von ’Negativmerkmalen’ an Kreditinformationssysteme, rechtzeitig zu verhindern.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 280 Comparison with principles developed by the German Constitutional Court In light of the divergence of examples, more or less referring to the right to informational self-determination, it is helpful to examine, in more detail, the criteria developed by the German Constitutional Court with respect to the precision of the purpose. As set out in chapter C. I. 2. d) Purpose specification as the essential link for legal evaluation, the German Constitutional Court decided, with respect to both the public and the private sector, on this question. In this regard, the difference between the requirement of purpose specification in the public and the private sector also becomes clearer. Public sector: Purpose specification as a result of the principle of clarity of law With respect to the public sector, the German Constitutional Court assesses the requirement of purpose specification as part of the proportionality assessment. In light of this, the requirement of purpose specification supplements the requirement to limit the later use, and is particularly strengthened by the principle of clarity of law. Function of purpose specification (basic conditions) In the case of “Decision on Population Census”, the German Constitutional Court stated, on the main criteria for specifying the purpose, which legal scholars many times referred to, as: “An obligation for the provision of personal data requires that the legislator precisely and specifically determines in certain areas the purpose of usage and should ensure that the information is suitable and necessary for achieving this purpose. The collection ahead of non-anonymized data for an undetermined or not yet determinable purpose is disproportionate with this (requirement).”766 ee) (1) (a) 766 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 179: “Ein Zwang zur Angabe personenbezogener Daten setzt voraus, daß der Gesetzgeber den Verwendungszweck bereichsspezifisch und präzise bestimmt und daß die Angaben für diesen Zweck II. The requirement of purpose specification and its legal scale 281 The Court referred to these criteria in its subsequent decisions, and particularly clarified, in its decision of “Retrieval of Bank Account Master Data”, the interrelationship between the principle of purpose limitation and the principle of clarity of law. In the Court’s opinion, “the principle of clarity of law is based, with respect to the right to informational self-determination, on Art. 2 sect. 1 in combination with Art. 1 sect. 1 GG per se. It shall guarantee that public agencies find legal criteria for the execution of the law and that the judicial courts are able to control it; furthermore, the principle of clarity of law enables the citizens concerned to be prepared by potentially infringing measures. Essentially, the reason, the purpose and the limits of the infringing measure must be provided for by the provision in a precise, legally clear manner, as well as specifically in relation to certain areas. (…) If a legal provision authorizes an infringement of the right to informational self-determination, the principle of clarity of law has a specific function to provide a sufficiently precise determination of the purpose of usage for the information concerned. It hence supplements the constitutionally required purpose limitation with respect to the information retrieved. The right to informational self-determination protects the individual against information related measures that he or she cannot foresee nor control.”767 Thus, beside further requirements, such as the specification of the reason for the infringing measure and the extent of the data colgeeignet und erforderlich sind. Damit wäre die Sammlung nicht anonymisierter Daten auf Vorrat zu unbestimmten oder noch nicht bestimmbaren Zwecken nicht zu vereinbaren. (...)” 767 See BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Banking Account Matser Data), cip. 71, 73 and 74: “Das Bestimmtheitsgebot findet im Hinblick auf das Recht auf informationelle Selbstbestimmung seine Grundlage in Art. 2 Abs. 1 in Verbindung mit Art. 1 Abs. 1 GG selbst (…). Es soll sicherstellen, dass die gesetzesausführende Verwaltung für ihr Verhalten steuernde und begrenzende Handlungsmaßstäbe vorfindet und dass die Gerichte die Rechtskontrolle durchführen können; ferner erlauben die Bestimmtheit und Klarheit der Norm, dass der betroffene Bürger sich auf mögliche belastende Maßnahmen einstellen kann (…). (…) Ermächtigt eine gesetzliche Regelung zu einem Eingriff in das Recht auf informationelle Selbstbestimmung, so hat das Gebot der Bestimmtheit und Klarheit die spezifische Funktion, eine hinreichend präzise Umgrenzung des Verwendungszwecks der betroffenen Informationen sicherzustellen. Auf diese Weise wird das verfassungsrechtliche Gebot der Zweckbindung der erhobenen Information verstärkt. Das Recht auf informationelle Selbstbestimmung schützt den Einzelnen gegen informationsbezogene Maßnahmen, die für ihn weder überschaubar noch beherrschbar sind. (…)”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 282 lected and further processed, the requirement of purpose specification results from the principle of clarity of law.768 However, both the principle of clarity of law and the principle of purpose limitation, are directly based in the right to informational self-determination. The German Constitutional Court also clarified that the requirement of purpose specification, as one element of the principle of clarity of law, does not, per se, forbid the usage of undetermined legal provisions. Rather, the legislator could choose, depending on the particular issue, between different regulation instruments in order to determine the requirements of the infringement. For example, the collection of personal data for statistical purposes cannot be comprehensively pre-determined in advance.769 In the case of “Surveillance of Telecommunications”, the Court comparably took the concrete possibilities of pre-determining the purposes into account, as: “In view of the task and operational method of intelligence services, a more precise determination of the pre-conditions for the surveillance was not possible.”770 In contrast, state measures, such as those based on social legal provisions, could typically be categorized and consequently listed according to the matter at hand.771 In the case of “License Plate Recognition”, the Court finally specified further criteria for the proportionality assessment: “The concrete requirements for the pre-determined clarification of the authorizing provision depend on the type and intensity of the infringement. Hence, the authorizing provision must especially pre-determine whether it allows serious infringements. If it does not exclude such (serious) infringements in a suffi- 768 Confirmed in BVerfG, 20th of April 2016, 1 BvR 966/09 and 1 BvR 1140/09 (Federal Bureau of Investigation Law), cip. 285; see also Härting, Purpose limitation and change of purpose in data protection law, p. 3285. 769 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 187. 770 See BVerfG, ibid., cip. 181: “(Der Gesetzgeber hat insbesondere die Zwecke, zu denen Telekommunikationsbeziehungen überwacht und die so erlangten Erkenntnisse verwendet werden dürfen, hinreichend präzise und normenklar festgelegt. Die Gefahrenlagen, auf deren Früherkennung die Beobachtung oder Überwachung zielt, werden genau genug beschrieben und durch die Bezugnahme auf andere Gesetze noch weiter verdeutlicht. Der Umfang der Überwachung ist durch die Begrenzung auf den internationalen nicht leitungsgebundenen Verkehr bestimmt.) Eine nähere Bestimmung der Voraussetzungen, unter denen die Überwachung stattfinden darf, war angesichts der Aufgabe und Arbeitsweise von Nachrichtendiensten nicht möglich.” 771 See BVerfG, ibid., cip. 76 and 77. II. The requirement of purpose specification and its legal scale 283 ciently clear manner, the provision has to also meet the legal requirements which apply to these (serious) infringements.”772 Already in the “Decision on the Population Census”, the Court has required, similarly: if the legislator cannot narrowly specify the purpose, “corresponding restrictions within the information system must balance the collection and processing of information. Clearly defined requirements for the processing of data are necessary in order to guarantee that the individual does not become, under the conditions of automated collection and processing of his or her personal data, a mere object of information.”773 Examples for specific purposes: Certain areas of life or explicitly listed crimes Given these criteria, the Constitutional Court came, in the cases of “Decision on Population Census”, “Surveillance of Telecommunications”, and “Big Eavesdropping Operation” to the conclusion that the purposes, which were provided for by the corresponding law, were sufficiently precise. In the case of “Decision on Population Census”, it clarified that “a legal provision is sufficiently determined if its purpose becomes clear with respect to the text of the provision and its legislative material; thereby, it is sufficient if the purpose results from the context of the provision with respect to the area of life that shall be regulated. The description of the data (…) (b) 772 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07, cip. 95: “Die konkreten Anforderungen an die Bestimmtheit und Klarheit der Ermächtigung richten sich nach der Art und Schwere des Eingriffs (…). Die Eingriffsgrundlage muss darum erkennen lassen, ob auch schwerwiegende Eingriffe zugelassen werden sollen. Wird die Möglichkeit derartiger Eingriffe nicht hinreichend deutlich ausgeschlossen, so muss die Ermächtigung die besonderen Bestimmtheitsanforderungen wahren, die bei solchen Eingriffen zu stellen sind (…).” 773 See BVerfG, BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 184 and 185: “Ist die Vielfalt der Verwendungsmöglichkeiten und Verknüpfungsmöglichkeiten damit bei der Statistik von der Natur der Sache her nicht im voraus bestimmbar), müssen der Informationserhebung und Informationsverarbeitung innerhalb des Informationssystems zum Ausgleich entsprechende Schranken gegenüberstehen. Es müssen klar definierte Verarbeitungsvoraussetzungen geschaffen werden, die sicherstellen, daß der Einzelne unter den Bedingungen einer automatischen Erhebung und Verarbeitung der seine Person betreffenden Angaben nicht zum bloßen Informationsobjekt wird.(...)” C. The function of the principle of purpose limitation in light of Article 8 ECFR 284 provided for by the law for the census from 1983 meets these requirements; the citizen is able to understand which fundamental facts of the social structure he or she will be asked. The main purposes result from the type of collection – a census for population, profession, housing, and work areas, from the program of collection and from the legislative material. The legislator is not obliged to determine the concrete purpose for each single information that must be provided for by citizens. This is especially the case with respect to the particularities of the collection of data for statistical purposes, in particular, of a census of population; the listing of the separate purposes is, given its multifunctional aims, impossible.”774 With respect to non-statistical purposes, in the case of “Surveillance of Telecommunications”, the Court stated “especially the purposes for which the telecommunication is controlled and the information retrieved can be used (such as the prevention, intelligence, and criminal prosecution of international terrorist attacks, of international distribution of weapons of war, of exports of drugs into the Federal Republic, and of counterfeiting of currencies committed abroad) are sufficiently precise and clear. The dangers, which the observation and surveillance seeks to discover in advance, are sufficiently pre-determined. The extent of the surveillance is determined by its restriction to traffic of international non-cable based telecommunication.”775 In the case of “Big Eavesdropping Operation”, the legislator also met the requirements of purpose limitation and clarity of law, giv- 774 See BVerfG, ibid., cip. 199: “Hinreichend bestimmt ist ein Gesetz, wenn sein Zweck aus dem Gesetzestext in Verbindung mit den Materialien deutlich wird (…); dabei reicht es aus, wenn sich der Gesetzeszweck aus dem Zusammenhang ergibt, in dem der Text des Gesetzes zu dem zu regelnden Lebensbereich steht (…). Diesen Anforderungen genügt die Beschreibung der zu erhebenden Merkmale im Volkszählungsgesetz 1983; der Bürger kann erkennen, über welche Grundtatbestände der Sozialstruktur er befragt werden soll. Die Hauptzwecke lassen sich aus der Art der Erhebung - einer Volkszählung, Berufszählung, Wohnungszählung und Arbeitsstättenzählung -, dem Erhebungsprogramm und den Gesetzesmaterialien hinreichend deutlich entnehmen. Nicht erforderlich ist, daß der Gesetzgeber zu jeder einzelnen gesetzlichen Verpflichtung auch den konkreten Zweck im Gesetz selbst erläutert. Dies gilt namentlich mit Rücksicht auf die Besonderheiten der Erhebung von Daten für statistische Zwecke, zumal bei einer Volkszählung; hier ist eine Auflistung der einzelnen Zwecke aufgrund ihrer multifunktionalen Zielsetzung unmöglich.” 775 See BVerfG, 14th of July 1999, 1 BvR 2226/94, cip. 181: “Der Gesetzgeber hat insbesondere die Zwecke, zu denen Telekommunikationsbeziehungen überwacht und die so erlangten Erkenntnisse verwendet werden dürfen, hinreichend präzise II. The requirement of purpose specification and its legal scale 285 en that the surveillance was only used for the investigation of explicitly listed crimes.776 Examples for unspecific purposes: Abstract dangers or unknown purposes In contrast, in the case of “Dragnet Investigation”, the German Constitutional Court clarified which purpose provided for by law was sufficiently precise and which was not: “The transfer of the data serves the purpose of automated synchronization regarding other data sets so long as it is necessary for the defense of specific dangers, here, for the existence or security of the Federal State or of one Land or for physical integrity, life or freedom of a person. (…) The law determines the police as the receiving public agency. (…) Given the pre-conditions mentioned, (…/the law offended) is also sufficiently determined as it authorizes not only the retrieval and processing of the explicitly listed types of data but (…) also ‘other data that are necessary for the concrete case’. The requirement of pre-determined clarification of legal rules is met because the notion of ‘other data which is necessary for the concrete case’ can be, with respect to the purpose of the defense of danger (…), typified in a manner that the principle of proportionality is met.”777 In contrast, the Court stressed “without restriction to a specific danger, there was not sufficient criteria in order to (c) und normenklar festgelegt. Die Gefahrenlagen, auf deren Früherkennung die Beobachtung oder Überwachung zielt, werden genau genug beschrieben und durch die Bezugnahme auf andere Gesetze noch weiter verdeutlicht. Der Umfang der Überwachung ist durch die Begrenzung auf den internationalen nicht leitungsgebundenen Verkehr bestimmt. Eine nähere Bestimmung der Voraussetzungen, unter denen die Überwachung stattfinden darf, war angesichts der Aufgabe und Arbeitsweise von Nachrichtendiensten nicht möglich.” 776 See BVerfG, 3rd of March 2004, 1 BvR 2378/98, cip. 307 to 319. 777 See BVerfG, 4th of April 2006, 1 BvR 518/02, cip. 145 to 147: “Gemäß § 31 Abs. 1 PolG NW 1990 dient die Datenübermittlung dem Zweck des automatisierten Abgleichs mit anderen Datenbeständen, soweit dies zur Abwehr bestimmter Gefahren, nämlich für den Bestand oder die Sicherheit des Bundes oder eines Landes oder für Leib, Leben oder Freiheit einer Person, erforderlich ist. (Als Verwendungszweck ist damit der automatisierte Abgleich der übermittelten Daten mit anderen Datenbeständen zur Abwehr der in § 31 Abs. 1 PolG NW 1990 benannten Gefahren festgelegt. Das ist hinreichend. Auch dem für Übermittlungsregelungen geltenden Gebot einer hinreichend sicher erschließbaren C. The function of the principle of purpose limitation in light of Article 8 ECFR 286 interpretatively determine the data concerned, especially with respect to the notion ‘other data which are necessary for the concrete case’. If there is no specific danger, it is not possible to sufficiently pre-determine which data is necessary ‘for the concrete case’. If a general terroristic danger was the reference for the dragnet investigation and consequently for the determination of the data required by the police, there would be a merely unlimited authorization (for data collection and processing). (…) This would infringe the constitutional requirements for the clarity of law.”778 In the case of “Retrieval of Bank Account Master Data”, the Court also affirmed the claim that the “law for the encouragement of tax compliance” infringed the requirement of purpose specification. This law required that the retrieval of data had to only relate to terms under the income tax act. The Constitutional Court stressed that such a requirement “does not determine the circuit of public agencies which shall be authorized to retrieve Kennzeichnung der Empfangsbehörden, einhergehend mit Regeln, welche die Übermittlung auf deren jeweiligen spezifischen Aufgabenbereich konzentrieren (…), ist nur genügt, wenn der Gefahrenbegriff zur Einschränkung der Ermächtigung verfügbar ist.) Als Empfangsbehörde für die übermittelten Daten ist die Polizei benannt. (Der Verwendungszweck ist auf den Zweck der Abwehr von Gefahren für im Einzelnen benannte, hochwertige Schutzgüter der öffentlichen Sicherheit begrenzt, also auf einen Zweck, dessen Verfolgung zum spezifischen Aufgabenbereich der Polizeibehörden zählt (…)).§ 31 PolG NW 1990 ist unter den genannten Bedingungen auch insoweit hinreichend bestimmt, als nicht nur die ausdrücklich aufgezählten Typen von Daten, sondern nach Absatz 2 auch "andere für den Einzelfall benötigte Daten" verlangt und verarbeitet werden dürfen. Die Bestimmtheitsanforderungen sind insoweit gewahrt, weil der Begriff der "anderen für den Einzelfall benötigten Daten" unter Berücksichtigung des Normzwecks der Gefahrenabwehr und damit auch hinsichtlich der Feststellung, wozu die Daten "benötigt" werden, so konkretisiert werden kann, dass der Verhältnismäßigkeitsgrundsatz gewahrt bleibt.” 778 See BVerfG, ibid., cip. 148: “Ohne die Begrenzung auf das Vorliegen einer konkreten Gefahr gäbe es demgegenüber keine hinreichenden Anhaltspunkte zur teleologischen Bestimmung der erfassbaren Daten, insbesondere soweit es sich um "andere für den Einzelfall benötigte Daten" handelt. Fehlt es an einer konkreten Gefahr, ist nicht mit verfassungsrechtlich hinreichender Bestimmtheit ermittelbar, unter welchen Bedingungen Daten "für den Einzelfall" benötigt werden. Wäre Bezugspunkt der Rasterfahndung etwa eine allgemeine Terrorismusgefahr und würde diese somit zum Bezugspunkt der Konkretisierung der Art der Daten, die von der Polizei benötigt werden, wäre eine nahezu grenzenlose Ermächtigung geschaffen. Es fehlten jegliche Anhaltspunkte für die Prüfung, ob die zu erhebenden Daten "für den Einzelfall benötigt" werden. Dies würde verfassungsrechtliche Bestimmtheitsanforderungen verletzen.” II. The requirement of purpose specification and its legal scale 287 the data and the tasks for that the retrieval serves in a sufficiently precise manner. (…/The wording allows) each notional accordance between the law that shall be executed and the income tax act in order to authorize the retrieval of the account data. Consequently, the scope of application would be unlimited in light of the fact that the income tax act contains numerous notions without concrete references to tax law which also exist in a multitude of other laws with totally different objectives.”779 The Court came to the conclusion that the retrieval of data has to relate to specific terms under the Act. In the Court’s opinion, there would be too many other laws containing such terms and, thus, allowing for the retrieval of data.780 As described above, in the case of “License Plate Recognition”, the Court weighed the criteria of both the requirement of purpose specification and the principle of proportionality against each other. As a first step, it examined to what extent the legal provision authorizing the automated license plate recognition determined the purposes for the collection of the data. The Court came to the conclusion that the police law originally offended, did not provide “concrete requirements for the state measure, it especially did not pre-determine the reason and the purpose of usage which was sufficiently specific for certain areas and legally clear.”781 Indeed, the provision authorized the collection of data for the purpose of checking it against the data files that were open for investigation. The Court argued, however, that this term “does not determine the purpose for that the collection and the checking of the data shall finally serve. Only the manner how an investigation purpose shall be, after the collection of the data, 779 See BVerfG, 13th June 2007, 1 BvR 1550/03, cip. 79 and 80: “Auf diese Weise werden der Kreis der Behörden, die zu Abrufersuchen berechtigt sein sollen, und die Aufgaben, denen solche Ersuchen dienen sollen, nicht präzise genug festgelegt. Sollte der Wortlaut von § 93 Abs. 8 AO weit zu verstehen sein, so genügte jede begriffliche Übereinstimmung zwischen dem anzuwendenden Gesetz und dem Einkommensteuergesetz, damit ein Kontoabruf in Betracht käme. In der Folge wäre der Anwendungsbereich der Norm praktisch unübersehbar, da das Einkommensteuergesetz zahlreiche Begriffe enthält, die keinen besonderen steuerrechtlichen Bezug aufweisen und sich auch in einer Vielzahl anderer Gesetze mit völlig unterschiedlichen Regelungsgegenständen finden (...).” 780 See BVerfG, ibid., cip. 81. 781 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07, cip. 98: “In den angegriffenen Bestimmungen fehlt es an näheren Voraussetzungen für die Maßnahme, insbesondere an einer hinreichenden bereichsspezifischen und normenklaren Bestimmung des Anlasses und des Verwendungszwecks der automatisierten Erhebung.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 288 achieved is mentioned. This purpose (itself) indeed remains open.”782 Pursuant to the Court’s decision, the notion ‘open investigation’, at least, did not determine the purpose because there was no legal or commonly accepted definition of the term.783 The broad specification of the purpose did not particularly exclude the possibility to use the collected data for police surveillance or even for purposes of criminal investigation.784 The requirement of ‘public streets and spaces’ did indeed restrict the locations where the data can be legally collected but did not refine the purpose of the collection.785 Finally, it was not possible to restrain the purpose by narrowly interpreting the provision, because there was no identifiable core objective of the regulation.786 Consequently, the undetermined purpose of collection of the data lead to the result that the information also gathered on the basis of that data is equally illegitimate.787 Given the broad definition of the purpose, and all potential purposes considered, the Court then examined, as a second step, whether or not the provision met the requirement of proportionality. It came to the conclusion that the provision was not proportionate in light of the following reasons: first, that the lack in the reasoning for the collection of the data could lead to chilling effects on society as a whole; second, the purpose was not restricted to the defense of concrete dangers; and third, the provision did not differentiate between the reasons for the inclusion of certain individuals in the data files for the open investigation.788 In addition, the Court stressed that the provision did not clearly exclude the collection of the data on the basis that the reason given as to why the license plates were included in the open file for all investigations fell away. Finally, the provision did not limit the collection and usage of the data for the determined purposes. The Court pointed out that this lack in limitation could lead to roaming data. This would also lead to an infringement of the principle of proportionality.789 782 See BVerfG, ibid., cip. 99: “Erwähnt wird lediglich das Mittel, mit dem ein Ermittlungszweck nach der Erhebung weiter verfolgt werden soll. Welcher Zweck das sein soll, bleibt jedoch offen.” 783 See BVerfG, ibid., cip. 100. 784 See BVerfG, ibid., cip. 136 and 149. 785 See BVerfG, ibid., cip. 144. 786 See BVerfG, ibid., cip. 153. 787 See BVerfG, ibid., cip. 157. 788 See BVerfG, ibid., cip. 170 to 176. 789 See BVerfG, ibid., cip. 177 and 178. II. The requirement of purpose specification and its legal scale 289 Liberalization of the strict requirement by referring to the object of protection In conclusion, the German Constitutional Court considers a purpose provided for by law with respect to the treatment of data by the State as sufficiently precise if they result, for example, as: from the type of collection such as a ‘census for population, profession, housing, and work areas’, or pursues the ‘prevention, intelligence, and criminal prosecution of international terrorist attacks’ or other explicitly listed crimes. Instead, the ‘defense of an abstract danger’ or notions which refer to unknown purposes such as ‘open investigation’ and, as such, only to the way of how these unknown purposes shall be achieved are not sufficiently precise. In the case of “Federal Criminal Police Office Act”, the Constitutional Court finally refined the general conditions as described. In this case, the Court clarified, as a first step, the criteria to be considered in order to decide whether a later use of data still pursues the same purpose or whether this usage pursues another purpose and must thus be considered as a change of purpose.790 Pursuant to this decision, the later use of data in another procedure (other than that of the collection) but for the same purpose requires, on the one hand, a proper legal basis. However, this extension does not constitute a change of purpose and, thus, does not have to meet the strict proportionality requirements for a change of purpose. Instead, such a later use of data must strictly apply the conditions set up by the law that authorized the collection. In this regard, this law has to determine, first, the public agency that is allowed to collect the data; second, the specific purpose; and third, further requirements set up for the collection of the data. By refining these specific criteria, the Constitutional Court clarified that it is not sufficient to specify the purpose by simply referring to the abstract task of a public agency. Instead, the purpose specified within the law that authorizes the collection of data sets the limit for the later processing and must be, as a consequence, more specific than the abstract task of the public agency. However, the purpose originally specified has not always to refer, for example, to explicitly listed crimes. Instead, it can (d) 790 See BVerfG, 20th of April 2016, 1 BvR 966/09 and 1 BvR 1140/09 (Federal Bureau of Investigation Law), cip. 277; see with respect to further refinements in this decision beneath point C. III. 1. b) bb) (2) Proportionate change of purpose. C. The function of the principle of purpose limitation in light of Article 8 ECFR 290 also refer to the object of protection that is protected by these criminal provisions.791 This last criteria is highly important because the reference to an object of protection leaves the public agency with more room for action than a reference to an explicit provision which is established in order to protect the object of protection. The reason is that the object of protection is broader than the explicit provision. This conclusion is based on the Court’s wording as: “A later usage for the same purpose is therefore only possible if it is carried out be the same public agency, for the same task, and if it serves the same object of protection as decisive for the collection: If this (the data collection) is allowed for the protection of specific objects of protection or for the prevention of specific crimes, only, this limits the immediate or later use even in the same public agency (… /words in brackets and underlining added by the author)”.792 Indeed, this conclusion is not free of doubt because the Court refers, in a subsequent paragraph, to both criteria not alternatively (“or”) but cumulatively (“and”) as: “In conclusion, it is decisive (…) that the public agency authorized for the data collection uses the data for the same task, the same objects of protection and for the prosecution or prevention of the same crimes as specified in the law authorizing the collection of the data. (Underlining added by the author.)”793 In the first quote, the Court thus appears to allow both options as alternatives, whereas in the second quote both options appear to form a cumulative requirement. The second option would lead, in contrast to the conclusion drawn in this thesis, to a narrower room of action for the public agency than the first one. 791 Cf. BVerfG, ibid., cip. 278 and 279. 792 See BVerfG, ibid., cip. 279: “Eine weitere Nutzung innerhalb der ursprünglichen Zwecksetzung kommt damit nur seitens derselben Behörde im Rahmen derselben Aufgabe und für den Schutz derselben Rechtsgüter in Betracht wie für die Datenerhebung maßgeblich: Ist diese nur zum Schutz bestimmter Rechtsgüter oder zur Verhütung bestimmter Straftaten erlaubt, so begrenzt dies deren unmittelbare sowie weitere Verwendung auch in derselben Behörde, (soweit keine gesetzliche Grundlage für eine zulässige Zweckänderung eine weitergehende Nutzung erlaubt).” 793 See BVerfG, ibid., cip. 282: “Für die Wahrung der Zweckbindung kommt es demnach darauf an, dass die erhebungsberechtigte Behörde die Daten im selben Aufgabenkreis zum Schutz derselben Rechtsgüter und zur Verfolgung oder Verhütung derselben Straftaten nutzt, wie es die jeweilige Datenerhebungsvorschrift erlaubt.” II. The requirement of purpose specification and its legal scale 291 In any case, as a second step, the Court also refines the concept of protection with respect to the reason of the data processing. This refinement does not require the same reason as required for its collection. The Court stressed, at first, that the requirement to specify the reason for State action, such as the “adequately specified danger” in the area of danger prevention or the “adequate grounds of suspicion” in the area of prosecution of crimes, does not result from the principle of purpose limitation.794 As illustrated previously, this requirement indeed results from the principle of clarity of law, which only supplements the principle of purpose limitation, but is equally based in the right to informational self-determination.795 As a consequence, the public agency can use the data at a later stage as a baseline for further investigation, even if there is no specific danger. The existence of a “specific investigative reason” usually suffices.796 However, even if no specific danger is required, the object of protection must be clear because, here again, the later use must pursue the same task and serve the same objects of protection as the data collection. The Court makes it very clear that this refinement is not a further tightening of the principle of purpose limitation, but a liberalization of it.797 The Court justifies this liberalization as: “This (liberalization) acknowledges the fact that the production of knowledge cannot be based – not least if it is about understanding terroristic structures – on the pure addition of single, separated data being taken into account only formally pursuant to criteria specified by law. (…) Through the boundaries to the tasks specified in the moment of collection and the objects of protection, the later usage of the data as a pure baseline for further investigation is adequately limited”.798 794 See BVerfG, ibid., cip. 285. 795 See above under point C. II. 1. c) ee) (1) (a) Function of purpose specification (basic conditions), referring to BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Banking Account Matser Data), cip. 71, 73 and 74. 796 See BVerfG, 20th of April 2016, 1 BvR 966/09 and 1 BvR 1140/09 (Federal Bureau of Investigation Law), cip. 289: ”konkreter Ermittlungsansatz“. 797 See BVerfG, ibid., cip. 292: “Hierin liegt keine Verschärfung der Maßstäbe, sondern eine behutsame Einschränkung, indem das Kriterium der hypothetischen Datenneuerhebung nicht strikt angewandt (…), sondern in Blick auf die - die zu fordernde Aktualität der Gefahrenlage bestimmenden - Eingriffsschwellen gegenüber früheren Anforderungen (…) teilweise zurückgenommen wird.” 798 See BVerfG, ibid., cip. 281: “Dies trägt dem Umstand Rechnung, dass sich die Generierung von Wissen – nicht zuletzt auch, wenn es um das verstehen terroristischer Strukturen geht – nicht vollständig auf die Addition von je getrennten, nach Rechtskriterien formell ein- oder ausblendbaren Einzeldaten reduzieren C. The function of the principle of purpose limitation in light of Article 8 ECFR 292 In contrast, this liberalization does not apply to data which is collected by an infringement of the right to inviolability of the home or the right to the confidentiality and integrity of information technological systems. The later use of this kind of data by the State is legitimate only if there is, additional to the before-mentioned conditions, again a specific or even urgent danger. The German Constitutional Court justifies this stricter condition in relation to this type of data with regard to the particular severity of the infringement with these fundamental rights.799 The Court considers an infringement of these basic rights as particularly severe because it typically concerns the essence of private life, which gets supplementary protection by the right to human dignity in Article 1 GG. From this perspective, an individual’s private home is particularly protected against surveillance because it typically concerns highly private or sensitive communication. Similar, information technological systems contain (typically) information stored over a longer period of time. An intrusion into these systems can reveal highly private or sensitive information, as well as, if the data is processed further, personal weaknesses and attitudes, which should be kept secret. In contrast, the German Court does not consider, for example, an infringement of the right to privacy of telecommunications as equally severe because this typically concerns single acts of immediate communication only. The essence of these rights must therefore be differently protected.800 Private sector: ‘Self-control of legitimacy’ With respect to the private sector, at first glance, the German Court pursues a similar approach. The Court refers to the same idea behind the regulation for both the private and public sector, such as: ‘The right to informational self-determination protects the individual against information (2) lässt. (...) Durch die Bindung an die für die Datenerhebung maßgeblichen Aufgaben und die Anforderungen des Rechtsgüterschutzes hat auch eine Verwendung der Daten als Spurenansatz einen hinreichend konkreten Ermittlungsbezug, (den der Gesetzgeber nicht durch weitere einschränkende Maßgaben absichern muss).” 799 See BVerfG, ibid., cip. 283. 800 See BVerfG, ibid., cip. 119 to 129 as well 238 and 239. II. The requirement of purpose specification and its legal scale 293 related measures which he or she cannot foresee nor control’801 and ‘the general personality right consists of the right of the individual to determine by him or herself the disclosure and usage of his or her personal data’802. However, the mechanisms safeguarding this guarantee are different. As illustrated above, in the case of “Release of Confidentiality”, the Court stated on the claimant's duty to authorize her insurance company to “retrieve appropriate information from all doctors, hospitals, nursing homes, where (../the claimant) was or will be treated, as well as from (../the claimant’s) health insurance company and other personal insurance companies, social insurance companies, public agencies, current and former employers.”803 The Constitutional Court came to the conclusion that the authorization was too broad, despite former decisions of lower courts stating that it was legal, and consequently lead to an infringement of the claimant’s right to informational self-determination. Interestingly, the purpose itself provided for by the release of confidentiality does not appear to be broader than the purposes lawfully provided for by ordinary law regarding a State’s treatment of data: The authorization, and the insurance policy, made clear that any retrieval of information would only occur with respect to the event of insurance and the approval and execution of the policy services. However, given the sensitivity of the data, the general list of rather unspecific inquiry offices and the lack of determination of the specific inquiries themselves, the Constitutional Court held the authorization as being too vague. From its point of view, the claimant lost “the possibility to control her interests of confidentiality by 801 See, for the public sector, for example, BVerfG, 4th of April 2006, 1 BvR 518/02 (Retrieval of Bank Account Master Data), cip. 74; and for the private sector, BVerfG, 1 BvR 2027/02 (Release of Confidentiality), cip. 43. 802 See, for the private sector, BVerfG, 1 BvR 2027/02, cip. 31; and for the public sector, BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83 (Decision on Population Census), cip. 173; cf. equally BVerfG, 14th of July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 136 and BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 132 and BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 64 and BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 63. 803 See BVerfG, ibid., cip. 13: “von allen Ärzten, Krankenhäusern und Krankenanstalten, bei denen ich in Behandlung war oder sein werde sowie von meiner Krankenkasse: … und von Versicherungsgesellschaften, Sozialversicherungsträgern, Behörden, derzeitigen und früheren Arbeitgebern sachdienliche Auskünfte einzuholen.” C. The function of the principle of purpose limitation in light of Article 8 ECFR 294 her own”.804 This ability of self-control apparently is the essential difference for the Court when it examines whether the control mechanisms implemented by a private data controller are sufficient. This is the reason for why the Court also examines, in detail, alternative mechanisms. As stressed previously, the Court examined, at first, whether the claimant relied on the insurance and whether there was no other insurance company offering such policy without the same authorization. Both questions referred to market mechanisms enabling the individual to control the disclosure of the information. Correspondingly, the Court considered whether the defendant had offered, on an organizational level, alternative mechanisms such as subsequent releases relating to her confidentiality that would have respected the claimant’s possibility of self-determination.805 Thus, so far, it shall be summarized that the German Constitutional Court applies, for the public and private sector, different scales in order to answer the question of whether the processing of personal data complies with the idea of informational self-determination or not. Criticism: Stricter effects on the private than the public sector The preceding chapters illustrated the important role that the specification of purposes plays in the European data protection system. The requirement of purpose specification provides a central link for further legal requirements. First, it serves to define the scope of application through determining which data are identifiable. Second, it determines the data controller who is responsible for safeguarding the regulation. Third, it determines further requirements such as adequacy, relevance, and necessity of processing of personal data. However, despite its important role, several aspects remain unclear. First, there appears to be a different scale in determining the precision of purposes specified, on the one hand, by the legislator authorizing certain acts of data processing and, on the other hand, by data controllers which base their processing either on the law or on the individual’s consent. Second, there are further ambiguities surrounding the 2. 804 See BVerfG, 1 BvR 2027/02 (Release of Confidentiality), cip. 43: “Dabei begibt sie sich auch der Möglichkeit, die Wahrung ihrer Geheimhaltungsinteressen selbst zu kontrollieren (…).” 805 See above under point C. I. 2. d) bb) In the private sector: The contract as an essential link for legal evaluation. II. The requirement of purpose specification and its legal scale 295 specification of the purpose in light of the applicable concept of protection, in particular, regarding requirements for the consent, and consequences of its non-fulfillment. Finally, none of the concepts, be they appropriately applied or not, provide reliable criteria that help data controllers in the private sector to determine how precisely they shall specify their processing purposes. The subsequent analysis will show that it appears as though the initial concept of protection developed by the constitutional courts with respect to the processing of personal data by the State is simply transferred to the private sector.806 The surprising bottom line of such a transfer is, given the different situations of the State and private data controllers, that the effects of the requirements discussed are even stricter for controllers acting in the private sector than for the State. Difference in precision of purposes specified by legislator and data controllers The first aspect which became apparent during the previous analysis is the divergence between the purpose being specified, on the one hand, by the legislator and, on the other hand, by data controllers in the private sector. In summary, the legislator is allowed to specify purposes in a broader way than data controllers. Examining in detail the purposes listed in the law for which certain acts of data processing are authorized, there are essentially four types. The first type refers to data processing which is necessary for the technical conveyance of a communication service.807 The second type authorizes data processing for the necessary conclusion, execution or termination of a contract.808 The third type refers to obligations provided for a) 806 Cf. above under points C. II. 1. b) bb) (1) Peliminary note: Clarifying conceptual (mis)understandings, and dd) (2) (a) Peliminary note: Clarifying conceptual (mis)understandings. 807 See in the ePrivacy Directive Article 5 sect. 1 sent. 3 and sect. 3, Article 6 sect. 1; in the German Telecommunication Law Article 88 sect. 3, Article 96 sect. 1, as well as in the Telecommunication Law Article 15 sect. 1. 808 This includes billing purposes; see in the ePrivacy Directive Article 6 sect. 2; in the Data Protection Directive Article 7 lit. b; in the German Telecommunication Law Article 95 sect. 1 and Article 96 sect. 1; in the German Telemedia Law Article 14 sect. 1 and Article 15 sect. 1; and in the German Federal Data Protection Law Article 28 sect. 1 sent. 1 no. 1. C. The function of the principle of purpose limitation in light of Article 8 ECFR 296 by law and public interests.809 Finally, the fourth type refers to the interests of data controller. Data processing for undisputed ‘marketing purposes’ authorized by law In this last respect, the law provides both provisions authorizing the data processing for the data controller’s interests in general, as long as they are legitimate,810 and more specific purposes. In particular, the purposes of ‘marketing’ and ‘market research’ play a prominent role as the following provisions will illustrate: – Article 6 sect. 3 of the ePrivacy Directive allows the processing of traffic data for the purpose of marketing or for the provision of value added services if it is based on the user’s consent; – Article 95 sect. 3 sent. 1 of the German Telecommunication Law also authorizes the processing of data in relation with a contract with telecommunication service providers for its own purposes of marketing and market research if it is based on the user’s consent; – Article 96 sect. 3 of the German Telecommunication Law equally allows the processing of traffic data for purposes of marketing of telecommunication services, technically improving the usability of the telecommunication services or for the provision of value added services if it is based on the user’s consent; – Article 15 sect. 3 of the German Telemedia Law allows the creation of user profiles with ‘usage data’ for purposes of marketing, market research, or technical improvement of the usability of Information Society services if it is pseudonymized and the user does not object; – The German Federal Data Protection Law finally authorizes data processing for the data controller’s own purposes of marketing and address trading (Article 28 sect. 3 to 3b); the commercial data treatment for third parties’ purposes of marketing and address trading (Article aa) 809 See, for example, in the ePrivacy Directive Article 15 sect. 1 referring to Article 13 sect. 1 of the Data Protection Directive; in the Data Protection Directive Article 7 lit. c and e as well as the before mentioned Article 13 sect. 1; in the German Telecommunication Law, in particular, Articles 108 et seqq. 810 See in the Data Protection Directive Article 7 lit. f and in the German Federal Data Protection Law the basic legitimate ground in Article 28 sect. 1 sent. 1 no. 3. II. The requirement of purpose specification and its legal scale 297 29); and the commercial data treatment for third parties’ purposes of market research (Article 30a). In legal literature, legal scholars do not doubt that these specified purposes within the law itself are sufficiently precise.811 Disputed ‘marketing purposes’ specified by data controllers However, it is interesting to see that while particular purposes of ‘marketing’ specified within the law are almost not disputed amongst legal scholars, the same purposes specified by data controllers in the private sector are disputed. The Article 29 Data Protection Working Party promotes that a controller simply using the term ‘marketing purpose’ does not sufficiently meet the requirement of purpose specification, pursuant to Article 6 sect. 1 lit. a of the Data Protection Directive.812 Comparably, German legal scholars argue that the purpose of ‘market research’ included in the user’s consent does not meet the requirement of purpose specification either.813 This is at least the case if the data controller does not differentiate between their own and third parties’ marketing purposes.814 This second reasoning enforces the data controller to apply at least the difference between the purposes drawn by German law itself. As described above, the German Federal Data Protection Law differentiates between the controller’s purposes and purposes pursued on behalf of third parties in order bb) 811 See, for example, Simitis, Federal Data Protection Law, § 30a cip. 67 to 95, who undertakes great efforts to define and distinguish the admittedly vague statutory terms of market and opinion research as purposes of data processing while not calling into question their blatant vagueness (indeed, he interprets the terms also with respect to the type of data concerned, which is, at least, given by the law). 812 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p 16. 813 See Simitis, ibid., § 4a cip. 81 and 82: “Kurzum, nur eine möglichst präzise Aussage räumt den Betroffenen grundsätzlich die Chance ein, eine aus ihrer Sicht besonders gefährliche Verarbeitung einzelner Angaben, etwa die uneingeschränkte Übermittlung von ‚Negativmerkmalen’ an Kreditinformationssysteme, rechtzeitig zu verhindern.” 814 See Kramer, ibid., § 4a BDSG cip. 21 with references to Wolff/Brink, ibid., § 4a cip. 44 and Plath, ibid., § 4a cip. 47. C. The function of the principle of purpose limitation in light of Article 8 ECFR 298 to cover different risks of the data processing for the individual concerned.815 Another reason for why legal scholars take such a strict view of the term ‘marketing purposes’ used by data controllers seems to be that it often does not refer to a certain type of data. The legal provisions authorizing the data processing for marketing purposes most often refer to certain types such as ‘traffic data’ (Article 6 sect. 3 of the ePrivacy Directive and Article 96 sect. 3 of the German Telecommunication Law) or ‘usage data’ (Article 15 sect. 3 of the German Telemedia Law). However, the German Federal Data Protection Law only partly refers to a certain type of data, such as in Article 28 sect. 3 to 3b for ones own marketing purposes, and authorizes the processing for third parties’ marketing and market research purposes for any kind of data (Articles 29 and 30a). Thus, the law itself does not consequently apply such a strict approach. Further examples for different scales applied in order to specify the purpose This difference also becomes apparent with respect to other purposes. For example, Article 96 sect. 1 of the German Telecommunication Law and Article 15 sect. 3 of the Telemedia Law refer to purposes of technical improvements of the usability for the processing of ‘traffic’ and ‘usage’ data, respectively. In contrast, the Working Group promotes that the term ‘improving user experience’ used by data controllers is not sufficiently precise.816 In the German Telecommunication Law, Article 88 sect. 3 authorizes the processing of ‘content data’ and ‘related circumstances’ for the purpose of protection of the technical telecommunication system. However, the Working Party considers the purpose of ‘IT security’ as not sufficiently specified.817 While Article 7 sect. 1 lit. b sent. 2 of the Data Protection Directive refers, exempting from the requirement of purpose limitation, to ‘scientific purposes’, the Working Party denies that the term of ‘future research’ used by data controllers meets the requirement of purcc) 815 See above under point C. II. 1. c) cc) (5) Privileges and restrictions pursuant to pruposes. 816 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 16. 817 See the Article 29 Data Protection Working Group, ibid., p. 16. II. The requirement of purpose specification and its legal scale 299 pose specification in the first sentence of that article.818 Comparably, while Article 28a of the German Federal Data Protection Law allows the transfer of certain data to credit agencies, legal scholars consider the terms of ‘transfer of remuneration claim to any refinancing bank’ or of ‘data of the debtor for credit processing’ or for the purpose of ‘credit security’ used by data controllers to not be sufficiently precise.819 Can the context help interpret a specified purpose? As mentioned above, the individual’s consent to the processing of his or her data for the purposes of ‘prudent business management’ or ‘usual support of the authorizing person’ is not sufficiently precise. These two last examples are particularly interesting in light of the criteria that the German Constitutional Court developed with respect to the purposes provided for by law. This is the case because the German Constitutional Court states “that the legislator precisely and specifically determines in certain areas the purpose of usage.”820 Pursuant to this requirement, “a legal provision is sufficiently determined (…) if the purpose results from the context of the provision with respect to the area of life that shall be regulated.”821 In light of this consideration, the terms ‘prudent business management’ and ‘usual support of the authorizing person’ would, in principle, allow the individual concerned, as well as the data controller to conclude from the specific context which kind of processing shall be covered and which not. The context of the interrelationship and the area of life referred to in the individuals consent appear indeed to clarify the extent of the data processing. The Article 29 Data Protection Working Party similarly considers that “the degree of detail in which a purpose should be specified depends on the particular context in which the data are collected and the personal data involved.”822 Regarding the requirement of ‘making the specified purpose explicit’, it stresses that the context may be sufficient informing the indidd) 818 See the Article 29 Data Protection Working Group, ibid., p. 16. 819 See Simitis, ibid., § 4a cip. 81 and 82; Däubler/Klebe/Welde/Weichert, BDSG, § 4a cip. 18. 820 BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83, cip. 161. 821 BVerfG, ibid., cip. 180. 822 See the Article 29 Data Protection Working Group, ibid., p. 16. C. The function of the principle of purpose limitation in light of Article 8 ECFR 300 vidual about the purpose of the processing.823 Indeed, a data controller could significantly increase the probability that such purposes meet the requirement of purpose specification if it provides examples of how the data will be processed and used. In this regard, the Working Party stated: “For ‘related’ processing operations, the concept of an overall purpose, under whose umbrella a number of data processing operations take place, can be useful.”824 However, it ads “that controllers should avoid identifying only one broad purpose in order to justify various further processing activities which are in fact only remotely related to the actual initial purpose.”825 The context itself seems, hence, to not provide sufficient criteria in order to legitimize purposes such as ‘prudent business management’ or ‘usual support of the authorizing person’. In conclusion, a solution for the question of how precisely the data controller has to specify the purpose, could be to have an objective scale which would assist in defining the context.826 A different scale for ‘purpose specification’ pursuant to the German concept of protection In any event, the German Constitutional Court appears to consider two different objective scales in order to determine the degree of precision of the purpose specified, on the one hand by the legislator and, on other hand, by data controllers in the private sector. It considers purposes provided for by law for the treatment of data by the State as lawful if they result, for example: from the type of collection such as a ‘census for population, profession, housing, and work areas’, or pursues the ‘prevention, intelligence, and criminal prosecution of international terrorist attacks’ or other explicitly listed crimes.827 In this regard, the Court considers whether the collection of data occurs in order to prevent abstract or concrete dangers. The law, which was questioned in the case of “Dragnet Inee) 823 See the Article 29 Data Protection Working Group, ibid., p. 18. 824 See the Article 29 Data Protection Working Group, ibid., p. 16. 825 See the Article 29 Data Protection Working Group, ibid., p. 16. 826 See introduction under point B. III. 5. Values as normative scale defining “contexts” and “purposes”. 827 See above under point C. II. 1. c) ee) Examples for specific purposes: Certain areas of life or explicitly listed crimes. II. The requirement of purpose specification and its legal scale 301 vestigation”, had authorized the collection of ‘other data which is necessary for the concrete case’. The Court concluded that this notion can be, in light of the overall aim, typified in a way that it is proportionate. In contrast, without reference to a concrete threat, it was not possible to interpret the notion in a way which limits the data concerned. If the notion referred to an abstract threat, only, this “would infringe the constitutional requirements for the clarity of law.”828 As a consequence, the ‘defense of an abstract danger’ or notions which refer to unknown purposes are not sufficiently precise. On the other hand, the Court considered an individual’s consent was not sufficiently precise if the consent given authorized his or her insurance company to “retrieve appropriate information from” certain types medical institutions from the health care sector such doctors, hospitals, etc.829 All of this data was intended to be gathered for the purpose of ‘approval and execution of the policy services’. Interestingly, if the Court had strictly applied its considerations made in the decision of “Dragnet Investigation”, it would have probably agreed that the release of confidentiality was sufficiently precise. Indeed, the Court considered the release of confidentiality as ‘comparable with a general authorization to retrieve sensitive information with respect to the insurance event (…)’ because the broad term ‘appropriate’ did not enable the policy-holder ‘to pre-estimate which information can be retrieved on the basis of the authorization’. However, the release of confidentiality required a ‘concrete (insurance) case’ as a pre-condition for the collection of the data. And from this angle, it would have been possible ‘to sufficiently pre-determine which data is appropriate ‘for the concrete case’. In conclusion, even if the release of confidentiality required, as a precondition for the retrieval of personal data, a concrete insurance case, the Court considered the consent given as not being sufficiently precise. The reason for this appears to be that the Court referred to two different objective scales in order to determine the degree of precision of the purpose specified, on the one hand by the legislator though the means of legal provisions, and on the other hand, by the data controller in the private sector gathering the individual’s consent. In the case of “Release of Confidentiality” which referred to the individual’s consent and not an authorizing law, 828 See BVerfG, 4th of April 2006, 1 BvR 518/02, cip. 145 to 148. 829 See BVerfG, 1 BvR 2027/02, cip. 13. C. The function of the principle of purpose limitation in light of Article 8 ECFR 302 the decisive fact was, in the Court’s opinion, the extent of control that would have been possible, on the basis of the “consent” as a protection instrument. From the Court’s point of view, the release of confidentiality was too broad because the individual concerned lost “the possibility to control her interests of confidentiality by her own”.830 Thus, this aim of enabling or giving an individual control over their own confidentiality justifies a different objective scale for determining the purpose, than if the purpose is determined by an authorizing law. If the legislator authorizes the data processing on the basis of a legal provision, the individual has lost this possibility of self-control in any case. This appears to justify that purposes can be more broadly specified in an authorizing legal provision than in the individual’s consent. Interim conclusion: Do regulation instruments dictate the scale for ‘purpose specification’? In light of this reasoning, it becomes apparent that the concrete regulation instrument might dictate the degree of precision of the purpose specified. In light of the concept of protection of the right to informational self-determination, this differentiation is reasonable. The German Constitutional Court considers the individual’s consent, on the private sector, as “the essential instrument in order to develop free and self-responsible actions in relation to third parties.”831 The Article 29 Data Protection Working Group also sees the individual’s consent as an expression of “self-determination”.832 Thus, both ideas lead to the result that the purpose must be more precisely specified within the individual’s consent than in an authorizing provision. Indeed, whether the European Court of Justice applies a similar approach is not (yet) clear. In the decision of “Telekom vs. Germany”, it applied a more functional approach. In this case, the Court stated that “the consent given (…) to the publication of his personal data in a public direcff) 830 See BVerfG, ibid., cip. 43. 831 See BVerfG, ibid., cip. 31, 32, and 34. 832 See above under point C. II. 1. b) dd) Preliminary note: Clarifying conceptual (mis)understandings, referring to the Article 29 Data Protection Woking Group, Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC, p. 13. II. The requirement of purpose specification and its legal scale 303 tory relates to the purpose of that publication and thus extends to any subsequent processing of those data by third-party undertakings active in the market (…), provided that such processing pursues that same purpose.”833 The Court concludes from this that the transfer of personal data from one party to another one pursuing the same purpose does not harm the individual’s right to data protection.834 In this instance, the Court does not refer to an individual’s self-determination but simply to his or her right to data protection. And in light of this right, the purpose specified within the individual’s consent simply ‘bundles’, from a normative perspective, several acts of data processing. Thus, so long as the data processing occurs for the same purpose that was made explicit when the first consent was given, it does not harm Article 8 ECFR. Such a function does not, per se, provide for stricter requirements regarding the purpose specified in the consent than in an authorizing law.835 Further ambiguities and possible reasons behind the same However, there are further ambiguities regarding the concept of protection that become apparent in the legal discussion on the requirement of purpose specification in the private sector. While German legal scholars and the Article 29 Data Protection Working Group have a similar understanding regarding the requirements to specify the purpose and make the specified purpose explicit, their reasoning appears to intermingle the applicable concept of protection provided for by the different constitutions. Even more so, further considerations will bring to light that certain requirements may simply be transferred from the public sector to the private sector. At least, this is the case with respect to the individual’s consent, in particular, the moment where these requirements are considered to be relevant, and the legal consequences if the requirements are not applied. Examining these aspects, the subsequent considerations will, from time to time, refer to decisions by the German Constitutional Court regarding the right to informab) 833 See ECJ C-543/09 cip. 65. 834 See ECJ C-543/09 cip. 66. 835 However, see the discussion at Lynskey, The Foundations of EU Data Protection Law, pp. 190 et seq., whether the consent incorporates the concept of individual self-control enabling an individual not only to determine what can be done with data relating to him or her, but also who is allwowed to do that. C. The function of the principle of purpose limitation in light of Article 8 ECFR 304 tional self-determination. Indeed, German basic rights do barely apply when interpreting European law.836 However, a comparison with the German concept of protection helps one to understand better certain conceptual components and decide which of these components should be incorporated in the concept of protection provided for by the European Charter of Fundamental Rights. Common understanding about the function of ‘purpose specification’ Firstly, the function of specifying purposes and making these purposes explicit and how it is interpreted on both the European and the German level shall be analyzed. As mentioned above, the Article 29 Data Protection Working Group reviewed and discussed in its “Opinion 03/2011 on purpose limitation” the function of both requirements, i.e. to specify the purpose and to limit the later processing of data to the originally specified purpose. In its opinion, the requirement to specify the purpose serves, as quoted previously, to “determine whether data processing complies with the law, and to establish what data protection safeguards should be applied (…).”837 From this perspective, the specification of the purpose “requires an internal assessment carried out by the data controller and is a necessary condition for accountability.”838 This function is similar to the German concept of protection. The German Constitutional Court considered that “only when it is clear for which purpose the information is required and which possibilities of linking and usage exist, it is possible to answer the question of whether the infringement of the right to informational self-determination is constitutionally legal or not.”839 Therefore, the specification of the purpose plays, in relation to both concepts of protection, an essential role, because it provides the legal link for subsequent legal requirements. aa) 836 See above under point C. I. 1. a) The interplay between European Convention for Human Rights, European Charter of Fundamental Rights and German Basic Rights. 837 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, pp. 13 and 15. 838 See the Article 29 Data Protection Working Group, ibid., pp. 13 and 15. 839 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83, cip. 159. II. The requirement of purpose specification and its legal scale 305 Ambiguous understanding regarding the functions of ‘making specified purpose explicit’ Compared with the requirement to specify the purpose, the requirement of making the specified purpose explicit has another function. The Working Party considers, in this regard: “The purposes of collection must not only be specified in the minds of the persons responsible for data collection. They must also be made explicit. In other words, they must be clearly revealed, explained or expressed in some intelligible form. (…) The requirement that the purposes be specified ‘explicitly’ contributes to transparency and predictability. (…) It helps all those processing data on behalf of the controller, as well as data subjects, data protection authorities and other stakeholders, to have a common understanding of how the data can be used.”840 The German right to informational self-determination provides a function equivalent to the requirement of ‘making specified purposes explicit’. However, the German Constitutional Court locates this function in the principle of clarity of law, in particular, with respect to the State. The principle of clarity of law “shall guarantee that public agencies find legal criteria for the execution of the law and that the judicial courts are able to control it; furthermore, the principle of clarity of law enables the citizens concerned to be prepared by potentially infringing measures.”841 However, while the predictability plays an important role by protecting “the individual against information related measures (by the State) which he or she cannot foresee nor control”842, the right to informational self-deter-mination safeguards, in the private sector, “that the legal order provides and maintains the legal conditions under which the individual is able to participate in communicational processes in a self-determined way and to develop his or her personality.”843 For this approach, “the contract is the essential instrument.”844 In light of this conceptual difference, the question is, on the European level, where the idea of the requirement to ‘make the specified purpose explicit’ to the individual concerned originates from. If there is no other jusbb) 840 See the Article 29 Data Protection Working Group, ibid., p. 17. 841 See BVerfG, 13th June 2007, 1 BvR 1550/03, cip. 71, 73. 842 See BVerfG, ibid., cip. 74. 843 See BVerfG, 1 BvR 2027/02, cip. 33. 844 See BVerfG, ibid., cip. 34. C. The function of the principle of purpose limitation in light of Article 8 ECFR 306 tification for this requirement, it appears to introduce in the private sector a protection instrument that primarily protects, in Germany, individuals against data processing authorized by law. Indeed, Britz makes clear that this function may also be transposed to the private sector: the information of the individual about the harm of his or her fundamental right may diminish its intensity because it principally enables the individual to adjust to it, correct wrong data, and seek legal protection against it.845 The European Court of Human Rights refers to a similar idea with respect to Article 8 ECHR. The European Court of Human Rights examines the purpose pursued by the data controller in order to determine whether there is an infringement at all: The information by the controller of the individual concerned about the purpose of the data processing frames the individual’s “reasonable expectation”, enables him or her to react to it, correspondingly, and therefore decides on whether the data processing harms his or her right to private life or not.846 However, so far, the concept of protection provided for by the European Charter of Fundamental Rights is not sufficiently clear in order to answer the question on the precise function of the requirement to ‘make the purpose explicit’. In the case of “Digital Rights vs. Ireland”, the Court indeed considered, determining the intensity of the infringement, the unspecified threat of the individual concerned that may result from being constantly surveyed.847 However, this decision referred to the processing of personal data by the State and it is still unclear whether, and if so, to which extent this idea can and should transferred to the private sector.848 Arguable focus on data collection for legal evaluation in the private sector It appears to be exactly such a transfer of certain conceptual elements, which were originally developed for the processing of personal data by the cc) 845 See Britz, ibid., p. 584. 846 See above under point C. I. 3. c) b) cc) Particular reference to the individual’s “reasonable expectations”. 847 See ECJ C-293/12 and C-594/12 cip. 37 referring to Opinion of Advocate General Cruz Villalón delivered on 12 December 2013 on Case C‑293/12, cip. 52. 848 See beneath under point C. II. 3. a) bb) (3) Function of making specified purpose explicit. II. The requirement of purpose specification and its legal scale 307 State, to the private sector, which led legal scholars and the Article 29 Working Group to conclude two further opinions on this issue. First, it is common sense to mainly focus, evaluating the legal consequences of the specified purpose, on the moment the data is collected.849 Consequently, the specified purpose must also be explicit before the personal data is collected. Recital 28 of the Data Protection Directive states, correspondingly, that the “purposes must be explicit and legitimate and must be determined at the time of collection”.850 The Working Party also stresses that “it follows from the previous analysis that this should not happen later than the time when the collection of personal data occurs.”851 Thus, this requirement not only applies to the consent given by the individual, but also to any kind of data processing in general. Such a broad understanding of the requirement might be discussed because it refers to all data collected, irrespective of how important the data is for the individual concerned. Since the definition of the term of ‘personal data’ refers, potentially, to any data that more or less relates to the individual, the individual can quickly be overwhelmed by information. The reason is that the controller will be obliged, in light of the increase in digitization in our society, to, on a more and more frequent basis, inform the individual about the processing of data that is somewhat related to him or her. This is, at least the case, if the controller’s information duty are not adapted to the specific risk by the processing of that data. Of course, the Article 29 Working Party considers that the context may sufficiently determine for which purpose the controller uses the data and, thus, which information the controller has to provide to the individual.852 However, again, in order to fulfill this function its must be clear how to define the context.853 With respect to the general requirement of purpose specification, it was recommended, only, that the specification of the purpose should be carried 849 See above under points C. II. 1. b) bb) (2) Legal opinion on the function of purpose specification, and C. II. 1. b) cc) Purposes of processing specified when consent is given. 850 See, however, recital 39 sent. 6 of the General Data Protection Regulation, which changes the “must”-requirement into a “should”-recommendation (see the possible impact of this amendment at the end of this paragraph). 851 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 17. 852 See the Article 29 Data Protection Working Group, ibid., p. 18. 853 See above under point B. III. 5. Values as a normative scale in order to determine the “contexts” and “purposes”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 308 out before the data is collected. However, with respect to the individuals consent, the controller must specify the purpose before the data is collected. The European Court of Justice stated in the case of “Telekom vs. Germany” that the individuals concerned must be “informed, before the first inclusion of their data in a public directory, of the purpose of that directory”.854 Furthermore, the European legislator clarified in Article 2 sect. 6 of the Civil Rights Directive that the marketing of electronic communication services or the provision of value-added services is only allowed on the basis of the individual’s ‘prior consent’. With respect to German ordinary law, legal scholars comparably agree that the consent must be given before the data is processed.855 However, the German Constitutional Court applies a more differentiated approach in this regard. Taking the contract into the center of the execution of the right to informational self-determination, it declares, comparably to the public sector, the moment of the conclusion of the contract, i.e. the legitimate basis for the following collection of the data, as the essential anchor point. However, it admits that the moment of the conclusion of the contract must not be the only possible moment for evaluating the subsequent data treatment. In the case of “Release of Confidentiality”, it acknowledged that the insurance company was, “in light of the variety of the events, not able to pre-list, already in the contract clause, all the information that might become relevant for the subsequent verification.”856 The Court therefore also considered moments subsequent to the conclusion of the contract in order to evaluate the final consequences of the treatment of data. In the Court’s opinion, such moments would have been possible by using alternative or supplementary mechanisms.857 In this regard, it is important to note that this decision only referred to one specific purpose. In contrast, the Article 29 Working Group promotes that the individual can only give his or her consent during the course of a data process if there is a new purpose.858 In conclusion, on the European level, the data controller has to comprehensively inform the individual about all purposes existing the moment that the data is collected. In the end, such a focus on the moment of collec- 854 See ECJ C-543/09 cip. 67. 855 See, for example, Simitis, Federal Data Protection Law, § 4a, cip. 27. 856 See BVerfG, 1 BvR 2027/02, cip. 50 and 51. 857 See BVerfG, ibid., cip. 59 and 60. 858 See the Article 29 Data Protection Working Group, Opinion 15/2011 on the definition of consent, p. 34. II. The requirement of purpose specification and its legal scale 309 tion by private parties corresponds to the strict requirement applied, in Germany, to the processing of personal data by the State.859 From this perspective, the slight liberalization foreseen in recital 39 sent. 6 of the General Data Protection Regulation might become very relevant. As stressed before, this recital does not require, but only recommends the controller to make explicit the purpose the moment the data is collected. Thus, the European legislator now appears to have foreseen situations where it makes more sense to specify and make explicit the purpose at a later stage. Arguable legal consequences surrounding the validity of the consent The second arguable conclusion considered by legal scholars concerns the legal consequences resulting from the fact that the data controller does not meet the requirement to make the specified purpose explicit. With respect to Article 6 sect. 1 lit. a of the Data Protection Directive, some legal scholars consider that the controller must not process the data if the purpose of the data processing is unclear.860 In contrast, the Article 29 Working Party promotes that if a data controller fails to meet this requirement, it does not mean that the processing as such is illegal. Instead, “it will be necessary to reconstruct the purposes of processing, keeping in mind the facts of the case. While the publicly specified purpose is the main indicator of what the data processing will actually aim at, it is not an absolute reference: where the purposes are specified inconsistently or the specified purposes do not correspond to reality (for instance in case of a misleading data protection notice), all factual elements, as well as the common understanding and reasonable expectations of the data subjects based on such facts, shall be taken into account to determine the actual purposes.”861 In fact, the Working Party intermingles, here again, the requirements of ‘purpose specification’ and ‘making the specified purpose explicit’. The reason appears to be that the Working Party itself is not clear about which functions these requirements precisely have. dd) 859 See above under point C. I. 2. e) aa) (1) Principles of clarity of law and purpose limitation referring to the moment when data is collected. 860 See Ehmann/Helfrich, ibid., cip. 13. 861 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 18. C. The function of the principle of purpose limitation in light of Article 8 ECFR 310 In any case, the Working Group’s considerations are interesting with respect to the situation where the data processing is based on the individual’s consent. Usually, German legal scholars, as well as judicial courts consider that the consent is invalid if the controller does not sufficiently inform the individual about the processing of data. This is in particular the case, if the controller does not sufficiently specify the purpose of the data processing in the consent form.862 In light of the principles of good faith, the data controller might not be allowed to fall back on legal provisions authorizing the processing.863 Therefore, if the controller does not or, even worse, is not able to specify all purposes the moment it collects the data, it is not allowed to adapt its processing operations at a later stage in order to legitimize its processing of the data. Instead, the data processing as a whole is forbidden. With respect to the European level, the above-mentioned recommendation that ‘the purpose must not be so broad that it implicitly includes unlawful sub-purposes’864 points into a similar direction. However, the approach is arguable because it transposes the idea of the legal consequences of an infringement of the principle of clarity of law, which undoubtedly applies to actions of the State to data controllers operating in the private sector. On the German level, as stressed before, the Constitutional Court developed the requirements of clarity of law in combination with the principle of purpose limitation with respect to the State. In the case of “License Plate Recognition”, the Court elaborates, on the function of this interplay as: If a processing purpose specified within the law, which shall authorize the data processing, does not exclude serious infringements for fundamental rights of the individual concerned, this authorizing provision must meet the strict proportionality requirement also for the serious infringements.865 862 See above under point C. II. 1. c) Requirements for consent and consequences of its failure; Kramer, ibid., § 4a BDSG cip. 12, 13 and 22 with further references to Gola/Schomerus, ibid., § 4a cip. 22; Plath, ibid., § 4a cip. 29; OLG Köln, decision from the 17th of June 2011 (6 U 8/11). 863 See Kramer, ibid., § 28 BDSG cip. 60 to 61 with further references; Gola/ Schomerus, Federal Data Protection Law, § 4 cip. 16; in contrast, see Article 17 sect. 1 lit. b GDPR, which excludes the individual’s right to require, based on an objection to his or her consent, from the controller to delete the personal data if the controller can base the processing on another legitimate ground foreseen by law. 864 See Dammann/Simitis, ibid., cip. 7. 865 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07, cip. 95. II. The requirement of purpose specification and its legal scale 311 And, in the case of “Data Retention”, the Court explicitly stressed the idea behind that function. With respect to the treatment of data by the Intelligent Services, who in turn provide their results to State authorities, the Court clarified that “the constitutional limits of these authorities using the data (later on) must not be undermined by a wider authorization for the preceding usage (by the Intelligence Services).”866 Thus, the flux of data and the retrieval of information are principally bound to the requirement that the later usage of information must already be determined the moment the data is first collected. That said, it becomes apparent that the idea that an individual’s consent is illegal as a whole if it does not specify possible harm in advance, the strict requirements for state data processing equally burdens private parties: Private parties, like the State, have to specify and make explicit their purposes the moment the data is collected, by excluding all possible later processing that might harm an individual’s fundamental right in another way than specified. In contrast, the above-mentioned consideration of the Article 29 Data Protection Working Party that the purpose must be re-constructed, pursuant to the real circumstances of a data processing, points to another direction.867 Indeed, these considerations referred to the requirement of ‘making specified purposes explicit’ and not to the consent. However, if transferred to the consent, these considerations could mean that the consent would not be illegal as a whole. Rather, the alternative could be, that the purpose specified in the consent simply answers the question on whether or not a later processing activity can still be covered by the consent or not. In this case, the question would not be whether the consent is illegal as a whole, but whether the specific later processing of data is legal or not. The lack of a legal scale for ‘purpose specification’ in the private sector The preceding criticism provided several arguments that the ambiguous understanding of the different concepts of protection (provided for by different constitutions) led to a transfer of requirements initially developed c) 866 See BVerfG, ibid., cip. 233. 867 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, p. 18. C. The function of the principle of purpose limitation in light of Article 8 ECFR 312 for the State to data controllers operating on the private sector. In light of this transferal, it is astonishing to note that the requirement to specify the purpose, is actually stricter for data controllers operating in the private sector, than for the legislator authorizing the data processing by law.868 This is even more the case, since a precise look at the structural conditions surrounding the requirement of purpose specification will reveal, in this sub-chapter, that private data controllers additionally have, in practice, more difficulties to specify the purpose than the State. This result is particularly relevant since there are only few reliable criteria which help determine the precision of the purposes being specified, over all.869 This subchapter therefore goes on to examine criteria that may help private data controllers fulfill their task. In doing so, the following will be examined: First, the differentiation between the terms ‘purpose’, ‘means’, and ‘interests’; and second, which ‘purpose’ or which ‘interest’ is, from a time perspective sufficiently specified. No legal system providing for ‘objectives’ of data processing in the private sector As mentioned above, the European Courts provide few criteria that help specify a purpose of data processing.870 In contrast, the German Constitutional Court elaborated on, during the last 30 years, a rather detailed approach. Indeed, this approach mainly refers to purposes specified by the State. This is the crucial point because the State is able to refer, in order to specify the purposes of its processing of data, to a rather extensively developed legal system. Such a legal system helps to specify the purposes because it extensively provides for the objectives as to how the data shall be processed. In the case of “Retrieval of Bank Account Master Data”, the German Court highlights this function, in particular. In this case, the law for the encouragement of tax compliance authorized the retrieval of data by state authorities from private banks only under the condition that the aa) 868 See above under point C. II. 2. a) Difference in precision of purposes specified by legislator and data controllers. 869 See above under points C. II. 1. b) bb) Criteria discussed for purpose specification, and C. II. 1. b) cc) Purposes of processing specified when consentis given, and C. II. 1. c) dd) (3) Discussion on degree of precision of specified purpose. 870 See above under point C. II. 1. a) ECtHR and ECJ: Almost no criteria. II. The requirement of purpose specification and its legal scale 313 concrete provision had to relate to the income tax act. The German Court came to the conclusion that such a “scope of application would be unlimited in light of the fact that the income tax act contains numerous notions without concrete references to tax law which also exist in a multitude of other laws with totally different objectives. (Underlining by the author)”871 Thus, the crucial point to consider here is that the objectives of a certain law, to that a legal provision authorizing a data processing refers, not only help specify the purpose for the processing, but also implies the consequences for the individuals concerned.872 Accordingly, most German legal scholars who elaborate on a more comprehensive approach in order to determine the requirement of purpose specification, discuss this with respect to the State. In doing so, they refer to specific tasks and functions of public agencies formulated by the legislator.873 These tasks and functions determined under State Law help resolve the purpose of the data processing, to a remarkable extent. Accordingly, Eifert highlights, in particular, that the legal order provides, “in light of the legal reservation and the principle of purpose limitation a relatively precise image of the flux of information between public agencies”.874 In contrast, data controllers operating in the private sector do not have such a reference system at their disposal; they cannot refer to established laws determining their “tasks and functions” in the private sector.875 The consequences of data processing in the private 871 See BVerfG, 13th June 2007, 1 BvR 1550/03, cip. 79 and 80. 872 See, for example, BVerfG, 11th March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 98 to 178; BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Bank Account Master Data), cip. 79 to 124; BVerfG, 14th July 1999, 1 BvR 2226/94 (Surveillance of Telecommunications), cip. 180 and 181; BVerfG, 3rd March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip 307 to 319; BVerfG, 4th April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 145 to 149; cf. also the Article 29 Data Protection Working Group, “Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC“, pp. 19 and 20 as well as pp. 21 and 22. 873 See, for example, Hofmann, Purpose Limitation as Anchor Point for a Procedural Approach in Data Protection, p.76 ff., Forgó/Krügel/Rapp, Purpose Specification and Informational Separation of Powers, p. 35 f. m. w. N. 874 See Eifert, Purpose Congruence instead of Purpose Limitation, p. 151: “(...) angesichts des eng verstandenen Gesetzesvorbehalts und der Zweckbindung ein relativ gutes Abbild der Informationsströme zwischen den Verwaltungen“. 875 At least, such a solution is barely discussed in legal literature; however, see the approach of Buchner, ibid., pp. 262 and 263, who refers to the German Civil Law in order to assess the legitimacy of the controller’s interest in the data processing. C. The function of the principle of purpose limitation in light of Article 8 ECFR 314 sector are thus less predictable because the flux of information cannot be so extensively predicted, in light of the diversity of participants, their actions and intentions, as well as their entanglements in a free market economy.876 Thus, in practice, private data controllers have less possibilities at their disposal in order to specify the purpose of its data processing than public agencies. Differentiating between the terms ‘purpose’, ‘means’ and ‘interest’ This is an astonishing result and it gives further reasons for why it is important to elaborate on reliable criteria that help data controllers acting in the private sector to specify the purpose of their data processing activities. Therefore, it seems to be promising to examine, precisely, the terms of ‘purpose’, ‘means’, and ‘interest’. Differentiating between these terms may help clarify the question of what purpose actually is legally relevant. As highlighted before, the term ‘purpose’ is mentioned in various Articles provided for by law. The term ‘means’ is mentioned, for example, together with the term ‘purpose’, in Article 2 lit. d of the Data Protection Directive and Article 4 sect. 7 of the General Data Protection Regulation determining who the ‘data controller’ is.877 On the German level, the German Constitutional Court also refers to the term ‘means’ as the way of how data is processed and, in doing so, differentiates it from the term ‘purpose’.878 Finally, while Article 28 sect. 1 sent. 1 no. 2 of the German Federal Data Protection Law only refers to ‘interests’, Article 7 lit. f of the Data Protection Directive refers to both terms ‘purpose’ and ‘interests’ as: “Personal data may be processed only if processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed”. It is therefore important to know how these notions can be differentiated from each other. bb) 876 Cf. Bäcker, Constitutional Protection of Information regarding Private Parties, p. 100. 877 See above under point C. II. 1. b) (2) Liability for ’data processing’: ’Controller’ and ’processor’. 878 See above under point C. II. 1. c) ee) (1) (c) Examples for unspecific purposes: Abstract dangers or unknown purposes, referring to BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07, cip. 99. II. The requirement of purpose specification and its legal scale 315 An analysis of all three terms may provide criteria in order to determine which purpose is legally relevant. ‘Interests’ protected by the controller’s fundamental rights The Article 29 Data Protection Working Group provides some guidelines on how to differentiate between ‘purposes’, ‘means’ and ‘interests’. With respect to the difference between the terms ‘purpose’ and ‘means’ it defined the first as an “anticipated outcome that is intended or that guides planned actions” and the second as “how a result is obtained or an end is achieved”.879 It elaborates on these definitions as: “(…) determining the purposes and the means amounts to determining respectively the ‘why’ and the ‘how’ of certain processing activities.”880 With respect to the difference between the terms of ‘purpose’ and ‘interest’, the Group furthermore states: “The concept of ‘interest’ is closely related to, but distinct from, the concept of ‘purpose’ mentioned in Article 6 of the Directive. In data protection discourse, ‘purpose’ is the specific reason why the data are processed: the aim or intention of the data processing. An interest, on the other hand, is the broader stake that a controller may have in the processing, or the benefit that the controller derives – or that society might derive – from the processing. For instance, a company may have an interest in ensuring the health and safety of its staff working at its nuclear powerplant. Related to this, the company may have as a purpose the implementation of specific access control procedures which justifies the processing of certain specified personal data in order to help ensure the health and safety of staff.”881 In conclusion, the Working Group defines the ‘purpose’ referring to the ‘why’ of the data processing. It defines the ‘means’ by referring to ‘how’ this purpose is obtained. And, it defines the ‘interest’ by referring to the ‘benefit that the controller derives’ from that purpose. At a first glance, these definitions appear to provide reliable criteria in order to differentiate (1) 879 See the Article 29 Data Protection Working Group, Opinion 1/2010 on the concepts of ’controller’ and ’processor’, p. 13. 880 See the Article 29 Data Protection Working Group, ibid., p. 14. 881 See the Article 29 Data Protection Working Group, Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC, p. 24. C. The function of the principle of purpose limitation in light of Article 8 ECFR 316 between the terms. However, applying them to a particular case, it becomes apparent that the definitions highly depend on the circumstances of the case at hand. In the example provided for in the second chapter, the publisher of an online newspaper used an analytical tool in order to review the ‘usage data’ of visitors of its website.882 While the ‘purpose’ might be the improvement of the website experience, the analysis would be the ‘means’, and the ‘interest’ could be to increase the user traffic. However, this ‘interest’ could also be the ‘purpose’ for the ‘interest’ to increase the price for banner advertisement and the improvement of the website would then be the ‘means’. Accordingly, this broader ‘interest’ could be the ‘purpose’ for the even broader ‘interest’ to finance the costs for the journalistic labor of the website holder and the ‘means’ would be the efforts of increasing the user traffic. Finally, this ‘interest’ could again be the ‘purpose’ for the ultimate ‘interest’ of surviving on the private market and so on. In conclusion, each ‘purpose’ could become the ‘means’ for the next following ‘purpose’, and each ‘interest’ the ‘purpose’ for the next broader ‘interest’. The question therefore remains: How to differentiate between purposes, means and interests? Or, in other words, if means and interests can also be considered as purposes, which of these purposes are deemed to be legally relevant? In fact, this question cannot be answered by technically differentiating between the terms ‘purposes’, ‘means’ and ‘interests’. Instead, it can only be answered, from a normative perspective, through an objective scale. The examples provided for by the Article 29 Working Group demonstrate that the method proposed leads to a circular reasoning. With respect to the difference between the terms of ‘purpose’ and ‘interest’, the Working Group exemplifies, as listed previously, possible ‘legitimate interests’ as: Conventional direct marketing and other forms of marketing or advertisement; unsolicited non-commercial messages; employee monitoring for safety or management purposes; physical security, IT and network security; processing for historical, scientific or statistical purposes; processing for research purposes (including marketing research). Most of these ‘interests’ are not only ‘purposes’ authorized by law but the Working Party itself also names them ‘purposes’! However, other examples given by the Working Party for ‘legitimate interests’ point to a solution, which provides 882 See above under point B. III. 4. Clarifying the relationship between ”context“ and ”purpose“, and 5. Values as normative scale determining ”contexts“ and ”purposes“. II. The requirement of purpose specification and its legal scale 317 an objective scale in order to define ‘interests’. In order to evaluate the importance of the ‘legitimate interests’ of the data controller during the balancing exercise with the opposing interests, the Working Group also refers to the data controller’s fundamental rights. These indeed protect ‘interests’ and therefore provide an objective scale for determining the ‘interests’ of the data controller. Is the ‘purpose’ determined by the individual’s fundamental rights? With respect to the definition of the term of ‘purpose’, the German Constitutional Court pointed, in its decision of “License Plate Recognition” into the same direction, even if it did so in favor of the individual. Again, the decisions provided for by the German Constitutional Court do not provide criteria for the interpretation of European laws.883 However, the decisions can provide a source of inspiration for how the terms could be differentiated on a European level. In the case of “License Plate Recognition“, as illustrated before, the law offended permitted the collection of data related to license plates of cars for the purpose of checking it against police data files that were open for investigation. The Court came to the conclusion that the law offended did not provide “concrete requirements for the state measure, it especially did not pre-determine the reason and the purpose of usage which was sufficiently specific for certain areas and legally clear.”884 The law offended has indeed named the ‘purpose’ of the data collection as ‘checking against the data stored in the police files open for investigation’. However, the Court argued that this term “does not determine the purpose for that the collection and the checking of the data shall finally serve. Only the manner how an investigation purpose shall be, after the collection of the data, achieved is mentioned. This purpose (itself) indeed remains open.”885 The Constitutional Court hence considered that the ‘checking of the data collected against other data stored in police files’ was not the ‘purpose’ but the ‘means’. The actual ‘purpose’ instead was the notion of ‘open investigation’. In the Court’s opinion, this notion did (2) 883 See above under point C. I. 1. a) The interplay between European Convention for Human Rights, European Charter of Fundamental Rights and German Basic Rights. 884 See BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07, cip. 98. 885 See BVerfG, ibid., cip. 99. C. The function of the principle of purpose limitation in light of Article 8 ECFR 318 not sufficiently specify the purpose because there was no legal or commonly accepted definition of the term.886 In particular, the fact that the purpose was so broad did not exclude the possibility to use the collected data for police surveillance purposes or even for purposes of criminal prosecution.887 The last two considerations finally point to the solution, which is based, again, on an objective scale and determines which purpose is legally relevant or not. As described before, the German Constitutional Court examines, whether the informational measures by the State are constitutional or not. The German Court does so, by assessing, at first, whether the information measure offended constitutes an infringement or not. This is the case if it provides ‘an insight into the personality’ of the individual concerned and the ‘state interest, with respect to the overarching context and with respect to the purpose’ either constitutes a ‘specific danger for the freedom of action and of being private’ or if it ‘qualitatively affects a person’s fundamental right’ or if it can ‘essentially concern the individual’s interests’.888 Whatever the concrete scale might be, evaluating the intensity of the infringement, the Court also takes the other fundamental rights of the individual concerned into account.889 For example, it considers the right to privacy of the home or telecommunications.890 It also takes the individual’s risk of being stigmatized into account, in particular, if the treatment of data refers to criteria, such as religion or ethnic origin, listed in Article 3 of the German Basic Law, which guarantees the freedom of equality.891 In addition, it stresses that the individual’s fear of being surveyed can, in advance, lead to a bias in communication, which is protected by the freedom of opinion.892 Finally, taking the disadvantages for the individuals into account, the Court considers their risk of being an object of state investigations, which adds to their general risk of being unreasonably 886 See BVerfG, ibid., cip. 100. 887 See BVerfG, ibid., cip. 136 and 149. 888 See above under point C. I. 2. d) Infringement by ’insight into personality’ and ’particularity of state interest’. 889 See in detail above under point C. I. 2. e) aa) In the public sector: Interplay between the three principles clarity of law, proportionality, and purpose limitation. 890 See BVerfG, 4th of April 2006, 1 BvR 518/02, cip. 93 (Dragnet Investigation). 891 See BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 106. 892 See BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 230. II. The requirement of purpose specification and its legal scale 319 suspected.893 This last consideration refers, at least implicitly, to the right to a fair trial and/or the individual’s general freedom of action. In conclusion, be it in relation to the infringement by an informational measure, or in relation to the proportionality of the infringement, the German Constitutional Court refers to the individual’s basic rights. Fundamental rights can therefore not only provide an objective scale in order to determine the ‘interests’ on behalf of the controller, but also, vice versa, the ‘purpose’ of the data processing with respect to the fundamental rights of the individual concerned. Thus, while the fundamental rights of the controller of the personal data can provide a scale for determining its interests, the opposing fundamental rights of the individual concerned could provide a legal scale in order to specify the purpose of the data processing. This differentiation at least enables one, so far, to clarify both terms used in Article 7 lit. f of the Data Protection Directive and Article 6 sect. 1 lit. f of the General Data Protection Regulation. While the term ‘interest’ refers to the fundamental rights of the controller, the term ‘purpose’ may refer to the fundamental rights of the individual who is concerned by the processing of data concerning him or her. Inclusion or exclusion of future ‘purposes’ and ‘interests’ Another question is which ‘interests’ and ‘purposes’ are recognized in terms of time. With respect to the ‘interests’ mentioned in Article 7 lit. f of the Data Protection Directive, the Article 29 Working Group states that there must be “a real and present interest, something that corresponds with current activities or benefits that are expected in the very near future. In other words, interests that are too vague or speculative will not be sufficient.”894 With respect to Article 2 lit. h of the Data Protection Directive, legal scholars comparably argue that the ‘specific’ consent does not exclude future acts of usage, but rather must refer to concrete circumstances, including the purpose of the processing.895 At first view, both statements bb) 893 See BVerfG, 3rd of March 2004, 1 BvR 2378/98 (Big Eavesdropping Operation), cip. 227; BVerfG, 4th of April 2006, 1 BvR 518/02 (Dragnet Investigation), cip. 103. 894 See ”Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC“, p. 24. 895 See Dammann/Simitis, ibid., cip. 22. C. The function of the principle of purpose limitation in light of Article 8 ECFR 320 seem to refer to the same question: How specific must the controller’s interest be, or, how specific must the consent be specified in terms of time? However, the answer depends on the fundamental rights. If the fundamental rights provide a legal scale in order to determine, on the one hand, the ‘interests’ on behalf of the controller and, on the other hand, the ‘purpose’ of the data processing with respect to the individual concerned, it becomes clear that there actually are two different starting points for answering this question. Present interests vs. future interests As proposed previously, a data controller’s ‘interest’ is determined by its fundamental rights. This differentiation refers to controllers acting through the private sector. With respect to the state processing of personal data, the Data Protection Directive, as well as the General Data Protection Regulation, also use the term ‘interest’, in more precise words, ‘public interest’.896 However, the State, as the controller does not process personal data in favor of its own fundamental rights. Instead, the State processes personal data in order to protect the fundamental rights of third private parties or other constitutional positions conflicting with the individual’s fundamental rights.897 In Germany, the German Constitutional Court summarizes, in its recent decision of “Federal Bureau of Investigation Law”, how the legislator has to specify these “interests”: in particular, first, it must specify the object of protection being protected by the data processing; second the task of the public agency that is allowed to process the personal data; and third, the reason given for the data processing. In the Court’s opinion, the reason, such as an abstract or concrete danger for the object of protection that shall be protected, does not result from the principle of purpose limitation but from the principle of clarity of law. If the collected data is re-used, for the same purpose by the same public agency, the Court slightly liberalized, in this decision, its approach. Before this new decision, the re-use of data required the same reason to be given as the initial reason (e.g. an urgent (1) 896 See, for example, Article 7 lit. e of the Data Protection Directive and Article 6 sect. 1 lit. e of the General Data Protection Regulation. 897 See above under point C. I. 1. b) bb) (1) The 3-Step-Test: Assessing the defensive and protection function. II. The requirement of purpose specification and its legal scale 321 danger for human life). Instead, pursuant to the recent case, the re-use of personal data does not require anymore the same reason to be given for its collection, but only for a so-called investigative reason.898 In any case, even if the Court slightly liberalized, in this regard, the concept of protection, the duty of the legislator to specify the reason still restricts, essentially, the State from data processing and, therefore, still constitutes an important element in the proportionality assessment.899 The Article 29 Data Protection Working Group applies a similar approach with respect to data controllers acting through the private sector requiring “a real and present interest, something that corresponds with current activities or benefits that are expected in the very near future.”900 Again, the reasoning provided for by the German Constitutional Court shall not serve, of course, as a source for the interpretation of European secondary law. However, also with respect to the European constitution, there is a difference principally between the State and private parties being regulated.901 Thus, the Working Group has to justify why it wants to regulate private parties similar or equal to the State. Private parties are not bound to the principle of clarity of law. In contrast, they are themselves protected by fundamental rights.902 There must hence be another reason justifying the restriction that their ‘interest’ must be ‘a real and present interest’. As highlighted before, fundamental rights do not only protect present interests, but also broader expectations, even against unspecific risks.903 At least, the right to freedom to conduct a business under Article 16 ECFR covers, as the more general right compared to the fundamental rights to occupation and property under Articles 15 and 17 ECFR, all 898 See above under point C. II. c) ee) (1) (d) Liberalization of the strict requirement by referring to the object of protection, referring to BVerfG, 20th of April 2016, 1 BvR 966/09 and 1 BvR 1140/09 (Federal Bureau of Investigation Law), cip. 289. 899 See, in particular, BVerfG, 11th of March 2008, 1 BVR 2047/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 75; BVerfG, 13th June 2007, 1 BvR 1550/03 (Retrieval of Banking Account Matser Data), cip. 71, 73 and 74. 900 See the Article 29 Data Protection Working Group, Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 96/46/EC, p. 24. 901 See above under point C. I. 1. b) The effects of fundamental rights on the private sector. 902 See above under point C. I. 1. b) aa) Third party effect, protection and defensive function. 903 See above under point B. II. 3. c) Interim conclusion: Fundamental rights determining appropriateness of protection. C. The function of the principle of purpose limitation in light of Article 8 ECFR 322 kinds of business activity.904 This right apparently protects, therefore, not only present profit prospects but also strategic aims. As a consequence, the restriction of a controller’s data processing operations to ‘present’ interests must thus be justified by the prevailing interests of the individual concerned covered by his or her fundamental rights. Purpose specification pursuant to the type of threat? With respect to the ‘purpose’ of the data processing, the question to consider is which type of threat the processing of data causes for the individual’s fundamental rights. The German Constitutional Court provides for the following differentiation: “the right to informational self-determination supplements and broadens the constitutional protection of freedom of action and of privacy by extending its scope already at the level of danger for the personality. Such a danger can already exist before the concrete threat of certain objects of legal protection, especially if personal information is used and combined in a manner that the individual is unable to overview or control it.”905 In chapter B. II. Data protection as a risk regulation, the differences that exist between the terms of ‘danger’ and ‘risk’ and which protection instruments are appropriate for these different types of threat were discussed. Different theories were presented, not in order to decide which theory prevails, but instead, in order to highlight the fact that different threats require different protection instruments. An essential difference concerned the fact of whether the object of the threat is known or not. If the threat is not known, effective instruments often require the threat-causing entity to gather or provide the information necessary in order to monitor the threats and discover, if so, the threat for a specific object of protection. This protection instrument constitutes a low regulatory burden because it only slightly restricts the room of action of the “threat causing” data controller. Simultaneously, the risk discovery function of this protection function safeguards the possibility of avoiding or at least reducing the threat before it turns into real harm. Thus, the question for example is whether the purpose specified by the controller must refer to threats for specific objects of (2) 904 Cf. Folz, Article 15 ECFR – Freedom to Conduct a Business, cip. 3. 905 See BVerfG, 13th June 2007, 1 BvR 1550/03, cip. 64 (”Kontostammdatenabfrage“). II. The requirement of purpose specification and its legal scale 323 protection, only, or whether it can also refer to unspecific threats.906 This question leads to the function of the requirement to specify the purpose with respect to the fundamental rights of the individual concerned. Summary of conceptual ambiguities The previous criticism carved out several arguable considerations made in the legal discussion with respect to the processing of personal data in the private sector: First, the current framework mainly refers to the purpose of the collection of personal data, for the private sector just as for the public sector, in order to evaluate the need for protection against the risks caused by the processing as a whole. But focusing on the moment of collection conflicts, in principle, with innovation processes in non-linear environments. The reason for this is that focusing on the moment of collection requires the controller to predict the later use of data, albeit the outcome of innovation processes is hardly predictable. Second, some legal scholars consider the individual’s consent invalid as a whole, if the private data controller did not specify the purpose in a sufficiently precise or comprehensive manner at the outset. This approach actually transfers the concept of protection, applicable for the processing of data by the State, to the private sector. Indeed, in the public sector, a law authorizing the processing of personal data is principally invalid as a whole if it is disproportionate. In contrast, in the private sector, it would be possible that an individual’s consent containing a very broad purpose is not invalid as a whole. Instead, such a consent could be considered as providing the basis only for such data processing that corresponds to the purposes specified in the consent. Other data processing activities that harm the individual more than specified before, does not lead to the consent becoming void per se, but this processing would simply not be covered by the individual’s consent. Finally, comparing the requirements considered, on the one hand, for the purpose specified in the consent, and on the other hand, within the law itself, brings to light the following result: the effects are, in practice, stricter on the private sector than on the public sector. This result is in particular surprising, in light of the fact that the legislator is, unlike private cond) 906 See above under point B. II. 3. c) Interim conclusion: Fundamental rights determining appropriateness of protection. C. The function of the principle of purpose limitation in light of Article 8 ECFR 324 trollers, directly bound to the individual’s fundamental rights. The reason for all of these results may be that the concept of protection initially developed with respect to the processing of data by the State is directly transferred to the data processing in the private sector.907 However, the previous criticism also sheds light on a possible solution for this contradictory result. When elaborating on a possible solution for the problem of how one could differentiate between the terms ‘purpose’, ‘means’, and ‘interest’, it was found that the fundamental rights of both the individual concerned and the controller could, respectively, provide for the necessary objective legal scale. Hence, the individual’s fundamental rights could also provide a legal scale in order to determine which purpose of the data processing is legally relevant, and as a consequence, how precisely a private data controller has to specify the purpose of its data processing. The subsequent analysis will demonstrate how this may work, applying the framework of the regulation of risks, as illustrated in the second chapter. Solution approach: Purpose specification as a risk-discovery process Data protection law is considered to be a regulation of risks caused by the processing of personal data. One of the challenges of such a risk-based approach is to find an objective scale for measuring the impact of risks on the individuals concerned and society as a whole. Without such an objective scale, the risk-based approach runs itself the risk of turning into a selflegitimizing procedural practice for data controllers.908 With respect to the question of the object of data protection laws, scholars argue that these laws protect the individual’s autonomy. Indeed, since the concept of individual autonomy is rather broad and therefore barely provides clear criteria for a legal concept of protection, scholars, as well as Constitutional Courts, refer to the specific context of a data processing activity. Nissenbaum argues, in particular, that such a “context”-based approach helps to determine the “informational norms” that govern specific contexts and, as such, provides a better framework for assessing the individual’s privacy 3. 907 See above under point B. II. Data protection as a risk regulation. 908 See above under point B. II. Data protection as a risk regulation. II. The requirement of purpose specification and its legal scale 325 than a “purpose” of data processing.909 However, this thesis has clarified that the “purpose” of the processing of personal data constitutes just another legal link for regulation, focusing on risk protection. This legal link, i.e. the purpose, determines the intended “future” context of the data processing and enables regulators, data controllers, and individuals concerned to determine and adapt, in advance, to the “informational norms” that govern a certain context. Using the purpose as a legal link for determining a future context hence avoids the risk of an infringement of its “contextual integrity”. Indeed, the definition of the context depends on “values” inherent in a social context,910 and consequently, the definition of the purpose also requires an objective scale in order to determine which context (aka purpose) is legally relevant.911 Therefore, the search for an objective scale draws attention to the concept of protection. Interestingly, the concept of protection elaborated on by the German Constitutional Court regarding the right to informational self-determination does not provide, so far, reliable criteria in order to determine the contexts and purposes, at least, not in the private sector.912 Similarly, the concept of protection that the European Court of Justice had started to elaborate on with respect to the fundamental rights to private life and to data protection under Article 7 and 8 ECFR, does not provide reliable criteria either. So far, the discussion mainly treats the question of the exact interplay between the fundamental right to private life under Article 7 ECFR and the fundamental right to data protection under Article 8 ECFR.913 However, it was demonstrated that both scopes of protection essentially are defined by referring exclusively to the 909 See above under point B. III. 4. Clarifying the relationship between ”context“ and ”purpose“, referring to Nissenbaum, Respect for Context as a Benchmark, p. 291 and 292. 910 See above under point B. III. 4. Clarifying the relationship between ”context“ and ”purpose“, referring to Nissenbaum, Respect for Context as a Benchmark, p. 292, and point B. III. Theories about the value of privacy aka data protection. 911 See above under point B. III. 4. Clarifying the relationship between ”context“ and ”purpose“, referring to Nissenbaum, Respect for Context as a Benchmark, p. 292, and point B. III. Theories about the value of privacy aka data protection. 912 See above under point C. II. 1. c) ee) Comparison with principles developed by German Constitutional Court. 913 See above under point C. I. 3. a) Genesis and interplay of both rights. C. The function of the principle of purpose limitation in light of Article 8 ECFR 326 term “personal data”. Such a concept of protection leads, in light of the increasing digitization of society, to the displacement of the other, eventually more specific, fundamental rights. This theoretical finding is particularly relevant because these other fundamental rights could actually provide the necessary criteria in order to determine the context in which data processing occurs and, correspondingly, the purpose of data processing.914 On the basis of these findings, this chapter proposes, in its first sub-chapter, a concept of protection for the fundamental right to data protection under Article 8 ECFR, avoiding the criticized “broadness and vagueness” of its scope. The second part illustrates the functioning of this concept of protection with respect to the substantial guarantees provided for by the other fundamental rights to privacy, freedom, and non-discrimination of the individual concerned by the processing of data concerning him or her. The last part concludes, by emphasizing that this concept of protection corresponds with the openness of data-driven innovation in the private sector. Regulative aim: Data protection for the individual’s autonomy This thesis promotes that the essential value added by the fundamental right to data protection under Article 8 ECFR consists of the following elements as subsequently assessed: The first chapter examines individual autonomy as the ultimate objective of the right to data protection, which is an essential pre-condition for a free and democratic civil society. However, since individual autonomy is itself a too broad concept in order to provide a precise scale determining specific requirements for the risks caused by the processing of personal data, the specific requirements must be determined by the totality of all fundamental rights. This leads to the second and third elements, which will be examined in the next chapter. The right to data protection under Article 8 ECFR regulates, as a central norm, the risks caused by the processing of personal data for all fundamental rights and freedoms and, in doing so, extends the range of protection to unspecific risks, i.e. before a specific object of protection of the a) 914 See above under point C. I. 3) c) cc) Referring to substantial guarantees as method of interpreting fundamental rights in order to avoid a scope of protection that is too broad and/or too vague. II. The requirement of purpose specification and its legal scale 327 other fundamental rights is threatened.915 The concept promoted thus avoids a conceptual link between privacy and data protection, because this conceptual link would inevitably lead to an exclusive focus on privacy and the moment when the data is collected. Instead, the concept promoted in this thesis equally links data protection regulation to the other fundamental rights. Consequently, the later processing is equally important for the evaluation of the risks. Hence, the requirement of purpose specification serves as an instrument of risk discovery. As will be demonstrated, this concept bears several advantages with respect to the interplay of the general scope of protection of the right to data protection and the application of its protection instruments balancing the opposing fundamental rights. Finally, since there are clear tendencies by the Constitutional Courts and the legislator that can assist in refining the current concept of protection, the last chapter concludes with highlighting how this refinement might be worked out in order to balance, more appropriately, the opposing interests of data controllers and individuals concerned in the private sector. Intermediate function of data protection As shown in chapter “C. I. 3. a) Genesis and interplay of both rights”, one part of the legal discussion surrounding the right to private life under Article 7 ECFR, and the right to data protection under Article 8 ECFR, concerns their precise interplay.916 A similar, but however distinct debate, concerns the nature of the fundamental right to data protection per se. This debate treats the question on the ultimate value of this right or, in other words, the object and concept provided for. Regarding this issue, Tzanou gives a dense overview and summarizes several values of data protection discussed in legal literature: First of all, the protection of privacy is deemed to be one value; another value is meant to be data security, i.e. securing personal data against its potential misuse (such as by loss or access by unauthorized persons); data quality is considered as another value, which means that personal data is accurate, relevant, and up-to-date; comparably, Tzanou mentions “transparency, foreseeability in data processing, accountability of data controllers, and (…) participation of the data subject aa) 915 Cf. BVerfG, 11th of March 2008, 1 BvR 2074/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 63. 916 See above under point C. I. 3. a) Genesis and interplay of both rights. C. The function of the principle of purpose limitation in light of Article 8 ECFR 328 in the processing of his/her information” as further values (which data protection laws establish by means of fair information principles such as fair processing, purpose specification and individual participation); as yet another value is considered the principle of non-discrimination; and Tzanou even lists the proportionality principle as a value expressed within the law (in form of the necessity requirement).917 Indeed, this multi-facetted “value collection”, as discussed in legal literature, does not provide a consistent theory on the object and concept of protection of the fundamental to data protection under Article 8 ECFR. In particular, it remains unclear why these values, and maybe even further ones, are the values of data protection law. Tzanou therefore consequently turns to the two, so far, mostcomprehensively developed theories, on the one hand, by the scholars De Hert and Gutwirth and, as a reaction to it, Rouvroy and Poullet.918 Different functions of rights (opacity and transparency) De Hert and Gutwirth appraise the new fundamental right to data protection,919 and consider privacy and data protection as two distinct instruments of power control: Privacy protects, in their opinion, as a “tool of opacity”, the individual by determining ”what is deemed so essentially individual that it must be shielded against public and private interference”;920 in contrast, as a “tool of transparency”, data protection becomes relevant “after these normative choices have been made in order still to channel the normatively accepted exercise of power”.921 In De Hert and Gutwirth’s opinion, data protection laws hence are, contrary to laws legit- (1) 917 See Tzanou, Data protection as a fundamental right next to privacy? ‘Reconstructing’ a not so new right, pp. 91 and 92 referring, amongst others, to: Lee Bygrave, ‘The Place of Privacy in Data Protection Law’ (2001) 24 University of New South Wales Law Journal 277, 278; Helen Nissenbaum, ‘Protecting Privacy in an Information Age: The Problem of Privacy in Public’ (1998) 17 Law and Philosophy 559, 576; Herbert Burkert, ‘Towards a New Generation of Data Protection Legislation’ in Gutwirth and others (eds), Reinventing Data Protection? (Springer: Dordrecht, 2009) 339; C Kuner and others ‘The challenge of “big data” for data protection’ (2012) 2 International Data Privacy Law 47–49. 918 See Tzanou, ibid., p. 92. 919 See De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, p. 81. 920 See De Hert and Gutwirth, ibid., p. 70. 921 See De Hert and Gutwirth, ibid., p. 70. II. The requirement of purpose specification and its legal scale 329 imizing an interference of privacy, “based upon the assumption that the processing of personal data is in principle allowed and legal.”922 Both authors consider, thus, the logic behind current data protection laws as not being prohibitive. Indeed, it might appear prohibitive because these laws principally forbid the processing of personal data unless certain conditions are met. However, for example, the general clause of Article 7 lit. f of the Data Protection Directive “can obviously 'make data processing legitimate' for every thinkable business interest.”923 De Hert and Gutwirth consider few exceptions from this rule. In particular, they see only provisions as exceptionally prohibitive regarding sensitive data, profiling, and the principle of purpose limitation. The first exception results, in their opinion, from the nature of the data that “bears the supplementary risk of discrimination"; however, they stress that the other exceptions do actually not strictly limit the use of data. For instance, the compatibility assessment of the principle of purpose limitation foresees that certain conditions need to be met (in order to pass the test) rather than a strict limitation, for example, as the strict requirement of purpose identity does.924 In contrast to De Hert and Gutwirth, Rouvroy and Poullet do not appraise the new right to data protection under Article 8 ECFR, but criticize its elevation into the status of a fundamental right.925 Rouvroy and Poullet advocate the high importance of the individual’s autonomy as the final objective behind privacy, stating that the right to privacy should be understood as “essentially an instrument for fostering the specific yet changing autonomic capabilities of individuals that are (…) necessary for sustaining a vivid democracy.”926 Turning the focus on the right to data protection, Rouvroy and Poullet indeed acknowledge that data protection laws are “among the tools through which the individual exercises his right to privacy” and even that “data protection is also a tool for protecting other rights than the right to privacy”.927 However, the similarities of both rights are 922 See De Hert and Gutwirth, ibid., p. 78. 923 See De Hert and Gutwirth, ibid., p. 78 and 79. 924 See De Hert and Gutwirth, ibid., pp. 79 and 80. 925 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 71. 926 See Rouvroy and Poullet, ibid., p. 46. 927 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 70. C. The function of the principle of purpose limitation in light of Article 8 ECFR 330 strong. In their opinion, it “appears obvious (…) that data protection regimes are intended both, with regard to the ‘seclusion’ aspect of privacy, to protect our ‘private sphere’ (for instance by forbidding the processing of certain sensitive data or by enlarging the secrecy of the correspondence to electronic mails) on the one hand and, on the other hand, with regard to the ‘decisional autonomy’ aspect of privacy, to increase the transparency of information flows and to limit them in order to prevent disproportionate informational power relationships to be developed or perpetuated between public and private data controllers and citizens.”928 Both rights thus “intersect but are also different tools for enabling individual reflexive autonomy and, as a consequence, also collective deliberative democracy."929 Rouvroy and Poullet fear these similarities because it risks “obscuring the essential relation existing between privacy and data protection and further estrange data protection from the fundamental values of human dignity and individual autonomy, foundational to the concept of privacy and in which data protection regimes have their roots (…).”930 Disconnecting the exclusive link between data protection to privacy Tzanou criticizes both approaches of De Hert and Gutwirth, as well as of Rouvroy and Poullet, because they do not provide a robust analysis of the fundamental right to data protection as such.931 With respect to Rouvroy and Poullet, she stresses that their “fears (...) remain unsubstantiated” because they do not make “clear why data protection cannot have an instrumental value, while at the same time being on an equal footing with privacy.”932 She observes that both authors obviously negate any proper value of the fundamental right to data protection “because this might allegedly end up in trumping the instrumental value of privacy, and thus undermine privacy as a fundamental right.”933 Indeed, Tzanou affirms the important values of autonomy, human dignity, and self-development that Rouvroy and Poullet highlight when discussing the appropriateness of privacy and (2) 928 See Rouvroy and Poullet, ibid., p. 70. 929 See Rouvroy and Poullet, ibid., p. 70. 930 See Rouvroy and Poullet, ibid., p. 75. 931 See Tzanou, ibid., pp. 92 and 93. 932 See Tzanou, ibid., p. 94. 933 See Tzanou, ibid., p. 94. II. The requirement of purpose specification and its legal scale 331 data protection as fundamental rights. However, before discussing the inappropriateness of data protection as a fundamental right, it is necessary, in her opinion, to clarify the precise concept of protection of the right to data protection.934 Irrespective of the principle legitimacy of her criticism, Tzanou indeed overlooks one decisive aspect in the reasoning of Rouvroy and Poullet. Their fear lies, undoubtedly, in the similarity of both rights. However, the reason for this similarity (and, consequently, for their fear) is how they conceptualize both rights. They locate “the two ‘aspects’ of privacy (the right to seclusion and the right to decisional autonomy)” under the right to privacy and the right to data protection. Hence, they link data protection exclusively to privacy when they state that data protection organises, like the right to privacy (and the German right to informational self-determination), “a system of disclosure of personal data respectful of the individual’s right to self-determination, as both opacity and transparency therefore contribute to sustaining the individual’s self-development.”935 In Rouvroy’s and Poullet’s understanding, there is hence, no clear conceptual difference between both rights, and this indeed obscures the relation between the right to privacy and the right to data protection and, consequently, the overall concept of protection. Though, this conceptual difference is exactly what makes the concept developed by De Hert and Gutwirth so important. Interestingly, Tzanou also criticizes their approach because both authors essentially define the right to data protection by referring to the right to private life, instead of elaborating on its concept of data protection independent from the right to privacy.936 She formulates this criticism as: “There is, however, a paradox in their line of thinking: their theory, while it aims to be a theory on data protection, does not focus on data protection itself. Rather, the added value of data protection is demonstrated through its distinction from privacy. By preaching separation, they strive to show the indispensability of data protection. But, their very argument proves them wrong. In the end, according to de Hert and Gutwirth, everything will be judged on the basis of privacy, as the tool of opacity will be the benchmark for establishing prohibited interference. Data protection, as a transparency tool, merely describes the permitted level of processing; the limits will then be set on the basis of 934 See Tzanou, ibid., p. 94. 935 See Rouvroy and Poullet, ibid., p. 58. 936 See Tzanou, ibid., pp. 92 and 93. C. The function of the principle of purpose limitation in light of Article 8 ECFR 332 privacy. This, however, means that data protection is not indispensable: we could live well without it. Of course we are better off with it, as it has some utility as a useful transparency tool, but still we could live without it, since every possible interference will be judged against privacy. De Hert and Gutwirth fail to prove, therefore, why data protection is so fundamental, that it explains its constitutional entrenchment.”937 Tzanou concludes from her critique that there is a necessity to elaborate on a concept of protection of the right to data protection, independently from the right to private life. This would enable one to clarify the real value added by this right. In particular, conceptualizing the fundamental right to data protection as a transparency tool, only, leads, in her opinion, to the problem that it remains dependent from other rights. Such an understanding limits the value of data protection. Therefore, she promotes the notion of elaborating on ‘hard core’ data protection principles that make this right autonomous from other fundamental rights.938 In doing so, she claims, amongst others, a ‘core’ or ‘essence’ of the right to data protection and that it should be balanced per se against opposing fundamental rights, and “not through the proxy of privacy.”939 Indeed, Tzanou’s criticism highlights an important point. If the right to data protection shall add protection with respect to other fundamental rights, its concept of protection must be clear in comparison to the other fundamental rights. However, in fact, the European Court of Justice had already balanced the right to data protection per se against opposing rights.940 And, meanwhile, the Court has also affirmed a ‘core’ or ‘essence’ of the fundamental right to data protection.941 Both affirmations undoubtedly strengthen this fundamental right and make it more independent from other rights, such as the right to privacy. However, this is not enough in order to clarify the value added by this right in the European Charter of Fundamental Rights. Tzanou therefore stops herself too early in elaborating on the precise concept of the right to data protection and ne- 937 See Tzanou, ibid., p. 93. 938 See Tzanou, ibid., pp. 96 and 97. 939 See Tzanou, ibid., p. 98. 940 See the decisions above under point C. I. 1. b) aa) (2) (b) The right to data protection under Article 8 ECFR and/or the right to private life under Article 7 ECFR. 941 See the decisions above under point C. I. 3. c) aa) (2) (b) Protection against collection, storage, and subsequent risk of abuse, referring to ECJ C-293/12 and C-594/12 cip. 39 and 40. II. The requirement of purpose specification and its legal scale 333 glects the fact that all authors criticized by her (De Hert, Gutwirth, Rouvroy, Poullet), already point into the direction that makes this right so indispensable. Data protection for all rights to privacy, freedom, and equality What makes the fundamental right to data protection so indispensable, is that it protects the individual against risks caused by the processing of personal data for all his or her fundamental rights. Its paramount objective is to protect the individual’s autonomy. However, because individual autonomy is a concept that is too broad in order to provide a differentiated scale to determine the requirements for the processing of personal data in all particular cases, it is the diversity of all fundamental rights that provide the necessary objective scale. This is where all legal scholars referred here point to, however, they do not actually deal with it in detail. For example, Rouvroy and Poullet stress that “autonomy and self-determination (…) cannot be characterized as legal ‘rights’, they are not something that the state can ‘provide’ the individuals with (…).”942 Instead, the State is rather able “showing respect for individual autonomy and, as far as possible, providing some of the conditions necessary for individuals to develop their capacity for individual deliberative autonomy (…) and for collective deliberative democracy (…).”943 Thus, the value of individual autonomy must always be intermediated by more specific rights. Unfortunately, the concept of protection considered by Rouvroy and Poullet in relation to the right to privacy does not provide clarity in this regard. In contrast, both authors believe that the natural result of the “intermediate value” of the right to privacy is indeterminate in itself. In order to tackle the ambiguous notion of privacy, both authors therefore require “taking fully into account the context in which our liberties have to express them- (3) 942 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, pp. 59 and 60. 943 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, pp. 59 and 60. C. The function of the principle of purpose limitation in light of Article 8 ECFR 334 selves”.944 With respect to data protection, both authors even more explicitly point to the idea that data protection laws are not only “among the tools through which the individual exercises his right to privacy” but “also a tool for protecting other rights than the right to privacy”.945 Hence, the other fundamental rights of freedom and equality can, beside the right to privacy, well provide a sufficient legal scale in order to refine the ambiguous notion of a right to data protection, which protects individual autonomy that is put at risk by automated data processing. De Hert and Gutwirth similarly follow this trait stating: “Last and foremost, data protection has grown in response to problems generated by new technology. It brings no added value to reduce all these responses to 'privacy'. Other values and concerns are also at play. Take for instance the right not to be discriminated against that is protected by Article 15 of the European Data Protection Directive. There is also a special regime for 'sensitive data' in the Directive prohibiting processing of data relating to racial or ethnic origin, political opinions, religious or philosophical beliefs and so on. The connection with rights and liberties such as the freedom of religion, freedom of conscience and the political freedoms is obvious.”946 Both authors also highlight the instrumental value of data protection for these more substantial values and even refer to abstract constitutional claims as: “Data protection principles seem less substantive and more procedural compared to other rights norms but they are in reality closely tied to substantial values and protect a broad scale of fundamental values other than privacy. Because of its reputation of only focusing on the benefits for individuals, putting data protection in the privacy frame hampers the realization of the societal benefits of data protection rights and therefore puts these rights essentially in conflict with the needs of society.”947 In actual fact, Tzanou also follows this approach by concluding: “Nevertheless, privacy and data protection are not identical rights. (…) Privacy 944 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 61. 945 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 70. 946 See De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, p. 82. 947 See De Hert and Gutwirth, Data Protection in the Case Law of Luxemburg and Strasburg, p. 44. II. The requirement of purpose specification and its legal scale 335 is a much broader concept that embodies a range of rights and values, such as the right to be let alone, intimacy, seclusion, personhood, and so on according to the various definitions. (…) Furthermore, unlike privacy’s elusive and subjective nature that makes the right different in different contexts and jurisdictions, data protection has an essential procedural nature that it makes it more objective as a right in different contexts. Finally, data protection is more than informational privacy itself because, as will be demonstrated below, it serves other, further fundamental rights and values besides privacy.”948 Unfortunately, in the end, and contrary to her intentions, Tzanou does not demonstrate, at least not in detail, how the fundamental right to data protection protects not only privacy, but also the other fundamental rights. This lack in detail of the concept of the right to data protection is characteristic of all approaches discussed so far. All authors stress the important function of data protection for individual autonomy and, consequently, for the other fundamental rights providing for more substantial values, be it privacy, liberties or equality. However, they do not show how this might be implemented in reality. Purpose specification as a risk regulation instrument This sub-chapter therefore illustrates how the concept of protection of Article 8 ECFR serving the other fundamental rights to privacy, freedom, and equality can be implemented, focusing on a risk regulatory point of view. As mentioned previously, this thesis promotes individual autonomy as the ultimate objective of the fundamental right to data protection. Since the concept of individual autonomy is too broad for providing for an objective scale for each particular case of data processing, it is the diversity of all fundamental rights that provides this differentiated scale. From the perspective that data protection is a regulation of risks, the (more instrumental) right to data protection serves, as a central norm, to protect the (more substantial) values provided for by the other fundamental rights against the risks caused by the processing of personal data. In doing so, the right extends its scope of protection, even so far to unspecific risks, i.e. before the data processing threatens a specific object of protection (i.e. substantial guarantee or value) of the other fundamental rights, such as of freedom bb) 948 See Tzanou, ibid., p. 90. C. The function of the principle of purpose limitation in light of Article 8 ECFR 336 or equality. However, this type of protection is an extension to unspecific risks but not a substitute of protection against specific risks. Thus, it extends the range of protection in the sense that it adds a precautionary level of protection against unspecific risks to the preventative level of protection against specific risks. In this system, the requirement of purpose specification plays a decisive role because it determines which type of risk, unspecific or specific, is caused by the data processing, and if specific, which fundamental right is actually concerned. The type of risk and, eventually, the specific substantial guarantee of the fundamental right concerned then determine which instruments are necessary for protection. Thus, the substantial guarantees provided for by the fundamental rights to privacy, freedom, and non-discrimination determine the context and related ‘informational norms’ and, consequently, which purpose of the data processing is legally relevant. ‘A risk to a right’: Quantitative vs. qualitative evaluation? Indeed, putting data protection within a framework of regulation of risks, can be challenging. Focusing on privacy impact assessments, as well as data protection impact assessments, van Dijk, Gellert and Rommetveit examine in more detail the relation between a risk and a right. They carve out the following conundrum which results from the conceptual link of a ‘risk to a right’. In doing so, the authors stress the provenance of privacy impact assessments from technology assessments and environmental impact assessments. Thus, privacy impact assessments, import the risk management practices developed for these technology assessments into data protection regimes.949 The authors stress the methodological challenge resulting from the fact that these risk management practices were typically “concerned with physical consequences for the natural environment and human health” and, thus, “defined through scientific concepts of probability in dealing with the possibilities” of these future events.950 In contrast, risk management practices once introduced in the field of data protection “direct risk assessment exercises to the consequences of technologies (1) 949 See van Dijk, Gellert and Rommetveit, A risk to a right? Beyond data protection risk assessments, pp. 287 and 288. 950 See van Dijk, Gellert and Rommetveit, ibid., p. 290. II. The requirement of purpose specification and its legal scale 337 (ICTs) upon citizens’ fundamental rights”.951 In light of this conceptual shift, the authors particularly raise one question that is especially relevant for this part of this thesis:952 how should a ‘risk to a right’ be conceptualized within the context of data protection law? Challenges of bridging risks to rights The authors van Dijk, Gellert and Rommetveit particularly criticize the quantitative risk-based approach that is becoming more and more dominant within general data protection regulation. They formulate their concern as: “The basic processes of risk assessment and management are not fundamentally concerned with the nature of rights, but rather with the likelihood of certain consequences occurring. Classical statements about the nature of risk assessments typically highlight quantification as intrinsic to the risk assessment process: ‘it is the major task of risk assessment to identify and explore, preferably in quantitative terms, the types, intensities and likelihood of the (normally undesired) consequences related to an activity or event’.”953 The authors conclude from this a certain shift: legal questions are not answered by legal analysis anymore but, more and more, through risk assessment practices.954 Several legal scholars seek to address the methodological challenge. For instance, the German White Paper on “Data Protection Impact Assessment” (Datenschutz-Folgenabschätzung) by the research project Forum Privatheit explicitly underlines, that the legal requirements guaranteed by fundamental rights of the individuals concerned must not be undermined by a risk assessment.955 This argument explicitly ties into the similar paper on “The Role of Risk Management in Data Protection” published by the French Centre for Information Policy Leadership (CNIL), which stressed: “Risk management does not alter rights or obligations. If a law conveys a (a) 951 See van Dijk, Gellert and Rommetveit, ibid., p. 290. 952 See van Dijk, Gellert and Rommetveit, ibid., pp. 289 and 290. 953 See van Dijk, Gellert and Rommetveit, ibid., p. 293, quoting Renn O. Risk governance coping with uncertainty in a complex world. Earthscan, London: Sterling, VA; 2008, p. 5. 954 See van Dijk, Gellert and Rommetveit, ibid., p. 293, quoting Renn O. Risk governance coping with uncertainty in a complex world. Earthscan, London: Sterling, VA; 2008, p. 5. 955 See Forum Privatheit, White Paper – Data Protection Impact Assessment, p. 18. C. The function of the principle of purpose limitation in light of Article 8 ECFR 338 right to data protection, or provides individuals with specific rights, such as rights of access, correction or deletion, risk management cannot alter those rights; just as the law imposes obligations on controllers or processors, risk management does not change those obligations. Rather, risk management is a valuable tool for calibrating accountability, prioritising action, raising and informing awareness about risks, identifying appropriate mitigation measures and, in the words of the Article 29 Working Party, providing a ‘scalable and proportionate approach to compliance’”.956 The German White Paper therefore proposes a risk assessment process intending to establish “a bridging of the risk-based approach and the fundamental rights approach”.957 Example: German White Paper on DPIA However, a closer look at this approach reveals how difficult it is to consistently bridge a ‘risk to a right’. The risk assessment methodology proposed by the German White Paper builds upon so-called “protection goals” (Schutzziele). German data protection experts had developed these protection goals, similar to the protection goals developed in IT security, for improving the assessment of whether specific protection instruments implemented in practice suffice the requirements of data protection law or not.958 These data protection goals add to the already known IT security goals and create certain “fields of conflicts” between each other. This shall be illustrated as follows:959 (b) 956 See the Center for Information Policy Leadership, The Role of Risk Management in Data Protection, p. 13, quoting the Article 29 Data Protection Working Party, Statement on the role of a risk-based approach in data protection legal frameworks, p. 2. 957 See Forum Privatheit, White Paper – Data Protection Impact Assessment, p. 18: “Der im folgenden Kapitel skizzierte Prozess zur Durchführung von DSFAen versucht den Brückenschlag zwischen dem Risikoansatz sowie dem Ansatz zur Grundrechtsgewährleistung und kombiniert die als sinnvoll erachteten Elemente mit dem Ziel, ein für alle Beteiligten nützliches Werkzeug zu schaffen.” 958 See Rost, Standardized Modeling of Data Protection; Rost and Bock, Privacy by Design and the New Protection Goals: Principles, objectives, and requirements; Rost and Pfitzmann, Data Protection Goals – revisited. 959 See Forum Privatheit, ibid., pp. 24 and 25. II. The requirement of purpose specification and its legal scale 339 220 goals add to the already known IT security goals and create certain “fields of conflicts” between each other. This shall be illustrated as follows:951 Integrity Non-combinability Accessibility Confidentiality Transparency Possibilities to intervene For example, the protection goal of “non-combinability” shall constitute criteria for the assessment of whether or not a data controller meets, in practice, the principle of purpose limitation on an organizational and technical level. This principally leads to a conflict with the protection goal of “transparency”. The reason is that the more technical and organizational measures guarantee that personal data stored or processed by the controller cannot be combined (for instance, by means of anonymization, pseudonymization, and isolation of data sets, systems and processes), the less the controller is able to make the personal data processed transparent, as a whole, to the individual concerned.952 In order to assess the data protection risk, the White Paper uses the same scale as developed for IT security risks. Indeed, the scenarios set out in the paper and consequences appear from time to time to be rather random. In any case, in the opinion of the authors of the White Paper, the fundamental rights approach requires one to consider each processing of personal data, even if it is undoubtedly legal, as harming the fundamental rights to private life and data protection under Articles 7 and 8 ECFR. Therefore, the data protection risk assessment must not depend on the likelihood and the intensity of the harm, only. Instead, even the “normal” processing of personal data already requires a certain minimum level of protection. In contrast, a higher level of protection is needed if, first, personal data is processed that belongs to the especially protected categories listed in the law or, second, 951 See Forum Privatheit, ibid., pp. 24 and 25. 952 See Forum Privatheit, ibid., pp. 25 and 28. For example, the protection goal of “non-combinability” shall constitute criteria for the assessment of whether or not a data controller meets, in practice, the principle of purpose limitation on an organizational and technical level. This principally leads to a conflict with the protection goal of “transparency”. The reason is that the more technical and organizational measures guarantee that personal data stored or processed by the controller cannot be combin d (for instance, by mean of anonymization, pseudonymization, and isolation of data sets, systems and processes), the less the controller is able to make the personal data processed transparent, as a whole, to the individual concerned.960 In order to assess the data protection risk, the White Paper uses the same scale as developed for IT security isks. Indeed, the scenarios set out in the paper and consequences appear from time to ti e to be rather random. In any case, in the opinion of the authors of the White Paper, the fundamental rights approach requires one to consider each processing of personal data, even if it is undoubtedly legal, as harming the fundamental rights to private life and data protection under Articles 7 and 8 ECFR. Therefore, the data protection risk assessment must not depend on the likelihood and the intensity of the harm, only. Instead, even the “normal” processing of personal data already requires a certain minimum level of protection. In contrast, a higher level of protection is needed if, first, personal data is processed that belongs to the especially protected categories listed in the law or, second, the individual concerned depends on the deci- 960 See Forum Privatheit, ibid., pp. 25 and 28. C. The function of the principle of purpose limitation in light of Article 8 ECFR 340 sion or service of the controller. In the second scenario, there are two additional, aspects increasing the need for protection: the risk caused by the data processing either leads to significant consequences for the individual concerned or the individual has no effective means at his or her disposal in order to protect him or herself. Consequently, the need for protection is very high if the before-mentioned aspects apply at the same time, i.e. personal data belonging to a special category of data is processed and the individual concerned depends on the decision or service of the controller (plus the additional requirements). Finally, a high need for protection can also result from the cumulative effects of a certain data processing that poses only a normal level of risk for an individual. This might be the case if personal data of many individuals is processed or the data is processed for many purposes.961 In order to specify these protection goals, the German data protection authorities formed the so-called “Technology” working group. This working group is going to set up a manual of specific protection measures that the controller can implement in order to meet the before-mentioned goals. The manual will constantly be updated in tandem with whatever new technology is being launched in Germany. In conclusion, the risk assessment methodology enables the controller (and data protection authorities) to evaluate the risk for the data protection goals that results from deviations from this catalogue. If the data controller uses protection measures other than listed in the catalogue, the data controller bears the burden of proving that these equally meet the protection goals. A data protection authority can therefore assess, on this basis, whether or not there are deficiencies in the protection of personal data overall.962 Criticism: Incoherence of current risk criteria In conclusion, the German White Paper refers to criteria such as the sensitivity of data and the possible consequences for the individual concerned in order to evaluate the level of risk. On the other hand, the data protection goals provide a reference point for the question of what is at risk. This approach is rather astonishing because the criteria determining the level of (c) 961 See Forum Privatheit, ibid., p. 27. 962 See Forum Privatheit, ibid., pp. 27 to 29. II. The requirement of purpose specification and its legal scale 341 risk are, actually, more substantial than the data protection goals, which appear here to constitute the object of protection. In contrast, one could think that the possibility to intervene (in the White Paper listed as a “goal”) should enable the individual to avoid negative consequences for him or herself, be it protected in general or specifically by his or her fundamental rights. The reason for this confusion is, thus, that the White Paper mixes procedural measures, such as transparency enhancing mechanisms, with substantial aspects, such as the consequences for the individual. In summary, the data protection risk assessment proposed may therefore well serve as a “bridging” of risk assessment methodologies and the fundamental rights approach. However, the approach unfortunately lacks consistency, at least, regarding the question of why certain criteria should determine the level of risk and why other criteria serve as reference points for the object of protection. In light of these challenges, van Dijk, Gellert and Rommetveit rightly stress how important it is to clarify and establish the risk criteria by distinguishing the following questions: Which event, if realized, should be considered as relevant (i.e. what is the harm)? Who would suffer the harm and who is responsible for it? And, how should the risk be measured?? Van Dijk, Gellert and Rommetveit propose to answer these questions by more rigorously referring to the attributes of law.963 The second part of this thesis has already discussed this issue and that it depends, indeed, on the fundamental right “at risk” which object or substantial value is guaranteed, thus, which type of threat is relevant and, as a consequence, what kind of protection instrument is required. The discussion differentiated between preventative measures against the danger for a substantial guarantee that is already specified, i.e. specific risks, and precautionary measures for situations where it is not yet even clear which harm the situation bears. These risks were called unspecific risks. For unspecific risks, the appropriate protection instruments often refers to the gathering of information in order to discover, in a timely fashion, the specific risks. Furthermore, such instruments that gather, in the first place, information are often more proportionate with respect to opposing fundamental rights than instruments that create a higher regulatory burden, such as the precautionary prohibition of certain actions.964 963 See van Dijk, Gellert and Rommetveit, ibid., pp. 302 to 304. 964 See above under point B. II. 3. c) Interims conclusion: Fundamental rights determining appropriateness of protection; cf. van Dijk, Gellert and Rommetveit, ibid., C. The function of the principle of purpose limitation in light of Article 8 ECFR 342 Purpose specification discovering risks posed to all fundamental rights This thesis therefore promotes that the first component of the principle of purpose limitation, i.e. the requirement to specify the purpose of the processing, is a risk regulation instrument that primarily serves to discover the risks caused by the processing of personal data to all fundamental rights of the individual concerned. As mentioned before, the Article 29 Data Protection Working Party considers the requirement as the necessary precondition in order to “determine whether data processing complies with the law, and to establish what data protection safeguards should be applied”.965 The German Constitutional Court similarly states, regarding informational measures imposed by the State, that “only when it is clear for which purpose the information is required (…), it is possible to answer the question of whether the infringement of the right to informational self-determination is constitutionally legal or not.”966 The specification of the purpose thus serves as the essential link for evaluating the legal relevance of the data processing. Pooling different actions together in order to create meaning German legal scholars indeed stress that the requirement of purpose specification must not be confused with the principle of clarity of law.967 As mentioned previously, with respect to the public sector, the German Constitutional Court concludes the requirement of purpose specification from the principle of clarity of law, which strengthens the principle of purpose limitation.968 However, Britz underlines that the requirement serves, pri- (2) (a) p. 295, referring to the ECtHR, which also focuses, more and more, on the production of knowledge as part of the risk assessment. 965 See the Article 29 Data Protection Working Group, Opinion 03/0213 on purpose limitation, pp. 13 and 15. 966 See BVerfG, 15th of December 1983, 1 BvR 209, 269, 362, 420, 440, 484/83, cip. 159. 967 See Albers, Treatment of Personal Information and Data, cip. 123; Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, p. 583. 968 See above under point C. II. 1. c) ee) (1) Public sector: Purpose specification as a result of the principle of clarity of law. II. The requirement of purpose specification and its legal scale 343 marily, to diminish the intensity of an infringement by the State and therefore relates it to the principle of proportionality.969 Albers instead highlights that the requirement serves to structure the treatment of personal data from a normative perspective determining which of the single acts are legally relevant.970 This second function is particularly relevant for the processing of data in the private sector. As mentioned previously, the German Constitutional Court also referred to the purpose examining the processing of personal data by a private party.971 However, the Court did not explicitly explain the specific function of the requirement to specify the purpose in the private sector. Here, the function therefore is more uncertain than on the public sector because private parties are undoubtedly not bound to the principles of clarity of law and proportionality.972 In light of this, the main function of the requirement to specify the purpose seems, in the private sector, indeed to be the structuring function promoted by Albers. This function is also supported by a historical point of view. Pohle stresses the historical provenance of the principle of purpose limitation in the individual’s consent. In this regard, the requirement to specify the purpose was considered, in particular, to define the context and conditions under that the controller should be allowed to use the data.973 If this function is transferred, in general, to the processing of data,974 the specification of the purpose indeed serves to pre-determine the conditions of the processing of data with respect to a specific context and, thus, the “informational norms” governing this context.975 However, this structuring function in the private sector does not answer the question of how precisely the controller has to specify the purpose. 969 See Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, pp. 583 and 584. 970 See Albers, Treatment of Personal Information and Data, cip 123. 971 See above under point C. II. 1. c) ee) (2) Private sector: ‘Self-control of legitimacy’. 972 At least, so long as private parties are not directly bound to fundamental rights, see the discussion above under point C. I. 1. b) aa) Third-party effect, protection and defensive function. 973 See Pohle, Purpose limitation revisited, p. 141, referring to Oscar M. Ruebhausen und Orville G. Brim Jr. (1965), ”Privacy and Behavioral Research“, Columbia Law Review 65.7, p. 1199. 974 See this conceptual shift from the historical perspective at Pohle, ibid., pp. 141 and 142. 975 See above under point B. III. 4. Clarifying the relationship between “context” and “purpose”. C. The function of the principle of purpose limitation in light of Article 8 ECFR 344 The preceding chapters made it clear that the task to precisely specify the purpose is, from a practical viewpoint, even more difficult for data controllers operating in the private sector than for the legislator. In contrast to the legislator, a private data controller is not able to refer to a legal system that comprehensively and precisely determines the objectives of different laws and, subsequently, the consequences for the individual concerned. If the legislator refers, for example, to the income tax act in a sufficiently precise way in order to authorize the processing of certain data, the individual concerned is able to foresee the consequences if/and when the tax authorities retrieve their data. In contrast, a private data controller does not have such a legal system at its disposal and is, in exaggerating words, on a stand-alone basis. In addition, the flux of data in the private sector is less predictable, in light of the diversity of the participants, their actions and intentions, as well as their interconnections in a free market economy. For the data controller operating in the private sector, the need for reliable criteria therefore is even more important.976 Thus, which criteria could help the controller specify the purpose operating in the private sector? Separating unspecific from specific risks (first reason why data protection is indispensable) The following two slight shifts regarding the concept of protection discussed so far assists in answering the above question. On the one hand, there is the intermediate function of the right to data protection for individual autonomy, which is determined by all further fundamental rights. As quoted previously, Rouvroy and Poullet seek to determine, for example, the indeterminateness of the right to privacy by “taking fully into account the context in which our liberties have to express themselves”.977 This sentence points to what has been already presumed with respect to Nissenbaum’s approach: Since the definition of a context and, thus, the purpose referring to the context depends on values, it is necessary to elab- (b) 976 See above under point C. II. 2. c) aa) No legal system providing for ‘objectives’ of data processing in the private sector. 977 See Rouvroy and Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, p. 61. II. The requirement of purpose specification and its legal scale 345 orate on an objective concept for this definition.978 It is the liberties of the individual, in other words, his or her fundamental rights to freedom, but also privacy and non-discrimination, that define which context, and which purpose is legally relevant when the data is processed. On the other hand, the concept of protection of the right to data protection builds upon the regulation of risks. De Hert and Gutwirth do not explicitly mention this function but elaborate on the particularity of the concept of protection of the right to data protection compared to the right to privacy. In their opinion, the right to data protection provides, as a “tool of transparency”, more for procedural protection than the right to privacy, as a “tool of opacity”. However, with this procedural function, the right to data protection is also closely linked to further substantial values, beside the right to privacy.979 Indeed, De Hert and Gutwirth do not compare the concept of the right to data protection with other fundamental rights, even if they mention them. This deficiency lead Tzanou to criticize the fact that both authors define in a negative manner “the added value of data protection (…) through its distinction from privacy.”980 However, considering the transparency tools provided for by the right to data protection, not only protecting the right to privacy but also to the other fundamental rights, i.e. to freedom and nondiscrimination, verifies that ‘we cannot live without it’ (i.e. the right to data protection). The reason is that the right to data protection protects, perceived as a risk regulation instrument, against the risks caused by the processing of personal data for all fundamental rights. Since this kind of protection already starts before a specific object of protection of one of the other fundamental rights is threatened, other fundamental rights cannot provide for similar protection.981 There must be an autonomous fundamental right if it shall protect, as a precautionary measure, further rights before there is a specific threat for them. The indispensability of the right to data protection, which protects the individual against risks to all his or her fundamental rights, becomes par- 978 See above under point B. III. 5. Values as normative scale determining “contexts” and “purposes”. 979 See De Hert and Gutwirth, Privacy, data protection and law enforcement. Opacity of the individual and transparency of power, pp. 69 and 70; and, ibid., Data Protection in the Case Law of Luxemburg and Strasburg, p. 44. 980 See Tzanou, ibid., p. 93. 981 Cf. BVerfG, 11th of March 2008, 1 BvR 2074/05 and 1 BvR 1254/07 (License Plate Recognition), cip. 63. C. The function of the principle of purpose limitation in light of Article 8 ECFR 346 ticularly apparent with respect to the requirement of purpose specification. From the perspective of the regulator of risks, the requirement to specify the purpose primarily serves as a function to discover the risks for the other fundamental rights. So long as the purpose specified by the controller does not concern one of the further fundamental rights of the individual concerned, data processing does not bear a specific risk but an unspecific risk. In this regard, the requirement to specify the purpose is a risk regulation instrument that primarily serves to gather information. However, the moment where the purpose of the data processing concerns one of the other fundamental rights, the substantial guarantee of this fundamental right determines the further protection instruments necessary in order to prevent the individual, i.e. the carrier of this fundamental right, against the specific risk. The answer to the question of how precisely the purpose must be specified hence is twofold: So long as the processing intended does not bear a specific risk for the other fundamental rights, there are no specific requirements related to the precision of the purpose specified by the controller. So far, the requirement to specify the purpose only constrains the controller to constantly assess whether its data processing bears a specific risk for the individual’s fundamental rights. In the end, this function appears more or less to correspond to the conclusion that Kokott and Sobotta draw after having compared the fundamental right to data protection with that to private life. They stress “the requirements that personal data must be processed fairly and for a specified purpose cover many instances where an interference with privacy would have to be justified. These specific requirements of data protection help to focus the debate on areas that are particularly susceptible to interference with fundamental rights.”982 This might mean, in the moment the purpose specified discovers a specific risk to one of the fundamental rights, the substantial guarantee of this right determines, amongst others, how precisely the purpose must be specified. Albers sums up this result, with respect to the German right to informational self-determination and to the public sector, as: “The required degree of specification thus depends on the needs for protection and the context of regulation.”983 982 See Kokott and Sobotta, The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, p. 228. 983 See Albers, Treatment of Personal Information and Data, cip. 124: “Der gebotene Konkretionsgrad hängt also von den Schutzerfordernissen und vom II. The requirement of purpose specification and its legal scale 347 Central function with respect to all fundamental rights (second reason why data protection is indispensable of data protection) This function is an illustrative example for why there must be a right, which is autonomous from the other fundamental rights, at least, to freedom and equality. However, this does not yet explain why this function explained above must also be autonomous from the right to privacy. Since each kind of processing of personal data starts with the collection, indeed it seems possible that the right to privacy might equally provide for this protection function. This corresponds, in principle, to the concept of the right to private life under Article 8 ECHR, even if the European Court of Human Rights does not determine the protection instruments by referring to the other fundamental rights.984 Furthermore, this approach appears to correspond to what Rouvroy and Poullet promote when referring to the fundamental right to privacy and, correspondingly, to the German right to informational self-determination. However, there is one important reason why it makes sense that a right to data protection should also be autonomous from the right to privacy. An autonomous right to data protection makes it easier to differentiate between the specific substantial guarantees concerned and, thus, react to the threats against these guarantees with different protection instruments. As illustrated before, the German Constitutional Court differentiates between several guarantees surrounding the individual’s privacy, such as: the right to the inviolability of the home; the right to confidentiality of telecommunications; the right to the integrity and confidentiality of information-technological systems; and last but not least, the right to informational self-determination. The reason for why the Court differentiates between these rights is that each separate right contains a specific guarantee, and this diversity gives the Court a more comprehensive scale for: first, determining the risks for the individual concerned; second, to choose the appropriate protection instrument; and third, to weigh these guarantees (c) regelungskontext ab.”; cf. also Britz, Informational Self-Determination between Legal Doctrine and Constitutional Case Law, pp. 284 and 285. 984 See above under point C. I. 3. b) Concept of Article 8 ECHR: Purpose specification as mechanism for determining the scope of application (i.e. the individual’s ‘reasonable expectation’. C. The function of the principle of purpose limitation in light of Article 8 ECFR 348 against the opposing constitutional positions.985 The European Court of Human Rights achieves a similar result, albeit referring to just one fundamental right (i.e. the right to private life under Article 8 ECHR), by applying its case-by-case approach. This approach allows the Court to elaborate on several types of cases without being strictly bound to a general definition for the scope of protection. This is an important difference to the tendencies of the European Court of Justice applying, at least with respect to Articles 7 and 8 ECFR, a deductive method coming from a general definition of the scope.986 However, the German Constitutional Court requires, providing for a conceptual link between data protection and privacy, a common protection instrument (i.e. the individual’s right to determine by him or herself the disclosure and later usage of the data related to him or her). This conceptual link makes it more difficult to elaborate on alternative and more differentiated protection instruments than without such a link.987 If the data processing concerns other fundamental rights to freedom, in effect, these rights rather supplement the scope of protection of the right to informational self-determination, which is already rather broad and vague, instead of refining its scope.988 Similarly, the European Court of Human Rights does not refer to other fundamental rights at all.989 Elaborating the right to data protection under the umbrella of the right to private life, or even privacy, thus makes it more difficult to apply a differentiated approach. In conclusion, the constitutional legislator clarified, through the separation of both rights, that there are two different guarantees. This is what De Hert and Gutwirth pointed out: the protection functions of the right to privacy and the right to data protection are structurally different. Their defi- 985 See above under points C. I. 2. The object and concept of protection of the German right to informational self-determination, and C. I. 1. b) bb) Balance between protection and defensive function. 986 See above under point C. I. 3. c) aa) (1) General definition of the term ‘personal data’ under Article 7 and 8 ECFR instead of case-by-case approach. 987 See above under point C. I. 2. f) Interim conclusion: Conceptual link between ‘privacy’ and ‘data processing’. 988 See above under point C. I. 2. c) Right to control disclosure and usage of personal data as protection instrument?. 989 See above under point C. I. 3. b) ee) Conclusion: Assessment of ’reasonable expectations’ on a case-by-case basis, referring to ECtHR, Case of Gillan and Quinton vs. the United Kingdom from 12 January 2010 (application no. 4158/05), cip. 88 to 90. II. The requirement of purpose specification and its legal scale 349 nition makes it clear that the right to data protection must be, at least substantially, autonomous from the right to privacy, and not only from the other rights to freedom and equality. Functionally, with respect to the risks caused by the processing of personal data, the right to data protection stands in the center of all other fundamental rights. Function of making specified purposes explicit This leads to the additional requirement that the specified purpose must be made explicit to the individual concerned. The European legislator established in Article 5 sect. 1 lit. b GDPR (Article 6 sect. 1 lit. b of the Data Protection Directive) the requirement of making specified purposes explicit. As quoted previously, the Article 29 Data Protection Working Party interprets this requirement as: “The purposes of collection must not only be specified in the minds of the persons responsible for data collection. They must also be made explicit. In other words, they must be clearly revealed, explained or expressed in some intelligible form. (…) The requirement that the purposes be specified ‘explicitly’ contributes to transparency and predictability. (…) It helps all those processing data on behalf of the controller, as well as data subjects, data protection authorities and other stakeholders, to have a common understanding of how the data can be used.”990 Thus, this requirement builds upon the requirement of specifying the purpose and safeguards predictability and transparency of the processing of data. However, the fundamental right to data protection under Article 8 ECFR