WHY PRIVACY MATTERS:
Youth, Identity and Online Sociability
Our everyday lives are increasingly dependent upon the use of social media for communication with our friends, family, schoolmates, and colleagues. Social network sites such as Facebook, real-time information services such as Twitter, micro-blogging sites such as Tumblr, video-sharing sites such as YouTube or Vimeo, are used for creative, constructive, or even mundane uses. In our ever-connected world, it is more than ever difficult to disconnect.
Many admonish youth for their supposed cavalier attitude toward their personal privacy on social media. But in our research we have found that youth have a very nuanced and smart sense about how to manage their own privacy and they possess a grounded sense of the interpersonal ethics surrounding social media communication. They have told us that they are often frustrated by the brazen attitude of social media companies who routinely change their privacy controls, and they have also told us that they want more of a voice in policymaking.
This film explores these issues, and brings forward the voices of Canadian youth, media educators, and academics. It is designed to serve as both an introduction to the various issues related to privacy, social media and youth and also serve as a provocation for youth to become more involved in not only shaping the social media landscape but in educating policymakers about their insights and concerns about online privacy.
Here we provide some basic information about some of the issues explores in the film: privacy definitions and issues; and resources and further reading on the topic.
The classic definition of privacy comes from American legal scholars Samuel Warren and Louis Brandeis, who in 1890 described it as “the right to be let alone.” As members of high society in Boston, they felt that the new technology of instant photography was threatening to their privacy. Even though their understanding of privacy has privileged origins, Warren and Brandeis’s definition is one of the first framings of privacy as a human right and civil liberty.
Allan Westin (1967) reformulated Warren & Brandeis’s concerns by situating the state of being “let alone” in terms of information. For Westin, privacy constitutes “the claim of individuals, groups, or institutions to determine when, how, and to what extent information is communicated about them to others.”
Westin’s conception of privacy rests on the idea of control; in relation to networked communication technologies, this control can be difficult to exercise because the workings of information online are often invisible.
Philip Agre and Mark Rotenberg (1997) characterize a number of changes to the technological landscape that have presented challenges for thinking about privacy. For example, technical changes to database organization, storage, and usage – such as the spread of data-mining algorithms that collect and transmit personal information instantaneously – make it difficult to locate and control the “digital personas” that get constructed from this mass of data.
These different terms describe the way that personal information disclosed online is accessed by institutional and commercial audiences. Anita Allen (1999) defines informational privacy as “confidentiality, secrecy, data protection, and control over personal information.” She cites the increasing penetration of data-based technologies – such as the current development of wearable computers “capable of transmitting data around the world” – as one of the main causes of informational privacy erosion.
Another aspect of privacy online has to do with who can see your personal information on social media platforms. danah boyd and Alice Marwick (2011) thus define privacy as a “social norm that is achieved through a wide array of social practices configured by structural conditions.” The most illustrative example of social privacy issues online is probably the everyday behaviour of people on social network sites: determining who in the network of friends (and in the larger audience of the site and search engine results) can access your information, photographs, links, communications, and so on.
Both social and informational privacy issues may be conceptualized together, as in Valerie Steeves’s (2006) four ways of understanding privacy. For her, privacy can be seen as:
- part of the democratic process, where personal autonomy and freedom lie at the heart of democratic citizenship;
- a social value or norm that affords people with the power to define and control their personal relationships and position within society;
- data protection, which includes the management of access to truthful personal data; and
- a human right resting on the idea of personal integrity.
privacy as contextual integrity
Helen Nissenbaum’s (2004; 2010) concept of contextual integrity frames technologically-mediated processes of enacting privacy as processes of negotiating “context-relative informational norms.” Informational norms are comprised of those normative valuations that a person holds, as determined by his or her position within a specific context: system of cultural beliefs or expectations. Contextual integrity thus means that privacy is respected when the norms around information are likewise respected; privacy is violated when those norms are breached.
In this way, more privacy is not always necessarily better – privacy needs change with the local context, as well as with what Nissenbaum sees as the contextual integrity that determines privacy norms for each individual. This framework is useful for thinking about how both social and informational privacy issues come to the fore in online spaces for social, commercial and institutional transactions.
What is so crucial about privacy as contextual integrity goes back to Warren and Brandeis’s initial framing of privacy as a right. As Leslie Regan Shade (2008) points out, “considering privacy as a human right recognizes it as an important element of civil liberties and a necessary component of a communication rights framework.” In this way, protection of privacy becomes a central mandate in the context of federal policy, which fundamentally constitutes citizens in terms of their rights, including privacy.
Canadian privacy legislation
Canadian privacy legislation, like that of other advanced industrial states, became part of public policy agendas in the late 1960s and 1970s. For instance, the Department of Communication issued two reports at this time, Instant World: A Report on Telecommunications in Canada (1970) and Privacy & Computers (1972), to examine the integrity of personal information and the potential surveillance implications of nascent database technologies for sorting, tracking, and making links by governments and private industries, alongside concerns about the outsourcing of Canadian data to the detriment of national sovereignty.
Since the mainstream adoption of internet technology in the mid-1990s, legislative debates around privacy in new mediated environments have only increased in the Canadian context. For example, Senator Sheila Finestone, then Chair of the House of Commons Standing Committee on Human Rights and the Status of Persons with Disabilities, brought online privacy issues to national attention with the release of the report Privacy: Where do we draw the line? (1997). In the report, Finestone notes that “Canadians view privacy as far more than the right to be left alone, or to control who knows what about us. It is an essential part of the consensus that enables us not only to define what we do in our own space, but also to determine how we interact with others – either with trust, openness and a sense of freedom, or with distrust, fear and a sense of insecurity.”
This report proved influential for new conceptions of privacy in networked technologies, where Finestone argued that the social consequences of these technologies were in fact more important than the artifacts themselves. Her goal was to raise both public and governmental awareness of the privacy implications of new technologies, since the pace of innovation was too fast for human rights legislation to keep up. Another drawback to the slow pace of legislative change is that it has resulted in incoherent and incomprehensive privacy protections. Canadian privacy regulation, even today, represents a patchwork of legislative tools that address various privacy issues relating to newer and older mediated environments, and more broadly to privacy in everyday life.
Charter of Rights and Freedoms (1982)
Canada’s patchwork of privacy legislation is subtended by the Canadian Charter of Rights and Freedoms (1982), under Legal Rights, Sections 7 and 8:
7. Everyone has the right to life, liberty and security of the person and the right not to be deprived thereof except in accordance with the principles of fundamental justice.
8. Everyone has the right to be secure against unreasonable search or seizure.
While these Sections do not represent an explicit constitutional right to privacy, they have been interpreted by the Supreme Court of Canada as protections against “unreasonable” violations of privacy (Shade 2008, 83). The case that set this precedent was R. v. Duarte (1990), where an instance of electronic surveillance was found to infringe Section 8 of the Charter, despite being allowed by the Criminal Code of Canada (1985). As a result of the Charter’s overriding of the Criminal Code in this case, Section 8 (as an extension of Section 7) has been subsequently interpreted as a robust claim to privacy rights.
Privacy Act (1985)
The Charter never explicitly mentions privacy, and as such, the Privacy Act came into effect in 1983 and was subsequently amended to its current form in 1985.
The Privacy Act applies only to the federal public sector in relation to data collection, placing limitations on the collection, use, disclosure, and disposal of personal information held by the federal government and federal agencies.
It provides the basis for individuals to file complaints against government agencies that breach the privacy of citizens’ personal information. These complaints are dealt with by the Office of the Privacy Commissioner of Canada (OPC), an office that was created with the passing of the Canadian Human Rights Act in 1977 to serve as an advocate for the privacy rights of Canadians whose personal information was stored in federal databanks.
Personal Information Protection and Electronic Documents Act (PIPEDA) (2000)
The Personal Information Protection and Electronic Documents Act (PIPEDA) was brought into law in 2000 and began a three-year phasing in period the following year. As a statute attached to the Privacy Act, PIPEDA works through the same complaint resolution system, where citizens and organizations may file complaints to the OPC if they feel that their privacy has been violated by commercial collection, use and/or disclosure of their personal information. This more specific piece of legislation deals with the federally regulated private sector with respect to the collection, use, and disclosure of personal information, but only for the transaction of commercial activities.
One of the main challenges that remains with PIPEDA, and indeed with any privacy legislation, is in ensuring informed consent as an extension of privacy defined as contextual integrity. The implied consent typically used in online spaces often results in people’s information being collected unless they actively understand and exercise the ability to “opt-out” of data collection.
The OPC has attempted to address the problem of opt-out consent in its recent reviews of PIPEDA, where newer, more sophisticated modes of data collection online may require legislative changes in order to maintain privacy protections. The aim of the OPC is to make sure that the law remains “principle based and technology neutral” in order to keep pace with rapid technological change. So while networked technologies have entailed new ways for people to engage in commercial activities, the OPC sees legislation as dealing with “new technologies, old questions”: such as older issues around transborder information flows, as discussed in the reports from the early 1970s, that still carry over into online environments of today.
surveillance & security
One of the main concerns around information flows over borders emerges from what was framed as the trade-off between privacy and security in the rhetoric of the “War on Terror.” Security has been recently championed over privacy in practices such as government surveillance, protected in the U.S. by the extreme measures of the 2001 USA Patriot Act. Title II of the USA Patriot Act invested U.S. government agencies with the power of surveillance under “Enhanced Surveillance Procedures,” including the interception of communications networks, the seizure of communications records and access to the records of communications customers.
The speedy implementation of this provision was widely criticized as an incarnation of George Orwell’s famous figure of Big Brother from his novel Nineteen Eighty-Four (1948). The novel is set in a totalitarian state where inhabitants live under constant surveillance by Big Brother, through a number of telescreens and loudspeakers reminding everyone that “Big Brother is watching you.” Now a popular model for the abuse of civil liberties by government surveillance, Big Brother offers an apt description for the way that security tactics inaugurated in the War on Terror took government surveillance into new mediated environments, where electronic communications over the internet were watched for traces of “terrorism.”
David Lyon (2001) describes this kind of institutional surveillance using the examples of closed circuit television, genetic screening, and credit card monitoring. According to Lyon, these practices threaten our civil liberties. Constant surveillance, especially when it is networked, undermines not only individual but social autonomy, that is, social norms that arise in and get solidified by everyday situations.
The increased penetration of surveillance through networked media invokes Michel Foucault’s famous theorization of the panopticon from Discipline and Punish (Surveiller et punir ). Initially devised as a model for building prisons devised in the 18th century by political philosopher Jeremy Bentham, the panopticon is a circular structure in which the prison guard sits in a tower in the centre, invisible to the inmates who populate the numerous individual cells encircling the centre tower. The inmates are thus always visible by not only each other, but by the prison guard who might be watching them at any time without their awareness. For Foucault, the panopticon structure describes how we internalize the gaze of surveillance and thus always “watch ourselves,” since we can never be sure of when we are being watched. Moreover, the distributed nature of panoptic surveillance offers an insightful starting point from which to think about online surveillance as even more diffuse than the panopticon envisions – instead of one central tower, there are potentially multiple sources of invisible surveillance by different audiences watching different aspects of our online behaviour.
One of the main audiences in online surveillance practices is comprised of corporate rather than government actors – namely marketers watching over information transactions to be able to use that data for target marketing. In the context of online surveillance, such institutional threats to informational privacy often get termed “dataveillance,” where personal information rather than behaviour is the target for internet Big Brothers. Rather than direct supervision, however, dataveillance describes the accumulation of coded information as augmented by new technologies for tracking people’s behaviour.
Despite the threat posed by dataveillance to privacy as a right, the widespread practice of surveillance in online marketing is seen as potentially beneficial to consumers, by matching their interests with targeted ads for example. In this way, internet users are drawn into a state of constant dataveillance through their seemingly voluntary decision to participate in online marketing practices.
As the most common of these online marketing practices, behavioural advertising uses dataveillance techniques to determine a person’s internet browsing habits for the purpose of targeting ads to his or her specific interests. Brian Stallworth (2010) explains that the key feature of behavioural advertising is the tracking of someone’s web activities over time, “including the searches the consumer has conducted, the web pages visited, and the content viewed,” in order to make predictions about his or her broader consumer behaviour. These predictions are typically integrated into online profiles, where not only tracking information but other personal data disclosed online, such as name, address, credit card information, are amassed into a virtual identity profile as part of micro-target marketing.
A central issue that arises with online behavioural advertising and profiling is how the swiftly advancing tracking and data collection mechanisms are outpacing privacy regulation and leaving consumer privacy in a vulnerable state. The OPC has indeed addressed this problem in the Canadian context through a recent series of public consultations about online tracking, profiling and targeting. In its final 2011 report, the OPC lays out the results of those consultations and responses, summarizing the concerns raised by privacy advocates, scholars and citizens about practices like behavioural advertising:
Key industry associations acknowledged that about half of Canadians they surveyed express some discomfort with respect to being tracked online. Those industry associations that commented on the issue acknowledged that such practices risk losing customer trust, typically because the practice of tracking individuals is invisible to users.
In the response to the draft report, an advocacy organization expressed the view that the business models built around online tracking greatly challenge the balance between e-commerce and privacy that underpins PIPEDA. Noting that in such models the user is not so much the customer as the product, the organization argued that revenues are dependent on obtaining more personal information and that the true customers are advertisers. As a result, the concept of legitimacy in terms of the purposes for collecting, using or disclosing personal information in PIPEDA is challenged.
Another risk or concern raised was that such practices threaten the individual’s ability to control the flow of their personal information. In one of the responses to the draft of this report, an advocacy organization challenged industry assertions that individuals view advertising as a benefit; rather, it believes that users tolerate advertising. Other risks raised include the use of potentially inaccurate data affecting users’ online experiences, as well as decisions made about them—often without them being aware and, consequently, having no ability to challenge the accuracy of the information. Another significant risk is that profiling can be used to discriminate against individuals, for example through pricing schemes. In sum, these practices threaten consumer autonomy.
For Further Reading
Agre, Philip E., and Marc Rotenberg. 1997. Technology and Privacy: The New Landscape. Cambridge MA: MIT Press.
Allen, Anita L. 1999. Coercing Privacy. William and Mary Law Review 40, no. 3: 723-757.
boyd, danah, and Alice Marwick. 2011. Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies. Paper presented at the Privacy Law Scholars Conference, Berkeley CA, June 2, 2011.
DeCew, Judith Wagner. 1997. In pursuit of privacy: Law, ethics, and the rise of technology. Ithaca, N.Y.: Cornell University Press.
Franklin, Ursula. 1996. Stormy weather: Conflicting forces in the information society. Closing address at the 18th International Privacy and Data Protection Conference, Ottawa, Canada, September 19.
Nissenbaum, Helen. 2010. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford University Press.
Nissenbaum, Helen. 2004. Privacy as Contextual Integrity. Washington Law Review 79, no. 1: 101-139.
Palen, Leysia, and Paul Dourish. 2003. Unpacking “privacy” for a networked world. In Proceedings of the ACM Conference on Human Factors in Computing Systems, 129-136. New York: Association for Computing Machinery.
Steeves, Valerie. 2006. Privacy and New Media. In Paul Attallah & Leslie Regan Shade (Eds.), Mediascapes: New patterns in Canadian communication (2nd ed., pp. 250-265). Toronto: Thomson Nelson.
Warren, Samuel, and Louis Brandeis. 1890. Right to Privacy. Harvard Law Review 4, no. 5: 193-220.
Westin, Allan. 1967. Privacy and Freedom. New York: Atheneum.
privacy in everyday life:
Debatin, Bernhard, Janet P. Lovejoy, Ann-Kathrin Horn, and Brittany N. Hughes. 2009. Facebook and Online Privacy: Attitudes, Behaviors and Unintended Consequences. Journal of Computer-Mediated Communication 15: 83-108.
Kerr, Ian, Valerie Steeves, and Carole Lucock, eds. 2009. Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society. Oxford: Oxford University Press.
Nippert-Eng, Christena. 2010. Islands of Privacy. Chicago: University of Chicago Press.
Papacharissi, Zizi, and Paige L. Gibson. 2011. Fifteen Minutes of Privacy: Privacy, Sociality, and Publicity on Social Network Sites. In Privacy Online: Perspectives on Privacy and Self-Disclosure in the Social Web, eds. Sabine Trepte and Leonard Reinecke, 75-89. New York: Springer.
Raynes-Goldie, Kate. 2010. Aliases, creeping, and wall cleaning: Understanding privacy in the age of Facebook. First Monday 15, no. 1 (January).
Solove, Daniel J. 2007. ‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy. San Diego Law Review 44, no. 4: 745-772.
Stallworth, Brian. 2010. Future Imperfect: Googling for Principles in Online Behavioural Advertising. Federal Communications Law Journal 62, no. 2 (March): 465-491.
Altman, Irwin. 1977. Privacy Regulation: Culturally Universal or Culturally Specific? Journal of Social Issues, 33, no. 3: 66-84.
Bennett, Colin, and Martin French. 2003. The state of privacy in the Canadian state: The fallout from 9/11. Journal of Contingencies and Crisis Management 11, no. 1: 2-11.
Department of Communication. 1970. Instant World: A Report on Telecommunications in Canada. Ottawa: Government of Canada.
Department of Communication/Department of Justice. 1972. Privacy & Computers. Ottawa: Information Canada.
Government of Canada, 2000. Personal Information Protection and Electronic Documents Act (PIPEDA). (S.C. 2000, c. 5).
Government of Canada. 1985. Criminal Code (R.S.C., 1985, c. C-46).
Government of Canada. 1983. The Privacy Act (R.S., 1985, c. P-21).
Government of Canada. 1982. Canadian Charter of Rights and Freedoms. Ottawa: Government of Canada.
Government of Canada. 1977. Canadian Human Rights Act. Ottawa: Canadian Human Rights Commission.
Government of the United States of America. 2001. Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA Patriot Act) Act of 2001, Pub. L. No. 107-56, 115 Stat. 272.
Government of the United States of America. 1998. Children’s Online Privacy Protection Act (COPPA). 15 U.S.C. §§ 6501–6506 (Pub. L. No. 105-277, 112 Stat. 2581-728, enacted October 21).
House of Commons Standing Committee on Human Rights and the Status of Persons With Disabilities, 1997. Privacy: Where Do We Draw the Line? (The Hon. Sheila Finestone, Chair).
Office of the Privacy Commissioner of Canada (OPC). 2011. Report on the 2010 Office of the Privacy Commissioner of Canada’s Consultations on Online Tracking, Profiling and Targeting and Cloud Computing. Ottawa: Office of the Privacy Commissioner.
R. v. Duarte. 1990. 1 S.C.R. 30, January 25.
Raboy, Marc, and Jeremy Shtern. 2010. Media Divides: Communication Rights and the Right to Communicate in Canada. Vancouver: UBC Press.
Shade, Leslie Regan. 2008. Reconsidering the Right to Privacy in Canada. Bulletin of Science, Technology & Society 28, no. 1 (February): 80-91.
Bennett, Colin. 2008. The Privacy Advocates: Resisting the Spread of Surveillance. Cambridge: MIT University Press.
Canadian Internet Policy and Public Interest Clinic (CIPPIC). 2008. PIPEDA Complaint: Facebook. Filed 30 May.
Denham, Elizabeth. 2009, July 16. Report of Findings into the Complaint Filed by Canadian Internet Policy and Public Interest Clinic (CIPPIC) against Facebook Inc. Under the Personal Information Protection and Electronic Documents Act. Ottawa: Office of the Privacy Commissioner.
Lawford, John. 2004. Consumer Privacy Under PIPEDA: How Are We Doing? Ottawa: Public Interest Advocacy Centre.
Lawford, John. 2008. All in the Data Family: Children’s Privacy Online. September. Ottawa: Public Interest Advocacy Centre.
Lawson, Philippa. 1999. Privacy in Canada: A Public Interest Perspective. Address to the Riley Conference on Privacy and Bill C-54. February 23. Ottawa: Public Interest Advocacy Centre.
Campbell, John Edward, and Matt Carlson. 2002. Panopticon.com: Online Surveillance and the Commodification of Privacy. Journal of Broadcasting & Electronic Media 46, no. 4: 586-606.
Foucault, Michel. 1975. Surveiller et punir: Naisance de la prison. Paris: Gallimard.
Gates, Kelly, and Shoshana Magnet. 2007. Communication Research and the Study of Surveillance. The Communication Review 10: 277-293.
Lyon, David. 2001. Surveillance Society: Monitoring Everyday Life. Buckingham: Open University Press.
Orwell, George. 1948. Nineteen Eighty-Four. London: Secker & Warburg.
Office of the Privacy Commissioner of Canada, Jennifer Stoddart
Office of the Information and Privacy Commissioner of Ontario, Ann Cavoukian
Office of the Information and Privacy Commissioner of British Columbia, Elizabeth Denham
Stop Online Spying
The New Transparency: Surveillance and Social Sorting
The Our Privacy Matters! project creators respect the privacy of its sites’ visitors and users. However, you should know that this website uses services that rely on Google Analytics, a web analytics service provided by Google, Inc. (“Google”).
To learn more about Google’s Analytics system check out Christopher Parsons post about this at http://www.christopher-parsons.com/blog/privacy/google-analytics-privacy-and-legalese/.