Person:
O'Brien, David

Loading...
Profile Picture

Email Address

AA Acceptance Date

Birth Date

Research Projects

Organizational Units

Job Title

Last Name

O'Brien

First Name

David

Name

O'Brien, David

Search Results

Now showing 1 - 8 of 8
  • Thumbnail Image
    Publication
    Practical approaches to big data privacy over time
    (Oxford University Press (OUP), 2018) Altman, Micah; Wood, Alexandra; O'Brien, David; Gasser, Urs
    Key Points Governments and businesses are increasingly collecting, analysing, and sharing detailed information about individuals over long periods of time. Vast quantities of data from new sources and novel methods for large-scale data analysis promise to yield deeper understanding of human characteristics, behaviour, and relationships and advance the state of science, public policy, and innovation. The collection and use of fine-grained personal data over time, at the same time, is associated with significant risks to individuals, groups, and society at large. This article examines a range of long-term research studies in order to identify the characteristics that drive their unique sets of risks and benefits and the practices established to protect research data subjects from long-term privacy risks. We find that many big data activities in government and industry settings have characteristics and risks similar to those of long-term research studies, but are subject to less oversight and control. We argue that the risks posed by big data over time can best be understood as a function of temporal factors comprising age, period, and frequency and non-temporal factors such as population diversity, sample size, dimensionality, and intended analytic use. Increasing complexity in any of these factors, individually or in combination, creates heightened risks that are not readily addressable through traditional de-identification and process controls. We provide practical recommendations for big data privacy controls based on the risk factors present in a specific case and informed by recent insights from the state of the art and practice.
  • Thumbnail Image
    Publication
    Bridging the Gap between Computer Science and Legal Approaches to Privacy
    (Harvard Law School, 2018) Nissim, Kobbi; Bembenek, Aaron; Wood, Alexandra; Bun, Mark Mar; Gaboardi, Marco; Gasser, Urs; O'Brien, David; Vadhan, Salil; Steinke, Thomas
    The analysis and release of statistical data about individuals and groups of individuals carries inherent privacy risks, and these risks have been conceptualized in different ways within the fields of law and computer science. For instance, many information privacy laws adopt notions of privacy risk that are sector- or context-specific, such as in the case of laws that protect from disclosure certain types of information contained within health, educational, or financial records. In addition, many privacy laws refer to specific techniques, such as deidentification, that are designed to address a subset of possible attacks on privacy. In doing so, many legal standards for privacy protection rely on individual organizations to make case-by-case determinations regarding concepts such as the identifiability of the types of information they hold. These regulatory approaches are intended to be flexible, allowing organizations to (1) implement a variety of specific privacy measures that are appropriate given their varying institutional policies and needs, (2) adapt to evolving best practices, and (3) address a range of privacy-related harms. However, in the absence of clear thresholds and detailed guidance on making case-specific determinations, flexibility in the interpretation and application of such standards also creates uncertainty for practitioners and often results in ad hoc, heuristic processes. This uncertainty may pose a barrier to the adoption of new technologies that depend on unambiguous privacy requirements. It can also lead organizations to implement measures that fall short of protecting against the full range of data privacy risks.
  • Thumbnail Image
    Publication
    Governments and Cloud Computing: Roles, Approaches, and Policy Considerations
    (Berkman Center for Internet & Society, 2014) Gasser, Urs; O'Brien, David
    Governments from Bogota to Beijing are engaging with emerging cloud computing technologies and its industry in a variety of overlapping contexts. Based on a review of a representative number of advanced cloud computing strategies developed by governments from around the world, including the United States, United Kingdom, the European Union, and Japan, we observed that these governments – mostly implicitly – have taken on several different “roles” with respect to their approaches to cloud computing. In particular, we identify six distinguishable but overlapping roles assumed by governments: users, regulators, coordinators, promoters, researchers, and service providers. In this paper, we describe and discuss each of these roles in detail using examples from our review of cloud strategies, and share high-level observations about the roles as well as the contexts in which they arise. The paper concludes with a set of considerations for policymakers to take into account when developing approaches to the rapidly evolving cloud computing technologies and industry.
  • Thumbnail Image
    Publication
    Elements of a New Ethical Framework for Big Data Research
    (Washington & Lee University School of Law, 2016) Vayena, Effy; Gasser, Urs; Wood, Alexandra; O'Brien, David; Altman, Micah
    Emerging large-scale data sources hold tremendous potential for new scientific research into human biology, behaviors, and relationships. At the same time, big data research presents privacy and ethical challenges that the current regulatory framework is ill-suited to address. In light of the immense value of large-scale research data, the central question moving forward is not whether such data should be made available for research, but rather how the benefits can be captured in a way that respects fundamental principles of ethics and privacy. In response, this Essay outlines elements of a new ethical framework for big data research. It argues that oversight should aim to provide universal coverage of human subjects research, regardless of funding source, across all stages of the information lifecycle. New definitions and standards should be developed based on a modern understanding of privacy science and the expectations of research subjects. In addition, researchers and review boards should be encouraged to incorporate systematic risk-benefit assessments and new procedural and technological solutions from the wide range of interventions that are available. Finally, oversight mechanisms and the safeguards implemented should be tailored to the intended uses, benefits, threats, harms, and vulnerabilities associated with a specific research activity. Development of a new ethical framework with these elements should be the product of a dynamic multistakeholder process that is designed to capture the latest scientific understanding of privacy, analytical methods, available safeguards, community and social norms, and best practices for research ethics as they evolve over time. Such a framework would support big data utilization and help harness the value of big data in a sustainable and trust-building manner.
  • Thumbnail Image
    Publication
    Don't Panic: Making Progress on the "Going Dark" Debate
    (Berkman Center for Internet & Society at Harvard Law School, 2016) Gasser, Urs; Gertner, Nancy; Goldsmith, Jack; Landau, Susan; Nye, Joseph; O'Brien, David; Olsen, Matthew; Renan, Daphna; Sanchez, Julian; Schneider, Bruce; Schwartzol, Larry; Zittrain, Jonathan
    Just over a year ago, with support from the William and Flora Hewlett Foundation, the Berkman Center for Internet & Society at Harvard University convened a diverse group of security and policy experts from academia, civil society, and the U.S. intelligence community to begin to work through some of the particularly vexing and enduring problems of surveillance and cybersecurity. The group came together understanding that there has been no shortage of debate. Our goals were to foster a straightforward, non-talking-point exchange among people who do not normally have a chance to engage with each other and then to contribute in meaningful and concrete ways to the discourse on these issues. A public debate unfolded alongside our meetings: the claims and questions around the government finding a landscape that is “going dark” due to new forms of encryption introduced into mainstream consumer products and services by the companies who offer them. We have sought to distill our conversations and some conclusions in this report. The participants in our group who have signed on to the report, as listed on the following page, endorse “the general viewpoints and judgments reached by the group, though not necessarily every finding and recommendation.” In addition to endorsing the report, some signatories elected to individually write brief statements, which appear in Appendix A of the report and also as individual posts on Lawfareblog.com, written by Jonathan Zittrain, Bruce Schneier, and Susan Landau. Our participants who are currently employed full-time by government agencies are precluded from signing on because of their employment, and nothing can or should be inferred about their views from the contents of the report. We simply thank them for contributing to the group discussions.
  • Thumbnail Image
    Publication
    Privacy and Cybersecurity Research Briefing
    (Berkman Klein Center for Internet & Society, 2016) O'Brien, David; Budish, Ryan; Faris, Robert; Gasser, Urs; Lin, Tiffany
  • Thumbnail Image
    Publication
    Privacy and Open Data Research Briefing
    (Berkman Klein Center for Internet & Society, 2016) Wood, Alexandra; O'Brien, David; Gasser, Urs
  • Thumbnail Image
    Publication
    Differential Privacy: A Primer for a Non-Technical Audience
    (Vanderbilt University, 2018) Wood, Alexandra; Altman, Micah; Bembenek, Aaron; Bun, Mark; Gaboardi, Marco; Honaker, James; Nissim, Kobbi; O'Brien, David; Steinke, Thomas; Vadhan, Salil
    Differential privacy is a formal mathematical framework for quantifying and managing privacy risks. It provides provable privacy protection against a wide range of potential attacks, including those currently unforeseen. Differential privacy is primarily studied in the context of the collection, analysis, and release of aggregate statistics. These range from simple statistical estimations, such as averages, to machine learning. Tools for differentially private analysis are now in early stages of implementation and use across a variety of academic, industry, and government settings. Interest in the concept is growing among potential users of the tools, as well as within legal and policy communities, as it holds promise as a potential approach to satisfying legal requirements for privacy protection when handling personal information. In particular, differential privacy may be seen as a technical solution for analyzing and sharing data while protecting the privacy of individuals in accordance with existing legal or policy requirements for de-identification or disclosure limitation. This primer seeks to introduce the concept of differential privacy and its privacy implications to non-technical audiences. It provides a simplified and informal, but mathematically accurate, description of differential privacy. Using intuitive illustrations and limited mathematical formalism, it discusses the definition of differential privacy, how differential privacy addresses privacy risks, how differentially private analyses are constructed, and how such analyses can be used in practice. A series of illustrations is used to show how practitioners and policymakers can conceptualize the guarantees provided by differential privacy. These illustrations are also used to explain related concepts, such as composition (the accumulation of risk across multiple analyses), privacy loss parameters, and privacy budgets. This primer aims to provide a foundation that can guide future decisions when analyzing and sharing statistical data about individuals, informing individuals about the privacy protection they will be afforded, and designing policies and regulations for robust privacy protection.