ICPSR staffer blogs from International Association of Privacy Professionals (IAPP) Global Privacy Summit 2018 #GPS18

 

 

Johanna BleckmanEditor's note: This post, written by ICPSR's Johanna Bleckman, will be continually updated during the #IAPP18 conference, March 25-28, 2018, in Washington, DC.
 

Beyond "Notice, Choice, and Purpose Limitation"

Wednesday, March 28, 2:30 p.m. EDT: I have just completed a workshop walking us through the 'post-legal' considerations necessary to support an ethical approach to the collection and use of Big Data. The Federal Information Processing Standards (FIPS) used to be sufficient for protecting the privacy of an individual's data. These standards require notice to individuals of the collection of their data, providing individuals the choice the opt out of this collection, and that the purposes for which the data will be used are limited to reasonable expectations at the time of collection. Professor Dennis Hirsch at Ohio State University's Law School (Go Blue) laid out how these standards are no longer enough – too much data are being collected from too many points and for too many purposes for individuals to be provided useful and timely notice, reasonable and informed choice, and anywhere near a full understanding of the purposes for which the data may be used. 

So what do we do? Since these three tenets are no longer sufficient to protect individuals' rights, consideration and mitigation of the following ethical risks need to be part of a new data ethics norm within industry: bias baked into Big Data, opaque or 'black box' analytic methods, and manipulation of individuals based on vulnerabilities identified using Big Data. 

How does this intersect with the social sciences and ICPSR's work? Michelle De Mooy, Director of the Center for Democracy and Technology's Privacy and Data Project, recommends that industry consider, among other things, the social impact of their systems, tools, and products. Data collected, for instance, by all of the devices in our lives are inherently biased (who drives smart cars? who carries smart phones? etc.). By responsibly sharing the data, though, and allowing it to be interrogated by the research community, the data can be understood, evaluated, and put in context. 

If you stuck with me, thanks. Data sharing for-the-(ethical)-win.

 

Privacy v. the Social Good and Protecting "Big [Classroom] Data"

Wednesday, March 28, 12:00 p.m. EDT: The closing session of the #GPS18 is underway! We are hearing from former vice president of the European Commission, Viviane Reding, who oversaw the original drafting of the General Data Protection Regulation (GDPR), a new EU regulation in place beginning in May of this year and one of the main topics of this year's Summit.

But it's not over yet - on my docket today are sessions on balancing our individual right to privacy and broader social values, and understanding the explosion of educational technology in K-12 classrooms and anticipating how the Children's Online Privacy Protection Act (COPPA) and the Famiy Educational Rights and Privacy Act (FERPA) may or may not apply to and help to protect this new source of "Big Data." 

 

"Metadata is a Love Note to the Future"

Tuesday, March 27, 5:00 p.m. EDT: Learning about how businesses use, or perhaps don't yet use, metadata. "If you don't tag your data, you don't know what you have; if you don't know what you have, you can't protect it." That was from Dana Simberkoff, Chief Risk, Privacy and Information Security Officer at AvePoint, and I feel like she's a kindred spirit. We do love metadata at ICPSR.

 

Privacy v. Data Flows

Tuesday, March 27, 1:30 p.m. EDT: I just came out of a panel discussion on the limitations of data localization policies – policies that require that data be stored geographically local to the entity storing it. We heard from, among others, Shannon Coe, Team Lead for Data Flows and Privacy for the International Trade Administration at the U.S. Department of Commerce, and Jane Horvath, Senior Director of Global Policy at Apple and former Chief Privacy Counsel and Civil Liberties Officer at the U.S. Department of Justice.

My main takeaway? The tension between data movement/access and privacy is truly top of mind for government and industry (not just us!). ICPSR continues to pursue and push forward the tools and policies that maximize both, knowing that they are not fundamentally at odds. As Shannon Coe said, "privacy and data flows can be mutually reinforcing." 

 

Privacy Bootcamp? Check!

Tuesday, March 27, 8 a.m. EDT: I'm thrilled to be at the International Association of Privacy Professionals (IAPP) Global Privacy Summit 2018! Yesterday, I went through their well-known Privacy Bootcamp, a 5-hour session on data privacy and security taught by Kirk Nahra, an attorney and internationally recognized leader in the field of data privacy laws and regulations. And after that, a workshop on assessing privacy and security risks to personally identifiable information hosted by folks from the White House Office of Management and Budget (OMB) and the National Institute of Standards and Technology (NIST). 
 
As we finalize our researcher credentialing white paper and continue to spec out and build the credentialing system at ICPSR, I am feeling so lucky to have the opportunity to join this community and learn from privacy professionals across government and industry. Responsibly archiving and sharing sensitive or otherwise restricted data has long been a critical part of ICPSR's business and it's great to bring our expertise into this world and to bring home theirs. Stay tuned for more updates from #GPS18!
 
On my lineup today are sessions on data ethics, the risks and benefits of data localization, legislative developments relevant to data privacy and more!

Mar 27, 2018

View other headlines