- 1 FILE NOTESNational Strategy for Trusted Identities in CyberspacePrivacy Workshop
- 1.1 June 27-28, 2011 / MIT Media Lab / Boston, Mass.
- 1.2 First panelist: Jeffrey Freidberg, Microsoft
- 1.3 Second panelist: John Clippinger, Harvard Berkman Center
- 1.4 Third panelist: Ken Mortensen, chief privacy officer, CVS Caremark
- 1.5 Fourth panelist: Seth Schoen, staff technologist, Electronic Frontier Foundation
- 1.6 Fifth panelist: Kaliya Hamlin – Internet Identity Commons
- 1.7 Dawn Jutla, Saint Mary’s University, Halifax, Nova Scotia
- 1.8 Q and A session
National Strategy for Trusted Identities in Cyberspace
June 27-28, 2011 / MIT Media Lab / Boston, Mass.
By Bill Densmore
Hash tags: #nstic.mit #nstic
This morning’s panel covers usability. The panelists are introduced by Dazza Greenwood of the eCitizen Foundation.
First panelist: Jeffrey Freidberg, Microsoft
Microsoft: experiences investigating user-centric privacy systems.
- Limit tracing by claims providers
- Limit tracking across websites
Discussion of concept of user agent
- What does the user agent do? A intermediary between claims providers and service providers. The user agent could be locally on the computer or in the cloud, hosted somewhere else. It allows the user to collect different claims sets from different claims providers. It could create mimimally derived claims>
- Who provides the audit trail? The user agent could do this. It can provide selective disclosure selectively.
Example: User goes to a claims provider and asks for a verified claim of again, which she then sends along to the relying party.
“Especially in the private sector there is a very strong incentive to collect more.”
trust marks as part of the system: they can be spoofed pretty easily.
Conclusion: There’s a need for guidance and research, a need to align user interface, architecture and mental models.
Second panelist: John Clippinger, Harvard Berkman Center
Also has an appointment at MIT Media Lab. He’s been working on a project called “mustard seed.”
The whole point of the new data is it is going to need to be able to flow in a principled way and enable new markets. Design solutions that leverage the technology. Law is not a just a set of rules it is a set of mechanisms. A trust framework is like a technology.
“Mustard seed” is a platform to build and test different trust frameworks. To get reference use cases you can point to. “At some point this is just too complex about, too many intellectual specialities, people talk around and do understand each other, so you just need to have a reference solution.”
Issue is not about privacy it is about level of risk.
An independent party will manage the trust framework – as in the credit-card network. How do you create layers of trust? All members of a trust framework have to be authenticated into it. “A violation of the terms as a result of an audit can result in a member being quarantined and there are economic consequences.”
Working with the Boston School System and putting a trust wrapper around Diaspora so they can have a “card” that allows the to have role-based sharing with different parties around the Boston school system. “Can you really get a card that is secure that people will use?”
Third panelist: Ken Mortensen, chief privacy officer, CVS Caremark
Mortensen served as a privacy officer in the U.S. Department of Justice. He is a lawyer. CVS Caremark has pharmacies but also has a managed-benefits business. Also have mini-clinics in about 600 stories – nurse practitioner.
“The reality is things are not easy. Things are harder than you might think . . . the world in which this came together came from many different angles. How do you build a trust framework? How do you deal with privacy adjudication?”
“In the world I live in there is a lot of information going all over the place.”
“There are sometimes very good use of personal information that the individual is not entirely cognizant of.”
- The paper the doctor has – who prescribed you – electronic is about 40% of market now
- Pharmacy and pharmacist will dispense. Between the two is a prescribing network.
- Payment – Run against insurance card / Medicare/Medicaid
- Pass back to health-care plan
- A portion passed back to any large employer
Fourth panelist: Seth Schoen, staff technologist, Electronic Frontier Foundation
With new identity technologies, what incentives are identity providers and relying parties going to have to align what they do with the preferences of consumers. “It’s great to see that people are paying attention to this.”
To what extend do these parties agree about what is supposed to happen?
Sometimes parties agree: Wells Fargo sending a token for two-factor authentication to a bank account – something Seth asked for.
Sometimes relying parties and users don’t agree. Example: Google operates as an identifying party and a relying part to send a tracking cookie to Seth’s computer.
Distinction between authentication and identification: You might be authenticated without being identified. WE might want to protect accounts against unauthorized access and use, without revealing who owns them. This is not anonymity, it is pseudonymity. “I think it’s a trust framework that is terrible from the point of user of user privacy because it creates mountains of data which is accessible for a very long time.”
Fraud protection is a consumer protection interest, but so is privacy and having a lot of data helps prevent fraud.
There may be increased pressure to provide identity as it gets easier to do so. It may become more difficult to resist or evade these. Is there a way inherently in the technology where the user can say, “I don’t agree with you requesting this.”
Fifth panelist: Kaliya Hamlin – Internet Identity Commons
As “identity woman” she is a founder and executive director of the Personal Data Ecosystem Consortium.
Introduction by Dazza Greenwood:
“Free people own their names and can name themselves whatever they want. Slaves are named by their households. Free people in common law can have as many aliases as they like so long as they have none of those for the purpose of committing a fraud.”
She notes that FIPPS were first developed three decades ago. There are now multiple new devices – how are we going to use these new things to make the aspirations of FIPPS real.
Current ecosystem includes data collectors, data brokers, date users. “We are not understanding these data flows which are completely disconnected from the individual.”
“There’s a middle way. And that’s what we’re really focused on. You can have privacy and new business opportunities if you shift and you allow people to track themselves and they can gain value from their own data using new devices and network nodes.”
Searches, calendars, location, social graphic, interests, purchases are all part of the use of individual data.
“Why is this important? Because people sift and move contexts and they don’t want those contexts linked together.”
She says individuals are really the only ethical integrators of all this data. “We need business models for agents that work on the user’s behalf.”
People beginning to form businesses in this area:
- Project Danube
“We are growing a community of startups working in this space … .we’ll have 19 in our startup circle shortly.” …. “You can’t really join unless you have a commitment to open standards.” “People have to be able to export their data from one service to another service.”
Unisys survey of 1,000 people – they are most concerned about securing their personal information.
Dawn Jutla, Saint Mary’s University, Halifax, Nova Scotia
She is from the school of business. She has been leader of the OASIS effort to create. privacy standards in the corporate environment.
ISPA – International Security Trust and Privacy Alliance – There was absence of common understanding across many jurisdictions.
Key terms: Agreement, control, validation, certification, audit, enforcement, interaction, usage, agent, access.
Focus of her talk is on the challenge of international and multi-state regulations.
Context is critical when use cases are organized.
Dawn.firstname.lastname@example.org dawn dotjutla at smudot ca
Q and A session
What incentives have to emerge to get the system focused on the user?
Jutla: Nervous about how the user is incentivized. Incent them so they benefit from the marketplace. That does not incent them to protect their privacy. “We need to create incentives for them to protect their data. … they more so need to protect their data if they are going to keep personal data stores than ever before.”
Hamlin: “What are the moral rights that can’t be given away?”
Clippinger: “You are going to see that the data in privacy and financial regulations are going to become very close.”
Mortenson: From 1996 to 2011, health-care providers have been incentivized to focus on privacy – many participants now have chief privacy officers.
Schoen: People’s mental model of who is a relying party varies. Why has obligations to the user? The person knows who the doctor is. On the other hand, “we have a lot of relying parties that the users have never heard of. They don’t know they have their idea, they don’t know they are being tracked . . . in that case the incentives are very poorly aligned between the user (and the relying party). I don’t know how to bridge that gap.”
Jutla: We encouraged everyone to get on the Internet without realizing the implications on personal data.
All questions asked at once:
Bob Panera (spelling uncertain): Haven’t discussed what privacy-enhancing technologies are. What abou t YourProve, which Microsoft has acquired rights to. Where does Microsoft see YouProve going. How do you envision it playing?
The gap between the user and the relying party – IBM has IdentityMixer, a user-centric cryptographic approach. Could technology and access to technology from a user and relying part in a similar way be a way to bridge that gap?
Doug Shepards: W3C. They held an identity workshop two weeks ago. Strong interests from browsers to build some sort of identity management into the browser, that might aggregate the info locally or use an IDP to broker the information. Where mght that be appropriate or not appropriate?
Paul Trevithick – Azigo – has been involved in lots of open-source efforts in this space. “Privacy is about allowing me control who can see what about me and in what context.” “We have spent many millions working on user agents … but so far we have not brought home the bacon.” Most of the innovation is coming from startups in this space. In the past Microsoft has played a powerful role, open-source projects played a role. “We didn’t quite get there, we learned a lot. But what’s the best way to foster the collaboration?” Is it just to let the startups struggle, is there a role for the government play. “What is the process, what would best foster and facilitate the emergence of these user agents?”
Friedberg: A challenge around card space was the lack of use cases at the time it was released. The demand wasn’t there. Challenges around the heavy UI. Needs to be extra work to figure out how to make the experiences map better to what people are thinking. He thinks YouProve is elegant and has cryptographic strength. Microsoft intrigued about where this is all going to lead. Cross-section digital identity project with NSTIC.
Clippinger: Source of innovation is coming form the startup community. It’s hard for larger institutions to keep pace. Facebook has emerged as everyone has been considering this. There are those who think the whale will be overturned and the ad networks model will go away and be replaced by a user-centric environment. “The whole idea of personally identifiable information is an old concept.” …. It will become part of brand building to build a trust relationship with users.
Mortenson: It’s critical for people in government to understand the pragmatic aspects of the use of this information. “Building these disconnected form the people who need to use the technology is problematic.”
Don’t forget about how much paper is still used.
Hamlin: Need to focus on agents lying between the individual and relying parties. “Because the individual just dealing with relying parties is not enough.”
Schoen: It is a challenge explaining the math of security to people. Sites directly circumvent user’s privacy preferences by constructing a new cookie if you delete one. In that environment, it’s hard to see what to do.
It’s amazing to see the power that browser vendors are acquiring in terms of the ability to protect consumer privacy. In terms of setting defaults, make information flows more visible to users, and understand and visualize sites’ policies and preferences. They can enable the use of multi-factor authentication. “Browser vendors can do a lot, really.”
Jutla: “I don’t believe we have the technology yet to protect personal data stores” from security hackers. “When we create these personal data stores we are opening Pandora’s Box if we don’t do it correctly.”
-- END OF PANEL DISCUSSION --