Monday, October 31, 2011

Privacy and Security in the Implementation of Health Information Technology (Electronic Health Records): U.S. and EU Compared

      Privacy and Security in the Implementation of Health Information Technology (Electronic Health Records): U.S. and EU Compared, B.U. J. SCI. & TECH. L., Vol. 17, Winter 2011. "The importance of the adoption of Electronic Health Records (EHRs) and the associated cost savings cannot be ignored as an element in the changing delivery of health care. However, the potential cost savings predicted in the use of EHR are accompanied by potential risks, either technical or legal, to privacy and security. The U.S. legal framework for healthcare privacy is a combination of constitutional, statutory, and regulatory law at the federal and state levels. In contrast, it is generally believed that EU protection of privacy, including personally identifiable medical information, is more comprehensive than that of U.S. privacy laws. Direct comparisons of U.S. and EU medical privacy laws can be made with reference to the five Fair Information Practices Principles (FIPs) adopted by the Federal Trade Commission and other international bodies. The analysis reveals that while the federal response to the privacy of health records in the U.S. seems to be a gain over conflicting state law, in contrast to EU law, U.S. patients currently have little choice in the electronic recording of sensitive medical information if they want to be treated, and minimal control over the sharing of that information. A combination of technical and legal improvements in EHRs could make the loss of privacy associated with EHRs de minimis. The EU has come closer to this position, encouraging the adoption of EHRs and confirming the application of privacy protections at the same time. It can be argued that the EU is proactive in its approach; whereas because of a different viewpoint toward an individual’s right to privacy, the U.S. system lacks a strong framework for healthcare privacy, which will affect the implementation of EHRs. If the U.S. is going to implement EHRs effectively, technical and policy aspects of privacy must be central to the discussion."

Monday, October 24, 2011

Jim Dempsey Op-ed: The shocking strangeness of our 25-year-old digital privacy law

Op-ed: The shocking strangeness of our 25-year-old digital privacy law

By Jim Dempsey 

Op-ed: Twenty-five years after it was passed, the Electronic Communications Privacy Act still governs much of our privacy online, and the Center for Democracy and Technology argues that ECPA needs an overhaul. The opinions in this post do not necessarily reflect the views of Ars Technica.

Cell phones the size of bricks, "portable" computers weighing 20 pounds, Ferris Bueller's Day Off, and the federal statute that lays down the rules for government monitoring of mobile phones and Internet traffic all have one thing in common: each is celebrating its 25th anniversary this year.
The Electronic Communications Privacy Act (ECPA) was signed into law on October 21, 1986. Although it was forward-looking at the time, ECPA’s privacy protections have remained stuck in the past while technology has raced ahead, providing us means of communication that not too long ago existed only in the minds of science fiction writers.

Citing ECPA, the government claims it can track your movements without having to get a warrant from a judge, using the signal your mobile phone silently sends out every few seconds. The government also claims it can read your e-mail and sneak a peek at your online calendar and the private photos you have stored in “the cloud," all without a warrant.

The government admits that if it wants to seize photos on your hard drive, it needs a warrant from a judge. And if it wants to intercept your e-mail en route, well, it needs a warrant for that, too. But once the data comes to rest on the Internet’s servers, the government claims you’ve lost your privacy rights in it. Same data, different rules.

Sound illogical? Out of step with the way people use technology today? It is. Most people assume the Constitution protects them against unreasonable searches and seizures, regardless of technology. The Justice Department thinks differently. It argues that the Fourth Amendment's warrant requirement does not apply to data stored online.

That’s the same argument the government made about telephones 80 years ago. If you really wanted your privacy, the government argued, you wouldn’t use the telephone. Unfortunately, in 1928 the Supreme Court agreed and said that wiretapping was not covered by the Constitution. It took the Court 40 years to rule that ordinary telephone calls were protected.

The courts have been equally slow in recognizing the significance of the Internet. The Supreme Court still has never ruled on whether e-mail is protected by the Constitution. Next month, the Supreme Court will hear oral argument in a case involving GPS tracking; let’s hope it doesn’t tell us we have to wait 40 years for the Constitution to cover GPS. But whatever the outcome in that case, it is unlikely to resolve all the issues associated with the new technologies we depend on now in our daily lives.

Search, but with a warrant
It’s time for Congress to update ECPA to require a warrant whenever the government reads our e-mail or tracks our movements. No competent programmer would be content to release version 1.0 of a program and then just walk away, ignoring bug reports and refusing all requests for upgraded features. Why should Congress be content with version 1.0 of our digital privacy law?

The good news is that an upgrade is in the works. Leading Internet companies and public interest groups from the left and the right have founded the Digital Due Process coalition to press Congress to enact reforms to ECPA. DDP's chief request is that, just as the government needs a warrant to enter your house or seize your computer, it should get a warrant before gaining access to your private communications stored online or to track you via your mobile phone.

Congress has taken note. Earlier this week, Senators Ron Wyden (D-OR) and Mark Kirk (R-IL) held a press conference to highlight their bi-partisan sponsorship of a bill requiring government agents to get a warrant before using technological means to track an individual. The press conference was held amid a "Retro Tech Fair" that displayed a dazzling array of 1986-era computers—highlighting just how far technology has come since ECPA was passed.

Just yesterday, Sen. Patrick Leahy (D-VT), the original author of ECPA, announced his intention to schedule a Committee markup before year's end on his ECPA reform bill.

These are encouraging steps, but you can be sure that the Justice Department will put up a fight. Prosecutors would rather act on their own, without going before a judge. They will raise all kinds of arguments about why the standard set in the Constitution over 200 years ago should not apply to the Internet.

Proponents of stronger privacy protection are gearing up, too. A left-right coalition spanning political ideologies has launched a campaign where individuals can add their name to a petition urging Congress to enact strong privacy protections.

You can get nostalgic for a 25-year-old movie, but there's nothing endearing about a 25-year-old digital privacy law.

Jim Dempsey is the Vice President for Public Policy at the Center for Democracy & Technology in Washington, DC.
Photograph by Center for Democracy and Technology

Monday, October 17, 2011

The Default Choice, So Hard to Resist


IN the wide-open Web, choice and competition are said to be merely “one click away,” to use Google’s favorite phrase. But in practice, the power of digital distribution channels, default product settings and traditional human behavior often matters most.

In a Senate hearing last month about Google, Jeremy Stoppelman, the chief executive of Yelp, pointed to that reality in his testimony. “If competition really were just ‘one click away,’ as Google suggests,” he said, “why have they invested so heavily to be the default choice on Web browsers and mobile phones?”

“Clearly,” he added, “they are not taking any chances.”

Indeed, Google made a big bet early in its history: In 2002, it reached a deal with AOL, guaranteeing a payment of $50 million to come from advertising revenue if AOL made Google its automatic first-choice search engine — the one shown to users by default. Today, Google pays an estimated $100 million a year to Mozilla, coming from shared ad revenue, to be the default search engine on Mozilla’s popular Firefox Web browser in the United States and other countries. Google has many such arrangements with Web sites.

Most economists agree that Google’s default deals aren’t anticompetitive. Rivals like Bing, the general search engine from Microsoft, and partial competitors like Yelp, an online review and listing service for local businesses, have their own Web sites and other paths of distribution. Choice, in theory, is one click away.

But most people, of course, never make that single click. Defaults win.

The role of defaults in steering decisions is by no means confined to the online world. For behavioral economists, psychologists and marketers, defaults are part of a rich field of study that explores “decision architecture” — how a choice is presented or framed. The field has been popularized by the 2008 book “Nudge,” by Richard H. Thaler, an economist at the University of Chicago and a frequent contributor to the Sunday Business section, and Cass R. Sunstein, a Harvard Law School professor who is now on leave and is working for the Obama administration. Nudges are default choices.

In decision-making, examples of the default preference abound: Workers are far more likely to save in retirement plans if enrollment is the automatic option. And the percentage of pregnant women tested for H.I.V. in some African nations where AIDS is widespread has surged since the test became a regular prenatal procedure and women had to opt out if they didn’t want it.
A study published in 2003 showed that while large majorities of Americans approved of organ donations, only about a quarter consented to donate their own. By contrast, nearly all Austrians, French and Portuguese consent to donate theirs. The default explains the difference. In the United States, people must choose to become an organ donor. In much of Europe, people must choose not to donate.

Defaults, according to economists and psychologists, frame how a person is presented with a choice. But they say there are other forces that make the default path hard to resist. One is natural human inertia, or laziness, that favors making the quick, easy choice instead of exerting the mental energy to make a different one. Another, they say, is that most people perceive a default as an authoritative recommendation.

“All those work, and that is why defaults are so powerful,” says Eric J. Johnson, a professor at the Columbia Business School and co-director of the university’s Center for Decision Sciences.

THE default values built into product designs can be particularly potent in the infinitely malleable medium of software, and on the Internet, where a software product or service can be constantly fine-tuned.

“Computing allows you to slice and dice choices in so many ways,” says Ben Shneiderman, a computer scientist at the University of Maryland. “Those design choices also shape our social, cultural and economic choices in ways most people don’t appreciate or understand.”

Default design choices play a central role in the debate over the privacy issues raised by marketers’ tracking of online consumer behavior. The Federal Trade Commission is considering what rules should limit how much online personal information marketers can collect, hold and pass along to other marketers — and whether those rules should be government regulations or self-regulatory guidelines.

Privacy advocates want tighter curbs on gathering online behavioral data, and want marketers to have to ask consumers to collect and share their information, presumably in exchange for discount offers or extra services. Advertisers want a fairly free hand to track online behavior, and to cut back only if consumers choose to opt out.

New research by a team at Carnegie Mellon University suggests the difficulty that ordinary users have in changing the default settings on Internet browsers or in configuring software tools for greater online privacy. The project, called “Why Johnny Can’t Opt Out,” has just been completed and the results have not yet been published. Forty-five people of various backgrounds and ages in the Pittsburgh area were recruited for the study.

To qualify as research subjects, they had to be frequent Internet users and express an interest in learning about protecting their privacy online. Each was interviewed for 90 minutes, and each watched a video showing how online behavioral advertising works.

Then, each person was given a laptop computer and told to set privacy settings as he or she preferred, using one of nine online tools. The tools included the privacy options on browsers like Mozilla Firefox and Microsoft’s Internet Explorer, and online programs like Ghostery and Adblock Plus, as well as Consumer Choice from the Digital Advertising Alliance.
The privacy tools typically proved too complicated and confusing to serve the needs of rank-and-file Internet users.

“The settings they chose didn’t block as much as they thought they were blocking, often blocking nothing,” says Lorrie Faith Cranor, a computer scientist at Carnegie Mellon who led the research.

Ms. Cranor says the research points to the need to simplify privacy software to few choices. “If you turn it on, it should be pretty privacy-protective,” she says. “The defaults are crucial.”

Monday, October 3, 2011

New Book (Ordered): Jeff Jarvis: Are we Too Hung Up on Privacy

Jeff Jarvis: 'We now take it for granted that any piece of information we want is likely a search away.'


For many years, privacy has been evolving to become a right as fundamental as equal protection or free speech. But what if it comes at too high a cost? What if we have too much privacy when technology now makes sharing information so much easier and the value of shared information so much greater?

This is the thesis of a new book, "Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live," by Jeff Jarvis, a journalism professor at the City University of New York. In contrast to privacy activists, he argues for "publicness" to make the most of modern technologies.

"Just as we now take it for granted that any piece of information we want is likely a search away," Mr. Jarvis writes, "we are coming to rely on the idea that the people we want to meet are a connection away."

The benefits of social media are leading people increasingly to be more public. Most Americans over the age of 12 now have accounts on Facebook, whose entire purpose is to connect and share personal information. "To join up with fellow diabetics or vegetarians or libertarians or Star Trek fans, we first have to reveal ourselves as members of those groups," Mr. Jarvis writes.

Mr. Jarvis details privacy fears over time arising from new technologies, from the printing press to the telephone to the microphone. A century ago, Kodak cameras made it easy for the first time to take and share photos of people; Teddy Roosevelt for a time banned cameras from parks in Washington as a privacy violation.

Mr. Jarvis is his own test case for the benefits of sharing information. A hyperactive blogger and Twitter poster, he is famous among the social media set for his frank postings about his prostate cancer and the occasional embarrassing side effects of the treatment. He says in return for being so public, he got advice from men who had undergone the same procedure and the satisfaction of urging others to seek treatment. He once complained so bitterly online about problems with his computer that he created what became known online as "Dell hell."

The more than 80,000 people who follow Mr. Jarvis on Twitter know his views on many topics. These are often cranky. He credits WikiLeaks as journalism, thinks advertising revenue will somehow again be enough to fund news reporting, and last Friday reported he had visited Occupy Wall Street demonstrators (Twitter post: "Glad to see they were eating well & a generator powering many Macs") and made a donation. Those of us who disagree with these views appreciate much of the rest of his posts, including his links to interesting articles and events.

Mr. Jarvis is not starry-eyed about the Web. "Some people turn trollish, just announcing that they don't like what I'm saying, adding nothing to the discussion but venom," he told me last week. "They sometimes accuse me of oversharing. Well, I say they're over-listening. If they don't like what I say and don't choose to enter a discussion, they shouldn't follow me—and they shouldn't try to tell me what not to say."

Mr. Jarvis argues it should be up to each person where to balance the risks and rewards of being more public. "When new technologies cause change and fear, government's reflex is to regulate them to protect the past," he says. "But in doing so, they also can cut off the opportunities for the future."

Congress is considering several privacy bills. But Mr. Jarvis calls it a "dire mistake to regulate and limit this new technology before we even know what it can do."
Privacy is notoriously difficult to define legally. Mr. Jarvis says we should think about privacy as a matter of ethics instead. We should respect what others intend to keep private, but publicness reflects the choices "made by the creator of one's own information." The balance between privacy and publicness will differ from person to person in ways that laws applying to all can't capture.

"Perhaps this will lead to what I call the doctrine of mutually assured humiliation," Mr. Jarvis says. "I won't make fun of your silly picture if you don't make fun of mine. Perhaps it will lead to a greater expectation of openness from corporations and transparency from government. Perhaps it will also lead to people being more connected, for they can no longer run away from each other as they'll always be only a link or two apart."

No one could have known when the printing press was invented that it would enable people to share and test new ideas, from democracy to the scientific method. Some day we'll know whether digital technologies have such a profound impact, but we are already altering our behavior to take advantage of what they offer. Yesterday's expectation of privacy is rapidly giving way to something new, and perhaps better.

--------------------------------------------------Stefaan G. Verhulst
Chief of Research
Markle Foundation
10 Rockefeller Plaza, Floor 16
New York, NY 10020-1903

 Tel. 212 713 7630
P Please consider the environment before printing this e-mail.