Wednesday, June 6, 2012

Health Data Map (Latanya Sweeney) Launched

As Health Records Go Digital, Where They End Up Might Surprise You

Jordan Robertson, Data Privacy Lab, Harvard University, June 5, 2012 8:08 

From Latanya Sweeney, Data Privacy Lab, Harvard University, 2010
A graphic depicting the sharing of a person's health data.

Two years ago, Latanya Sweeney created a graphic on the widespread sharing of medical files that shocked lawmakers, technologists and doctors.

Sweeney, who founded the Data Privacy Lab at Harvard University, produced a “health data map” that looks like a windshield cracked by a few big rocks. At the center is someone’s health record, medical provider and insurance company. Emanating from them are webs of more than two dozen organizations that could have legitimate access to the file, including transcription services, medical researchers, and even data-mining firms and pharmaceutical companies.

“Collectively, you’d hear a gasp and then a moment of silence — that was pretty universal,” she said, describing the reaction during her congressional testimony and presentations to privacy summits, academic conferences and medical schools.

However, Sweeney said there are limitations in tracking the movement of medical data, and many doctors are in the dark about where their patients’ data go. So at the Health Privacy Summit in Washington, D.C., which starts Wednesday, she plans to unveil a new project to harness the collective knowledge of doctors, data-breach victims, whistle-blowers, technology specialists and others to build a new, more comprehensive health data map.

“If we can get a lot of people to march in this direction and keep them there and entertained and incentivized, I think what we’ll uncover will be mind-blowing,” said Sweeney, who is a computer scientist.

Her project comes amid a U.S. push to digitize patient records, which has created lifesaving benefits but has also made it easier for medical files to end up in unexpected places. As I reported last month in a special report for Bloomberg.com, loopholes in the federal law have allowed the collection and sharing of private medical information without patients’ consent.

Sweeney’s work has focused on identifying those unexpected places and on showing that it’s possible to determine some people’s identities from medical data, even after the records have been stripped of personal information. Adding to the alarm, she said the number of third-party entities receiving medical data has more than doubled in the past decade, and some firms that once received only “anonymized” data now get records that identify people.

While Sweeney’s earlier mapping effort drew on her experience as a legal expert and her work with the privacy center, her new project, thedatamap.org, needs submissions from others to help sketch a more complete picture of how medical data are shared.

At first, she’s seeking submissions of Internet links that show data-sharing relationships between medical providers and others. People will sign up with an e-mail address to be “data detectives,” and the accuracy of their submissions will be checked by other people who have signed up to submit links. Eventually, the map could include information from other sources.

Deborah Peel, a physician and founder of Patient Privacy Rights, the Austin, Texas-based group putting on the conference, said a promising aspect of Sweeney’s project is its open nature, which will help ensure accuracy by allowing organizations that are mentioned on the map to respond.

“There’s some self-regulation there — we’re pretty hopeful that if somebody says something wrong about a hospital or a corporation, that they’d respond and provide the right information,” Peel said.  “It’s kind of ridiculous we’re forced to resort to this because there’s no chain of custody for our data.”

Even if the project gets little public input, the research can still be used to pressure lawmakers into mandating that data-sharing arrangements become more transparent, Peel said.

Sweeney said a goal of the research is to identify areas where patient data might be vulnerable to theft or abuse. It’s not to prevent the sharing of medical data entirely, she said.
“Because you don’t know where your data is going, harms are almost impossible to report and detect,” she said. “We don’t want to stop data sharing. There are a lot of uses and benefits that come from it. But how do we do it in a responsible way? As long as the data sharing is invisible, you can’t possibly do that.”

Thursday, February 23, 2012

We Can’t Wait: Obama Administration Unveils Blueprint for a “Privacy Bill of Rights” to Protect Consumers Online

The White House, Office of the Press Secretary
For Immediate Release  February 23, 2012

Internet Advertising Networks Announces Commitment to “Do-Not-Track” Technology to Allow Consumers to Control Online Tracking

WASHINGTON, DC – The Obama Administration today unveiled a “Consumer Privacy Bill of Rights” as part of a comprehensive blueprint to improve consumers’ privacy protections and ensure that the Internet remains an engine for innovation and economic growth. The blueprint will guide efforts to give users more control over how their personal information is used on the Internet and to help businesses maintain consumer trust and grow in the rapidly changing digital environment. At the request of the White House, the Commerce Department will begin convening companies, privacy advocates and other stakeholders to develop and implement enforceable privacy policies based on the Consumer Privacy Bill of Rights.

In addition, advertising networks announced that leading Internet companies and online advertising networks are committing to act on Do Not Track technology in most major web browsers to make it easier for users to control online tracking. Companies that represent the delivery of nearly 90 percent of online behavioral advertisements, including Google, Yahoo!, Microsoft, and AOL have agreed to comply when consumers choose to control online tracking. Companies that make this commitment will be subject to FTC enforcement.

“American consumers can’t wait any longer for clear rules of the road that ensure their personal information is safe online,” said President Obama. “As the Internet evolves, consumer trust is essential for the continued growth of the digital economy. That’s why an online privacy Bill of Rights is so important.  For businesses to succeed online, consumers must feel secure. By following this blueprint, companies, consumer advocates and policymakers can help protect consumers and ensure the Internet remains a platform for innovation and economic growth.”

The advertising industry also committed not to release consumers’ browsing data to companies who might use it for purposes other than advertising, such as employers making hiring decisions or insurers determining coverage.

“It’s great to see that companies are stepping up to our challenge to protect privacy so consumers have greater choice and control over how they are tracked online. More needs to be done, but the work they have done so far is very encouraging,” said FTC Chairman Jon Leibowitz.

A Consumer Privacy Bill of Rights
The Consumer Privacy Bill of Rights is outlined in a report released today by the White House Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy.

“Every day, millions of Americans shop, sell, bank, learn, talk and work online.  At the turn of the century, online retail sales were around $20 billion in the United States, now they’re nearing $200 billion,” said Secretary Bryson.  “The Internet has become an engine of innovation, business growth, and job creation, so we need a strong foundation of clear protections for consumers, and a set of basic principles to help businesses guide their privacy and policy decisions.  This privacy blueprint will do just that.”

The Consumer Privacy Bill of Rights provides a baseline of clear protections for consumers and greater certainty for businesses. The rights are:

Individual Control:  Consumers have a right to exercise control over what personal data organizations collect from them and how they use it.

·       Transparency:  Consumers have a right to easily understandable information about privacy and security practices.
·       Respect for Context:  Consumers have a right to expect that organizations will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data.
·       Security:  Consumers have a right to secure and responsible handling of personal data.
·       Access and Accuracy:  Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data are inaccurate.
·       Focused Collection:  Consumers have a right to reasonable limits on the personal data that companies collect and retain.
·       Accountability:  Consumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.

The Consumer Privacy Bill of Rights is one of four key elements of the report, which also includes a stakeholder-driven process to specify how these rights apply in particular business contexts;  strong enforcement by the Federal Trade Commission (FTC);  and greater interoperability between the United States’ privacy framework and those of our international partners.

In the coming weeks, the Commerce Department’s National Telecommunications and Information Administration will convene stakeholders – including companies, privacy and consumer advocates, technical experts, international partners, and academics – to establish specific practices or codes of conduct that implement the general principles in the Consumer Privacy Bill of Rights.

The Administration also will work with Congress to develop legislation based on these rights to promote trust in the digital economy and extend baseline privacy protections to commercial sectors that existing federal privacy laws do not cover.

Today’s report results from a comprehensive review of the intersection of privacy policy and innovation in the Internet economy. The Commerce Department’s Internet Policy Task Force launched the review in 2010, seeking public comment on an initial set of issues and later on a set of policy recommendations.

Wednesday, January 25, 2012

New Privacy Framework in Europe

European Commission - Press Release

Commission proposes a comprehensive reform of data protection rules to increase users' control of their data and to cut costs for businesses

Brussels, 25 January 2012 – The European Commission has today proposed a comprehensive reform of the EU's 1995 data protection rules to strengthen online privacy rights and boost Europe's digital economy. Technological progress and globalisation have profoundly changed the way our data is collected, accessed and used.

In addition, the 27 EU Member States have implemented the 1995 rules differently, resulting in divergences in enforcement. A single law will do away with the current fragmentation and costly administrative burdens, leading to savings for businesses of around €2.3 billion a year. The initiative will help reinforce consumer confidence in online services, providing a much needed boost to growth, jobs and innovation in Europe.

"17 years ago less than 1% of Europeans used the internet. Today, vast amounts of personal data are transferred and exchanged, across continents and around the globe in fractions of seconds," said EU Justice Commissioner Viviane Reding, the Commission's Vice-President. "The protection of personal data is a fundamental right for all Europeans, but citizens do not always feel in full control of their personal data.

My proposals will help build trust in online services because people will be better informed about their rights and in more control of their information. The reform will accomplish this while making life easier and less costly for businesses. A strong, clear and uniform legal framework at EU level will help to unleash the potential of the Digital Single Market and foster economic growth, innovation and job creation."

The Commission's proposals update and modernise the principles enshrined in the 1995 Data Protection Directive to guarantee privacy rights in the future. They include a policy Communication setting out the Commission's objectives and two legislative proposals: a Regulation setting out a general EU framework for data protection and a Directive on protecting personal data processed for the purposes of prevention, detection, investigation or prosecution of criminal offences and related judicial activities.

Key changes in the reform include:
·       A single set of rules on data protection, valid across the EU. Unnecessary administrative requirements, such as notification requirements for companies, will be removed. This will save businesses around €2.3 billion a year.
·       Instead of the current obligation of all companies to notify all data protection activities to data protection supervisors – a requirement that has led to unnecessary paperwork and costs businesses €130 million per year, the Regulation provides for increased responsibility and accountability for those processing personal data.
·       For example, companies and organisations must notify the national supervisory authority of serious data breaches as soon as possible (if feasible within 24 hours).
·       Organisations will only have to deal with a single national data protection authority in the EU country where they have their main establishment. Likewise, people can refer to the data protection authority in their country, even when their data is processed by a company based outside the EU. Wherever consent is required for data to be processed, it is clarified that it has to be given explicitly, rather than assumed.
·       People will have easier access to their own data and be able to transfer personal data from one service provider to another more easily (right to data portability). This will improve competition among services.
·       A 'right to be forgotten' will help people better manage data protection risks online: people will be able to delete their data if there are no legitimate grounds for retaining it.
·       EU rules must apply if personal data is handled abroad by companies that are active in the EU market and offer their services to EU citizens.
·       Independent national data protection authorities will be strengthened so they can better enforce the EU rules at home. They will be empowered to fine companies that violate EU data protection rules. This can lead to penalties of up to €1 million or up to 2% of the global annual turnover of a company.
·       A new Directive will apply general data protection principles and rules for police and judicial cooperation in criminal matters. The rules will apply to both domestic and cross-border transfers of data.

The Commission's proposals will now be passed on to the European Parliament and EU Member States (meeting in the Council of Ministers) for discussion. They will take effect two years after they have been adopted.

Background
Personal data is any information relating to an individual, whether it relates to his or her private, professional or public life. It can be anything from a name, a photo, an email address, bank details, your posts on social networking websites, your medical information, or your computer's IP address. The EU Charter of Fundamental Rights says that everyone has the right to personal data protection in all aspects of life: at home, at work, whilst shopping, when receiving medical treatment, at a police station or on the Internet.

In the digital age, the collection and storage of personal information are essential. Data is used by all businesses – from insurance firms and banks to social media sites and search engines. In a globalised world, the transfer of data to third countries has become an important factor in daily life. There are no borders online and cloud computing means data may be sent from Berlin to be processed in Boston and stored in Bangalore.

On 4 November 2010, the Commission set out a strategy to strengthen EU data protection rules (IP/10/1462 and MEMO/10/542). The goals were to protect individuals' data in all policy areas, including law enforcement, while reducing red tape for business and guaranteeing the free circulation of data within the EU. The Commission invited reactions to its ideas and also carried out a separate public consultation to revise the EU's 1995 Data Protection Directive (95/46/EC).

EU data protection rules aim to protect the fundamental rights and freedoms of natural persons, and in particular the right to data protection, as well as the free flow of data.

This general Data Protection Directive has been complemented by other legal instruments, such as the e-Privacy Directive for the communications sector. There are also specific rules for the protection of personal data in police and judicial cooperation in criminal matters (Framework Decision 2008/977/JHA).

The right to the protection of personal data is explicitly recognised by Article 8 of the EU's Charter of Fundamental Rights and by the Lisbon Treaty. The Treaty provides a legal basis for rules on data protection for all activities within the scope of EU law under Article 16 of the Treaty on the Functioning of the European Union.

For more information
MEMO/12/41
Press pack: data protection reform:
http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm
Homepage of Vice-President Viviane Reding, EU Justice Commissioner:
http://ec.europa.eu/reding
European Commission – data protection:
http://ec.europa.eu/justice/data-protection
Justice Directorate General Newsroom:
http://ec.europa.eu/justice/news/intro/news_intro_en.htm

Monday, January 23, 2012

Unique ID Debate at WSJ

Should Every Patient Have a Unique ID Number for All Medical Records?
The WSJ Debate

·       Yes: It means better care, says Michael F. Collins.
·       No: Privacy would suffer, says Deborah C. Peel.
·       Read the complete Big Issues: Health Care report .

As the U.S. invests billions of dollars to convert from paper-based medical records to electronic ones, has the time come to offer everyone a unique health-care identification number?

Proponents say universal patient identifiers, or UPIs, deserve a serious look because they are the most efficient way to connect patients to their medical data. They say UPIs not only facilitate information sharing among doctors and guard against needless medical errors, but may also offer a safety advantage in that health records would never again need to be stored alongside financial data like Social Security numbers. UPIs, they say, would both improve care and lower costs.

Privacy activists aren't buying it. They say that information from medical records already is routinely collected and sold for commercial gain without patient consent and that a health-care ID system would only encourage more of the same. The result, they say, will be more patients losing trust in the system and hiding things from their doctors, resulting in a deterioration in care. They agree that it's crucial to move medical records into the digital age. But they say it can be done without resorting to universal health IDs.

Yes: It Means Better Care
By Michael F. Collins
The U.S. health-care industry has an identity crisis.

Lacking an easy, uniform way to identify patients and link them to their health data, doctors, hospitals, pharmacies, insurance plans and others throughout health care have created a sea of unrelated patient-identity numbers that are bogging down our medical-records system.

'An ID system 'could be the safest and most efficient way to manage health-care data.' -- MICHAEL F. COLLINS

Indeed, in an age when it's possible to pay for a cup of coffee using a cellphone, transferring a single patient's medical data from one health provider to another is often a struggle, sometimes resulting in treatment delays and even needless medical errors.

That is why, as the nation invests billions of dollars to convert from paper-based medical records to an electronic system, the time has come to offer everyone a universal patient identifier, or UPI.

A UPI system, using one number that seamlessly connects a person to all of his or her records, could be the safest and most efficient way to manage health-care data. It would guard against misidentification and make it much easier to pull together a patient's records from disparate providers. Using today's best technologies and practices, UPIs could help dramatically improve the quality of health care, lower costs, accelerate medical discovery and better preserve privacy.

That last point is by far the most contentious. It was privacy advocates who stopped the move toward UPIs more than a decade ago, leading to a ban on the use of federal funds just to study this approach. Enough has changed that UPIs deserve another look.

Cases of Mistaken Identity
In the 2010 federal health-care law, substantial resources are dedicated to promoting technology in medicine. We are investing billions of dollars to convert from paper to electronic health records, and to connect health-information hubs across the nation.

UPIs could make such systems more efficient. Currently, health-care providers and administrators struggle daily to match patients to records organized by disparate systems that rely on names, addresses, birth dates and sometimes Social Security numbers. Names can be presented in numerous formats, leading to duplicative records that cost money and lead to errors. As our population grows, the number of people with the same name and other similar personal data multiplies. Research cited by RAND Corp. indicates patients are misidentified at a rate of about 7% to more than 10% during record searches. As databases grow, the problem will only worsen. UPIs can correct this situation.

What about data security? It is difficult—especially without being able to study UPIs—to know what the safest approach is. Admittedly, no IT system is immune to breaches.
That said, patients with UPIs hold a distinct and important advantage in that their medical information is compiled and stored according to that unique identifier, separate from financial data typically coveted by thieves. UPIs can even be set up so that patients could choose to have no identifying data in their record, making it completely anonymous UPIs can be created with built-in checks against typing errors and counterfeiting, and if a UPI is compromised, patients can "retire" it and obtain a new one. Without a UPI, one can only regain medical privacy by changing one's identity, not dissimilar from participation in a witness-protection program!

Could the UPI be co-opted, the way the Social Security number has been, and used for other things? That's something we must guard against. By establishing a system where patients request the number through their doctor's office, and from a third party, not the government, we can help keep the UPI associated with medical data only.

Gaining Patients' Trust
Critics contend that UPIs will only make it easier for companies and others to use medical data for commercial purposes. To protect against this, they say, we need a system where physicians have to ask patients for permission to access their information. Because there has been so little study of UPIs, it's difficult to say whether those fears are valid. But having patients decide which doctor gets which data is the wrong choice. Doctors need full access to all of a patient's data, so they can deliver the appropriate care. That is the essence of the doctor-patient covenant. Furthermore, in critical-care situations, the patient might be unconscious and, therefore, unable to grant access to essential health information.

While narrowing access isn't optimal for patient care, new UPI technology does make it possible. For example, one type of UPI could be used for patients who want all of their physicians to have broad access to their medical data, while another would indicate the patient must first authorize access. Patients get to choose.

Even with all these protections, not every person will trust the system. Studies show that many people already refuse testing and treatment because they are worried it could be used to discriminate against them. UPI critics say a universal health-care ID system will only undermine trust further, but I would argue the opposite is true. Problems related to misidentifying patients and accessing their health information in a timely manner have eroded trust in the current low-tech system, which is why we need a new approach. Building an efficient records system that is more secure and offers better coordinated care can only enhance trust between patients and providers.

Congress should lift the ban on federal funding for UPI research, and we should better inform patients about the benefits of UPIs. No one wants medical data to fall into the wrong hands, but neither do we want patients to suffer because their medical information cannot be accessed.

Dr. Collins, a board-certified physician in internal medicine, is chancellor of the University of Massachusetts Medical School in Worcester, Mass. He can be reached at reports@wsj.com.

No: Privacy Would Suffer
By Deborah C. Peel

Doctors and patients need to find a better way to collect and share personal medical records from the innumerable places health data are collected and stored. But linking people to their health data via a unique identifying number isn't the answer.

'History shows that universal IDs are always used in unintended ways.' -- DEBORAH C. PEEL

Yes, assigning everyone a universal patient identifier, or UPI, would improve doctors' ability to share information and make it easier for hospitals to differentiate one John Smith from another. But a universal health ID system would empower government and corporations to exploit the single biggest flaw in health-care technology today: Patients can't control who sees, uses and sells their sensitive health data.

Searching for sensitive patient information would take just one number, not dozens of account numbers at professional offices, hospitals, pharmacies, labs, treatment facilities, government agencies and health plans. UPIs would make it vastly easier for government, corporations and others to use the nation's health information for their own gain without patients even knowing it.

What's more, any benefits associated with UPIs would be erased when patients, knowing their doctors have no control over where health-care data go, refuse to share sensitive information about their minds and bodies. This is a very real issue: Without privacy, patients won't trust doctors. In 2005, a California Healthcare Foundation survey found that due to the lack of privacy, one in eight patients lies, omits critical details, refuses tests or otherwise keeps sensitive health information private. Six hundred thousand people per year avoid early diagnosis for cancer alone.

Invitation to Snoop
We are in the midst of an unprecedented data-privacy crisis. Changes to federal regulations in 2002 eliminated patient control over who sees personal health information and led to explosive growth in the data-mining industry. Pharmacies, health-care IT vendors, insurers and others routinely sell and commercialize prescription records, genetic tests, hospital and office records, and claims data to drug companies and any willing purchasers. Even with names and key identifiers stripped off, it's simple to reidentify patients. Under the guise of improving health, lowering costs or promoting innovation, even government agencies sell and give away large databases of patient records.

Universal health-care IDs would only exacerbate such practices.

Further, UPIs would encourage the government and corporations to build massive, centralized databases of health information, rich targets for data theft and abuse. UPIs would become a de facto universal identification system far more harmful than Social Security numbers, enabling millions of government and corporate workers to snoop into anyone's medical records.

But concerns about health IDs go much deeper. UPIs exacerbate the commoditization of patients by encouraging the perspective that government agencies and corporations have superior rights to decide and control core aspects of who we are. A unique ID system is like giving master keys to millions who work in health care—they no longer need to ask patients to see records.

In the end, cutting out the patient will mean the erosion of patient trust. And the less we trust the system, the more patients will put health and life at risk to protect their privacy.
Such an obvious outcome makes a mockery of claims that UPIs would "reduce errors" and improve "patient safety." Similarly, claims that UPIs will be kept separate from personal and financial IDs are wishful thinking. All health records have financial records attached. But more important, history shows that universal IDs are always used in unintended ways. Social Security numbers were to be used only for payroll taxes, but morphed into universal IDs for health and commerce. UPIs will share the same fate.

Patients in Control
If a single ID number isn't the answer, what is? The best way to share sensitive health information is to build electronic-records systems where patients are in control of their own medical records, not government and industry. Health professionals should seek permission to see personal data, but only patients should release or link it. This is how it works with paper records systems, and there's no reason we should be less concerned about privacy in the digital age.

Existing technologies can allow patients to set default rules to govern data exchanges electronically, such as: "In emergencies, treating physicians may access my entire medical record" or "Anytime I receive health treatment, send copies to my family doctor." Consent rules can be changed instantly online, and sensitive information can be selectively withheld at the patient's discretion.

Unique patient IDs are unnecessary for this system. Much like using online banking to pay bills, patients can use online health systems to send encrypted information from medical accounts to whomever they choose.

Decentralized systems with smaller data sets protect privacy because if any account is broken into, only some information is compromised. More important, they require mediation by the patient. Imagine a universal ID system for all financial transactions where all retailers had our IDs. Commercial transactions would be more efficient if retailers could see and debit our accounts without consent. But it would be unacceptable—and it should also be unacceptable for others to use your health records without permission.

I agree that we need to transform the health-IT system so health professionals and researchers can electronically tap into complete and accurate health information. But any such technology should allow professionals to treat patients as individuals whose needs come first. That won't happen if we create an electronic medical-record system that no one trusts.

Dr. Peel, a psychiatrist and health-privacy expert in Austin, Texas, is the founder of Patient Privacy Rights and leader of the bipartisan Coalition for Patient Privacy. She can be reached at reports@wsj.com.

Tuesday, November 1, 2011

Why Johnny Can't Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising

Title:  Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising   
Authors:        Pedro G. Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wang
Publication Date:       October 31, 2011     

Abstract
We present results of a 45-participant laboratory study investigating the usability of tools to limit online behavioral advertising (OBA).We tested nine tools, including tools that block access to advertising websites, tools that set cookies indicating a user’s preference to opt out of OBA, and privacy tools that are built directly into web browsers. We interviewed participants about OBA, observed their behavior as they installed and used a privacy tool, and recorded their perceptions and attitudes about that tool. We found serious usability flaws in all nine tools we examined.

The online opt-out tools were challenging for users to understand and configure. Users tend to be unfamiliar with most advertising companies, and therefore are unable to make meaningful choices. Users liked the fact that the browsers we tested had built-in Do Not Track features, but were wary of whether advertising companies would respect this preference. Users struggled to install and configure blocking lists to make effective use of blocking tools. They often erroneously concluded the tool they were using was blocking OBA when they had not properly configured it to do so.

Full Report: CMU-CyLab-11-017

--

Privacy: "Things are just getting worse"

Your phone company is selling your personal data

David Goldman @CNNMoneyTech November 1, 2011

NEW YORK (CNNMoney) -- Your phone company knows where you live, what websites you visit, what apps you download, what videos you like to watch, and even where you are. Now, some have begun selling that valuable information to the highest bidder.

In mid-October, Verizon Wireless changed its privacy policy to allow the company to record customers' location data and Web browsing history, combine it with other personal information like age and gender, aggregate it with millions of other customers' data, and sell it on an anonymized basis.

That kind of data could be very useful -- and lucrative -- to third-party companies. For instance, if a small business owner wanted to figure out the best place to open a new pet store, the owner could buy a marketing report from Verizon about a designated area. The report might reveal which city blocks get the most foot or car traffic from people whose Web browsing history reveals that they own pets.
Verizon (VZ, Fortune 500) is the first mobile provider to publicly confirm that it is actually selling information gleaned from its customers directly to businesses. But it's hardly alone in using data about its subscribers to make extra cash.

All four national carriers use aggregated customer information to help outside parties target ads to their subscribers. AT&T, Sprint and T-Mobile insist that subscriber data is never actually handed over to third-party vendors; nevertheless, they all make money on it.

AT&T's (T, Fortune 500) AdWorks program, for instance, promotes AT&T's customer base to advertisers. On its AdWorks website, AT&T touts its ability to "reach customized audience segments based on anonymous and aggregate demographics." It then shows customers carefully tailored coupons, in-app ads and Web ads.

Sprint (S, Fortune 500), like Verizon, tracks the kinds of websites a customer visits on their mobile devices as well as what applications they use, according to spokesman Jason Gertzen. Sprint uses that data to help third parties target ads to customers.

That's a step further than Verizon goes. It too lets advertisers target customized messages to Verizon subscribers' mobile phones, but for that initiative, it does not incorporate its customers' Web surfing or location data, according to a company spokesman. Verizon relies on other personal information, including customers' demographic details and home address.

T-Mobile declined to answer specific questions about what kind of information it shares or sells, instead pointing CNNMoney to T-Mobile's privacy policy. The policy's open-ended terms seem to suggest that the company does not divulge customer information, but a T-Mobile spokeswoman acknowledged that the company "collects information about the websites that customers visit and their location" and that it "may use that information in an anonymous, aggregate form to improve our services."

Selling customer information is an age-old practice that is certainly not exclusive to the wireless industry. Brian Kennish, a former DoubleClick engineer who developed the advertising network's mobile ad server, noted that wireless companies have been sharing users' location data with third parties for more than a decade.

Why Apple and Google need to stalk you
But the rise of smartphones has given mobile providers an accidental treasure trove of marketable data: The gadgets are hyper-personalized tracking devices that "know" more about their owners than any other product on the market.
Wireless providers are taking advantage of their gold mine.

"At the end of the day, we're getting to a situation where customers are the products that these wireless companies are selling," said Nasir Memon, a professor of computer science at New York University's Polytechnic Institute. "They're creating a playground to attract people and sell them to advertisers. People are their new business."

There's a lot of money to be made in the largely untapped local advertising markets. A BIA/Kelsey study from March predicts that U.S. local online ad revenues will reach $42.5 billion annually in 2015.

Google (GOOG, Fortune 500) and Facebook are scrambling to sign local businesses to their new services like Facebook Places, Google Wallet and Google Places. But with smartphone customer data in their arsenal, wireless carriers are well positioned to swoop in as well.

"Verizon revealed the industry's strategy," said Jeff Chester, executive director of the Center for Digital Democracy. "This is more than the camel's nose under the tent.

With NFC [near field communication, an emerging technology for mobile payments] and GPS, there's a new digital gold rush here, and wireless companies want to reap the tremendous financial rewards that will come with dominating a local advertising market."

Chester noted that Verizon was the first to admit that it's selling customer data for local advertising and business-development purposes, but he said he believes all of the industry's players are involved in using subscriber information for that purpose.
"They're all doing this," he said. "Everyone is aware that big growth in the digital economy is mobile and location-based services."

For its part, Verizon has largely been applauded by privacy groups for at least being transparent about what it's doing and pointing users to an opt-out site if they don't wish to participate. But privacy advocates are concerned about the direction wireless companies are headed.

"The Web pages we go to and searches we do are the closest thing to our thoughts, the most private info of all, that can be recorded," said Kennish, who now heads up Disconnect, an online privacy tool. "If Verizon succeeds, I'm sure others will follow. Despite all the talk about privacy lately, things are just getting worse." 


Monday, October 31, 2011

Privacy and Security in the Implementation of Health Information Technology (Electronic Health Records): U.S. and EU Compared

      Privacy and Security in the Implementation of Health Information Technology (Electronic Health Records): U.S. and EU Compared, B.U. J. SCI. & TECH. L., Vol. 17, Winter 2011. "The importance of the adoption of Electronic Health Records (EHRs) and the associated cost savings cannot be ignored as an element in the changing delivery of health care. However, the potential cost savings predicted in the use of EHR are accompanied by potential risks, either technical or legal, to privacy and security. The U.S. legal framework for healthcare privacy is a combination of constitutional, statutory, and regulatory law at the federal and state levels. In contrast, it is generally believed that EU protection of privacy, including personally identifiable medical information, is more comprehensive than that of U.S. privacy laws. Direct comparisons of U.S. and EU medical privacy laws can be made with reference to the five Fair Information Practices Principles (FIPs) adopted by the Federal Trade Commission and other international bodies. The analysis reveals that while the federal response to the privacy of health records in the U.S. seems to be a gain over conflicting state law, in contrast to EU law, U.S. patients currently have little choice in the electronic recording of sensitive medical information if they want to be treated, and minimal control over the sharing of that information. A combination of technical and legal improvements in EHRs could make the loss of privacy associated with EHRs de minimis. The EU has come closer to this position, encouraging the adoption of EHRs and confirming the application of privacy protections at the same time. It can be argued that the EU is proactive in its approach; whereas because of a different viewpoint toward an individual’s right to privacy, the U.S. system lacks a strong framework for healthcare privacy, which will affect the implementation of EHRs. If the U.S. is going to implement EHRs effectively, technical and policy aspects of privacy must be central to the discussion."