Hockey Hall of Famer Wayne Gretzky famously said that he skates “to where the puck is going to be, not where it has been.” With that goal in mind, in our own arena of privacy and data security, we aim to provide you with top-notch advice on cutting-edge issues to help keep you ahead of the game. That’s why we are delighted to welcome the newest member of our team:  Chris Kuner, a renowned expert on European privacy and data protection regulation, who recently joined the firm’s Brussels office. 

In his more than 15 years of practicing data protection and EU regulatory law, Chris has established himself as a “go-to” privacy lawyer in Europe. He has represented clients in numerous dealings with the European Commission, the Article 29 Working Party, and European and national data protection authorities. His accomplishments are notable: he successfully negotiated the EC’s landmark adequacy decisions regarding the standard contractual clauses for data transfers; assisted multinational clients in complying with the complex requirements of European data protection law; and established himself as a thought leader, serving as chairman of the Task Force on Privacy and the Protection of Personal Data of the International Chamber of Commerce (ICC) in Paris, as well as the editor-in-chief of International Data Privacy Law.

Next Thursday, July 19, we are hosting a program in Palo Alto with Chris and David Vladeck, the Director of the Bureau of Consumer Protection at the Federal Trade Commission. Chris and David will discuss the current state of privacy law in the U.S. and the EU.  We are certain that this will be an informative program worth attending, and hope you can join us. Please click here to register for the event.

This second edition of Eye on Privacy covers a variety of timely and relevant issues that we hope will be of interest to you. If there are other topics you’d like to see us address in a future edition, please let us know by contacting us at PrivacyAlerts@wsgr.com.


Lydia Parnes
Partner, Wilson Sonsini Goodrich & Rosati


  • The Privacy Niche Fallacy: Identifying Those Types of Businesses That Should Care about Privacy and Data Security in Transactions
  • Recent Developments in “Do Not Track”
  • Dismissal of Amazon Privacy Suit Signals Uphill Battle for Plaintiffs’ CFAA Claims
  • Do Consumers Understand Mobile and Online Advertising?
  • Myspace Reaches Consent Agreement with FTC over Misrepresentations in Privacy Policy
  • The Reverberations of Carrier IQ:  Federal Communications Commission Steps into Privacy Debate

    G. Stegmaier


    M. Ringler


    D. Thomas

    The Privacy Niche Fallacy: Identifying Those Types of Businesses That Should Care about Privacy and Data Security in Transactions

    By Gerard Stegmaier, Michael Ringler, and David Thomas

    A few years ago, privacy and data security issues rarely showed up in the papers, much less as diligence items in transactions. Times have changed significantly, as due diligence is a key way that businesses manage risk in the mergers and acquisitions process. The prevalence of privacy and data security issues in the media, especially in business publications like the Wall Street Journal, will lead to the next wave of focused diligence. Much like the early rise of environmental law and related issues, privacy and data security issues increasingly impact transactions, especially in the middle market and consumer Internet markets. The following piece explains how and why that has come to be the case and provides some suggestions for why it matters.

    In a nutshell, privacy and data security has been viewed as a niche practice by many investment professionals and those who support them. It is true that there are only two types of businesses that should care about privacy and data protection, but unfortunately, for those who choose to remain blissfully ignorant of the issues, the two types of businesses are those that have customers and those that have employees. For these reasons, privacy and data protection issues should be considerations in every deal.

    The simple reality about customer privacy claims is that privacy-related class action litigation is booming in a plaintiffs-lawyer-driven gold rush. For over a decade, enterprising lawyers sought to find a toehold in bringing class action claims based on the use or disclosure of personal information. In the past, litigation waves have included cases ranging from those based on an obscure federal statute known as the Fair Credit Reporting Act for the printing of credit-card digits on receipts to those involving recent claims against companies like AOL and Netflix regarding whether or not anonymized data truly was kept anonymous.

    The current waves of litigation focus on two general areas: security incidents and the monitoring and tracking of device usage information through technical variations of cookies or device IDs. Even if the cases are not ultimately won, they are costly to defend and are an unnecessary and unplanned distraction. Much like ships traveling through the North Atlantic ice fields in the evening, prudent enterprises will pay careful attention to the landscape because what sits invisibly below the surface remains the most dangerous threat.

    Even if consumer class actions are not a concern, employee-related privacy issues also impact many transactions, especially for rapid-growth companies. Part of the reason why these issues are becoming more prevalent is that acquiring companies want to learn about potential liabilities associated with the employees and the teams they have chosen to acquire. Whether it involves the processing of government personal identification numbers or learning about employees from Europe, issues will arise in these transactions regarding what can be shared and where the underlying information may be processed. In fact, because the first non-U.S. stage of enterprise growth is often in the European markets, differences in European law that limit the processing of human resources data in the U.S. as part of back-office functions can ensnare well-intentioned companies focused on running lean operations. In fact, there are potential criminal penalties for the improper processing of personal data from Europe that should give pause to even the most aggressive risk-takers.

    In the day-to-day operations of many companies, privacy and data protection issues may be easy to overlook or neglect, and doing so may even seem to make sense—or, at the very least, may have seemed to make sense in the not-so-distant past. That is part of why these issues are becoming so important in transactions today. For the first time, over the last few years some deals and transactions have been at risk of not moving forward because a key asset of the target—its customer or employee data—was encumbered due to the way in which it was collected or maintained. Similarly, reputation risk because of public action or the risk of investigation might limit the value of certain business combinations. As a result, investors increasingly are looking to deal teams with specialists who emphasize privacy and information governance as part of their practices to help identify and understand data-related risks.

    For those who recognize these risks, a few observations may be helpful:

    1. Recognizing that data, just like other intellectual property, represents an important corporate asset is an important first step.

    2. As with any asset, how an organization safeguards and monetizes its assets can have important implications for enterprise value.

    3. The sheer volume of regulatory and litigation interest in this area means that corporate decisions, including very recent decisions, may be called into question in financially significant ways. Doing things the way they have always been done simply will not work, and businesses that adopt such an approach increasingly pay the price.

    All of this begs a very practical first-line question for dealmakers: What kind of data is involved in this deal, and if an acquirer cannot get the data, or if its use will be scrutinized, what does that mean for achieving business goals?

    Will privacy and data protection issues drive every deal? Certainly not, but they will impact deals for two kinds of companies—those that have customers and those that have employees.

    [back to top]

    Related Information


    Publications

    Speaking Engagements

    Privacy & Data Security Team



    Recent Developments in “Do Not Track”

    By Edward Holman

    The past few months have seen a flurry of activity regarding the adoption of “Do Not Track” (DNT) as a new web privacy standard:  Twitter announced that it would support Mozilla Firefox’s DNT implementation; Microsoft announced that it would enable DNT by default in Internet Explorer 10, evoking reactions from the advertising industry, the World Wide Web Consortium (W3C) Tracking Protection Working Group developing the DNT standard, two U.S. congressmen, a Federal Trade Commission (FTC) commissioner, and a European Commission official; and Opera Software released a new version of its web browser that implements DNT. The following article will cover those developments and bring you up to speed on the current state of DNT.

    Background on DNT

    In late 2010, FTC Staff proposed “Do Not Track” as a method of simplified choice for consumers to opt out of data collection and use for online behavioral advertising (OBA). The proposal was part of the FTC’s preliminary privacy report that was finalized in March 2012. In its final report, FTC Staff did not define a specific implementation for DNT, but rather laid out principles for a DNT system to follow and praised the efforts of several groups and companies to find a suitable approach. One such implementation addressed by the FTC—and the one relevant to this article—is the “browser header” approach. The essential element of this implementation involves a simple browser header sent on all outgoing HTTP requests that signals whether or not a user wants to be tracked. This is the method that has been implemented by the Firefox, Safari, and, most recently, Opera web browsers. Nevertheless, this rather simple implementation belies a difficult policy question: Once a website receives a DNT signal, what should—or must—it do?

    One example of how DNT can be adopted by businesses can be seen in Twitter’s announcement that it would support DNT in connection with the testing of its “tailored suggestions” feature.1 Twitter’s tailored suggestions feature is designed to suggest people to follow based on a user’s visits to websites that have integrated Twitter buttons or widgets. The feature works by analyzing the websites that a particular user visits and suggesting people who frequently are followed by other Twitter users who visit the same websites.2 Twitter users may opt out of the feature on the Twitter website; alternatively, anyone can turn on DNT in their browser to stop data collection for the feature.

    In another example of DNT adoption, the mobile ad network Jumptap announced that it would support the use of DNT as a means of opting out of targeted advertising.3 Specifically, the company stated that for users who had the DNT setting in the mobile version of Firefox turned on, the company would deliver an untargeted ad and would update the user’s profile as “opted-out.” This opt out would persist even if the user subsequently turned DNT off. To receive targeted ads after turning DNT off, the user would have to opt back in via the company’s website.

    The W3C DNT Proposal

    The actual technical definition of DNT is currently in the hands of the W3C Tracking Protection Working Group. The W3C is an international organization that creates standards for the web with participation from member organizations, W3C staff, and the public. The Tracking Protection Working Group was chartered to define mechanisms for expressing user tracking preferences and for blocking or allowing tracking elements. There are two specifications related to DNT that the working group is currently developing: the “Tracking Preference Expression”4 specification, which defines the technical mechanism for DNT, and the “Tracking Compliance and Scope”5 specification, which defines the meaning of a DNT preference and outlines how websites should comply with DNT. Both specifications are still in draft form and are the subjects of active debate and ongoing revisions.

    The W3C process has attracted mainstream interest through an increasingly public debate over whether DNT should be turned on by default, or whether this practice should even be allowed. In May, Microsoft announced that it would ship the next version of its web browser, Internet Explorer 10 (IE10), with DNT turned on by default. This decision elicited a strong reaction from the Interactive Advertising Bureau (IAB), a prominent association of online advertisers, which asserted that “[a] default setting that automatically blocks content violates the consumer’s right to choose.”6 Shortly after Microsoft’s announcement, the W3C working group released a revised draft of the DNT specification that states that compliant browsers cannot send a DNT signal without explicit user consent. What this effectively means is that for a browser to be compliant with that draft of the specification, it cannot set DNT either on or off by default. If the W3C working group’s position is ultimately adopted in the final draft of the specification and Microsoft maintains its position of turning DNT on by default, websites receiving the DNT signal from IE10 browsers may end up ignoring the signal because it would be considered non-compliant, thus rendering the setting ineffective.

    The W3C working group’s draft position on turning DNT on by default also evoked governmental responses in both the U.S. and the EU. U.S. Congressmen Edward Markey (D-MA) and Joe Barton (R-TX) were the first to weigh in, sending a letter to the working group that called on its participants to support allowing browsers to turn on DNT by default. This letter subsequently prompted a response by FTC Commissioner J. Thomas Rosch, who asserted that “Microsoft’s default DNT setting means that Microsoft, not consumers, will be exercising choice as to what signal the browser will send.”7 Finally, Robert Madelin, Director General for Information Society and Media in the European Commission, also sent a letter to the working group expressing a desire to see DNT deployed soon. The letter further stated that the three different tracking preference expressions drafted so far seem “sensible,” and that the W3C specification should not address what the default DNT setting should be or whether a browser should be able to choose a default setting. Rather, Director Madelin proposed, browsers should inform users upon installation or first use that they have a choice regarding DNT and prompt them for a decision.

    Conclusion

    It is important to remember that the W3C DNT proposal is still very much in draft form. While the basic technical aspects of the draft are unlikely to change, important policy decisions as to what compliance with an affirmative DNT signal looks like remain up in the air. Thus, businesses considering whether they need to or should comply with DNT as it currently stands should keep its draft state in mind, especially when considering whether to declare compliance with a still-evolving standard. That said, businesses developing or updating online products and services should always consider the potential privacy concerns raised by those products and services and how best to provide notice and choice to consumers, including whether to adopt DNT or other consumer-choice methods.


    1Twitter Help Center,“Twitter Supports ‘Do Not Track,’” Twitter, https://support.twitter.com/articles/20169453.
    2 Twitter Help Center,“About Tailored Suggestions,” Twitter, https://support.twitter.com/articles/20169421.
    3“Jumptap Supports ‘Do Not Track,’” Jumptap (February 23, 2012), http://www.jumptap.com/blog/jumptap-supports-%E2%80%98do-not-track%E2%80%99/.
    4 W3C Working Draft, “Tracking Preference Expression (DNT),” W3C (March 13, 2012), http://www.w3.org/TR/tracking-dnt/.
    5 W3C Working Draft, “Tracking Compliance and Scope,” W3C (March 13, 2012), http://www.w3.org/TR/tracking-compliance/.
    6 Randall Rothenberg, “‘Do Not Track’ Set to ‘On’ By Default in Internet Explorer 10—IAB Response,” IAB (May 31, 2012), http://www.iab.net/InternetExplorer.
    7 Letter from FTC Commissioner J. Thomas Rosch to W3C Tracking Protection Working Group (June 20, 2012), http://ftc.gov/speeches/rosch/120620W3Cltr.pdf.

    [back to top]

     


    M. Wolk


    J. McGaraghan


    L. Hantover

    Dismissal of Amazon Privacy Suit Signals Uphill Battle for Plaintiffs’ CFAA Claims

    By Michael Wolk, John McGaraghan, and Lixian Hantover

    On June 1, 2012, the U.S. District Court for the Western District of Washington dismissed with prejudice in part a class action suit against Amazon alleging that the company’s use of cookies to track consumers’ personal data constituted a violation of the Computer Fraud and Abuse Act (CFAA) (Vecchio v. Amazon.com, Inc., No. C11-366RSL). This case is part of a growing trend of cases in which courts have dismissed CFAA claims related to the use of cookies and other forms of Internet usage-tracking technologies to better target consumers.1 In past cases, and here in Vecchio, the courts granted the defendants’ motions to dismiss CFAA claims on the grounds that the plaintiffs failed to articulate specific harms resulting from the defendants’ tracking of their personal data.

    Vecchio v. Amazon.com

    While there are many different mechanisms for tracking user activity online, the Vecchio case addresses Amazon’s use of cookies. Cookies are alphanumeric identifiers that website operators can configure their servers to send to Internet browsers accessing their sites. Each subsequent time a web page on the site is loaded, the operator can access data stored in those cookies to customize web pages and log browsing activities. Here, the plaintiffs complained that Amazon “‘exploit[ed]’ a known frailty” in the Internet Explorer browser to place cookies on their hard drives against their wishes, and despite their browsers’ filter settings. The plaintiffs claimed they had suffered economic injury as a result of Amazon’s conduct and sought relief under multiple claims, most notably under the CFAA.

    Under the CFAA, a plaintiff can state a civil cause of action against a defendant that intentionally accesses a computer without authorization, where such conduct causes the plaintiff loss or damages of at least $5,000 over a one-year period. In arguing that they met the damages threshold, the plaintiffs claimed that Amazon derived substantial financial gain through its use of cookies to gather the plaintiffs’ personal information. As a result, the plaintiffs argued that Amazon had deprived them of the opportunity to monetize their own valuable information. The court acknowledged that in theory, opportunity costs may be used to satisfy the damage threshold requirement. However, the court observed that the plaintiffs’ claims were entirely speculative because they did not demonstrate that they had the capacity or opportunity to independently monetize their raw information. As a result, the court granted Amazon’s motion to dismiss for the plaintiffs’ failure to state a claim under the CFAA.

    The court did, however, observe that the plaintiffs still might have a viable claim under the State of Washington’s Consumer Protection Act (CPA).2 The CPA requires a showing of injury, but unlike the CFAA, it does not require a plaintiff to demonstrate monetary damages in order to satisfy the requirement. In this case, the court stated that in order to allege an injury, the plaintiffs would need to demonstrate that Amazon accessed their computers or their information without authorization. The court noted that Amazon’s “Conditions of Use and Privacy Notice” appear to notify visitors to Amazon sites that the company uses cookies and that the terms state that the plaintiffs’ use of Amazon was conditioned on their acceptance of those very terms. The court asked the parties to file additional briefings on the issues of (a) authorization under the CPA and (b) whether Amazon’s conduct was unfair or deceptive in light of Amazon’s terms.

    Implications

    The Vecchio case is part of a larger trend toward limiting the application of the CFAA in the context of tracking and targeting advertisements and other content to web users based on the use of cookies and similar mechanisms. Like the dismissal in Vecchio, other data-tracking cases involving CFAA claims in California and New York3 have been dismissed where the plaintiffs were unable to produce a detailed allegation of actual damages and therefore could not sufficiently state a claim entitled to relief. Claims under certain state consumer protection laws, such as Washington’s CPA, which do not require a showing of damages in order to state a claim, may remain viable in cases in which the plaintiffs can demonstrate that the defendants acted without authorization.

    1In re iPhone Application Litig., No. 11-MD-00250-LHK, (N.D. Cal. Sept. 20, 2011); La Court v. Specific Media, No. SACV-10-1256-JW (C.D. Cal. Apr. 28, 2011).
    2 Wash. Rev. Code §19.86.
    3 In addition to the CA cases cited above, see Bose v. Interclick, Inc., No.10-CV-09183-DAB (S.D.N.Y. Aug. 17, 2011).

    [back to top]

     


    S. Silber


    J. Molosky

    Do Consumers Understand Mobile and Online Advertising?

    By Seth Silber and Joseph Molosky

    Do you or your customers know what “#spon” means? If not, you are not alone. The Federal Trade Commission (FTC) held a one-day workshop on May 30, 2012, to discuss online and mobile disclosures in small-space media, such as “#spon,” which refers to a sponsored message on Twitter. The workshop included four panels that discussed a variety of questions, including whether the FTC’s 2000 online-disclosure guidance should be updated to address new technologies. One overarching question that continued to arise during the course of workshop discussions was, Who are reasonable consumers of small-space media and what do they know? These will be key considerations for the FTC as the agency looks to provide updated guidance to companies operating in this space.

    At the workshop, panelists agreed that the high-level principles set forth in the FTC’s 2000 “Dot Com Disclosures Guidelines,”1 especially proximity and prominence of disclosures, still hold true even in today’s rapidly changing information culture. Several panelists suggested that when working to meet these principles, companies may find it helpful to identify the assumptions they make about consumers. For example, how deep is a consumer’s understanding of new media? Whom do consumers expect to teach or train them on new methods or techniques of advertising and disclosures? What are consumers’ primary means for gathering information—mobile-centric or traditional devices? Many panelists further suggested that the FTC consider these questions when determining whether and how to update the agency’s disclosure guidance. At the same time, several panelists emphasized that companies should be aware of how their own customers answer these questions in order to provide the best possible experience for them.

    Another topic of significant discussion at the workshop was the degree of consumer understanding as consumer attention spans shorten and more activities take place on mobile, small-space devices. Workshop panelist Jennifer King, a Ph.D. candidate at the University of California, Berkeley, School of Information, presented her research about consumer viewing patterns related to web pages, and noted that disclosures must be placed in a context that makes sense and is easily understood by the average consumer. Other panelists commented that in the context of mobile devices, this can be a complicated task due to the necessity of scrolling and reduced real estate for placement of disclosures. Robert Weissman of Public Citizen offered his view that the concern over consumer understanding also may continue to hold true even as consumers become more comfortable and proficient with small-space devices, as consumers’ understanding of advertising may not advance at the same time. 

    With the significant increase in small-space advertising, one of the most debated concepts at the workshop was the use of icon and badge disclosures. One of the clearest examples of this issue and consumer knowledge is the “#spon” disclosure being used in tweets to designate a sponsored message. Malcolm Faulds of Word of Mouth Marketing Association (WOMMA) explained in one of the panel discussions that WOMMA has included the use of this hashtag in guidance for its members. Many panelists, however, did not agree that it appropriately conveyed the disclosure to consumers. These individuals felt that there is a lack of research on consumer understanding of the hashtag and that the word “sponsored” does not take up much more space than “#spon.” Nevertheless, a few panelists were quick to point out that as with any new disclosure tools, consumers must be given an opportunity to learn about these new concepts for the disclosures to be effective. Along those lines, a common sentiment expressed by industry representatives on the panels was the desire for the FTC to allow for flexibility in any new guidelines as the industry explores new options for ads and disclosures in the evolving small-space area.

    Panelists also noted that in many areas of small-space marketing, icon and badge usage is not as distinctive as the “#spon” hashtag, which may cause additional complications for companies and consumers. A few different organizations in attendance at the workshop are publishing and implementing their own sets of icons or badges. For example, panelists Moms with Apps2 and PrivacyChoice3 are creating simplified privacy disclosure labels with the goal of disclosing a company’s privacy practices to consumers in a uniform and easy-to-read manner. One panelist noted, however, that competing methods of disclosure could lead to significant consumer confusion and increase the time needed for consumers to gain an understanding of each label, thereby reducing the disclosures’ effectiveness. Another panelist observed that this confusion also could cause harm to a company’s brand.

    An additional topic discussed by the panelists was understanding how consumers are trying to access information. Panelist Anna Bager of the Mobile Marketing Center of Excellence at the Interactive Advertising Bureau explained that as more and more consumers move to mobile-centric computing instead of web-centric or traditional devices for obtaining information, the methods used by application and website developers will be key to ensuring that ads and disclosures simultaneously meet both the FTC’s guidance and consumer experience expectations. One promising area discussed by panelists was responsive design, a new development that allows websites to render effectively across multiple platforms with minimal effort. This method of development segments a site’s content into boxes and allows a device to rearrange the boxes to accommodate the size of the device’s screen. Most panelists agreed that this type of mobile-first attitude will benefit companies as consumers become more and more mobile-first themselves, while at the same time easing compliance with FTC guidelines due to the reduced complexity of adding appropriate disclosures.

    The FTC’s updated guidance for mobile and online advertising disclosures will likely not be available for some time, but based on the workshop panel discussions, the knowledge and experience of consumers is likely to play a significant role. As such, it is never too early for companies to begin discussing how they can best understand consumers and how to comply with the upcoming FTC guidance as well as meet consumers’ needs.


    1Federal Trade Commission, “Dot Com Disclosures: Information about Online Advertising” (2000), available at http://www.ftc.gov/os/2000/05/0005dotcomstaffreport.pdf.
    2See Privacy Icon, Moms with Apps, http://momswithapps.com/privacy-icon/.
    3See PrivacyMark, PrivacyChoice, http://www.privacychoice.org/privacymark.

    [back to top]

     

    Myspace Reaches Consent Agreement with FTC over Misrepresentations in Privacy Policy

    By Matthew Staples

    On May 8, 2012, the Federal Trade Commission (FTC) announced that it had reached a consent agreement with Myspace, a social networking service. The agreement settled charges that Myspace misled its users about the extent to which it shared personal information with third-party advertisers. The order signals increased scrutiny by the FTC of the ways in which social networks share information about their users with advertisers.

    The consent order prohibits Myspace from making misrepresentations regarding the extent to which it maintains the privacy of its users' information and the extent to which it complies with the U.S.-EU Safe Harbor Framework and similar programs. The order also requires Myspace to establish a "comprehensive privacy program" and obtain biennial assessments of the program by an independent auditor for the next 20 years. The order is the third consent order that specifically requires a comprehensive privacy program, following similar orders that the FTC entered into with Google and, more recently, Facebook.1

    Background

    The FTC alleged that, contrary to representations in its privacy policy, Myspace shared users' personally identifiable information (PII) with third-party advertisers by providing those advertisers with users' "Friend IDs." A Myspace Friend ID is a persistent, unique number that easily can be used to access a Myspace user's profile page. Depending on a user's privacy settings, different types of information may be available publicly on that user's profile page. According to the complaint, Myspace designated certain profile information, including a user's profile picture, location, gender, age, display name, and full name, as "basic profile information" outside the scope of its privacy settings, and thus made that information publicly viewable. While a user could change Myspace's default setting, which showed the user's full name publicly, allegedly only approximately 16 percent of users had made their full names private as of July 2010. Thus, as of July 2010, the Friend ID allegedly could be used to access, at minimum, the basic profile information of Myspace users, including the full names of approximately 84 percent of those users.

    Since January 2009, Myspace allegedly shared the Friend IDs, ages, and genders of users viewing the Myspace website with third-party advertisers. According to the complaint, Myspace transmitted this information to third-party advertisers from January 2009 to June 2010 whenever its affiliated advertising network lacked an appropriate ad to serve. While Myspace allegedly started encrypting the Friend IDs, ages, and genders of users viewing the Myspace website in June 2010, it provided the encryption key to its affiliated ad network in order to permit use of this information to target advertising to those users. On October 29, 2010, Myspace's affiliated ad network was sold to a third party, and Myspace allegedly continued to provide the encryption key to the ad network’s new owners for the following year.

    In its privacy policy that was applicable at the time this information sharing took place, Myspace allegedly represented that it would not share a user's PII with unaffiliated third parties. Myspace further allegedly represented that: (a) while some users' profile information might have been used to customize ads delivered to them, the information provided for this purpose could not be used to personally identify each user; (b) anonymous web traffic and aggregated demographic information could be shared with advertisers; and (c) Myspace maintained a then-current self-certification to the U.S.-EU Safe Harbor Framework from December 9, 2010, until the present, and the company complied with the framework in its privacy policy.2

    The FTC's Claims

    The FTC alleged in its complaint that four of Myspace's representations were false or misleading, thereby qualifying as deceptive acts or practices that violated Section 5 of the FTC Act. Key to the FTC's claims was that third-party advertisers receiving a user's Friend ID could use that ID to access the user's public profile and obtain PII about that user. Specifically, the FTC alleged that Myspace misrepresented that:

    1. Myspace would not share a user's PII with third parties without giving notice to and receiving permission from the user. The FTC claimed that by providing third parties with a user's Friend ID, Myspace gave those third parties access to that user's PII without providing the user with notice or obtaining permission.

    2. The means Myspace used to customize ads did not allow advertisers to access PII or individually identify users. Again, the FTC claimed that by providing third parties with a user's Friend ID, Myspace gave those third parties access to that user's PII.

    3. Information about users' web-browsing activity was anonymized when shared with advertisers. The FTC claimed that by providing advertisers with access to a user's PII via that user's Friend ID, advertisers could link web-browsing activity collected via cookies stored on that user's browser to PII available on that user's Myspace profile.

    4. It complied with the U.S.-EU Safe Harbor privacy principles of notice and choice. The FTC claimed that Myspace did not adhere to these principles.

    Settlement Terms

    Myspace's settlement with the FTC is similar to recent privacy consent orders that the FTC entered into with Facebook and Google. As noted above, the consent order prohibits Myspace from making misrepresentations regarding user privacy and compliance with the U.S.-EU Safe Harbor Framework. Myspace also must establish a comprehensive privacy program and obtain biennial assessments of the program by an independent auditor for the next 20 years. The specific requirements of the privacy program and the required contents of the biennial assessments are virtually identical to the programs and assessments imposed by the FTC's settlements with Google and Facebook.

    One subtle but notable addition to the Myspace consent order that distinguishes it from the previous privacy consent orders is the explicit inclusion of "device ID" as "covered information" within the consent order’s definitions. The order prohibits misrepresentations regarding the privacy of covered information, and the required comprehensive privacy program must be designed to protect such information. The FTC's definition of "covered information" has included persistent identifiers in the past, but this order represents the first time that "device ID" specifically has been identified as a persistent identifier in a privacy consent order. This subtle inclusion may signal that the agency intends to scrutinize the use of device IDs in targeted advertising going forward, which could have a significant impact on companies developing mobile applications supported by such advertising.

    Implications

    This settlement has important implications for businesses engaged in targeted advertising, particularly those also engaged in the social media space. Companies should take care that representations made in their privacy policies regarding data collection, sharing, and use are accurate. Additionally, companies that self-certify compliance with the U.S.-EU Safe Harbor Framework or other self-regulatory codes should ensure that they understand the requirements of those programs and act accordingly. Companies that share personally identifiable information about their users, even indirectly, may find themselves subject to regulatory scrutiny if they do not provide their users with what the FTC believes to be adequate notice and choice regarding information-sharing practices.

    1 Our WSGR Alert regarding this Facebook consent agreement is available at http://www.wsgr.com/wsgr/Display.aspx?SectionName=publications/PDFSearch/wsgralert-facebook-ftc-settlement-privacy.htm.
    2 The U.S.-EU Safe Harbor Framework serves as a method that U.S. companies may use to transfer personal data outside of the European Union, consistent with the requirements of the European Union’s “Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data,” known commonly as the Data Protection Directive. The framework is a voluntary program through which a company self-certifies to the U.S. Department of Commerce that it complies with the framework's principles and requirements.

    [back to top]

     


     

    The Reverberations of Carrier IQ: Federal Communications Commission Steps into Privacy Debate

    By Wendell Bartnick

    With the Federal Trade Commission’s (FTC’s) extensive involvement in protecting consumers’ privacy interests, it is easy to forget that other federal agencies also can play a key role. The public outcry following the discovery of wireless carriers’ use of Carrier IQ’s key-logging software on wireless devices has prompted another government agency to consider becoming more heavily involved in regulating privacy and data security issues.

    Under the Telecommunications Act of 1934, the Federal Communications Commission (FCC) has jurisdiction to regulate telecommunications carriers, including wireless service providers. Under this authority, the FCC recently sought public comment on “the privacy and security practices of mobile wireless service providers with respect to customer information stored on their users’ mobile communications devices, and the application of existing privacy and security requirements to that information.”1 

    Evolution of the Wireless Industry

    The FCC admitted that “technologies and business practices have evolved dramatically” since its most recent solicitation of public input on these matters in 2007. Among the industry changes, the FCC stated, was the Carrier IQ instance where “[m]obile carriers are directing the collection and storage of customer-specific information on mobile devices.” Wireless service providers have explained that this technology allows them to gather data to help operate, maintain, and improve the network and services. The FCC concluded that although the use of this information may be legitimate and effective for providing better services, “the collection, transmission, and storage of this customer-specific network information raise new privacy and security concerns.”

    FCC Authority

    Congress has granted the FCC the authority to enforce telecommunications carriers’ duty to “protect the confidentiality of proprietary information of, and relating to . . . customers.”2 This duty generally requires a “telecommunications carrier that receives or obtains customer proprietary network information [(CPNI)] by virtue of its provision of a telecommunications service [to] only use, disclose, or permit access to individually identifiable [CPNI]” as necessary to provide the service.3 The FCC asked for public comment on its preliminary conclusion that CPNI could include data collection before its transmission to the wireless carrier, as well as the factors that may affect that conclusion. If the FCC concludes that CPNI includes data collected and stored on mobile devices at the direction of wireless carriers, then it likely has the authority to regulate the collection, use, and disclosure of such information.

    In addition, the FCC is requesting public comment on other topics, including:

    • how data practices have evolved;
    • the effectiveness of notice to consumers about data practices;
    • the encouragement of privacy by design by the FCC;
    • the application of privacy and security obligations to wireless service providers and third parties involved in collecting and storing customer information that wireless service providers cause to be collected and stored on mobile devices; and
    • general privacy concerns.

    Implications

    Privacy interests related to the mobile space have become a hot topic in Washington, and we likely can expect significant guidance and, potentially, regulation in this area in the near term. In addition to the FCC’s recent indication of future involvement, the National Telecommunications and Information Administration (NTIA) is holding stakeholder meetings on mobile privacy to help develop industry codes of conduct as part of the White House’s Consumer Privacy Bill of Rights.4 In addition, in its March 2012 privacy report,5 the FTC stated that mobile services are a top priority, and it recently held a public workshop discussing privacy disclosures on mobile devices. In light of this government agency involvement, companies involved in the mobile space should carefully review their potentially evolving privacy obligations, which may differ from those required of Internet websites.

    1 The public notice is available at http://transition.fcc.gov/Daily_Releases/Daily_Business/2012/db0525/DA-12-818A1.pdf.
    2 47 U.S.C. § 222 (a).
    3 47 U.S.C. § 222 (c)(1).
    4 Our WSGR Alert discussing the White House’s Consumer Privacy Bill of Rights is available at http://www.wsgr.com/wsgr/Display.aspx?SectionName=publications/PDFSearch/wsgralert-consumer-privacy-bill-of-rights.htm.
    5 Our WSGR Alert discussing the FTC’s final report on privacy is available at http://www.wsgr.com/WSGR/Display.aspx?SectionName=publications/PDFSearch/wsgralert-FTC-final-privacy-report.htm
    .

    [back to top]

     

    Please click here for a printable version of this edition of Eye on Privacy.

    This communication is provided for your information only and is not intended to
    constitute professional advice as to any particular situation.

    © 2012 Wilson Sonsini Goodrich & Rosati, Professional Corporation