You are here

Where's the Market for Online Privacy?

Why are market forces so weak in protecting users’ online privacy?

The main reason is that the online marketplace is economically structured around users being a commodity, data, to be aggregated and mined, not customers to be served and protected in a competitive marketplace. That’s because the overriding economic force that created the free and open commercial Internet – the predominant Silicon Valley venture capital/IPO value creation model – was and remains largely antithetical to protecting online privacy.

The Silicon Valley venture capital/IPO driven model is laser-focused on achieving Internet audience/user scale fastest in order to gain first-mover advantage and then rapid dominance of a new product or service segment. This predominant Internet economic model is predicated on a precious few investments achieving such rapid user scale that it: warrants a buy-out at an enormous premium multiple; enables fast and exceptionally-profitable liquidity (via the new secondary derivative market for private venture shares or employee options); or broad liquidity via a public IPO.

What is the essential critical element of achieving audience/user scale fastest? Free. No direct cost to the user fuels fastest, frictionless, viral adoption. This free economic model presupposes online advertising as an eventual monetization mechanism and shuns products and services directly paid for by the user because their inherent time-to-market is too slow and their upfront sunk cost of sales and customer service is too high for this predominant value creation model.

The other essential element of a fastest-adoption-possible model is user trust. User trust is created legitimately by providing the user with very valuable innovation for no monetary cost. However, user trust also is illegitimately manufactured and maintained via misrepresentation that the free service works for the user (when the real monetary value creation comes from the Silicon Valley liquidity derivative market and the online advertising market); and only has the interests of users in mind by: downplaying privacy concerns, risks or harms; implying that privacy undermines the free speech and sharing ethos of the free and open Internet; and claiming that privacy is outdated, anti-innovation, and no longer the current social norm.

Privacy policies generally meet the letter of full disclosure but seldom the spirit, by refusing to openly explain to the user in detail the purposes, amount, breadth and sensitivity of the private data being collected on a user and how that aggregated information could be used or abused by the collector, a third party or the government.

The second big reason that market forces for privacy protection are so weak is that the user is not the customer but the product. Once Internet companies’ founders, early investors, and employees have generated wealth via the Silicon Valley value creation model, their companies’ economic models shift to the full harvesting of the value of this economic model via advertising revenue growth. In the online advertising model, the user is the product and the advertiser is the customer. Importantly, the grand assumption of the online advertising model is that it assumes and depends on publicacy (the opposite of privacy), because what makes online advertising work is the unlimited business freedom to maximally leverage whatever customer data (private/personal information) a company can mine in order to most effectively and profitably micro-target users’ personal hot buttons.

The third big reason market forces for privacy protection are so weak is the glaring lack of user leverage/consumer power in the market equation. By design, the Silicon Valley venture capital/IPO model produces first-movers that can dominate their chosen Internet segment: Google-search, Facebook-social networking, eBay-online auctions and payments, Amazon-retailing, Twitter-real-time-micro-blogging, Zynga-games, etc. By design, this adoption-fastest model seeks to preclude or limit the viability of a significant competitive alternative. Thus the purveyors of this model can claim users have privacy choice, when they know their model has limited choice of alternatives and the limited choices that are available also have limited market incentives to protect privacy.

Given the Internet loophole in privacy law, where other industries operate under strict privacy laws in health care, financial services and communications, online users have no meaningful privacy rights or power in the Internet marketplace to protect their privacy. What this means is that consumers face the exact opposite market situation as they do in other markets, where consumers (buyers) have unique private knowledge of what their own wants, needs, means and budget are. However, in the online market, the seller has most all that private information on the buyer, and the buyer generally does not know that, so they have dramatically less buyer leverage or negotiating power than they do in a market where they are the customer and not the product.

Despite there being substantial value being exchanged when users use ad-based online services, there is no real market transaction for privacy in that exchange, i.e. no market choice for users to generally protect it -- or to sell it if one so chooses to exploit one’s privacy for one’s own personal financial benefit. The current model assumes that the user is and always will be a data-pawn without a real economic role or say in the market transaction over their personal data.

Tellingly, Smart Money reports that Michael Fertik, CEO of Reputation.com, estimates that a user’s “personal information can be worth $50 and $5,000 per person per year to advertisers and market researchers.” If this estimate is remotely accurate, why couldn’t/shouldn’t there be a market mechanism for a user to either protect their private/personal information or sell it for their personal benefit? What’s wrong or not workable with having users have a market role in influencing the market outcomes of their personal information? It is truly remarkable that such a rich marketplace, worth literally tens of billions of dollars per year in revenues, nearly completely shuts out the user from participating openly and directly in these transactions involving their private information.

Where does this leave us? Recently, the Supreme Court ruled in U.S. vs. Jones that using a tracking device was an infringement of someone’s private property. If people’s private data is in fact a legal form of private property -- that a user has some right to exercise a substantial amount of personal control over -- then the online advertising model may be built on a foundation of sand and not the foundation of rock that people assume.

Moreover, there is mounting evidence that in the future users will have more power over their privacy/private information than they do today. The EU is proposing an update of their privacy rules for the first time since 1995 and they propose to give users much greater control to opt out and control what personal information an on online company has on them. The FTC favors an Internet “Do Not Track” mechanism like the FTC’s wildly popular Do Not Call List, and Do Not Track legislation has been introduced in Congress. The Department of Commerce has proposed a Privacy Bill of Rights. Google’s new centralization of private information in its new privacy policy has generated strong opposition, a bipartisan letter from lawmakers urging that Google allow users the freedom to opt out, and charges that Google is violating the FTC-Google Buzz privacy agreement. There is evidence that Facebook users continue to be concerned about oversharing with FaceBook’s new Timeline. And the FTC has sanctioned Google, Facebook, and Twitter for not adequately protecting users’ privacy.

In sum, isn’t it ironic, that in this supposed market that allegedly serves and empowers the interests of users “at the edge,” there is no real privacy innovation to protect users’ privacy the way that users want, but only innovation to more effectively invade, abuse, or monetize people’s privacy largely without their knowledge or permission?

What does all this mean? It means there is a serious market failure in protecting users’ online privacy.

Let me be crystal clear here. I have nothing against venture investing; it is a sound capital market essential to the funding of high-risk innovation. I am not saying the venture capital model is a market failure. I am also not saying that online advertising is a market failure. I have nothing against online advertising; it is a completely legitimate and useful business model and mechanism to fund free content, if purveyors of the model fairly represent the inherent privacy/financial conflicts of interest and risks to users, so that users have the accurate information to protect themselves, if they choose to do so.

The very specific market failure here is twofold. First, extremely lax enforcement of the FTC Section 5 law against deceptive business practices, created market failure here because users were systematically denied fair representation in the marketplace, which is the first and most important line of defense against consumer fraud. Without effective fair-representation law enforcement, the consumer incorrectly assumes the online businesses in question are being forthright. Second, dysfunctional Federal privacy law -- that only protects privacy expectations offline and not online -- creates market failure as well, because users have minimal market control over the market for their online personal data. What is needed is new privacy legislation that is a consumer-driven, technology/competition-neutral privacy framework that works online and offline.

Simply, what is needed here to correct this specific market failure, is for law enforcement to ensure online businesses fairly represent their privacy and financial conflicts of interest to consumers/users so they can better protect themselves, and for Congress to harmonize Federal privacy law to close the huge Internet loophole in privacy law so that users enjoy the same expected privacy protections online that they do offline.

Scott Cleland is President of Precursor LLC, a consultancy serving Fortune 500 clients, some of which are Google competitors; he is also author of “Search & Destroy: Why You Can’t Trust Google Inc. In addition, Cleland is Chairman of NetCompetition, a pro-competition eforum supported by broadband interests. During the George H.W. Bush Administration, Cleland served as Deputy United States Coordinator for International Communications and Information Policy.

 


****************************

Note: Below is the original post that was posted 1-31-12. The post above is identical to the orginal post save for typographical corrections that do not change the content or meaning of the post in any way.

*******

Why are market forces so weak in protecting users’ online privacy?

To learn why and what to do about it, see my Forbes Tech Capitalist post: "Where's the Market for Online Privacy?"