Call us

Regulators of internet platforms should look beyond legal aspects

Larissa Brunner

Date: 10/12/2019
Effective regulation of internet platforms should address the fundamental question of how to regulate in light of uncertainty over future outcomes.

The rise of large internet platforms has changed digital ecosystems and economic structures around the world. Many of these changes have been positive: enabling people to connect with each other more easily and affordably, even across great distances; and to access information within seconds and an immense range of products at the click of a button. However, they have also come at the cost of increasing concerns about data protection, consumer rights and unfair competition.

Regulation objectives

Effective regulation of internet platforms should address the fundamental question of how to regulate in light of uncertainty over future outcomes. On the one hand, regulation must be robust enough to preserve the precautionary principle, minimise risks for consumers and prevent anti-competitive behaviour. On the other hand, attempting to eliminate all risks could result in overly restrictive regulation that stifles new businesses and hinders innovation.

Regulators thus need to balance two sets of legal and economic objectives each: fair competition, to minimise market distortions and ensure consumer benefits versus data protection, to safeguard citizens’ right to privacy as well as fundamental rights to freedom of expression and non-discrimination; and consumer rights, to prevent unfair terms and pricing techniques and ensure transparency versus wider economic benefits, including innovation, growth and consumer surplus generated by new products and services.  

However, in practice, there is a tendency to prioritise the first set of objectives – all of which are legal – over the latter potential economic benefits. This can be explained partly by the fact that fair competition, data protection and consumer rights tend to be rather clearly defined – companies either protect user data in line with their legal obligations or not – while wider economic benefits tend to be fuzzy and take time to materialise. It is often virtually impossible to assess how new platforms will contribute to growth and innovation, and increased consumer surplus. Moreover, regulators have the legal role and obligation to focus on clearly defined legal categories.

As a result, in light of the uncertainty surrounding the future impact of innovations, this suggests a tendency from regulators to try and eliminate rather than manage risk.

Non-monetary prices

The example of non-monetary prices illustrates the problem. Many internet platforms (e.g. Google, Facebook) provide a variety of services to consumers without charging a monetary price. Instead, they collect and use data to show users targeted advertisements, for example, monetising the data collected.

This approach raises issues regarding data protection and consumer rights and has even given rise to illegal behaviour in extreme cases. Perhaps the most egregious violation as of yet is the Facebook-Cambridge Analytica scandal of early 2018, where it was revealed that the political consulting firm Cambridge Analytica had used personal data from millions of Facebook profiles for political advertising without their consent.

However, it is not just illegal behaviour that has raised concerns. Indeed, the fundamental premise of many business models of internet giants – collecting and monetising user data in return for ‘free’ services – has led to the unease of regulators and some consumers, even when the companies are abiding the law. Moreover, there is a risk that large companies may stretch the law and become so big that they can no longer be adequately regulated.

For example, Google saves its users’ location (if used on their smartphones), search, YouTube and app activity histories, among many other things.[1] Such data could allow Google to draw conclusions from the intimate lives of its users – what their daily schedule looks like, where and how often they travel, how they spend their free time and what religious and political views they hold.

This could lead legally-minded regulators to argue that the risks associated with non-monetary prices are unacceptably high and that the practice should be banned or at least severely restricted. In extremis, this would mean forcing platforms like Facebook or Google to demand monetary payment from users, in exchange for their commitment to not use private data for commercial purposes.

Missing a piece of the puzzle?

While this idea may have a certain appeal from a data protection point of view, it overlooks a piece of the puzzle.

Alongside its flaws, the use of non-monetary prices holds several benefits, too. First, allowing companies to collect and analyse user data can give rise to new business models and lead to the creation of services that otherwise would not exist, supporting innovation and economic growth. Moreover, if the data is used for research, either by the companies themselves or academic institutions, important insights may be gleaned.

Second, the use of non-monetary prices may lead to higher consumer surplus. As long as the value consumers derive from using the new services is greater than the perceived costs of having their data collected, they are better off than if those business models did not exist. However, any such calculations should take into account consumers’ externalities (i.e. an individual’s data being disclosed by someone else) and inaccuracies in predicting the perceived costs of data collection.

Moreover, even if business models that currently rely on data collection were feasible with a monetary price model, consumers may still prefer the former option. Several studies have shown how easy it is to incentivise consumers to give up personal data. For example, a 2017 Stanford study found that a large majority of participants was willing to disclose the email addresses of three friends in exchange for free pizza.[2] This suggests that consumers tend to value their data less than companies. Even if irrational choices might explain this, it is nonetheless indicative of their preferences which do not necessarily need to be rational. This implies that forcing consumers to pay companies a price that is equivalent to the value of their data would lower the latter's’ consumer surplus.

Third, the use of non-monetary prices can ensure accessibility across all sections of society, regardless of their financial resources. This is especially important for marginalised and vulnerable groups who might otherwise be excluded, which would deepen the digital divide and entrench their precarious position. At the same time, these groups tend to be exposed to the greatest harm by data-related malpractices, highlighting the need for regulation.

These considerations suggest that regulators should look beyond purely legal aspects when making decisions. Trying to eliminate rather than manage risk can stifle innovation and prevent the creation of consumer surplus. While not all relevant factors can be easily measured and weighed against each other, regulators should be aware that all of the four areas mentioned above – fair competition, data protection, consumer rights and wider economic benefits – exist and need to be balanced in policymaking to obtain optimal results.

This publication is part of our activities as co-project leader of The Digital Clearinghouse initiative, which aims to create a platform facilitating cooperation, dialogue and exchange of insights and best practices between regulatory authorities, policymakers, researchers and other stakeholders. Its key mission is to achieve a better and more coherent protection of individuals in an era of big data and artificial intelligence.

The support the European Policy Centre receives for its ongoing operations, or specifically for its publications, does not constitute an endorsement of their contents, which reflect the views of the authors only. Supporters and partners cannot be held responsible for any use that may be made of the information contained therein.

[1] Curran, Dylan, “Are you ready? Here is all the data Facebook and Google have on you”, The Guardian, 30 March 2018.
[2] Athey, Susan; Christian Catalini and Catherine E. Tucker (2017), “The Digital Privacy Paradox: Small Money, Small Costs, Small Talk”, Stanford: Stanford Graduate School of Business.

Photo credits:

The latest from the EPC, right in your inbox
Sign up for our email newsletter
14-16 rue du Trône, 1000 Brussels, Belgium | Tel.: +32 (0)2 231 03 40
EU Transparency Register No. 
89632641000 47
Privacy PolicyUse of Cookies | Contact us | © 2019, European Policy Centre

edit afsluiten