'Making Markets Work: New Challenges for EU Competition Law': The 2019 Mackenzie-Stuart Lecture (audio)

Duration: 35 mins 23 secs
Share this media item:
Embed this media item:


About this item
'Making Markets Work: New Challenges for EU Competition Law': The 2019 Mackenzie-Stuart Lecture (audio)'s image
Description: The Centre for European Legal Studies (CELS) hosts an annual public lecture in honour of Lord Mackenzie-Stuart, the first British Judge to be President of the Court of Justice. Among the eminent scholars of European legal studies invited to give the lecture are Professor Joseph Weiler, former Judge David Edwards of the European Court of Justice, and Advocate-General Francis Jacobs of the European Court of Justice. The texts of the Mackenzie-Stuart Lectures are published in the Cambridge Yearbook of European Legal Studies.

The 2019 Mackenzie-Stuart Lecture was delivered by Commissioner Margrethe Vestager, European Commission, under the title 'Making Markets Work: New Challenges for EU Competition Law' on 4 February 2019.

More information about this lecture, including photographs from the event, is available from the Centre for European Legal Studies website at:

https://www.cels.law.cam.ac.uk/mackenzie-stuart-lectures

This entry provides an audio source for iTunes.
 
Created: 2019-02-05 13:00
Collection: The Mackenzie-Stuart Lecture: The Centre for European Legal Studies (audio)
Cambridge Law: Public Lectures from the Faculty of Law
Publisher: University of Cambridge
Copyright: Mr D.J. Bates
Language: eng (English)
Distribution: World     (downloadable)
Keywords: Competition; EU Competition; Regulation; EU; Data Protection;
Categories: iTunes - Law & Politics - Law
Explicit content: No
Transcript
Transcript:
It’s a very great honour to be here in Cambridge, to give this Mackenzie Stuart lecture.

Seventy-five years ago, almost to the day, digital technology scored one of its very first successes. At Bletchley Park, less than 50 miles from here, the world’s first programmable digital electronic computer, known as Colossus, cracked its first code on 5 February 1944.

Colossus was a huge advance on the machines that came before it. But to crack a code, it needed information from intercepted radio messages. Those messages were very faint and hard to make out, and just one mistake in the message would make the whole text useless.

So it was fortunate that the team at Bletchley could rely on skilled radio operators. Because Colossus – like every computer the world has seen since - could only reach its full potential if it had the right data.

The data explosion
The pioneers of computing would be astonished if they could see how far we’ve come in 75 years.

But perhaps the most remarkable thing of all would be the enormous explosion in the amount of data we produce.

Every time we shop online, or do an Internet search, we leave a trail of data behind us. Every time we walk around town with our smartphones in our pockets, they’re building up a record of the places we’ve been. And behind the scenes, modern factories and the industrial products they make constantly record data that can help them run more efficiently. The latest Airbus planes, for instance, each have thousands of sensors on board, producing terabytes of data every day.

The benefits of data
And with that data at our fingertips – and the technology to make sense of it – we can understand our world in a way we couldn’t before. Companies can tell when a machine is about to break down. Doctors can get a much better understanding of the factors that decide how we’ll respond to a medicine. Computers can learn to make judgments that have long been the preserve of humans – recognising faces, for instance, or perhaps one day even driving cars.

And the more we understand our world, the better we can control it.

So data gives us power. Power we can use to save resources, to find new solutions to our problems, to help us live happier and more fulfilling lives.

So this is an exciting moment. But it’s also a moment when we have choices to make, about who gets to use that power.

Data and competition
When you or I sit down to play a game of chess, we’re playing the same game a grandmaster plays. But they see the board very differently from us. We see a jumble of pieces – the grandmaster takes in the whole board at a glance. We struggle to see just a couple of moves ahead – Magnus Carlsen, the highest-ranked player in history, can see a sequence of as many as 20 moves.

And data can turn us into grandmasters. It can give us the ability to see more deeply, to understand our world more thoroughly, to make faster, better decisions. But if only some of us have those abilities, it can mean that the rest have no chance to keep up.

That’s worrying, when you think about what it could mean for the openness of our societies. But we also need to think what it means for competition.

Because we need competition, to keep our markets working fairly – to keep up the drive for companies to cut prices, improve quality, come up with innovative new products. And competition can’t work if just a few companies control a vital resource that you need to be able to compete – and if they refuse to share it with others.

Right now, it looks as though data is becoming one of those vital resources. And if that’s so in a particular case, then we need to make sure it’s not monopolised by a few.

Understanding how data affects competition
It’s true that data isn’t that easy to monopolise. There’s no limit to the number of companies that can use the same data, at the same time. And some types of data are pretty easy for anyone to get hold of, by collecting it from users or buying it in.

A few years ago, when Microsoft bought LinkedIn, we investigated whether LinkedIn’s data would let Microsoft squeeze its rivals out of the market. But it turned out that the sort of data which LinkedIn had available wasn’t crucial for rivals to be able to compete – and similar data was available on the market. But that’s not always the case.

Between Denmark and Sweden runs a narrow stretch of sea known as the Øresund. It’s probably best known these days for the bridge that crosses it – and the TV show named after the bridge. But in the past, when what’s now southern Sweden was part of Denmark, the Øresund – along with two other Danish Straits – was the only way in and out of the Baltic Sea.

No single country controlled the whole Baltic. But for centuries, the Danish kings could decide who entered and left – and charge handsome fees for the privilege. Because they controlled the most vital thing – access.

The Internet offers us a huge wealth of choice. We can choose products from millions of sellers, find news from every country in the world. But all those sites are funnelled to us through just a handful of platforms – the search engines that find us find different offers for a product, the online marketplaces where buyers meet sellers, the news aggregators that collect stories from around the web.

And like the Danish kings, with their castles by the Øresund, those platforms do very well from the fact that so much trade has to pass through their dominions. They have access to huge amounts of data, about every part of the market. And we need to be sure that the way they use that data doesn’t undermine competition.

Amazon’s Marketplace is a platform that links sellers and buyers. But Amazon also sells products directly – often in competition with the very same sellers. That raises the question of how Amazon uses the data it collects about other sellers through the platform, and whether that leads to unfair competition.

This is still at a very early stage. We certainly can’t say today that Amazon has done anything wrong. But what matters is that we’re already looking very closely at whether companies are using their control of data to harm competition.

Looking to the future of competition policy
But we need to be sure that the competition rules are ready for a world where data becomes even more vital.

A few weeks ago, in Brussels, we held a conference with some of the world’s leading experts on technology and society. And one common theme that kept coming up – at the conference, and in the more than a hundred written contributions we got in advance – was concern about what it could mean for competition, if a few companies got control of the data that you need to compete.

There were also some interesting ideas about how we could tackle this issue. And we’ve asked our three special advisers to put forward their ideas in the report they will produce next month, on what digitisation means for consumers, and how competition enforcers should respond.And I’m very grateful for all those ideas. Because this isn’t a simple issue.

The thing is, that word “data” covers a vast range of different things. The way that companies in different sectors collect and use data can be so different that it might not even be possible to have one set of rules that works for them all. It might be better, instead, to adjust our approach to fit the way data is used in each sector of the economy separately.

But whether we have one set of rules for everyone, or different approaches for different sectors, no solution will work unless it’s fair to everyone. It will have to give companies access to the data they need to compete. But it will also have to be fair to the companies that have put money and effort into building those sets of data; and it will have to respect the privacy rights of the people whose data it is.

Privacy
But competition rules won’t solve all the issues we face, when it comes to the power of data.

People need to know that this power won’t fall into the hands of just a few companies. But they also need to know that, whoever uses that data, they’ll use it in our interests – and above all, that they won’t undermine our privacy.

Competition enforcement can help with that. After all, the point of competition is to give consumers the power to insist on the kind of service they want. And if privacy is something that’s important to consumers, competition should drive companies to offer better protection.

But if we want to make sure people’s privacy is really protected, the most important thing is strong privacy rules, firmly enforced. Already, the EU’s new rules on privacy – the General Data Protection Regulation – are leading the way for a new approach around the world. And the fine which the French data protection authority imposed on Google last month shows that those rules have teeth.

But as consumers, we also have our part to play. Having these rights is important. But to make a difference, we also need to use them.

Of course, we need to be realistic. Few of us have the time to dig very deep into what’s happening with our data. But if we show that we care about how our data is used – and that we want to have more control – then there’s a lot of room for business to step in with products that can guide us through the maze, and make sure that our rights are protected.

That could mean keeping an eye on what’s happening with our data - who’s sharing it, and what they’re using it for – throughout the Internet. It could mean helping us compare the privacy of different services, so we can pick out the ones we prefer. It could even mean helping consumers get the full value out of their data. All of those are services that business could provide – if, as consumers, we show that we want them.

Ethics in AI
But even if we make sure our own data is safe, then that, on its own, won’t solve all our challenges.

Because when we hand over data, we’re not just telling a computer – and the business or the government that runs it – about ourselves as individuals. We’re also helping them learn about how people in our society think and feel and act – building an understanding that they can use to make decisions about all of us, or to try to persuade us.

So we also need to come together as a society, to make sure we’re not fuelling harmful uses of data. We need to make sure, for example, that artificial intelligence doesn’t entrench our human prejudices. When a computer makes a decision, we easily assume that it’s objective – that it’s based purely on logic. And yet, if the data it’s trained on is biased, there’s a serious risk that the results will be too.

This is why the Commission has put together a group of experts, to come up with ethical rules to make sure AI doesn’t undermine our fundamental values. That group has already published draft guidelines, and the final version should be ready in March.

Conclusion
In the last few years, as data has developed more and more of its potential, it’s sometimes seemed that society is falling behind. Our understanding of what’s changing, and our will to shape that change, haven’t always kept pace with the way technology has developed.

But I think that, as a society, we’re starting to catch up. We’re starting to understand that those fun quizzes on social media are really there to harvest data; that our smartphones are tracking us everywhere we go; that even the biggest companies can be caught out by data leaks. And there’s a growing understanding that if we want our values to be protected in a data-driven world, we’ll have to take action.

That won’t make life easier for those who work with data. Companies that develop AI will need to think carefully, right from the start, about how to make their products ethical. Businesses will have to be clearer with their users about exactly what they plan to do with their data.

But that’s how it should be. Because data can do great things for us - but we can’t buy those benefits at the expense of our values. We can’t trade our freedom for better maps, or our democracy for a better social media algorithm. We have a lot to gain from data – but we need to make sure that we use it in a way that is really good for society.

So we must not be afraid, as a society, to take control of this new world. Because in the end, it’s not technology that will decide our future. It’s us.

Thank you.
Available Formats
Format Quality Bitrate Size
MP3 44100 Hz 249.83 kbits/sec 64.75 MB Listen Download
MP3 44100 Hz 62.22 kbits/sec 16.19 MB Listen Download
Auto * (Allows browser to choose a format it supports)