The European Commission has named 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that will be targeted under the Digital Services Act. It said that these companies are being picked out because they have over 45 million monthly active users.
The companies that will have to adhere to strict new rules, which you can read further down, include Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando, Bing, and Google Search. The last two are the VLOSEs and the rest are the VLOPs.
Now that they’ve been designated, they will have to comply with the following rules within the next four months:
- More user empowerment:
- Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
- Users will be able to report illegal content easily and platforms have to process such reports diligently;
- Advertisements cannot be displayed based on the sensitive data of the user (such as ethnic origin, political opinions or sexual orientation);
- Platforms need to label all ads and inform users on who is promoting them;
- Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.
- Strong protection of minors:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for negative effects on mental health will have to be provided to the Commission 4 months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.
- More diligent content moderation, less disinformation:
- Platforms and search engines need to take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
- Platforms need to have clear terms and conditions and enforce them diligently and non-arbitrarily;
- Platforms need to have a mechanism for users to flag illegal content and act upon notifications expeditiously;
- Platforms need to analyse their specific risks, and put in place mitigation measures – for instance, to address the spread of disinformation and inauthentic use of their service.
- More transparency and accountability:
- Platforms need to ensure that their risk assessments and their compliance with all the DSA obligations are externally and independently audited;
- They will have to give access to publicly available data to researchers; later on, a special mechanism for vetted researchers will be established;
- They will need to publish repositories of all the ads served on their interface;
- Platforms need to publish transparency reports on content moderation decisions and risk management.
The Digital Services Act will be enforced by a “pan-European supervisory architecture”. This consists of the Commission supervising the designated platforms and search engines but also includes national Digital Services Coordinators. The DSCs will also be responsible for the supervision of the smaller platforms and search engines.
The strict rules will mean that the platforms and search engines will have a lot of work to do to ensure they meet the necessary standards. No penalties were outlined in the announcement but last year Neowin reported that companies that break the rules could be fined up to 6% of their global turnover.