Technology

DSA vs. DMA: How Europe’s twin digital regulations are hitting Big Tech | TechCrunch


It’s no accident that the European Union’s Digital Services Act and Digital Markets Act have such similar-sounding names: They were conceived together and, at the end of 2020, proposed in unison as a twin package of digital policy reforms. EU lawmakers had overwhelmingly approved them by mid-2022, and both regimes were fully up and running by early 2024. While each law aims to achieve distinct things, via its own set of differently applied rules, they are best understood as a joint response to Big Tech’s market power.

Key concerns driving lawmakers include a belief that major digital platforms have ignored consumer welfare in their rush to scale fatter profits online. The EU also sees dysfunctional digital markets as undermining the bloc’s competitiveness, thanks to phenomena like network effects and the power of big data to cement a winner-takes-all dynamic.

The argument is that this is both bad for competition and bad news for consumers who are vulnerable to exploitation when markets tip.

Broadly speaking, the DSA is concerned about rising risks for consumer welfare in an era of growing uptake of digital services. That could be from online distribution of illegal goods (fakes, dangerous stuff) on marketplaces or illegal content (CSAM, terrorism, etc.) on social media. But there are thornier issues, for example, with online disinformation: There may be civic risks (such as election interference), but how such content is handled (whether it’s taken down; made less visible; labeled, etc.) could have implications for fundamental rights like freedom of expression.

The bloc decided it needed an updated digital framework to tackle all these risks to ensure “a fair and open online platform environment” to underpin the next decades of online growth.

Their goal with the DSA is absolutely a balancing act, though: The bloc is aiming to drive up content moderation standards in a quasi-hands-off way: by regulating the processes and procedures involved in content-related decisions, rather than defining what can and can’t be put online. The aim is to harmonize and raise standards around governance decision-making processes, including by ensuring comms channels exist with relevant external experts in order to make platforms more responsible in moderating content.

There’s a further twist: While the DSA’s general rules apply to all sorts of digital apps and services, the strictest requirements — concerning algorithmic risk assessment and risk mitigation — only apply to a subset of the largest platforms. So the law has been designed to have the greatest impact on popular platforms, reflecting higher risks of harm flowing from stronger market power.

But when it comes to impact on Big Tech, the DMA is the real biggie: The mission of the DSA’s sister regulation is to drive market contestability itself. The EU wants this regulation to rebalance power at the very top of the tech industry pyramid. That’s why this regime is so highly targeted, applying to just over a handful of power players.

Laws with teeth big enough to bite Big Tech?

Another important thing to note is that both laws have sizable teeth. The EU has long had a range of rules that apply to online businesses but no other dedicated digital regulations are this flashy. The DSA contains penalties of up to 6% of global annual turnover for any infringements; the DMA allows for fines of up to 10% (or even 20% for repeat offenses). In some cases, that could mean billions of dollars in fines.

There’s a growing list of platform giants subject to the DSA’s strictest level of oversight, including major marketplaces like Amazon, Shein and Temu; dominant mobile app stores operated by Apple and Google; social networks giants, including Facebook, Instagram, LinkedIn, TikTok and X (Twitter); and, more recently, a handful of adult content sites that have also been designated as very large online platforms (VLOPs) after crossing the DSA usage threshold of 45 million or more monthly active users in the EU.

The European Commission directly oversees compliance with DSA rules for VLOPs, centralizing the rulebook’s enforcement on Big Tech inside the EU (versus enforcement of the DSA’s general rules being decentralized to member state-level authorities). This structure underlines that the bloc’s lawmakers are keen to avoid forum-shopping undermining its ability to enforce these rules on Big Tech as has happened with other major digital rulebooks (such as the GDPR).

The Commission’s early priorities for DSA enforcement fall into a few broad areas: illegal content risks; election security; child protection; and marketplace safety, though its investigations opened to date cover a wider range of issues.

Around 20 companies are in scope of the EU’s enforcement in relation to around two dozen platforms’ compliance with DSA rules for VLOPs. The Commission maintains a list of designated VLOPs and any actions it’s taken on each.

The DMA is also enforced centrally by the Commission. But this regime applies to far fewer tech giants: Just six companies were originally designated as “gatekeepers.” Back in May, European travel giant Booking was named the seventh.

The gatekeeper designation kicks in for tech giants with at least 45 million monthly EU end users and 10,000 annual business users. And, in a similar fashion as the DSA, the DMA applies rules to specific types of platforms (a similar number of platforms are in scope of each law, though the respective lists are not identical). The EU has some discretion on whether to designate particular platforms (e.g., Apple’s iMessage being let off the hook; same with Microsoft advertising and Edge browser. On the flip side, Apple’s iPadOS was added to the list of core platform services in April).

Regulated categories for the DMA cover strategic infrastructure where Big Tech platforms may be mediating other businesses’ access to consumers, including operating systems; messaging platforms; ad services; social networks; and various other types of intermediation.

The structure means there can be overlap of application between the DSA and the DMA. For example, Google Search is both a DSA VLOP (technically it’s a very large online search engine, or VLOSE, to use the correct acronym. But the EU also deploys VLOPSE to refer to both) and a DMA core platform service. The respective mobile apps stores of Apple and Google are also VLOPs and CPS. Such platforms face a double whammy of compliance requirements, though the EU would say this reflects its strategic importance to digital markets.

Shooting for a digital market reboot

Problems the EU wants the regulations to address by reshaping behavior in digital markets include reduced consumer choice (i.e., fewer and less innovative services), and higher costs (free services may still have expensive access costs, such as forcing a lack of privacy on users).

Online business models that do not pay proper attention to consumer welfare, such as ad-funded Big Tech platforms that seek to drive engagement through outrage/polarization, are another target. And making platform power more responsible and accountable is a unifying thread running through both regimes.

The EU thinks this is necessary to drive trust in online services and power future growth. Without fair and open competition online, the EU’s thesis is that not even startups can ride to the rescue of digital markets. It’s harder for startups to reach as many consumers as the dominant players, which means there’s a low chance that innovation alone will prevent/correct negative effects. Hence the bloc’s decision to lean into regulation.

The DSA and DMA take a different approach to Big Tech

So while the DSA aims to leverage the power of transparency to drive accountability on major platforms — such as by making it obligatory for VLOPs to publish an ad archive and provide data access to independent researchers so they can study the societal impacts of their algorithmic content-sorting — the DMA tries to have a more upfront effect by laying down rules on how gatekeepers can operate strategic services that are prone to becoming choke points under a winner-takes-all playbook.

The EU likes to refer to these rules as the DMA’s list of “dos and don’ts,” which boil down to a pretty specific set of operational requirements based on stuff the bloc’s enforcers have seen before, via earlier antitrust enforcements, such as the EU’s multiple cases against Google over the past two decades. It hopes these commandments will nip any repeat bad behaviors in the bud.

One of the dos on the list, however, is an important order that aims to force CPS to open up to third parties to try to stop gatekeepers using control of their dominant platforms to close down competition.

Changes announced by Apple earlier this year to iOS in the EU, to allow sideloading of apps through web distribution and third-party app stores, are a couple of examples of the DMA forcing more openness than was on offer through Big Tech’s standard playbook.

Another key DMA interoperability mandate applies to messaging platforms. This “do” will require Meta — so far the only designated gatekeeper to have messaging CPS, like WhatsApp and Messenger — to build infrastructure that will allow smaller platforms to offer ways for people to communicate with people using, say, WhatApp without the person needing to sign up for a WhatsApp account.

This requirement is in force but has yet to translate into new opportunities for messaging app consumers and competitors, given that the DMA allows for implementation periods for undertaking the necessary technical work. The EU has also allowed Meta more time to build the technical connectors. But policymakers are hoping that over time, the interoperability mandate for messaging will lead to a leveling of the playing field in this area because it would be empowering consumers to choose services based on innovation, rather than market forces.

The same competitive leveling goal applies across all CPS types the DMA regulates. The bloc’s big hope is that a set of operational commandments applied to the most powerful forces in tech will trigger a wide-ranging market reset that rekindles service innovation and supports consumer welfare. But the success or otherwise of that competitive reset mission remains to seen.

The regulation only started applying on gatekeepers in February 2024 (versus late August 2023 for the DSA rules on VLOPSEs). The real-world effects of the flagship digital market reform will be playing out for months and years yet.

That said, if anyone thought the DMA’s fixed “dos and don’ts” would be self-executing as soon as the law began to apply, then the Commission’s swift announcement (in March 2024) of a clutch of investigations for suspected noncompliance should have destroyed that. On certain issues, some gatekeepers are clearly digging in and preparing to fight.

List of DMA investigations opened to date

Apple: Since March, the EU has been looking into the compliance of Apple’s rules on steering developers in the App Store; the design of choice screens for alternatives to its Safari web browser; and whether its core technology fee (CTF) — a new charge introduced with the set of business terms that implement DMA entitlements — meets the bloc’s rules. The law doesn’t include a specific ban on gatekeepers charging fees, but they must abide by FRAND (fair, reasonable and nondiscriminatory) terms.

In June 2024, the Commission announced preliminary findings on the first two Apple probes and confirmed the formal CTF investigation. Its draft findings at that point included that Apple is breaching the DMA by not letting developers freely inform their users of alternative purchase opportunities. All these probes remain ongoing.

Alphabet/Google: The EU has also been investigating Alphabet’s rules on steering in Google Play, as well as self-preferencing in search results since March.

Meta: Meta’s “pay or consent” model also went under DMA investigation in March. Since November 2023, the tech giant has forced EU users of Facebook and Instagram to agree to being tracked and profiled for ad targeting in order to get free access to its social networks; otherwise, they would have to pay a monthly subscription to use the services. On July 1, the EU issued a preliminary finding that this binary choice Meta imposes breaches the DMA. The investigation is ongoing.

DSA: EU investigations on VLOPSE

On the DSA side, the Commission has been slower to open formal investigations, although it does now have multiple probes open.

By far its most used enforcement action is a power to ask platforms for more information about how they’re operating regulated services (known as a request for information, or RFI). This underpins the EU’s ability to monitor and assess compliance and build cases where it identifies grievances, explaining why the tool has been used repeatedly over the past 11 months since the compliance deadline for VLOPSEs.

X (Twitter): The first DSA investigation the EU opened was on X, back in December 2023. The formal proceeding concerned a raft of issues including suspected breaches of rules related to risk management; content moderation; dark patterns; advertising transparency; and data access for researchers. In July 2024 the Commission issued its first DSA preliminary findings, which concern aspects of its investigation of X.

One of the preliminary findings is that the design of the blue check on X is an illegal dark pattern under the DSA. A second preliminary finding is that X’s ad repository does not comply with the regulatory standard. A third preliminary finding is that X has failed to provide the requisite data access for researchers. X was given a chance to respond.

Other areas the EU continues investigating X for relate to the spread of illegal content; its handling of disinformation; and its Community Notes content moderation feature. So far it has yet to reach a preliminary view.

TikTok: In February 2024 the EU announced a DSA probe of video social network TikTok it said is focused on protection of minors; advertising transparency; data access for researchers; and the risk management of addictive design and harmful content.

AliExpress: In March 2024 the Commission opened its first DSA probe of an ecommerce marketplace, targeting AliExpress over suspected failings of risk management and mitigation; content moderation; its internal complaint-handling mechanisms; the transparency of advertising and recommender systems; and the traceability of traders and to data access for researchers.

Meta: In April 2024 the EU took aim at Meta’s social networks Facebook and Instagram, opening a formal DSA investigation for suspected breaches related to election integrity rules. Specifically it said it’s concerned about the tech giant’s moderation of political ads. It’s also concerned about Meta’s policies for moderating non-paid political content, suggesting they are opaque and overly restrictive.

The EU also said it would look into policies related to enabling outsiders to monitor elections. A further grievance it’s probing relates to Meta’s processes for letting users flag illegal content. EU enforcers are concerned these are not easy enough.

Penalties and impacts

So far, no DSA or DMA investigations have been formally concluded by the Commission, meaning that no penalties have been issued yet. But that’s likely to change as probes conclude in the coming months and years.

As with all EU regulations, it’s worth emphasizing that enforcement is a spectrum, not an event. Just the fact of oversight can apply pressure and lead to operational changes, ahead of any formal finding of noncompliance. Assessing impact based on headline penalties and sanctions alone would be a very crude way of trying to understand a regulation’s effect.

Notably, there have already been some big changes to how major platforms are operating in the EU — such as Apple being forced to allow sideloading or open up its Safari browser, or Google having to ask users to link data for ad targeting across CPS, to name a few early DMA-related developments.

But it’s also true that some major business model reforms have yet to happen.

Notably, Apple has so far stuck to its fee-based model for the App Store (by creating a new fee, the CTF, in a bid to work around the effect of being forced to open its App Store); and Meta has sought to cling to a privacy-hostile mode by forcing users to choose between being tracked or paying to use the historically free services, despite blowback from enforcers of multiple EU rules.

On the DSA side, the EU has been quick to trumpet a number of developments as early wins, such as crediting the DSA with helping drive improvements in platforms’ responsiveness to election security concerns ahead of the EU elections (also following its publication of detailed guidance and pre-election stress-testing exercises), or highlighting LinkedIn’s decision to disable certain types of ads data linking following a DSA complaint. Another example the EU points to in order to illustrate early impact is TikTok pulling functionality from the TikTok Lite app in the region over addiction concerns.

DMA effects the Commission may be less keen to own are claims by Apple and Meta that they’re delaying the launch of certain AI features in the EU, as they’re unsure how the DMA applies.



Source link