Taking Stock: Platforms Transparency under the DSA

 

With the Digital Services Act now over a year old, it's worth taking stock of what we've learned so far.

 A way to assess progress is by examining how the DSA's groundbreaking transparency requirements for Very Large Online Platforms (VLOPs) have worked out in practice. The acts’ clear requirements are meant to push platforms toward transparency that empowers users and regulators. These disclosures can tell us whether platforms are genuinely committed to transparency and taking their duty of care to users seriously.

Transparency under the DSA

The DSA requirements set up a solid framework and outline clear transparency obligations, organized around key checkpoints:

1.      Publishing clear rules (Articles 14 & 15): Providers must clearly disclose content moderation policies, including any algorithmic or human review processes, and annual transparency reports on moderation activities and statistics.

 2.      Explaining and contesting content decisions (Articles 16-18, 20 & 25): Users need clear, accessible statements of reasons for each restriction or removal, a free internal complaint-handling system to challenge those decisions, and online interfaces that avoid manipulative or deceptive design that could impair informed choice.

 3.      Revealing recommendations and advertising mechanics (Articles 26, 27 & 39): Platforms must transparently disclose core algorithmic recommendation parameters (Article 27) and maintain an accessible, transparent advertising repository providing detailed information about ads presented (Article 39).

 4.      Offering non-profiling alternatives (Article 38): VLOPs must offer a feed that operates without personal profiling.

 5.      Enabling tangible and external oversight (Articles 34, 35, 37, 40 & 42): Platforms are required to assess and mitigate systemic risks (Articles 34, 35), undergo independent audits (Article 37), provide access to data for oversight (Article 40), and comply with detailed transparency reporting obligations (Article 42)

 Transparency or Performance?

We recently conducted a comprehensive evaluation of transparency disclosures from 13 major platforms, examining approximately 75 regulatory documents—including systemic risk assessments, terms of service, and bi-annual transparency reports—published since the Digital Services Act (DSA) became effective. Despite DSA’s robust legal framework, our analysis highlights significant shortcomings that compromise the transparency goals.

A prominent finding from our analysis is platforms' widespread reliance on repetitive and superficial disclosures, often described as "transparency theatre". Nearly half of these disclosures were excessively vague, resorting to generic references to "platform practices" rather than explicitly detailing how specific systems like content moderation or recommendation algorithms operate and assessment of their inherent risks.

Typical boilerplate phrases, such as "We are committed to transparency," frequently appear across platforms promising broad assurances without delivering substantial clarity. On issues of fairness and bias, platforms consistently recycle generic statements about their commitments without offering specifics about the methodologies for bias testing, the metrics used for assessing disparities, or concrete steps for mitigating biases.

This reliance on generic, superficial language obstructs meaningful oversight by regulators, researchers, and the public. By maintaining surface-level compliance, platforms present an appearance of transparency without providing the necessary operational details critical for genuine accountability.

A Better Way to Measure Transparency?

So how do we get past this corporate in-speak? An answer points to more structured transparency indicators and a more systematic way to turn the DSA's requirements into specific, measurable criteria that platforms must address.

Such indicators can be drawn from extensive work by scholars and experts who have already thought long and hard about the puzzle of transparency and accountability in algorithmic systems like recommender systems and content moderation. Science and Technology Studies (STS) scholars have identified many fundamental features and indicators that together, if adjusted to the requirements of the DSA, make up a comprehensive bundle of reasonable queries to submit to platforms.

From this substantive groundwork, it is first straightforward to break transparency down into three layers: systemic (how the platform works overall), operational (day-to-day processes), and societal (broader impacts).

From then on, we can develop a tentative and initial framework based on transparency indicators for platforms algorithms systems (content moderation and RSs).

We established over 60 questions to assess current platform transparency practices via content analysis. These are not prescriptive but rather provide a foundational approach to operationalizing transparency assessment as per the DSA requirements, facilitating consistent evaluation across stakeholders. The aim was to distinguish between substantive transparency and performative compliance:

Systemic Transparency

  • Objective Transparency (11 questions): Explicit articulation of algorithmic objectives such as engagement optimization, content diversity goals, user retention strategies, and value alignment mechanisms. This dimension reveals the normative foundations underlying algorithmic design choices and enables assessment of goal-outcome coherence. Core DSA Anchor: Art. 14 §1-4 (T&Cs); Art. 27 §§1-2 (main recommender parameters); Arts. 34-35 (systemic-risk identification & mitigation)

 

Operational or Governance Transparency

  • Input Transparency (9): Comprehensive disclosure of data inputs encompassing behavioral signals, contextual metadata, third-party data streams, and inferential attribution processes. This transparency layer exposes the informational architecture shaping algorithmic decision-making and enables evaluation of the legitimacy of data processing . Core DSA Anchor: Art. 27 §1-2 (main signals & weighting)

  • Curation Transparency (23): Details about exposition of content selection mechanisms, ranking methodologies, filtering criteria, and prioritization logic. This operational visibility enables stakeholders to understand content governance processes and evaluate their consistency with stated platform policies.  Core DSA Anchor: Art. 27 §1-2 (ranking parameters); Arts. 16-18 (notice-and-action flow, statement-of-reasons); Art. 20 (internal complaint handling, process visibility)

  • Personalization Transparency (8): Requirements for accessible user interfaces that provide granular visibility into individual algorithmic profiles, personalization parameters, and preference modeling. This empowers user agency through informed choice and meaningful control over personalized experiences.  Core DSA Anchor: Art. 27 §3 (users can influence recommender criteria); Art. 38 (non-profiling option); Art. 26 (ad-targeting disclosures); Art. 17 (individual explanations)

 Societal Transparency

  • Fairness Transparency (8): Public disclosure of bias detection methodologies, mitigation strategies, fairness metrics, and equality impact assessments. This dimension addresses algorithmic systems' differential impacts across demographic groups and vulnerable populations. Core DSA Anchor: Art. 34 §1(b) (risk of discrimination); Art. 35 §1 (mitigation); Art. 37 §4(g) (audit opinion on compliance)

  • Auditability Transparency (2): External verification through comprehensive activity logging, standardized audit interfaces, and researcher data access protocols. This enables independent oversight and democratic accountability mechanisms.  Core DSA Anchor: Art. 37 (independent audit regime); Art. 40 (vetted-researcher data access); Art. 42 (public disclosure of risk & audit reports)

 What the Numbers Tell Us

We assessed platforms across these transparency indicators- over 60 specific questions scored from 0 (no disclosure) to 2 (detailed, platform-specific disclosure). Most platforms averaged around 1, indicating a prevalence of generic or superficial disclosures. Detailed disclosures were uncommon, demonstrating that platforms largely meet transparency requirements only at a surface level rather than substantively.

Towards Genuine Accountability

A critical area remains the current auditing mechanism mandated by Article 37. Presently, auditing is dominated by the Big Four accounting firms, creating potential conflicts of interest. These firms often perform financial services for the platforms they audit, fostering conditions ripe for "audit capture," where auditors may soften findings to maintain lucrative corporate relationships. Moreover, platforms are allegedly typically defining the benchmarks for audits, further undermining independence.

These observations primarily pertain to platform discourse rather than evaluations of tangible affordances or the actual functional properties of platforms that would better reflect their strict adherence to regulatory requirements. Nevertheless, clearly articulating discursive expectations can establish a foundational reference point for systematically assessing transparency practices. Establishing these expectations facilitates meaningful dialogue among stakeholders, regulators, and platforms about what genuine transparency could and should entail. By anchoring discussions in explicitly defined standards and expectations, stakeholders can more effectively identify gaps between stated commitments and actual practices, thereby driving accountability and continuous improvement in platform governance.

The DSA is an opportunity to transform how platforms approach transparency and accountability. Lack of early structured indicators and independent audits, transparency risks to remain little more than symbolic compliance. Realizing the DSA’s vision—empowering users and safeguarding democratic engagement—requires a forward-looking, inclusive, participatory approach, anchored in rigorously defined indicators and strengthened by independent oversight.