Discussions
The EU’s new digital regulations set out obligations that affect how technology companies operating in the Union handle language, documentation, and compliance. The AI Act, the Digital Services Act (DSA) and the Digital Markets Act (DMA) together create a complex set of obligations that are often to be fulfilled not only through technical measures, but also through documentation and disclosures: how companies describe their systems, risks, terms and user rights in writing.
These new regulations sit on top of existing EU consumer protection and data protection law, which already requires clear and accessible information for users across all Member States, in their own languages. The Artificial Intelligence (AI) Act requires providing technical documentation and user instructions for high-risk AI systems; the Digital Services Act (DSA) restructures how online platforms express their terms of service, content moderation rules and transparency reports; and the Digital Markets Act (DMA) forces large “gatekeeper” platforms to explain complex data and ranking practices in terms that business users and consumers can actually understand.
For many companies, multilingual documentation is no longer just a secondary localisation task for the marketing or UX departments, but has also become a matter of legal compliance.
Many of the operative duties established by the AI Act, DSA and DMA are structured as information duties, such as technical files, user-facing notices, terms of service summaries, transparency reports, and algorithmic disclosures. Documentation is no longer considered a back-office by-product: it is also the basis on which regulators conduct their audits and users exercise their rights.
A mistranslated legal term, an ambiguous summary of a risk or a right, or inconsistency between language versions can thus become a point of regulatory scrutiny or litigation. For instance, if ‘high-risk AI system’ under Article 6 AI Act is rendered in one language as ‘high-impact AI system’ in another language of the Union, it could be understood that an AI system falls outside the stricter category in that jurisdiction.
The Artificial Intelligence Act (AI Act): technical documentation and AI literacy
For AI systems, the AI Act requires technical documentation with specified minimum content, drafted before placing the system on the market, kept up to date, and clear and comprehensive enough for authorities to assess compliance.
AI providers must also support a sufficient level of AI literacy among staff and users: in practice, this means intelligible, audience-appropriate documentation and training materials, not just dense legal annexes.
In a multilingual Union, those same documents, risk descriptions and instructions must be accurately localised to preserve definitions, risk categorizations and limitations of use across all markets where the system is deployed.
The Digital Services Act (DSA): user-facing language and accessibility
The DSA requires that intermediary services, especially very large online platforms, provide clear and concise summaries of their terms of service, including remedies and redress mechanisms. They must also make their terms and conditions accessible in the official languages of all Member States where they offer services.
Emphasis is placed on ensuring that the choice of language does not itself become a barrier to communication, encouraging providers and authorities to overcome language gaps with appropriate human or technological resources.
It also seeks to prevent “dark patterns”: deceptive, unethical user interface (UI) designs crafted to manipulate users into taking actions they did not intend, such as making purchases, subscribing to services, or sharing personal data.
Under Article 26 DSA, advertising transparency must be delivered “for every ad”: standardized text templates translated into 24 official EU languages are provided so that disclosure wording remains consistent and compliant across jurisdictions. This means, in practice, that a video platform must display, in the viewer’s language, why a particular political advert is shown: for example, because the user follows a specific channel or matches a demographic segment.
Under the DSA, any mistranslation of concepts like “sponsor”, “targeting parameters” or appeal mechanisms that can distort users’ understanding of their rights exposes platforms to enforcement or accusations of dark patterns.
The Digital Markets Act (DMA): fairness, transparency and plain language
The DMA establishes obligations for “gatekeeper” platforms: that is, large digital firms, such as Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft, that control essential “core platform services” – such as app stores, search engines, or operating systems – thus serving as gateways between businesses and consumers.
The obligations imposed by the DMA are based on transparency and fairness principles, requiring terms and conditions and data-use information that are publicly available and written in plain and intelligible language.
Emphasis is placed on ensuring that the information that enables users to exercise their DMA rights is accessible, neutral and provided in a timely manner, avoiding manipulative or overly complex wording that would undermine meaningful choice.
Because gatekeepers typically operate across the EU, they must articulate identical rights, data flows and ranking or interoperability rules in multiple languages without losing legal nuance between versions.
Why multilingual legal precision matters for compliance
Regulatory risk and evidence
- Technical documentation and user-facing notices can serve as primary evidence of compliance: if they are unclear or inconsistent between language versions, regulators can question both substance and process.
- Divergent translations of key concepts (for example, “high-risk use”, “profiling”, “suspension for misuse”) can be read as unequal treatment of users in different Member States, undermining fundamental-rights arguments and exposing companies to complaints.
Interplay with GDPR and sector rules
- In terms of transparency and accountability, the DMA and related instruments largely build on the General Data Protection Regulation (GDPR), which already requires concise, intelligible and easily accessible information in clear and plain language. By extending this logic into AI and platform regulation, the DMA significantly increases the volume of texts that must meet that standard in every market language.
Operational complexity and governance
- AI and platform providers must keep their documentation up to date: updates to models, safety measures or terms trigger parallel updates in all relevant languages, requiring structured translation workflows, terminology management, and version control.
For example, a platform that retrains a recommender system to comply with Article 27 DSA will have to adjust recommender explanations and associated help-centre content in every language where that recommender is offered.
- Companies will be expected to adopt standardized multilingual templates, like those released for DSA ad transparency, or else match their precision with their own.
User trust, rights and litigation posture
- Clear and consistent multilingual documentation supports AI literacy initiatives and makes it easier for users and staff to understand limitations, risks, and redress channels.
- From a litigation standpoint, having aligned language versions of terms and conditions and technical documentation reduces the scope for disputes over authentic versions, forum-shopping based on language, or claims that users were misled by a localised interface.
For example, in a dispute over account suspension under Article 20 DSA, a user might point to discrepancies between the English and Spanish versions of the terms to argue that they were not properly informed of the grounds for suspension.
Implications for legal translation
Under the EU’s new digital regulations, translation has shifted from marketing-focused localisation to a core compliance function. The AI Act, DSA and DMA establish a set of obligations to be discharged through what providers state in their documentation, terms of service, transparency notices and user-facing explanations. Regulators will seek to establish whether this information exists, but also whether it is accurate, consistent, and understandable in each language in which services are offered.
Legal translation is now, together with impact assessments and technical controls, part of the documentary evidence that companies rely on to prove their legal compliance and support their position against potential enforcement or litigation.
In addition to being best practice, managing legal terminology is becoming part of regulatory infrastructure. Organisations will need shared multilingual glossaries for AI risk categories, content-moderation outcomes, ranking parameters, data uses and redress mechanisms to express the same legal reality across 24 official EU languages without semantic drift. Consistent terms for concepts such “high-risk AI system”, “systemic risk”, “suspension of service”, “profiling”, or “preferential ranking” will provide the terminology that allows authorities, users, and courts in different Member States to interpret obligations in a harmonised way rather than fragmenting along language lines.
Legal-linguistic experts are thus working together with lawyers, product managers, and engineers to design documentation that is technically precise, aligned with internal risk assessments, and accessible to non-specialists in each target language. They are also creating and maintaining controlled multilingual terminology sets that ensure that obligations and user rights are framed identically across jurisdictions.
Under the new regulatory frameworks, legal translators do not merely convert text from one language to another: they help shape the formulations that companies and law firms rely on to document their compliance with digital regulations across all territories and jurisdictions in the single market.