RegImpact
eu ai acteffective· Published 7/30/2025

Overview of the Code of Practice

The Code of Practice offers a clear framework to help developers of General Purpose AI (GPAI) models meet the requirements of the EU AI Act. While providers can choose to follow the Code, they are also free to demonstrate compliance through other appropriate methods. This post provides a concise overview of each Chapter, Commitment, and […]

What this rule actually says

The EU AI Act is a new law that regulates AI systems sold or used in Europe. The Code of Practice is the EU's official playbook for how AI builders—especially those making general-purpose models like GPT-style systems—can prove they're complying with it. Following the Code is voluntary, but it's the clearest path to staying legal; builders can also comply by other means if they prefer to do their own thing.

Who it applies to

  • If you're selling or serving users in the EU, this applies to you, even if the company is based elsewhere.
  • If you're building a general-purpose AI model (like a fine-tuned LLM or multi-task foundation model), this is directly relevant. Custom chatbots trained on proprietary data are less likely to trigger it.
  • If you're using AI as a tool (e.g., a medical scribe that uses an off-the-shelf model like Claude or GPT-4), you're probably *not* the primary target—the model provider is. But you may inherit some compliance obligations depending on how you deploy it.
  • High-risk use cases matter more. Medical diagnosis, hiring decisions, or systems that could deny people services face stricter rules than a support chatbot answering FAQs.
  • Your users' data scope: The Code addresses training data, testing data, and how you handle user information once they interact with your system. If you're collecting EU user data, data governance is not optional.

What founders need to do

  1. Check if you're a "provider" under the Act (roughly: you built or released the AI model itself). If you licensed a third-party model and wrapped it, you're less exposed. *(1 day)*
  1. If you are a provider, read the Code's chapters relevant to your use case—focus on transparency, testing, and risk management sections. Skim the full text or find a summary tailored to your sector. *(2–3 days)*
  1. Document what you're already doing: testing procedures, bias checks, user disclosures, data handling. Compliance often means writing down practices you should be doing anyway. *(3–5 days)*
  1. Set up a light compliance check at launch and quarterly: Is your model drifting? Are users in the EU? Are there complaints? Keep a simple log. *(ongoing, ~2 hours/month)*
  1. Get a lawyer's read if you're in high-risk domains (healthcare, employment, financial decisions). A 1-hour consultation now beats a fine later. *(1–2 days to source and schedule)*

Bottom line

If you're a solo founder building a support chatbot on top of OpenAI, monitor but don't panic. If you trained and released your own foundation model serving Europeans, act now—start documenting and reading the Code immediately.