
EU AI Act for SMEs (2025): what you must do now
I advise companies that use or build AI. Below is a simple plan for SMEs to prepare for the EU AI Act without wasting time or money.
Why this matters
The EU AI Act is now in force and its duties start to apply in phases. If you prepare early, you avoid last-minute costs and keep your projects on track.
What applies when (plain timeline)
- 2 Feb 2025: bans on prohibited practices start.
- 2 Aug 2025: rules for general-purpose AI (GPAI) models and EU-level governance start.
- 2 Aug 2026: most remaining duties apply (including transparency under Article 50 and Annex III high-risk systems).
- 2 Aug 2027: extra time for high-risk AI embedded in regulated products.
Step 1 — Map your AI use (30 minutes)
- Where do you use AI? list tools and vendors (chatbots, scoring, document analysis, vision, etc.).
- Who provides it? vendor, open-source model, or in-house.
- What is the impact? does it affect people’s rights (hiring, credit, access to services) or safety?
Step 2 — Classify the risk
- Not high-risk: most internal productivity use (drafting, summaries) tends to be lower risk, but keep basic transparency and records.
- High-risk (Annex III): e.g., certain employment, credit-worthiness, education or essential services use. If in scope, you will need a quality management system, risk management, data governance, technical documentation, human oversight, post-market monitoring and, in many cases, a conformity assessment.
- GPAI models: if you provide a general-purpose model, special provider rules apply; if you only use someone else’s model, focus on vendor assurances and your deployer duties.
Step 3 — Transparency and content
- AI interactions: people should know when they interact with an AI system (unless obvious).
- AI-generated or altered content (“deepfakes”): plan to mark synthetic media so it can be detected by machines and recognised by people. Build the workflow now; obligations phase in with the broader transparency rules.
- Documentation: keep simple records of what you used, why, data sources and human review.
Step 4 — Vendors and contracts (quick wins)
- Ask for: model card/technical sheet, training-data info, known limitations, safety tests, and EU AI Act compliance roadmap.
- Put it in the contract: duty to inform about material changes, security incidents, and non-compliance; right to audit/assurance reports; cooperation for transparency labels and user notices.
- Security & privacy: align with your GDPR and information-security controls (access, retention, logs).
Step 5 — Governance for SMEs (lightweight)
- One owner: name a practical AI contact in the company (can be the DPO or compliance lead).
- One policy page: who can use which tools, where to store outputs, when to involve a human reviewer.
- Training: brief managers on acceptable use, bias risks and escalation paths.
- Sandbox & guidance: consider regulatory sandboxes and official guidance aimed at SMEs.
Fines (what SMEs should know)
Severe breaches (e.g., prohibited practices) can trigger high fines. The Act provides proportional treatment for SMEs and start-ups, but do not rely on that—fix issues early.
Common pitfalls I see
- “We only use off-the-shelf tools, so we are outside scope.” Deployers still have duties (transparency, monitoring, records).
- Missing the timeline. Work back from August 2026; sort prohibited uses and transparency now.
- No vendor evidence. Contracts without documentation duties make compliance and audits hard.
- Security gap. AI outputs end up in uncontrolled drives or chats—fix with simple rules.
My 6-point starter kit for SMEs
- Inventory of AI tools and use cases; mark possible high-risk ones.
- Choose one transparency workflow for AI-generated content (media, docs, images).
- Add an AI clause to vendor contracts (assurances + notifications).
- One-page internal policy; short training for managers.
- For high-risk cases: start the documentation pack and human-oversight plan.
- Set a review date every quarter to update the inventory and fixes.
FAQs
We are not building models. Do we still have duties?
Yes. As a deployer, you still handle transparency, records and safe use. For high-risk use cases, more duties can apply.
When do we need to label AI-generated content?
Plan the labelling workflow now. Transparency duties form part of the 2026 phase-in; some GPAI and governance rules start in 2025.
Are fines different for SMEs?
Fines can be large, but the Act provides proportional treatment for SMEs and start-ups. Good preparation reduces risk and cost.
Call to action
Schedule a personal consultation. I will map your use cases, draft light-weight controls, align vendor contracts, and set a clear path to compliance.
Further reading (official)
- European Commission — AI Act timeline and overview
- EU AI Act — Implementation timeline (key dates)
- Council of the EU — Final adoption (press release)
Disclaimer: This article provides general information and does not replace individual legal advice.