Why the CRA now belongs on the executive agenda
If your company sells a connected product — a machine with a cloud back end, an IoT sensor, a mobile app, a back-end service, a SaaS platform with customers in the EU — then EU Regulation 2024/2847, the Cyber Resilience Act (CRA), defines what your product must do from 11 December 2027 to remain on the EU market. This is not a technical detail. It is market access, liability, and penalties up to €15 million or 2.5 % of global annual turnover.
The regulation entered into force on 10 December 2024 and applies in three stages. The reporting obligations under Article 14 take effect on 11 September 2026 — at the time of writing, sixteen months away. Full applicability of all requirements follows on 11 December 2027. The decisions you make as management, IT leadership, or product owner in 2026 determine whether you ship on time or scramble under deadline pressure. Either is possible. Only one is affordable.
This article maps what the CRA actually requires — based on the consolidated text on EUR-Lex and the supplementary Q&A from the European Commission — and shows which decisions belong on the table now and which investments follow. Not a legal guide. A decision aid for the people accountable — with enough technical depth to evaluate the effort estimates your engineering team will put in front of you.
What the CRA actually requires
The CRA sets horizontal cybersecurity requirements for "products with digital elements" placed on the EU market. The scope is wide: hardware with software, standalone software, components — anything that can communicate with other devices or networks via a data or network connection, or is intended to.
Essential requirements from Annex I
Annex I has two parts. Part I covers product properties: secure by design and secure by default, minimal attack surface, protection against unauthorised access, confidentiality of stored and transmitted data, integrity of code, stored data and configuration, data minimisation, availability, resilience against denial-of-service, logging of security-relevant activity, secure update mechanisms. Part II covers vulnerability handling across the full support period: identification and documentation of components and vulnerabilities — explicitly including an SBOM in a machine-readable format —, timely security updates, and coordinated vulnerability disclosure.
Reporting obligations under Article 14
From 11 September 2026, manufacturers must report actively exploited vulnerabilities and severe security incidents through ENISA's Single Reporting Platform. The deadlines are tight:
- Early warning: within 24 hours of becoming aware
- Incident notification: within 72 hours of becoming aware
- Final report: within 14 days of a corrective measure being available for vulnerabilities, within one month for severe incidents
The architectural meaning of those deadlines is concrete: you need telemetry and tooling that escalates a security-relevant event to your team in a form that supports a substantive first report within hours — not days. A logfile that rolls every 24 hours and nobody reads will not do it.
Conformity assessment and CE marking
Before placing a product on the market, manufacturers must perform a conformity assessment, maintain technical documentation, issue an EU declaration of conformity, and affix CE marking. The required depth of assessment depends on the product class — more on that shortly.
Support period
The manufacturer sets a support period — at least five years, or the typical product lifetime, whichever is longer. During that period, vulnerabilities must be identified and remediated, security updates provided, and technical documentation kept current. Buyers must be informed of the support end date at the time of purchase, at minimum to month and year.
Who is in scope — and who is not
"Products with digital elements" is deliberately broad. Build an industrial machine with a cloud back end and you are in. Distribute a mobile app that talks to a back end and you are in. Write firmware for an IoT device and you are in. Operate a SaaS platform whose software is part of a product placed on the EU market and you are in.
The main carve-outs cover sectors regulated by their own product-specific frameworks — medical devices (MDR/IVDR), motor vehicles (UN R155/R156), civil aviation, certain marine equipment — and products developed exclusively for national security or defence. Pure SaaS without a product character is not directly covered, but often falls under NIS2 instead.
Four product categories, three assessment paths
The CRA structures products into four categories with different conformity-assessment requirements:
- Default class — the bulk of products. Self-assessment is permitted regardless of which standards are used.
- Important products class I (Annex III) — for example identity management systems, browsers, password managers, VPNs, routers, SIEM tools, smart locks, home automation with security functions. Self-assessment is permitted only if harmonised standards, common specifications, or a European cybersecurity certification scheme are applied — otherwise, third-party assessment via a notified body is required.
- Important products class II (Annex III) — for example hypervisors, firewalls, intrusion detection systems. Third-party assessment is mandatory.
- Critical products (Annex IV) — for example smart meter gateways, hardware security modules, smartcards with security elements. European cybersecurity certification through a notified body is mandatory.
For most Mittelstand manufacturers of connected industrial products, the default class is the starting point. But anyone building security-relevant features — authentication, encryption, access control as a product capability — should check early whether they fall into Annex III. That changes the conformity-assessment effort substantially.
Architecture implications — where this actually lands
The CRA is not a compliance exercise you can paper over with a PDF at the end. It changes what a connected platform must do. Five points matter from an architecture perspective.
SBOM means: build pipeline plus vulnerability scanning
Annex I, Part II, paragraph 1 requires manufacturers to identify and document the components and vulnerabilities in their products — at minimum at the level of top-level dependencies, in a commonly used and machine-readable format. The regulation does not mandate a specific format; in practice, SPDX and CycloneDX are the established options. The Commission may issue further specifications via delegated acts.
The architectural consequence: the SBOM cannot be a spreadsheet that someone assembles before an audit. It must become part of the build output. Concretely: a CI step that generates the SBOM at build time (Syft, the CycloneDX Maven plugin, npm-cyclonedx, gradle-cyclonedx, Trivy with `--format cyclonedx`); a step that scans it against vulnerability databases (Grype, OSV-Scanner, Trivy); a versioned archive of SBOMs per release that is available on request from market surveillance authorities. Software composition analysis stops being optional and becomes part of the build infrastructure's data model and API surface.
Vulnerability handling means: hotfix path, audit trail, versioning
Vulnerabilities must be handled across the entire support period — at least five years, often longer. Two architecture consequences are commonly underestimated.
First, you need a hotfix path that does not run through the full release-train process. When a critical vulnerability appears in a library you depend on, you have to patch, sign, roll out, and inform customers — quickly. An architecture that pushes every update through a six-week QA cycle collides with the operational reality of this regulation.
Second, you need an audit-trail of which version of a product contained which components, when a vulnerability became known, when it was remediated, when an update reached the end customer. Append-only event logs in the platform database — per device or product instance — are a robust pattern for this. What we built at LITE BLOX as a lifecycle data model with a birth snapshot per unit covers exactly this requirement type.
Incident reporting means: real-time telemetry and a clear escalation path
24 hours is not a lot. "Actively exploited" means you have evidence the vulnerability is being exploited in the wild — not that the damage is already done. To recognise that within hours, you need:
- Runtime instrumentation that surfaces anomalies — Sentry for application errors, Grafana/Prometheus or equivalent for metrics, structured logging with correlation IDs for forensics
- Alerting that escalates to an on-call engineer — not an email read on Monday morning
- A documented incident-response procedure that defines who confirms and submits the ENISA notification, in what form, with what template
The ENISA Single Reporting Platform is scheduled to be operational by 11 September 2026. Until then, a dry run on a fictitious incident pays off: who would do what, in which order, with what data on hand?
Secure-by-default is not "Compliance-by-Design"
The regulation requires products to ship with a secure default configuration; authentication, identity and access control to follow the state of the art; data to be protected at rest and in transit; only data necessary for the intended purpose to be processed. None of this is new — but it must be anchored in data-model schemata, authentication layers, and API contracts. In concrete terms:
- No default passwords — first-time setup forces credential creation
- TLS 1.2 or higher as the floor on every external interface, mTLS for device-to-back-end
- Secrets management as a first-class architectural component (HashiCorp Vault, cloud KMS, or — on owned infrastructure — a dedicated secret store), not environment variables in the repository
- Role and permission model visible in the data model — not as a post-hoc filter in the UI layer
Three regulatory currents, one architecture
The CRA does not stand alone. It meets the EU Data Act (applicable since 12 September 2025), which gives users access to the data their connected products generate, and the GDPR, which governs what may happen to personal data. Three regulations reaching into the same data model — field telemetry, usage data, maintenance logs — and producing different obligations. A platform that stitches them together after the fact accumulates debt. A platform that tracks data origin, purpose binding, retention policy, and access rights as fields in the model can answer all three without reinventing itself.
What you should actually build in 2026
Anyone setting up a new connected platform today — or refactoring an existing system — should treat the following building blocks as 2026 architecture goals:
- SBOM generation in the CI pipeline on every release build, with automated scanning against NVD/OSV and a quality gate that blocks known-critical vulnerabilities
- Append-only event log per product or device instance for security-relevant events — auth attempts, configuration changes, update installations, incident references
- Hotfix release channel that ships independently of the regular release cycle, with signed updates and a rollback path
- Secrets management as a first-class component, separate from application config, with auditable access control
- Runtime instrumentation (Sentry for application errors, Grafana for metrics or equivalent) coupled to an on-call escalation channel
- Vulnerability disclosure policy published on the product page, with a security@ address or a HackerOne / Open Bug Bounty channel — the CRA requires a coordinated disclosure process, not the heroic one-off
- Documented support end dates per product or hardware generation, visible in the sales and onboarding flow
None of these is a new concept on its own. What the CRA changes is the binding force: what used to be best practice becomes a regulatory obligation — backed by market surveillance, fines, and potential market withdrawal for non-compliance.
OSS and third-party libraries
Open-source software is not blanket-exempt under the CRA but is governed by its own logic. A maintainer who publishes free software without commercial activity — the classic GitHub repo author — does not fall under manufacturer obligations. The regulation does, however, introduce a new role: the open-source software steward. Stewards are legal persons that systematically and on a sustained basis support the development of OSS intended for commercial activity (typically foundations such as Apache, Eclipse, the Linux Foundation, and similar bodies). Stewards have a reduced obligation set — a documented cybersecurity policy, reporting of actively exploited vulnerabilities, cooperation with market surveillance — and are explicitly not subject to administrative fines.
For you as a commercial manufacturer, that does not mean "OSS = safe". If you embed an OSS library in your product and place that product on the EU market commercially, you are responsible for the security of the resulting product — including its OSS components. The practical consequence for supply chain and SBOM: you must know what is inside, must track vulnerabilities, must patch or replace. The OSS community supplies the material — the responsibility for the finished product stays with you.
Penalties and enforcement
The CRA defines a three-tier administrative fine regime (Article 64). In each case, the higher of the two figures applies, measured against the previous financial year's turnover:
- Up to €15 million or 2.5 % of global annual turnover — for non-compliance with the essential requirements of Annex I and the manufacturer core obligations under Articles 13 and 14 (vulnerability handling and reporting)
- Up to €10 million or 2 % — for other CRA obligations, including those of importers, distributors, and conformity assessment
- Up to €5 million or 1 % — for incorrect, incomplete, or misleading information supplied to authorities or notified bodies
Open-source stewards are explicitly exempt from administrative fines. Microenterprises and small enterprises cannot be sanctioned for missing the 24-hour early-warning deadline. In Germany, market surveillance falls to the BSI — the federal government has tabled the implementing legislation that designates the BSI as the central market surveillance and notifying authority. The BSI will also provide awareness and training for SMEs and OSS stewards.
How we approach this
We do not sell "CRA-compliance-as-a-service". Compliance is not a product — it is the consequence of a cleanly built architecture. What we do: we build platforms for connected products where the obligations from the CRA, EU Data Act, and GDPR are not retrofitted but anchored in the data model and the API layer. Append-only event log per product instance, SBOM in the build pipeline, signed updates, audit trail in tamper-evident form — as architectural patterns we have implemented in production.
The B2B IoT project LITE BLOX is exactly the use case where these obligations land: connected industrial units in the field, lifecycle data model with a birth snapshot per unit, continuous telemetry, update path. If you are responsible for a comparable initiative — a new platform or a refactor of an existing system — we are happy to talk architecture before the spec sheet hits the first sprint.
Resources
- EU Regulation 2024/2847 (Cyber Resilience Act), consolidated text: eur-lex.europa.eu/eli/reg/2024/2847
- European Commission, Q&A on the Cyber Resilience Act: digital-strategy.ec.europa.eu/en/policies/cyber-resilience-act
- European Commission, summary of the legislative text: digital-strategy.ec.europa.eu/en/policies/cra-summary
- European Commission, open source under the CRA: digital-strategy.ec.europa.eu/en/policies/cra-open-source
- BSI, Cyber Resilience Act in Germany: bsi.bund.de — Cyber Resilience Act
- ENISA, technical cybersecurity guidance: enisa.europa.eu
As of May 2026. The interpretation of individual articles may change through implementing acts, harmonised standards (standardisation request M/606), and national implementing legislation. For legally binding assessment of your specific case, please consult qualified legal counsel.
IntegrIT Solutions
Your Partner for High-Quality Mobile Applications
Email: info@integritsol.de
About IntegrIT Solutions
IntegrIT Solutions is your specialized software agency for developing performant mobile applications. With solid experience in developing business apps for B2B clients, we combine technical competence with business understanding. Our apps are reliable, user-friendly, and deliver measurable business results.
