Klaus Elk Books

Embedded Cyber Security

What to protect? - and concrete developer actions

Cyber security is a very complex subject - filled with heavy math, but also with a lot of compliance and regulations. Before opening the latter - sometimes boring box - let's take a look at what we want to protect. I am using a lot of terms here. Many of these are - to a degree - explained in a table at the bottom of this page.

What do we want to do?

Which processes does the above lead to?

Testing & verification pipeline:

Acceptance criteria examples (testable):

Prioritization (MVP / Phase 2 / Long-term):

Common pitfalls (summary):

Developer checklist:

Terms and acronyms used above

TermMeaning
AEADAuthenticated Encryption with Associated Data. AEAD is the modern way to both encrypt and authenticate messages safely.
AES-GCMAdvanced Encryption Standard - Galois/Counter Mode. Superceeded DES and triple-DES.
CACertificate Authority. An organization serving as common "root-of-trust". Some are installed with your OS and/or browser.
ChaCha20-Poly1305ChaCha20 stream cipher with Poly1305 message authentication
CERTComputer Emergency Response Team (also refers to SEI CERT coding standards).
CoAPConstrained Application Protocol. UDP-based protocol for resource-constrained devices.
CRCCyclic Redundancy Check. Can catch memory-overwrites and bit-errors in transmissions etc. - but not security relevant. See HMAC.
CRLCertificate Revocation List. List of certificates that are not valid anymore.
CVECommon Vulnerabilities and Exposures. Specific issues found in e.g., libraries. See more below.
DASTDynamic Application Security Testing (dynamic analysis). Specific Black-box tools doing e.g., SQL-injection.
DTLSDatagram Transport Layer Security. Does for UDP what TLS does for TCP.
ECDHEElliptic Curve Diffie-Hellman Ephemeral. Implementation allowing for Forward Security. Used in BLE and many other protocols.
FIPSFederal Information Processing Standards. Computer security standards from NIST.
HMACHash-based Message Authentication Code. Based on symmetric shared key. Ensures integrity and authenticity with message in plaintext.
HSMHardware Security Module. HW Server that generates, stores and manages keys and signatures. May be used in production to assure individual certificates in devices.
JTAGJoint Test Action Group (debug/test interface). Interface and protocol for testing chips on PCB.
JWTJSON Web Token. Format for representing security claims.
MCUMicrocontroller Unit
MISRAMotor Industry Software Reliability Association (coding guidelines). See more below.
mTLSMutual TLS (Mutual Transport Layer Security)
OCSPOnline Certificate Status Protocol. Way of establishing whether a given certificate is revoked.
OSCOREObject Security for Constrained RESTful Environments. An IoT lightweight secure protocol for REST.
OTPOne-Time Programmable (memory/fuse) — context-dependent. Often used in production for e.g., serial-numbers.
OTAOver-The-Air (updates). Wireless firmware updates.
PRNGPseudo-Random Number Generator. Not as random as TRNG.
PSKPre-Shared Key. Key is shared via other media - e.g., user typing or QR-code.
RNGRandom Number Generator. On a computer 'random' is surprisingly difficult.
RTCReal-Time Clock. Hardware-based clock containing calendar-info. Normally battery driven.
RTOSReal-Time Operating System. Discussed many places in this site.
SASTStatic Application Security Testing (static analysis). Aka Static Code Analysis.
SBOMSoftware Bill of Materials. List of modules and libraries - with versions - used in a given release.
SCASoftware Composition Analysis. A way to generate an SBOM.
SNIServer Name Indication. Client tells server the hostname it addresses. Allows for webhosting.
SWDSerial Wire Debug. 2-wire ARM Standard for debugging microcontrollers
TLSTransport Layer Security. Modern form of SSL that allows web-based privacy.
TPMTrusted Platform Module. HW-solution for secure boot and encryption.
TRNGTrue Random Number Generator. More random than pseudo-random generators.

Codebase processes

Many of the above "protect"-actions can be implemented like "normal" features - e.g., the use of signed certificates and various keys in protocol handshakes. Some processes, however, are embedded in the daily work and are meant to protect the codebase. These are what we focus on now.

The figure below shows most of the processes an embedded developer may be involved with, when it comes to protecting the codebase

The left column in the figure is the main input. Standards and guidelines may be real input, whereas requirements in many organizations change along the way - hence the dashed arrow from the design activity. The center column is where many developers will spend most of their time. The right column contains output - such as documentation and source, but also activities related to vulnerabilities.
Workflow - seen from an R&D point of view
Workflow - seen from an R&D point of view

Vulnerabilities

A vulnerability is an issue in your product that may be exploited by hackers - or can be triggered accidentally by normal use. Traditionally, in medical and transportation, "safety" has been the major (only) considered risk to users. However, these days all sectors need to (also) consider security. There are many different kind of vulnerabilities:

Vulnerabilities are not in the center column in the figure above, because dealing with these often require a lot of work from QA. marketing, sales and sometimes top-management. "Vulnerability Disclosure" is about informing users about the issue - should they put in on a shelf until there is a patch?, are there workarounds?, when will there be a patch? - if ever, and so on. "Vulnerability Reporting" is targeted authorities - incl. updating the various vulnerability databases, in case someone uses your product in theirs. The term "CVE" means Common Vulnerabilities and Exposures. This relates to databases that register these issues.

Weaknesses

Examples of Weaknesses is e.g., "Buffer overflow", "Memory Leakage", "Null-pointer assignment" etc. This is classical problems that most of us have met in our carrier. There are more or less advanced tools that can scan your software for these kinds of errors. This is not a test in the target product, but a scan of the source code on a PC. This is called "Static Code Analysis".

The term "CWE" means Common Weakness Enumeration. It is an eternal source of confusion that CWE and CVE look and sound almost the same, and deal with issues in the same domain - yet are very different.

Secure Coding Standards

For most software developers the "Coding" phase in the above figure is where we want to be. Also here we see compliance demands. Many organizations require coding guidelines, normally inspired by one of the below organisations:

The right-side menu has links to the above three organizations. Note that while CERT and OWASP are easy to browse, MISRA wants money for their document. OWASP has an interesting Top-10 based on user input:

DomainDescription
Broken Access Control Users are allowed to act outside their intended permissions.
Cryptographic FailuresMissing or weak (homegrown) cryptography, old hash functions like MD5 (or even worse: CRC), insufficient randomness.
InjectionSQL or (unix-style) command injection as well as Cross-Site Scripting (XSS).
Insecure DesignUse threat-modeling, secure design patterns and reference architecture.
Security MisconfigurationHighly configurable components are nice - but we need to know how to use them.
Vulnerable and Outdated ComponentsThis is the above mentioned CVEs.
Identification and Authentication FailuresUsed to be called "Broken Authentication".
Software and Data Integrity FailuresInsecure Software Updates and CI/CD pipelines.
Security Logging and Monitoring FailuresDo log failed login-attempts etc. (and throttle), but also do not disclose debug information to users.
Server-Side Request ForgeryNever trust URLs etc from users.

I recently participated in a training program from Secure Code Warriers - with all the above guidelines and rules. It was surprisingly interactive and educational.

Standards and Requirements

I asked ChapGPT to provide an overview of the involved standards - and what they enforce. The below output (except from my remarks in first column) was the response. I am not sure it's much help...

Standards EU: Cyber Resilience Act (CRA) EU: Radio Equipment Directive : Delegated Reg. 2022/30 (RED DA) EU: NIS2 (Directive 2022/2555) US: IoT Cybersecurity Improvement Act (2020) (Federal procurement) US: FCC U.S. Cyber Trust Mark (IoT label) US: FDA FD&C : 524B (medical devices) US: OMB M‑22‑18 / M‑23‑16 + CISA Attestation (federal software) US: California SB‑327 (consumer IoT)
Static code analysis (SAST)
SW is scanned at source-level - no running target
Risk-based, not prescriptive; part of secure development expected. Not prescriptive; meet essential requirements (e.g., network protection) : methods up to manufacturer. Not prescriptive for products; focuses on org. risk mgmt. Not prescriptive; follows NIST guidance for federal IoT buys. Encouraged via NISTIR 8425 conformance; not named explicitly. Expected as part of secure design evidence; not mandated by name. Align with NIST SSDF practices; attestation of controls, not tool-specific.
Dynamic testing (DAST)
Classic tests - Unit/Integration/System
Risk-based; acceptable means to verify requirements. Not prescriptive; verification approach is up to manufacturer. Not prescriptive for products. Not prescriptive; NIST guidelines inform testing. Implied in evaluation against criteria; not required by name. Expected as part of verification evidence; not mandated by name. Attest to secure dev practices; tool-agnostic.
Software Composition Analysis (SCA) / SBOM
Documenting SW-libraries etc
SCA helpful; SBOM not strictly mandated in CRA text; vulnerability handling required. Not prescriptive; components security must be managed. Not product-focused; supports CVD eco‑system. Encouraged via NIST guidelines; not universal requirement. Often part of program criteria/registry details; not universally required. SBOM explicitly required in submissions for 'cyber devices'. SBOM/artifacts may be requested; attestation required (SSDF aligned).
Fuzz testing
Comm-interfaces tested with illegal packages
Risk-based; not prescriptive. Optional technique to meet criteria. Useful evidence; not expressly mandated. Optional per supplier practice; not required.
Threat modeling & secure-by-design
Architecture Analysis & Traceability
Yes: security by design & risk assessment expected (Annex I). Yes: meet essential requirements via risk analysis. Yes: risk management measures for in-scope entities. Encouraged by criteria; not named explicitly. Yes: risk assessment and cybersecurity plan required. Yes: NIST SSDF-aligned practices in attestation.
Vulnerability scanning (infra/app)
Comparing SBOM with databases on known vulnerabilities in components used
Part of vulnerability handling lifecycle; not prescriptive. Implied by essential requirements. Org-level measure; not product-specific. Often part of evaluation; not mandatory by name. Expected as part of monitoring & maintenance. Tool-agnostic; control attestation.
Coordinated Vulnerability Disclosure (CVD) policy
Rules for informing users about vulnerabilities
Explicitly required for manufacturers. Implied manufacturer responsibilities for vulnerabilities. Explicitly addressed: Member States designate CVD coordinators. Encouraged in NIST guidance and federal procurement baselines. Required/encouraged through program criteria based on NISTIR 8425. Expected policy/process for postmarket handling. Vulnerability disclosure practices aligned to SSDF; attest.
Incident & exploited-vuln reporting to authorities
Rules for informing Authorities about exploited vulnerabilities
Yes: report exploited vulns & severe incidents via ENISA platform; 24h initial, follow-ups. Not applicable (no central authority reporting). Yes: entity incident reporting to national CSIRTs (not product manufacturers per se). Not applicable outside federal procurement; no central reporting. No authority reporting; consumer labeling program. Yes: submission content & postmarket expectations; engage FDA as required. No central reporting; agencies collect attestations.
Secure update & patching mechanism
In-field updates of Software
Yes: security updates during support period required. Yes: protection against fraud/network harm implies update & patching capability. Org-level continuity; not product mandate. Patchability emphasized in NIST guidance for federal IoT. Yes: program criteria include update policy/capabilities. Yes: secure update mechanisms and maintenance plan required. Addressed through secure development/maintenance practices. Implied only via 'reasonable security'; not explicit.
Logging/monitoring & telemetry
Collecting data on running products
Required to detect/respond proportionately (risk-based). Implied by essential requirements. Yes: org-level detection & response measures. Program criteria expect baseline logging/telemetry. Yes: monitoring and logging addressed in guidance. Covered under SSDF practice areas; attest.
Cryptography & secure communications
Protecting data in transit & at rest
Yes: state of the art protection for data & comms. Yes: explicitly protect network & privacy. Guidance-driven (no-hardcoded creds, secure comms). Yes: criteria include secure comms & data protection. Yes: requirements for encryption/authentication. Addressed via SSDF-aligned practices; not prescriptive. Yes: effectively bans default passwords; implies stronger auth.
Secure default configuration (e.g., no default passwords)
Role-based access on need-to-do basis
Yes: secure-by-default expectation; ban on known-insecure defaults. Yes: measures to prevent harm/fraud & protect privacy imply secure defaults. Org-level; not product-specific. Encouraged in NIST baselines (no hard-coded creds). Yes: criteria expect strong default posture. Yes: secure configuration expectations in submissions. Yes: attest to secure configuration practices. Explicit: unique passwords / no universal default passwords.
Penetration testing
Using white-hat hackers
Not mandated; best practice. Often expected/recommended; not mandated.
Security documentation / technical file Yes: technical documentation incl. support period info. Yes: demonstrate conformity to essential requirements. Policy/procedure documentation at org level. Documentation per NIST guidance for procurement. Documentation required for label registry/QR details. Yes: submit cybersecurity documentation incl. SBOM. Yes: supplier attestation & artifacts repository. Not specified beyond general compliance.
Supplier attestation to secure development practices
Documentation on all the above
EU Declaration of Conformity & CE marking; not SSDF attestation. Conformity assessment to RED DA; possible Notified Body. Not relevant (operational directive). Vendors must meet NIST baselines to sell to federal gov. Third-party evaluation against program criteria. Regulatory submission demonstrates compliance. Yes: mandatory secure software development self‑attestation to agencies.
Black Elk

© 2025 KlausElk.com & ElkTronic.dk