
2026-03-10
In this #KEYMASTER session, Sven Rajala sat down with Chief PKI Officer Tomas Gustavsson to unpack a deceptively dry but critically important topic: standards in PKI, and what happens when they go wrong.
Framed as the good, the bad, and the ugly, the discussion explored how deviations from well-established standards can create real-world security, interoperability, and maintenance problems.
Public Key Infrastructure (PKI) operates at a massive global scale. The standards it relies on—such as RFC 5280—have earned trust because they are:
When systems follow these shared standards, they benefit from stable libraries, predictable behavior, and strong security guarantees. The moment an organization decides to “tweak” or reinterpret those standards, they step onto much shakier ground.
As Tomas explains, violating standards almost always leads to custom code. Custom code is expensive to build, harder to maintain, less tested, and more likely to contain security flaws. Worse, it often locks organizations into brittle solutions that don’t age well.
The conversation highlights several concrete cases where industry or vertical standards conflict with core PKI specifications:
Each of these cases increases the likelihood of interoperability failures and forces libraries to grow more complex, raising the risk of bugs and vulnerabilities.
Conflicting standards don’t just create theoretical problems—they break real systems. Libraries built to strictly follow RFCs may fail when encountering certificates that violate them. To compensate, developers add exception handling and workaround logic, making codebases harder to reason about and easier to exploit.
This fragmentation is especially dangerous because PKI depends on trust, predictability, and consistency across the internet.
So how do we avoid repeating these mistakes? The answer is collaboration and humility. Industry-specific standards often reference foundational RFCs while simultaneously violating them. That contradiction needs to be addressed early—before standards are mandated and widely deployed.
They argue strongly for closer cooperation between industry groups and established standards bodies like the IETF. Fewer silos, more shared review, and more alignment would go a long way toward preventing these conflicts.
The discussion closes by looking ahead to post-quantum cryptography (PQC). This transition introduces many new standards, but also a rare opportunity: to get things right from the start.
Instead of inventing private or proprietary approaches, organizations should rely on established standardization bodies and widely adopted specifications. PQC is a chance to learn from past mistakes—not repeat them.

