The internet was built bottom-up: permissionless, pseudonymous, no central authority. Real identity is top-down: governments, civil registries, hierarchical trust. The middle ground where the two meet is messy and often toxic. Sign in with Google and you hand over more than you need. Verify your identity for KYC and you’re uploading a selfie next to your passport. Deepfakes and synthetic identities make the trust problem worse every month.
A better model has existed for years, often under the label Self-Sovereign Identity (SSI): the issuer signs a credential, you carry it in a wallet, you present it directly to the verifier. No phone home. Private sector incentives were weak and government adoption was slow. Until now. The EUDI Wallet is forcing the issue. By late 2026, every EU Member State must offer one, interoperable across 27 countries and 450 million citizens.
The model is simple. The implementation isn’t. There’s no single credential format that covers every scenario: proximity, online, cross-border, machine-to-machine. The EUDI Wallet has to make multiple formats work together, at scale. If you’re building anything that needs to embed trust, these formats are worth getting familiar with.
An SD-JWT VC is not a W3C VC. An mdoc is not an X.509 certificate. They come from different standards bodies, different eras, and different design goals. This post is the crash course I wish I’d had. Let’s start with the trust layer the others build on: X.509.
The trust anchor you already use: X.509
X.509 came out of the ITU-T X.500 directory standards in 1988, designed for authenticating entities in hierarchical directory services. The original X.500 protocol stack was largely supplanted by LDAP, but X.509 survived because Netscape picked it up for SSL/TLS in the 1990s. Every HTTPS connection since then relies on X.509 certificates. That’s a pattern you’ll see repeated: formats rarely end up where they started.
In the EU, X.509 is also the backbone of qualified trust services under eIDAS: qualified electronic signatures, qualified website authentication certificates (QWACs), and qualified electronic seals all use X.509. The trust layer you use daily without thinking about it.
A certificate binds a public key to an identity, signed by a Certificate Authority. You trust the root CA, which signs intermediates, which sign end-entity certificates. Simple, hierarchical, battle-tested.
Root CA ← self-signed, pre-installed in OS/browser
└── Intermediate CA ← e.g. Belgian PID Provider CA
└── End-entity ← e.g. PID Provider BE
X.509 certificate · human-readable rendering via openssl x509 -text
Certificate:
Data:
Version: 3 (0x2)
Serial Number: 4a:3b:2c:1d:...
Signature Algorithm: ecdsa-with-SHA256
Issuer: CN=Belgian PID Provider CA, O=FOD BOSA, C=BE
Validity
Not Before: Jan 1 00:00:00 2025 GMT
Not After : Jan 1 00:00:00 2026 GMT
Subject: CN=PID Provider BE, O=FOD BOSA, C=BE
Subject Public Key Info:
Public Key Algorithm: id-ecPublicKey (P-256)
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature
X509v3 Basic Constraints:
CA:FALSE
qc-Statements:
id-etsi-qcs-QcCompliance
id-etsi-qcs-QcType 2 (id-etsi-qct-eseal)
X509v3 CRL Distribution Points:
URI:http://crl.eid.belgium.be/ecc.crl
Signature Algorithm: ecdsa-with-SHA256
30:45:02:21:00:ab:cd:ef:...
The actual certificate is binary (ASN.1 DER). Source: ITU-T X.509 (1988).
In the EUDI Wallet, X.509 isn’t something citizens see. It’s the bridge to existing trust infrastructure: issuer certificates, relying party authentication, trusted lists. The ecosystem is already there, and X.509 lets the wallet plug into it.
But X.509 was built for a narrow job: binding a key to an identity with a fixed set of attributes. Every certificate type needs its own OID, its own profile, its own validation logic. Scaling that to dozens of credential types (driving licenses, diplomas, health certificates, age tokens) would mean dozens of custom certificate profiles. It doesn’t scale to a wallet that needs to carry many different credentials. And there’s no selective disclosure: you present the whole certificate or nothing. Which is why the wallet needs something else.
The format your phone presents in person: mDL / mdoc
mDL stands for mobile driving license. The spec (ISO 18013-5, published 2021) was designed for a specific problem: showing your driver’s license on your phone instead of a plastic card. The work started around 2015, driven by US states and Australia, where the driving license is the de facto national ID. The underlying data format, mdoc, has since been generalized for broader mobile credentials via ISO 23220.
The encoding is CBOR (binary, compact, fast), designed for proximity transport like NFC and BLE. The issuer hashes each data element individually (with a random salt) and signs the full set of digests in a Mobile Security Object (MSO). When you present, you choose which elements to include. The verifier re-hashes each revealed element and checks it against the signed MSO. Selective disclosure, native to the format.
mdoc credential · CBOR diagnostic notation
{
"docType": "org.iso.18013.5.1.mDL",
"issuerSigned": {
"nameSpaces": {
"org.iso.18013.5.1": [
24(<< { "digestID": 0, "random": h'8798...', "elementIdentifier": "family_name", "elementValue": "TURNER" } >>),
24(<< { "digestID": 1, "random": h'51C2...', "elementIdentifier": "given_name", "elementValue": "SUSAN" } >>),
24(<< { "digestID": 2, "random": h'D48C...', "elementIdentifier": "birth_date", "elementValue": 1004("1990-08-28") } >>),
24(<< { "digestID": 7, "random": h'2605...', "elementIdentifier": "age_over_18", "elementValue": true } >>)
]
},
"issuerAuth": [
h'A10126', ; protected: alg=ES256
{ 33: h'3082...' }, ; unprotected: x5chain (issuer X.509 cert)
<< { "digestAlgorithm": "SHA-256", ; MSO payload
"docType": "org.iso.18013.5.1.mDL",
"valueDigests": { "org.iso.18013.5.1": { 0: h'7516...', 1: h'C570...', 2: h'A4B1...', 7: h'FF75...' } }
} >>,
h'5901A4...' ; ECDSA signature
]
}
}
Source: ISO 18013-5 (2021), Annex D.
The holder chooses which data elements to include in the presentation. The verifier only sees those elements. But the namespace (org.iso.18013.5.1) is the only thing giving those fields meaning.
mdoc’s strength is where it started: proximity. Age verification at a vending machine, identity check at a hotel check-in, presenting your driving license during a traffic stop. Tap your phone, share only what’s needed.
mdoc has no semantic layer. For a single credential type like a driving license, that’s fine: every country uses the same ISO namespace, so verifiers know what each field means. But for cross-border attestations (professional qualifications, educational credentials, health certificates), there’s no shared namespace. Two issuers can use the same field name and mean different things, with no mechanism in the format to resolve the ambiguity.
No unlinkability either. The issuer signature stays the same across presentations. A colluding set of verifiers can link your presentations together.
The format a website verifies online: SD-JWT VC
Where mdoc’s strength is proximity, SD-JWT VC’s strength is the web. The split between mdoc and SD-JWT VC follows the encoding: CBOR is binary, compact (typically 20-50% smaller than JSON), and has deterministic encoding, which matters over NFC and BLE. JWT is native to the web and the existing OAuth/OpenID ecosystem.
SD-JWT (RFC 9901, published 2025) added selective disclosure to JWTs. SD-JWT VC layers credential semantics on top: credential typing, issuer metadata, status checking. SD-JWT and SD-JWT VC both emerged from the IETF OAuth working group.
The encoding is JWT: header, payload, signature. Claims you want to keep hidden are replaced by salted hashes in an _sd array. Each hidden claim has a corresponding disclosure (a base64url-encoded [salt, name, value] triple). At presentation, you include only the disclosures you want to reveal.
SD-JWT VC credential · JWT payload with selective disclosure
{
"vct": "https://credentials.example.com/identity_credential",
"iss": "https://example.com/issuer",
"iat": 1683000000,
"exp": 1883000000,
"cnf": {
"jwk": { "kty": "EC", "crv": "P-256", "x": "TCAER19Zvu3...", "y": "ZxjiWWbZMQG..." }
},
"_sd": [
"09vKrJMOlyTWM0sjpu_pdOBVBQ2M1y3KhpH515nXkpY",
"2rsjGbaC0ky8mT0pJrPioWTq0_daw1sX76poUlgCwbI",
"EkO8dhW0dHEJbvUHlE_VCeuC9uRELOieLZhh7XbUTtA",
"JzYjH4svliH0R3PyEMfeZu6Jt69u5qehZo7F7EPYlSE",
"PorFbpKuVu6xymJagvkFsFXAbRoc2JGlAUA2BA4o7cI"
],
"_sd_alg": "sha-256"
}
// Disclosures (illustrative; hashes above are placeholders):
// ["2GLC42sKQveCfGfryNRN9w", "given_name", "John"]
// ["eluV5Og3gSNII8EYnsxA_A", "family_name", "Doe"]
// ["Qg_O64zqAxe412a108iroA", "address", {"street_address": "123 Main St", ...}]
// ["AJx-095VPrpTtN4QMOqROA", "birthdate", "1940-01-01"]
// ["Pc33JM2LchcU_lHggv_ufQ", "is_over_18", true]
// The holder selects which disclosures to reveal at presentation.
Source: SD-JWT VC on SD-JWT (RFC 9901, 2025). Adapted from draft-ietf-oauth-sd-jwt-vc-15.
The vct claim identifies what kind of credential this is. Verifiers can resolve issuer keys, type metadata, and credential status through separate endpoints. A website needs to verify your age, your nationality, your professional qualification: SD-JWT VC handles the online flow with full selective disclosure. Mastercard and Google’s Verifiable Intent spec for agent authorization uses SD-JWT credential chains for delegated payments.
The trade-off: vct defines the credential as one monolithic type, with claim meaning agreed out of band. That’s fast and works well for most credential types. W3C VC’s @context takes a different approach: every claim links to a resolvable vocabulary, and you can compose claims from multiple vocabularies in a single credential. More expressive, but JSON-LD processing is heavier.
And like mdoc, no native unlinkability. The holder’s key binding is the same across presentations. Batch issuance (multiple credentials with different keys) can mitigate this, but unlinkability becomes an operational choice, not a cryptographic guarantee.
The format that carries meaning across borders: W3C VC
Now the hard case. A Spanish university issues a diploma. A German employer needs to verify it. Not just “is this credential valid?” but “what does this qualification mean in my context?”
The W3C Verifiable Credentials Data Model (v1.0 in 2019, v2.0 in 2025) was developed by the W3C Verifiable Credentials Working Group, bringing together actors from government digital identity (EU, Canada), education (credential transparency), and decentralized identity (DID community). Unlike mdoc and SD-JWT VC, where selective disclosure is baked into the credential structure at issuance, W3C VC separates the data model from the proof layer. The credential itself is plain JSON-LD with all claims visible. The Data Integrity specification adds proof mechanisms on top: ECDSA for signing (with an optional selective disclosure variant), or BBS for unlinkable selective disclosure. The choice of proof determines what’s possible at presentation.
The @context field links every claim to a resolvable vocabulary.
W3C VC credential · JSON-LD with BBS proof
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://data.europa.eu/snb/model/elm"
],
"type": ["VerifiableCredential", "EuropeanQualificationCredential"],
"issuer": "did:web:uni-madrid.es",
"validFrom": "2026-01-15T00:00:00Z",
"credentialSubject": {
"name": "María García",
"qualificationLevel": {
"@id": "http://data.europa.eu/snb/eqf/6",
"name": "Bachelor"
},
"fieldOfStudy": {
"@id": "https://esco.ec.europa.eu/en/classification/skills?uri=http://data.europa.eu/esco/isced-f/0610",
"name": "Information and Communication Technologies"
}
},
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "bbs-2023",
"verificationMethod": "did:web:uni-madrid.es#key-1",
"proofPurpose": "assertionMethod",
"proofValue": "u2V0BhVh..."
}
}
Source: W3C VC Data Model 2.0 (2025) with Data Integrity BBS. Structure based on the European Learning Model.
qualificationLevel isn’t a string. It’s a link to the European Qualifications Framework. A German employer’s system can resolve snb/eqf/6 and map it to its own framework automatically. A Spanish diploma and a German Zeugnis can be interpreted as the same qualification, because the context tells the verifier what each field means. Not by convention. Deterministically.
And then there’s unlinkability. BBS signatures (draft-irtf-cfrg-bbs-signatures) produce derived proofs: each presentation generates a mathematically distinct proof. The proof value itself is unlinkable across presentations. Verifiers cannot correlate based on the cryptographic proof alone. Disclosed attributes and metadata can still be used for correlation, but the cryptographic foundation is fundamentally different from mdoc and SD-JWT VC where the same issuer signature appears in every presentation.
DC4EU validated this in practice: a large-scale pilot across 25 countries with nearly 100 partners, covering education credentials and professional qualifications. The Commission’s own reference implementation was extended to support the full W3C VC issuance and presentation flow. This isn’t a proposal. It’s completed, publicly funded work.
Where it breaks down: complexity and adoption. JSON-LD processing is heavier than JWT or CBOR. The linked vocabularies only work if both parties adopt the same ones, which still requires coordination. And in EUDI specifically, the implementing regulations formally reference W3C VC through ETSI TS 119 472-1, but the operational scaffolding (PID encoding tables, presentation profiles, issuance protocol specifications) exists only for mdoc and SD-JWT VC.
No encoding tables for W3C VC. No presentation profile for JSON-LD with embedded Data Integrity proofs. No regulated issuance protocol. A recent consultation (deadline: March 5, 2026) drew 43 contributions, and this gap was the dominant theme: W3C VC hasn’t been removed from the normative perimeter, but it has been deprived of the scaffolding necessary to function. DC4EU strategic committee member Alex Grech has argued that assuming semantic interoperability can be layered on later is a critical strategic vulnerability.
And the BBS curve (BLS12-381) isn’t on any EU-approved cryptographic mechanisms list. No national agencies currently endorse pairing-based cryptography, and common secure elements like TPMs don’t support BLS12-381. The privacy primitive is technically ready, but not formally regulated. Which creates a contradiction: Article 5a(16)(b) of the regulation requires unlinkability where identification is not needed, but the only format that delivers it cryptographically doesn’t have regulatory approval for its curve.
A legal obligation without a cryptographic mechanism.
Side by side
| X.509 | mDL / mdoc | SD-JWT VC | W3C VC | |
|---|---|---|---|---|
| Encoding | ASN.1 (binary) | CBOR (binary) | JSON (JWT) | JSON-LD |
| Selective disclosure | No | Per claim (salted hashes in MSO) | Per claim (salted hashes in JWT) | Depends on proof (yes with BBS) |
| Unlinkability | No | No | No | Yes (BBS, not yet approved for EUDI) |
| Semantic layer | OIDs | Namespace-defined | vct (type identifier) |
@context (resolvable vocabularies) |
| Trust model | Hierarchical CA | X.509 issuer certificates | X.509, DIDs, trusted lists | X.509, DIDs, trusted lists |
| EUDI Wallet role | Trust anchor | Required for PID | Required for PID | Referenced, incomplete scaffolding |
What the trade-offs reveal
The EUDI Wallet didn’t invent these formats. But by forcing them into the same system, it exposes the trade-offs:
- Selective disclosure is solved by mdoc and SD-JWT VC (salted hashes) and W3C VC (Data Integrity proofs, including BBS). X.509 has none.
- Unlinkability is only solved cryptographically by BBS. Everything else is linkable across presentations.
- Semantic interoperability is only solved by W3C VC’s
@context. mdoc and SD-JWT VC tell you a credential is valid. W3C VC tells you what it means.
All four will coexist. X.509 as trust anchor. mdoc and SD-JWT VC as the two mandatory PID formats. W3C VC for cross-border credentials where meaning matters, and for unlinkable privacy once BBS gets regulatory approval. The question is whether the operational scaffolding catches up for all of them. Right now, for one of them, it hasn’t.