Shane Deconinck Trusted AI Agents · Decentralized Trust
Just launched: trustedagentic.ai · a structured approach to your agentic transformation

EUDI Credential Formats Crash Course: X.509, mDL, SD-JWT VC, and W3C VC

· 12 min read · Technical

The web was built bottom-up: permissionless, pseudonymous, no central authority. Legal identity is top-down: governments, civil registries, hierarchical trust. The middle ground where the two meet is messy and often toxic. Sign in with Google and you hand over more than you need. Verify your identity for KYC and you’re uploading a selfie next to your passport. Deepfakes and synthetic identities are widening the gap.

The cryptography to fix this has existed for years, often under the label Self-Sovereign Identity (SSI): cryptographic credentials you carry in a wallet and present directly, without phoning home to the issuer. What was missing was adoption. The EUDI Wallet, mandated by eIDAS 2.0 (the follow-up to eIDAS, expanding electronic identity from trust services to citizen-facing digital wallets), is changing that: by late 2026, every EU Member State must offer one, interoperable across 27 countries and 450 million citizens.

The model is simple. The implementation isn’t. A wallet that works at a border gate, on a website, across 27 legal systems, and between machines can’t, it turns out, rely on a single data format. The EUDI Wallet has to make multiple formats work together, at scale. If you’re building anything that needs to embed trust, these formats are worth getting familiar with.

The EUDI Wallet relies on four of them: X.509, mdoc, SD-JWT VC, and W3C VC. Different standards bodies, different eras, different design goals. This post walks through each of them.

The trust anchor you already use: X.509

X.509 came out of the ITU-T X.500 directory standards in 1988, designed for authenticating entities in hierarchical directory services. The original X.500 protocol stack was largely replaced by LDAP, but X.509 survived because Netscape picked it up for SSL/TLS in the 1990s. Every HTTPS connection since then relies on X.509 certificates. That’s a pattern you’ll see repeated: formats rarely end up where they started.

In the EU, X.509 is also the backbone of eIDAS trust services: electronic signatures, website authentication, and electronic seals. The trust layer you use daily without thinking about it.

A certificate binds a public key to an identity, signed by a Certificate Authority. You trust the root CA, which signs intermediates, which sign end-entity certificates. Simple, hierarchical, battle-tested.

Root CA                    ← self-signed, pre-installed in OS/browser
  └── Intermediate CA      ← e.g. Belgian PID Provider CA
        └── End-entity     ← e.g. PID Provider BE
X.509 certificate · this site's TLS certificate via openssl x509 -text
Certificate:
    Data:
        Version: 3 (0x2)
        Serial Number:
            06:d7:95:3c:03:28:8a:58:23:8e:12:2b:9d:ef:a6:e6:07:52
        Signature Algorithm: ecdsa-with-SHA384
        Issuer: C=US, O=Let's Encrypt, CN=E7
        Validity
            Not Before: Jan 25 16:20:48 2026 GMT
            Not After : Apr 25 16:20:47 2026 GMT
        Subject: CN=shanedeconinck.be
        Subject Public Key Info:
            Public Key Algorithm: id-ecPublicKey
                Public-Key: (256 bit)
                pub:
                    04:d6:46:77:72:99:23:88:b6:8f:7a:01:63:86:6b:
                    be:91:52:a3:25:8f:ba:76:b7:55:76:34:4d:99:5f:
                    95:bf:03:4a:5d:0f:a1:91:47:ea:e1:e9:c6:80:72:
                    58:48:3f:f2:d7:1f:22:53:86:56:1d:76:7e:49:4f:
                    b7:a0:d2:98:8d
                ASN1 OID: prime256v1
                NIST CURVE: P-256
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature
            X509v3 Extended Key Usage:
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Basic Constraints: critical
                CA:FALSE
            X509v3 Subject Alternative Name:
                DNS:*.shanedeconinck.be, DNS:shanedeconinck.be
    Signature Algorithm: ecdsa-with-SHA384
         30:65:02:31:00:ae:ce:2f:e5:d7:8b:63:32:e9:e5:f9:...

This site’s own TLS certificate via openssl x509 -text. The actual certificate is binary (ASN.1 DER).

In the EUDI Wallet, X.509 isn’t something citizens see. It’s the bridge to existing trust infrastructure: issuer certificates, relying party authentication, trusted lists. The ecosystem is already there, and X.509 lets the wallet plug into it.

But X.509 was built for a narrow job: binding a key to an identity. Version 3 (1996) added extensions, and in theory they’re open-ended: you can define custom OIDs (Object Identifiers, globally unique numeric codes; 2.5.29.17 in the example above is the one for Subject Alternative Name). Custom OIDs need to be registered under your organisation’s branch (via IANA or a national registry), and you can put any data you want in them.

In practice, it’s cumbersome. Each extension is ASN.1 encoded, so you define the exact binary structure upfront. Mark an extension critical and every verifier that doesn’t understand it must reject the certificate. Mark it non-critical and verifiers can silently ignore it.

That’s not a flaw. X.509 was designed for a world of well-known certificate types, not arbitrary credentials. But a wallet that carries driving licenses, diplomas, health certificates, and age tokens needs something more flexible: a generic container where new credential types don’t require new parsing logic. And it needs selective disclosure: presenting just your age without revealing your name. X.509 has neither. Which is why the wallet needs something else.

The format your phone presents in person: mDL / mdoc

mDL stands for mobile driving license. The spec (ISO 18013-5, published 2021) was designed for a specific problem: showing your driver’s license on your phone instead of a plastic card. The work started around 2015, driven by US states and Australia, where the driving license is the de facto national ID. The underlying data format, mdoc, has since been generalized for broader mobile credentials via ISO 23220.

The encoding is CBOR (binary, compact, fast), designed for proximity transport like NFC and BLE. The issuer hashes each data element individually (with a random salt) and signs the full set of digests in a Mobile Security Object (MSO). When you present, you choose which elements to include. The verifier re-hashes each revealed element and checks it against the signed MSO. Selective disclosure, native to the format. All credential formats in this article can be verified offline (the signature is self-contained), but mdoc was designed for it: compact enough for NFC, with proximity transport built into the spec.

mdoc credential · CBOR diagnostic notation
{
  "docType": "org.iso.18013.5.1.mDL",
  "issuerSigned": {
    "nameSpaces": {
      "org.iso.18013.5.1": [
        24(<< { "digestID": 0, "random": h'8798...', "elementIdentifier": "family_name",  "elementValue": "TURNER" } >>),
        24(<< { "digestID": 1, "random": h'51C2...', "elementIdentifier": "given_name",   "elementValue": "SUSAN" } >>),
        24(<< { "digestID": 2, "random": h'D48C...', "elementIdentifier": "birth_date",   "elementValue": 1004("1990-08-28") } >>),
        24(<< { "digestID": 7, "random": h'2605...', "elementIdentifier": "age_over_18",  "elementValue": true } >>)
      ]
    },
    "issuerAuth": [
      h'A10126',                          ; protected: alg=ES256
      { 33: h'3082...' },                 ; unprotected: x5chain (issuer X.509 cert)
      << { "digestAlgorithm": "SHA-256",  ; MSO payload
           "docType": "org.iso.18013.5.1.mDL",
           "valueDigests": { "org.iso.18013.5.1": { 0: h'7516...', 1: h'C570...', 2: h'A4B1...', 7: h'FF75...' } }
      } >>,
      h'5901A4...'                        ; ECDSA signature
    ]
  }
}

CBOR diagnostic notation (human-readable rendering of the binary format). Based on ISO 18013-5 (2021), Annex D.

The holder chooses which data elements to include in the presentation. The verifier only sees those elements. But the namespace (org.iso.18013.5.1) is the only thing giving those fields meaning.

mdoc’s strength is where it started: proximity. Age verification at a vending machine, identity check at a hotel check-in, presenting your driving license during a traffic stop. Tap your phone, share only what’s needed.

mdoc has no semantic layer. What does family_name mean? You’d look it up in the ISO 18013-5 specification, where every field in the org.iso.18013.5.1 namespace is defined. But that definition lives out of band, in a human-readable document, not in the format itself. Like X.509, meaning is agreed outside the format. Unlike X.509, adding a new credential type is lightweight: define a new namespace string, use the same CBOR structure, no OID registration or custom parsing needed.

For a single credential type like a driving license, that works: every country uses the same ISO namespace, so verifiers know what each field means. Any credential type can be standardized this way. But as the number of types grows (professional qualifications, educational credentials, health certificates, each across 27 countries), coordinating all those namespace definitions out of band becomes its own problem. The format has no built-in mechanism to tell the verifier what the fields mean.

No unlinkability either. The issuer signature stays the same across presentations. A colluding set of verifiers can link your presentations together.

The format a website verifies online: SD-JWT VC

Where mdoc’s strength is proximity, SD-JWT VC’s strength is the web. The split between mdoc and SD-JWT VC follows the encoding: CBOR is binary, compact (typically 20-50% smaller than JSON), and has deterministic encoding, which matters over NFC and BLE. JWT is native to the web and the existing OAuth/OpenID ecosystem. That’s why SD-JWT VC was chosen for EUDI alongside mdoc: it meets the web where it already is, adding selective disclosure to a stack developers already know.

JWTs (JSON Web Tokens) are the standard token format on the web: OAuth, OpenID Connect, API authentication. Billions of tokens in circulation. But a regular JWT is all-or-nothing, just like X.509. SD-JWT (RFC 9901, 2025) added selective disclosure: the ability to reveal only some claims from a signed token.

SD-JWT VC (Selective Disclosure JWT Verifiable Credential) layers credential semantics on top of that: credential typing, issuer metadata, status checking. Both specs emerged from the IETF OAuth working group.

The encoding is JWT: header, payload, signature. Claims you want to keep hidden are replaced by salted hashes in an _sd array. Each hidden claim has a corresponding disclosure (a base64url-encoded [salt, name, value] triple). At presentation, you include only the disclosures you want to reveal.

SD-JWT VC credential · JWT payload with selective disclosure
{
  "vct": "https://credentials.example.com/identity_credential",  // → resolves to Type Metadata (JSON Schema, display info)
  "iss": "https://example.com/issuer",                           // → resolves to issuer keys via .well-known endpoint
  "iat": 1683000000,
  "exp": 1883000000,
  "cnf": {
    "jwk": { "kty": "EC", "crv": "P-256", "x": "TCAER19Zvu3...", "y": "ZxjiWWbZMQG..." }
  },
  "_sd": [
    "09vKrJMOlyTWM0sjpu_pdOBVBQ2M1y3KhpH515nXkpY",
    "2rsjGbaC0ky8mT0pJrPioWTq0_daw1sX76poUlgCwbI",
    "EkO8dhW0dHEJbvUHlE_VCeuC9uRELOieLZhh7XbUTtA",
    "JzYjH4svliH0R3PyEMfeZu6Jt69u5qehZo7F7EPYlSE",
    "PorFbpKuVu6xymJagvkFsFXAbRoc2JGlAUA2BA4o7cI"
  ],
  "_sd_alg": "sha-256"
}

// Disclosures (illustrative; hashes above are placeholders):
// ["2GLC42sKQveCfGfryNRN9w", "given_name", "John"]
// ["eluV5Og3gSNII8EYnsxA_A", "family_name", "Doe"]
// ["Qg_O64zqAxe412a108iroA", "address", {"street_address": "123 Main St", ...}]
// ["AJx-095VPrpTtN4QMOqROA", "birthdate", "1940-01-01"]
// ["Pc33JM2LchcU_lHggv_ufQ", "is_over_18", true]
// The holder selects which disclosures to reveal at presentation.

Source: SD-JWT VC on SD-JWT (RFC 9901, 2025). Adapted from draft-ietf-oauth-sd-jwt-vc-15.

The vct claim identifies what kind of credential this is. Verifiers can resolve issuer keys, type metadata, and credential status through separate endpoints. A website needs to verify your age, your nationality, your professional qualification: SD-JWT VC handles the online flow with full selective disclosure. Mastercard and Google’s Verifiable Intent spec for agent authorization uses SD-JWT credential chains for delegated payments.

The trade-off: vct defines the credential as one monolithic type, with claim meaning agreed out of band. That’s fast and works well for most credential types. W3C VC’s @context takes a different approach: every claim links to a resolvable vocabulary, and you can compose claims from multiple vocabularies in a single credential. More expressive, but JSON-LD processing is heavier.

And like mdoc, no native unlinkability. The holder’s key binding is the same across presentations. Batch issuance (multiple credentials with different keys) can mitigate this, but unlinkability becomes an operational choice, not a cryptographic guarantee.

The format that carries meaning across borders: W3C VC

Now the hard case. A Spanish university issues a diploma. A German employer needs to verify it. Both mdoc and SD-JWT VC could carry the data, with meaning agreed out of band. But what if the credential needs to combine fields from different vocabularies (education, professional qualification, identity) and the verifier’s system needs to resolve what each field means from the credential itself?

The W3C Verifiable Credentials Data Model (v1.0 in 2019, v2.0 in 2025) was developed by the W3C Verifiable Credentials Working Group, bringing together actors from government digital identity (EU, Canada), education (credential transparency), and decentralized identity (DID community). Unlike mdoc and SD-JWT VC, where selective disclosure is baked into the credential structure at issuance, W3C VC separates the data model from the proof layer. The credential itself is plain JSON-LD with all claims visible. The Data Integrity specification adds proof mechanisms on top: ECDSA for signing (with an optional selective disclosure variant), or BBS for unlinkable selective disclosure. The choice of proof determines what’s possible at presentation.

The @context field links every claim to a resolvable vocabulary.

W3C VC credential · JSON-LD with BBS proof
{
  "@context": [
    "https://www.w3.org/ns/credentials/v2",
    "https://data.europa.eu/snb/model/elm"
  ],
  "type": ["VerifiableCredential", "EuropeanQualificationCredential"],
  "issuer": "did:web:uni-madrid.es",
  "validFrom": "2026-01-15T00:00:00Z",
  "credentialSubject": {
    "name": "María García",
    "qualificationLevel": {
      "@id": "http://data.europa.eu/snb/eqf/6",
      "name": "Bachelor"
    },
    "fieldOfStudy": {
      "@id": "https://esco.ec.europa.eu/en/classification/skills?uri=http://data.europa.eu/esco/isced-f/0610",
      "name": "Information and Communication Technologies"
    }
  },
  "proof": {
    "type": "DataIntegrityProof",
    "cryptosuite": "bbs-2023",
    "verificationMethod": "did:web:uni-madrid.es#key-1",
    "proofPurpose": "assertionMethod",
    "proofValue": "u2V0BhVh..."
  }
}

Source: W3C VC Data Model 2.0 (2025) with Data Integrity BBS. Structure based on the European Learning Model.

qualificationLevel isn’t a string. It’s a link to the European Qualifications Framework. A German employer’s system can resolve snb/eqf/6 and map it to its own framework automatically. A Spanish diploma and a German Zeugnis can be interpreted as the same qualification, because the context tells the verifier what each field means. Not by convention. Deterministically.

This isn’t new to the EU. EBSI (European Blockchain Services Infrastructure, now transitioning to the EUROPEUM-EDIC) established W3C VC as the credential format for cross-border public services, with diploma verification as a flagship use case. DC4EU built on that foundation and validated it at scale: a large-scale pilot across 25 countries with nearly 100 partners, covering education credentials and professional qualifications. The Commission’s own reference implementation was extended to support the full W3C VC issuance and presentation flow. This isn’t a proposal. It’s completed, publicly funded work.

W3C VC also enables unlinkability, though not yet in EUDI. BBS signatures (draft-irtf-cfrg-bbs-signatures) produce derived proofs: each presentation generates a mathematically distinct proof. The proof value itself is unlinkable across presentations. Verifiers cannot correlate based on the cryptographic proof alone. Disclosed attributes and metadata can still be used for correlation, but the cryptographic foundation is fundamentally different from mdoc and SD-JWT VC where the same issuer signature appears in every presentation.

Where it breaks down: complexity and adoption. JSON-LD processing is heavier than JWT or CBOR. The linked vocabularies only work if both parties adopt the same ones, which still requires coordination. And in EUDI specifically, the implementing regulations formally reference W3C VC through ETSI (the European standards body for telecom and digital infrastructure) TS 119 472-1, but the operational scaffolding (PID encoding tables, presentation profiles, issuance protocol specifications) exists only for mdoc and SD-JWT VC.

In practice, that means: no specification for how to encode a W3C VC as a PID, no defined flow for how a wallet presents one to a verifier, and no regulated issuance protocol. mdoc and SD-JWT VC have all three. W3C VC is mentioned in the regulation but lacks the practical specs to actually use it.

A recent public consultation (deadline: March 5, 2026) drew 43 contributions, and this gap was the dominant theme. More than half the substantive contributions — from universities, trust service providers, standards bodies, and digital rights organisations across over a dozen Member States — converge on the same diagnosis: de jure inclusion, de facto exclusion.

And the BBS curve (BLS12-381) isn’t on ENISA’s agreed cryptographic mechanisms list (v2.0, April 2025). No national agencies currently endorse pairing-based cryptography, and common secure elements like TPMs don’t support BLS12-381. The privacy primitive is technically ready, but not formally regulated. Which creates a contradiction: Article 5a(16)(b) of the regulation requires unlinkability where identification is not needed, but the only format that delivers it cryptographically doesn’t have regulatory approval for its curve.

A legal obligation without a cryptographic mechanism.

Side by side

X.509mDL / mdocSD-JWT VCW3C VC
EncodingASN.1 (binary)CBOR (binary)JSON (JWT)JSON-LD
Selective disclosureNoPer claim (salted hashes in MSO)Per claim (salted hashes in JWT)ECDSA-SD or BBS (W3C VC lacks EUDI scaffolding; BBS curve also unapproved)
UnlinkabilityNoNoNoYes (BBS, not yet approved for EUDI)
Semantic layerOIDsNamespace-definedvct (type identifier)@context (resolvable vocabularies)
Trust modelHierarchical CAX.509 issuer certificatesX.509, DIDs, trusted listsX.509, DIDs, trusted lists
EUDI Wallet roleTrust anchorRequired for PIDRequired for PIDReferenced, incomplete scaffolding

What the trade-offs reveal

The EUDI Wallet didn’t invent these formats. But by forcing them into the same system, it exposes the trade-offs:

  • Selective disclosure is solved by mdoc and SD-JWT VC (salted hashes) and W3C VC (Data Integrity proofs: ECDSA-SD for selective disclosure, BBS for selective disclosure plus unlinkability). X.509 has none.
  • Unlinkability is only solved cryptographically by BBS, but BBS isn’t approved for EUDI (unapproved curve). ECDSA-SD gives selective disclosure with approved cryptography (P-256), but not unlinkability — and both are W3C Data Integrity cryptosuites, which lack EUDI scaffolding regardless. Everything else is linkable across presentations.
  • Semantic interoperability is only solved by W3C VC’s @context. mdoc and SD-JWT VC tell you a credential is valid. W3C VC tells you what it means.

All four will coexist. X.509 as trust anchor. mdoc and SD-JWT VC as the two mandatory PID formats. W3C VC for cross-border credentials where meaning matters, and for unlinkable privacy once BBS gets regulatory approval. The question is whether the operational scaffolding catches up for all of them. Right now, for one of them, it hasn’t.

What about zero-knowledge proofs?

Everything above assumes that privacy properties like selective disclosure and unlinkability must be built into the credential format. BBS and ECDSA-SD do this at the proof layer. mdoc and SD-JWT VC do it with salted hashes. X.509 doesn’t do it at all.

Zero-knowledge proofs (ZKPs) take a different approach entirely: instead of changing the credential, you change the presentation. A ZKP circuit can prove that you hold a valid credential, signed by a trusted CA, bound to your key, without revealing the credential itself, the signature, or the public key. Selective disclosure and unlinkability become properties of the proof system, not the format.

Early work is exploring this for EUDI. EUDI-ZK demonstrates ZKP circuits over existing eIDAS X.509 certificates and e-seals, reusing current signing infrastructure as-is with cryptography that’s already on the approved lists. If this matures, the format trade-offs in this article look different. The rigid, all-or-nothing X.509 certificate could gain privacy properties it was never designed for, without changing a single byte of the format.

It’s early. But it’s worth watching.

Stay in the loop

A few times per month at most. Unsubscribe with one click.

Your email will be stored with Buttondown and used only for this newsletter. No tracking, no ads.

↑ Back to top