Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This section elaborates the theory behind the SSI Kit:
SSI Kit | Basics - Learn what the SSI Kit is and what it does.
SSI Flavors & Ecosystems - Learn which SSI flavors and identity ecosystems we support.
Architecture - Explore the SSI Kit's multi-layered architecture and components.
Use Cases - Explore use cases you can implement with the SSI Kit.
SSI-Kit feature list - Explore all features in an overview list.
Our products are agnostic towards the underlying technologies used to implement Trust Registries, which means that the SSI Kit is potentially compatible with any type of Trust Registry.
The SSI Kit supports:
Permissionless Blockchains (e.g. Ethereum),
Permissioned Blockchains (e.g. Ethereum Enterprise/Hyperledger Besu),
Domain Name Service (DNS),
Pure peer-to-peer approaches that do not require Registries.
Note that we are continuously adding support for new Registries and underlying technologies.
You can learn more about Trust Registries here.
Our open source solutions enable you to use different types of DIDs and different identity ecosystems. Every relevant functionality is supported from the generation of DIDs and DID Documents to anchoring or resolving them on/from Registries.
We currently support the following DID methods:
did:ebsi
did:web
did:key
did:jwk
did:iota
did:cheqd
Note that we are continuously adding support for new DID methods.
You can learn more about DIDs here.
We believe in a multi-ecosystem future.
This is why we built an abstraction layer for ecosystem-specific operations and business logic. The idea is to support any ecosystem with a single solution that does not put any additional burden on developers. As a result, you can use our solutions to participate in different ecosystems without having to switch between different technical implementations.
We currently support:
EBSI/ESSIF (EU's new decentralized identity ecosystem)
Gaia-X (EU's new cloud infrastructure)
Velocity Network
cheqd Network
IOTA
Note that we are continuously adding new ecosystems.
Cryptographic keys convey control over digital identities and enable core functionality such as encryption and authentication.
The SSI Kit supports:
EdDSA / ed25519
ECDSA / secp256k1
ECDSA / secp256r1
RSA
Note that we are continuously adding support for new key types.
You can learn more about keys here.
Here are the most important things you need to know about the SSI Kit:
It is written in Kotlin/Java. It can be directly integrated (Maven/Gradle dependency) or run as RESTful web-service. A CLI tool allows you to run all functions manually.
It is open source (Apache 2). You can use the code for free and without strings attached.
It is a holistic solution that allows you to build use cases “end-to-end”. There is no need to research, combine or tweak different libraries to build pilots or production systems.
It abstracts complexity and low-level functionality via different interfaces (CLI, APIs). Additional services facilitate development and integration (e.g. Issuer and Verifier Portals).
It is modular, composable and built on open standards allowing you to customize and extend functionality with your own or third party implementations and to preventing lock-in.
It is flexible in a sense that you can deploy and run it on-premise, in your (multi) cloud environment or as a library in your application.
It enables you to use different identity ecosystems like Europe’s emerging identity ecosystem (EBSI, ESSIF) in anticipation of a multi-ecosystem future.
This software-layer holds a set of generic core services for common SSI and cryptographic functions. The services are in the scope of key management, decentralized identifiers, verifiable credentials and data storage.
The low-level services expose comon interfaces that can conviniently unitized directly via Kotlin/Java or via the REST API (Swagger doc of the core API).
The following is a short summary of the interfaces available. The detailed functions are described in the documentation further on.
Handles keys and cryptographic operations like the generation of signatures (e.g. linked data, JWT) with signature types such as ES256K or EdDSA.
Keys can be stored in a file and database keystore, which is extendable to HSMs and WebKMS.
Abstracts common functionality related to Decentralised Identifiers (DIDs, DID Documents) for methods like “did:web”, “did:key”, “did:ebsi”.
Abstracts common functionality related to Verifiable Credentials (VCs) and Verifiable Presentations (VPs) in different formats like JSON and JSON-LD.
A shared and trusted record of information.
Registries serve as a single source of truth which all participants of an SSI ecosystem can trust. Depending on the ecosystem, registries make information accessible to anyone or just a limited group. Registries are important because they enable:
(Distributed) Public Key Infrastructures (DPKIs) which establishes an open distribution system for public keys which can be used for encryption and authentication among others.
Trust Registries hold reliable information about people, organizations, things and even credentials (e.g. data models, status and validity information) to ensure that different parties can trust each other and the identity-related data they exchange.
Different technologies can be used to implement Registries. For example:
Blockchains or L1: Typically blockchains are used because it is unfeasible (or even impossible) to tamper with them. The fact that no single organization can change the contents of a blockchain or manipulate the terms by which it is governed are very aligned with the requirements for identity ecosystems. Today, we see a growing number of developers and organizations focusing on so-called permissioned blockchains (i.e. only a selected group can “write”) like Ethereum Quorum/Enterprise. Permissionless blockchains, like Ethereum, are still used, but less than the permissioned alternatives for a variety of reasons like scalability, costs, lack of customisable governance frameworks.
L2: Layer two networks sit on top of blockchains and aggregate data before anchoring it. The main idea behind them is to circumvent common challenges of public, permissionless blockchains like scalability and cost issues. The most popular implementations in the context of identity are “ION” (for Bitcoin) and “Element” (for Ethereum).
Other Distributed Ledger Technologies (DLTs): Sometimes other DLTs are utilised like the Interplanetary File System (IPFS) though its use for digital identity remains limited.
Domain Name Service (DNS): Considering certain drawbacks of DLTs and their relatively slow adoption by the mass market, DNS can also be used to serve as a registry. Though it is not fully decentralised (considering its underlying governance framework), DNS has many advantages like its maturity and global adoption.
Importantly, SSI can be implemented without registries, particularly without blockchains, because identity data (or at least personal data of individuals) is never anchored due to privacy and compliance reasons. However, by combining SSI with blockchains (or other technologies), robust and trustworthy identity ecosystems that utilise transparent DPKIs and reliable Trust Registries can emerge.
Authentication and data exchange protocols (e.g. OIDC/SIOP) enable the exchange of data (VCs) between different parties.
The SSI Kit supports latest OpenID Connect extension for SSI:
The implementation of the protocols is conformant with the latest specs from EBSI https://api-conformance.ebsi.eu/docs/wallet-conformance
You can learn more about protocols here.
Learn about Self-Sovereign Identity (SSI).
Welcome to our Introduction to Self-Sovereign Identity (SSI) for developers and technical readers.
Before you get started, feel free to explore other (less technical) resources that will help you and your team to get a more holistic understanding of SSI and digital identity in general:
The following subsections show examples of interactions with the EBSI/ESSIF frameworks using the SSI Kit command line interface, REST APIs and library SDK.
You can find more information about EBSI/ESSIF here.
The SSI Kit exposes high-level interfaces / APIs to hide the complex introduced by
low-level services (e.g. key management, signing, data storage)
different ecosystems (i.e. different SSI flavors, business logic and governance frameworks).
The functionality of the high-level interfaces correlate with the SSI Kit Components. The functions are grouped around:
issuing Verifiable Credentials by the Signatory,
holding (storing, presenting) Verifiable Credentials by the Custodian
and verifying Verifiable Credentials by the Auditor.
The interfaces can be used in JVM-based applications directly, or via the REST API.
The Swagger documentation can be found under section REST API.
Signatory REST API functions.
The Signatory API exposes the "issuance" endpoint, which provides flexible integration possibilities for anyone intending to act as an "Issuer" (i.e. create, sign and issue Verifiable Credentials), as follows:
Credentials - issue credentials
Templates - create and mange credential templates
Revocations - revocation related functions
If you're new to VCs, check out the intro section for an overview.
The /v1/credentials/issue
endpoint issues a specified credential.
E.g. Issue a UniversityDegree
credential in the default JSON-LD format. In case you don't have the DID for the Issuer and or the Holder, you can create one here.
Check out the Issue with status section to learn about how to issue a verifiable credential with a credentialStatus property.
The currently available template functions are:
list - display the list of Templates
import - import a custom template
load - display the content of the template having the specified id
The /v1/templates
endpoint returns the list of the available template id
s
No parameter
E.g. List the templates
The /v1/templates/{id}
endpoint to import your custom credential template
id path parameter (required) - id
of the template, e.g. MyCustomCredential
The /v1/templates/{id}
endpoint displays the content of the template having the parameters:
id path parameter (required) - id
of the template
No parameter
E.g. Load the template for the id
set to UniversityDegree.
Refer to Credential Statuses section for more details on verifiable credential revocations.
Verifiable Credentials (VCs) are digital identity documents that can easily and securely be shared with and verified (incl. validity, integrity, authenticity, provenance) by anyone in a privacy preserving way. Importantly, they are never (!) stored on a blockchain due to privacy and compliance reasons.
The SSI Kit supports W3C Verifiable Credentials in different formats:
JSON / JWT
JSON-LD
Note that we are continuously adding support for new VC types and formats.
You can learn more about VCs here.
Custodian REST API functions.
The Custodian API provides management functions for maintaining secrets and sensitive data (e.g. keys, Verifiable Credentials) in a secure way:
Data Exchange (Protocols) enable the exchange of data (VCs) between different parties.
Different authentication and data exchange protocols are used to securely transfer identity data (e.g. VCs, VPs) between parties (e.g. from an Issuer to a Holder). They typically establish a mutually authenticated and encrypted data channel between the communicating parties.
The most common data exchange protocols used for SSI are:
OIDC4SSI / SIOP (Self-Issued OpenID Connect Provider): An extension of a mature authentication and authorization protocol called "OpenID Connect" (OIDC).
DIDComm: A novel protocol specifically designed for SSI and maintained by the Decentralized Identity Foundation (DIF).
Credential Handler API: A proposed browser-extension that may be used to connect the user's identity wallet to a web-application.
Our solutions enable you to use different data exchange protocols like OIDC/SIOP as required by different ecosystems.
Core REST API functions.
The Core API exposes wallet core functionality in the scope of storing and managing:
The Core API exposes most of the functionalities provided by the SSI Kit, however newer features will only be released in the other API endpoints. Therefore, it is recommended to use the Signatory API, Custodian API and Auditor API for most use cases.
The following DID management functions are available:
list - list DIDs
load - load DID
delete by url - delete by DID url
create - create DID
resolve - resolve DID
import - import DID
The /v1/did
endpoint lists the available DIDs.
E.g. List the available DIDs.
The /v1/did/{id}
endpoint loads a DID specified by:
id - path parameter (required) - the DID url string
E.g. Load the DID = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
The /v1/did/{id}
endpoint deletes the DID by:
id - path parameter (required) - the DID url string
E.g. Delete the DID = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
The /v1/did/create
creates a DID.
The method
and keyAlias
properties are common for all did-method requests, method
being required, while keyAlias
- optional (if not specified, a new key will be automatically created using the default algorithm according to the did-method). The method-dependent options have default values, if not specified otherwise. Below are the available properties by did-method.
useJwkJcsPub
(default) - false - specifies whether to create a did:key using the jwk_jcs-pub multicodec (code: 0xeb51)
didWebDomain
(default) - "walt.id"
didWebPath
(default) - empty-string
version
(default) - 1
network
(default) - "testnet"
E.g. Create a DID using the key method and automatically generate a new key.
The /v1/did/resolve
resolves a DID url string to a DID document.
E.g. Resolve the DID = did:key:z6MkqmaCT2JqdUtLeKah7tEVfNXtDXtQyj4yxEgV11Y5CqUa
.
The /v1/did/import
endpoint resolves and imports the specified DID url to the underlying data store.
E.g. Import the DID = did:key:z6MkqmaCT2JqdUtLeKah7tEVfNXtDXtQyj4yxEgV11Y5CqUa
.
DID management functions enable the following:
List - lists the available DIDs
Load - loads a DID by the specified id
Delete - deletes a DID by the specified url
Create - creates a new DID
Resolve - resolves a DID to a document
Import - import a DID
For more info on DIDs, go here.
The /did
endpoint lists the available DIDs.
E.g. List the available DIDs
The /did/{id}
endpoint loads a DID specified by:
id path parameter (required) - the DID url string
E.g Load the DID having the id = did:web:walt.id.
The /did/{id}
deletes the DID specified by:
url - path parameter (required) - the DID url string
E.g. Delete the DID having id = did:web:walt.id.
The /did/create
endpoint creates a DID.
The method
and keyAlias
properties are common for all did-method requests, method
being required, while keyAlias
- optional (if not specified, a new key will be automatically created using the default algorithm according to the did-method). The method-dependent options have default values, if not specified otherwise. Below are the available properties by did-method.
useJwkJcsPub
(default) - false - specifies whether to create a did:key using the jwk_jcs-pub multicodec (code: 0xeb51)
didWebDomain
(default) - "walt.id"
didWebPath
(default) - empty-string
version
(default) - 1
network
(default) - "testnet"
E.g. Create a DID using the web
method having the domain set to walt.id
.
The /did/resolve
endpoint resolves a DID.
E.g. Reslove the DID having id = did:key:z6MkkLmAVeM3P6B2LJ2xGrK1wVojCoephK4G9VrCcct42ADX
.
The /did/import
endpoint resolves and imports the DID to the underlying data store.
The DID url string.
E.g. Import DID having id = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
Manage keys, DIDs, issue Verifiable Credentials, and verify them using the SSI-Kit command line tool.
Choose between a Docker or a JVM-based runtime.
Make sure you have Docker build environment installed on your machine.
Pulling the project directly from DockerHub
2. Setting and alias for convenience
3. Getting an overview of the commands and options available
Make sure you have a JDK 16+ build environment including Gradle installed on your machine.
Clone the project
2. Change the folder
3. Run the project
The first time you run the command you will be asked to built the project. You can confirm the prompt.
You will now see an overview of all the different commands and options available.
4. Set an alias
To make it more convient to use, you can also set an alias as follows for the wrapper script:
5. Get the overview again
If you want to get a more detailed overview of the options provided for building the project on your machine, please refer to building the project.
For debug infos add "-v" e.g.:
Explore the components of the SSI Kit and their functionality:
To expose the API service using the CLI tool or the docker container, use one of the following commands:
Show all options for specifying bind address and ports:
On localhost only using the default ports 7000-7003
Binding on all network interfaces, using the default ports 7000-7003
Binding on a specific network interface (e.g.: 192.168.0.1)
Using docker one needs to bind to 0.0.0.0 in the container and limit the binding from outside using the docker run -p syntax like so:
Use custom ports by using the -p (Core API), -e (ESSIF API), -s (Signatory API) command options
Key management functions include:
List - lists the available keys
Load - loads a key specified by its alias
Generate - generate a key using the specified algorithm
Import - imports a key
Delete - deletes a specific key
Export - exports public and private key parts (if supported by the underlying keystore)
The /keys
endpoint lists the key available to the Custodian
E.g. List the available keys
The /keys/{alias}
endpoint loads a key specified by its alias.
E.g. Load a key with id
e548f032cadf4145ab6886a57c2e87e6
The /keys/generate
endpoint generates a key using the specified algorithm.
E.g. Generate a key using the EdDSA_Ed25519 algorithm.
The /keys/import
endpoint imports a key (JWK or PEM format) to the underlying keystore.
The key string in JWK or PEM format
E.g. Import a public key specified in JWK format.
The /keys/{id}
deletes the specified as parameter:
id path parameter (required) - the key alias
E.g. Delete the key with id
bc6fa6b0593648238c4616800bed7746
The /keys/export
endpoint exports a key.
E.g. Export the public key with id = e548f032cadf4145ab6886a57c2e87e6 as JWK.
The following credentials management functions are available:
list - list verifiable credentials
load - load the verifiable credential
delete - delete the verifiable credential
create - create a verifiable credential
present - create a verifiable presentation from the supplied credentials
verify - verify credential / presentation
import - import the verifiable credential
The /v1/vc
endpoint lists the available credentials.
E.g. List the available credentials.
The /v1/vc/{id}
endpoint loads a credential specified by:
id - path parameter (required) - the credential id
E.g. Load the credential having id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270.
The /v1/vc/{id}
deletes a credential by:
id - path parameter (required) - the credential's id
E.g. Delete the credential with id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270
.
The /v1/vc/create
endpoint creates a credential.
E.g. Create a credential from the UniveristyDegree template, having issuer = did:key:z6MkqmaCT2JqdUtLeKah7tEVfNXtDXtQyj4yxEgV11Y5CqUa
and holder = did:key:z6MkkLmAVeM3P6B2LJ2xGrK1wVojCoephK4G9VrCcct42ADX
.
The /v1/vc/present
endpoint creates a verifiable presentation from the supplied credential. Only JSON-LD format is supported.
E.g. Create a verifiable presentation from the provided VeriafiableID credential for a holder with did = did:web:my.domain
.
The /v1/vc/verify
endpoint verifies the supplied credential or presentation against the signature policy.
E.g. Verify a presentation.
The /v1/vc/import
endpoint imports a verifiable credential.
E.g. Import the UniversityDegree credential.
The demo shows an end-to-end SSI use case from the perspective of an individual (Holder).
Issuer Portal: Shows issuance of Verifiable Credentials (VCs) to a Holder’s wallet.
Visit our Issuer Portal.
Click on "Sign in". You do not need to input an email or a password.
Select and "confirm" your Verifiable Credential (VC) claim.
Wallet: Shows the receipt and management of VCs by a Holder.
Log into the Web Wallet (after being redirected from the Issuer Portal). The demo does not require the account-creation nor a valid passord. Just type a email address to log in you demo account.
Accept the connection request (from the Issuer Portal), by clicking on "share".
Review and accept the Verifiable Credential (VC).
Verifier Portal: Shows the presentation of VCs by Holder in order to authenticate or identify towards a Relying Party (Verifier).
Our hosted demo verifier is currently not working with our wallet. We are working on fixing the issue. Please reach out in case of any questions.
Visit the Verifier Portal.
Connect the wallet to share VCs.
Log into the wallet (you used before)
Accept the credential request by the Verifier.
Get access to the Verifier’s products, services or other benefits..
For building the project JDK 16+ is required.
First clone the Git repo and switch into the project folder:
The walt.id wrapper script ssikit.sh is a convenient way for building and using the library on Linux.
The script takes one of the the following arguments:
build|build-docker|build-podman|extract|execute.
For example, for building the project, simply supply the "build" argument:
Manually with G
radle:
After the Gradle build you can run the executable.
In build/distributions/
you have two archives, a .tar, and a .zip.
Extract either one of them, and run waltid-ssi-kit-1.0-SNAPSHOT/bin/waltid-ssi-kit
.
ESSIF specific operations.
Commands:
onboard ESSIF Onboarding flow
auth-api ESSIF EBSI Authentication flow
did ESSIF DID operations.
tir ESSIF Trusted Issuer Registry operations.
timestamp EBSI Timestamp API operations.
taor ESSIF Trusted Accreditation Organization operations.
tsr ESSIF Trusted Schema Registry operations
The following key management functions are available:
list - list of key ids
load - load the public key in JWK format
delete - delete key
generate - generate key
import - import key
export - export key
The /v1/key
endpoint lists the available key ids.
E.g. List the available key ids.
The /v1/key/{id}
endpoint loads the public component of the provided key id in JWK format:
id - path parameter (required) - the key id
E.g. Load the key having id = e548f032cadf4145ab6886a57c2e87e6.
The /v1/key/{id}
endpoint deletes the specified key.
E.g. Delete the key having id = e548f032cadf4145ab6886a57c2e87e6.
The /v1/key/gen
generates a new key using the specified algorithm.
E.g. Generate a new key using the EdDSA_Ed25519 algorithm.
The /v1/key/import
endpoint imports a key (JWK or PEM format) to the underlying keystore.
E.g. Import a public key specified in JWK format.
The /v1/key/export
endpoint exports public and private key part (if supported by underlying keystore).
E.g. Export the public key with id = bc6fa6b0593648238c4616800bed7746 as JWK.
This use case describes the steps, which are required to register a DID on the EBSI blockchain.
Key generation (type ECDSA Secp256k1, which is required for signing ETH transactions)
Generation of the DID document
EBSI/ESSIF Onboarding flow.
As prerequisite, the bearer token (validity of 15 min) from https://app-pilot.ebsi.eu/users-onboarding/v2 must be placed in file
data/ebsi/bearer-token.txt
After successfully completing the onboarding process, the Verifiable Authorization (validity of 6 months) from the Ebsi Onboarding Service is placed in data/ebsi/verifiable-authorization.json
EBSI/ESSIF Auth API flow
After successfully completing the Auth API flow, the decrypted EBSI Access Token (validity of 15min) can be accessed in file: /home/pp/dev/walt/data/ebsi/ebsi_access_token.json
EBSI/ESSIF DID registration
DID Resolution (only to check if the DID was correctly anchored with the EBSI blockchain)
The resulting DID document from the EBSI blockchain:
First pull the latest container
Starting the container as RESTful service
Key generation (type ECDSA Secp256k1, which is required for signing ETH transactions)
Generation of the DID document
EBSI/ESSIF Onboarding flow
EBSI/ESSIF Auth flow
EBSI/ESSIF DID registration
DID Resolution (only to check if the DID was correctly anchored with the EBSI blockchain)
The did:ebsi example shows how to register an EBSI DID in Java.
Services come with their own configuration files.
For the configuration of service -> implementation mappings, ServiceMatrix is used.
The default mapping file is "service-matrix.properties", and looks like this:
e.g., to change the keystore service, simply replace the line
id.walt.services.keystore.KeyStoreService=id.walt.services.keystore.SqlKeyStoreService
with your own implementation mapping, e.g. for the Azure HSM keystore:
id.walt.services.keystore.KeyStoreService=id.walt.services.keystore.azurehsm.AzureHSMKeystoreService
To add a service configuration:
id.walt.services.keystore.KeyStoreService=id.walt.services.keystore.SqlKeyStoreService:sql.conf
Service configuration is by default in HOCON format. Refer to the specific service on how their configuration is laid out.
EBSI is a blockchain run by member states and public authorities. The idea is to create a trusted, public ledger that serves as a single source of truth for vital information. As such, EBSI instantiates different Trust Registries, such as
Trusted Issuers Registry (TIR), which contains information about organisations (e.g. public keys, accreditations, legal name, ...)
Trusted Accreditation Organisation Registry (TAOR), which is similar to the TIR, but contains information about organisations that can authorise Issuers to issue credentials in regulated fields.
Trusted Schemas Registry (TSR), which contains information about semantic contexts and vocabularies, policies and templates of VCs and their data models to ensure semantic interoperability.
ESSIF's goal is to bring SSI to Europe. To do so, it fullfills different functions:
Governance and Trust Framework: ESSIF drives the creation of a EU Governance and Trust Framework supported by Member States.
Standards and specifications: ESSIF creates business, functional and technical specifications, which make up the European “flavor of SSI”.
Facilitate Adoption: ESSIF coordinates the adoption of SSI across Europe starting with pilot projects of “Early Adopters” which are public authorities from different member states.
Note that ESSIF specifies its own DID Method and different types of Verifiable Credentials (VCs):
Verifiable IDs, which are mainly used to identify entities at a high-level of assurance. As such, they can be compared to passports or national IDs.
Verifiable Attestions, which encode basically any other type of identity information, such as education or work records, financial data or health data.
Verifiable Manadates, which enable delegation as outlined in this paper.
There are also other variations of VC that evolved over time such as Verifiable Accreditations (as issued by TAOS; see above) or Verifiable Authorizations (used to authorize parties to interact with EBSI).
This section describes the main steps required to interact with Velocity network:
Intro to Velocity Network
Velocity NetworkTM is a public permissioned distributed network, based on a permissioned version of the Ethereum Blockchain utilizing Hyperledger Besu. Operating a node and writing to the Velocity Ledger requires permission from the Velocity Network FoundationR.
The following data is stored on chain:
organization metadata - DID, profile, endpoint
credential metadata (encrypted) - ID, type, public key, revocation status
verification voucher transactions
credential types and schemas
Holder - a person that holds the credential on behalf of the subject (themselves or another person)
Issuer - an organization that creates and issues credentials
first party issuer - an entity that can directly attest to the claims within the credential
notary issuer - an entity that can evaluate evidence to attest to the claims within the credential
Relying Party - an entity that requests and verifies credentials from a Holder
Wallet Provider (Holder App Provider) - an organization offering digital wallets to be used by Holders
Credential Agent Operator - an organization operating a credential agent
Agent - an interface to the network used by organizations (Issuer, Relying Party, Holder) - call contracts, retrieve account states - form the 'layer-2' network
Tenant - an organization’s delegate on which behalf the agent is acting
Node Operator - an organization operating a node
Node - a participant on the network holding copies of the underlying ledger
Members (Stewards) - read-only nodes with limited data access that forward write operations to Validators
Validators - full-write permission nodes that participate in consensus
Velocity Network Registrar - a set of centralized services that are used for administering the accredited organizations and credential types on the Network
Credits hub - a module where Velocity credits are administered and credit reward transactions can be executed
Voucher hub - a module where Velocity vouchers are administered and top up transactions can be executed
The ledger - the distributed blockchain-based, continuously-replicated, global cryptographic database maintained by Stewards operating nodes communicating with the Velocity consensus protocol
Issuing - the process of asserting claims about a Holder who receives the verifiable credential
by writing a transaction to the Velocity Ledger which includes the credential ID, its type, the Issuer ID, and the public key matching the private key that signed it
Revocation - the act of an Issuer revoking the validity of a credential
by writing a transaction to the Velocity Ledger marking the credential as revoked
Verification - the process of confirming that a verifiable credential is not modified, revoked or expired and is issued by a trusted authority
by accessing the Velocity Ledger to retrieve the unique public key associated with credential and verify its signature
Velocity currently uses the JWT format for encoding credentials with JWS signatures using SECP256K1 as proofs.
Verifiable credentials are divided into the following categories:
Layer-1 credential types - network’s core set of credential types (e.g. Email, IdDocument, OpenBadgeCredential)
for each issued credential, the Issuer receives a reward in the form of Velocity Credits
Layer-2 credential types - any custom credential type
should be mapped to a Layer-1 type in order for the Issuer to be eligible for a reward
More about credential types here https://docs.velocitynetwork.foundation/docs/developers/basics-credential-types.
did:ion - used to identify organizations or individuals
received when registering with the Registrar
did:velocity - used to identify credentials
is immutable
stores only a single key and credential type
resolving it will burn an NFT to permit DID resolution
This is a holistic SSI use case, which demonstrates the setup of two identities for an Issuer and a Holder on the EBSI blockchain. It also shows the steps to issue two diploma credentials to the Holder (e.g student), which then creates a Verifiable Presentation including both credentials in order to be verified. The Verifier then resolves the DIDs from the EBSI ledger and uses the corresponding public keys to verify the signatures from the issued credentials.
Creating a work-dir for all three parties of the trust triangle (Issuer, Holder & Verifier)
Setting up the Issuer (generating a key, EBSI DID and registering it on the EBSI ledger)
Setting up the Holder (generating a key, EBSI DID and registering it on the EBSI ledger)
Setting up the Verifier (only run ssikit in order to initialize the work-dir)
Issuing two credentials, one Bachelor & one Master degree (values are defined by running the interactive shell). Both credentials are based on the VerifiableDiploma Template
Creating the Verifiable Presentation containing both - the Master and the Bachelor credential
Verifying the Verifiable Presentation by resolving DIDs (public keys) from the EBSI ledger and verifying the signatures from each VC (Bachelor & Master degree credential) and from the VP itself.
Note that the order of the policies does matter. The TrustedIssuerDidPolicy & TrustedSubjectDidPolicy are verifying the presents of the DIDs on the EBSI ledger, and if they are, the keys are imported to the key-store. Once the keys are available, the SignaturePolicy can be applied in order to verify each signature.
Issuing involves an exchange between a Holder and an Issuer, by which the Holder receives a set of offers from the Issuer. Once the Holder accepts the offers, the Issuer converts them into verifiable credentials and supplies them to the Holder.
Depending on data source location and process initiating party, the following issuing types are supported:
custom - credential agent loads offers from itself as well as calling out webhooks
demand triggered - Holder initiates the credential claiming process
Issuer responds with the available credential offers according to Holder’s criteria
supply triggered - Issuer initiates the credential claiming process
offers are made available to the Holder using a notification mechanism by sending a deep-link or qr-code to claim the credentials
batch - credential agent loads offers only from itself
a specialized version of supply triggered custom issuing
More information on issuing can be found at https://docs.velocitynetwork.foundation/docs/developers/developers-guide-issuing.
Use cases you can build with the SSI Kit.
You can use Self-Sovereign Identity (SSI) - and by extension the SSI Kit - to solve any identity-related problem.
You can use the SSI Kit to enable your users, customers, employees or partners to access information, services or products. By this, you can replace today's cumbersome sign-up and login processes (usernames, passwords) with more seamless experiences.
In other words, you can SSI to authenticate stakeholders you already know.
You can use the SSI Kit to identify people, organizations or even things to provide them with information, services or products.
Identity proofing is particularly important in the AML (anti-money launder) regulated industries, but is seeing growing adoption by non-regulated industries and platforms to prevent fraud, SPAM and other malicious behaviour.
Simply put, you can use SSI to identify stakeholders you do not yet know.
You can use the SSI Kit to verify any identity-related information beyond a person’s or company’s core identity (see Identity Proofing / Verification), which can be important when evaluating risks or performing compliance assessments.
For example, you can use SSI for
employment background checks (education, work, criminal history)
financial due diligence (bank account information, liquidity events, credit ratings)
any other type of data verification required for transactions from insurance or health data to social proofs like ratings or recommendations.
SSI can be used to digitize any type of identity-related information in order to replace paper-based identity documents or cards with digital ones that are easier to manage, share and verify as well as harder to forge.
For example, think about official public sector documents such as identity certificates or about licenses or certificates that convey allowance to perform regulated activities.
You can find more examples in our White Papers:
Me, Myself & (SS)I (co-authored by the Boston Consulting Group)
If you have any questions, feel free to get in touch.
We apologize that our current implementation does not yet support the Stardust Upgrade from IOTA. As such, you cannot issue or verify credentials associated via a did:iota. Please refer to our roadmap for more information on when our products will be updated to include this latest changes.
The following section outlines the planned integration of the IOTA identity framework with the walt.id SSI Kit and gives insight on the architecture of the integration and the required changes that need to be applied to the SSI Kit to establish smooth interoperability with the IOTA ecosystem.
The IOTA identity framework, much like the walt.id SSI Kit, is based on open standards for decentralized identity, such as the W3C specifications for verifiable credentials and decentralized identifiers (DIDs) and provides creation, management and registration of DIDs on the IOTA DLT technology (tangle), as well, as issuance, signing and validation of verifiable credentials.
In addition to the open standards for decentralized identity, the IOTA identity frameword implements a custom DID method, which needs to be supported by the walt.id SSI Kit to ensure compatibility.
Thanks to the use of W3C standards for DID documents and verifiable credentials, the SSI Kit is mostly compatible with the IOTA identity framework, with regards to DID documents and issuance/validation of verifiable credentials. Also the key type Ed25519, used in the IOTA framework is already supported by the SSI Kit.
The following aspects have been identified, which require implementation changes and/or integration work in the SSI Kit:
IOTA DID method: Creation, management and registration of DIDs on the IOTA tangle
Key management: Integrate SSI Kit key management seamlessly, such that keys managed by the SSI Kit (or a supported key store implementation) can be leveraged in the context of the IOTA framework.
Signature type: To ensure compatibility of issued credentials with both the SSI Kit and the IOTA framework, the LD-signature type JcsEd25519Signature2020 needs to be supported in the SSI Kit.
Public key format: The public key, stored in the verification methods of the IOTA DID documents, is formatted in multibase encoding, for which support in the SSI Kit needs to be provided.
The following subsections give more details on the planned integration work.
The following chart outlines the overall architecture of the integration between the walt.id SSI Kit and the IOTA framework:
In order to overcome the native-to-managed-code gap between the IOTA identity framework libraries, written in Rust, and the SSI Kit in Kotlin/JVM, a wrapper library is implemented in Rust, which includes the IOTA libraries as a dependency and exposes a plain C-compatible application binary interface (ABI).
The wrapper library can be loaded in Kotlin/JVM using the JNR-FFI abstracted foreign function layer library. The advantage compared to JNI (Java Native Interface) is that no Java-specific interface code needs to be written in the native wrapper library, such that the same library could be used from various other programming and scripting languages, that support loading of native dynamic libraries.
This approach also facilitates portability of the wrapper library to all operating systems and platforms supported by the Rust compiler and the IOTA framework library.
For DID creation and management, the wrapper library implements an interface method, called by the IotaService component in the SSI Kit.
The public and private keys for creating the DID, should be managed by the SSI Kit and its key store abstraction layer, with support for various key store implementations (see also Key management below).
The wrapper library makes use of the AccountBuilder of the identity_iota::account module to create and register a DID on the IOTA ledger. After creation the library updates the DID document to include the various verification method relationships, such that issuance (assertionMethod) and presentation (authentication) of verifiable credentials is permitted using the new DID.
The created DID document is returned to the SSI Kit, where it can be parsed and stored for further use.
In order to be able to make full use of the SSI Kit together with the IOTA framework, it is preferred to share the key store between both worlds.
The SSI Kit provides a key store abstraction layer, that has support for various key store implementations, including an embedded key store and cryptography library, as well as cloud-based HSM stores, such as Azure key vault and the walt.id Storage Kit, a general-purpose distributed encrypted data store.
In order to leverage the SSI Kit key store abstraction with the IOTA identity framework integration, we plan to implement a key store mediator component in the Rust wrapper library, which exposes the IOTA storage interface on the one hand, and, on the other hand, communicates the signing or encryption/decryption requests to the SSI Kit via a native-to-managed callback function. This key store mediator can be passed to the AccountBuilder as to storage interface to use for DID creation. The SSI Kit can then fulfill the cryptographic requests using the configured key store implementation and hand back the result to the wrapper library and IOTA framework internals.
The IOTA identity framework makes use of the JcsEd25519Signature2020 LD-signature type for signing and validation of verifiable credentials.
Thus, to be compatible with the credentials issued by the IOTA framework,the SSI Kit needs to be extended to support this type of signature.
To make the SSI Kit compatible with the DID documents created by the IOTA identity framework, it is required to support Multibase encoding of the verification material in the verification method objects, according to the latest DID specification, as for the time being, an older version of said specification is implemented, using Base58 encoding.
Given the support of multibase encoded verification material, the DID documents should be fully compatible with the SSI Kit.
The SSI Kit abstracts complexity for developers by following a "multi-stack approach" that enables you to use different implementations or "flavours" of SSI.
As a result, you can participate in different identity ecosystems (e.g. EBSI/ESSIF, Gaia-X, Velocity Network, cheqd and IOTA) and avoid technology-related lock-in effects.
Based on our Introduction to Self-Sovereign Identity (SSI), we distinguish the following concepts or building blocks:
Read on to learn which concrete technologies and implementations we support on the level of
Trust Registries
Keys
Decentralized Identifiers (DIDs)
Verifiable Credentials (VCs)
Data Exchange Protocols
It has always been our goal to provide developers and organizations with great tools, so they can focus on delivering holistic identity solutions. Taking the lessons learned from previous products, we decided to redesign our current offering, resulting in what we now call The Community Stack. A collection of open-source products providing everything to launch any identity solution with ease. You can learn more about it here.
Starting from December 2023, the SSI-Kit will halt feature enhancements, leading to a complete discontinuation planned for end-Q3 2024. It's essential to plan your transition to the new stack effectively. The table below indicates which components of the SSI-Kit are already supported in the new stack.
For Kotlin/Java projects where SSI-Kit was used as a native dependency, utilize the provided Library for equivalent features in the new stack. •
If you employed the REST APIs, simply switch to the supplied API in the new stack.
If you have any question, please reach out.
All relevant new libaries and APIs have found it's place in the waltid-identity repo.
SSI-Kit Feature | The Community Stack |
---|---|
Signatory allows you to digitize paper credentials and automate data provision to your stakeholders.
It provides all functionality required by “Issuers”. For example:
Process and authenticate data requests by people or organisations,
Import data (from local storage or third parties),
Create re-usable VC templates,
Create VCs in different formats (e.g. JSON/JWT, JSON-LD),
Sign VCs using different key types (e.g. ed25519, secp256K1, RSA),
Manage the lifecycle of VCs (e.g. revocation).
Issue VCs (e.g. via OIDC/SIOP)
Custodian is a secure data hub for people and organizations. It provides all functionality required by “Holders”. For example:
Interact with Registries (read, write)
Create, store, manage keys, data (DIDs, VCs) and other secrets,
Request and import data (VCs) from third parties,
Selectively disclose data (VCs/VPs) for authentication and identification,
Manage consent and data access in a user-centric fashion.
Auditor allows you to verify your stakeholders’ identity data and offer frictionless access to services or products. It provides all functionality required by “Verifiers”. For example:
request data (VCs/VPs) from stakeholders,
verify data (VCs/VPs; incl. integrity, validity, provenance, authenticity),
trigger pre-defined actions following the verification.
The verification steps can be dynamically configured by passing "verification policies" to each verification attempt.
The SSI Kit comes with the following set of built-in verification polices:
SignaturePolicy: Loads or resolves DID, loads public key and verifies the credentials signature.
JsonSchemaPolicy: Validates the credential against the JSON schema.
TrustedSchemaRegistryPolicy: Checks if the JSON schema is anchored in the EBSI Trusted Schema Registry.
TrustedIssuerDidPolicy: Checks if the issuer DID is anchored on the EBSI DID registry.
TrustedIssuerRegistryPolicy: Checks if the issuer got inserted in the EBSI TIR (Trusted Issuer Registry).
TrustedSubjectDidPolicy: Checks if the issuer DID is anchored on the EBSI DID registry.
IssuedDateBeforePolicy: Checks if issued date is in the past.
ValidFromBeforePolicy: Checks if valid-from date is in the past.
ExpirationDateAfterPolicy: Checks if expiration-date is in the futrue.
CredentialStatusPolicy: Checks if credential is revoked based on the credential-status list.
The SSI Kit establishes an identity infrastructure layer for any use case in any industry. Its core services are in the scope of:
Registry Interactions (e.g. read, write; agnostic towards the underlying tech e.g. DLT, DNS)
Key Management (e.g. generate, sign, import, export, manage lifecycle)
Decentralized Identifier (DID) operations (e.g. register, resolve, manage lifecycle)
Verifiable Credential/Presentations (VC, VP) operations (e.g. create, issue, present, verify)
Ecosystem specific use cases (e.g. onboarding, data exchange and monetization)
Illustration:
Learn what the SSI Kit is.
The SSI Kit offers everything you need to use Self-Sovereign Identity (SSI) with ease.
The following sections elaborate the SSI Kit's unique properties, enabled functionality and components.
Important: Please be informed that, beginning from December 2023, the SSI Kit will no longer receive new features. Furthermore, the SSI Kit is planned for discontinuation by the end of Q3 2024. However, all functionalities offered by the SSI Kit will be integrated into our new libraries, APIs, and apps in the walt.id identity repo. Giving you more modularity, flexibility and ease-of-use to build end-to-end digital identity and wallet solutions. Read the transition guide here. For any clarification or queries, feel free to contact us as we aim to make this transition as smooth as possible.
This documentation will help you understand how the SSI Kit works and how you can use it. However, it presumes a certain level of knowledge about Self-Sovereign Identity (SSI) so
if you are already familiar with SSI, you can jump to the introduction of the SSI Kit.
if you are new to SSI, please continue with our introduction to Self-Sovereign Identity.
The architecture of the SSI Kit consists of three layers:
Low-Level Services Abstraction: Abstracts complex, low-level operations (e.g. cryptography, key management, digital signatures, data storage).
Ecosystem Abstraction: Abstracts ecosystem-specific requirements based on the relevant technical and governance frameworks (e.g. SSI flavors, business logic, policies).
High-Level Interfaces / APIs: Provides high-level interfaces that hide complexity and facilitate usage for developers.
Also, the architecture allows for the integration of third party solutions throughout the stack. For example:
Key storage (e.g. HSM, WebKMS)
Data storage (e.g. identity hubs, confidential storage)
Registries (e.g. blockchains, DNS)
This architectural openness prevents vendor lock-in and allows you to build SSI-based solutions that meet your unique requirements.
Illustration:
Read on to explore all three abstraction layers in more detail.
Learn about the technologies and concepts on which SSI is based.
Understanding SSI requires the understanding of a few core concepts:
Registries, which hold a shared and trusted record of information. In other words, they represent a “layer of trust” and can be referred to as the “single source of truth”.
Cryptographic keys, which convey control over digital identities and enable core functionality such as encryption and authentication.
Decentralized Identifiers (DIDs), which give us the power of verifying information, for example credentials, anywhere, anytime, through the establishment of a public key infrastructure. They link keys to unique identifiers that allow different parties to find and interact with each other.
Verifiable Credentials (VCs) which are digital identity documents that can easily and securely be shared with and verified (incl. validity, integrity, authenticity, provenance) by anyone in a privacy preserving way. Importantly, they are never (!) stored on a blockchain due to privacy and compliance reasons.
Verifiable Presentations (VPs), contain identity data for verification from one or multiple VCs and are mainly created by the holder of the VCs.
Protocols enable the exchange of data (VCs) between different parties.
Wallets, which store our keys (control) and VCs (identity data) and enable the management and sharing of our digital identities and data via easy-to-use applications.
The following graphic shows how SSI works and highlights the core concepts (in blue)
Think of these core concepts as different building blocks that are available in different variations and can be put together in different ways:
As a result, there are different “flavours” of SSI depending on which variations of which building blocks have been used and how they have been put together.
Importantly, the differences in terms of technologies that are being used illustrate why interoperability has always been one of the most important topics within the industry and why the development and use of open standards (e.g. by the W3C, Decentralized Identity Foundation, OpenID Foundation and others) are vital for technology and vendor selection.
A Verifiable Presentation (VP) is a collection from one or more Verifiable Credentials, whereas the authorship of the whole collection can be cryptographically verified. VPs are standardized as part of the W3C Verifiable Credentials Data Model.
Verifiable Presentations, make it possible to combine and tamper-evident share data of one or more Verifiable Credentials. The shared presentation of the data will be encoded in such a way that authorship of the data can be trusted after a process of cryptographic verification. In situations where only a subset of the original Verifiable Credential data is reveled, for example, to enhance user privacy, Zero-Knowledge Proofs can help us keep that data verifiable.
Verifiable Presentations represent a composition of claims, which can come from one or multiple Verifiable Credentials, of which the authorship is verified. This gives the holder of credentials the chance of composing context specify presentations, which only contain the data which is relevant in that context. When presenting the composition to a verifier, it can easily be validated.
Taking a closer look at how they are built up. We will see four different layers:
Presentation Layer - Being the Verifiable Presentation itself with the required metadata
Credential Layer - Referenced by Layer 1 and pointing to one or more credentials
Credential Proof Layer - Holding the proofs of the credentials and the signatures from Layer
Presentation Proof Layer - Holding the proof of the Verifiable Presentation and its signatures
If you want to get a better understanding of the different attributes present, please visit our section about VCs.
Our open source products enable you to act as a Holder (share VPs) and as a Verifier (request verify VPs).
Getting started with the SSI Kit.
Important: Please be informed that, beginning from December 2023, the SSI Kit will no longer receive new features. Furthermore, the SSI Kit is planned for discontinuation by the end of Q3 2024. However, all functionalities currently offered by the SSI Kit will be integrated into our new libraries, APIs, and apps under The Community Stack. This is aimed at providing a more modular, flexible, and efficient solution for your needs. For any clarification or queries, feel free to contact us as we aim to make this transition as smooth as possible.
The SSI-Kit's functionality can be used in a variety of ways. Select your preference to get started
My First Verifiable Credential (VC) - Issue and verify your first VC using the SSI-Kit Api
Advanced Verifiable Credentials (VC) - Leverage custom credential templates, the credential status property, prebuild and custom verification policies.
Learn what SSI is.
Self-Sovereign Identity (SSI) is a user-centric approach to digital identity that gives people and organizations full control over their data. As a result, SSI enables anyone to easily share their data and reliably prove their identity (i.e. who they are and anything about them) without sacrificing security or privacy.
In other words, SSI enables you to “bring your own identity” and this is true for potentially any type of information - from your core identity (e.g. name, age, address) to your education and work records, your health and insurance data, bank account and financial information, etc.
Moreover, SSI can be used to model the digital identities of people, organizations and things.
At the end of the day, SSI promises a digital world in which interactions are effortless and worry-free. It is simply the next evolutionary step in identity management, a new paradigm in which our digital identities are no longer fragmented and locked into silos that are under someone else’s control, but only at our own disposal to be shared securely and privately.
SSI allows us to model digital identity just like we are used to the way identity works in the non-digital world based on paper documents and cards. There are just some minor twists.
For example, instead of our identity documents being made of paper or plastic, they are digital credentials made of bits and bytes and instead of storing them in wallets made of leather, they are stored in digital wallets on our phones. Importantly, these digital credentials can be reliably verified by anyone they are shared with online or offline.
In doing so, SSI enables decentralized ecosystems in which different parties can exchange and verify identity-related information. These ecosystems look like three-sided marketplaces, so that every party can take on three roles:
Issuers - Parties who “issue” identity-related data to people or organizations (“Holders”) in the form of digital credentials. They are the original data sources of an SSI ecosystem. For example, a government issues digital passports to citizens or a university issues digital diplomas to graduates.
Holders - Individuals or organizations who receive digital credentials that contain data about themselves from various sources (“Issuers”). By aggregating and storing such credentials in digital wallets, Holders can build holistic digital identities that are under their control and can easily be shared with third parties ("Verifiers").
Verifiers - Parties who rely on data to provide products and services can reliably verify and process data that has been provided by others (“Holders”). Verifiers, also called “Relying Parties”, are usually organizations or individuals in their professional capacity.
Usually, a single party plays only one of these roles per interaction. However, it is perfectly normal for a party to take on different roles in different interactions.
For example:
A university (Holder) is being accredited to issue certain types of educational credentials by a national authority (Issuer).
A university (Issuer) issues a digital diploma to a graduate (Holder), who can share this information with a recruiter (Verifier) in the course of a job application.
After the recruiting process, a recruiter (Issuer) issues the results of an applicant’s assessment (e.g. skills, referral) to the applicant (Holder), who can share this information with a new manager or another recruiter (Verifier).
A manager (Issuer) issues the results of a performance review to his employee (Holder) who can share this information with HR (e.g. to improve talent development programs).
Manage keys, DIDs, issue Verifiable Credentials, and verify them using the SSI-Kit's REST API.
Make sure you have Docker or a JDK 16 build environment including Gradle installed on your machine
Pull the docker container directly from docker hub and run the project
This will create a folder called data in your current directory as storage for the VC, DIDs, Keys and other things which need to be stored in order to provide all the functionality.
Clone the project
2. Change the folder
3. Run the project
The first time you run the command you will be asked to built the project. You can confirm the prompt.
If you want to get a more detailed overview of the options provided for building the project on your machine, please refer to building the project.
After successfully running the project, you will have the endpoints, described below, available for use.
Exposed endpoints:
Type | Locally | General Available |
---|---|---|
The Core API exposes most of the functionalities provided by the SSI Kit, however newer features will only be released in the other API endpoints. Therefore, it is recommended to use the Signatory API, Custodian API and Auditor API for most use cases.
Issuance - Learn how to issue credentials
Holders - Learn how to maintain secrets and sensitive data (e.g. keys, Verifiable Credentials)
Verifiers - Learn how to verify credentials
My First VC - Play through a whole use case from Issuance to Verification
DIDs give us the power of verifying information, for example credentials, anywhere, anytime, through the establishment of a public key infrastructure.
DIDs are unique identifiers (URIs) which are standardised by the W3C. They can refer to any subject - from a person, to an organization, to a thing or basically anything else for that matter. Before we have a look at how a DID is structured and the benefit it provides, let's understand the shortcomings of current identifiers first and then see how the DID solves those.
In the digital economy, data exchange happens a lot, which makes it increasingly important to be able to identify persons, concepts or anything else for that matter in a secure and verifiable way.
A person today can have multiple different identifiers, like:
name@surname.com
https://www.surname.com
0000-0000-0000-0000 (ORCID based identifier, used to identify authors of scholarly communication)
All of these identifiers work, but none fulfills to be decentralized, persistent, resolvable and cryptographically verifiable when put into the following questions.
Is the identifier decentralized?
https://www.surname.com
depends on a single point of failure. What happens if the hosting side disappears?
0000-0000-0000-0000 (ORCID)
depends on the ORCID database. What happens if it is discontinued, hacked, etc?
Is the identifier persistent?
https://www.surname.com
If I do no longer pay for that domain, the identifier will be gone or even bought by somebody else.
Is the identifier resolvable?
How can I get additional information about 0000-0000-0000-0000 (ORCID)
identifiers?
Is the identifier verifiable?
How can I prove that I own the domain, https://www.surname.com
?
If I stopped paying for my domain, https://www.surname.com
and somebody else would buy it. How would somebody know that the information provided on the side was actually mine?
All those problems make it hard to be 100% sure when exchanging information, that the party we are exchanging information with, is actually the party and not some malicious actor pretending to be the party.
The design of DIDs, which is a new form of unique identifier that has been standardized by the W3C, can now help us address these problems of current identifiers, by being:
Decentralized
The DIDs no longer depend on centralized registries, identity providers, authorities, etc.
Persistent
Once created, the did is permanently assigned to the subject.
Resolvable
It is possible to find a basic set of information when resolving the did. Typically, this will lead to a DID Document.
Cryptographically verifiable
There is a mechanism to cryptographically prove identity and ownership via information provided by the DID Document.
The DID is a simple text string built up of three parts:
A variety of “DID methods'', which are different implementations of the DID specification, exist. Considering that DID methods differ in terms of how they are created, registered and resolved, different methods come with different advantages and disadvantages.
For example, while DIDs are often anchored on Registries, such as EBSI (did:ebsi) or the Domain Name Service (did:web), new methods emerged that do not require Registries because their distribution is based on peer-to-peer interactions (e.g. did:key).
Now we can take any did, look at the method and resolve it based on the framework around the method. The resolved content will most of the time be JSON or JSON-LD, although other data formats might also be added in the future. The resolved content is called DID Document.
A container of information holding:
The Subject
The owner of the DID Document
The Controllers
The entities allowed to introduce changes to the DID Document. The controller may or may not be identical to the "subject". For example, if the DID Document belonged to a DID of a book. The controller would be the author or another related person, rather than the book itself.
Cryptographic data
The DID document contains the public keys of the corresponding entity. The keys might be of any algorithm (typically elliptical curve keys or RSA keys are used), and are mostly encoded in the JWK format. However, other encoding formats such as PEM or Multibase are also supported. The keys can be classified to be used in certain use cases such as: authentication, verification, etc.
Service endpoints
Services that the subject wants to mention
The identifier did
:
ebsi
:
2A9RkiYZJsBHT1nSB3HZAwYMNfgM7Psveyodxrr8KgFvGD5y
of the method ebsi would resolve to the following DID document:
Our open source products enable you to use different DID methods for different identity ecosystems. Every relevant functionality (e.g. generation, anchoring, resolution) is supported .
Verifiable Credentials (VCs) are digital credentials that contain actual identity data of people or organizations and are standardized by the W3C. They are digital equivalents of paper-based identity documents like passports or diplomas.
Before we dive deeper into Verifiable Credentials and learn about their structure and how they work, we will have a look at the problems of today's credentials.
Today's credentials are easy to fake, hard to verify, and not privacy preserving by design. Making it hard for business and people offline but especially online to trust each other, when exchanging information and data. This brings about many problems, thereunder:
To verify that a document or claim presented is actually valid, can take up many resources and time. Just think about, what you had to do last time you opened up a bank account. The presenting of your ID card via a video call, taking selfies etc.
Often the credentials provided by you to get access to a service, are then stored on centralized servers. This makes them not only vulnerable to data breaches, but you also need to trust the organization that they only use the data in ways you would agree with.
You might be forced to disclose more information than needed. The police officer checking your driver license, in most cases, only needs to know that you are allowed to drive, but not where you live or your name.
Organizations employing people who claim to have a skill by presenting a fake certificate, can get jobs, which, when performed poorly, could have catastrophic consequences.
This is why we need a better way to verify the claims presented, and that is where Verifiable Credentials come in.
With VCs and the standard introduced by W3C, we now have a way of creating a digital piece of information that identifies a particular entity or verifies a specific attribute, qualification or claim about them, in a way, that is almost impossible to forger, easy to verify, and privacy preserving by design. Leaving us with the following benefits:
Easy to verify: There is a clearly defined and reliable way of verifying a Verifiable Credential.
Temper-proof: No one expect the issuer (entity creating the VC) can change the claims stated in the VC.
Independent: No need to contact the issuer of the presented certificate to be certain about its validity. The check can happen in an independent, asynchronous way.
Data is owned: The holder of a certificate now owns the data and decides what to share and when, only providing proof but never actually giving it (a copy) to the service provider.
Portable: The user is free to choose where to take their VC and in which wallet it is saved.
For us to understand the typical lifecycle of a Verifiable Credential, we need to make sure, we understand the idea behind an Issuer, Holder and Verifier and what DID and DID Documents are. With that out of the way, let's start with cycle.
Registration of the Issuer: Depending on the governance framework, the issuer will be accredited by a trusted entity, before their DID as well as DID Document will be put into the Registry. The Registry is the Single Source of truth and trust entity which verifiers will use as a reference point to make sure a presented VC is valid.
Holder setup: The holder generates the DID via the wallet and saves the private and public key as part of the DID Document, to be able to request, receive and present Verifiable Credentials from thereon. The DID and the DID Document will never be put into any registry, it will only exist locally in the wallet.
Verifier setup: The verifier only needs to have the technology to communicate with the registries when presented with a VC, to validate its authenticity using the DID and the DID Document from the issuer.
After the registration of the issuer and the setup of the wallet for the holder, the holder can now receive a VC from the issuer.
When the holder receives their Verifiable Credential it will be saved on their wallet, and it will contain the following:
Metadata:
The DID of the issuer
The status of the credential (expiration and issuing date, revoke state)
Claims:
The DID of the holder of the credential
The claims about the subject (what the issuer asserts about the subject). This could be, if they can drive a car and what type of car (driver license) or the subject of their study and knowledge areas skilled in (university certificate).
Proof:
This will contain the signatures of the issuer, which can be used to see if the content of the VC has been tempered with and for an authenticity check.
The holder can now use the VC in their wallet to access services, and get access to products by presenting it to the service/product provider (The Verifier) and thereby making it a Verifiable Presentation. The verifier will go through the following steps to make sure the certificate is valid:
Before the validation of the content of the certificate can take place, the VC needs to be parsed from the support JSON-LD or the JWT format. Depending on the ecosystem used, there will also happen a validation of the schema of the credential.
Validate that the DID of the holder, stated in the certificate, is the person presenting the VC.
Checking if all the state values are valid (expiration date and if the certificate is revoked or not).
Checking the claims about the subject and if they match the requirements to give the person access to the service they are requesting to get access to.
Checking the signatures of the issuer and the holder, by getting the DID of the issuer from the registry and the DID from the holder in their wallet and validating it using the public keys presented in the related DID documents.
When all the checks pass, the verifier can now grant the holder access to the service requested.
Our open source products enable you to act as an "Issuer" (create and issue VCs), as a Holder (manage and share VCs/VPs) and as a Verifier (request and verify VCs/VPs).
Key management functions like generation, listing, export/import, and deletion.
SSI-Kit CLI key management commands can be accessed with the key
command. It provides the following functionality:
Generate key - using gen command
List keys - using list command
Import key - using import command
Export key - using export command
Delete key - using delete command
All commands have the help option available:
<your-command> -h
or <your-command> --help
E.g. key gen -h
Use the gen
command to create asymmetric key pair by the specified algorithm. Supported algorithms are:
RSA:
key gen -a RSA
or key gen --algorithm RSA
ECDSA Secp256k1:
key gen -a Secp256k1
or key gen --algorithm Secp256k1
EdDSA Ed25519 (default)
key gen
or key gen -a Ed25519
or key gen --algorithm Ed25519
The returned value represents the keyId
of the newly created key.
E.g. key gen -a Secp256k1
Use the list
command to list all keys in the key store:
key list
It will output the following fields:
key index - index within the list
keyId - key identification number
key algorithm - algorithm used to create the key
crypto service - the cryptographic service used to create the key
Use the import
command to import a key in JWK or PEM format:
key import <your-key-file-path>
JWK - based on the JWK key ID and key material, an internal key object will be created and placed in the corresponding key store
PEM - if there's no key ID in the PEM file (which is usually the case), a random key ID will be generated and, based on the key material, an internal key object will be created and placed in the corresponding key store. PEM files must have the file extension 'pem':
RSA keys - file should contain either the private key or private and public keys concatenated with a 'new line' character
Ed25519, Secp256k1 - file should contain both private and public keys concatenated with a 'new line' character
E.g.
Ed25519 JWK public key
key import ./ed25519jwk.json
Secp256k1 PEM key
key import ./secp256k1.pem
Use the export
command to export a specified key type with the specified id and format.
Available key type:
public (default):
key export <your-key-id>
or key export <your-key-id> --pub
private:
key export <your-key-id> --priv
Available export formats:
JWK (default):
key export <your-key-id>
or key export <your-key-id> -f JWK
or key export <your-key-id> --key-format JWK
PEM:
key export <your-key-id> -f PEM
key export <your-key-id> --key-format PEM
The output will display the exported key in the specified format.
E.g.
key export 17592087c6f04c358b9b813dbe2ef027 --pub -f PEM
key export 17592087c6f04c358b9b813dbe2ef027 --pub
key export 17592087c6f04c358b9b813dbe2ef027 --priv -f PEM
key export 17592087c6f04c358b9b813dbe2ef027 --priv
Use the delete
command to delete a key with the specified ID:
key delete <your-key-id>
E.g. key delete 17592087c6f04c358b9b813dbe2ef027
VC related operations like issuing, verifying and revoking VCs.
VC related operations like issuing, verifying and revoking VCs. If you're new to VCs, check out the intro section for an overview.
Commands:
Issues and save VC - using issue command
Present VC - using present command
Verify VC or VP - using verify command
List verification policies - using policies command
Import VC to custodian store - using import command
VC templates - using templates command
List VCs - using list command
All commands have the help option available:
<your-command> -h
or <your-command> --help
E.g. vc issue -h
Use the issue
command to issue a W3C Verifiable Credential with either a JWT or a JSON_LD signature.
options:
-t, --template TEXT
specify the VC template. To create your own template, have a look here [Required]
-i, --issuer-did TEXT
DID of the issuer (associated with signing key). [Required]
-s, --subject-did TEXT
DID of the VC subject (receiver of VC). [Required]
-v, --issuer-verification-method TEXT
KeyId of the issuers' signing key
-y, --proof-type [JWT|LD_PROOF]
Proof type to be used [LD_PROOF]
-p, --proof-purpose TEXT
Proof purpose to be used [assertion]
--interactive
Interactively prompt for VC data to fill in
--ld-signature, --ld-sig \[Ed25519Signature2018|Ed25519Signature2020|EcdsaSecp256k1Signature2019|RsaSignature2018|JsonWebSignature2020|JcsEd25519Signature2020]
--ecosystem \[DEFAULT|ESSIF|GAIAX|IOTA]
Specify ecosystem, for specific defaults of issuing parameters
--statusType \[StatusList2021Entry|SimpleCredentialStatus2022]
specify the credentialStatus type
e.g.
vc issue -t OpenBadgeCredential -s did:key:z6MkpuUYdpaZPcpnEWnkE8vb7s2u2geTZJden1BwGXsdFUz3 -i did:ebsi:zZ5apnsHPUXNqjWELjNZhYW
, returns a credential document (JSON format)
Use present command to present a VC or VP to a verifier.
-i, --holder-did TEXT
DID of the holder (owner of the VC)
-v, --verifier-did TEXT
DID of the verifier (recipient of the VP)
-d, --domain TEXT
Domain name to be used in the LD proof
-c, --challenge TEXT
Challenge to be used in the LD proof
use verify command to verify
-p, --policy VALUE
Verification policy. Can be specified multiple times. By default, SignaturePolicy is used. For more details on how to specify the policies, refer to Verification Policies.
To see available verification policies, use vc policies
command
Import VC to custodian store
Learn about VC template related functions like the listing and exporting of templates, as well as how to create/import your own custom VC template.
list
List VC Templates.
vc template list
result
export <template-name>
Export VC Template.
Options:
-n, --name <Name>
Name of the template
e.g. vc templates export --name VerifiableId
import <customCredentialPath.json>
Options:
-n, --name <Name>
Name of the template
Arguments:
credential path
the last argument of the command references the path to the custom credential, which should be imported
e.g vc templates import -n MyCustomCredential custom.json
custom.json
Output of the command
list
VCs saved in the custodian store
e.g. vc list
This section shows how you can use the SSI Kit to interact with Europe’s emerging identity ecosystem based on the
EU Blockchain Service Infrastructure (EBSI)
EU Self-Sovereign Identity Framework (ESSIF)
Here **** you can find examples on how to use the SSI Kit command line interface to interact with the EBSI/ESSIF ecosystem.
Note, that some of the flows are not in their final version. Slight modifications are to be expected.
Creation and anchoring of a new DID on the EBSI ledger (incl. use of "Verifiable Authorizations" and EBSI access tokens).
For SSIKit usage examples, refer to: EBSI DID registration
Onboarding of a legal entity to the EBSI/ESSIF ecosystem (incl. combined VC request and DID registration).
Gaining access to protected EBSI/ESSIF services (incl. presentation of "Verifiable Authorization").
The ESSIF protocol for issuance of verifiable credentials aims to being compliant with the OIDC for credential issuance specification. Refer to the respective section for details:
The ESSIF protocol for presentation of verifiable credentials to a Verifier or Relying Party, aims to being compliant with the OIDC/SIOPv2 specification. Refer to this section for details:
OIDC/SIOPv2 for verifiable presentations
Several code-examples how to use the ESSIF functionality of the SSI Kit are shown on GitHub
Note, that EBSI/ESSIF specifications are evolving, which means that some flows are not available in their final version. Modifications are to be expected.
Creation and anchoring of a new DID on the EBSI ledger (incl. use of "Verifiable Authorizations" and EBSI access tokens).
For SSIKit usage examples, refer to: EBSI DID registration
Onboarding of a legal entity to the EBSI/ESSIF ecosystem (incl. combined VC request and DID registration).
Gaining access to protected EBSI/ESSIF services (incl. presentation of "Verifiable Authorization").
The ESSIF protocol for issuance of verifiable credentials aims to being compliant with the OIDC for credential issuance specification. Refer to the respective section for details:
The ESSIF protocol for presentation of verifiable credentials to a Verifier or Relying Party, aims to being compliant with the OIDC/SIOPv2 specification. Refer to this section for details:
OIDC/SIOPv2 for verifiable presentations
Several code-examples how to use the ESSIF functionality of the SSI Kit are shown on GitHub
DID related operations, like registering, updating and deactivating DIDs. For more info on DIDs, go here.
Commands:
Create DID - using create command.
Resolve DID - using resolve command.
List DIDs - using list command.
Import DID to custodian store - using import command.
Delete DID from custodian - using delete command.
All commands have the help option available:
<your-command> -h
<your-command> --help
E.g. did create -h
Creates a DID document using did create [options]
command based on the corresponding SSI ecosystem (DID method). Optionally the associated asymmetric key is also created.
-m, --did-method [key | web | ebsi | iota | jwk | cheqd]
- Specify DID method [key], Supported DID methods are: "key", "web", "ebsi", "iota", "jwk"
-k, --key TEXT
- Specific key (ID or alias)
-d, --domain TEXT
- Domain for did:web
-p, --path TEXT
- Path for did:web
-v, --version INT
- Version of did:ebsi. Allowed values: 1 (default), 2
-n, --network [testnet | mainnet]
- cheqd network, default is testnet
-j, --useJwkJcsPub
- specifies whether to create a did:key using the jwk_jcs-pub multicodec (code: 0xeb51)
The returned value represents the DID document.
E.g. did create -m ebsi -k 8a2c3628acdd45999b4c0b5a69911437
IOTA support
For creating IOTA DIDs and registering them on the IOTA tangle, a wrapper library needs to be installed and available in the local library path.
The wrapper library is included in the SSIKit Docker image, such that for Docker users no additional setup is required.
CLI users can find instructions for build and SSIKit integration at:
Resolves the DID document.
Options:
-d, --did TEXT DID to be resolved
-r, --raw / -t, --typed
-w, --write
List all created DIDs using did list
command
Import DID to custodian store using did import [options]
command
-k, --key-id TEXT
- Specify key ID for imported did, if left empty, only public key will be imported
-f, --file TEXT
- Load the DID document from the given file
-d, --did TEXT
- Try to resolve DID document for the given DID
Use the delete
command to delete a DID:
did delete <your did>
E.g. did delete -d "did:ebsi:zs79GYJvzEnQYxkAAj4UX1j"
The SSI Kit can also be used as direct dependency for JVM-based applications. In this case an existing application can easily be enhanced with SSI functionality.
The following illustrates how the SSI Kit can be used via Gradle or Maven (look for the current version on GitHub https://github.com/walt-id/waltid-ssikit)
Gradle
Maven
Required Maven repos:
You can find the latest version here. Make sure when adding the version you add it without the 'v' in front.
The verification process (inspection) is initiated by disclosure exchanges where the Relying Party requests credentials from Holder. These exchanges can be encoded in the following ways:
deep links - a URI that matches the spec:
QR-codes - a visual representation of a deep link
Depending on the use case, the Relying Party can request either:
verified credentials
requires payment in tokens
returns the verification checks (policies) result
unverified credentials:
requires no payment in tokens
can be verified later
The verification checks performed against the credential are the following:
UNTAMPERED
pass - hasn't been tampered
fail - has been tampered
voucher_reserve_exhausted - a voucher is required for verification
TRUSTED_ISSUER
pass - issuer is trusted
fail - issuer is not a member of Velocity
self_signed - data attested by the Holder
voucher_reserve_exhausted - a voucher is required for verification
UNREVOKED
pass - hasn't been revoked
has been revoked
voucher_reserve_exhausted - a voucher is required for verification
UNEXPIRED
pass - hasn't expired
fail - has expired
More details on credential verification checks can be found at https://docs.velocitynetwork.foundation/docs/developers/developers-guide-disclosure-exchange#credential-verification-checks.
More on credential verification at https://docs.velocitynetwork.foundation/docs/developers/developers-guide-disclosure-exchange.
Auditor REST API functions.
|
The Auditor API enables anybody to act as a "Verifier" (i.e. verify Verifiable Credentials or Verifiable Presentations). The validation steps can be easily configured by existing or custom policies.
The following functionality is available:
- credential / presentation verification
- policy related functions
The /v1/verify
endpoint verifies a list of credentials / presentations specified in the JSON-LD
format against a set of policies. Each of the policy should be registered with the Auditor before being used in the verification. If at least one of the listed policies fails the verification, then the entire credential is considered to be invalid.
E.g Verification of a UniversityDegree credential against Signature and JsonSchema policies, where SignaturePolicy is failing.
The Auditor Rest API also enables policy management with the following methods:
The /v1/policies
endpoint lists the available verification policies. The policy id
field is used to reference the policy during verification.
E.g. Listing of the verification policies
The /v1/create/{name}
creates a dynamic policy. The following parameters can be specified:
name
path parameter (required) - specifies the value to be used as the policy id
update
query parameter (optional, defualts to false
) - accepts boolean
values and specifies whether it should override an existing policy with the same name
(only if the policy is mutable)
downloadPolicy
query parameter (optional, defaults to false
) - accepts boolean
values and identifies the scope of the policy
field:
specifies a remote source that should be resolved to a policy
specifies the actual policy content
E.g. Creating a Rego policy that checks if a credential subject id is not null or empty
Code 200
The /v1/delete/{name}
endpoint deletes a dynamic policy. The following parameters can be specified:
name
path parameter (required) - specifies the id
value of the policy
Policy removed / Policy not found
E.g. Removing the policy having 'MyPolicy' name
Policy removed / Policy not found
The following functions are available for credentials management:
- lists the available credentials
- lists credential ids
- loads a credential by id
- store a credential
- delete a credential by id
- create a verifiable presentation from specific credentials
- create a verifiable presentation from specific stored credential ids
The /credentials
endpoint lists the available credentials:
id - query parameter (optional) - the list of credentials ids
E.g. List the the credentials having ids urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270 and urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84271
The /credentials/list/credentialIds
lists the available credentials ids.
E.g. List the available credentials ids.
The /credentials/{id}
loads a credential specified by:
id - path parameter (required) - the credential id
E.g. Load the credential having id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270.
The /credentials/{alias}
endpoint stores a verifiable credential by:
alias - path parameter (required) - the credential's id
The body should contain, the VC to be store. If no adjustments are required to VC, then the body can be the VC itself (e.g. the one received from the create VC endpoint).
E.g. Store the UniversityDegree
verifiable credential.
The /credentials/{alias}
deletes a credential by:
alias - path parameter (required) - the credential's id
E.g. Delete the credential with id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270
The /credentials/present
endpoint creates a verifiable presentation from the specified credentials.
E.g. Create a verifiable presentation from the provided VeriafiableID and OpenBadgeCredential credentials for a holder with did = did:web:my.domain.
The /credentials/presentIds
endpoint creates a verifiable presentation from the specified credential ids.
E.g. Create a verifiable presentation from the stored credential having id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270 for the holder with did = did:web:my.domain.
This section describes the following functions implemented as part of Velocity network integration:
Velocity network specific operation are available under the velocity
command:
Velocity credential verification is available with the verify
command:
E.g. Verify credential.
The interaction with Velocity network is implemented in SSIKit using a Rest API client which currently exposes the functionality through the command-line-interface. The available functions can grouped as follows:
organization related
onboarding
tenant configuration
credential related
issuance
verification
In order to cover the credential related functions, but also tenant management, SSIKit uses a credential agent deployed on walt.id infrastructure. For this, walt.id is registered on Velocity network (currently only on testnet) as a credential agent operator and can issue and verify credentials on behalf of issuer using either issuer's keys or walt.id keys. The diagram below shows how Velocity integration is currently done with SSIKit.
The organization related functions, such as onboarding and DID acquisition, are implemented by calling Velocity network registrar Rest API.
You can launch identity wallets or extend your existing applications with the SSI capabilities. Check out our if you are interested to learn more.
- display the available verification policies
- create a dynamic verification policy
- remove a dynamic verification policy
More details on creating verification policies and fields definitions can be found at .
Keys Operations: Create, Update, Delete
All types of operations are supported
ed25519
Available via: • Crypto Lib • Wallet-API
secp256k1
Available via: • Crypto Lib • Wallet-API
secp256r1
Available via: • Crypto Lib • Wallet-API
rsa
Available via: • Crypto Lib • Wallet-API
DIDs Operations: Create, Register, Resolve, Delete
All types of operations are supported
did:key
Available via: • DID Lib • Wallet API
did:jwk
Available via: • DID Lib • Wallet API
did:web
Available via: • DID Lib • Wallet API
did:cheqd
Available via: • DID Lib • Wallet API
did:iota
Available via: • DID Lib • Wallet API
did:ebsi
Not yet supported
W3C Credentials
Issuance
W3C Credential Issuance as JWTs
Available via: • Verifiable Credential Lib • Issuer API
W3C Credential Issuance as SD-JWTs
Available via: • Verifiable Credential Lib • Issuer API
W3C Credential Issuance as JSON-LD
Not yet supported
Verification
W3C Credential Verification (JWTs)
Available via: • Verifiable Credential Lib • Verifier API Please note, the issuer API only supports did:key at this point.
W3C Credential Verification (SD-JWTs)
Available via: • Verifiable Credential Lib • Verifier API
W3C Credential Verification (JSON-LD)
Not yet supported
Other Credential Features
Credential Templates
In The Community Stack, we no longer have the notion of a credential template. The issuance will simply happen by providing the full W3C data schema, which will then be signed. A list of credentials schemas can be found here
Credential Revocation
Not yet supported
Policies
Similar. A list of all policies can be found here.
Open-Policy Agent Policies
Not yet supported. However, the new webhook policies also give you great flexibility until we will reintroduce Open-Policy Agent policies.
Issuance & Verification via OpenID4VC
Available via: • OpenID4VC Lib • Issuer API • Verifier API
Signatory API - For Issuers
http://127.0.0.1:7001
Custodian API - For Holders
http://127.0.0.1:7002
Auditor API - For Verifiers
http://127.0.0.1:7003
Core API
http://127.0.0.1:7000
ESSIF API
http://127.0.0.1:7004
Schema
The did URI scheme identifier
DID Method
The identifier for the did method
DID Method-Specific Identifier
An identifier based on the requirements of the did method
REST API
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs via API.
Java | Kotlin
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs directly in a Java or Kotlin Application. Adding the SSI-Kit as a direct dependency.
CLI - Command Line Tool
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs via the CLI.
OIDC For Credential Issuance
OIDC for Verifiable Presentations
Java
Examples on how to issue and verify Verifiable Credentials in a Java project.
Kotlin
Examples on how to issue and verify Verifiable Credentials in a Kotlin project.
In order to operate on Velocity network, any entity (regardless of scope - issuer, relying party or credential agent operator) has to register with the network.
The onboarding is currently done manually and will custody a DID on the DID:ION network:
get an account with the registrar
by sending an email to support@velocitynetwork.foundation
set up the organization(s)
set the required services according to the use case:
issuer - VlcCareerIssuer_v1
verifier - VlcInspector_v1
agent operator - VlcCredentialAgentOperator_v1
configure the tenants
add the required keys according to the use case:
issuer - ISSUING_METADATA
verifier - EXCHANGES
agent operator - DLT_TRANSACTIONS
The configuration steps from above can be completed either using:
or Rest API
More information on Velocity onboarding can be found at https://docs.velocitynetwork.foundation/docs/developers/developers-guide-getting-started.
ESSIF REST API functions.
The ESSIF API exposes the necessary endpoints for running the ESSIF specific flows between Issuers (incl. ESSIF Onboarding Services), Holders and Verifiers.
Aligned with the ESSIF terminology, the API is grouped by the User Wallet (wallet API for consumers / natural persons) and Enterprise Wallet (wallet API for organisations / legal entities).
Note that the EBSI/ESSIF specifications are expected to evolve which will be reflected in continuous updates of the proposed APIs.
The following functions are available:
onboard - EBSI onboarding flow, requests a Verifiable Authorization from EOS
authorize - ESSIF authorization flow
register did - registers DID on the EBSI blockchain
create timestamp - creates a timestamp in EBSI ledger
load timestamp by id - loads a timestamp by the timestamp id
load timestamp by hash - loads a timestamp by the transaction hash
The /v1/client/onboard
endpoint onboards the specified DID to the EBSI blockchain. It runs the ESSIF onboard API and requires a Bearer token, which can be acquired from https://app.preprod.ebsi.eu/users-onboarding.
E.g. Onboard the did = did:ebsi:zYubw2L8tCZSKKpAcMmCY2Q.
The /v1/client/auth
runs the ESSIF authorization flow for the specified did.
E.g. Run the authorization flow for did = did:ebsi:zYubw2L8tCZSKKpAcMmCY2Q.
The /v1/client/registerDid
endpoint registers the specified DID on the EBSI ledger.
E.g. Register did = did:ebsi:zwdPobJGue3w86Gpqhq5Cni on the EBSI ledger.
The /v1/client/timestamp
endpoint creates a timestamp on the EBSI ledger, using the provided DID as a key. The data will be written to the data-field of the timestamp.
E.g. Create a timestamp using did = did:ebsi:zYubw2L8tCZSKKpAcMmCY2Q.
The /v1/client/timestamp/id/{timestampId}
endpoint loads the timestamp based on the provided parameter:
timestampId - path parameter (required) - the timestamp id
E.g. Load the timestamp having id = uEiBEtXzl3QXshn5Z1V4dgZVtMlvnx3E1f2IWFDQVqzEv_Q.
The /v1/client/timestamp/txhash/{txhash}
endpoint loads the timestamp based on the provided parameter:
txhash - path parameter (required) - the transaction hash
E.g. Load the timestamp having the transcation hash 0x9c60ca0094771afe4093b0e47260eb623d5d18140e188e671cf912609cd0e169.
A Cosmos based blockchain
The cheqd Network is a Cosmos SDK-based blockchain that facilitates the exchange of trusted, verifiable data off-chain Verifiable Credentials, using on-chain identifiers (Decentralised Identifiers 'DIDs'). The cheqd Network leverages the cheqd DID method and enables linked resources to be written to the network, associated with a DID and controlled using the verification methods in the DID Document. Through this approach, the cheqd Network is able to natively support all major credential types: VC-JWT, JSON-LD and AnonCreds across an array of enterprise SDKs. cheqd also has a dedicated token, $CHEQ, used for identity writes to the network, voting in a decentralised governance framework as well as for various payment flows between verifiers, holders and issuers of Verifiable Credentials.
Create DID - Create your first did:cheqd
Issue VC - Issue your first Verifiable Credential based on a did:cheqd
Verify VC - Verify your Credential based on a did:cheqd
Detailed instructions on how to verify VCs using the CLI.
Detailed instructions on how to verify VCs using REST.
Detailed instructions on how to issue VCs using the CLI.
Detailed instructions on how to issue VCs using REST.
For verification of verifiable credentials, the SSI-Kit offers a wide range of predefined static and parameterized verification policies, which are ready-to-use and are designed for common use cases. For more complex verification, the creation of custom policies using a policy execution engine such as the Open Policy Agent can be used.
Predefined and covering a variety of common use cases, enabling developers to verify credentials without having to dive into dynamic or custom policy creation and scripting languages. Some of these policies include SignaturePolicy
, JsonSchemaPolicy
, ValidFromBeforePolicy
, ExpirationDateAfterPolicy
, and more.
Learn more about Static Verification Policies.
Parameterized policies are a type of policy that requires certain parameters or arguments for their execution.
Learn more about Parameterized Verification Polices.
Dynamic policies offer a more customized approach to credential verification, enabling even the most complex of use-cases. Policies can be created based on different policy engine languages.
In this tutorial, our fictional SSI Use-Case involves Emma, a recent university graduate, who wants to apply for a job. The employer requires proof of her academic credentials. To achieve this, she will use her Verifiable Degree Credential issued by her university.
Where we start:
Utilizing the walt.id SSI-Kit's REST-API, we will provide Emma with the Verifiable Degree Credential by creating a DID for both the university (Issuer) and Emma (Holder). After that, the employer (Verifier) will authenticate the Degree Credential using the associated DIDs.
What we do:
Create DIDs - for Emma + University
Issue Credential to Emma
Verify Credential from Emma
Make sure you have Docker or a JDK 16 build environment including Gradle installed on your machine
Pull the docker container directly from docker hub and run the project
This will create a folder called data in your current directory as storage for the VC, DIDs, Keys and other things which need to be stored in order to provide all the functionality.
Clone the project
2. Change the folder
3. Run the project
The first time you run the command you will be asked to built the project. You can confirm the prompt.
Now with the project up and running, visit the walt.id Custodian API URL displayed in your terminal. We will use it, to create our first DID (Decentralised Identifier). If you want to learn more about DIDs, visit the general section.
Body Parameters
method
: [string] method of the did. Value can be one of key
, web
, ebsi
, iota
, cheqd
, jwk
Example
Create DIDs for the University (Issuer) and Emma (Holder) using the example above twice. Make sure to save the DIDs somewhere, as we will be needing them later for the issuance of the credential.
Using the DIDs from the previous step, we will now issue a University Degree Credential. To issue the credential, we will be using the Signatory API endpoint. Below you find the final credential structure. Verifiable Diploma
This Verifiable Diploma includes the following key elements:
"type"
: Defines the credential as both a general "VerifiableCredential" and a more specific "UniversityDegreeCredential".
"@context"
: Provides references to the necessary standards and examples for interpreting the credential.
"id"
: A unique identifier for the credential.
"issuer"
: The DID of the university issuing the diploma (e.g., "did:example:456").
"issuanceDate"
: The timestamp indicating when the credential was issued.
"issued"
: The timestamp indicating when the credential was issued.
"validFrom"
: The timestamp indicating from when the credential is valid.
"credentialSubject"
: Contains the DID of the credential holder (Emma - e.g., "did:example:123") and the details of the degree earned, such as the name of the degree and its type.
proof
: The proof object contains the digital signature and related information that is used to verify the authenticity and integrity of the Verifiable Credential.
type
: "JsonWebSignature2020" indicates the cryptographic suite used for generating the digital signature. In this case, it's the JSON Web Signature (JWS) standard based on the Ed25519 signature algorithm.
creator
: "did:example:456" is the DID of the entity that created the digital signature, in this case, the issuer of the Verifiable Credential.
created
: "2023-03-21T15:35:08Z" is the timestamp when the digital signature was created. It is in the ISO 8601 format.
verificationMethod
: "did:example:456" is the identifier for the public key or cryptographic method used to verify the digital signature.
jws
: is the actual digital signature (JWS) generated using the Ed25519 algorithm. This signature is used to verify the authenticity and integrity of the Verifiable Credential.
Now that we know how the credential looks like, let's issue it.
Issue
Body Parameters
templateId
: [string] The identifier of the template used for issuing a Verifiable Credential. This template defines the structure and format of the credential being issued.
config
: [object] Contains configuration parameters for the issuance process.
issuerDid
: [string] The DID of the entity issuing the credential (University).
subjectDid
: [string] The DID of the entity receiving the credential (Emma).
proofType
: [string] Specifies the format and cryptographic algorithm used for the digital signature of the Verifiable Credential. E.g. LD_PROOF
credentialData
: [object] Contains the actual data of the credential being issued.
credentialSubject
: [object] Holds the information about the credential holder and the earned degree.
id
: [string] The DID of the credential holder, identical to the subjectDid
in the config
object.
degree
: [object] Contains details of the degree earned by the credential holder.
name
: [string] The name of the earned degree (e.g., "Bachelor of Science and Arts").
type
: [string] The type of the earned degree (e.g., "BachelorDegree").
Example:
Awesome, you just issued your first VC 🎉. Now Emma can use it to prove her academic credentials to her employer.
Emma needs to verify her academic credentials to her employer, which involves the following steps performed by the verifier (employer):
Verify the credential's digital signature: The Verifier checks if the digital signature on the credential is valid and has been signed by the Issuer's private key. This ensures that the credential was indeed issued by the claimed Issuer.
Check the Issuer's DID: The Verifier validates the Issuer's Decentralized Identifier (DID) by resolving it to a DID Document. This step helps confirm the Issuer's identity and retrieve their public key for signature verification.
Verify the Holder's DID: The Verifier checks the credential's subject (i.e., the Holder) by validating their DID. This ensures that the credential was issued to the correct entity and prevents unauthorized use of the credential.
Check the credential's status: (Not for our use-case) The Verifier may also query the credential's status on a registry, ledger, or another trusted source. This step helps confirm whether the credential is still valid or has been revoked by the Issuer.
Check the credential's integrity: The Verifier confirms that the credential data has not been tampered with since issuance. This is typically done by comparing the signed data with the credential's content to detect any discrepancies.
Verify the credential's issuance and expiration dates: The Verifier checks the issuance and expiration dates (if applicable) of the credential to ensure it was issued within the specified time frame and is still valid.
To verify the credential, we will be using the Auditor API endpoint.
Example
You have successfully issued and verified a Verifiable Credential. Great job!! Emma can start her new job 🎉 Where to go next?
Exploring ssi-kit Capabilities - Familiarize yourself with the various features of ssi-kit by reviewing the REST API or Command Line section.
Understanding Verification Policies - Discover the potential applications of Verification Policies and how they can be useful for your particular use-case.
Exploring Supported Ecosystems - Visit our ecosystem overview page to explore the different ecosystems (did:methods) that we support.
Some policies may require a parameter or argument for execution. The parameter is indicated in the policy list output, together with the expected data type.
Please refer to the SSI-Kit setup section to exectute the command successfully. Let's verify a credential using the parameterless SignaturePolicy and ChallengePolicy which taks a paramter.
Flags
-p, --policy
: Verification policy. Can be specified multiple times. By default, SignaturePolicy is used. To specify a policy argument (if required), use the format PolicyName='{"myParam": "myValue", ...}', to specify the JSON object directly, or PolicyName=path/to/arg.json, to read the argument from a JSON file.
The Challange Policy
It checks that the challenge of the credential is one of the challenges given in the ChallengePolicyArg argument.
Please refer to the SSI-Kit setup section to serve the API.
Using the /v1/verify
enpoint in the Auditor API to verify a credential
Body
policies
: [array] A list of policy definitions to verify against
policy
: [string] The name/id of the policy
argument
: [JSON] The argument needed by the policy (if required)
credentials
: [array] An array of credentials in JWT, or LD_PROOF format
They are the blueprint on which Verifiable Credentials are issued, and we offer two variants:
Prebuild templates for the most relevant use-cases
Custom Templates to help you issue use-case specific verifiable credentials when the prebuild ones are not enough. You can also watch our intro video to custom templates.
Onboarding Emma to a Virtual Company, issuing custom credentials, and utilizing dynamic verification policies.
In this tutorial, we will expand on the "My First VC" tutorial by now onboarding Emma to a virtual company. During the onboarding, Emma will receive an EmployeeID credential, which will be based on a custom credential template using the status property. With that, you learn about two more concepts; Custom credential templates and the status property. The credential will also have Emma's role as a property, which can then be checked with a dynamic verification policy, another new concept.
Where we start:
Using the walt.id SSI-Kit's REST API, we'll streamline access control within the company by
Establishing a Digital Identity (DID): The company will be set up as the issuer through its own Digital Identity (DID).
Creating an EmployeeID Template: This template will be used for generating verifiable credentials for employees.
Issuing a Verifiable Credential to Emma: Emma, an employee, will be issued a Verifiable Credential, making her the holder.
Authenticating with Door Access System: The system, acting as the Verifier, will:
Confirm Emma's employment status
Assess her role within the company
Grant or deny access to specific areas on the company campus based on the above.
Make sure you have Docker or a JDK 16 build environment including Gradle installed on your machine
Pull the docker container directly from docker hub and run the project
This will create a folder called data in your current directory as storage for the VC, DIDs, Keys and other things which need to be stored in order to provide all the functionality.
Clone the project
2. Change the folder
3. Run the project
The first time you run the command you will be asked to built the project. You can confirm the prompt.
Now with the SSI-Kit up and running, we can create the DID for the company, establishing its Digital Identity.
You have two options for creating a DID: executing a curl command or visiting the swagger docs. The custodian endpoint should be visible in the terminal after you have run the serve command, visit the endpoint to see the swagger docs.
Body Parameters
method
: [string] method of the did. Value can be one of key
, web
, ebsi
, iota
, cheqd
, jwk
Example
Create one DID for the company using the command above. In case you don't have done the My First VC tutorial and still have the DID for Emma, create another one for her. Make sure to save both, as we will be needing them later for the issuance of the EmployeID. You can also use the list DID endpoint to see all previously created DIDs.
The SSI-Kit supports the creation of credential templates, which are blueprints that establish reusable data structures based on which credentials can be issued.
Let's create one for an EmployeeID. The template will include the following properties:
id
: Identifier for the employee
name
: Name of the employee
role
: Role or designation in the company
joiningDate
: Date of joining the company
Path Parameters
id
: The name for the credential template, which we will later reference during issuance. e.g. EmployeeID
Body Use the JSON structure from above as the body. Example
We can streamline the process of issuing a credential to Emma by utilizing the EmployeeID credential template. However, we also have the opportunity to further simplify the process using an additional feature: prefilling credential properties. This will be particularly beneficial as Emma likely won't be the only one receiving the Engineer role assignment. By creating a template where the Engineer role is prefilled, we can improve the issuance process.
You can now also save this template and use it later for the issuance of our VC, or use the one we created before and provide the value for the role attribute during issuance.
In order to issue Emma's EmployeeID VC, we will use one of the credential templates. Additionally, we will include a credential status so that we have the option to revoke the credential if Emma leaves the company without needing to remove it from her ownership. For more information on credential status and the types we support, read our guide.
Body Parameters
templateId
: [string] The identifier of the template used for issuing a Verifiable Credential. In our case this would be the EmployeeID or EngineerEmployeeID template.
config
: [object] Contains configuration parameters for the issuance process.
issuerDid
: [string] The DID of the entity issuing the credential (University).
subjectDid
: [string] The DID of the entity receiving the credential (Emma).
proofType
: [string] Specifies the format and cryptographic algorithm used for the digital signature of the Verifiable Credential. E.g. LD_PROOF
statusType
: [statusType] Specifies if the credential should be issued with status and the type of the status. Options StatusList2021Entry
or SimpleCredentialStatus2022
credentialData
: [object] Contains the actual data of the credential being issued.
credentialSubject
: [object] Holds the information about the employee as described in the EmployeeID credential template.
The id
in the credential subject we don't need to provide as this will be prefilled automatically with the DID of Emma.
Example EmployeeID:
Example EngineerEmployeeID:
Now we can use the issued VC in the company campus verification system.
To successfully authorize access, our authentication system needs to verify four crucial elements regarding the Verifiable Credential presented during the verification process:
Confirm the signature of the Verifiable Credential to ensure it's legitimate.
Ascertain the credential's validity by validating its current status, ensuring it has not expired or been revoked.
Verify that the issuer is indeed our company, thereby eliminating any fraudulent issuers.
Evaluate whether the role of the employee, as indicated on the credential, authorizes them to access the desired area.
To facilitate successful verification, we will use Verification Policies. For the initial two cases, which are relatively common, we can utilize our built-in policies, namely SignaturePolicy
and CredentialStatusPolicy
. These policies essentially verify that the signature is authentic and that the status is still valid.
For the additional cases, we will create custom policies, leveraging the Open Policy Agent (OPA) Engine and the REGO language. You can learn more about it here.
Please install the Open Policy Agent as described here, to ensure seamless verification.
Make sure to replace {issuerDID} with the DID of your issuer (company)
Creating Custom Policies - IsRole
This custom policy holds greater flexibility, as it accepts an argument containing a 'role' property. This property is then cross-verified with the role present in the Verifiable Credential. Alternatively, the role can be hardcoded, and the policy can be renamed accordingly, such as IsEngineer
, similar to the IsCompany
policy.
Path parameters:
policyName
: [string] Name of the policy, e.g. IsCompany
Query parameters:
update
: [boolean] Specifies if existing policy with same name should be overridden (if mutable)
downloadPolicy
: [boolean] When using an URL to reference the to created policy. Downloads and/or saves the policy definition locally, rather than keeping the reference to the original URL\
Body
name
: [string] Policy name, must not conflict with existing policies
description
: [string] Optional policy description
input
: [JSON] Input JSON object for rego query, which can be overridden/extended on verification. Can be a JSON string or JSON file
policy
: [URL, REGO] Whole Policy or URL to policy definition.
dataPath
: [JSON path] JSON path to the data in the credential which should be verified, default: "$" (whole credential object)
policyQuery
: [string] The query string in the policy engine language. Defaults to
"data.system.main".
policyEngine
: [string] Policy engine type, default: OPA. Options, OPA
applyToVC
: [boolean] Apply/Don't apply to verifiable credentials (default: apply)
applyToVP
: [boolean] Apply/Don't apply to verifiable presentation (default: don't apply)
Using the endpoint above, you can save both policies.
Now with the custom policies created, we will use those and the two predefined ones to verify Emma's credential and enable access.
Body
policies
: [array] A list of policy definitions to verify against
policy
: [string] The name/id of the policy
argument
: [JSON] The argument needed by the policy (optional)
credentials
: [array] An array of credentials in JWT, or LD_PROOF format
Congratulations, you've reached the finish line! 🎉 You have skillfully utilized an array of features to bring this use case to life. Now, you have the opportunity to explore further and deepen your understanding of the concepts introduced today. Happy building!
Credential Templates - Dive deeper into credential templates and what you can do with them
Credential Status - Learn more about the credential status property and what is enables.
Verification Policies - Explore our pre-build templates and learn more about how to build and leverage custom ones using OPA (Open Policy Agent) and the Rego language.
Guide to SD-JWTs for selective disclosure; create, verify.
Intro - Learn about selective disclosure and its importance in the context of digital identity, and how SD-JWTs can be used for implementing it.
Issuing a SD-JWT Credential - Learn how to issue a Verifiable Credential using the SD-JWT format.
Verifying a SD-JWT Credential - Learn how to verify a Verifiable Credential using the SD-JWT format.
SSI Kit supports custom policies, written in any of the supported policy engine languages. A dynamic policy can either be executed on the fly (if all required parameters are provided) or saved under a specific name for later reference in the verify command or REST API.
Note: To use dynamic policies with Open Policy Agent, setup of the OPA Engine is required. Refer to the OPA Engine configuration for more details.
Create a dynamic policy - Learn how to create a dynamic policy via CLI or REST
Use a dynamic policy - Learn how to verify a VC using a dynamic policy via CLI or REST
Remove a dynamic policy - Learn how to remove a dynamic policy via CLI or REST
Data classes - Examine data classes used internally.
Example of a Rego policy
A simple Rego policy that takes a credential subject as input and verifies the subject DID against a given parameter would look like this:
This policy file is located in the SSIKit test resources: src/test/resources/rego/subject-policy.rego
Please refer to the SSI-Kit setup section to exectute the command successfully.
You can save the policy by name, which simplifies its usage in future verifications.
Please refer to the SSI-Kit setup section to exectute the command successfully. Example
Flags:
-n, --name
: Policy name, must not conflict with existing policies
-D, --description
: Optional policy description
-p, --policy
: Path or URL to policy definition. e.g.: rego file for OPA policy engine
-i, --input
: Input JSON object for rego query, which can be overridden/extended on verification. Can be a JSON string or JSON file
-d, --data-path
: JSON path to the data in the credential which should be verified, default: "$" (whole credential object)
-s, --save-policy
: Downloads and/or saves the policy definition locally, rather than keeping the reference to the original URL
-f, --force
: Override existing policy with that name (static policies cannot be overridden!)
-e, --policy-engine
: Policy engine type, default: OPA. Options, OPA
--vc / --no-vc
: Apply/Don't apply to verifiable credentials (default: apply)
--vp / --no-vp
: Apply/Don't apply to verifiable presentations (default: don't apply)
Please refer to the SSI-Kit setup section to serve the API.
Path parameters:
policyName
: [string] Name of the policy, e.g. MyCustomPolicy
Query parameters:
update
: [boolean] Specifies if existing policy with same name should be overridden (if mutable)
downloadPolicy
: [boolean] When using an URL to reference the to created policy. Downloads and/or saves the policy definition locally, rather than keeping the reference to the original URL
Body
name
: [string] Policy name, must not conflict with existing policies
description
: [string] Optional policy description
input
: [JSON] Input JSON object for rego query, which can be overridden/extended on verification. Can be a JSON string or JSON file
policy
: [URL, REGO] Whole Policy or URL to policy definition.
dataPath
: [JSON path] JSON path to the data in the credential which should be verified, default: "$" (whole credential object)
policyQuery
: [string] The query string in the policy engine language. Defaults to
"data.system.main".
policyEngine
: [string] Policy engine type, default: OPA. Options, OPA
applyToVC
: [boolean] Apply/Don't apply to verifiable credentials (default: apply)
applyToVP
: [boolean] Apply/Don't apply to verifiable presentaion (default: don't apply)
A dynamic policy requires an argument of the DynamicPolicyArg
type, defined as follows:
The properties are as follows:
name
: The policy name. Defaults to "DynamicPolicy".
description
: An optional description of the policy.
input
: A generic map (JSON object) holding the input data required by the policy. If no input is required, this can be an empty map.
policy
: The policy definition. Can be a file path, URL, JSON Path (if policy is defined in a credential property), or the policy script directly.
dataPath
: The path to the credential data to be verified. Defaults to the entire credential object ($). If you want to use only the credential subject as verification data, specify the JSON path like this: $.credentialSubject
.
policyQuery
: The query string in the policy engine language. Defaults to "data.system.main".
policyEngine
: The engine used for policy execution. Defaults to OPA (Open Policy Agent).
applyToVC
: Determines whether this policy should apply to verifiable credentials. Defaults to true.
applyToVP
: Determines whether this policy should apply to verifiable presentations. Defaults to false.
The policy is executed by the specified policy engine, with the Open Policy Agent currently being the only supported engine. OPA receives an input object containing the dynamic policy's input parameter and the credential data configured in the policy argument.
The input object for the policy engine is structured as follows:
This structure allows the REGO policy definition to access the input
properties as follows:
input.parameter
: The input object defined in the DynamicPolicyArg
's input property.
input.credentialData
: The credential data selected by the JSON path provided in the DynamicPolicyArg
's dataPath
property.
SimpleCredentialStatus2022 is a credentialStatus method by which the status of a verifiable credential can be checked. When a credential is issued with a SimpleCredentialStatus2022
credentialStatus type, it gets assigned a non-delegated revocation base-token. In order to check the status or revoke the verifiable credential, a delegated revocation derived-token is used.
DID cheqd scheme is supported with the same functionality as the other DID schemes (see Decentralized Identifiers for command-line interface or #decentralised-identifiers for REST API). Creating a did:cheqd
will also onboard with the Universal Registrar. The created DID can be checked at https://resolver.cheqd.net/1.0/identifiers/{your-did} or using the Universal Resolver.
did:cheqd
requires keys of type Ed25519
. They can be either:
imported into SSIKit - and further used when creating the did by specifying the kid
or created with SSIKit
standalone
by default when running the did-create command
Detailed instructions on how to build and run the SSI-Kit's CLI.
E.g. create a did:cheqd
on testnet
and also create the key for it by default
or
Detailed instructions on how to build and run the SSI-Kit's REST API.
POST
https://core.ssikit.walt.id/v1/did/create/
We apologize that our current implementation does not yet support the Stardust Upgrade from IOTA. As such, you cannot issue or verify credentials associated via a did:iota. Please refer to our roadmap for more information on when our products will be updated to include this latest changes.
Authors: Severin Stampler (walt.id) December 2022
Login-with-IOTA is defined as a profile of the OpenID Connect for Verifiable Presentations specification [OIDC4VP], which defines the protocol for authorization using SSI, based on top of OAuth 2.0 [RFC6749] and introduces protocol extensions for the presentation of claims via Verifiable Credentials [VC_DATA_MODEL]. In this document we will describe the specifics of using the OIDC4VP protocol in the scope of Login-with-IOTA to ensure compatibility with the IOTA identity framework [IOTA_IDENTITY].
The IOTA identity framework [IOTA_IDENTITY] defines a custom DID method [IOTA_DID], based on the public key of the user account. The key material used is an EdDSA/Ed25519 [RFC8032] key pair.
DID method | Key algorithm |
---|---|
To ensure compatibility with the IOTA identity framework [IOTA_IDENTITY], the issuers and holders of verifiable credentials should use a did:iota for issuance and as credential subject.
The credentials used by the IOTA identity framework [IOTA_IDENTITY] are in line with the W3C specification for Verifiable Credentials [VC_DATA_MODEL]. Every type of credential, that is compatible with the W3C specification, should in theory be supported.
Proofs for the credentials are created in the linked data format, ldp_vc or ldp_vp, as described by the W3C data integrity specification [VC_DATA_INTEGRITY], using [JSON-LD] as the credential format and the JCS Ed25519 Signature 2020 [JcsEd25519Signature2020] signature type.
Format | Signature type |
---|---|
The [OIDC4VP] specification is based on top of OAuth 2.0 [RFC6749], which enables implementers to also build on top of OpenID Connect [OIDC] and the Self-issued OpenId Provider specification [SIOPv2].
This Login-with-IOTA profile of the [OIDC4VP] specification supports only W3C Verifiable Credentials [VC_DATA_MODEL].
Like described by [OIDC4VP], verifiable presentations can be requested by adding the presentation_definition
parameter to the authorization request. The presentation is returned in the vp_token
response parameter.
Both, same-device and cross-device flows are supported by this profile. For cross-device scenarios [OIDC4VP] introduces a new response mode post
, to support verifier initiated verification.
The parameters for a Login-with-IOTA authorization request, are a subset of the parameters defined by the [OIDC4VP] specification:
response_type
: REQUIRED. this parameter is defined in [RFC6749]. The possible values are determined by the response type registry established by [RFC6749]. This specification introduces the response type vp_token
. This response type asks the Authorization Server (AS) to return only a VP Token in the Authorization Response.
presentation_definition
: CONDITIONAL. A string containing a presentation_definition JSON object as defined in Section 4 of [DIF.PresentationExchange].
presentation_definition_uri
: CONDITIONAL. A string containing a URL pointing to a resource where a presentation_definition
JSON object as defined in Section 4 of [DIF.PresentationExchange] can be retrieved .
nonce
: REQUIRED. This parameter follows the definition given in [OIDC]. It is used to securely bind the verifiable presentation(s) provided by the AS to the particular transaction.
state
: OPTIONAL. State provided by the authorization client, that is passed through to the response.
A request MUST contain either a presentation_definition
or a presentation_definition_uri
. Those two ways to request credential presentations are mutually exclusive. The wallet MUST refuse any request violating this requirement.
The presentation_definition
parameter must contain a JSON representation of a Presentation Definition object, according to the DIF Presentation Exchange specification [DIF.PresentationExchange].
Alternatively the request could contain the presentation_definition_uri
parameter, containing a URL to a presentation definition object.
The following example shows a presentation definition object, requesting the presentation of a VerifiableId
credential:
This is an example authorization request:
The response parameters depend on the response_type
defined in the authorization request. Possible response parameters include:
id_token
: The ID token as defined by section 2 of the [OIDC] core specification.
presentation_submission
: The presentation submission object, as defined in [DIF.PresentationExchange], which links the input descriptors of the presentation definition in the request to the corresponding presentation(s) in the vp_token
response.
state
: Optional state parameter passed through from the authorization request.
Depending of the response_type
given in the authorization request, the response should contain the following parameters, like described in section 6.1 of [OIDC4VP]:
If only vp_token
is used as the response_type
, the VP Token is provided in the authorization response.
If id_token
is used as the response_type
alongside vp_token
, the VP Token is provided in the OpenID Connect authentication response along with the ID Token.
In all other cases, if vp_token
is not used, but presentation_definition
parameter is present, the VP Token is provided in the Token Response.
Any combination of vp_token
with a response_type
other than id_token
is undefined.
The vp_token
response parameter contains the verifiable presentation or array of verifiable presentations, matching the input descriptors of the presentation definition, specified in the authorization request.
The only supported format of the verifiable presentation in this specification is the [ldp_vp
] / [JSON-LD] format. The JSON data can be either a single presentation object or an array of JSON objects and must be URL encoded.
The presentation submission object contains the correlations of the input descriptors, specified in the presentation definition of the authorization request, with the verifiable presentations in the VP token of the response. The format of the presentation submission objects is defined in section 6 of [DIF.PresentationExchange].
This is an example response for the respones_type=vp_token
request parameter, with the presentation_submission
and vp_token
response parameters:
An example vp_token
, containing a presentation with a VerifiableId credential:
An example presentation_submission
object:
For creating compliant error responses, please refer to section 6.3 of [OIDC4VP].
For the same device flow, the verifier or relying party, links directly to the authorization endpoint of the wallet, passing the request parameters, as specified in the Authorization request section.
The response_mode
should be set to form_post
. After getting the user consent, the wallet will generate the response parameters, as specified in the Authorization response section, and performs a HTTP FORM POST action to the redirect_uri
specified in the authorization request. The relying party can now verify the authorization response and redirect the user to the protected web page.
For the cross-device flow, the verifier or relying party initiates an internally cached authorization session and displays a QR code containing the authorization request URI with the request parameters, as specified in the Authorization request section.
The response_mode
should be set to post
. The wallet scans the QR code and parses the authorization request. After getting the user consent, the wallet will generate the response parameters, as specified in the Authorization response section, and posts the response to the redirect_uri
specified in the authorization request, via the HTTP POST method. The relying party can now verify the authorization response and update the state of the internally cached authorization session. Depending on the concrete implementation (for example by polling for the session state), the relying party UI can now be refreshed and redirected to the protected page.
Terbu, O., Lodderstedt, T., Yasuda, K., Lemmon, A., Looker, T., "OpenID for Verifiable Presentations", September 2022, https://openid.net/specs/openid-4-verifiable-presentations-1_0.html
Hardt, D., Ed., "The OAuth 2.0 Authorization Framework", RFC 6749, DOI 10.17487/RFC6749, October 2012, https://www.rfc-editor.org/info/rfc6749
Sporny, M., Longley, D., Chadwick, D., "Verifiable Credentials Data Model v1.1", March 2022, https://www.w3.org/TR/vc-data-model/
IOTA Foundation, "IOTA Identity Framework Guide", https://wiki.iota.org/identity.rs/introduction/
Millenaar, J., IOTA Foundation, "IOTA DID Method Specification", https://wiki.iota.org/identity.rs/specs/did/iota_did_method_spec/
Josefsson, S.; Liusvaara, I. (January 2017). Edwards-Curve Digital Signature Algorithm (EdDSA). IRTF. doi:10.17487/RFC8032. ISSN 2070-1721. RFC 8032. https://datatracker.ietf.org/doc/html/rfc8032
Longley, D., Sporny, M., "Verifiable Credential Data Integrity 1.0", https://w3c.github.io/vc-data-integrity/
Sporny, M., Longley, D., Kellog, G., Lanthaler, M., Champin, P., Lindström, N., "JSON-LD 1.1", https://www.w3.org/TR/json-ld11/
Cohen, G., Steele, O, Decentralized Identity Foundation, "JCS Ed25519 Signature 2020", https://identity.foundation/JcsEd25519Signature2020/
N. Sakimura, J. Bradley, M. Jones, B. de Medeiros, C. Mortimore, "OpenID Connect Core 1.0 incorporating errata set 1", November 8, 2014, https://openid.net/specs/openid-connect-core-1_0.html
K. Yasuda, M. Jones, T. Lodderstedt, "Self-Issued OpenID Provider v2", September 2022, https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html
Buchner, D., Zundel, B., Riedel, M., and K. H. Duffy, "Presentation Exchange 2.0.0", , https://identity.foundation/presentation-exchange/spec/v2.0.0/.
In order to be able to onboard the did:cheqd
on testnet and mainnet, SSIKit relies on a cheqd universal registrar deployed on walt.id infrastructure. The DID will be created using a key imported into or also created with SSIKit.
Velocity issuance commands are available under issue
command as follows:
offer management
credential management
Before being able to issue verifiable credentials, the credential data needs to be prepared. Offers represent the way to set up credential data. Basically, an offer is a credential that has not been signed. The offer management functions can be accessed from the command:
Currently available functions are:
create offer
E.g. Create an offer.
Credential management functions include:
issue credential
E.g. Issue credential.
Velocity onboading commands are available under the onboard
command as follows:
Once an account was set up with the registrar (see onboarding), cli-tool can be used to register the organization, using the command:
E.g. Onboarding organization.
Every organization needs a tenant on the credential agent. Tenant functions are available under the tenant
command:
create tenant
E.g. Add a tenant having verifier, issuer and agent operator purposes.
In order to be able to issue / verify credential, it is required to have the correct identification disclosure set up. Current disclosure management functions are:
create disclosure
E.g. Create an integrated issuing identification disclosure.
Selective Disclosure allows holders to reveal only necessary information.
Selective disclosure enables a holder to choose which pieces of information contained in a Verifiable Credential will be revealed to a verifier, rather than being forced to reveal all the data present in a Verifiable Credential.
For example, Alice could now only share her age to verify being old enough to purchase products offered in an ecommerce shop, without revealing other personal information present in her Verifiable ID document used for verification. This allows for greater privacy and control over personal data.
Selective disclosure is a critical aspect of SSI because it enables individuals to share only the minimum amount of personal information necessary to complete a transaction or interaction, while keeping the rest of their personal data private. This reduces the risk of identity theft and other types of fraud.
Our implementation of selective disclosure currently does not follow any specific standard, as standards in the field are still under development. As reference, we used the Selective Disclosure for JWTs (SD-JWT) reference by IETF. Please note that our implementation is subject to change.
A Selective Disclosure JSON Web Token (SD-JWT) is a type of in which the claims in the body are hashed, making them unreadable without disclosure. By providing the necessary disclosures, the original values of the claims can be revealed.
When presenting a classical credential via JWT, claims are visible to the verifier in plain text. With an SD-JWT credential, claims are encrypted in a hashed format, making them unreadable. This allows the holder to choose which claims to reveal to the verifier by providing the plain text key-value pairs (known as disclosures) next to the SD-JWT. The verifier can then hash these disclosures and compare them to the values in the SD-JWT, verifying that the claim was part of the SD-JWT. Additionally, SD-JWTs also allow for decoy hashes to be included in the credential, which are dummy values to conceal the actual number of claims in the credential.
In the following section, we will see how to issue and verify SD-JWT credentials.
The credentialStatus
property is used to identify the status of a verifiable credential. It is an optional property (meaning when it's missing, the credential is not subject to any status change), but when specified, it includes the following mandatory fields:
id - a URI which identifies a location for the credential's status
type - an arbitrary string which identifies the type of the credential status (typically revocation or suspension)
Depending on the type, a credentialStatus property can contain additional fields, according to its model specification.
Learn about issuing SD-JWT credentials which includes hashing, creating disclosures and adding decoy hashes
Credential Creation: The issuer first creates the credential, as usual.
Conversion to SD-JWT Credential: The issuer then transforms the credential into an SD-JWT. This is done by hashing all or only a subset of claims, adding decoy hashes, and preparing disclosures. At the end, the SD-JWT credential can contain plain-text claims next to disclosable ones.
Claim Hashing: The claim is transformed into a disclosure, which is a plain-text representations of the claim, by concatenating the attribute name and value, then prefixing it with a salt. The salt prevents attackers from guessing plain-text values via dictionary attacks. This is then converted to a base64 string, which will represent the disclosure. For example, the disclosure may look like this (salt + attribute name + value): [ “dC12Y2xpYi9tYXN0ZXI”, “given_name”, “John” ]
. The disclosure is then put into a hash function, and the result gets included in the SD-JWT.
Adding Decoy Hashes: At this point, decoy hashes are also added to the SD-JWT. These decoy hashes are essentially dummy values that help conceal the actual number of claims a credential holds. By using decoy hashes, the issuer can prevent potential observers from determining how many claims are contained within the credential based on the number of hashes.
SD-JWT Credential Transfer to Holder: The issuer sends the SD-JWT and all disclosures to the holder. Because the holder now has the SD-JWT credential as well as all the disclosures, he can read the entire content of the credential. This allows the holder to decide which disclosures to send alongside the SD-JWT in a transaction with a verifier.
Transfer Format: On transfer, the SD-JWT is shared with the concatenated disclosures using the ~ sign. An example of this format would be:
Using either the CLI, Kotlin or REST option, you can start issuing your SD-JWT credential.
, if you never used the SSi-Kit before.
I will be using ssikit
as an alias for ./ssikit.sh
in this section.
Creating a did:key for our SD-JWT credential. Please refer to for all options.
Example Response
We will be using the VerifiableId credential template for this example, but you can use whatever template you want. When issuing we specify the format of the VC as SD-JWT, the fields which we want to make selectily discloable and the number of decoy digests we want to include (optiona).
Options:
-s, --subject-did
: DID of the subject
-i, --issuer-did
: DID of the issuer
-y, --proof-type
: Either JWT
, LD_PROOF
or SD_JWT
-t, --template
: VC template, e.g VerifiableId
--sd, --selective-disclosure
: Path to selectively disclosable fields (if supported by chosen proof type), in a simplified JsonPath format, can be specified multiple times, e.g credentialSubject.dataOfBirth
--num-decoys
: Number of SD-JWT decoy digests to add (fixed mode), or max num of decoy digests (random mode)
--interactive
: Interactively prompt for VC data to fill in
vc.txt
: Path to output the generated VC
Example Response
Receiving a JWT token with disclosures appended via '~'.
Viewing body of SD-JWT in JSON format
Options:
-c
: credential content or file path
Example Responsen
coming soon
StatusList2021Entry is a credentialStatus method by which the status of a verifiable credential can be checked. The basic idea of it is each issued credential has a corresponding position in a bit-string (also called the status list), having a value of either 0 - not revoked, or 1 - revoked. This status list is published by the issuer as a verifiable credential with a type that includes StatusList2021Credential.
The StatusList2021Entry credentialStatus contains the following fields:
id - a URL identifying the status information for the verifiable credential
type - StatusList2021Entry
statusPurpose - the purpose of the status entry (typically revocation or suspension)
statusListIndex - the bit position of the credential within the bit-string
statusListCredential - the URL of the StatusList2021Credential credential that encapsulates the bit-string
e.g.
The StatusList2021Credential is a verifiable credential that encapsulates the bit-string information about all the credentials ever issued. The following fields have to be explicitly provided:
id - (optional) the URL to this credential (should match the statusListCredential value from StatusList2021Entry)
type - should contain StatusList2021Credential
credentialSubject
type - StatusList2021
statusPurpose - the purpose of the status credential (StatusList2021Entry should match this value)
encodedList - the compressed and base64 encoded value of the bit-string
e.g.
Verifier requests, holder shares selective disclosures, verifier verifies hashes and signature.
Selective Disclosure Sharing: During the verification process, the verifier asked for a certain set of claims. The selective disclosure sharing mechanism allows the holder then to only share does require claims through sending the whole SD-JWT plus the disclosures which are needed for verification. This process therefore helps with privacy by not revealing more identity information than what's specifically requested.
Disclosure Verification: The verifier, upon receiving the shared disclosures and the SD-JWT, can confirm that the shared disclosures are a part of the SD-JWT. This is done by hashing the received disclosures in the same manner as the issuer did during .
Hash Comparison and Tamper Check: The verifier then compares the hashed values of the shared disclosures with the values present in the SD-JWT. If the hashed values match, the verifier can be confident that the shared values haven't been tampered with and are actually part of the SD-JWT.
Transfer Format: Transferring the credential from holder to verifier happens through the sharing of the SD-JWT with the concatenated disclosures which were chosen to be revealed using the ~ sign. An example of this format would be:
Using either the CLI, Kotlin or REST option, you can start verifying your SD-JWT credential.
We create a presentation to provide the verifier with the holder's credentials for verification. The presentation can include data from multiple credentials, making verification easier as only one interaction is required. We provide the holder DID and the disclosures to create a presentation, which we can then present to a verifier, via the present command. Example Command
Options:
-i, --holder-did
: DID of the holder (owner of the VC)
-c, --challange
: Challenge to be used in the LD proof
--sd, --selective-disclosure
: Path to selectively disclosed fields, in a simplified JsonPath format. Can be specified multiple times. By default NONE of the sd fields are disclosed, for multiple credentials, the path can be prefixed with the index of the presented credential, e.g. "credentialSubject.familyName", "0.credentialSubject.familyName", "1.credentialSubject.dateOfBirth".
other options
--sd-all-for
: Selects all selective disclosures for the credential at the specified index to be disclosed. Overrides --sd flags!
--sd-all
: Selects all selective disclosures for all presented credentials to be disclosed.
Example Response
Parseing the presentation to JSON
Using the parse command, you can print the presenation as a JSON object.
Options:
-r
: Recursively parse credentials in presentation
-c
: Credential content or file path
We can check the validity of the presentation by providing the verify command with it. Use the storage location printed at the end of the last command.
Example Command
Example Output
coming soon
Once a dynamic policy has been saved with a specific name, as explained in , you can use it to verify Verifiable Credentials.
Please refer to the to exectute the command successfully.
-p, --policy
: Verification policy. Can be specified multiple times. By default, SignaturePolicy is used. To specify a policy argument (if required), use the format PolicyName='{"myParam": "myValue", ...}', to specify the JSON object directly, or PolicyName=path/to/arg.json, to read the argument from a JSON file.
We can verify a credential with the SubjectPolicy
and VerifiableId located in src/test/resources/rego/VerifiableId.json
, which are provided when cloning the project, so no setup is needed.
Please refer to the to serve the API.
Body
policies
: [array] A list of policy definitions to verify against
policy
: [string] The name/id of the policy
argument
: [JSON] The argument needed by the policy (optional)
credentials
: [array] An array of credentials in JWT, or LD_PROOF format
In order to issue a verifiable credential with a credentialStatus, the statusType
property of the proofConfig
object should be provided (e.g. 'SimpleCredentialStatus2022', 'StatusList2021Entry', etc.). If no statusType is provided, the credential will be issued without any credentialStatus property.
e.g. Issue a UniversityDegree credential having a StatusList2021Entry
credentialStatus using the REST API interface issue endpoint: https://signatory.ssikit.walt.id/v1/credentials/issue
. The request-body is presented below.
e.g. Issue a UniversityDegree credential having a StatusList2021Entry
credentialStatus using the command-line interface issue command: ssikit vc issue -h
Name | Type | Description |
---|---|---|
Name | Description | Argument |
---|---|---|
Currently, supports the following credentialStatus methods:
More details on credentialStatus specification can be found at .
More details about StatusList2021Entry and StatusList2021Credential can be found at .
method*
String
did schema (use cheqd)
network
String
mainnet or testnet
SignaturePolicy
Verifies the signature of the W3C Verifiable credential.
None
JsonSchemaPolicy
Verifies against the associated JSON schema. Note that the attribute credentialSchema must be set and the JSON schema must be accessible by the http URL.
None
ValidFromBeforePolicy
Verifies the credentials based on their valid-from date
None
ExpirationDateAfterPolicy
Verifies the credentials based on their expiration date
None
ChallengePolicy
Verifies challenge
ChallengePolicyArg
, which contains specific challenges to check against.
VpTokenClaimPolicy
Verify verifiable presentation by OIDC/SIOPv2 VP token claim.
VpTokenClaim
CredentialStatusPolicy
Verifies credentials based on their status
None
EbsiTrustedSchemaRegistryPolicy
Verify by EBSI Trusted Schema Registry. Checks performed:
credential schema id has the correct format
None
EbsiTrustedIssuerDidPolicy
Verify by trusted issuer did. Checks performed:
issuer did is resolvable against EBSI
None
EbsiTrustedIssuerRegistryPolicy
Verify by EBSI Trusted Issuer Registry record. Checks performed:
issuer has any record on trusted registry having an authorization claim matching the VC schema
issuer's TIR record contains a VerifiableId credential
the authorized claim record (from p.1) has the type provided as argument to the policy
issuer's accreditation is valid - verifies against EbsiTrustedIssuerAccreditationPolicy
EbsiTrustedIssuerRegistryPolicyArg
EbsiTrustedSubjectDidPolicy
Verify by trusted subject did. Checks performed:
subject did is resolvable against EBSI
None
EbsiTrustedIssuerAccreditationPolicy
Verify by issuer's authorized claims. Checks performed:
fetches the attribute specified by the termsOfUse
property
checks whether the credential stored as the attribute body has the required accreditation claims to match the current VC schema
None
IssuedDateBeforePolicy
Verify by issuance date.
None
GaiaxTrustedPolicy
Verify Gaiax trusted fields.
None
GaiaxSDPolicy
Verify Gaiax SD fields.
None