Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Here are the most important things you need to know about the SSI Kit:
It is written in Kotlin/Java. It can be directly integrated (Maven/Gradle dependency) or run as RESTful web-service. A CLI tool allows you to run all functions manually.
It is open source (Apache 2). You can use the code for free and without strings attached.
It is a holistic solution that allows you to build use cases “end-to-end”. There is no need to research, combine or tweak different libraries to build pilots or production systems.
It abstracts complexity and low-level functionality via different interfaces (CLI, APIs). Additional services facilitate development and integration (e.g. Issuer and Verifier Portals).
It is modular, composable and built on open standards allowing you to customize and extend functionality with your own or third party implementations and to preventing lock-in.
It is flexible in a sense that you can deploy and run it on-premise, in your (multi) cloud environment or as a library in your application.
It enables you to use different identity ecosystems like Europe’s emerging identity ecosystem (EBSI, ESSIF) in anticipation of a multi-ecosystem future.
Learn what the SSI Kit is.
The SSI Kit offers everything you need to use Self-Sovereign Identity (SSI) with ease.
The following sections elaborate the SSI Kit's unique properties, enabled functionality and components.
It has always been our goal to provide developers and organizations with great tools, so they can focus on delivering holistic identity solutions. Taking the lessons learned from previous products, we decided to redesign our current offering, resulting in what we now call The Community Stack. A collection of open-source products providing everything to launch any identity solution with ease. You can learn more about it here.
Starting from December 2023, the SSI-Kit will halt feature enhancements, leading to a complete discontinuation planned for end-Q3 2024. It's essential to plan your transition to the new stack effectively. The table below indicates which components of the SSI-Kit are already supported in the new stack.
For Kotlin/Java projects where SSI-Kit was used as a native dependency, utilize the provided Library for equivalent features in the new stack. •
If you employed the REST APIs, simply switch to the supplied API in the new stack.
If you have any question, please reach out.
All relevant new libaries and APIs have found it's place in the waltid-identity repo.
Keys Operations: Create, Update, Delete
All types of operations are supported
ed25519
secp256k1
secp256r1
rsa
DIDs Operations: Create, Register, Resolve, Delete
All types of operations are supported
did:key
did:jwk
did:web
did:cheqd
did:iota
did:ebsi
Not yet supported
W3C Credentials
Issuance
W3C Credential Issuance as JWTs
W3C Credential Issuance as SD-JWTs
W3C Credential Issuance as JSON-LD
Not yet supported
Verification
W3C Credential Verification (JWTs)
W3C Credential Verification (SD-JWTs)
W3C Credential Verification (JSON-LD)
Not yet supported
Other Credential Features
Credential Templates
Credential Revocation
Not yet supported
Policies
Open-Policy Agent Policies
Issuance & Verification via OpenID4VC
Signatory allows you to digitize paper credentials and automate data provision to your stakeholders.
It provides all functionality required by “Issuers”. For example:
Process and authenticate data requests by people or organisations,
Import data (from local storage or third parties),
Create re-usable VC templates,
Create VCs in different formats (e.g. JSON/JWT, JSON-LD),
Sign VCs using different key types (e.g. ed25519, secp256K1, RSA),
Manage the lifecycle of VCs (e.g. revocation).
Issue VCs (e.g. via OIDC/SIOP)
Custodian is a secure data hub for people and organizations. It provides all functionality required by “Holders”. For example:
Interact with Registries (read, write)
Create, store, manage keys, data (DIDs, VCs) and other secrets,
Request and import data (VCs) from third parties,
Selectively disclose data (VCs/VPs) for authentication and identification,
Manage consent and data access in a user-centric fashion.
Auditor allows you to verify your stakeholders’ identity data and offer frictionless access to services or products. It provides all functionality required by “Verifiers”. For example:
request data (VCs/VPs) from stakeholders,
verify data (VCs/VPs; incl. integrity, validity, provenance, authenticity),
trigger pre-defined actions following the verification.
The verification steps can be dynamically configured by passing "verification policies" to each verification attempt.
The SSI Kit comes with the following set of built-in verification polices:
SignaturePolicy: Loads or resolves DID, loads public key and verifies the credentials signature.
JsonSchemaPolicy: Validates the credential against the JSON schema.
TrustedSchemaRegistryPolicy: Checks if the JSON schema is anchored in the EBSI Trusted Schema Registry.
TrustedIssuerDidPolicy: Checks if the issuer DID is anchored on the EBSI DID registry.
TrustedIssuerRegistryPolicy: Checks if the issuer got inserted in the EBSI TIR (Trusted Issuer Registry).
TrustedSubjectDidPolicy: Checks if the issuer DID is anchored on the EBSI DID registry.
IssuedDateBeforePolicy: Checks if issued date is in the past.
ValidFromBeforePolicy: Checks if valid-from date is in the past.
ExpirationDateAfterPolicy: Checks if expiration-date is in the futrue.
CredentialStatusPolicy: Checks if credential is revoked based on the credential-status list.
The SSI Kit establishes an identity infrastructure layer for any use case in any industry. Its core services are in the scope of:
Registry Interactions (e.g. read, write; agnostic towards the underlying tech e.g. DLT, DNS)
Key Management (e.g. generate, sign, import, export, manage lifecycle)
Decentralized Identifier (DID) operations (e.g. register, resolve, manage lifecycle)
Verifiable Credential/Presentations (VC, VP) operations (e.g. create, issue, present, verify)
Ecosystem specific use cases (e.g. onboarding, data exchange and monetization)
Illustration:
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • •
Available via: • • Please note, the issuer API only supports did:key at this point.
Available via: • •
In The Community Stack, we no longer have the notion of a credential template. The issuance will simply happen by providing the full W3C data schema, which will then be signed. A list of credentials schemas can be found
Similar. A list of all policies can be found .
Not yet supported. However, the new also give you great flexibility until we will reintroduce Open-Policy Agent policies.
Available via: • • •
Important: Please be informed that, beginning from December 2023, the SSI Kit will no longer receive new features. Furthermore, the SSI Kit is planned for discontinuation by the end of Q3 2024. However, all functionalities offered by the SSI Kit will be integrated into our new libraries, APIs, and apps in the walt.id . Giving you more modularity, flexibility and ease-of-use to build end-to-end digital identity and wallet solutions. Read the transition guide . For any clarification or queries, feel free to as we aim to make this transition as smooth as possible.
if you are already familiar with SSI, you can jump to the .
if you are new to SSI, please continue with our .
Cryptographic keys convey control over digital identities and enable core functionality such as encryption and authentication.
The SSI Kit supports:
EdDSA / ed25519
ECDSA / secp256k1
ECDSA / secp256r1
RSA
Note that we are continuously adding support for new key types.
You can learn more about keys here.
Our open source solutions enable you to use different types of DIDs and different identity ecosystems. Every relevant functionality is supported from the generation of DIDs and DID Documents to anchoring or resolving them on/from Registries.
We currently support the following DID methods:
did:ebsi
did:web
did:key
did:jwk
did:iota
did:cheqd
Note that we are continuously adding support for new DID methods.
You can learn more about DIDs here.
Authentication and data exchange protocols (e.g. OIDC/SIOP) enable the exchange of data (VCs) between different parties.
The SSI Kit supports latest OpenID Connect extension for SSI:
The implementation of the protocols is conformant with the latest specs from EBSI https://api-conformance.ebsi.eu/docs/wallet-conformance
You can learn more about protocols here.
This section elaborates the theory behind the SSI Kit:
SSI Kit | Basics - Learn what the SSI Kit is and what it does.
SSI Flavors & Ecosystems - Learn which SSI flavors and identity ecosystems we support.
Architecture - Explore the SSI Kit's multi-layered architecture and components.
Use Cases - Explore use cases you can implement with the SSI Kit.
SSI-Kit feature list - Explore all features in an overview list.
The SSI Kit abstracts complexity for developers by following a "multi-stack approach" that enables you to use different implementations or "flavours" of SSI.
As a result, you can participate in different identity ecosystems (e.g. EBSI/ESSIF, Gaia-X, Velocity Network, cheqd and IOTA) and avoid technology-related lock-in effects.
Based on our Introduction to Self-Sovereign Identity (SSI), we distinguish the following concepts or building blocks:
Read on to learn which concrete technologies and implementations we support on the level of
Trust Registries
Keys
Decentralized Identifiers (DIDs)
Verifiable Credentials (VCs)
Data Exchange Protocols
Our products are agnostic towards the underlying technologies used to implement Trust Registries, which means that the SSI Kit is potentially compatible with any type of Trust Registry.
The SSI Kit supports:
Permissionless Blockchains (e.g. Ethereum),
Permissioned Blockchains (e.g. Ethereum Enterprise/Hyperledger Besu),
Domain Name Service (DNS),
Pure peer-to-peer approaches that do not require Registries.
Note that we are continuously adding support for new Registries and underlying technologies.
You can learn more about Trust Registries here.
Verifiable Credentials (VCs) are digital identity documents that can easily and securely be shared with and verified (incl. validity, integrity, authenticity, provenance) by anyone in a privacy preserving way. Importantly, they are never (!) stored on a blockchain due to privacy and compliance reasons.
The SSI Kit supports W3C Verifiable Credentials in different formats:
JSON / JWT
JSON-LD
Note that we are continuously adding support for new VC types and formats.
You can learn more about VCs here.
The SSI Kit exposes high-level interfaces / APIs to hide the complex introduced by
low-level services (e.g. key management, signing, data storage)
different ecosystems (i.e. different SSI flavors, business logic and governance frameworks).
The functionality of the high-level interfaces correlate with the SSI Kit Components. The functions are grouped around:
issuing Verifiable Credentials by the Signatory,
holding (storing, presenting) Verifiable Credentials by the Custodian
and verifying Verifiable Credentials by the Auditor.
The interfaces can be used in JVM-based applications directly, or via the REST API.
The Swagger documentation can be found under section REST API.
Use cases you can build with the SSI Kit.
You can use Self-Sovereign Identity (SSI) - and by extension the SSI Kit - to solve any identity-related problem.
You can use the SSI Kit to enable your users, customers, employees or partners to access information, services or products. By this, you can replace today's cumbersome sign-up and login processes (usernames, passwords) with more seamless experiences.
In other words, you can SSI to authenticate stakeholders you already know.
You can use the SSI Kit to identify people, organizations or even things to provide them with information, services or products.
Identity proofing is particularly important in the AML (anti-money launder) regulated industries, but is seeing growing adoption by non-regulated industries and platforms to prevent fraud, SPAM and other malicious behaviour.
Simply put, you can use SSI to identify stakeholders you do not yet know.
You can use the SSI Kit to verify any identity-related information beyond a person’s or company’s core identity (see Identity Proofing / Verification), which can be important when evaluating risks or performing compliance assessments.
For example, you can use SSI for
employment background checks (education, work, criminal history)
financial due diligence (bank account information, liquidity events, credit ratings)
any other type of data verification required for transactions from insurance or health data to social proofs like ratings or recommendations.
SSI can be used to digitize any type of identity-related information in order to replace paper-based identity documents or cards with digital ones that are easier to manage, share and verify as well as harder to forge.
For example, think about official public sector documents such as identity certificates or about licenses or certificates that convey allowance to perform regulated activities.
You can find more examples in our White Papers:
Me, Myself & (SS)I (co-authored by the Boston Consulting Group)
If you have any questions, feel free to get in touch.
The architecture of the SSI Kit consists of three layers:
Low-Level Services Abstraction: Abstracts complex, low-level operations (e.g. cryptography, key management, digital signatures, data storage).
Ecosystem Abstraction: Abstracts ecosystem-specific requirements based on the relevant technical and governance frameworks (e.g. SSI flavors, business logic, policies).
High-Level Interfaces / APIs: Provides high-level interfaces that hide complexity and facilitate usage for developers.
Also, the architecture allows for the integration of third party solutions throughout the stack. For example:
Key storage (e.g. HSM, WebKMS)
Data storage (e.g. identity hubs, confidential storage)
Registries (e.g. blockchains, DNS)
This architectural openness prevents vendor lock-in and allows you to build SSI-based solutions that meet your unique requirements.
Illustration:
Read on to explore all three abstraction layers in more detail.
This software-layer holds a set of generic core services for common SSI and cryptographic functions. The services are in the scope of key management, decentralized identifiers, verifiable credentials and data storage.
The following is a short summary of the interfaces available. The detailed functions are described in the documentation further on.
Handles keys and cryptographic operations like the generation of signatures (e.g. linked data, JWT) with signature types such as ES256K or EdDSA.
Keys can be stored in a file and database keystore, which is extendable to HSMs and WebKMS.
Abstracts common functionality related to Decentralised Identifiers (DIDs, DID Documents) for methods like “did:web”, “did:key”, “did:ebsi”.
Abstracts common functionality related to Verifiable Credentials (VCs) and Verifiable Presentations (VPs) in different formats like JSON and JSON-LD.
You can launch identity wallets or extend your existing applications with the SSI capabilities. Check out our if you are interested to learn more.
The low-level services expose comon interfaces that can conviniently unitized directly via Kotlin/Java or via the REST API ().
We believe in a multi-ecosystem future.
This is why we built an abstraction layer for ecosystem-specific operations and business logic. The idea is to support any ecosystem with a single solution that does not put any additional burden on developers. As a result, you can use our solutions to participate in different ecosystems without having to switch between different technical implementations.
We currently support:
EBSI/ESSIF (EU's new decentralized identity ecosystem)
Gaia-X (EU's new cloud infrastructure)
Velocity Network
cheqd Network
IOTA
Note that we are continuously adding new ecosystems.
Data Exchange (Protocols) enable the exchange of data (VCs) between different parties.
Different authentication and data exchange protocols are used to securely transfer identity data (e.g. VCs, VPs) between parties (e.g. from an Issuer to a Holder). They typically establish a mutually authenticated and encrypted data channel between the communicating parties.
The most common data exchange protocols used for SSI are:
OIDC4SSI / SIOP (Self-Issued OpenID Connect Provider): An extension of a mature authentication and authorization protocol called "OpenID Connect" (OIDC).
DIDComm: A novel protocol specifically designed for SSI and maintained by the Decentralized Identity Foundation (DIF).
Credential Handler API: A proposed browser-extension that may be used to connect the user's identity wallet to a web-application.
Our solutions enable you to use different data exchange protocols like OIDC/SIOP as required by different ecosystems.
DIDs give us the power of verifying information, for example credentials, anywhere, anytime, through the establishment of a public key infrastructure.
DIDs are unique identifiers (URIs) which are standardised by the W3C. They can refer to any subject - from a person, to an organization, to a thing or basically anything else for that matter. Before we have a look at how a DID is structured and the benefit it provides, let's understand the shortcomings of current identifiers first and then see how the DID solves those.
In the digital economy, data exchange happens a lot, which makes it increasingly important to be able to identify persons, concepts or anything else for that matter in a secure and verifiable way.
A person today can have multiple different identifiers, like:
name@surname.com
https://www.surname.com
0000-0000-0000-0000 (ORCID based identifier, used to identify authors of scholarly communication)
All of these identifiers work, but none fulfills to be decentralized, persistent, resolvable and cryptographically verifiable when put into the following questions.
Is the identifier decentralized?
https://www.surname.com
depends on a single point of failure. What happens if the hosting side disappears?
0000-0000-0000-0000 (ORCID)
depends on the ORCID database. What happens if it is discontinued, hacked, etc?
Is the identifier persistent?
https://www.surname.com
If I do no longer pay for that domain, the identifier will be gone or even bought by somebody else.
Is the identifier resolvable?
How can I get additional information about 0000-0000-0000-0000 (ORCID)
identifiers?
Is the identifier verifiable?
How can I prove that I own the domain, https://www.surname.com
?
If I stopped paying for my domain, https://www.surname.com
and somebody else would buy it. How would somebody know that the information provided on the side was actually mine?
All those problems make it hard to be 100% sure when exchanging information, that the party we are exchanging information with, is actually the party and not some malicious actor pretending to be the party.
The design of DIDs, which is a new form of unique identifier that has been standardized by the W3C, can now help us address these problems of current identifiers, by being:
Decentralized
The DIDs no longer depend on centralized registries, identity providers, authorities, etc.
Persistent
Once created, the did is permanently assigned to the subject.
Resolvable
It is possible to find a basic set of information when resolving the did. Typically, this will lead to a DID Document.
Cryptographically verifiable
There is a mechanism to cryptographically prove identity and ownership via information provided by the DID Document.
The DID is a simple text string built up of three parts:
Schema
The did URI scheme identifier
DID Method
The identifier for the did method
DID Method-Specific Identifier
An identifier based on the requirements of the did method
A variety of “DID methods'', which are different implementations of the DID specification, exist. Considering that DID methods differ in terms of how they are created, registered and resolved, different methods come with different advantages and disadvantages.
For example, while DIDs are often anchored on Registries, such as EBSI (did:ebsi) or the Domain Name Service (did:web), new methods emerged that do not require Registries because their distribution is based on peer-to-peer interactions (e.g. did:key).
Now we can take any did, look at the method and resolve it based on the framework around the method. The resolved content will most of the time be JSON or JSON-LD, although other data formats might also be added in the future. The resolved content is called DID Document.
A container of information holding:
The Subject
The owner of the DID Document
The Controllers
The entities allowed to introduce changes to the DID Document. The controller may or may not be identical to the "subject". For example, if the DID Document belonged to a DID of a book. The controller would be the author or another related person, rather than the book itself.
Cryptographic data
The DID document contains the public keys of the corresponding entity. The keys might be of any algorithm (typically elliptical curve keys or RSA keys are used), and are mostly encoded in the JWK format. However, other encoding formats such as PEM or Multibase are also supported. The keys can be classified to be used in certain use cases such as: authentication, verification, etc.
Service endpoints
Services that the subject wants to mention
The identifier did
:
ebsi
:
2A9RkiYZJsBHT1nSB3HZAwYMNfgM7Psveyodxrr8KgFvGD5y
of the method ebsi would resolve to the following DID document:
Our open source products enable you to use different DID methods for different identity ecosystems. Every relevant functionality (e.g. generation, anchoring, resolution) is supported .
Learn about Self-Sovereign Identity (SSI).
Welcome to our Introduction to Self-Sovereign Identity (SSI) for developers and technical readers.
Before you get started, feel free to explore other (less technical) resources that will help you and your team to get a more holistic understanding of SSI and digital identity in general:
Learn about the technologies and concepts on which SSI is based.
Understanding SSI requires the understanding of a few core concepts:
Cryptographic keys, which convey control over digital identities and enable core functionality such as encryption and authentication.
Wallets, which store our keys (control) and VCs (identity data) and enable the management and sharing of our digital identities and data via easy-to-use applications.
The following graphic shows how SSI works and highlights the core concepts (in blue)
Think of these core concepts as different building blocks that are available in different variations and can be put together in different ways:
As a result, there are different “flavours” of SSI depending on which variations of which building blocks have been used and how they have been put together.
Importantly, the differences in terms of technologies that are being used illustrate why interoperability has always been one of the most important topics within the industry and why the development and use of open standards (e.g. by the W3C, Decentralized Identity Foundation, OpenID Foundation and others) are vital for technology and vendor selection.
.
, which hold a shared and trusted record of information. In other words, they represent a “layer of trust” and can be referred to as the “single source of truth”.
, which give us the power of verifying information, for example credentials, anywhere, anytime, through the establishment of a public key infrastructure. They link keys to unique identifiers that allow different parties to find and interact with each other.
which are digital identity documents that can easily and securely be shared with and verified (incl. validity, integrity, authenticity, provenance) by anyone in a privacy preserving way. Importantly, they are never (!) stored on a blockchain due to privacy and compliance reasons.
, contain identity data for verification from one or multiple VCs and are mainly created by the holder of the VCs.
enable the exchange of data (VCs) between different parties.
Learn what SSI is.
Self-Sovereign Identity (SSI) is a user-centric approach to digital identity that gives people and organizations full control over their data. As a result, SSI enables anyone to easily share their data and reliably prove their identity (i.e. who they are and anything about them) without sacrificing security or privacy.
In other words, SSI enables you to “bring your own identity” and this is true for potentially any type of information - from your core identity (e.g. name, age, address) to your education and work records, your health and insurance data, bank account and financial information, etc.
Moreover, SSI can be used to model the digital identities of people, organizations and things.
At the end of the day, SSI promises a digital world in which interactions are effortless and worry-free. It is simply the next evolutionary step in identity management, a new paradigm in which our digital identities are no longer fragmented and locked into silos that are under someone else’s control, but only at our own disposal to be shared securely and privately.
SSI allows us to model digital identity just like we are used to the way identity works in the non-digital world based on paper documents and cards. There are just some minor twists.
For example, instead of our identity documents being made of paper or plastic, they are digital credentials made of bits and bytes and instead of storing them in wallets made of leather, they are stored in digital wallets on our phones. Importantly, these digital credentials can be reliably verified by anyone they are shared with online or offline.
In doing so, SSI enables decentralized ecosystems in which different parties can exchange and verify identity-related information. These ecosystems look like three-sided marketplaces, so that every party can take on three roles:
Issuers - Parties who “issue” identity-related data to people or organizations (“Holders”) in the form of digital credentials. They are the original data sources of an SSI ecosystem. For example, a government issues digital passports to citizens or a university issues digital diplomas to graduates.
Holders - Individuals or organizations who receive digital credentials that contain data about themselves from various sources (“Issuers”). By aggregating and storing such credentials in digital wallets, Holders can build holistic digital identities that are under their control and can easily be shared with third parties ("Verifiers").
Verifiers - Parties who rely on data to provide products and services can reliably verify and process data that has been provided by others (“Holders”). Verifiers, also called “Relying Parties”, are usually organizations or individuals in their professional capacity.
Usually, a single party plays only one of these roles per interaction. However, it is perfectly normal for a party to take on different roles in different interactions.
For example:
A university (Holder) is being accredited to issue certain types of educational credentials by a national authority (Issuer).
A university (Issuer) issues a digital diploma to a graduate (Holder), who can share this information with a recruiter (Verifier) in the course of a job application.
After the recruiting process, a recruiter (Issuer) issues the results of an applicant’s assessment (e.g. skills, referral) to the applicant (Holder), who can share this information with a new manager or another recruiter (Verifier).
A manager (Issuer) issues the results of a performance review to his employee (Holder) who can share this information with HR (e.g. to improve talent development programs).
A shared and trusted record of information.
Registries serve as a single source of truth which all participants of an SSI ecosystem can trust. Depending on the ecosystem, registries make information accessible to anyone or just a limited group. Registries are important because they enable:
(Distributed) Public Key Infrastructures (DPKIs) which establishes an open distribution system for public keys which can be used for encryption and authentication among others.
Trust Registries hold reliable information about people, organizations, things and even credentials (e.g. data models, status and validity information) to ensure that different parties can trust each other and the identity-related data they exchange.
Different technologies can be used to implement Registries. For example:
Blockchains or L1: Typically blockchains are used because it is unfeasible (or even impossible) to tamper with them. The fact that no single organization can change the contents of a blockchain or manipulate the terms by which it is governed are very aligned with the requirements for identity ecosystems. Today, we see a growing number of developers and organizations focusing on so-called permissioned blockchains (i.e. only a selected group can “write”) like Ethereum Quorum/Enterprise. Permissionless blockchains, like Ethereum, are still used, but less than the permissioned alternatives for a variety of reasons like scalability, costs, lack of customisable governance frameworks.
L2: Layer two networks sit on top of blockchains and aggregate data before anchoring it. The main idea behind them is to circumvent common challenges of public, permissionless blockchains like scalability and cost issues. The most popular implementations in the context of identity are “ION” (for Bitcoin) and “Element” (for Ethereum).
Other Distributed Ledger Technologies (DLTs): Sometimes other DLTs are utilised like the Interplanetary File System (IPFS) though its use for digital identity remains limited.
Domain Name Service (DNS): Considering certain drawbacks of DLTs and their relatively slow adoption by the mass market, DNS can also be used to serve as a registry. Though it is not fully decentralised (considering its underlying governance framework), DNS has many advantages like its maturity and global adoption.
Importantly, SSI can be implemented without registries, particularly without blockchains, because identity data (or at least personal data of individuals) is never anchored due to privacy and compliance reasons. However, by combining SSI with blockchains (or other technologies), robust and trustworthy identity ecosystems that utilise transparent DPKIs and reliable Trust Registries can emerge.
Getting started with the SSI Kit.
Important: Please be informed that, beginning from December 2023, the SSI Kit will no longer receive new features. Furthermore, the SSI Kit is planned for discontinuation by the end of Q3 2024. However, all functionalities currently offered by the SSI Kit will be integrated into our new libraries, APIs, and apps under The Community Stack. This is aimed at providing a more modular, flexible, and efficient solution for your needs. For any clarification or queries, feel free to contact us as we aim to make this transition as smooth as possible.
The SSI-Kit's functionality can be used in a variety of ways. Select your preference to get started
My First Verifiable Credential (VC) - Issue and verify your first VC using the SSI-Kit Api
Advanced Verifiable Credentials (VC) - Leverage custom credential templates, the credential status property, prebuild and custom verification policies.
Before we dive deeper into Verifiable Credentials and learn about their structure and how they work, we will have a look at the problems of today's credentials.
Today's credentials are easy to fake, hard to verify, and not privacy preserving by design. Making it hard for business and people offline but especially online to trust each other, when exchanging information and data. This brings about many problems, thereunder:
To verify that a document or claim presented is actually valid, can take up many resources and time. Just think about, what you had to do last time you opened up a bank account. The presenting of your ID card via a video call, taking selfies etc.
Often the credentials provided by you to get access to a service, are then stored on centralized servers. This makes them not only vulnerable to data breaches, but you also need to trust the organization that they only use the data in ways you would agree with.
You might be forced to disclose more information than needed. The police officer checking your driver license, in most cases, only needs to know that you are allowed to drive, but not where you live or your name.
Organizations employing people who claim to have a skill by presenting a fake certificate, can get jobs, which, when performed poorly, could have catastrophic consequences.
This is why we need a better way to verify the claims presented, and that is where Verifiable Credentials come in.
Easy to verify: There is a clearly defined and reliable way of verifying a Verifiable Credential.
Temper-proof: No one expect the issuer (entity creating the VC) can change the claims stated in the VC.
Independent: No need to contact the issuer of the presented certificate to be certain about its validity. The check can happen in an independent, asynchronous way.
Data is owned: The holder of a certificate now owns the data and decides what to share and when, only providing proof but never actually giving it (a copy) to the service provider.
Portable: The user is free to choose where to take their VC and in which wallet it is saved.
Holder setup: The holder generates the DID via the wallet and saves the private and public key as part of the DID Document, to be able to request, receive and present Verifiable Credentials from thereon. The DID and the DID Document will never be put into any registry, it will only exist locally in the wallet.
Verifier setup: The verifier only needs to have the technology to communicate with the registries when presented with a VC, to validate its authenticity using the DID and the DID Document from the issuer.
After the registration of the issuer and the setup of the wallet for the holder, the holder can now receive a VC from the issuer.
When the holder receives their Verifiable Credential it will be saved on their wallet, and it will contain the following:
Metadata:
The DID of the issuer
The status of the credential (expiration and issuing date, revoke state)
Claims:
The DID of the holder of the credential
The claims about the subject (what the issuer asserts about the subject). This could be, if they can drive a car and what type of car (driver license) or the subject of their study and knowledge areas skilled in (university certificate).
Proof:
This will contain the signatures of the issuer, which can be used to see if the content of the VC has been tempered with and for an authenticity check.
Validate that the DID of the holder, stated in the certificate, is the person presenting the VC.
Checking if all the state values are valid (expiration date and if the certificate is revoked or not).
Checking the claims about the subject and if they match the requirements to give the person access to the service they are requesting to get access to.
Checking the signatures of the issuer and the holder, by getting the DID of the issuer from the registry and the DID from the holder in their wallet and validating it using the public keys presented in the related DID documents.
When all the checks pass, the verifier can now grant the holder access to the service requested.
Our open source products enable you to act as an "Issuer" (create and issue VCs), as a Holder (manage and share VCs/VPs) and as a Verifier (request and verify VCs/VPs).
Verifiable Presentations represent a composition of claims, which can come from one or multiple Verifiable Credentials, of which the authorship is verified. This gives the holder of credentials the chance of composing context specify presentations, which only contain the data which is relevant in that context. When presenting the composition to a verifier, it can easily be validated.
Taking a closer look at how they are built up. We will see four different layers:
Presentation Layer - Being the Verifiable Presentation itself with the required metadata
Credential Layer - Referenced by Layer 1 and pointing to one or more credentials
Credential Proof Layer - Holding the proofs of the credentials and the signatures from Layer
Presentation Proof Layer - Holding the proof of the Verifiable Presentation and its signatures
Our open source products enable you to act as a Holder (share VPs) and as a Verifier (request verify VPs).
Verifiable Credentials (VCs) are digital credentials that contain actual identity data of people or organizations and are standardized by the . They are digital equivalents of paper-based identity documents like passports or diplomas.
With VCs and the standard introduced by , we now have a way of creating a digital piece of information that identifies a particular entity or verifies a specific attribute, qualification or claim about them, in a way, that is almost impossible to forger, easy to verify, and privacy preserving by design. Leaving us with the following benefits:
For us to understand the typical lifecycle of a Verifiable Credential, we need to make sure, we understand the idea behind an and what are. With that out of the way, let's start with cycle.
Registration of the Issuer: Depending on the governance framework, the issuer will be accredited by a trusted entity, before their DID as well as DID Document will be put into the . The Registry is the Single Source of truth and trust entity which verifiers will use as a reference point to make sure a presented VC is valid.
The holder can now use the VC in their wallet to access services, and get access to products by presenting it to the service/product provider (The Verifier) and thereby making it a . The verifier will go through the following steps to make sure the certificate is valid:
Before the validation of the content of the certificate can take place, the VC needs to be parsed from the support JSON-LD or the JWT format. Depending on the ecosystem used, there will also happen a of the credential.
A Verifiable Presentation (VP) is a collection from one or more Verifiable Credentials, whereas the authorship of the whole collection can be cryptographically verified. VPs are standardized as part of the .
Verifiable Presentations, make it possible to combine and tamper-evident share data of one or more Verifiable Credentials. The shared presentation of the data will be encoded in such a way that authorship of the data can be trusted after a process of cryptographic verification. In situations where only a subset of the original Verifiable Credential data is reveled, for example, to enhance user privacy, can help us keep that data verifiable.
If you want to get a better understanding of the different attributes present, please visit our section about .
REST API
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs via API.
Java | Kotlin
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs directly in a Java or Kotlin Application. Adding the SSI-Kit as a direct dependency.
CLI - Command Line Tool
Learn how to issue, verify and manage Verifiable Credentials, Keys and DIDs via the CLI.
Signatory REST API functions.
The Signatory API exposes the "issuance" endpoint, which provides flexible integration possibilities for anyone intending to act as an "Issuer" (i.e. create, sign and issue Verifiable Credentials), as follows:
Credentials - issue credentials
Templates - create and mange credential templates
Revocations - revocation related functions
If you're new to VCs, check out the intro section for an overview.
The /v1/credentials/issue
endpoint issues a specified credential.
E.g. Issue a UniversityDegree
credential in the default JSON-LD format. In case you don't have the DID for the Issuer and or the Holder, you can create one here.
Check out the Issue with status section to learn about how to issue a verifiable credential with a credentialStatus property.
The currently available template functions are:
list - display the list of Templates
import - import a custom template
load - display the content of the template having the specified id
The /v1/templates
endpoint returns the list of the available template id
s
No parameter
E.g. List the templates
The /v1/templates/{id}
endpoint to import your custom credential template
id path parameter (required) - id
of the template, e.g. MyCustomCredential
The /v1/templates/{id}
endpoint displays the content of the template having the parameters:
id path parameter (required) - id
of the template
No parameter
E.g. Load the template for the id
set to UniversityDegree.
Refer to Credential Statuses section for more details on verifiable credential revocations.
Manage keys, DIDs, issue Verifiable Credentials, and verify them using the SSI-Kit's REST API.
Make sure you have Docker or a JDK 16 build environment including Gradle installed on your machine
After successfully running the project, you will have the endpoints, described below, available for use.
Exposed endpoints:
The Core API exposes most of the functionalities provided by the SSI Kit, however newer features will only be released in the other API endpoints. Therefore, it is recommended to use the Signatory API, Custodian API and Auditor API for most use cases.
Auditor REST API functions.
The Auditor API enables anybody to act as a "Verifier" (i.e. verify Verifiable Credentials or Verifiable Presentations). The validation steps can be easily configured by existing or custom policies.
The following functionality is available:
The /v1/verify
endpoint verifies a list of credentials / presentations specified in the JSON-LD
format against a set of policies. Each of the policy should be registered with the Auditor before being used in the verification. If at least one of the listed policies fails the verification, then the entire credential is considered to be invalid.
E.g Verification of a UniversityDegree credential against Signature and JsonSchema policies, where SignaturePolicy is failing.
The Auditor Rest API also enables policy management with the following methods:
The /v1/policies
endpoint lists the available verification policies. The policy id
field is used to reference the policy during verification.
E.g. Listing of the verification policies
The /v1/create/{name}
creates a dynamic policy. The following parameters can be specified:
name
path parameter (required) - specifies the value to be used as the policy id
update
query parameter (optional, defualts to false
) - accepts boolean
values and specifies whether it should override an existing policy with the same name
(only if the policy is mutable)
downloadPolicy
query parameter (optional, defaults to false
) - accepts boolean
values and identifies the scope of the policy
field:
specifies a remote source that should be resolved to a policy
specifies the actual policy content
E.g. Creating a Rego policy that checks if a credential subject id is not null or empty
Code 200
The /v1/delete/{name}
endpoint deletes a dynamic policy. The following parameters can be specified:
name
path parameter (required) - specifies the id
value of the policy
Policy removed / Policy not found
E.g. Removing the policy having 'MyPolicy' name
Policy removed / Policy not found
Custodian REST API functions.
The Custodian API provides management functions for maintaining secrets and sensitive data (e.g. keys, Verifiable Credentials) in a secure way:
Pull the docker container directly from docker hub and run the project
This will create a folder called data in your current directory as storage for the VC, DIDs, Keys and other things which need to be stored in order to provide all the functionality.
Clone the project
2. Change the folder
3. Run the project
The first time you run the command you will be asked to built the project. You can confirm the prompt.
If you want to get a more detailed overview of the options provided for building the project on your machine, please refer to .
- Learn how to issue credentials
- Learn how to maintain secrets and sensitive data (e.g. keys, Verifiable Credentials)
- Learn how to verify credentials
- Play through a whole use case from Issuance to Verification
|
- credential / presentation verification
- policy related functions
- display the available verification policies
- create a dynamic verification policy
- remove a dynamic verification policy
More details on creating verification policies and fields definitions can be found at .
|
Signatory API - For Issuers
http://127.0.0.1:7001
Custodian API - For Holders
http://127.0.0.1:7002
Auditor API - For Verifiers
http://127.0.0.1:7003
Core API
http://127.0.0.1:7000
ESSIF API
http://127.0.0.1:7004
The following key management functions are available:
list - list of key ids
load - load the public key in JWK format
delete - delete key
generate - generate key
import - import key
export - export key
The /v1/key
endpoint lists the available key ids.
E.g. List the available key ids.
The /v1/key/{id}
endpoint loads the public component of the provided key id in JWK format:
id - path parameter (required) - the key id
E.g. Load the key having id = e548f032cadf4145ab6886a57c2e87e6.
The /v1/key/{id}
endpoint deletes the specified key.
E.g. Delete the key having id = e548f032cadf4145ab6886a57c2e87e6.
The /v1/key/gen
generates a new key using the specified algorithm.
E.g. Generate a new key using the EdDSA_Ed25519 algorithm.
The /v1/key/import
endpoint imports a key (JWK or PEM format) to the underlying keystore.
E.g. Import a public key specified in JWK format.
The /v1/key/export
endpoint exports public and private key part (if supported by underlying keystore).
E.g. Export the public key with id = bc6fa6b0593648238c4616800bed7746 as JWK.
Key management functions include:
List - lists the available keys
Load - loads a key specified by its alias
Generate - generate a key using the specified algorithm
Import - imports a key
Delete - deletes a specific key
Export - exports public and private key parts (if supported by the underlying keystore)
The /keys
endpoint lists the key available to the Custodian
E.g. List the available keys
The /keys/{alias}
endpoint loads a key specified by its alias.
E.g. Load a key with id
e548f032cadf4145ab6886a57c2e87e6
The /keys/generate
endpoint generates a key using the specified algorithm.
E.g. Generate a key using the EdDSA_Ed25519 algorithm.
The /keys/import
endpoint imports a key (JWK or PEM format) to the underlying keystore.
The key string in JWK or PEM format
E.g. Import a public key specified in JWK format.
The /keys/{id}
deletes the specified as parameter:
id path parameter (required) - the key alias
E.g. Delete the key with id
bc6fa6b0593648238c4616800bed7746
The /keys/export
endpoint exports a key.
E.g. Export the public key with id = e548f032cadf4145ab6886a57c2e87e6 as JWK.
DID management functions enable the following:
List - lists the available DIDs
Load - loads a DID by the specified id
Delete - deletes a DID by the specified url
Create - creates a new DID
Resolve - resolves a DID to a document
Import - import a DID
For more info on DIDs, go here.
The /did
endpoint lists the available DIDs.
E.g. List the available DIDs
The /did/{id}
endpoint loads a DID specified by:
id path parameter (required) - the DID url string
E.g Load the DID having the id = did:web:walt.id.
The /did/{id}
deletes the DID specified by:
url - path parameter (required) - the DID url string
E.g. Delete the DID having id = did:web:walt.id.
The /did/create
endpoint creates a DID.
The method
and keyAlias
properties are common for all did-method requests, method
being required, while keyAlias
- optional (if not specified, a new key will be automatically created using the default algorithm according to the did-method). The method-dependent options have default values, if not specified otherwise. Below are the available properties by did-method.
useJwkJcsPub
(default) - false - specifies whether to create a did:key using the jwk_jcs-pub multicodec (code: 0xeb51)
didWebDomain
(default) - "walt.id"
didWebPath
(default) - empty-string
version
(default) - 1
network
(default) - "testnet"
E.g. Create a DID using the web
method having the domain set to walt.id
.
The /did/resolve
endpoint resolves a DID.
E.g. Reslove the DID having id = did:key:z6MkkLmAVeM3P6B2LJ2xGrK1wVojCoephK4G9VrCcct42ADX
.
The /did/import
endpoint resolves and imports the DID to the underlying data store.
The DID url string.
E.g. Import DID having id = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
The SSI Kit can also be used as direct dependency for JVM-based applications. In this case an existing application can easily be enhanced with SSI functionality.
The following illustrates how the SSI Kit can be used via Gradle or Maven (look for the current version on GitHub https://github.com/walt-id/waltid-ssikit)
Gradle
Maven
Required Maven repos:
You can find the latest version here. Make sure when adding the version you add it without the 'v' in front.
Core REST API functions.
The Core API exposes wallet core functionality in the scope of storing and managing:
The Core API exposes most of the functionalities provided by the SSI Kit, however newer features will only be released in the other API endpoints. Therefore, it is recommended to use the Signatory API, Custodian API and Auditor API for most use cases.
Commands:
All commands have the help option available:
<your-command> -h
<your-command> --help
E.g. did create -h
Creates a DID document using did create [options]
command based on the corresponding SSI ecosystem (DID method). Optionally the associated asymmetric key is also created.
-m, --did-method [key | web | ebsi | iota | jwk | cheqd]
- Specify DID method [key], Supported DID methods are: "key", "web", "ebsi", "iota", "jwk"
-k, --key TEXT
- Specific key (ID or alias)
-d, --domain TEXT
- Domain for did:web
-p, --path TEXT
- Path for did:web
-v, --version INT
- Version of did:ebsi. Allowed values: 1 (default), 2
-n, --network [testnet | mainnet]
- cheqd network, default is testnet
The returned value represents the DID document.
E.g. did create -m ebsi -k 8a2c3628acdd45999b4c0b5a69911437
IOTA support
For creating IOTA DIDs and registering them on the IOTA tangle, a wrapper library needs to be installed and available in the local library path.
The wrapper library is included in the SSIKit Docker image, such that for Docker users no additional setup is required.
CLI users can find instructions for build and SSIKit integration at:
Resolves the DID document.
Options:
-d, --did TEXT DID to be resolved
-r, --raw / -t, --typed
-w, --write
List all created DIDs using did list
command
Import DID to custodian store using did import [options]
command
-k, --key-id TEXT
- Specify key ID for imported did, if left empty, only public key will be imported
-f, --file TEXT
- Load the DID document from the given file
-d, --did TEXT
- Try to resolve DID document for the given DID
Use the delete
command to delete a DID:
did delete <your did>
E.g. did delete -d "did:ebsi:zs79GYJvzEnQYxkAAj4UX1j"
VC related operations like issuing, verifying and revoking VCs.
Commands:
All commands have the help option available:
<your-command> -h
or <your-command> --help
E.g. vc issue -h
Use the issue
command to issue a W3C Verifiable Credential with either a JWT or a JSON_LD signature.
options:
-i, --issuer-did TEXT
DID of the issuer (associated with signing key). [Required]
-s, --subject-did TEXT
DID of the VC subject (receiver of VC). [Required]
-v, --issuer-verification-method TEXT
KeyId of the issuers' signing key
-y, --proof-type [JWT|LD_PROOF]
Proof type to be used [LD_PROOF]
-p, --proof-purpose TEXT
Proof purpose to be used [assertion]
--interactive
Interactively prompt for VC data to fill in
--ld-signature, --ld-sig \[Ed25519Signature2018|Ed25519Signature2020|EcdsaSecp256k1Signature2019|RsaSignature2018|JsonWebSignature2020|JcsEd25519Signature2020]
--ecosystem \[DEFAULT|ESSIF|GAIAX|IOTA]
Specify ecosystem, for specific defaults of issuing parameters
--statusType \[StatusList2021Entry|SimpleCredentialStatus2022]
specify the credentialStatus type
e.g.
vc issue -t OpenBadgeCredential -s did:key:z6MkpuUYdpaZPcpnEWnkE8vb7s2u2geTZJden1BwGXsdFUz3 -i did:ebsi:zZ5apnsHPUXNqjWELjNZhYW
, returns a credential document (JSON format)
Use present command to present a VC or VP to a verifier.
-i, --holder-did TEXT
DID of the holder (owner of the VC)
-v, --verifier-did TEXT
DID of the verifier (recipient of the VP)
-d, --domain TEXT
Domain name to be used in the LD proof
-c, --challenge TEXT
Challenge to be used in the LD proof
use verify command to verify
To see available verification policies, use vc policies
command
Import VC to custodian store
Learn about VC template related functions like the listing and exporting of templates, as well as how to create/import your own custom VC template.
list
List VC Templates.
vc template list
result
export <template-name>
Export VC Template.
Options:
-n, --name <Name>
Name of the template
e.g. vc templates export --name VerifiableId
import <customCredentialPath.json>
Options:
-n, --name <Name>
Name of the template
Arguments:
credential path
the last argument of the command references the path to the custom credential, which should be imported
e.g vc templates import -n MyCustomCredential custom.json
custom.json
Output of the command
list
VCs saved in the custodian store
e.g. vc list
Key management functions like generation, listing, export/import, and deletion.
SSI-Kit CLI key management commands can be accessed with the key
command. It provides the following functionality:
All commands have the help option available:
<your-command> -h
or <your-command> --help
E.g. key gen -h
Use the gen
command to create asymmetric key pair by the specified algorithm. Supported algorithms are:
RSA:
key gen -a RSA
or key gen --algorithm RSA
ECDSA Secp256k1:
key gen -a Secp256k1
or key gen --algorithm Secp256k1
EdDSA Ed25519 (default)
key gen
or key gen -a Ed25519
or key gen --algorithm Ed25519
The returned value represents the keyId
of the newly created key.
E.g. key gen -a Secp256k1
Use the list
command to list all keys in the key store:
key list
It will output the following fields:
key index - index within the list
keyId - key identification number
key algorithm - algorithm used to create the key
crypto service - the cryptographic service used to create the key
Use the import
command to import a key in JWK or PEM format:
key import <your-key-file-path>
JWK - based on the JWK key ID and key material, an internal key object will be created and placed in the corresponding key store
PEM - if there's no key ID in the PEM file (which is usually the case), a random key ID will be generated and, based on the key material, an internal key object will be created and placed in the corresponding key store. PEM files must have the file extension 'pem':
RSA keys - file should contain either the private key or private and public keys concatenated with a 'new line' character
Ed25519, Secp256k1 - file should contain both private and public keys concatenated with a 'new line' character
E.g.
Ed25519 JWK public key
key import ./ed25519jwk.json
Secp256k1 PEM key
key import ./secp256k1.pem
Use the export
command to export a specified key type with the specified id and format.
Available key type:
public (default):
key export <your-key-id>
or key export <your-key-id> --pub
private:
key export <your-key-id> --priv
Available export formats:
JWK (default):
key export <your-key-id>
or key export <your-key-id> -f JWK
or key export <your-key-id> --key-format JWK
PEM:
key export <your-key-id> -f PEM
key export <your-key-id> --key-format PEM
The output will display the exported key in the specified format.
E.g.
key export 17592087c6f04c358b9b813dbe2ef027 --pub -f PEM
key export 17592087c6f04c358b9b813dbe2ef027 --pub
key export 17592087c6f04c358b9b813dbe2ef027 --priv -f PEM
key export 17592087c6f04c358b9b813dbe2ef027 --priv
Use the delete
command to delete a key with the specified ID:
key delete <your-key-id>
E.g. key delete 17592087c6f04c358b9b813dbe2ef027
To expose the API service using the CLI tool or the docker container, use one of the following commands:
Show all options for specifying bind address and ports:
On localhost only using the default ports 7000-7003
Binding on all network interfaces, using the default ports 7000-7003
Binding on a specific network interface (e.g.: 192.168.0.1)
Using docker one needs to bind to 0.0.0.0 in the container and limit the binding from outside using the docker run -p syntax like so:
Use custom ports by using the -p (Core API), -e (ESSIF API), -s (Signatory API) command options
Manage keys, DIDs, issue Verifiable Credentials, and verify them using the SSI-Kit command line tool.
Choose between a Docker or a JVM-based runtime.
For debug infos add "-v" e.g.:
Explore the components of the SSI Kit and their functionality:
The following DID management functions are available:
The /v1/did
endpoint lists the available DIDs.
E.g. List the available DIDs.
The /v1/did/{id}
endpoint loads a DID specified by:
id - path parameter (required) - the DID url string
E.g. Load the DID = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
The /v1/did/{id}
endpoint deletes the DID by:
id - path parameter (required) - the DID url string
E.g. Delete the DID = did:key:z6Mkm8NbvDnnxJ2t5zLGSkYGCWZiqq11Axr58xQ3ZG1Jss3z
.
The /v1/did/create
creates a DID.
The method
and keyAlias
properties are common for all did-method requests, method
being required, while keyAlias
- optional (if not specified, a new key will be automatically created using the default algorithm according to the did-method). The method-dependent options have default values, if not specified otherwise. Below are the available properties by did-method.
didWebDomain
(default) - "walt.id"
didWebPath
(default) - empty-string
version
(default) - 1
network
(default) - "testnet"
E.g. Create a DID using the key method and automatically generate a new key.
The /v1/did/resolve
resolves a DID url string to a DID document.
E.g. Resolve the DID = did:key:z6MkqmaCT2JqdUtLeKah7tEVfNXtDXtQyj4yxEgV11Y5CqUa
.
The /v1/did/import
endpoint resolves and imports the specified DID url to the underlying data store.
E.g. Import the DID = did:key:z6MkqmaCT2JqdUtLeKah7tEVfNXtDXtQyj4yxEgV11Y5CqUa
.
The following credentials management functions are available:
The /v1/vc
endpoint lists the available credentials.
E.g. List the available credentials.
The /v1/vc/{id}
endpoint loads a credential specified by:
id - path parameter (required) - the credential id
E.g. Load the credential having id = urn:uuid:d36986f1-3cc0-4156-b5a4-6d3deab84270.