Digital Success Fabric Connectors
Summary
Digital Success Fabric (DSF) is Digital Banking Platform that enables financial institutions to provide modern and reliable banking services across digital channels, such as mobile, web, conversational AI bots and others.
The architecture of the DSF is Event-Driven, providing high-degree of decoupling between the Customer-facing APIs and capabilities the underlying downstream systems, such as Core Banking, Card Issuing Processors, Customer Information Files and others.
The communication with all the underlying systems is happening using Commands and Events, where commands are instructions usually sent by the Customer-facing Capabilities, and Events are capturing the facts that happened in the underlying system. Customer-facing capabilities are reacting and processing these events to update their standalone data projections and execute other required actions.
Documentation Purpose
The portal provides documentation of Connectors that are divided into business areas called Domains. Each domain has one more Connector Service. Each Service receives commands and send events (some services receives events as well).
The role of the Connection Service is to translate the protocol and messages of the underlying systems to commands and events supported by DSF. E.g. Account Connector Service receives a Command to OpenAccount in JSON format, transforms the JSON object into the WebService XML Payload, and calls the underlying Core Banking WebService. Once the operation succeeds, the connector transforms the received XML response into the AccountOpened event in JSON format and publishes it to the Event Bus topic of DSF.
Intended Audience
The intended audience of this documentation portal are developers working on integrating DSF with underlying downstream systems.
Common Concepts
Identifiers
Commands and Events are using different IDs to link internal entities with the corresponding entities of underlying systems, as well as to reference entities in underlying systems. There are generally three types of entity IDs, as described below.
Local ID IDs of external entities as generated, registered and formatted in the downstream system, i.e. Core Banking. These IDs are only processed in Connectors and are never sent further to DSF. Sometimes referred to as unqualifiedIds
NB. “local” has the same notion as local name in the XML Namespaces specification.
Qualified ID IDs of external entities, formatted as URI to support multiple systems managing same types of entities, i.e. multiple Core Banking systems etc. The qualified ID has the following structure: coreBankingPrefix ':' branch '/' localId '. Example. flexcube:001/account-01. Qualified Ids are often prefixed with origin in actual commands and events.
All the external IDs are references in the DSF using Qualified IDs format.
NB. “qualified” has the same notion as qualified names in the XML Namespaces specification).
Internal ID Internal IDs are synthetic IDs generated by the connector and are passed to DSF via events. Internal IDs are used when the exposure of Qualified IDs directly in Customer-facing APIs introduces Data Privacy risks.
Message Envelope
All the DSF messages (Events and Commands) are wrapped in Enveloper before being dispatched via the EventBus. The purpose of the Envelope is to pass additional metadata / headers alongside the message payload, such as plmr-correlation-id.
The actual set of expected metadata properties is defined in the Header section of the Operation, in the AsyncAPI Specification of eah corresponding Connector Service.
Please note that all the Events and Commands Schemas in this portal are only documenting the Payload part of the Envelope.
Below is the structure of the Envelope:
{
"metadata" : {"properties": {}},
"payload : {}
}
Example of a typical message. Note that the payload is passed as a string, not plain json:
{
"metadata" : {
"properties": {
"plmr-correlation-id": "213594b7-1d97-4972-bb2a-9ae6d286461d"
}
},
"payload : "{\"originAccountId\":\"flexcube:001/account-01\",\"userId\":\"user-01\"}"
}
Message signing
For security and data integrity, all messages (commands and events) exchanged via the message broker are signed using asymmetric cryptography. The signing process involves creating a digital signature for each message, which can be verified by the recipient to ensure that the message has not been tampered with and originates from a trusted source. Canonicalization is a critical step that transforms the message into a standardized format before signing, ensuring consistent signature verification regardless of how the message is transported or stored.
All messages produced by Plumery are signed. Customers are not required to sign the messages - it is optional.
Canonicalization Format When a message is prepared for signing, it is transformed into a canonical representation with the following structure:
topic;timestamp;key;value;canonicalizedHeaders
Where:
topic: The Kafka topic nametimestamp: The Unix timestamp (in milliseconds) when the message was signedkey: The Kafka message key (empty string if null)value: The message payload as a JSON stringcanonicalizedHeaders: Any additional Kafka headers in a standardized format
Header Canonicalization Headers are processed as follows:
- Signature-related headers are excluded
- Remaining headers are sorted alphabetically by key, then by value
- Each header is formatted as
key=value - Headers are joined with a semicolon (
;) separator
Signature Headers After signing, the following headers are added to the Kafka message:
| Header Name | Description |
|---|---|
x-signature | The cryptographic signature of the canonicalized message |
x-alg | The algorithm used for signing (e.g., “Ed25519”) |
x-key-id | Identifier for the key used to create the signature |
x-ts | Timestamp when the signature was created |
x-sig-v | Version of the canonicalization format |
x-nonce | Unique value to prevent replay attacks |
Verification Process To verify a signed message:
- Extract all headers from the received message
- Rebuild the canonicalized message using the same process
- Use the
x-key-idto retrieve the appropriate public key - Verify the signature against the canonicalized message
Example For a message with:
- Topic:
cards.event.card-issued - Timestamp:
1632152400000 - Key:
133265 - Value:
{"metadata": {"timestamp": "2025-09-25T12:09:08.163715862Z","properties": {"plmr-correlation-id": "cards-0530854a-6cdd-4610-958e-4b4f96bdc925"}},"payload": "{\"id\":\"d9fc9715-8037-49fb-8cc2-c40ea4f21f39\",\"createdAt\":\"2025-09-25T12:09:07Z\",\"maskedPan\":\"111111______3798\",\"expirationDate\":\"1025\",\"cardStatus\":\"ACTIVE\",\"alias\":\"Helicopter card\",\"productId\":\"4ee74f29-7001-4834-8e08-ea8b354397ec\",\"userId\":\"6c017bfa-def6-4626-8515-10a756315039\",\"cardToken\":\"5175b183-4def-40f7-bb6d-266119953f01\",\"fulfillmentStatus\":\"ISSUED\"}"} - Headers:
x-request-id=db539fc2-cc14-414b-b2a9-1a58682e28ea,traceparent=00-4706552648e1fbe17f5e3c10b57d1c2f-93087ddece8d8be8-01,X-B3-TraceId=4706552648e1fbe17f5e3c10b57d1c2f
The canonicalized message would be:
cards.event.card-issued;1632152400000;133265;{"metadata": {"timestamp": "2025-09-25T12:09:08.163715862Z","properties": {"plmr-correlation-id": "cards-0530854a-6cdd-4610-958e-4b4f96bdc925"}},"payload": "{\"id\":\"d9fc9715-8037-49fb-8cc2-c40ea4f21f39\",\"createdAt\":\"2025-09-25T12:09:07Z\",\"maskedPan\":\"111111______3798\",\"expirationDate\":\"1025\",\"cardStatus\":\"ACTIVE\",\"alias\":\"Helicopter card\",\"productId\":\"4ee74f29-7001-4834-8e08-ea8b354397ec\",\"userId\":\"6c017bfa-def6-4626-8515-10a756315039\",\"cardToken\":\"5175b183-4def-40f7-bb6d-266119953f01\",\"fulfillmentStatus\":\"ISSUED\"}"};traceparent=00-4706552648e1fbe17f5e3c10b57d1c2f-93087ddece8d8be8-01;X-B3-TraceId=4706552648e1fbe17f5e3c10b57d1c2f;x-request-id=db539fc2-cc14-414b-b2a9-1a58682e28ea
Implementation Notes
- The canonicalization process is consistent across producers and consumers
- Header values are converted to strings (empty string if null)
- UTF-8 encoding is used throughout the process
- Consumers must follow the exact same canonicalization process to verify signatures successfully