flowchart LR P[Probe] -->|Standardized result| A(Aggregation Engine) P1[Probe] -->|Standardized result| A P2[Probe] -->|Standardized result| A A -.->|Ask for a timestamp| RTS(Third party timestamper) P -.-> H[Agents registry] P1 -.-> H P2 -.-> H A -.-> H A -->|HTTP POST| B[FastAPI] B -.->|Ask for a timestamp| RTS B -->|Write| G[Database] E[External source] -->|HTTP POST| B

In summary, the outlined architecture operates as follows: Probes efficiently gather data through localized scans conducted at diverse locations. The aggregation engine is tasked with consolidating data from these probes and generating cryptographic timestamps in adherence to RFC 3161 standards. Powered by FastAPI, the API offers a range of services for the storage and retrieval of checks (scans, etc.), and the proof of checks.

The remote timestamper coud be for example

An early demonstation is available on Youtube:

The responsibility of the different components is explained in detail below.

The project’s changelog, detailing its updates, can be accessed here.


The core part of the API is responsible of collecting, verifying the format of the data (with Pydantic) and to provide different ways to access the results from the various scanning tools.

The API is based on the FastAPI framework well known for its excellent performance.

The OpenAPI documentation is available in this section.

The API also provides a PubSub mechanism. Below is a simple client example.

import asyncio
import os
import sys

from fastapi_websocket_pubsub import PubSubClient

PORT = int(os.environ.get("PORT") or "8000")

async def on_events(data, topic):
    print(f"running callback for {topic}!")

async def main():
    # Create a client and subscribe to topics 'scan' and 'tst'.
    client = PubSubClient(["scan", "tst"], callback=on_events)
    await client.wait_until_done()

Type of agents#

Each agent is authenticated, registered and declare its availability (for the presence notification system).

Ad hoc module: a module in order to share data with external platforms, such as MISP [1] or other database systems.

Each agent has the possibility to provide a HTML view and different services.

Aggregation Engine#

Time-Stamp Protocol (TSP) RFC 3161#

This main responsibility of this agent is to collect data from the different scanning tools. This agent is as well responsible of timestamping the collected data by using a third party provider (see RFC 3161).

Probe agent#

The probe agent is responsible of:

  • embedding different scanning tools (probes);

  • normalizing and verifying the format of analysis tools output;

  • transferring the standardized data to the aggregation engine.

Configuration file of a probe agent:

   "uuid": "",
   "period": 3600,
   "target": "",
   "command": "<-how-to-launch-the-scanning-tool>",
   "args": [],
   "result_parser": "",
   "up_agent": ""

One shot#

A one shot probe agent can be launched for a ponctual task. For example a task triggered by an action of a user via a graphical user interface. A agent is able to manage a list of jobs. For an important number of jobs it is possible to launch several agents in parallel.


An agent capable of executing a specific task at a scheduled period.


List behaviours of the Correlation Engine

Fig. 1 List behaviours of the Correlation Engine#

Messages received by the Correlation Engine

Fig. 2 Messages received by the Correlation Engine from various probes.#

Presence notification

Fig. 3 Presence notification#

Some details about a contact of the Correlation Engine.

Fig. 4 Some details about a contact of the Correlation Engine.#