This repository provides the latest tools for the standalone Trust Assessment Framework.
You can get a pre-compiled version of the go-taf tools in the Releases section.
First, clone this repository:
git clone [email protected]:coordination/go-taf-tools.gitAlso clone the following internal dependencies into a shared common folder:
git clone [email protected]:coordination/go-taf.git
git clone [email protected]:coordination/tlee-implementation.git
git clone [email protected]:coordination/crypto-library-interface.gitThe resulting folder structure should look like this:
├── crypto-library-interface
├── go-taf
├── go-taf-tools
└── tlee-implementation
Next, go to the go-taf-tools directory and run make:
cd go-taf-tools
make build-allTo run the compiled tools, switch to the out/ folder and run them:
cd out
./playback
./watch
./recorderTo debug incoming Kafka communication from the perspective of the TAF, this repository provides a helper application that emulates the Kafka topic consumption behavior of the TAF and validates and checks incoming messages. To build only this helper, use:
make build-watchAnd to run it:
make run-watchThe helper application will now dump any incoming messages on the following topics: "taf", "tch", "aiv", "mbd", "application.ccam". For each consumed message, the application will do the following:
- check whether the message is valid JSON
- check whether the message is valid according to its Schema
- create a struct according to the type of message (unmarshalling)
The watch application has the option to interactively introspect kafka messages. By default, the web interface is not enabled. Setting it to any non-zero value will start it on the respective port. For example:
$ go run cmd/watch/watch.go -web-interface-port 8080
Starting web interface server: http://127.0.0.1:8080To induce a workload on the TAF, it provides another helper application that creates Kafka messages based on a folder specifying the worklaod.
To build this helper separately, use:
make build-playbackAnd to run it:
make run-playbackThe playback application can take three inputs: --story, --config and --target. The --story input is mandatory, the other two inputs are optional. The --target input has to be specified as the last one.
The order of the other two attributes does not matter.
The usage of the playback application is shown in the following:
usage: ./playback --story=path [--no-validation] [--config=path] [--target targetlist]
example: ./playback --story=storydirectory/storyline1 --config=configdirectory/config1.json --target taf aiv mbd
The playback application takes the input from a workload folder specified via the --story attribute and produces according Kafka messages.
A workload folder consists of two main components: (1) a script.csv file that orchestrates the workload and a list of JSON files that represent messages to be sent as a record valueof a Kafka message.
The CSV file uses the following structure:
| Rel. Timestamp since Start (in milliseconds) | Sender Name | Destination Topic Name | JSON File to use as Message Content |
|---|
1000,"application","taf","001-init.json"
In the given example entry, after 1000 ms, the playback component sends a message from the "application" component to the topic "taf" and uses the file content of "001-init.json" as record value.
By default, the playback application validates any message of the workload against the corresponding JSON schema file for the message type used. When using this flag, the validation will be skipped, allowing the use of invalid messages that do not follow their schema definition.
The playback application uses an internal configuration with hardcoded defaults. To change the configuration, you can use a JSON file (template located in res/taf.json) and specify the actual file location in the environment variable TAF_CONFIG or via the --config attribute. See documentation in the TAF Implementation repository.
The playback application sends by default all Kafka messages that are specified in the workload.
However, by using the --target attribute, the workload can be tailored and filtered for a set of specific targets.
If on or more targets are set, then only messages to be sent to these targets will be emitted and messages sent by these targets will be omitted.
This helper application allows you to record messages exchanged during the execution of the TAF and its surrounding components by capturing all messages sent via relevant Kafka topics.
The output of the recorder tool are folders that can be used as input for the playback tool.
A recording folder contains all plain messages as well as the script with the timestamps of sending.
A recorded execution can also be shared with others in order to replicate the exact same message exchanges, therefore simplifying (remote) debugging, deterministic executions and sampling of executions with specified environments not available to others.
make build-recorderAnd to run it:
make run-recorderThe helper application will now record any incoming messages on the following topics: "taf", "tch", "aiv", "mbd", "application.ccam". For each consumed message, the application will do the following:
- check whether the message is valid JSON
- write it to a JSON file in the output directory
- add the message to the script.csv in the output directory with the correct relative timestamp
The recorder application uses an internal configuration with hardcoded defaults. To change the configuration, you can use a JSON file (template located in res/taf.json) and specify the actual file location in the environment variable TAF_CONFIG or via the --config attribute. See documentation in the TAF Implementation repository.
The path where the recorded storylines are saved to. It defaults to the current working directory.
You can overwrite the set of recorded Kafka topics by specifying a space-separated list of topics with the --topics parameter.
If enabled, this flag causes the playback tool to pause until specified targets respond with the expected messages. This allows the tool to re-enact complex message exchanges.
- go-taf Repo: The main go-taf development repository