Skip to content

vs-uulm/go-taf-tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tools for Trust Assessment Framework

Latest Release pipeline status

This repository provides the latest tools for the standalone Trust Assessment Framework.

Gettting Started

Gettting a Pre-Compile Binary

You can get a pre-compiled version of the go-taf tools in the Releases section.

Build from Source

First, clone this repository:

git clone [email protected]:coordination/go-taf-tools.git

Also clone the following internal dependencies into a shared common folder:

git clone [email protected]:coordination/go-taf.git
git clone [email protected]:coordination/tlee-implementation.git
git clone [email protected]:coordination/crypto-library-interface.git

The resulting folder structure should look like this:

├── crypto-library-interface
├── go-taf
├── go-taf-tools
└── tlee-implementation

Next, go to the go-taf-tools directory and run make:

cd go-taf-tools
make build-all

To run the compiled tools, switch to the out/ folder and run them:

cd out
./playback
./watch
./recorder

Watch Application for Testing/Debugging

To debug incoming Kafka communication from the perspective of the TAF, this repository provides a helper application that emulates the Kafka topic consumption behavior of the TAF and validates and checks incoming messages. To build only this helper, use:

make build-watch

And to run it:

make run-watch

The helper application will now dump any incoming messages on the following topics: "taf", "tch", "aiv", "mbd", "application.ccam". For each consumed message, the application will do the following:

  • check whether the message is valid JSON
  • check whether the message is valid according to its Schema
  • create a struct according to the type of message (unmarshalling)

Parameter: --web-interface-port

The watch application has the option to interactively introspect kafka messages. By default, the web interface is not enabled. Setting it to any non-zero value will start it on the respective port. For example:

$ go run cmd/watch/watch.go -web-interface-port 8080
Starting web interface server: http://127.0.0.1:8080

Playback Application for Testing/Debugging

To induce a workload on the TAF, it provides another helper application that creates Kafka messages based on a folder specifying the worklaod.
To build this helper separately, use:

make build-playback

And to run it:

make run-playback

The playback application can take three inputs: --story, --config and --target. The --story input is mandatory, the other two inputs are optional. The --target input has to be specified as the last one. The order of the other two attributes does not matter.

The usage of the playback application is shown in the following:

usage: ./playback --story=path [--no-validation] [--config=path] [--target targetlist]

example: ./playback --story=storydirectory/storyline1 --config=configdirectory/config1.json --target taf aiv mbd

Parameter: --story

The playback application takes the input from a workload folder specified via the --story attribute and produces according Kafka messages. A workload folder consists of two main components: (1) a script.csv file that orchestrates the workload and a list of JSON files that represent messages to be sent as a record valueof a Kafka message. The CSV file uses the following structure:

Rel. Timestamp since Start (in milliseconds) Sender Name Destination Topic Name JSON File to use as Message Content
1000,"application","taf","001-init.json"

In the given example entry, after 1000 ms, the playback component sends a message from the "application" component to the topic "taf" and uses the file content of "001-init.json" as record value.

Parameter: --no-validation

By default, the playback application validates any message of the workload against the corresponding JSON schema file for the message type used. When using this flag, the validation will be skipped, allowing the use of invalid messages that do not follow their schema definition.

Parameter: --config

The playback application uses an internal configuration with hardcoded defaults. To change the configuration, you can use a JSON file (template located in res/taf.json) and specify the actual file location in the environment variable TAF_CONFIG or via the --config attribute. See documentation in the TAF Implementation repository.

Parameter: --target

The playback application sends by default all Kafka messages that are specified in the workload. However, by using the --target attribute, the workload can be tailored and filtered for a set of specific targets. If on or more targets are set, then only messages to be sent to these targets will be emitted and messages sent by these targets will be omitted.

Recorder Application for Testing/Debugging

This helper application allows you to record messages exchanged during the execution of the TAF and its surrounding components by capturing all messages sent via relevant Kafka topics. The output of the recorder tool are folders that can be used as input for the playback tool. A recording folder contains all plain messages as well as the script with the timestamps of sending. A recorded execution can also be shared with others in order to replicate the exact same message exchanges, therefore simplifying (remote) debugging, deterministic executions and sampling of executions with specified environments not available to others.

make build-recorder

And to run it:

make run-recorder

The helper application will now record any incoming messages on the following topics: "taf", "tch", "aiv", "mbd", "application.ccam". For each consumed message, the application will do the following:

  • check whether the message is valid JSON
  • write it to a JSON file in the output directory
  • add the message to the script.csv in the output directory with the correct relative timestamp

Parameter: --config

The recorder application uses an internal configuration with hardcoded defaults. To change the configuration, you can use a JSON file (template located in res/taf.json) and specify the actual file location in the environment variable TAF_CONFIG or via the --config attribute. See documentation in the TAF Implementation repository.

Parameter: --output

The path where the recorded storylines are saved to. It defaults to the current working directory.

Parameter: --topics

You can overwrite the set of recorded Kafka topics by specifying a space-separated list of topics with the --topics parameter.

Parameter: --await-responses

If enabled, this flag causes the playback tool to pause until specified targets respond with the expected messages. This allows the tool to re-enact complex message exchanges.

See Also

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •