This is one of 4531 IT projects that we have successfully completed with our customers.

How can we support you?

Weißes Quadrat mit umrandeten Seiten rechts oben

Deltix/Trayport TB-Data base Algo trading

Brief description

Data streams from upstream trading systems are required to develop individual commodity-specific algo trading strategies, compliance and reporting requirements. The upstream systems supply the trading data regardless of the type of trade. In particular, there is no separation between electricity, gas and certificate transactions. In this project, the data stream in a middleware solution consisting of Kafka and Flink is to be separated according to the type of transaction and made available specifically for subsequent customised solutions.

Supplement

The exponential growth of transaction data for energy trading generated by the use of algo trading systems and the long list of possibilities to develop new algorithms, increases the complexity of the IT landscape of the company. The flow of large amounts of data (approx. 15,000,000 transactions per day) should flow end-to-end through the systems in accordance with business rules and requirements. In this project, we are developing a software solution with streaming technologies that makes it possible to feed large volumes of data into the systems in a customised manner. New tools for monitoring and reporting are also in scope of the project. PTA's responsibilities include eliciting requirements, validating them, and ensuring that the developed software solution is implemented according to business needs.

Subject description

In larger energy trading houses the trading departments are separated according to the type of transaction, such as electricity, gas or certificates. Trading strategies as well as monitoring and reporting are desk-specific. A separation of the data streams from the trader systems, such as Trayport, Deltix etc. according to transaction types is therefore also necessary in real time processing solutions. For this purpose, the solution uses Apache Kafka to transport the data and Apache Flink to specifically prepare the data streams for the consuming systems.

Overview

Project start16.08.2022

Have we sparked your interest?

Dr. Andreas Schneider, grauhaariger Mann mit Brille

Dr. Andreas Schneider

Head of Energy

Jetzt Kontakt aufnehmen

Zum Umgang mit den hier erhobenen Daten informieren wir in unserer Datenschutzerklärung.

Contact now

We provide information on the handling of the data collected here in our privacy policy.

Download file

We provide information on the handling of the data collected here in our privacy policy.