LOG IN
SIGN UP
Canary Wharfian - Online Investment Banking & Finance Community.
Sign In
or continue with e-mail and password
Forgot password?
Don't have an account?
Create an account
or continue with e-mail and password
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Senior Data Engineer - Trading Development (f/m/d)

ExperiencedNo visa sponsorship

Posted 7 days ago

No clicks

**Senior Data Engineer - Trading Development (f/m/d) in Prague** Design, develop & maintain high-volume data pipelines using ELT processes, Python, Scala, Spark, Kafka, and RDBMS. Collaborate cross-functionally to understand data needs and deliver scalable applications processing billions of datasets daily. Manage data storage & retrieval on GCP, Azure, and Cloudera, ensuring data quality and compliance. Troubleshoot data pipeline performance and stay updated with industry trends. Bring 3+ years of data engineering experience, cloud expertise (GCP/Azure), strong programming skills (Scala/Spark, Python/PySpark, Java), and familiarity with data architecture and version control tools. Proficiency in English and understanding of financial markets are essential. Enjoy a hybrid work model, flexible hours, and comprehensive benefits package.

Compensation
Not specified

Currency: Not specified

City
Prague
Country
Czech Republic

Full Job Description

Prague

Your career at Deutsche Brse Group

Your area of work:

Are you ready to work at the intersection of financial markets, cutting-edge technology, and data innovation? Were looking for a passionate Data Product Developer / Data Engineer to join our international team and help build scalable, high-performance applications that process petabytes of sensitive data in real-time.

 

As part of the StatistiX IT within the Clearing & Risk IT department, youll be contributing to a next-generation data platform that powers mission-critical regulatory use cases. Building on data mesh architecture, our platform enables domain-driven ownership, interoperability, and agility across the organization and serves as the foundation for AI/ML innovation.

 

Youll work on designing, engineering and developing data products and applications that handle billions of datasets daily, ensuring reliability, accuracy, and compliance. From architecture to deployment and production support, your impact will be felt across the financial ecosystem.

 

If you're excited by the challenge of building intelligent, resilient data systems in a fast-paced, data-centric environmentwe want to hear from you.

 

Your responsibilities:

  • Design, develop, and maintain scalable data pipelines (ELT) and data products in offering batch and steaming.
  • Implement data integration solutions using Python, Scala, Spark, Kafka and RDBMS such as Oracle.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Optimize and manage data storage and retrieval processes on GCP, Azure, and Cloudera platforms.
  • Ensure data quality and integrity through rigorous testing and validation.
  • Troubleshoot data pipeline performance and reliability.
  • Stay updated with the latest industry trends and technologies to continuously improve our data infrastructure.

 

Your profile:

  • Education: University degree in Computer Science, Engineering, or a related technical field.
  • Experience: Ideally 3+ years of relevant professional experience in data engineering or data product development.
  • Cloud Expertise: Solid hands-on experience with hyperscalers such as Google Cloud Platform (GCP) or Microsoft Azure.
  • Data platforms: Experience and exposure to Data platforms in GCP such as Bigquery, Dataproc and / or Databricks.
  • Programming Skills: Proficient in languages such as Scala/Spark, Python/PySpark, and Java.
  • Workflow Orchestration: Experience with batch scheduling and workflow orchestration tools; familiarity with Control-M and / or Airflow is a plus.
  • Data Architecture: Strong understanding of data modeling, domain driven design and data platform design. Having experience with open architecture and open data formats such as as Delta or Iceberg will be big advantage.
  • Database Knowledge: Solid working knowledge of SQL and relational databases.
  • Version Control: Proficient with tools like Git, GitHub, or GitLab.
  • DevOps & CI/CD: Exposure to CI/CD pipelines and DevOps methodologies is an advantage.
  • Usage of AI to augment SDLC responsibly such as co-pilot, Gemini or Claude a big advantage
  • Domain Knowledge: Good understanding of financial markets; knowledge of Trading area of Stock Exchange is a big advantage.
  • Communication: Strong interpersonal and communication skills to collaborate effectively across diverse teams.
  • Languages: Fluency in English is required; German language skills are a plus.

 

You can look forward to our benefit package:

  • Hybrid Work and Flexible working hours
  • Work from abroad - 12 days of remote work from EU countries per year
  • Group Share Plan - discount on company shares
  • Pension fund contribution - 3% of your gross salary (5% after 5 years with us)
  • Health & Wellbeing - fully covered Multisport card, life & accident insurance, sick days and 100% salary contribution during sick leave (up to 56 days)
  • 25 vacation days
  • Mobility - fully covered public transport in Prague & free parking
  • Flexible Benefit Account (Pluxee) - 1200 per month
  • Free Access to E-Learning Platforms, Internal Development Programs, Mentoring & Learning Budget

 

 

Senior Data Engineer - Trading Development (f/m/d)

Compensation

Not specified

City: Prague

Country: Czech Republic

Deutsche Borse logo
Other

7 days ago

No clicks

at Deutsche Borse

ExperiencedNo visa sponsorship

**Senior Data Engineer - Trading Development (f/m/d) in Prague** Design, develop & maintain high-volume data pipelines using ELT processes, Python, Scala, Spark, Kafka, and RDBMS. Collaborate cross-functionally to understand data needs and deliver scalable applications processing billions of datasets daily. Manage data storage & retrieval on GCP, Azure, and Cloudera, ensuring data quality and compliance. Troubleshoot data pipeline performance and stay updated with industry trends. Bring 3+ years of data engineering experience, cloud expertise (GCP/Azure), strong programming skills (Scala/Spark, Python/PySpark, Java), and familiarity with data architecture and version control tools. Proficiency in English and understanding of financial markets are essential. Enjoy a hybrid work model, flexible hours, and comprehensive benefits package.

Full Job Description

Prague

Your career at Deutsche Brse Group

Your area of work:

Are you ready to work at the intersection of financial markets, cutting-edge technology, and data innovation? Were looking for a passionate Data Product Developer / Data Engineer to join our international team and help build scalable, high-performance applications that process petabytes of sensitive data in real-time.

 

As part of the StatistiX IT within the Clearing & Risk IT department, youll be contributing to a next-generation data platform that powers mission-critical regulatory use cases. Building on data mesh architecture, our platform enables domain-driven ownership, interoperability, and agility across the organization and serves as the foundation for AI/ML innovation.

 

Youll work on designing, engineering and developing data products and applications that handle billions of datasets daily, ensuring reliability, accuracy, and compliance. From architecture to deployment and production support, your impact will be felt across the financial ecosystem.

 

If you're excited by the challenge of building intelligent, resilient data systems in a fast-paced, data-centric environmentwe want to hear from you.

 

Your responsibilities:

  • Design, develop, and maintain scalable data pipelines (ELT) and data products in offering batch and steaming.
  • Implement data integration solutions using Python, Scala, Spark, Kafka and RDBMS such as Oracle.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Optimize and manage data storage and retrieval processes on GCP, Azure, and Cloudera platforms.
  • Ensure data quality and integrity through rigorous testing and validation.
  • Troubleshoot data pipeline performance and reliability.
  • Stay updated with the latest industry trends and technologies to continuously improve our data infrastructure.

 

Your profile:

  • Education: University degree in Computer Science, Engineering, or a related technical field.
  • Experience: Ideally 3+ years of relevant professional experience in data engineering or data product development.
  • Cloud Expertise: Solid hands-on experience with hyperscalers such as Google Cloud Platform (GCP) or Microsoft Azure.
  • Data platforms: Experience and exposure to Data platforms in GCP such as Bigquery, Dataproc and / or Databricks.
  • Programming Skills: Proficient in languages such as Scala/Spark, Python/PySpark, and Java.
  • Workflow Orchestration: Experience with batch scheduling and workflow orchestration tools; familiarity with Control-M and / or Airflow is a plus.
  • Data Architecture: Strong understanding of data modeling, domain driven design and data platform design. Having experience with open architecture and open data formats such as as Delta or Iceberg will be big advantage.
  • Database Knowledge: Solid working knowledge of SQL and relational databases.
  • Version Control: Proficient with tools like Git, GitHub, or GitLab.
  • DevOps & CI/CD: Exposure to CI/CD pipelines and DevOps methodologies is an advantage.
  • Usage of AI to augment SDLC responsibly such as co-pilot, Gemini or Claude a big advantage
  • Domain Knowledge: Good understanding of financial markets; knowledge of Trading area of Stock Exchange is a big advantage.
  • Communication: Strong interpersonal and communication skills to collaborate effectively across diverse teams.
  • Languages: Fluency in English is required; German language skills are a plus.

 

You can look forward to our benefit package:

  • Hybrid Work and Flexible working hours
  • Work from abroad - 12 days of remote work from EU countries per year
  • Group Share Plan - discount on company shares
  • Pension fund contribution - 3% of your gross salary (5% after 5 years with us)
  • Health & Wellbeing - fully covered Multisport card, life & accident insurance, sick days and 100% salary contribution during sick leave (up to 56 days)
  • 25 vacation days
  • Mobility - fully covered public transport in Prague & free parking
  • Flexible Benefit Account (Pluxee) - 1200 per month
  • Free Access to E-Learning Platforms, Internal Development Programs, Mentoring & Learning Budget