LOG IN
SIGN UP
Canary Wharfian - Online Investment Banking & Finance Community.
Sign In
or continue with e-mail and password
Forgot password?
Don't have an account?
Create an account
or continue with e-mail and password
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Data Engineer III - DataBricks

ExperiencedNo visa sponsorship
J.P. Morgan logo

at J.P. Morgan

Bulge Bracket Investment Banks

Posted 7 days ago

No clicks

**Data Engineer III - Databricks: Design, maintain scalable data pipelines in Bournemouth's leading financial tech team. applicant with proven proficiency in Python, SQL, Spark, ETL/ELT processes, and cloud data platforms (AWS/Azure/GCP). Collaborate with data scientists and stakeholders to derive business insights from structured and unstructured data for cash forecasting and fund movement. Requires 5+ years of relevant experience.**

Compensation
Not specified GBP

Currency: £ (GBP)

City
Not specified
Country
United Kingdom

Full Job Description

Location: BOURNEMOUTH, DORSET, United Kingdom

As a Data Engineer III within our data engineering team, you design, implement, and maintain scalable data pipelines for collecting, transforming, and delivering data across systems.

The Cash Account Platform (CAP) is a firm-wide, strategic initiative to enhance the risk management, oversight, control and reporting related to JP Morgans intraday liquidity. In support of this the platform provides real-time dashboards for senior executives which visualize Cash, Credit, and Collateral balances & activity at the Firmwide, Line of Business and Institutional Client level. It is also responsible for the firms cash forecast and funding functions, in conjunction with the nostro account reference data system. The Bournemouth development team is one of several global Agile teams contributing to the active development of the suite of applications with emphasis in the business areas pertaining to Cash Management. From a technical perspective there is a strong push to progress our architecture further into the Cloud. 

Job Responsibilities

  • Design, implement, and maintain scalable data pipelines for collecting, transforming, and delivering data across systems
  • Ensure data quality, reliability, and timeliness throughout the pipeline
  • Develop solutions for securely and efficiently moving data between internal and external systems
  • Work with both structured and unstructured data sources
  • Analyze large datasets to extract actionable insights and present findings in a business-friendly format
  • Collaborate with data scientists and business stakeholders to identify opportunities for impactful analysis
  • Provide clean, well-structured data to support predictive models and algorithms for cash forecasting and fund movement
  • Work closely with product, engineering, and business teams to understand requirements and deliver solutions
  • Document processes and share knowledge with team members

 

Required qualifications, capabilities and skills

  • Proven experience in designing and building data pipelines (ETL/ELT) using modern technologies (e.g., Python, SQL, Spark, Airflow, etc.).
  • Strong analytical skills with the ability to interpret complex data and deliver business value.
  • Experience integrating data from multiple sources and systems.
  • Familiarity with cloud data platforms (e.g., AWS, Azure, GCP) and big data technologies.
  • Ability to work independently and collaboratively in a fast-paced environment.
  • Excellent communication and documentation skills.
Lead data strategy, building real time dashboards that transform cash, credit, and collateral insights for smarter decisions

Data Engineer III - DataBricks

Compensation

Not specified GBP

City: Not specified

Country: United Kingdom

J.P. Morgan logo
Bulge Bracket Investment Banks

7 days ago

No clicks

at J.P. Morgan

ExperiencedNo visa sponsorship

**Data Engineer III - Databricks: Design, maintain scalable data pipelines in Bournemouth's leading financial tech team. applicant with proven proficiency in Python, SQL, Spark, ETL/ELT processes, and cloud data platforms (AWS/Azure/GCP). Collaborate with data scientists and stakeholders to derive business insights from structured and unstructured data for cash forecasting and fund movement. Requires 5+ years of relevant experience.**

Full Job Description

Location: BOURNEMOUTH, DORSET, United Kingdom

As a Data Engineer III within our data engineering team, you design, implement, and maintain scalable data pipelines for collecting, transforming, and delivering data across systems.

The Cash Account Platform (CAP) is a firm-wide, strategic initiative to enhance the risk management, oversight, control and reporting related to JP Morgans intraday liquidity. In support of this the platform provides real-time dashboards for senior executives which visualize Cash, Credit, and Collateral balances & activity at the Firmwide, Line of Business and Institutional Client level. It is also responsible for the firms cash forecast and funding functions, in conjunction with the nostro account reference data system. The Bournemouth development team is one of several global Agile teams contributing to the active development of the suite of applications with emphasis in the business areas pertaining to Cash Management. From a technical perspective there is a strong push to progress our architecture further into the Cloud. 

Job Responsibilities

  • Design, implement, and maintain scalable data pipelines for collecting, transforming, and delivering data across systems
  • Ensure data quality, reliability, and timeliness throughout the pipeline
  • Develop solutions for securely and efficiently moving data between internal and external systems
  • Work with both structured and unstructured data sources
  • Analyze large datasets to extract actionable insights and present findings in a business-friendly format
  • Collaborate with data scientists and business stakeholders to identify opportunities for impactful analysis
  • Provide clean, well-structured data to support predictive models and algorithms for cash forecasting and fund movement
  • Work closely with product, engineering, and business teams to understand requirements and deliver solutions
  • Document processes and share knowledge with team members

 

Required qualifications, capabilities and skills

  • Proven experience in designing and building data pipelines (ETL/ELT) using modern technologies (e.g., Python, SQL, Spark, Airflow, etc.).
  • Strong analytical skills with the ability to interpret complex data and deliver business value.
  • Experience integrating data from multiple sources and systems.
  • Familiarity with cloud data platforms (e.g., AWS, Azure, GCP) and big data technologies.
  • Ability to work independently and collaboratively in a fast-paced environment.
  • Excellent communication and documentation skills.
Lead data strategy, building real time dashboards that transform cash, credit, and collateral insights for smarter decisions