LOG IN
SIGN UP
Canary Wharfian - Online Investment Banking & Finance Community.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Canary Wharfian
OR continue with e-mail and password
E-mail address
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Job Details

J.P. Morgan logo
Bulge Bracket Investment Banks

Data Scientist Associate

at J.P. Morgan

ExperiencedNo visa sponsorship

Posted 16 days ago

No clicks

Senior data scientist role within JPMorgan Chase's DART team supporting Consumer and Community Banking Operations. Responsible for designing, developing, and maintaining ETL pipelines and data integrations using AWS Glue, Java/Python, Oracle and AWS RDS, and migrating legacy ETL to cloud-native solutions. Collaborates with AWS infrastructure teams to deploy containerized applications on Amazon ECS, participates in CI/CD, code reviews and production support, and analyzes large datasets to drive business decisions.

Compensation
Not specified

Currency: Not specified

City
Not specified
Country
India

Full Job Description

Location: India

 

DART (Data, Analytics and Reporting Team) serves as the primary provider of data analytics and automation solutions for all functions within the Consumer and Community Banking Operations. As a global team with a presence in the United States, India, and the Philippines, this role offers the opportunity to develop cutting-edge solutions and the necessary infrastructure to support them. The position involves collaboration with various stakeholders and management levels to ensure the delivery of the most effective solutions.

Job summary:

As a Data Scientist Associate in the DART (Data, Analytics and Reporting Team), you will be responsible for setting up future ready solutions and related infrastructure.

Job Responsibilities

  • Design, develop, and maintain ETL data pipelines using AWS Glue, Java, or Python
  • Write secure, high-quality code following best practices and engineering standards
  • Collaborate with AWS infrastructure teams to deploy and optimize containerized applications on Amazon ECS
  • Develop and maintain data integrations with Oracle databases and AWS RDS
  • Partner with business stakeholders to gather requirements and translate business needs into technical solutions
  • Participate in code reviews, testing, and CI/CD pipeline maintenance
  • Apply knowledge of the Software Development Life Cycle, including Jenkins, Docker, and Git
  • Gather, analyze, and draw conclusions from large datasets to support decision-making
  • Participate in production support and incident response as needed
 

Required Qualifications, Capabilities, and Skills

  • Bachelor’s degree with 7+ years or Master’s with 5+ years of relevant experience in a quantitative field (e.g., Computer Science, Engineering).
  • Design, develop, and maintain ETL data pipelines using AWS Glue, Java, or Python, and migrate legacy ETL to cloud-native solutions.
  • Develop and maintain data integrations with Oracle databases and AWS RDS; familiarity with Liquibase for database migrations.
  • Write secure, high-quality code following best practices and engineering standards; experience with Java, Spring Boot, and data processing frameworks (PySpark, Pandas).
  • Collaborate with AWS infrastructure teams to deploy and optimize containerized applications on Amazon ECS.
  • Apply knowledge of the Software Development Life Cycle, including Jenkins, Docker, and Git; participate in code reviews, testing, and CI/CD pipeline maintenance.
  • Gather, analyze, and draw conclusions from large datasets to support decision-making.
  • Participate in production support and incident response as needed; knowledge of observability tools such as Splunk or CloudWatch.
  • Partner with business stakeholders to gather requirements and translate business needs into technical solutions.
  •  
  • Strong problem-solving mindset, excellent communication skills, highly organized, intellectually curious, and able to work independently on unstructured problems.
 

Preferred Qualifications, Capabilities, and Skills

  • Experience in financial services or highly regulated industries.
  • AWS certifications (Solutions Architect Associate, Developer Associate, or Data Analytics Specialty) and experience migrating legacy ETL processes to cloud-native solutions.
  • Experience in financial services or highly regulated industries; AWS certifications (Solutions Architect Associate, Developer Associate, or Data Analytics Specialty) preferred.
  • Proficiency with Java, Spring Boot, and hands-on experience with data processing frameworks such as PySpark and Pandas.
  • Familiarity with Liquibase for database migrations and knowledge of observability tools like Splunk or CloudWatch.
  • Strong problem-solving mindset; excels at solving unstructured problems independently.
  • Intellectually curious, strategic, and eager to learn with a focus on business goals and innovation and Highly organized, able to prioritize multiple tasks, and strong communicator who builds effective relationships with stakeholders.

     

Build your career in Data scientist while working in the world’s most innovative bank which values creativity and excellence.

Job Details

J.P. Morgan logo
Bulge Bracket Investment Banks

16 days ago

clicks

Data Scientist Associate

at J.P. Morgan

ExperiencedNo visa sponsorship

Not specified

Currency not set

City: Not specified

Country: India

Senior data scientist role within JPMorgan Chase's DART team supporting Consumer and Community Banking Operations. Responsible for designing, developing, and maintaining ETL pipelines and data integrations using AWS Glue, Java/Python, Oracle and AWS RDS, and migrating legacy ETL to cloud-native solutions. Collaborates with AWS infrastructure teams to deploy containerized applications on Amazon ECS, participates in CI/CD, code reviews and production support, and analyzes large datasets to drive business decisions.

Full Job Description

Location: India

 

DART (Data, Analytics and Reporting Team) serves as the primary provider of data analytics and automation solutions for all functions within the Consumer and Community Banking Operations. As a global team with a presence in the United States, India, and the Philippines, this role offers the opportunity to develop cutting-edge solutions and the necessary infrastructure to support them. The position involves collaboration with various stakeholders and management levels to ensure the delivery of the most effective solutions.

Job summary:

As a Data Scientist Associate in the DART (Data, Analytics and Reporting Team), you will be responsible for setting up future ready solutions and related infrastructure.

Job Responsibilities

  • Design, develop, and maintain ETL data pipelines using AWS Glue, Java, or Python
  • Write secure, high-quality code following best practices and engineering standards
  • Collaborate with AWS infrastructure teams to deploy and optimize containerized applications on Amazon ECS
  • Develop and maintain data integrations with Oracle databases and AWS RDS
  • Partner with business stakeholders to gather requirements and translate business needs into technical solutions
  • Participate in code reviews, testing, and CI/CD pipeline maintenance
  • Apply knowledge of the Software Development Life Cycle, including Jenkins, Docker, and Git
  • Gather, analyze, and draw conclusions from large datasets to support decision-making
  • Participate in production support and incident response as needed
 

Required Qualifications, Capabilities, and Skills

  • Bachelor’s degree with 7+ years or Master’s with 5+ years of relevant experience in a quantitative field (e.g., Computer Science, Engineering).
  • Design, develop, and maintain ETL data pipelines using AWS Glue, Java, or Python, and migrate legacy ETL to cloud-native solutions.
  • Develop and maintain data integrations with Oracle databases and AWS RDS; familiarity with Liquibase for database migrations.
  • Write secure, high-quality code following best practices and engineering standards; experience with Java, Spring Boot, and data processing frameworks (PySpark, Pandas).
  • Collaborate with AWS infrastructure teams to deploy and optimize containerized applications on Amazon ECS.
  • Apply knowledge of the Software Development Life Cycle, including Jenkins, Docker, and Git; participate in code reviews, testing, and CI/CD pipeline maintenance.
  • Gather, analyze, and draw conclusions from large datasets to support decision-making.
  • Participate in production support and incident response as needed; knowledge of observability tools such as Splunk or CloudWatch.
  • Partner with business stakeholders to gather requirements and translate business needs into technical solutions.
  •  
  • Strong problem-solving mindset, excellent communication skills, highly organized, intellectually curious, and able to work independently on unstructured problems.
 

Preferred Qualifications, Capabilities, and Skills

  • Experience in financial services or highly regulated industries.
  • AWS certifications (Solutions Architect Associate, Developer Associate, or Data Analytics Specialty) and experience migrating legacy ETL processes to cloud-native solutions.
  • Experience in financial services or highly regulated industries; AWS certifications (Solutions Architect Associate, Developer Associate, or Data Analytics Specialty) preferred.
  • Proficiency with Java, Spring Boot, and hands-on experience with data processing frameworks such as PySpark and Pandas.
  • Familiarity with Liquibase for database migrations and knowledge of observability tools like Splunk or CloudWatch.
  • Strong problem-solving mindset; excels at solving unstructured problems independently.
  • Intellectually curious, strategic, and eager to learn with a focus on business goals and innovation and Highly organized, able to prioritize multiple tasks, and strong communicator who builds effective relationships with stakeholders.

     

Build your career in Data scientist while working in the world’s most innovative bank which values creativity and excellence.