LOG IN
SIGN UP
Canary Wharfian - Online Investment Banking & Finance Community.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Canary Wharfian
OR continue with e-mail and password
E-mail address
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Job Details

Barclays logo
Bulge Bracket Investment Banks

Data Engineer

at Barclays

ExperiencedNo visa sponsorship

Posted 17 days ago

No clicks

Barclays is hiring a Data Engineer to design, build and maintain scalable data pipelines, data warehouses and data lakes to ensure accurate, accessible and secure data. The role focuses on ETL/ELT development using AWS services (Redshift, Glue, Lambda, etc.), big data frameworks and advanced SQL/Python or PySpark programming. You will work with structured, semi-structured and unstructured data across multiple formats and integrate with orchestration, CI/CD and data quality/metadata frameworks. The position involves collaboration with data scientists to deploy models and support analytics across the financial services domain.

Compensation
Not specified

Currency: Not specified

City
Pune
Country
India

Full Job Description

Join us as a “Data Engineer" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences.

You may be assessed on the key critical skills relevant for success in role, such as experience with, skills to meet business requirement as well as job-specific skillsets.

To be successful as a “Data Engineer", you should have experience with:

Basic/ Essential Qualifications:

  • Must be graduate/ must have bachelor’s degree.

  • Solid experience in data engineering, ETL and building and maintaining pipelines for structured, semi structured and unstructured data from multiple upstream systems.

  • Work with various formats and protocols (CSV, JSON, XML, Avro, Parquet, APIs , streaming feeds and messaging queues).

  • Develop scalable ETL/ELT workflows using AWS and big data frameworks.

  • Strong Experience in AWS( Redshift, Glue, Lambda, Step Functions, Cloud Formation templates, CloudWatch, API Gateway).

  • Excellent programming skills in Python or Pyspark.

  • Experience with ETL orchestration tools, workflow schedulers and CI/CD pipelines.

  • Excellent knowledge of Data Storage and Warehousing concepts.

  • Model and Maintain datasets within warehouses like S3, Data Lake, Redshift, Hive/Glue Catalog).

  • Experience with database systems (relational: Oracle, SQL Server, PostgreSQL, MySQL; columnar: Redshift, Snowflake; NoSQL: MongoDB, Cassandra).

  • Advanced SQL skills (DDL, DML, performance tuning) and scripting experience (PL/SQL, T-SQL, Python or Shell).

  • Knowledge of data warehousing concepts (Inmon/Kimball) and ETL tools (e.g., Informatica).

  • Cloud platform experience, ideally AWS (S3, Redshift, Glue) and Data Lake implementation. 

Some other highly valued skills may include:

  • Experience with big data ecosystems (Hadoop, Databricks, Snowflake etc.)

  • Knowledge of Kafka and real-time flows with MSK.

  • Knowledge of Trino/Presto, Delta/Iceberg/Hudi.

  • Experience with Data Quality frameworks and metadata management.

  • Exposure to Post-trade settlement , clearing, reconciliations or financial markets preferred.

This role is based out of Pune.

Purpose of the role

To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. 

Accountabilities

  • Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
  • Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
  • Development of processing and analysis algorithms fit for the intended data complexity and volumes.
  • Collaboration with data scientist to build and deploy machine learning models.

Analyst Expectations

  • To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement.
  • Requires in-depth technical knowledge and experience in their assigned area of expertise
  • Thorough understanding of the underlying principles and concepts within the area of expertise
  • They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources.
  • If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.
  • OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate.
  • Will have an impact on the work of related teams within the area.
  • Partner with other functions and business areas.
  • Takes responsibility for end results of a team’s operational processing and activities.
  • Escalate breaches of policies / procedure appropriately.
  • Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.
  • Advise and influence decision making within own area of expertise.
  • Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
  • Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.
  • Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.
  • Make evaluative judgements based on the analysis of factual information, paying attention to detail.
  • Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.
  • Guide and persuade team members and communicate complex / sensitive information.
  • Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Job Details

Barclays logo
Bulge Bracket Investment Banks

17 days ago

clicks

Data Engineer

at Barclays

ExperiencedNo visa sponsorship

Not specified

Currency not set

City: Pune

Country: India

Barclays is hiring a Data Engineer to design, build and maintain scalable data pipelines, data warehouses and data lakes to ensure accurate, accessible and secure data. The role focuses on ETL/ELT development using AWS services (Redshift, Glue, Lambda, etc.), big data frameworks and advanced SQL/Python or PySpark programming. You will work with structured, semi-structured and unstructured data across multiple formats and integrate with orchestration, CI/CD and data quality/metadata frameworks. The position involves collaboration with data scientists to deploy models and support analytics across the financial services domain.

Full Job Description

Join us as a “Data Engineer" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences.

You may be assessed on the key critical skills relevant for success in role, such as experience with, skills to meet business requirement as well as job-specific skillsets.

To be successful as a “Data Engineer", you should have experience with:

Basic/ Essential Qualifications:

  • Must be graduate/ must have bachelor’s degree.

  • Solid experience in data engineering, ETL and building and maintaining pipelines for structured, semi structured and unstructured data from multiple upstream systems.

  • Work with various formats and protocols (CSV, JSON, XML, Avro, Parquet, APIs , streaming feeds and messaging queues).

  • Develop scalable ETL/ELT workflows using AWS and big data frameworks.

  • Strong Experience in AWS( Redshift, Glue, Lambda, Step Functions, Cloud Formation templates, CloudWatch, API Gateway).

  • Excellent programming skills in Python or Pyspark.

  • Experience with ETL orchestration tools, workflow schedulers and CI/CD pipelines.

  • Excellent knowledge of Data Storage and Warehousing concepts.

  • Model and Maintain datasets within warehouses like S3, Data Lake, Redshift, Hive/Glue Catalog).

  • Experience with database systems (relational: Oracle, SQL Server, PostgreSQL, MySQL; columnar: Redshift, Snowflake; NoSQL: MongoDB, Cassandra).

  • Advanced SQL skills (DDL, DML, performance tuning) and scripting experience (PL/SQL, T-SQL, Python or Shell).

  • Knowledge of data warehousing concepts (Inmon/Kimball) and ETL tools (e.g., Informatica).

  • Cloud platform experience, ideally AWS (S3, Redshift, Glue) and Data Lake implementation. 

Some other highly valued skills may include:

  • Experience with big data ecosystems (Hadoop, Databricks, Snowflake etc.)

  • Knowledge of Kafka and real-time flows with MSK.

  • Knowledge of Trino/Presto, Delta/Iceberg/Hudi.

  • Experience with Data Quality frameworks and metadata management.

  • Exposure to Post-trade settlement , clearing, reconciliations or financial markets preferred.

This role is based out of Pune.

Purpose of the role

To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. 

Accountabilities

  • Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
  • Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
  • Development of processing and analysis algorithms fit for the intended data complexity and volumes.
  • Collaboration with data scientist to build and deploy machine learning models.

Analyst Expectations

  • To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement.
  • Requires in-depth technical knowledge and experience in their assigned area of expertise
  • Thorough understanding of the underlying principles and concepts within the area of expertise
  • They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources.
  • If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.
  • OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate.
  • Will have an impact on the work of related teams within the area.
  • Partner with other functions and business areas.
  • Takes responsibility for end results of a team’s operational processing and activities.
  • Escalate breaches of policies / procedure appropriately.
  • Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.
  • Advise and influence decision making within own area of expertise.
  • Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
  • Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.
  • Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.
  • Make evaluative judgements based on the analysis of factual information, paying attention to detail.
  • Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.
  • Guide and persuade team members and communicate complex / sensitive information.
  • Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.