Logo
JPMorgan Chase & Co.

AWS Software Engineer III, Big Data and ETL

JPMorgan Chase & Co., Aurora, Colorado, United States, 80012


We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As an AWS Software Engineer III, Big Data and ETL at JPMorgan Chase within Corporate Technology, specifically as a part of Consumer and Community Banking Risk Technology team, your role will be to contribute to an agile team responsible for designing and delivering trusted, market-leading technology products securely, stably, and scalably.

You will be tasked with creating complex data pipelines on AWS infrastructure capable of processing tens of millions of records. As a senior member of a high-functioning agile team you will employ tools such as Apache Spark running on EMR, EC2 & EKS, AWS Glue, S3, and Apache Kafka to implement solutions which are secure, highly performant, resilient, and cost effective. You will be a key contributor to a number of modernization efforts which will ensure the firm's continuing ability to deliver trusted, market-leading consumer banking products.

Job responsibilities

Design & build new applications utilizing leading edge technologies and modernize existing applications

Implement batch & real-time software components consistent with architectural best-practices of reliability, security, operational efficiency, cost-effectiveness and performance

Ensure quality of deployed code via automated unit, integration & acceptance testing

Collaborate with multi-national agile development, support and business teams to meet sprint objectives

Participate in all agile meetings & rituals, including daily standups, sprint planning, backlog reviews, demos, and retrospectives

Provide level 2 support for production systems

Learn and apply new processes, tools & technologies for personal & team growth and to continuously improve the team's products

Add to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities and skills

Formal training and certification on software engineering concepts and 3+ years applied experience

Strong experience in Big Data development & ETL data pipeline implementation using Apache Spark

Experience provisioning and tuning AWS infrastructure for ETL such as EMR, S3, Glue and Athena

Experience designing, developing and deploying solutions on AWS using services such as EC2, EKS, Aurora, SQS and MSK

Must demonstrate strong analytics and troubleshooting skills

Preferred qualifications, capabilities and skills

Certified AWS Developer, Solutions Architect or Data Engineer strongly preferred

Experience coding Java applications using Spring Boot

Experience using Terraform to deploy infrastructure-as-code to public cloud

Experience with Linux scripting such as Bash, KSH, or Python

#J-18808-Ljbffr