Logo
Saxon Global

Data Engineer- AWS/Python/Java

Saxon Global, Concord, North Carolina, United States, 28027


The Customer Data Technology group supporting our Cloud Acceleration group within Fidelity's Personal Investing (PI) organization is seeking a Data Engineer to aid in our migration to the Cloud. This role is critical in delivering Fidelity's promise of creating the best customer experience. Apart from cloud migration and associated cloud tools you will get exposure to our CI/CD deployment infrastructure, Agile development tools and procedures, relational (and non-relational) database technologies, and technology leadership opportunities.

The Expertise and Skills You Bring

Strong understanding and experience of Data Warehouse environment such as Snowflake, Redshift, etc.Hands-on development experience with below AWS Cloud environments

AWS GlueS3 StorageSQS or Lambda

Hands-on programming experience in Python, Java, or ScalaExperience building data streaming pipelines from scratch in SparkNoSQL experience (DynamoDB) will be a plusStrong experience developing in an Agile team setting (Kanban and/or SCRUM)Experience working in a DevOps and CI/CD environment a plusAWS certification a plusA Bachelor's or Master's degree in Computer Science, Information Technology, or equivalent experiencePassion and intellectually curious to learn new technologies and business areasAbility to deal with ambiguity and work in a fast-paced environmentExcellent verbal and written communication skillsExcellent collaboration skills to work with multiple teams in the organization

The Team

We are responsible for providing the Data Engineering technical expertise and leadership used by our Managed Solutions Platform. This role is critical in delivering Fidelity's promise of creating the best customer experience on our mobile and web-based applications.

Required Skills : - Very strong hands on coding python, java, or scala - AWS, hands-on development experience with below AWS Cloud environments: AWS Glue, S3 StorageSQS, or Lambda - Experience building data streaming pipelines from scratch in Spark - strong SQL skillsBasic Qualification :

• Snowflake • Step functions (AWS) • Kinesis (AWS) • SQS (AWS) • Experience in migration/rewrite projects • Experience migrating Informatica to cloud ETLAdditional Skills :

• Snowflake • Step functions (AWS) • Kinesis (AWS) • SQS (AWS) • Experience in migration/rewrite projects • Experience migrating Informatica to cloud ETLBackground Check :YesDrug Screen :YesNotes :Selling points for candidate :Project Verification Info :Candidate must be your W2 Employee :YesExclusive to Apex :YesFace to face interview required :NoCandidate must be local :NoCandidate must be authorized to work without sponsorship ::NoInterview times set : :NoType of project :Development/EngineeringMaster Job Title :Dev: JavaBranch Code :Boston