Hexaware Technologies
Data Engineer (AWS)
Hexaware Technologies, Reston, Virginia, United States, 22090
Job Description
Required Experience: 6-9 years of experience is required. Responsibilities:
The candidate must participate in efforts to design, build, develop rapid Proof-of-Concept (POC) solutions and services. The candidate is required to build applications using Python, SQL, Databricks and AWS. The candidate must be a key team member in the design and development of the Marketing product team. The candidate must apply knowledge of basic principles, methods, practices to simple and moderately complex assignments. Proactively identify and implement opportunities to automate tasks, develop reusable frameworks. Act as a Run manager and provide Run/DevOps support. The candidate must adhere to standard methodologies for coding, testing and designing reusable code/components. Participate in sprint planning meetings and provide estimations on technical implementation. The candidate needs to contribute to exploring and understanding new tools, techniques and proposed improvements to the data pipeline. The candidate must work as a data engineer within the US value, access IS team that uses several data, search and AWS technologies. The Role offers:
The candidate will have the opportunity to learn about various tools and technologies. Gives an opportunity to grow the Importance of Cloud Computing. An outstanding opportunity to re-imagine, redesign, apply technology to add value to the business and operations. Opportunity to understand, analyze acceptance criteria with colleagues to plan, manage the work necessary to perform and deliver for projects. Essential Skills:
The candidate should have advanced knowledge of AWS Services/Architecture. Experience in AWS Compute such as EC2, Lambda, Beanstalk, Batch or ECS is required. Experience with AWS Storage services such as: S3, EFS, and Glacier is required. Experience in AWS Management and Governance suite of products such as CloudTrail, CloudWatch is required. The candidate should have experience in AWS Analytics such as Athena, EMR, Glue, Redshift and Kinesis. Strong knowledge of Python object-oriented programming is required. The candidate should have strong experience with AWS database services such as RDS and DynamoDB. Experience in using APIs for software development or programming is required. Experience in using AWS Application Integration Services such as Simple Notification Service (SNS), Simple Queue Service (SQS) and Step Functions is required. Experience with AWS Developer tools such as CodeDeploy and CodePipeline is required. Experience with enterprise data lakes, data warehouses, data marts, big data, JSON and SQL is required. The candidate should have experience in data migration, cloud migration and ETL processes. Experience determining causes of operating errors and taking corrective action is required. The candidate should have a strong understanding of Agile, Scrum, Design Thinking and Lean Startup principles. Essential Qualification:
Any bachelor's degree is required.
Required Experience: 6-9 years of experience is required. Responsibilities:
The candidate must participate in efforts to design, build, develop rapid Proof-of-Concept (POC) solutions and services. The candidate is required to build applications using Python, SQL, Databricks and AWS. The candidate must be a key team member in the design and development of the Marketing product team. The candidate must apply knowledge of basic principles, methods, practices to simple and moderately complex assignments. Proactively identify and implement opportunities to automate tasks, develop reusable frameworks. Act as a Run manager and provide Run/DevOps support. The candidate must adhere to standard methodologies for coding, testing and designing reusable code/components. Participate in sprint planning meetings and provide estimations on technical implementation. The candidate needs to contribute to exploring and understanding new tools, techniques and proposed improvements to the data pipeline. The candidate must work as a data engineer within the US value, access IS team that uses several data, search and AWS technologies. The Role offers:
The candidate will have the opportunity to learn about various tools and technologies. Gives an opportunity to grow the Importance of Cloud Computing. An outstanding opportunity to re-imagine, redesign, apply technology to add value to the business and operations. Opportunity to understand, analyze acceptance criteria with colleagues to plan, manage the work necessary to perform and deliver for projects. Essential Skills:
The candidate should have advanced knowledge of AWS Services/Architecture. Experience in AWS Compute such as EC2, Lambda, Beanstalk, Batch or ECS is required. Experience with AWS Storage services such as: S3, EFS, and Glacier is required. Experience in AWS Management and Governance suite of products such as CloudTrail, CloudWatch is required. The candidate should have experience in AWS Analytics such as Athena, EMR, Glue, Redshift and Kinesis. Strong knowledge of Python object-oriented programming is required. The candidate should have strong experience with AWS database services such as RDS and DynamoDB. Experience in using APIs for software development or programming is required. Experience in using AWS Application Integration Services such as Simple Notification Service (SNS), Simple Queue Service (SQS) and Step Functions is required. Experience with AWS Developer tools such as CodeDeploy and CodePipeline is required. Experience with enterprise data lakes, data warehouses, data marts, big data, JSON and SQL is required. The candidate should have experience in data migration, cloud migration and ETL processes. Experience determining causes of operating errors and taking corrective action is required. The candidate should have a strong understanding of Agile, Scrum, Design Thinking and Lean Startup principles. Essential Qualification:
Any bachelor's degree is required.