Logo
Society of Exploration Geophysicists

AWS Data Engineer

Society of Exploration Geophysicists, Charlotte, North Carolina, United States, 28245


Location : Charlotte, NC (Hybrid 2-3 days a week on site)3-year contract

with opportunity for extension or full-time hireW-2 Only, No Corp-to-Corp or 1099Brooksource is searching for an AWS Data Engineer with experience in data warehousing using AWS Redshift to join our Fortune 500 Energy & Utilities client in Charlotte, NC.RESPONSIBILITIES:Provides technical direction, guides the team on key technical aspects and responsible for product tech deliveryLead the Design, Build, Test and Deployment of componentsCollaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)Understand requirements/use case to outline technical scope and lead delivery of technical solutionConfirm required developers and skillsets specific to productProvide leadership, direction, peer review and accountability to developers on the productWork closely with the Product Owner to align on delivery goals and timingAssist Product Owner with prioritizing and managing team backlogCollaborate with Data and Solution architects on key technical decisionsDesign architecture to deliver the requirements and functionalityDevelop data pipelines, focusing on long-term reliability and maintaining high data qualityDesign data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performanceManage and resolve issues in production data warehouse environments on AWSTECHNICAL REQUIREMENTS:5+ years of AWS experience, specifically including AWS RedshiftAWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSightExperience with Kafka/Messaging preferably Confluent KafkaExperience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and AuroraExperience with Amazon Redshift for AWS data warehousing tools such as Amazon Redshift and Amazon AthenaProven track record in the design and implementation of data warehouse solutions using AWSSkilled in data modeling and executing ETL processes tailored for data warehousingCompetence in developing and refining data pipelines within AWSProficient in handling both real-time and batch data processing tasksExtensive understanding of database management fundamentalsExpertise in creating alerts and automated solutions for handling production problemsTools and Languages – Python, Spark, PySpark and PandasInfrastructure as Code technology – Terraform/CloudFormationExperience with Secrets Management Platform like Vault and AWS Secrets managerExperience with Event Driven ArchitectureDevOps pipeline (CI/CD); Bitbucket; ConcourseExperience with RDBMS platforms and Strong proficiency with SQLExperience with Rest APIs and API gatewayDeep knowledge of IAM roles and PoliciesExperience using AWS monitoring services like CloudWatch, CloudTrail and CloudWatch eventsDeep understanding of networking DNS, TCP/IP and VPNExperience with AWS workflow orchestration tool like Airflow or Step FunctionsPREFERRED SKILLS:Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearchDatabases - Document DB, Mongo DBHadoop platform (Hive; HBase; Druid)Java, Scala, Node JSWorkflow AutomationExperience transitioning on premise big data platforms into cloud-based platforms such as AWSStrong Background in Kubernetes, Distributed Systems, Microservice architecture and containersADDITIONAL REQUIREMENTS:Ability to perform hands-on development and peer review for certain components/tech stack on the productStanding up of development instances and migration path (with required security, access/roles)Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)Lead implementation of integrated data quality frameworkEnsure optimal framework design and load testing scope to optimize performance (specifically for Big Data)Support data scientist with test and validation of modelsPerform impact analysis and identify risk to design changesAbility to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applicationsAbility to implement data pipelines with the right attentiveness to durability and data qualityImplement data warehousing products thinking of the end user's experience (ease of use with right performance)Ensure Test Driven development5+ years of Experience leading teams to deliver complex productsStrong technical skills and communication skillsStrong skills with business stakeholder interactionsStrong solutioning and architecture skills5+ years of Experience building real-time data ingestion streams (event driven)Ensure data security and permissions solutions, including data encryption, user access controls and logging

#J-18808-Ljbffr