Logo
Damco Solutions

GCP Data Engineer

Damco Solutions, Texas City, Texas, us, 77592


Position Title: Data Engineer- only USC and Green card holdersLocation: Hybrid (3x onsite) 2 location options:Houston TX (Downtown)Evansville IN 47708Project Details:Looking for a Data Engineer that will work in the IT and Data Analytics group at. They will help with developing, constructing, test and maintain data acquisition pipelines for large volumes of data. This includes batch and real time processing. They will build help with building large datasets, and collaborate with stakeholders to identify opportunities for data acquisition. Looking for someone who can facilitate the overarching data process from python all the way through to SQL.

Must Have Skills:Level 2:Bachelor's degree in Data Science, Analytics, Mathematics, Statistics, Computer Science or STEMMinimum of 2 years' experience of related experience in data engineering, data analysis, ETL or similar role delivering data pipelinesMust have experience with Python and Strong Proficiency in SQL with direct working experience leveraging this skillUnderstanding in data modeling concepts and experience with data modeling toolsBachelor's degree in Data Science, Analytics, Mathematics, Statistics, Computer Science or related fieldLevel 3Minimum of 5 years' experience of related experience in data engineering, data analysis, ETL or similar role delivering data pipelinesExperience with Google Big Query, Python and Strong Proficiency in SQL with direct working experience leveraging this skillUnderstanding in data modeling concepts and experience with data modeling toolsGCP experienceNice to have Skills:Utility experienceExperience with data governanceExperience working in an agile team environmentDay to Day responsibilities:• Develop, construct, test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing.• Build large and complex datasets based on business requirements.• Construct 'big data' pipeline architecture• Identifies opportunities for data acquisition via collaborating with stakeholders and business clients.• Leverages a variety of tools such as Python, SAP Data Services, Dataflow, DataStream, Google Cloud Functions, Spark, Google Cloud Run, SAP SLT, Google Pub Sub, etc. to integrate systems and data pipelines.• Recommends ways to improve data quality, reliability, and efficiency.• Develop JSON messaging structure for integrating with various applications.