Logo
Myticas Consulting

Google Cloud Data Engineer (31926)

Myticas Consulting, Atlanta, Georgia, United States, 30383


Google Cloud Data EngineerDirect HireHybrid

Grand Rapids, MI Plano, TX or Atlanta, GAClover Consulting (a Myticas Consulting Company) has a large, direct client seeking a skilled

SeniorPrincipal Backend Engineer.

This is a direct hire opportunity, and a hypid work model requiring onsite workdays in one of three locations : Grand Rapids, MI, Plano TX or Atlanta, GAWhat We Are Looking For:Architect, design and develop core data platform components with a microservices architecture, abstracting platform, and infrastructure intricacies.Create and maintain essential data platform SDKs and liparies, adhering to industry best practices.Design and develop connector frameworks and modern connectors to source data from disparate applications both on-prem and cloud.Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices.Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.Collaborate with data scientists, analysts, and cross-functional teams to design data models, database schemas and data storage solutions.Design and develop advanced analytics and machine learning capabilities on the data platform.Design and develop observability and data governance frameworks and practices.Stay up to date with the latest data engineering trends, technologies, and best practices.Drive the deployment and release cycles, ensuring a robust and scalable platform.Requirements:10+ (for senior) 15+ (for principal)

of proven experience in modern cloud data engineering, data architectures, data warehousing, and software engineering.Expertise in architecting, designing, and

building end-to-end data platforms in the GCP environment

using

BigQuery

and other services

while adhering to best practices guidelines such as open standards, cost, performance, time to market and minimizing vendor lock.Solid experience building data platforms in

GCP environment.Solid experience designing and

developing modular, distributed data platform components with a microservices architecture . Strong experience with

Docker, Kubernetes, APIs

is needed.Proficiency in data engineering tools and technologies -

SQL, Python, Spark, DBT, Airflow, Kafka

.Solid experience implementing

data lineage, data quality and data observability for big data workflows.Strong experience with modern

data modeling, data architecture, and data governance principles

.Excellent experience with

DataOps principles

and

test automation.Excellent experience with observability tools -

Grafana and DatadogNice to have:Experience with

Data Mesh architecture.Experience

building Semantic layers for data platforms.Experience

building scalable IoT architectures.