Logo
Raisso

Data Engineer General

Raisso, Dearborn, Michigan, United States, 48120


Title - Data engineerDuration - 12 monthsLocation - Dearborn, MIWorking Hours - 40

Position Description:High Speed Modeling Service High-Speed Modelling Service (HSMS) is a platform that allows data scientists and marketing specialists to quickly build and score models for implementation in marketing campaigns. Key Responsibilities: Working with large datasets and solving difficult analytical problems is a primary task of a Google data engineer. Proven experience as a Machine Learning Engineer or similar role Understanding of data structures, data modeling and software architecture Deep knowledge of math, probability, statistics and algorithms Conducting end-to-end analyses, including data collection, processing, and analysis. Building prototype analysis pipelines to generate insights. Developing comprehensive abilities for Google data structures and matrix for upcoming product development and sales activities is also a crucial task for Google data engineers. Finding trends in data sets and developing an algorithm to make raw data more useful across teams. Designing, building, operationalizing, securing, and monitoring data processing systems on Google Cloud Platform. Deploying, leveraging, and continually training and improving existing machine learning models. Identifying, designing, and implementing internal process movements Automating manual processes to enhance delivery. Meeting business objectives in collaboration with data scientist teams and key stakeholders. Creating reliable pipelines after combining data sources. Designing data stores and distributed systems. Capture lessons learned to further improve our processes Required Skills: Self-motivated with good oral and written communication skills Be energetic and have positive can-do mindset Problem-solver Good Planning and organizational skills Passion for process development, execution, and continuous improvement mindset

Skills Required:Google Cloud Platform: Vertex AI, AI/Client, Data Proc, Data Flow, Big Query, Pub/Sub, Cloud Logging and Monitoring AI/Client Tools: Seldon Infrastructure as code: Terraform / GitHub ELT Framework: DBT Orchestrator/ Schedule: Astronomer/Airflow Programming Language/ Scripting Python, Shell scripting IDE: VS code Agile Software: Rally

Experience Required:Google Cloud Platform (GCP) Certification preferred. 5+ years of data Engineering experience developing applications in Python 2+ years of experience in building AI/Client models using Vertex AI GCP experience in building data warehouse systems with ability to understand ETL principles and write complex sql queries. 3 years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Dataproc, Big Table, Google Cloud Storage, Data Fusion, Dataflow, etc. 1 Year experience of deploying google cloud services using Terraform 1 Yr or more experience in ELT framework DBT

Experience Preferred:Self-motivated with good oral and written communication skills Be energetic and have positive can-do mindset Problem-solver Good Planning and organizational skills Passion for process development, execution, and continuous improvement mindset

Education Required:BS in Computer Science or related

Additional Information:- Selection process - Hand-on Test ( GCP Dataproc , Python, PySpark & Airflow) and Interview -Preferece to local candidates HRA Test Required