Motion Recruitment
DevOps Engineer
Motion Recruitment, Auburn, WA, United States
Our client, one of the world's largest automotive manufacturers is hiring a DevOps Engineer to join their Enterprise Data and Advanced Analytics team in Auburn Hills, MI.
This team supports data ingestion processing and as an Engineer on this team, you will be responsible for orchestrating data pipelines using Python and Informatica and supporting data ingestion processing using cloud technologies like GitHub Actions OR Argo CD.
Responsibilities:
Design & Develop Data Pipelines
- Orchestrate complex data pipelines using data integration tools like Informatica and Python, ensuring seamless data flow from various sources.
- Leverage GCP Dataflow, Cloud Functions, and other cloud technologies to build scalable and resilient data ingestion and processing pipelines.
- Implement robust CI/CD workflows using GitHub Actions and Argo CD to automate pipeline deployments and ensure consistency.
- Monitor and manage production solutions. Optimize and fine-tune models for performance, accuracy, and scalability.
- Document best practices and quality standards to be adherence during development of data science solutions.
- Conduct review and provide feedback on data science work applications.
Manage & Analyze Data
- Work with diverse data sources, including relational databases (Oracle, SQL Server, MySQL, Postgres, Snowflake), big data platforms (Hadoop, Parquet files, BigQuery, Big Lake managed Iceberg), and streaming data (Kafka, GCP Dataflow/Proc).
- Employ powerful compute engines like Hive, Impala, and Spark to analyze massive datasets and derive valuable insights.
Deliver Actionable Insights
- Collaborate with business stakeholders to understand their challenges and requirements.
- Translate business problems into analytical frameworks and identify opportunities to address complex problems.
- Build APIs and user-friendly interfaces to present data results and empower informed decision-making.
Drive Machine Learning Innovation:
- Explore and implement Vertex AI models to generate quick insights and support business requirements.
Stay at the Forefront:
- Continuously learn and adapt to emerging data technologies and best practices.
- Contribute to the ongoing improvement of data infrastructure and processes.
Requirements:
- Bachelor's Degree in STEM
- 4+ years of Python development experience
- Experience building machine learning, data pipelines
- Experience leveraging cloud technologies to scale data ingestion using Informatica
- Experience implementing CI/CD processes using GitHub Actions and Argo CD for automation
- Experience working with diverse data sources, including relational databases (Oracle, SQL Server, MySQL, Postgres, Snowflake), big data platforms (Hadoop, Parquet files, BigQuery, Big Lake managed Iceberg), and streaming data (Kafka, GCP Dataflow/Proc).