Logo
Expeditors

Senior Data Engineer

Expeditors, Romulus, Michigan, United States, 48174


“We’re not in the shipping business; we’re in the information business” -Peter Rose, Expeditors Founder

Global supply chain management is what we do, but at the heart of Expeditors you will find professionalism, leadership, and a friendly environment, all of which foster an innovative, customer service-based approach to logistics.

15,000 trained professionals

250+ locations worldwide

Fortune 500

Globally unified systems

As a Senior Data Engineer, you will play a pivotal role in our agile analytics team, leading the design, development, and optimization of complex data infrastructure and pipelines. Your expertise will be crucial in delivering high-impact reporting, analytics, and machine learning solutions that drive business success. You will leverage your extensive experience to not only build and maintain critical data systems but also to mentor and guide junior engineers, ensuring the seamless translation of intricate business requirements into robust and scalable data solutions.

This position is for a role within GEO-IS Data Platforms Product Development team. This team supports our entire GEO-IS Solutions data infrastructure.

Successful candidates must understand or be able to translate complex business requirements into scalable and trusted data pipelines and models. At the core, a successful Senior Data Engineer will excel at the following:

Serve as the subject matter expert for data and systems.

Design, develop, and maintain scalable data pipelines and models, driving architecture and best practices for high performance.

Identify and lead data quality improvements, standardizing and enriching data to solve complex challenges and provide insights.

Collaborate with cross-functional teams to enhance data models and support advanced BI and analytics.

Mentor junior data engineers, lead code reviews, and promote best practices and skill development.

Optimize data pipelines for performance and scalability while staying updated on industry trends and technologies.

Major Duties and Responsibilities

Translate complex business needs into aligned data models and architecture.

Design and implement scalable, high-performance data pipelines and models.

Optimize data infrastructure to meet evolving business and tech needs.

Monitor and enhance data pipelines for performance and integrity.

Lead testing and troubleshooting for complex data issues.

Drive data quality initiatives and enforce standards.

Manage version control, backups, and disaster recovery with rigorous documentation.

Create and maintain up-to-date technical documentation.

Document and communicate data lineage and governance.

Establish and promote best practices for data pipeline and model development.

Train and mentor end-users on data models and business logic.

Refine data models in collaboration with stakeholders for optimal performance.

Support agile practices and maintain current work management systems.

Minimum Qualifications

Bachelor's degree in an IT-related field or equivalent experience, with at least 5-7 years of progressive experience in data engineering.

Extensive expertise in schema design, dimensional data modeling, and data warehousing, including experience with cloud-based platforms (e.g., AWS, Azure, Google Cloud).

Proven ability to lead and collaborate effectively with cross-functional development teams, including stakeholders and executive leadership.

Demonstrated capability to independently manage and prioritize complex projects, driving team commitments and meeting strategic objectives.

Expert-level proficiency in writing, optimizing, and troubleshooting complex SQL queries across multiple databases.

Advanced skills in data processing and manipulation using programming languages such as Python, with a focus on developing scalable data solutions.

Experience with on-prem data engineering tools such as dbt core.

Proven track record in automating complex data tasks and workflows, with experience in designing and implementing automation frameworks.

Desired Qualifications

Master’s degree or higher in Computer Science, Data Engineering, or a related field.

Relevant certifications in cloud platforms (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer, Microsoft Azure Data Engineer Associate).

Proven experience in migrating data infrastructures from on-premises to cloud environments, including strategy development, execution, and post-migration optimization.

Experience with big data technologies such as PySpark or Kafka for processing and analyzing large-scale datasets.

Experience with modern data warehousing solutions (e.g., Azure Synapse, Databricks) and their integration with BI tools.

Knowledge of data governance, data privacy regulations (e.g., GDPR, CCPA), and best practices for ensuring data security and compliance.

Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes) for deploying and managing data pipelines in a CI/CD environment.

Proficiency with advanced ETL (Extract, Transform, Load) tools and frameworks (e.g., Apache NiFi, Talend).

Experience with managing and optimizing MS SQL Server environments (e.g., advanced T-SQL, performance tuning, linked servers, proactive maintenance).

The annual salary for this position is between $130,000 to $160,000.

Position is full time (40 hours per week) Monday through Friday (in office). We are an office business attire company.

Applicants are required to be eligible to lawfully work in the U.S. immediately; employer will not transfer or sponsor applicants for U.S. work authorization (such as an H-1B visa) for this opportunity.

Expeditors offers excellent benefits

Paid Vacation, Holiday, Sick Time

Health Plan: Medical, Prescription Drug, Dental and Vision

Life and Long Term Disability Insurance

401(k) Retirement Savings Plan (US only)

Employee Stock Purchase Plan

Training and Personnel Development Program

All your information will be kept confidential according to EEO guidelines.