Costco
Data Engineer - Costco Travel
Costco, Issaquah, Washington, United States, 98027
Data Engineers are responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and deploying code to production via CI/CD. The Data Engineer role requires knowledge of software development/programming methodologies, various data sources (Relational Databases, flat files (csv, delimited), APIs, XML, JSON, etc.), data access (SQL, Python, etc.), followed by expertise in data modeling, cloud architectures/platforms, data warehousing, and data lakes. This role also will partner closely with Product Owners, Data Architects, Platform/DevOps Engineers, etc. to design, build, test, implement and maintain data pipelines.
Costco Travel is looking for a creative, team-oriented, and motivated Database Engineer to mentor and assist project teams with database modeling, design, development and optimization. This will include both internal and online applications as well as the health of the production environment to make sure all database systems are running efficiently and smoothly. Data Engineers are also responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration, and deploying code to production via CI/CD. This role will also partner closely with the DBA team, Product Owners, Data Architects, and Platform/DevOps Engineers. Costco Travel contains a fast paced, growing, and exciting team environment and our ideal candidate enjoys challenges, appreciates being able to apply creative thinking to solve complex business problems and wants to be part of a team that is goal focused and results oriented.
Job Duties/Essential Functions
• Develops complex SQL & Python against a variety of data sources.
• Implements streaming data pipelines using event/message based architectures.
• Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
• Works in tandem with Data Architects to align on data architecture requirements provided by the requestor.
• Defines and maintains optimal data pipeline architecture.
• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
• Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
• Analyzes data to spot anomalies, trends and correlate data to ensure Data Quality.
• Develops data pipelines to store data in defined data models/structures.
• Demonstrates strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) tools.
• Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
• Identifies ways to improve data reliability, efficiency and quality of data management.
• Performs peer review for another Data Engineer's work.
• Optimizes existing SQL code for performance.
• Works within and across teams to solve complex technical challenges and priority issues.
• Coaches and mentors the development teams on all things SQL and Data.
• Develops, maintains, and operationalizes Azure based ETL pipelines for reporting, advanced analytics, testing, and archiving.
• Develops data engineering best-practices - continues to evaluate our processes and reporting to identify opportunities to improve, enhance, and automate existing, and new, capabilities.
• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
• Analyzes data to spot anomalies, trends, and correlate data to ensure Data Quality.
• Identifies ways to improve data reliability, efficiency, and quality of data management.
• Performs peer review for the Data Engineering team and other development teams as needed.
• Works in tandem with Architects, Product owners, and Engineers to design data requirements and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
• Regular and reliable workplace attendance at your assigned location.
Ability to operate vehicles, equipment or machinery.
• Computer, phone, printer, copier, fax
Non-Essential Functions
• Assists in other areas of the department as necessary.
• Assists in other areas of the company as necessary.
Ability to operate vehicles, equipment or machinery.
• Same as essential functions
Experience, Skills, Education & Licenses/CertificationsRequired:
• 5 years' data engineering experience and a BS or MS in Computer Science, Engineering or related technical discipline.
• 5 years' SQL server experience.
• 3 years' Azure, ADF, ADSL, and Synapse or equivalent.
• 3 years' creating data pipelines and ETL solutions.
• 2 years' Python experience.
• MS SQL scripting and server subject matter expert.
• Able to work in a fast-paced agile development environment.Recommended:
• Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
• Successful internal candidates will have spent one year or more on their current team.
Other Conditions
• Management will review the Job Analysis for this position prior to a job offer.
Required Documents
• Cover Letter
• Resume
• Last two performance reviews
• Attendance records for current year (Do not include absences covered by paid sick/personal time,
FMLA or other protected absences.)
California applicants, please click here to review the Costco Applicant Privacy Notice.
Costco Travel is looking for a creative, team-oriented, and motivated Database Engineer to mentor and assist project teams with database modeling, design, development and optimization. This will include both internal and online applications as well as the health of the production environment to make sure all database systems are running efficiently and smoothly. Data Engineers are also responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration, and deploying code to production via CI/CD. This role will also partner closely with the DBA team, Product Owners, Data Architects, and Platform/DevOps Engineers. Costco Travel contains a fast paced, growing, and exciting team environment and our ideal candidate enjoys challenges, appreciates being able to apply creative thinking to solve complex business problems and wants to be part of a team that is goal focused and results oriented.
Job Duties/Essential Functions
• Develops complex SQL & Python against a variety of data sources.
• Implements streaming data pipelines using event/message based architectures.
• Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
• Works in tandem with Data Architects to align on data architecture requirements provided by the requestor.
• Defines and maintains optimal data pipeline architecture.
• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
• Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
• Analyzes data to spot anomalies, trends and correlate data to ensure Data Quality.
• Develops data pipelines to store data in defined data models/structures.
• Demonstrates strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) tools.
• Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
• Identifies ways to improve data reliability, efficiency and quality of data management.
• Performs peer review for another Data Engineer's work.
• Optimizes existing SQL code for performance.
• Works within and across teams to solve complex technical challenges and priority issues.
• Coaches and mentors the development teams on all things SQL and Data.
• Develops, maintains, and operationalizes Azure based ETL pipelines for reporting, advanced analytics, testing, and archiving.
• Develops data engineering best-practices - continues to evaluate our processes and reporting to identify opportunities to improve, enhance, and automate existing, and new, capabilities.
• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
• Analyzes data to spot anomalies, trends, and correlate data to ensure Data Quality.
• Identifies ways to improve data reliability, efficiency, and quality of data management.
• Performs peer review for the Data Engineering team and other development teams as needed.
• Works in tandem with Architects, Product owners, and Engineers to design data requirements and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
• Regular and reliable workplace attendance at your assigned location.
Ability to operate vehicles, equipment or machinery.
• Computer, phone, printer, copier, fax
Non-Essential Functions
• Assists in other areas of the department as necessary.
• Assists in other areas of the company as necessary.
Ability to operate vehicles, equipment or machinery.
• Same as essential functions
Experience, Skills, Education & Licenses/CertificationsRequired:
• 5 years' data engineering experience and a BS or MS in Computer Science, Engineering or related technical discipline.
• 5 years' SQL server experience.
• 3 years' Azure, ADF, ADSL, and Synapse or equivalent.
• 3 years' creating data pipelines and ETL solutions.
• 2 years' Python experience.
• MS SQL scripting and server subject matter expert.
• Able to work in a fast-paced agile development environment.Recommended:
• Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
• Successful internal candidates will have spent one year or more on their current team.
Other Conditions
• Management will review the Job Analysis for this position prior to a job offer.
Required Documents
• Cover Letter
• Resume
• Last two performance reviews
• Attendance records for current year (Do not include absences covered by paid sick/personal time,
FMLA or other protected absences.)
California applicants, please click here to review the Costco Applicant Privacy Notice.