Data Engineer Job at Robert Half in Seattle
Robert Half, Seattle, WA, US
Job Description
We are offering a long-term contract employment opportunity for a Data Engineer in the Hospitality industry, located in Seattle, Washington. As a Data Engineer, you will be responsible for developing, maintaining, and testing infrastructures for data generation. You will work closely with data architects and will be required to have in-depth knowledge of data-related technologies.
Key responsibilities:
- Develop and maintain robust + scalable data pipelines to ingest, process and transform large data sets using variety of tools like Python, Snowflake and Bash, but not limited to those.
- ML Ops Integration: Implement ML Ops practices to streamline the deployment, monitoring, and management of machine learning models in production environments
- Cloud Services: Utilize AWS or other cloud services as required for Data Engineering, Data Science and ML modelling
- Dev Ops Practices: Apply modern software development practices including CI / CD, Version Control and automated testing to ensure efficient and reliable data engineering process
- Collaboration: Work closely with cross-functional teams, including data scientists, analysts and business stakeholders, to understand data requirements and deliver robust, scalable data pipelines based on those requirements
- Documentation: Maintain comprehensive documentation of data workflows, pipelines and processes
- Performance Optimization: Continuously monitor and optimize data processes and systems for performance, scalability and cost-efficiency
Qualifications:
- Education: Bachelor’s or master’s degree in computer science, Engineering or related field
- Experience: Minimum 5 years of experience in data engineering or related role, with a proven record of building and maintaining data pipelines and infrastructure
- SQL Skills: Advanced proficiency in SQL and experience with relational databases like Snowflake, Oracle
- Programming: Strong programming skills in Python or similar language, with experience in developing data processing scripts and applications
- Scripting: Proficiency in bash scripting for automating data workflows and processes
- Cloud Services: Hands-on experience with AWS or other cloud platforms, including services like S3, EC2, RDS, Lambda and Glue
- Dev Ops and ML Ops: Hands on experience with modern software development practices, CI / CD pipelines for data and machine learning workflows
- Analytical Skills: Strong analytical and problem-solving skills, with ability to translate business requirements into technical solutions
- Communication: Excellent communication and collaboration skills, with the ability to work effectively in a remote and distributed team environment
Preferred Qualifications:
- Visualization: Experience with data visualization tools like Power BI, Tableau or similar platforms
- Machine Learning: Experience with machine learning frameworks and libraries such as TensorFlow, PyTorch or Scikit-learn
- Certifications: Relevant certifications in Snowflake, AWS, Data Engineering, or machine Learning