Robert Half
Robert Half is hiring: Data Engineer in Miami Beach
Robert Half, Miami Beach, FL, US
Job Description
Job Description
We are in the industry of data services and currently have a long-term contract employment opportunity for a Data Engineer. This role is based in Miami Beach, Florida. In this role, you will be instrumental in constructing and maintaining data infrastructure and pipelines for our revenue management systems. Collaborating with data science and business stakeholders, you will ensure their data needs are met consistently, while also making sure that the data pipelines are robust, scalable, and efficient.
Responsibilities:
• Build and preserve scalable and robust data pipelines to ingest, process, and modify large data sets using tools such as Python, Snowflake, and Bash.
• Implement ML Ops practices to streamline the deployment, monitoring, and management of machine learning models in operational environments.
• Utilize AWS or other cloud services as necessary for Data Engineering, Data Science, and ML modeling.
• Apply modern software development practices, including CI / CD, Version Control, and automated testing, to ensure an efficient and reliable data engineering process.
• Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver robust, scalable data pipelines based on those requirements.
• Maintain comprehensive documentation of data workflows, pipelines, and processes.
• Continuously monitor and optimize data processes and systems for performance, scalability, and cost-efficiency.• Minimum of 5 years of experience in data engineering or related field
• Proficiency in cloud technologies
• Familiarity with data visualization techniques
• Expertise in AWS technologies
• Strong skills in Python programming and Python scripting
• Experience with Snowflake data platform
• Ability to perform Shell scripting
• Knowledge of Continuous Integration / Continuous Delivery (CICD) practices
• Experience in setting up Continuous Integration (CI) systems
• Proven ability to design and manage data pipelines
Responsibilities:
• Build and preserve scalable and robust data pipelines to ingest, process, and modify large data sets using tools such as Python, Snowflake, and Bash.
• Implement ML Ops practices to streamline the deployment, monitoring, and management of machine learning models in operational environments.
• Utilize AWS or other cloud services as necessary for Data Engineering, Data Science, and ML modeling.
• Apply modern software development practices, including CI / CD, Version Control, and automated testing, to ensure an efficient and reliable data engineering process.
• Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver robust, scalable data pipelines based on those requirements.
• Maintain comprehensive documentation of data workflows, pipelines, and processes.
• Continuously monitor and optimize data processes and systems for performance, scalability, and cost-efficiency.• Minimum of 5 years of experience in data engineering or related field
• Proficiency in cloud technologies
• Familiarity with data visualization techniques
• Expertise in AWS technologies
• Strong skills in Python programming and Python scripting
• Experience with Snowflake data platform
• Ability to perform Shell scripting
• Knowledge of Continuous Integration / Continuous Delivery (CICD) practices
• Experience in setting up Continuous Integration (CI) systems
• Proven ability to design and manage data pipelines