Logo
Hatchpad

Data Ops Engineer

Hatchpad, Annapolis, Maryland, United States, 21403


Up to $160,000 Partially Remote Annapolis, MD hatch I.T. is partnering with Expression to find a Data Ops Engineer. See details below: About The Role: Expression is seeking a skilled Data Ops Engineer to join their team in Annapolis, MD on a hybrid role. As a Data Ops Engineer, you will play a crucial role in bridging their data and infrastructure teams. This strategically designed position offers you the chance to tackle challenging tasks that will foster your professional growth and development. You will be at the forefront of ensuring our data systems are reliable, efficient, and scalable, enabling seamless data flows that empower data-driven decision-making for their clients. Responsibilities: Design, implement, and maintain robust data infrastructure, including databases, data warehouses, and data lakes, to support our rapidly expanding data landscape. Develop, deploy, and test ETL pipelines for extracting, transforming, and loading data from various sources. Collaborate with data scientists and data engineers to integrate and test machine learning models within our data systems. Implement automation and orchestration tools to streamline data operations and boost efficiency. Continuously assess and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness. Establish proactive monitoring and alerting mechanisms to detect and address potential issues in real time. Work closely with cross-functional teams to understand evolving data requirements. Create comprehensive documentation of data infrastructure, pipelines, and processes. Requirements

Top Secret with capability to obtain a CI Poly Security+ certification (or willingness to get certified within the first month) Associates degree or higher in engineering, computer science, or related field and 5+ years of experience as a DevOps/Cloud/Software engineer -OR- 8+ years of experience as a DevOps/Cloud/Software engineer Proficiency in programming languages such as Python, Java, or Scala. Strong experience with relational databases (e.g., PostgreSQL, MySQL) and big data technologies (e.g., Hadoop, Spark). Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience with data pipeline orchestration tools (e.g., Airflow, Luigi) and workflow automation tools (e.g., Jenkins, GitLab CI/CD). Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.

#J-18808-Ljbffr