Codex
Enterprise Data Quality Lead
Codex, Houston, Texas, United States
Enterprise Data Quality Lead Location: Houston, Texas (Hybrid) Salary: $150,000 - $175,000 (base) Job Description We are looking for an experienced and proactive Enterprise Data Quality Lead to ensure the accuracy, consistency, completeness, and integrity of organizational data. This role is critical to supporting the transition to a modern, cloud-based ERP platform by implementing data quality standards and leading cross-functional initiatives to address data challenges. The ideal candidate will bring deep expertise in data quality management, a strong understanding of ERP systems (particularly Oracle Cloud), and experience navigating complex, enterprise-level implementation projects. Roles & Responsibilities Data Quality Assurance and Strategy Design and implement a scalable, repeatable data quality framework. Define data quality standards, metrics, and KPIs to ensure high-quality data across various ERP modules, such as Financials, Supply Chain, HCM, Procurement, and Customer Experience. Collaborate with stakeholders to establish data ownership, roles, and accountability. Ensure compliance with regulatory and organizational policies, including SOX and GDPR. Oversee data profiling, cleansing, validation, and enrichment during migration and day-to-day operations. Identify and resolve data quality issues by partnering with technical and functional teams to develop mitigation strategies. Lead comprehensive data quality testing during implementation phases, including System Integration Testing (SIT) and User Acceptance Testing (UAT). Collaborate with ERP implementation partners to align data quality efforts with broader project goals. Stakeholder Collaboration Engage closely with business process owners, functional leads, and technical teams to define and address data quality requirements. Act as the primary contact for all data quality-related issues and escalations. Facilitate workshops and training sessions to raise data quality awareness and capabilities across the organization. Tools and Technology Utilize tools such as Oracle Data Integrator (ODI), Enterprise Data Quality (EDQ), and others to automate and streamline data quality processes. Configure rules and workflows within data quality tools to meet project and operational needs. Develop dashboards and reports to track, analyze, and communicate data quality performance. Required Knowledge, Skills & Abilities 8 years of IT experience in technical domains, with a focus on Data Engineering or Analytics. Bachelor’s degree in Computer Science, Information Systems, Business Administration, or a related field. 4 years of experience in configuring and implementing cloud-based data warehousing and archiving tools. 3 years of experience in designing and implementing data integration platforms and ETL/ELT tools like Oracle Data Integrator or Boomi. Proficiency in at least two programming or scripting languages, such as PLSQL, Python, Spark, Scala, or R, with experience in tools like Jupyter Notebook. Strong familiarity with databases, including Oracle, SQL Server, and PostgreSQL. Proven ability to work independently under enterprise architectural guidelines, with the potential to supervise other developers. Expertise in data migration strategies, tools, and methodologies. Strong skills in data profiling, cleansing, validation, and enrichment techniques. Exceptional analytical, problem-solving, and communication abilities. Ability to balance multiple priorities effectively in a fast-paced environment. Preferred Experience with Oracle Cloud ERP modules such as Financials, Supply Chain, or HCM. Familiarity with Oracle tools like EDQ, ODI, FAW, and OAC. Experience in designing and deploying cloud infrastructure on platforms like Microsoft Azure, including Azure Data Lake and Synapse Analytics. Hands-on experience with data modeling, cloud-based data pipelines, and creating dashboards. Proficiency with Azure Data Lake Gen2, Synapse Analytics, or equivalent tools.