Logo
CCS IT

63436 - Application ETL Developer Expert - ONSITE - Must be NE Local

CCS IT, Lincoln, Nebraska, United States, 68511


The ETL / Data integration Developer role will be responsible for developing and implementing ETL processes from a variety of data sources. This includes building high-level conceptual models, logical models and production of high-quality documentation. The ETL Developers will also participate in requirements gathering, collecting data definitions, source to target mapping, and defining and implementing data transformation, quality checking, cleansing, and standardization processes. The role will also be responsible for testing the resulting processes for accuracy and performance.

Although this is an individual contributor role, this is a highly collaborative Agile team environment requiring personal flexibility in performing a variety of roles such as analyst, designer, developer, tester, troubleshoot, performance tunes and mentor when needed; all to take the necessary steps to ensure our customers' needs are met to the maximum extent possible in an accurate and timely manner.

MINIMUM QUALIFICATIONS: Bachelor's degree in computer science or a related field or related experience. Minimum of five years of experience related to the essential functions of the position. Any equivalent education andor work experience may be substituted in order to meet the minimum qualifications of the position. PREFERRED QUALIFICATIONS: Extensive experience in data migration projects utilizing ETL/ELT tools, with strong expertise in database modeling techniques. Proven track record in application database re-engineering, redesign, and data migration from legacy databases across multiple agencies. Skilled in developing complex SQL Server PL/SQL, Stored Procedures, and ETL frameworks, with a strong grasp of data warehousing and cloud computing concepts. Proficient with a range of tools, including MuleSoft, Qlik, Talend, DataStage, AWS Glue, AWS Data Pipeline, and Azure Data Factory. Hands-on experience with database programming, Azure DevOps, Azure SQL, SQL performance tuning, data modeling, and ETL job scheduling. Successful execution of data warehousing projects across various platforms such as Snowflake, Databricks, Redshift, and other cloud-based data platforms.

Knowledge and Abilities:

KNOWLEDGE:

•Writes advanced SQL including some query tuning.

•Experience in the identification of data quality.

•Experience in creating ETL jobs using DataStage, Qlik, Talend or MuleSoft

for data migration.

•Utilize ad-hoc techniques to perform on-the-fly analysis of data in all forms

(structured, unstructured and semi structured).

•Proficient in how to design, implement, and troubleshoot a production Data

Platform environment.

•Design, build and implement effective ETL/ELT jobs both in scheduled batch and real-time. •Implement stored procedures and effectively query a database for multiple databases using established security procedures.

•Identify and test for bugs and bottlenecks in the ETL solution.

•Ensure the best possible performance and quality in the ETL packages.

•Provide support and fix issues in the ETL packages.

•Author and execute unit test scripts.

•Experience with the development and use of APIs for data sharing and consuming.

•Knowledgeable with data infrastructure both on premise and in the cloud. Preferable in Azure or AWS

•Experience working in a DevOps, Continuous Integration and Continuous Delivery environments.

•Work within developing and establishing policies, standards, procedures, practices of the State of Nebraska Department of OCIO.