Logo
Saxon Global

Cloud data Engineer

Saxon Global, Madison, WI, United States


Role: Certified Cloud Engineer

End Client: State of WI.

Contract Length: 9+months

Work Type: LOCAL CANDIDATES TO WI. CAN WORK 100% REMOTE

Location: Madison, WI.

Rate C2C: 100/hr

Work Auth - All Except H1B

***Please send proof of legal location. I will only be accepting candidates who are currently RESIDING IN WI***

Job Summary:

  • Collaborate with data engineering and development teams to design, develop, test, and maintain robust and scalable ELT/ETL pipelines using SQL scripts, Redshift stored procedures, and other AWS tools and services.
  • Collaborate with our engineering and data teams to understand business requirements and data integration needs, translate them into effective data solutions, that yield top-quality outcomes.
  • Architect, implement, and manage end-to-end data pipelines, ensuring data accuracy, reliability, data quality, performance, and timeliness.
  • Employ AWS DMS and other services for efficient data ingestion from on-premises databases into Redshift.
  • Design and implement ETL processes, encompassing Changed Data Capture (CDC) and Slow Changing Dimension (SCD) logics, to seamlessly integrating data from diverse source systems.
  • Provide expertise in Redshift database optimization, performance tuning, and query optimization.
  • Design and implement efficient orchestration workflows using Airflow, ensuring seamless coordination of complex ETL processes.
  • Integrate Redshift with other AWS services, such as AWS DMS, AWS Glue, AWS Lambda, Amazon S3, Airflow, and more, to build end-to-end data pipelines.
  • Perform data profiling and analysis to troubleshoot data-related challenges / issues and build solutions to address those concerns.
  • Proactively identify opportunities to automate tasks and develop reusable frameworks.
  • Work closely with version control team to maintain a well-organized and documented repository of codes, scripts, and configurations using Git.
  • Provide technical guidance and mentorship to fellow developers, sharing insights into best practices, tips, and techniques for optimizing Redshift-based data solutions.
REQUIRED SKILLS:
  • 12+ years of experience with AWS stack including:
    • Glue, Spark, Python, and Redshift.
  • 10+ years working with and personally implementing solutions involving the following concepts:
    • CDC (Change Data Capture), SCD (Slowly Changing Dimensions), and surrogate keys.
  • Proficient in SQL and Redshift for data manipulation.
  • Hands-on experience with AWS services like DMS, S3, Glue, Redshift, and Airflow.
  • Strong ETL knowledge and data modeling skills.
  • Skilled in complex ETL scenarios and data integration.
  • Expertise in AWS DMS for on-prem to cloud data migration.
  • Proficient in Python for Airflow DAGs and operators.
  • Oracle to Redshift script conversion experience.
  • Familiar with Git for version control.
  • Adept at identifying and optimizing Redshift queries.
  • Strong problem-solving skills with data accuracy focus.
  • Collaborative, agile team player with technical communication skills.
  • Proven track record of on-time, high-quality data solutions.
  • Experience in large-scale data environments.
  • Extensive Redshift and AWS experience for scalable solutions.
  • AWS data engineering or database certifications are a plus.