JobRialto
Data Engineer - AWS, Snowflake/Postgres
JobRialto, West Lake Hills, Texas, United States,
Job Summary:
We are seeking a Senior Data Engineer to join our team in Westlake, TX. The Data Engineer will be responsible for building and modernizing the database, business rules, and API layers of our Customer Relationship Management (CRM) platform, enabling seamless customer interactions across our services. This role involves leveraging cloud-native technologies, primarily AWS and Snowflake, to deliver innovative, high-impact data solutions. The Data Engineer will play a critical role in ensuring scalable, resilient, and efficient solutions in line with Client's commitment to exceptional customer service.
Key Responsibilities:
• Build and modernize CRM platform components, including databases, business rules, and API layers.
• Develop, implement, and optimize cloud-based data solutions using AWS services such as Lambda, Glue, and step functions.
• Build data pipelines in Snowflake and manage data migrations from on-premise to AWS.
• Design, develop, and debug complex SQL statements, PL/SQL packages, and procedures.
• Collaborate on DevOps and CI/CD processes using tools such as Maven, Jenkins, Terraform, and GitHub.
• Manage API-database connections across relational databases like Oracle and PostgreSQL.
• Monitor, validate, and troubleshoot issues in development, testing, and production environments.
• Ensure data quality, governance, and lineage, working closely with data management practices.
• Collaborate with cross-functional teams to design end-to-end, scalable data solutions.
Required Qualifications:
• Bachelor's degree in Computer Science or related field.
• Expertise in data modeling, data profiling, data analysis, and data governance.
• Proven experience with AWS services (Lambda, Glue, step functions) and database migration to AWS.
• Strong SQL skills with a background in ETL and experience in data pipeline scheduling.
• Experience with Snowflake for building pipelines and Aurora Postgres is a plus.
• Proficiency in DevOps/CI/CD pipelines using tools like Maven, Jenkins, Terraform, and Ansible.
• Familiarity with messaging technologies such as Kafka, Kinesis, SNS, and SQS.
• Experience in managing API connections to relational databases (Oracle, PostgreSQL).
• Excellent written and verbal communication skills.
Preferred Qualifications:
• Cloud-based certifications in AWS architecture, security, or data management.
• Knowledge of open-source technologies for building distributed systems.
• Background in highly scalable systems and distributed data management.
Education:
Bachelors Degree
We are seeking a Senior Data Engineer to join our team in Westlake, TX. The Data Engineer will be responsible for building and modernizing the database, business rules, and API layers of our Customer Relationship Management (CRM) platform, enabling seamless customer interactions across our services. This role involves leveraging cloud-native technologies, primarily AWS and Snowflake, to deliver innovative, high-impact data solutions. The Data Engineer will play a critical role in ensuring scalable, resilient, and efficient solutions in line with Client's commitment to exceptional customer service.
Key Responsibilities:
• Build and modernize CRM platform components, including databases, business rules, and API layers.
• Develop, implement, and optimize cloud-based data solutions using AWS services such as Lambda, Glue, and step functions.
• Build data pipelines in Snowflake and manage data migrations from on-premise to AWS.
• Design, develop, and debug complex SQL statements, PL/SQL packages, and procedures.
• Collaborate on DevOps and CI/CD processes using tools such as Maven, Jenkins, Terraform, and GitHub.
• Manage API-database connections across relational databases like Oracle and PostgreSQL.
• Monitor, validate, and troubleshoot issues in development, testing, and production environments.
• Ensure data quality, governance, and lineage, working closely with data management practices.
• Collaborate with cross-functional teams to design end-to-end, scalable data solutions.
Required Qualifications:
• Bachelor's degree in Computer Science or related field.
• Expertise in data modeling, data profiling, data analysis, and data governance.
• Proven experience with AWS services (Lambda, Glue, step functions) and database migration to AWS.
• Strong SQL skills with a background in ETL and experience in data pipeline scheduling.
• Experience with Snowflake for building pipelines and Aurora Postgres is a plus.
• Proficiency in DevOps/CI/CD pipelines using tools like Maven, Jenkins, Terraform, and Ansible.
• Familiarity with messaging technologies such as Kafka, Kinesis, SNS, and SQS.
• Experience in managing API connections to relational databases (Oracle, PostgreSQL).
• Excellent written and verbal communication skills.
Preferred Qualifications:
• Cloud-based certifications in AWS architecture, security, or data management.
• Knowledge of open-source technologies for building distributed systems.
• Background in highly scalable systems and distributed data management.
Education:
Bachelors Degree