SRS Consulting Inc
Data Engineer - DBT
SRS Consulting Inc, McLean, Virginia, United States
Strong in Python, Snowflake Use attached Skill Matrix . Responsibilities Develop data pipelines for streaming and batch data processing needs to move data in and out of Snowflake data warehouse Collaborate with engineering and DevOps team members to implement, test, deploy, and operate data pipelines and ETL solutions Develop scripts to Extract, Load and Transform data and other utility functions Hands-on experience with Snowflake, including schema design, query optimization, and data load techniques. Strong experience with DBT, including model development, testing, and documentation. Familiarity with ETL/ELT tools and processes Troubleshoot issues such data load problems, transformation translation problems raised by team members, investigating and providing effective solutions Optimize data pipelines, ETL processes, and data integrations for large-scale data analytics use cases Ensure data quality, integrity, and governance standards are maintained in data processing workflows Experience Required Skills: Bachelor's degree in Computer Science, Information Systems, or related field 7 years of experience in building and maintaining data pipelines and ETL/ELT processes in data-intensive organizations Design, build, and maintain scalable data pipelines using Snowflake and DBT. Develop and manage ETL processes to ingest data from various sources into Snowflake. Optimize and tune data pipelines for performance and cost efficiency. Able to build data integrations and ingestion pipelines for streaming and batch data Competence in Cloud database systems including 3 years with Snowflake Hands-on experience with data movement using Snowpipe, Snow SQL, etc. Strong coding skills with Python and SQL for manipulating and analyzing data Knowledge of data governance and data security best practices. Hands-on experience with cloud platforms such as AWS and Google Cloud Other Desired Skills: Minimum 5 years of designing and implementing operational production grade large-scale data pipelines, ETL/ELT and data integration solutions Hands on experience with product ionized data ingestion and processing pipelines Strong understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies Proficiency with Kafka, AWS S3, SQS, Lambda, Pub/Sub, AWS DMS, Glue Experience working with structured, semi-structured, and unstructured data Preferred Skills: Background in healthcare data especially patient centric clinical data and provider data is a plus Familiarity with API security frameworks, token management and user access control including