Logo
Compunnel

Data Engineer

Compunnel, Richmond, Virginia, United States, 23214


As a Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Terraform, etc.), automation and innovation.Primary Responsibilities:Create & maintain data pipelines using Azure & Snowflake as primary toolsCreate SQL Stored procs and Functions to perform complex transformationsUnderstand data requirements and design optimal pipelines to fulfil the use-casesCreating logical & physical data models to ensure data integrity is maintainedCode management, CI/CD pipeline creation & automation using GitHub & GIT ActionsTuning and optimizing data processesDesign and build best in class processes to clean and standardize dataCode Deployments to production environment, troubleshoot production data issuesModelling of big volume datasets to maximize performance for our Business Intelligence & Data Science TeamQualificationsRequired Qualifications:Computer Science bachelor's degree or similarMin 1-4 years of industry experience as a Hands-on Data engineerExcellent communication skills Verbal and WrittenExcellent knowledge of SQLExcellent knowledge of Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.Excellent knowledge of Snowflake - Architecture, Features, Best practicesExcellent knowledge of Data warehousing & BI SolutionsExcellent Knowledge of change data capture (CDC), ETL, ELT, SCD etc.Hands on experience on the following technologies:o Developing data pipelines in Azure & Snowflakeo Writing complex SQL querieso Building ETL/ELT/data pipelines using SCD logico Query analysis and optimizationAnalytical and problem-solving experience applied to a Big Data datasetsData warehousing principles, architecture and its implementation in large environmentsExperience working in projects with agile/scrum methodologies and high performing team(s)Knowledge of different data modelling techniques such as Star Schema, Dimensional models, Data vault is an AdvantageExperience in code lifecycle management and repositories such as GIT & GitHubExposure to DevOps methodologyGood understanding of Access control and Data masking