Saxon Global
Data Engineer
Saxon Global, Newark, New Jersey, us, 07175
Primary Responsibilities:
Data modeling and data warehousing.
Cloud-based data migration.
Database management (MSSQL, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse).
Leverage and implement shortcuts across workspaces to aggregate the data between different sources.
Implement the different types of security for lakehouses and datawarehouses i.e. RLS/CLS/OLS.
Bitbucket / Jenkins / GIT Integration using Azure Repos for the Source Control.
What You Bring:
Required Skills:
10+ years of data engineering experience.
5+ years of experience in building data and analytics platform focused on Azure data and analytics solutions.
5+ years of programming experience in Python/Pyspark.
Must have experience in Microsoft Fabric and implemented at the least 2 major initiatives.
Built multiple semantic models in Fabric leveraging Direct Lake query mode.
Knowledge of data modeling and data warehousing concepts.
Experience with cloud-based data migration.
Database management experience (MSSQL, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse).
Leveraged and implemented Shortcuts across workspaces to aggregate the data between different sources.
Implemented the different types of security for lakehouses and datawarehouses i.e. RLS/CLS/OLS.
ETL tool experience (e.g., SSIS, Azure Data Factory, Power BI Dataflow).
Bitbucket / Jenkins / GIT Integration using Azure Repos for the Source Control.
Experience in Azure services such as:
Azure SQL Database
Azure Data Factory
Azure Synapse Analytics
Azure Data Lake
Notebooks
Preferred Certifications:
MCSE
Azure Data Engineer Associate
Microsoft Fabric Analytics Architect / Engineer Associate
Additional background:
Leadership skill/ Soft skills: Collaborative, team player. Self-starter. Curious about technology.
Agile experience.
Data modeling and data warehousing.
Cloud-based data migration.
Database management (MSSQL, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse).
Leverage and implement shortcuts across workspaces to aggregate the data between different sources.
Implement the different types of security for lakehouses and datawarehouses i.e. RLS/CLS/OLS.
Bitbucket / Jenkins / GIT Integration using Azure Repos for the Source Control.
What You Bring:
Required Skills:
10+ years of data engineering experience.
5+ years of experience in building data and analytics platform focused on Azure data and analytics solutions.
5+ years of programming experience in Python/Pyspark.
Must have experience in Microsoft Fabric and implemented at the least 2 major initiatives.
Built multiple semantic models in Fabric leveraging Direct Lake query mode.
Knowledge of data modeling and data warehousing concepts.
Experience with cloud-based data migration.
Database management experience (MSSQL, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse).
Leveraged and implemented Shortcuts across workspaces to aggregate the data between different sources.
Implemented the different types of security for lakehouses and datawarehouses i.e. RLS/CLS/OLS.
ETL tool experience (e.g., SSIS, Azure Data Factory, Power BI Dataflow).
Bitbucket / Jenkins / GIT Integration using Azure Repos for the Source Control.
Experience in Azure services such as:
Azure SQL Database
Azure Data Factory
Azure Synapse Analytics
Azure Data Lake
Notebooks
Preferred Certifications:
MCSE
Azure Data Engineer Associate
Microsoft Fabric Analytics Architect / Engineer Associate
Additional background:
Leadership skill/ Soft skills: Collaborative, team player. Self-starter. Curious about technology.
Agile experience.