Logo
Insight Global

Azure Data Engineering Manager (Hybrid TO Charlotte, NC)

Insight Global, Charlotte, North Carolina, United States, 28245


A client in Charlotte, NC is looking for Azure Data Engineering Manager to join their team. As the Manager, your primary responsibility will be to spearhead the design, development, and implementation of data solutions aimed at empowering the organization to derive actionable insights from intricate datasets. You will take the lead in guiding a team of data engineers (onshore and offshore), fostering collaboration with cross-functional teams, and spearheading initiatives geared towards fortifying our data infrastructure, CI/CD pipelines, and analytics capabilities. Responsibilities are shown below:

Apply advanced knowledge of Data Engineering principles, methodologies and techniques to design and implement data loading and aggregation frameworks across broad areas of the organization.Gather and process raw, structured, semi-structured and unstructured data using batch and real-time data processing frameworks.Implement and optimize data solutions in enterprise data warehouses and big data repositories, focusing primarily on movement to the cloud.Drive new and enhanced capabilities to Enterprise Data Platform partners to meet the needs of product / engineering / business.Experience building enterprise systems especially using Databricks, Snowflake and platforms like Azure, AWS, GCP etcLeverage strong Python, Spark, SQL programming skills to construct robust pipelines for efficient data processing and analysis.Implement CI/CD pipelines for automating build, test, and deployment processes to accelerate the delivery of data solutions.Implement data modeling techniques to design and optimize data schemas, ensuring data integrity and performance.Drive continuous improvement initiatives to enhance performance, reliability, and scalability of our data infrastructure.Collaborate with data scientists, analysts, and other stakeholders to understand business requirements and translate them into technical solutions.Implement best practices for data governance, security, and compliance to ensure the integrity and confidentiality of our data assets.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal. com.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .Prior experience on working with offshore team (India, Europe, etc.)Relevant certifications (e. g., Azure, Databricks, Snowflake) would be a plusExperience working with Snowflake and/or Microsoft Fabric12+ years of experience in a data engineering role, with expertise in designing and building data pipelines, ETL processes, and data warehouses2+ years of experience working as a ManagerStrong proficiency in SQL, specifically being able to handle window functions like rank, average, standard deviationPython and PySpark programming languagesStrong experience with the Azure cloud platformExperience working in a retail environmentExtensive experience working with Databricks and Azure Data Factory for data lake and data warehouse solutionsExperience in implementing CI/CD pipelines for automating build, test, and deployment processesHands-on experience with big data technologies (such as Hadoop, Spark, Kafka)