Lorven Technologies
Azure Data Engineer
Lorven Technologies, Dallas, TX, United States
Hi ,
Our client is looking for aAzure Data Engineer with a Long-term Contract project in Multiple Locations (Anywhere in USA)below is the detailed requirement.
Position: Azure Data Engineer
Location: Multiple Locations (Anywhere in USA)
Duration: Long-term Contract
Job Description:
Must-Have Qualifications:
- 12+ Years of IT Experience: Proven track record of working in IT with a focus on data engineering, data warehousing, and ETL processes.
- Cloud Technologies: Extensive experience with cloud-based data solutions, particularly on the Azure platform, including ADF, Synapse, and Databricks.
- Design and Documentation: Ability to understand and interpret complex design requirements, source-to-target mapping (STTM), and create detailed technical specification documents.
- Client-Site Flexibility: Willingness and ability to operate from client office locations as required, ensuring seamless collaboration and project delivery.
- Mentorship: Capable of mentoring and guiding junior resources, providing technical support and knowledge transfer as needed.
- Lead the design, development, and deployment of scalable data solutions on Azure.
- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
- Design and implement robust ETL pipelines using ADF and Databricks, ensuring data quality and integrity.
- Develop and optimize PySpark/Python code for data processing tasks.
- Provide technical leadership and mentorship to junior team members, fostering a collaborative and knowledge-sharing environment.
- Create and maintain comprehensive documentation, including design specs, STTM, and technical specifications.
- Ensure data solutions are aligned with industry best practices and compliance requirements, particularly in banking domains, if applicable.
- Azure: Extensive hands-on experience with Azure Cloud services, including Azure Data Factory (ADF), Synapse Analytics, and related Azure components.
- Databricks: Proficiency in working with Databricks for data processing, transformation, and analytics.
- ADF (Azure Data Factory): Expertise in designing and implementing ETL pipelines using ADF for large-scale data processing.
- PySpark/Python: Advanced programming skills in PySpark and Python for data engineering tasks.
- Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms.