Logo
PSI (Proteam Solutions)

Data Engineer

PSI (Proteam Solutions), Phila, Pennsylvania, United States, 19117


Job Description:Position summary:

We are seeking a mid-level Data Engineer with strong expertise in Databricks and Azure technologies. The ideal candidate will be responsible for designing, implementing, and maintaining robust data pipelines, while collaborating effectively within an Agile team environment. This role focuses on delivering scalable data solutions using modern cloud technologies and best practices.Essential functions and responsibilitiesDesign and implement end-to-end ETL pipelines using Databricks and Azure Synapse AnalyticsDevelop and maintain data workflows using Scala in Databricks environmentCreate and optimize data models for analytical and operational use casesCollaborate with cross-functional teams using Azure DevOps for project management and version controlWrite clean, maintainable code following team standards and best practicesParticipate in code reviews and technical design discussionsImplement and maintain CI/CD pipelines in Azure DevOpsWork independently while contributing effectively to team objectivesBreak down and execute on User Stories within sprint cyclesProvide production support on a rotating basis, including occasional non-business hoursRequired Skills:5+ years of hands-on experience with Databricks, including ETL pipeline development3+ years of experience with Azure Synapse Analytics and Azure cloud servicesDemonstrated experience working in Azure DevOps environmentStrong understanding of data modeling and data warehouse conceptsExperience working with Agile methodologies and executing User StoriesProven track record of successful team collaboration on data engineering projectsExcellent problem-solving and analytical skillsStrong verbal and written communication abilitiesRequired Experience:Bachelor's Degree in computer science, Information Technology, or related fields3+ years of experience developing data/database systems and integrating with third-party solutions.3+ years of Extract, Transform and Load (ETL) experience2+ years of experience architecting data/database solutions and designing the integration with third party solutions.3+ years of experience working in a hybrid agile development methodologyPreferred ExperienceExperience with Python and/or R programmingFamiliarity with AI/ML workflows in DatabricksExperience with Delta Lake and Spark optimizationKnowledge of data governance and security best practicesExperience with ScalaCloud certifications (Azure Data Engineer, Databricks, etc.)