Karkidi
Cloud Data Engineer
Karkidi, Chicago, Illinois, United States
Responsibilities:
Design, implement, and optimize end-to-end data pipelines in cloud and on-prem environments.
Utilize Kubernetes for efficient container orchestration and deployment of data applications.
Develop and maintain Python-based scripts and code for data processing and transformation.
Leverage Snowflake for managing and enhancing data warehousing capabilities.
Implement, manage, and troubleshoot data workflows using Apache Airflow.
Conduct thorough data analysis to extract valuable insights and support decision-making processes.
Utilize Azure services for cloud data solutions and integrations.
Requirements:
Bachelor's degree in Computer Science, Data Engineering, or a related field.
Proven experience in cloud data engineering, demonstrating a strong understanding of data architecture and best practices.
Proficient in Python for data manipulation, scripting, and automation.
Hands-on experience with Kubernetes for containerized application deployment.
Expertise in Snowflake for effective data warehousing and analytics.
Expertise with Apache Airflow for workflow automation and orchestration.
Experience with Azure cloud services for data solutions and integrations.
Solid background in data analysis, utilizing tools and techniques to derive meaningful insights.
Strong problem-solving skills and the ability to troubleshoot complex data issues.
Effective communication skills to collaborate with cross-functional teams.
Proactive in staying abreast of industry trends and advancements in cloud data engineering.
Experience with traditional ETL tools like Datastage and scheduling tools like Control-M a plus.
If you possess these qualifications and are eager to contribute to a dynamic team, please submit your resume outlining your relevant experience.
#J-18808-Ljbffr
#J-18808-Ljbffr