Cargill, Incorporated
Sr. Data Platform Engineer, Framework (Hybrid-Midtown- Atlanta, Georgia)
Cargill, Incorporated, Atlanta, Georgia, United States, 30383
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
The Sr. Data Platform Engineer will be focused on Cargill's Data Engineering Frameworks. They will take the lead in building the data and analytics platform for modern business applications in the company. In this role, you will apply your in-depth knowledge of modern infrastructure and cloud software engineering practices to build, secure and maintain the core capabilities used by our data and application teams to drive business value. You will also coach and mentor junior engineers to deliver highly scalable and resilient systems using infrastructure as code across our data centers and cloud environments.
Key Accountabilities
Build the platforms, systems and infrastructure using in depth knowledge of software development and infrastructure as code practices.Take the lead to design, develop, test, deploy, support and enhance the complex and varied automated infrastructure and platform components.Take the lead to drive large efforts, stories and tasks to completion.Participate in the engineering community to maintain and share relevant technical approaches and modern skills and present best code practices.Build prototypes to test new concepts and provide ideas on reusable framework and components to help promote adoption of new technologies.Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.Other duties as assigned.
Qualifications
Minimum Qualifications
Bachelor's degree in a related field or equivalent experience.Minimum of four years of related work experience.Preferred Qualifications
Experience with CI/CD and other automation for cloud deployments.Familiarity with infrastructure as code.Cloud infrastructure management and use of serverless technologies (AWS preferred).Creating technical documentation.Experience with networking (firewall, routing, etc) and disaster recovery methods such as high availability and scalability.Experience Programming knowledge in languages such as SQL, Python, Java or equivalent.Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks such as dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others.Experience with cloud native technologies such as Kubernetes and Docker.Proven understanding of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution and processing (Kafka, Flink), and modern data warehouse tools (Snowflake, Databricks).Knowledge of secure coding practices including secrets management and vulnerability remediation.Proven knowledge of Linux/Unix operating systems.Ability to perform technical deep-dives into code, networking, operating systems, and compute infrastructure.Experience with financial operations as related to the platform.Anticipates and adopts innovations in business-building digital and technology applications.Experience growing team capabilities through role modeling and mentoring.Plans and prioritizes work to meet commitments aligned with organizational goals.Solid interpersonal skills and a desire to improve the developer experience.Ability to work effectively as part of a team, group and culture.Ability to navigate ambiguity and work in agile ways.Ability to make well-supported tradeoffs in complex situations.
Equal Opportunity Employer, including Disability/Vet.
#J-18808-Ljbffr
Job Purpose and Impact
The Sr. Data Platform Engineer will be focused on Cargill's Data Engineering Frameworks. They will take the lead in building the data and analytics platform for modern business applications in the company. In this role, you will apply your in-depth knowledge of modern infrastructure and cloud software engineering practices to build, secure and maintain the core capabilities used by our data and application teams to drive business value. You will also coach and mentor junior engineers to deliver highly scalable and resilient systems using infrastructure as code across our data centers and cloud environments.
Key Accountabilities
Build the platforms, systems and infrastructure using in depth knowledge of software development and infrastructure as code practices.Take the lead to design, develop, test, deploy, support and enhance the complex and varied automated infrastructure and platform components.Take the lead to drive large efforts, stories and tasks to completion.Participate in the engineering community to maintain and share relevant technical approaches and modern skills and present best code practices.Build prototypes to test new concepts and provide ideas on reusable framework and components to help promote adoption of new technologies.Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.Other duties as assigned.
Qualifications
Minimum Qualifications
Bachelor's degree in a related field or equivalent experience.Minimum of four years of related work experience.Preferred Qualifications
Experience with CI/CD and other automation for cloud deployments.Familiarity with infrastructure as code.Cloud infrastructure management and use of serverless technologies (AWS preferred).Creating technical documentation.Experience with networking (firewall, routing, etc) and disaster recovery methods such as high availability and scalability.Experience Programming knowledge in languages such as SQL, Python, Java or equivalent.Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks such as dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others.Experience with cloud native technologies such as Kubernetes and Docker.Proven understanding of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution and processing (Kafka, Flink), and modern data warehouse tools (Snowflake, Databricks).Knowledge of secure coding practices including secrets management and vulnerability remediation.Proven knowledge of Linux/Unix operating systems.Ability to perform technical deep-dives into code, networking, operating systems, and compute infrastructure.Experience with financial operations as related to the platform.Anticipates and adopts innovations in business-building digital and technology applications.Experience growing team capabilities through role modeling and mentoring.Plans and prioritizes work to meet commitments aligned with organizational goals.Solid interpersonal skills and a desire to improve the developer experience.Ability to work effectively as part of a team, group and culture.Ability to navigate ambiguity and work in agile ways.Ability to make well-supported tradeoffs in complex situations.
Equal Opportunity Employer, including Disability/Vet.
#J-18808-Ljbffr