Saxon Global
Sr. Data Engineer
Saxon Global, Raleigh, North Carolina, United States, 27601
• THIS IS AN ONSITE POSITION AT CHICAGO DOWNTOWN. CANDIDATE IS EXPECTED TO BE ONSITE 3-4 DAYS A WEEK.
Top 5 Skill sets:
DevOps AWS Cloud Terraform Python CI/CD pipelines Nice to have skills or certifications:
Blue-Green deployments Kubernetes Ansible Playbooks REQUIRED
Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree) 5+ years of experience in data engineering or ETL development role Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structure, semi- structure, and unstructured datasets. Experience with Big Query, SQL server, etc. Experience with AWS cloud services: Redshift, S3, Athena, etc Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc. Passionate about solving problems through data and analytics, and creating data products including data models Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner Proven proficiency with Microsoft Excel and PowerPoint Strong problem-solving skills, using data to tackle problems Outstanding writing, communication, and presentation skills PREFERRED
Master's degree Experience with Quantum Metrics and Akamai Experience with languages: Python, R, etc. Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills OVERVIEW/SUMMARY: • The Product Analytics team is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.
JOB OVERVIEW AND RESPONSIBILITIES :
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in United's data sources and technology business intuition, and a working knowledge of data transformation and analytical tools. Support large scale data pipelines in a distributed and scalable environment Enable and optimize production AWS environment for data infrastructure and framework Expert in creating Terraform modules to automate deployments Knowledge of Databricks and Datalake technologies Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues AWS Certification Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databrick Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement Must be fluent in English (written and spoken) Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels Manage agile development and delivery by collaborating with project manager, product owner and development leads
Top 5 Skill sets:
DevOps AWS Cloud Terraform Python CI/CD pipelines Nice to have skills or certifications:
Blue-Green deployments Kubernetes Ansible Playbooks REQUIRED
Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree) 5+ years of experience in data engineering or ETL development role Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structure, semi- structure, and unstructured datasets. Experience with Big Query, SQL server, etc. Experience with AWS cloud services: Redshift, S3, Athena, etc Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc. Passionate about solving problems through data and analytics, and creating data products including data models Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner Proven proficiency with Microsoft Excel and PowerPoint Strong problem-solving skills, using data to tackle problems Outstanding writing, communication, and presentation skills PREFERRED
Master's degree Experience with Quantum Metrics and Akamai Experience with languages: Python, R, etc. Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills OVERVIEW/SUMMARY: • The Product Analytics team is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.
JOB OVERVIEW AND RESPONSIBILITIES :
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in United's data sources and technology business intuition, and a working knowledge of data transformation and analytical tools. Support large scale data pipelines in a distributed and scalable environment Enable and optimize production AWS environment for data infrastructure and framework Expert in creating Terraform modules to automate deployments Knowledge of Databricks and Datalake technologies Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues AWS Certification Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databrick Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement Must be fluent in English (written and spoken) Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels Manage agile development and delivery by collaborating with project manager, product owner and development leads