Saxon Global
Sr Data Engineer
Saxon Global, Mc Lean, Virginia, us, 22107
Sr Data Engineer | VA, TX, CA, NY, IL, GA |
Freddie Mac
is building a culture based on modern software delivery practices to support the secondary mortgage market through rapid technology and best practice innovation.
DATA ENGINEER• Must have 4-5 years of experience.• Candidates must be local or willing to relocate at their own expense.• Option to convert to FTE after 6-8 months with no fee.• Provide one loaded hourly bill rate for each position.
Required Technical Skills:• Minimum 3 years of experience in development of Big Data ETL pipelines using Spark, Python, Hive, Hortonworks Data platform (HDP) and Unix Shell scripting• At least 2 years of past experience working with business analysts, testing engineers and end users (modelers, data scientists) on data projects• Ability to drive automation of data pipelines using Autosys, Unix shell scripting or Airflow, Kubernetes technology• Ability to identify process improvement opportunities and implement solutions.• Strong Unix experience with UNIX shells scripting: Solaris, RedHat Linux.• Demonstrated ability to manage competing demands, prioritize work, and manage customer expectation.• Prior experience working in Advanced analytics using AI/ML technologies and good familiarity with modelling techniques is preferred.• Strong verbal and written communication skills.
Responsibilities• Should be able to work with Business analysts, testing engineers and end users, data scientists• Team player attitude; ability to interact with all levels of an organization in a professional, diplomatic, and tactful manner.• Participate in meetings with technical staff to discuss issues in plans and strategies.• Knowledge of Big Data platforms, Snowflake, Kubernetes platforms.
Freddie Mac
is building a culture based on modern software delivery practices to support the secondary mortgage market through rapid technology and best practice innovation.
DATA ENGINEER• Must have 4-5 years of experience.• Candidates must be local or willing to relocate at their own expense.• Option to convert to FTE after 6-8 months with no fee.• Provide one loaded hourly bill rate for each position.
Required Technical Skills:• Minimum 3 years of experience in development of Big Data ETL pipelines using Spark, Python, Hive, Hortonworks Data platform (HDP) and Unix Shell scripting• At least 2 years of past experience working with business analysts, testing engineers and end users (modelers, data scientists) on data projects• Ability to drive automation of data pipelines using Autosys, Unix shell scripting or Airflow, Kubernetes technology• Ability to identify process improvement opportunities and implement solutions.• Strong Unix experience with UNIX shells scripting: Solaris, RedHat Linux.• Demonstrated ability to manage competing demands, prioritize work, and manage customer expectation.• Prior experience working in Advanced analytics using AI/ML technologies and good familiarity with modelling techniques is preferred.• Strong verbal and written communication skills.
Responsibilities• Should be able to work with Business analysts, testing engineers and end users, data scientists• Team player attitude; ability to interact with all levels of an organization in a professional, diplomatic, and tactful manner.• Participate in meetings with technical staff to discuss issues in plans and strategies.• Knowledge of Big Data platforms, Snowflake, Kubernetes platforms.