Logo
Saxon Global

Remote Data Cloud Engineer Job at Saxon Global in Des Moines

Saxon Global, Des Moines, IA, United States


Apex Systems is looking for a Remote Data Engineer. This is a contract role through the end of the year, with potential to extend.

Description: The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives

Qualifications for Data Engineer

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

Experience building and optimizing 'big data' data pipelines, architectures and data sets.

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Strong analytic skills related to working with unstructured datasets.

Build processes supporting data transformation, data structures, metadata, dependency and workload management.

Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.

They should also have experience using the following software/tools:

Experience with relational SQL and NoSQL databases: RDS, DynamoDB, Experience with data pipeline and workflow management tools: Glue ETL , Step Functions, Airflow, etc.

Experience with AWS cloud compute services: EC2, Glue , Athena,,Quicksight, Lambda, Redshift Experience with AWS Cloud database/storage services: RDS, S3, Lake Formation Experience with object-oriented/object function scripting languages: Python, Java

Description: The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives

Qualifications for Data Engineer

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

Experience building and optimizing 'big data' data pipelines, architectures and data sets.

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Strong analytic skills related to working with unstructured datasets.

Build processes supporting data transformation, data structures, metadata, dependency and workload management.

Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.

They should also have experience using the following software/tools:

Experience with relational SQL and NoSQL databases: RDS, DynamoDB, Experience with data pipeline and workflow management tools: Glue ETL , Step Functions, Airflow, etc.

Experience with AWS cloud compute services: EC2, Glue , Athena,,Quicksight, Lambda, Redshift Experience with AWS Cloud database/storage services: RDS, S3, Lake Formation Experience with object-oriented/object function scripting languages: Python, Java

Required Skills : Ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. Experience with relational SQL and NoSQL databases: RDS, DynamoDB, Experience with data pipeline and workflow management tools: Glue ETL , Step Functions, Airflow, etc. Experience with AWS cloud compute services: EC2, Glue , Athena,,Quicksight, Lambda, Redshift Experience with AWS Cloud database/storage services: RDS, S3
Basic Qualification :
Additional Skills :
Background Check :Yes
Notes :
Selling points for candidate :
Project Verification Info :
Candidate must be your W2 Employee :No
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set :Yes
Type of project :Development/Engineering
Master Job Title :Data Analyst
Branch Code :Des Moines