Logo
Saxon Global

Data modelar

Saxon Global, Phila, Pennsylvania, United States, 19117


Client: chs

Location: grove heights, mn

Rate: open

Duration: 1 year+

This is on the Common Schema Lakehouse Project- they need a data modeler to build out a Data Vault 2.0 Raw Layer- ideally someone who has used Erwin to do it. If they have snowflake experience that's great but Data Vault 2.0 experience is required. They are moving all of the data from legacy ERP systems and creating a warehouse inside of their datalake- which is why it's called a Lakehouse. They need someone who can understand how to incorporate different domain areas and build out a raw layer so data marts can build on top of it and downstream applications can pull from this data set. Here is a generic job description but someone with Data Vault 2.0 expertise is ideally what we are targeting:

Basic Qualifications (required):• Bachelor's degree in Computer Science, MIS, or related field.• Experience in Big Data Engineering or Business Intelligence

o Data Modeling with Data Vault V2.0 and ERwin

o ETL/ELT and SQL Development• 3 or more years of experience in Software Engineering

o Development Lifecycle

o Test Driven Development• 1 one or more Experience with Cloud Big Data Technologies

o CDC tools (HVR, Qlik Replicate)

o AWS Native Tools (Glue, DMS, S3, Athena), Snowflake, Cloudera CDP or Databricks

o NoSQL Databases such as Hive, Spark

Preferred Qualifications (Desired)• In-depth experience with creating data models • Experience with Cloudera Data Platform including Impala• Masters degree in Information Systems, Computer Science or related field• 5 or more years of experience with C/C++• NoSQL background such as MongoDB, HBase• Experience with Search Engine tools such as Lucene and ElasticSearch• Experience with IDEs such as Microsoft Visual Code, Eclipse and/or PyCharm• Experience with version control using Git• Experience with Agile Methodology using Scrum• Knowledge of Data Science using python and R