Logo
Saxon Global

Azure Data Engineer

Saxon Global, Minneapolis, Minnesota, United States, 55400


Fully Remote out of MN.

CST working hours

GC or USC only.

This team is very sensitive to any questionable activity. No headsets, no video or audio lag. Candidates must be able to perform live coding with a shared screen.

If the hiring manager senses any deviation, the candidate will be dismissed immediately and the vendor will be blacklisted by the client.

ID check and Interview Prep will take place before a scheduled interview.

Please only send your best.

The client name will be shared once the candidate is selected for consideration.

Interview Process:

Phone and Video - with 2 rounds of technical

Description:

Summary of the project/initiative:

The candidate will be working closely with IT team members building a robust Marketing technology platform for enabling real time campaigns and data for marketing and Data Scientist team to make quick decisions.

Describe the team:

10Break-down of the team:

We currently are planning to have two teams with 4 developers [including QE], SM and PO

Top 5-10 responsibilities:• Should have prior experience managing Data lake and data warehouse type application• Should have prior experience managing big data platform to ingest / transform time series, batch data for real time access• 1 to 3 years of experience with Snowflake/synapse, Databricks, Kafka, eventHub, Spark.• Should have experience with data security and governance guideline• Should have experience managing such application on cloud (preferably Azure) platform and also understand native cloud concept such as eventHub, ADLS, Azure data factory.• Should have experience managing various kind of BI tools for reporting like tableau

Ideal candidate background:

ETL, Data warehousing, Python experience

Skills/attributes required:• Undergraduate degree or equivalent experience• 5 + year of IT experience• 1 to 3 years of experience with Snowflake/synapse, Databricks, Kafka, eventHub, Spark. [required]

Skills/attributes preferred:

1 to 3 or more years of experience with Snowflake/synapse, Databricks, Kafka, eventHub, Spark.