Logo
Technogen International Company

AWS Data Engineer

Technogen International Company, PRINCETON, NJ


AWS Data Engineer
Location: Princeton, NJ.

Duration : Long term

Responsibilities

Building, Optimizing & supporting data pipelines utilizing Python, SQL & Java Script utilizing Snowflake & AWS

Building & maintaining frameworks using stored procedures & Python to simplify design patterns

Building Data APIs (Web/REST APIs)

Developing, Building, Testing Object oriented enterprise standard applications / Systems / Framework.

Setup Python standards & work with infrastructure team for setting up Infrastructure

Working with DevOps team to standardize deploying python application

Resolving overnight batch issues escalated by offshore team

Perform Code review for other team members

Requirements

Good Communication, & problem-solving abilities

Ability to work as a Individual contributor & collaborating with Global team.

Strong experience with Data Warehousing- OLTP, OLAP, Dimension, Facts, Data Modeling

Expertise implementing Python design patterns (Creational, Structural and Behavioral Patterns).

Expertise in Python building data application including reading, transforming & writing data sets.

Strong experience in using boto3, pandas, numpy, pyarrow, Requests, Fast API, Asyncio, Aiohttp, PyTest, OAuth 2.0, multithreading, multiprocessing, snowflake python connector & Snowpark

Experience in Python building data APIs (Web/REST APIs)

Experience with Snowflake including SQL, Pipes, Stream, Tasks, Time Travel, Data Sharing, Query Optimization

Experience with Scripting language in Snowflake including SQL Stored Procs, Java Script Stored Procedures & Python UDFs

Understanding of Snowflake Internals & experience in integration with Reporting & UI applications

Strong Experience with AWS tools such as S3, Athena, Glue, Lambda, SNS, SQS etc...

Experience with application and libraries packaging and distribution like Wheel packages, Zipapp and Pyinstaller and Docker Containerization.

Experience working in Financial services preferably buy side firms

Good to have

Familiarity with building Reports using reporting tools such as Tableau

High level understanding of ETL tools such as Informatica

Familiarity with batch schedulers such as Active Batch

Experience with Real time data streaming using message queues

Python Libraries Kivy, Dash, PyTorch and Poetry Tool

Experience in Python building UI interface with libraries such as Matplotlib, plotly, streamlit

Devops experience specially utilizing Azure Devops for deploying Python applications

Experience with scripting such as Power Shell, Unix Shell