Logo
TechnoGen

Data Engineer

TechnoGen, Indianapolis, Indiana, us, 46262


Position : Data Engineer Duration : 12 Months Location : Remote Interview mode : Webcam only Essential Duties/Responsibilities : Work closely with the solution leads, project managers, data architects, data scientists, and data analysts on solution design, architecture, and implementation Performing extraction, transformation, and loading of data from a wide variety of data sources using various data engineering tools and methods. Designing and implementing data solutions for operational and secure integration across systems. Assist, with guidance and oversight from data architects, in creating database models and architecture design and documentation Conduct research and development as well as contribute to the long-term positioning of and emerging technologies related to data sourcing, cleansing, and integration Documenting and demonstrating solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code. Improving operations by conducting systems analysis; recommending changes in policies and procedures. Involvement requirements gathering, solution reviews, and explaining technical complexities and business benefits in layperson terms. Job Requirements : Bachelor's degree in Computer Science, Engineering or a similar field is required 3 years of data engineering, software engineer, or similar experience 2 hands-on industry experience working with SQL on various relational database platforms (Microsoft, Oracle, Hana, Postgres, etc.) 2 hands-on industry experience working with enterprise ETL/DW tools like Azure Data Factory, Redshift, Informatica, etc. Hands-on experience with aspects of data engineering design and implementation including data sourcing, data modeling of warehouses/marts/repositories, data integration/transformation/ETL, APIs, reporting, business intelligence and analytics Hands-on experience with modern programing languages like Python, C#, JavaScript, etc. Hands-on experience with cloud platforms like AWS, Azure, GCP, etc. Experience with Docker for containerization and Kubernetes for orchestration a plus Collaborative team player who is detailed oriented, focused on solution quality and execution Progressive mindset particularly around deployment models and emerging technologies Skills: Hands-on experience with data engineering design and implementation Experience with data modeling design and implementation Hands-on industry experience programming in SQL on relational database platforms (T-SQL and PL/SQL preferred) Hands-on industry experience working with enterprise ETL/ELT tools (Azure Data Factory and Databricks preferred) Hands-on experience with modern programing languages like Python, C#, JavaScript, etc (Python preferred) Hands-on experience with Azure, AWS, and/or GCP cloud platforms (Azure preferred) Bachelor's degree in Computer Science, Data Science, Software Engineering, Information Technology or a similar field Progressive mindset particularly around deployment models and emerging technologies Collaborative team player who is detailed oriented, focused on solution quality and execution Experience with Docker for containerization and Kubernetes for orchestration