Infojini Consulting
Data Engineer - FULL STACK DEVELOPER
Infojini Consulting, New York, NY
TASKS:
• Create optimal data pipeline architecture that is coherent and scalable, based on best
practices of integrating data into a consolidated repository.
• Perform the technical design, development, and component testing.
• Build data and analytics tools, services and products that manage data, metadata or utilize
data pipeline to provide actionable insights into customer engagement and experience,
operational efficiency, and other key business performance metrics.
• Build the infrastructure required for optimal extraction, transformation, and loading (ETL) of
data from a wide variety of data sources using SQL, cloud, and 'big data' technologies.
• Develop ETLs to move data securely from source to target systems.
• Develop new or build against existing APls for data access or landing data as output for
further downstream consumption in the appropriate target data store.
MANDATORY SKILLS/EXPERIENCE
Note: Candidates who do not have the mandatory skills will
not be considered
• 8+ years of experience in the Java application development and implementation of large
technology projects.
• 5+ years of experience in writing SQL.
• Experience in front-end and back-end application development.
• Proven experience as a Full Stack Developer with expertise in Java, Spring Boot, Python,
JavaScript, and related frameworks (e.g., React, Angular, Node.js, Vue)
• Experience with RDBMS (Oracle, MySQL, PostgreSQL)
• Understand basic design principles behind a scalable application.
• Experience with service-oriented architecture and RESTful web services
• Knowledge of JSON, XML, XSD, WSDL, JDBC, MQ, SOAP concepts
• Strong problem-solving skills and enjoy learning new technologies.
• Experience with an agile, iterative development process and version control tools such as Git,
GitHub, Subversion.
• Strong understanding of data engineering concepts and experience with tools such as SQL,
NoSQL databases, and data processing frameworks (e.g., Apache Spark).
• Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies
(e.g., Docker, Kubernetes) is a plus.
• Experience working on large datasets, data models and full-cycle data pipeline development.
• Experience working with Amazon Web Services or Microsoft Azure cloud computing platform
and services.
• Experience developing cloud-ready applications.
• Excellent problem-solving and communication skills.
• Ability to work effectively in a fast-paced and collaborative environment.
DESIRABLE SKILLS/EXPERIENCE:
• ETL experience in development with the suite of tools from Informatica PowerCenter, IBM
DataStage and B2B Data Transformation.
• Experience using Oracle 12g, AWS RDS/MySQL and/or a database appliance.
• Knowledge of IBM Master Data Management (MDM) implementation
• Knowledge of metadata-driven enterprise reporting platforms.
• Prior experience working on complex data integration projects.
• Create optimal data pipeline architecture that is coherent and scalable, based on best
practices of integrating data into a consolidated repository.
• Perform the technical design, development, and component testing.
• Build data and analytics tools, services and products that manage data, metadata or utilize
data pipeline to provide actionable insights into customer engagement and experience,
operational efficiency, and other key business performance metrics.
• Build the infrastructure required for optimal extraction, transformation, and loading (ETL) of
data from a wide variety of data sources using SQL, cloud, and 'big data' technologies.
• Develop ETLs to move data securely from source to target systems.
• Develop new or build against existing APls for data access or landing data as output for
further downstream consumption in the appropriate target data store.
MANDATORY SKILLS/EXPERIENCE
Note: Candidates who do not have the mandatory skills will
not be considered
• 8+ years of experience in the Java application development and implementation of large
technology projects.
• 5+ years of experience in writing SQL.
• Experience in front-end and back-end application development.
• Proven experience as a Full Stack Developer with expertise in Java, Spring Boot, Python,
JavaScript, and related frameworks (e.g., React, Angular, Node.js, Vue)
• Experience with RDBMS (Oracle, MySQL, PostgreSQL)
• Understand basic design principles behind a scalable application.
• Experience with service-oriented architecture and RESTful web services
• Knowledge of JSON, XML, XSD, WSDL, JDBC, MQ, SOAP concepts
• Strong problem-solving skills and enjoy learning new technologies.
• Experience with an agile, iterative development process and version control tools such as Git,
GitHub, Subversion.
• Strong understanding of data engineering concepts and experience with tools such as SQL,
NoSQL databases, and data processing frameworks (e.g., Apache Spark).
• Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies
(e.g., Docker, Kubernetes) is a plus.
• Experience working on large datasets, data models and full-cycle data pipeline development.
• Experience working with Amazon Web Services or Microsoft Azure cloud computing platform
and services.
• Experience developing cloud-ready applications.
• Excellent problem-solving and communication skills.
• Ability to work effectively in a fast-paced and collaborative environment.
DESIRABLE SKILLS/EXPERIENCE:
• ETL experience in development with the suite of tools from Informatica PowerCenter, IBM
DataStage and B2B Data Transformation.
• Experience using Oracle 12g, AWS RDS/MySQL and/or a database appliance.
• Knowledge of IBM Master Data Management (MDM) implementation
• Knowledge of metadata-driven enterprise reporting platforms.
• Prior experience working on complex data integration projects.