Cognitio Corp
ETL Java Developer
Cognitio Corp, Reston, Virginia, United States, 22090
Job Description
We have an IMMEDIATE NEED for a ETL Java Developer SME to provide Agile DevOps support to mission critical systems. As a ETL Java Developer on this program, you will have the opportunity to build strong systems, software, and cloud environments and provide operations and maintenance for critical systems. The candidate will provide technical expertise and support in the design, development, implementation and testing of customer tools and applications in support of Extracting, Transforming and Loading of data into an enterprise Data Lake. Based in a DevOps framework, participate in and/or direct major deliverables of projects through all aspects of the software development lifecycle including scope and work estimation, architecture and design, coding and unit testing. This role requires the development of custom code/scripts to quickly extract, triage and exploit data across domains and data stores.
Primary Responsibilities:
Designing and implementing a large scale ingest system in a big data environment. Required to read, analyze and digest what the enterprise needs to accomplish with its data and design the best possible ELT process to support those objectives. Responsible for recommending methodologies to optimize the visualization, organization, storage, and availability of large scale data in support of enterprise requirements. Participate in software programming initiatives to support innovation and enhancement, using HTML, CSS, JavaScript, Java, Python, SpringBoot, and Hibernate. Developing and directing software system validation and testing methods using Junit and Katalon Develop and integrate custom developed software solutions to leverage automated deployment technologies Develop, prototype and deploy solutions within Commercial Cloud Solutions leveraging Infrastructure platform services Analyze (though proof of concept, performance, and end-to-end testing) and effectively coordinate Infrastructure needs driven by developed software to meet customer mission needs Support the Agile software development lifecycle following Program SAFe practices Use industry leading DevOps tools like GitHub, Jenkins, Unix bash scripting Document and Perform systems software development, including deployment of build artifacts across different environments leverage GitFlow constructs Leverage Atlassian tool suite like JIRA and Confluence to track activities Apply and identify best practices and standard operating procedures Coordinate closely with team members, Product Owners and Scrum Masters to ensure User Story alignment and implementation to customer use cases Communicate key project data to team members and build team cohesion and effectiveness. Hold meetings with PMO and enterprise stakeholders Requirements
Basic Qualifications:
Candidate must have an active TS/SCI with a Polygraph Candidate must have a Master's degree with 15+ years of prior relevant experience or a Doctorate's degree with 13+ years of prior relevant experience. Demonstrated experience performing ETL activities including, but not limited to, parser development and deployment; data flow management; implementing data lifecycle policies; troubleshooting data access issues; and developing data models Experience with data modeling. Extensive experience with relational databases, such as MySQL, that utilize SQL queries. Extensive experience with using Java for data processing, manipulation or querying (SQL or NoSQL) ETL/Data Integration experience using Spring, NiFi, Kafka, and Elasticsearch. Experience with development in Commercial Cloud Platforms (ex, AWS, Google Cloud, Azure) Experience with development leveraging cloud data services (ex: S3, RDS, EFS) Excellent communication skills (written and verbal) Experience leading development scrum teams. Preferred Qualifications:
Demonstrated experience using Neo4J
Primary Responsibilities:
Designing and implementing a large scale ingest system in a big data environment. Required to read, analyze and digest what the enterprise needs to accomplish with its data and design the best possible ELT process to support those objectives. Responsible for recommending methodologies to optimize the visualization, organization, storage, and availability of large scale data in support of enterprise requirements. Participate in software programming initiatives to support innovation and enhancement, using HTML, CSS, JavaScript, Java, Python, SpringBoot, and Hibernate. Developing and directing software system validation and testing methods using Junit and Katalon Develop and integrate custom developed software solutions to leverage automated deployment technologies Develop, prototype and deploy solutions within Commercial Cloud Solutions leveraging Infrastructure platform services Analyze (though proof of concept, performance, and end-to-end testing) and effectively coordinate Infrastructure needs driven by developed software to meet customer mission needs Support the Agile software development lifecycle following Program SAFe practices Use industry leading DevOps tools like GitHub, Jenkins, Unix bash scripting Document and Perform systems software development, including deployment of build artifacts across different environments leverage GitFlow constructs Leverage Atlassian tool suite like JIRA and Confluence to track activities Apply and identify best practices and standard operating procedures Coordinate closely with team members, Product Owners and Scrum Masters to ensure User Story alignment and implementation to customer use cases Communicate key project data to team members and build team cohesion and effectiveness. Hold meetings with PMO and enterprise stakeholders Requirements
Basic Qualifications:
Candidate must have an active TS/SCI with a Polygraph Candidate must have a Master's degree with 15+ years of prior relevant experience or a Doctorate's degree with 13+ years of prior relevant experience. Demonstrated experience performing ETL activities including, but not limited to, parser development and deployment; data flow management; implementing data lifecycle policies; troubleshooting data access issues; and developing data models Experience with data modeling. Extensive experience with relational databases, such as MySQL, that utilize SQL queries. Extensive experience with using Java for data processing, manipulation or querying (SQL or NoSQL) ETL/Data Integration experience using Spring, NiFi, Kafka, and Elasticsearch. Experience with development in Commercial Cloud Platforms (ex, AWS, Google Cloud, Azure) Experience with development leveraging cloud data services (ex: S3, RDS, EFS) Excellent communication skills (written and verbal) Experience leading development scrum teams. Preferred Qualifications:
Demonstrated experience using Neo4J