Leidos
Software Developer Integration Engineer
Leidos, Ogden, Utah, United States, 84403
Description
The Leidos Digital Modernization Group seeks an
Integration Engineer
with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.
As a key SA/COP team member, you will work as part of a fast-paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.
Primary Responsibilities:
Integration Solutions:
Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration:
Integrate data sources into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development:
Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management:
Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation:
Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines:
Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:
Education:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering.
Experience:
5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Programming Background:
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Event Streaming & Integration:
Advanced understanding of event streaming and Kafka integration.
Application Integration:
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Software Design:
Experience in developing software detailed designs, particularly in ksql or kstreams.
Software Development Lifecycle:
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Software Documentation & Requirements:
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Security Clearance:
Ability to obtain interim Secret DoD Security clearance before the start date.
Certifications:
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Preferred Qualifications:
Text Mining & ELK Stack:
Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps:
Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications:
Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile:
Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise:
Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cross-Team Collaboration:
Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
This role offers the opportunity to work on advanced integration projects, combining software engineering with data integration in a dynamic environment. If you meet the qualifications and are passionate about integration engineering, we encourage you to apply.
#J-18808-Ljbffr
The Leidos Digital Modernization Group seeks an
Integration Engineer
with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.
As a key SA/COP team member, you will work as part of a fast-paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.
Primary Responsibilities:
Integration Solutions:
Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration:
Integrate data sources into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development:
Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management:
Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation:
Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines:
Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:
Education:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering.
Experience:
5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Programming Background:
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Event Streaming & Integration:
Advanced understanding of event streaming and Kafka integration.
Application Integration:
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Software Design:
Experience in developing software detailed designs, particularly in ksql or kstreams.
Software Development Lifecycle:
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Software Documentation & Requirements:
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Security Clearance:
Ability to obtain interim Secret DoD Security clearance before the start date.
Certifications:
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Preferred Qualifications:
Text Mining & ELK Stack:
Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps:
Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications:
Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile:
Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise:
Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cross-Team Collaboration:
Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
This role offers the opportunity to work on advanced integration projects, combining software engineering with data integration in a dynamic environment. If you meet the qualifications and are passionate about integration engineering, we encourage you to apply.
#J-18808-Ljbffr