Leidos
Software Developer Integration Engineer
Leidos, Odenton, Maryland, United States, 21113
DescriptionThe Leidos Digital Modernization Group seeks an
Integration Engineer
with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.As a key SA/COP team member, you will work as part of a fast paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.Primary Responsibilities:Integration Solutions:
Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration:
Integrate data sources (ServiceNow, Terminal Cert DB, Modem Cert DB, Baseband Mission Workup, TRS, MRS, UDL) into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development:
Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management:
Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation:
Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines:
Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:Education:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering.
Experience:
5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Programming Background:
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Event Streaming & Integration:
Advanced understanding of event streaming and Kafka integration.
Application Integration:
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Software Design:
Experience in developing software detailed designs, particularly in ksql or kstreams.
Software Development Lifecycle:
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Software Documentation & Requirements:
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Security Clearance:
Ability to obtain interim Secret DoD Security clearance before the start date.
Certifications:
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Preferred Qualifications:Text Mining & ELK Stack:
Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps:
Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications:
Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile:
Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise:
Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cloud Platform Expertise:
Utilize expert knowledge of cloud-integrated platforms for integration and deployment tasks.
Cross-Team Collaboration:
Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
System Architecture & Operational Stability:
Apply knowledge of system architecture, networks, and Centralized Logging (ELK) to support data transformation initiatives.
Cloud & DoD Environments:
Experience developing and deploying software in a DoD environment (DISA experience is a plus), including experience building and deploying software applications that meet DoD security standards, including updating applications and code to meet security scans and meeting security implementation guidelines (e.g. STIGs).
Agile Processes:
Experience with Agile methodologies and related tools. Experience with Atlassian tools, including JIRA and Confluence.
Certifications:
Certified Confluent Developer and Certified Elastic Engineer.
Remote Teamwork:
Experience working remotely with a geographically dispersed team.
Original Posting Date:
2024-09-12While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.Pay Range:
Pay Range $101,400.00 - $183,300.00The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
#J-18808-Ljbffr
Integration Engineer
with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.As a key SA/COP team member, you will work as part of a fast paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.Primary Responsibilities:Integration Solutions:
Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration:
Integrate data sources (ServiceNow, Terminal Cert DB, Modem Cert DB, Baseband Mission Workup, TRS, MRS, UDL) into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development:
Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management:
Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation:
Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines:
Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:Education:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering.
Experience:
5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Programming Background:
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Event Streaming & Integration:
Advanced understanding of event streaming and Kafka integration.
Application Integration:
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Software Design:
Experience in developing software detailed designs, particularly in ksql or kstreams.
Software Development Lifecycle:
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Software Documentation & Requirements:
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Security Clearance:
Ability to obtain interim Secret DoD Security clearance before the start date.
Certifications:
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Preferred Qualifications:Text Mining & ELK Stack:
Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps:
Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications:
Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile:
Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise:
Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cloud Platform Expertise:
Utilize expert knowledge of cloud-integrated platforms for integration and deployment tasks.
Cross-Team Collaboration:
Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
System Architecture & Operational Stability:
Apply knowledge of system architecture, networks, and Centralized Logging (ELK) to support data transformation initiatives.
Cloud & DoD Environments:
Experience developing and deploying software in a DoD environment (DISA experience is a plus), including experience building and deploying software applications that meet DoD security standards, including updating applications and code to meet security scans and meeting security implementation guidelines (e.g. STIGs).
Agile Processes:
Experience with Agile methodologies and related tools. Experience with Atlassian tools, including JIRA and Confluence.
Certifications:
Certified Confluent Developer and Certified Elastic Engineer.
Remote Teamwork:
Experience working remotely with a geographically dispersed team.
Original Posting Date:
2024-09-12While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.Pay Range:
Pay Range $101,400.00 - $183,300.00The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
#J-18808-Ljbffr