Orbis Operations
DevOps System Engineer (TS/SCI with Poly)
Orbis Operations, Annapolis, Maryland, United States, 21403
DevOps System Engineer (TS/SCI with Poly)
Department:
Federal
Employment Type:
Full Time
Location:
Annapolis Junction
Reporting To:
Robert Wolfe
Description
The successful candidate will be responsible for creating, sustaining, and troubleshooting complex operational data flows including data storage, data transport, data management, data security, data compliance, and knowledge store management. Additional tasking includes: working with the mission customer to perform exploratory data analysis on raw data to clean, enrich, transform and convert the raw data into the required formats. Will be responsible for devising methods to improve existing operational data flow processing, distribution, and reliability.
What you'll be doing:Must be authorized to work in the USActive TS/SCI w/ appropriate level Polygraph requiredExperience with AWS (S3, VPCs & Networking, EC2, ECS/EKS)Experience with containerization (Docker, Kubernetes, Regsitries)Experience with IaC (Terraform/Cloud Formation)Experience with CI/CD (Jenkins/Gitlab/Github Actions)Experience creating, managing, and troubleshooting complex operational data flowsExperience with Corporate data flow processes and toolsExperience with Corporate data security and compliance procedures and policiesExperience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)Experience with Unix/Linux CLIWhat you'll need:
Must be authorized to work in the USActive TS/SCI w/ appropriate level Polygraph requiredExperience using the Unix CLIExperience creating, managing, and troubleshooting complex operational data flowsExperience using Apache NiFi canvas to process and distribute dataExperience with Corporate data flow processes and toolsExperience with Corporate data security and compliance procedures and policiesExperience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)Experience with LinuxMinimum of 5 years' as a DevOps/Systems Engineer in programs and contracts of similar scope, type, and complexity is requiredBachelor's Degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related disciplineFive (5) years of additional SE experience may be substituted for a bachelor's degree.SQL querying experienceExperience using Apache NiFi canvas to process and distribute dataGeneral HPC technical knowledge regarding compute, network, memory, and storage componentsExperience with the Elastic Stack (Elasticsearch/Kibana)Experience with time-series visualization tools such as GrafanaExperience writing scripts using Bash/PythonExperience with IaC principles and automation toolsResponsible for the deployment of microservicesDemonstrated experiences in cloud-based system development & integration testing
Department:
Federal
Employment Type:
Full Time
Location:
Annapolis Junction
Reporting To:
Robert Wolfe
Description
The successful candidate will be responsible for creating, sustaining, and troubleshooting complex operational data flows including data storage, data transport, data management, data security, data compliance, and knowledge store management. Additional tasking includes: working with the mission customer to perform exploratory data analysis on raw data to clean, enrich, transform and convert the raw data into the required formats. Will be responsible for devising methods to improve existing operational data flow processing, distribution, and reliability.
What you'll be doing:Must be authorized to work in the USActive TS/SCI w/ appropriate level Polygraph requiredExperience with AWS (S3, VPCs & Networking, EC2, ECS/EKS)Experience with containerization (Docker, Kubernetes, Regsitries)Experience with IaC (Terraform/Cloud Formation)Experience with CI/CD (Jenkins/Gitlab/Github Actions)Experience creating, managing, and troubleshooting complex operational data flowsExperience with Corporate data flow processes and toolsExperience with Corporate data security and compliance procedures and policiesExperience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)Experience with Unix/Linux CLIWhat you'll need:
Must be authorized to work in the USActive TS/SCI w/ appropriate level Polygraph requiredExperience using the Unix CLIExperience creating, managing, and troubleshooting complex operational data flowsExperience using Apache NiFi canvas to process and distribute dataExperience with Corporate data flow processes and toolsExperience with Corporate data security and compliance procedures and policiesExperience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)Experience with LinuxMinimum of 5 years' as a DevOps/Systems Engineer in programs and contracts of similar scope, type, and complexity is requiredBachelor's Degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related disciplineFive (5) years of additional SE experience may be substituted for a bachelor's degree.SQL querying experienceExperience using Apache NiFi canvas to process and distribute dataGeneral HPC technical knowledge regarding compute, network, memory, and storage componentsExperience with the Elastic Stack (Elasticsearch/Kibana)Experience with time-series visualization tools such as GrafanaExperience writing scripts using Bash/PythonExperience with IaC principles and automation toolsResponsible for the deployment of microservicesDemonstrated experiences in cloud-based system development & integration testing