DigiFlight
Cloud Software Engineer, L2 - TS/SCI with Polygraph
DigiFlight, Fort George Meade, Maryland, United States
Join an outstanding team that offers exciting job opportunities with the goal of providing the absolute best support to our customers. Here at DigiFlight we embrace integrity, innovative solutions, put our customers first and offer a highly competitive benefits package!
Role description: The Cloud Software Engineer-Level 2 develops, maintains, and enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object-Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.
Required skills:
Active TS/SCI with Fullscope polygraph security clearance. Eight (8) years' experience software engineering experience in programs and contracts of similar scope, type, and complexity is required; two (2) years of which must be in programs utilizing Big-Data Cloud technologies and/or Distributed Computing. Bachelor's degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master's in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience The following Cloud related experiences are required:
Two (2) years of Cloud and Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Experience with Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 2. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience with Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s) Experience with Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies Aspect Oriented Design and Development Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications UNIX/LINUX, CentOS
Experience with SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis Experience with Other: CentOS and Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor
Preferred Skills:
The following Cloud related experiences are required:
Two (2) years of Cloud and Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Experience with Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 2. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience with Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s) Experience with Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies Aspect Oriented Design and Development Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications UNIX/LINUX, CentOS
Experience with SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis Experience with Other: CentOS and Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor
Provide in-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval Implement complex workflows that manage Cloud MapReduce analytics Implement code that interacts with Cloud Distributed Coordination Framework Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project Make recommendations for improving documentation and software development process standards Our People
DigiFlight attracts the most highly skilled workforce to protect some of our nation's most sensitive systems. Before joining the company, many DigiFlight professionals served our country in a civilian and/or military capacity. Our diverse team provides innovative solutions as they support critical clients in tackling tough challenges. Most importantly, our team is passionate about their work and making a difference.
Our corporate culture promotes a healthy work/life balance.
Our Benefits
DigiFlight's competitive benefits package allows employees to manage their personal and professional portfolios through a variety of features and programs. Our benefits include:
Health, Dental, Vision, and Flexible Spending Account Paid Time Off (PTO) 11 paid holidays Tuition Education Assistance Professional Development 401(k) retirement plan Life insurance and short- and long-term disability insurance Employee Referral Program Marketing Incentive Plans
DigiFlight, Inc. (DFI) is an Affirmative Action, Equal Opportunity Employer. DFI offers a highly competitive, family-oriented benefits package.
Role description: The Cloud Software Engineer-Level 2 develops, maintains, and enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object-Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.
Required skills:
Active TS/SCI with Fullscope polygraph security clearance. Eight (8) years' experience software engineering experience in programs and contracts of similar scope, type, and complexity is required; two (2) years of which must be in programs utilizing Big-Data Cloud technologies and/or Distributed Computing. Bachelor's degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master's in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience The following Cloud related experiences are required:
Two (2) years of Cloud and Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Experience with Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 2. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience with Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s) Experience with Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies Aspect Oriented Design and Development Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications UNIX/LINUX, CentOS
Experience with SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis Experience with Other: CentOS and Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor
Preferred Skills:
The following Cloud related experiences are required:
Two (2) years of Cloud and Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Experience with Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 2. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience with Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s) Experience with Information Technology:
Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies Aspect Oriented Design and Development Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications UNIX/LINUX, CentOS
Experience with SIGINT:
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis Experience with Other: CentOS and Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor
Provide in-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval Implement complex workflows that manage Cloud MapReduce analytics Implement code that interacts with Cloud Distributed Coordination Framework Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project Make recommendations for improving documentation and software development process standards Our People
DigiFlight attracts the most highly skilled workforce to protect some of our nation's most sensitive systems. Before joining the company, many DigiFlight professionals served our country in a civilian and/or military capacity. Our diverse team provides innovative solutions as they support critical clients in tackling tough challenges. Most importantly, our team is passionate about their work and making a difference.
Our corporate culture promotes a healthy work/life balance.
Our Benefits
DigiFlight's competitive benefits package allows employees to manage their personal and professional portfolios through a variety of features and programs. Our benefits include:
Health, Dental, Vision, and Flexible Spending Account Paid Time Off (PTO) 11 paid holidays Tuition Education Assistance Professional Development 401(k) retirement plan Life insurance and short- and long-term disability insurance Employee Referral Program Marketing Incentive Plans
DigiFlight, Inc. (DFI) is an Affirmative Action, Equal Opportunity Employer. DFI offers a highly competitive, family-oriented benefits package.