Logo
Cornerstone Defense

Software Developer

Cornerstone Defense, Chantilly, Virginia, United States, 22021


Location: Chantilly, VirginiaType: ContractJob #3247

Title: Software DeveloperLocation: Chantilly, VA*Clearance:

*Active TS/SCI w/ Polygraph needed to apply *Company Overview:Cornerstone Defense is the Employer of Choice within the Intelligence, Defense, and Space communities of the U.S. Government. Realizing early on that our most prized assets are our employees, we continually focus our attention on improving the overall work/life experience they have supporting the mission. Our Team is pushed every day to use their industry leading knowledge to provide end-to-end solutions to combat our nation's toughest and most secure problems. If you are looking for a place to not only be professionally challenged, but encouraged and supported by a company that cares, don't look any further than Cornerstone Defense.

CRITICAL CORE COMPETANCIES and KEY ROLES

Current Role and ResponsibilitiesIf you are an

incumbent

on this effort/in this space, please list your current role and responsibilities for the team. Include any details that you feel will help us create a more robust staffing approach as well as any details you think might help identify you to the customer

System Architecture/Engineering (SE):Design, review, and implementation of collaboration network architecture, including graphing tool recommendations, data flow, data integration tools, user interfaces. Develop operation and maintenance documentation, to include User Support, Developer Support, Tool Integration, and Data Requirements/Integration standard operating procedures.

SW Engineering/Developer (SW-E):

Design, development, and implementation of software systems/services solutions based on requirements analysis; perform operational feasibility evaluations and recommend improvements; develop and maintain system and solution code, architecture, and security documentation; assist Sponsor with software licensing and purchase efforts.

(DevSecOps (DevOPS):

Build, deploy and secure software applications and tools on the Sponsor development and collaboration networks; support agile processes and automation in system development life cycle; deliver new features, fixes, and updates aligning with Sponsor business needs and security requirements.

Data Engineering:

Data integration with provenance and accessibility requirements; design and implementation of data processing pipelines to support streaming, batch and ad hoc scheduled tasks; application of Sponsor community data engineering standards.

Mission Data Integration:

Define data requirements (including ingest, metadata, data security, storage, processing, enrichment, and transformation); design and implementation of data processing pipelines (including ingest, metadata, data security, storage, processing, enrichment, and transformation). Support application of Sponsor community data engineering, data integration, and data security standards.

Database Development:

Design, develop, implement and maintain NoSQL databases; utilize expert knowledge of SQL Language to perform database tasks. Translate business requirements into databases, data warehouses, and data streams; create procedures and procedures to ensure data accuracy and accessibility and process, clean, integrate the data. Analyze, plan, and define data architecture framework, including security, reference data, metadata, and master data. Write and engage with server side APIs and applications using existing sponsor data pipelines, application APIs, services or self-written data pipelines, applications and APIs. Gather data requirements such as how long the data needs to be stored, how it will be used and what people and systems need access to the data. Process data for specific needs using tools that access data from different sources, transform and enrich the data, summarize the data and store the data in the storage system.

) Data Science:

Support the data acquisition process and dataset characterization; conduct data investigation and exploratory data analysis; implement models and algorithms; correlate similar data to find actionable results. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Develop custom data models and algorithms to apply to data sets using predictive modeling to increase and optimize customer experiences and other business outcomes. Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy.

) KEY Role - Software Development Lead (SDL):

Serve as a POC to oversee the software development activities to ensure proper alignment of development activities to the Sponsor's priorities.Oversee the successful integration of the current graphing system with the Sponsor's enterprise systems and core technical systems and design and development of the next iteration solution for the sponsor collaboration and development network implementation.Design and oversee the execution of integration testing to ensure the Sponsor's enterprise level support meets the performance requirements defined by the Sponsor.Apply in-depth hands-on software engineering expertise to develop automated workflows that ensure compliance with Sponsor security, regulatory and compliance requirements such as the design of queueing and message processing workflows for large scale operations.Maintain oversight over the Sponsor's infrastructure capabilities to ensure key metrics are collected and made available for continuity of operations

KEY Role - Data Engineering Lead (DEL):

Serve as a POC to oversee data engineering requirements and engineering efforts to support successful integration of data into the data graphing tool while ensuring provenance and accessibility requirements are met using existing Sponsor pipelines and codebase.Design and implement data processing pipelines necessary to support streaming, batch and ad hoc scheduled tasks are executed successfully to deliver high quality data.Provide expert level oversight of application of Sponsor community standards

KEY Role - System Engineering Transformation Lead (SETL):

Serves as a POC to operate as the focal point for integrating Sponsor community best practices already in use for data and solution approach, developing new recommendations as directed by the Sponsor, and defining detailed engineering requirements for implementing the new analytics, databases, and models.Engage with Sponsor communities of practice, stakeholders, and engineering teams to apply technical expertise on the Sponsor's mission data to develop holistic scalable solutions.Lead the implementation of Sponsor technical standards for the data graphing tool and associated integrations with Sponsor systems to ensure information security requirements are met in an efficiently scalable manner that enables measurement and monitoring.

REQUIRED and DESIRED SKILLS() Demonstrated experience with designing cloud-native architectures using cloud services such as AWS, Google, IBM, and Oracle() Demonstrated experience designing and operating big data systems() Demonstrated experience building and optimizing performance of large scale graph databases (tens of billions of edges) using DynamoDB or new enhanced capabilities() Demonstrated experience developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities() Demonstrated experience developing and operating NoSQL solutions to complex big data applications() Demonstrated experience in data modeling for performance, partition sharding, record/event aggregation workflows, stream processing, and metrics gathering() Demonstrated experience designing and operating large-scale serverless geospatial indexes built with GeoMESA() Demonstrated experience with partition and sort key design and implementation to ensure consistent performance() Demonstrated experience with aggregation operations to de-duplicate records on continuous data feeds() Demonstrated subject matter expertise experience with relational databases to noSQL( Demonstrated experience building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark() Demonstrated experience building high quality User Interface/User experiences with the React framework and webGL() Demonstrated experience designing and operating large scale graph databases using Apache Cassandra() Demonstrated experience performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations() Demonstrated experience developing technical capabilities for processing, persistence and search of datasets that are collected or maintained using standards common in the Sponsor's community() Demonstrated experience facilitating engineering discussions across teams representing multiple stakeholders to develop and execute implementation strategies that meet mission needs) Demonstrated experience developing Machine Learning Operations (MLOps) pipelines for large scale applicationDemonstrated experience maintaining configuration of software using configuration management resources such as GitHubDemonstrated experience designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/eventsDemonstrated experience with Niagara Files (NiFi) applications or new enhanced capabilitiesDemonstrated experience developing and operating Kubernetes infrastructureDemonstrated experience supporting engineering efforts that will contribute to delivery of capabilities such as datasets and functionality such as communications, geospatial workflowsDemonstrated experience implementing DevSecOps and agile development in production environmentsDemonstrated experience with agile software development and testingDemonstrated experience with federal security, regulatory and compliance requirements and security accreditation package developmentDemonstrated experience with data security and governance using centralized security controls like LDAP, encrypting the data, and auditing access to the dataDemonstrated experience with specialized technologies that are optimized for the particular use of the data, such as relational databases, a NoSQL database (Cassandra), or object storageDemonstrated experience with Apache, TINKERPOP, GREMLIN and/or JANUSGRAPH to design, develop, implement and maintain systemDemonstrated knowledge of Graph Database to design, develop, implement and maintain systemDemonstrated experience with C or C++ to write interfacesDemonstrated experience using centralized security controls like LDAP, encrypting data, and auditing access to dataDemonstrated experience with:

Databases: Postgres, MariaDB, ELK, Minio, AWS S3, Neo4j, MongoDB, noSQL

(Demonstrated experience with:

Languages: Python (pypi libraries)

(Demonstrated experience with:

Operating Systems: Centos7, RockyLinux8

(Demonstrated experience with:

Orchestration: Kubernetes, Docker, Docker-Compose, Docker-Swarm

() Demonstrated experience with:

Development Tools: vscode, gitlab, jupyterhub/notebooks, MATLAB

() Demonstrated experience with:

Environments: large collaboration and development environments

() Demonstrated experience with:

Data types: Unstructured, structured, or semi-structured data, including: CSV, JSON, JSONL, AVRO, Protocol Buffers, Parquet, etc

() DESIRED: Demonstrated experience with designing cloud-native architectures using Sponsors cloud services() DESIRED: Demonstrated experience designing and operating big data systems within the Sponsors policy and regulatory environment() DESIRED: Demonstrated experience developing and operating graph traversal capabilities using the Sponsors data graphing tool traversal capabilities built upon Apache Gremlin() DESIRED: Demonstrated experience building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark on the Sponsors infrastructure with EMR() DESIRED: Demonstrated experience working with the Sponsor's enterprise services used for Data Management, including the enterprise catalog service (and associated APIs), and Policy Decision Points (PDPs).() DESIRED: Demonstrated experience developing Machine Learning Operations (MLOps) pipelines for large scale application in the Sponsor's environment() DESIRED: Demonstrated experience and understanding of IT Service Management and common SLA measurements() DESIRED: Demonstrated experience presenting solutions, requirements, and presentations to diverse audiences.() DESIRED: Demonstrated experience working with container orchestration technologies such as AWS ECS, AWS Fargate, and Kubernetes or other enhanced capabilities available() DESIRED: Demonstrated experience in managing large operational cloud environments spanning multiple tenants using Multi-Account management, AWS Well Architected Best Practices, and AWS Organization Units/Service Control Policies (OU/SCP).() DESIRED: Demonstrated experience with Micro-services such as building decoupled systems, utilizing RESTful endpoints and lightweight systems() DESIRED: Demonstrated experience in total systems perspectives, including a technical understanding of systems and applications relationships, dependencies, and requirements of hardware and software components() DESIRED: Demonstrated experience consulting with customers to determine present and future user needs( DESIRED: Demonstrated experience providing frequent contact with customers, traceability within program documents, and the overall computing environment and architecture() DESIRED Certifications:AWS Certified Solutions ArchitectAWS Machine Learning Certification(s)Agile certificationAzureSecurity+GSECCCNA