Logo
Saic

Chief Data Manager

Saic, Mc Lean, Virginia, us, 22107


DescriptionSAIC’s Space & Intel Business Group, USG Mission and Information Technology Division, is seeking a

Chief Data Manager

to support a program which leverages integrated discrete technologies to support massive data processing, storage, modeling, and analytics over several thousand unique data sources, to perform threat identification and analysis, as well as support efforts to meet tactical and strategic goals. This position is in

McLean, VA

and requires an

active TS/SCI clearance with Polygraph .Job responsibilities include, but are not limited to:Identify new and assess external requirements for data engineering and analytic support, and determine an appropriate course of action to support said requirements for Customer consideration. This includes the work required to identify the data needed, document the methodology used to perform analysis, document results, and make determinations on classifications and handling restrictions of resulting product(s).Work with Data Engineers to develop and evolve sophisticated data models to facilitate the exploitation of structured, semi-structured, and unstructured data.Identify and evaluate a wide range of bulk data, data exploitation methods, data models, and algorithms used to design novel solutions for complex mission-driven problems of high importance.Support and evolve processes and mechanisms.Design processes and systems to manage the acquisition, ingestion, and modeling of data.QualificationsActive TS/SCI with polygraphBachelors degree in mathematics, computer science, engineering, or similar scientific or technical discipline and 18 years or more experience; Masters and 16 years or more experience; PhD or JD and 13 years or more experienceDemonstrated experience developing and maintaining data catalogue and reporting processes and systems to enable the pedigree and lineage for records and datasets to adhere to data owner, legal and policy requirements.Experience working in teams to develop and evolve sophisticated data models to facilitate exploitation of structured, semi-structured, and unstructured data.Experience in identification and effective evaluation of a wide range of bulk data, data exploitation methods, data models, and algorithms used to design novel solutions for complex mission-driven problems of high importance.Hands-on experience working with COTS, GOTS and Open-Source capabilities for the processing and exploitation of disparate high-volume, high-velocity data streams to meet analytical requirements.Ability to add customization to COTS and GOTS capabilities.Desired SkillsGraduate degree in computer science, information systems, engineering, or another scientific or technical discipline.Demonstrated experience using Enterprise Control Language (ECL) and the Lexis-Nexis High Performance Cluster Computing (HPCC) platform.Experience performing All-Source data analysis to perform analytic support.Experience developing custom algorithms to support analytic requirements against massive data stores supporting the Sponsor.Ability to perform technical analysis using massive data processing systems.Experience writing cables.Experience planning and coordinating program activities such as installation and upgrading of hardware and software, utilization of cloud services, programming, or systems design development, modification of IT networks, or implementation of Internet and intranet sites.Experience deploying web applications to a cloud managed environment to include DevOps and security configuration management.Experience developing, implementing, and maintaining cloud infrastructure services such as EC2, ELB, RDS, S3, and VPC.Experience planning, coordinating, and executing the required activities to support documentation to meet the Sponsor’s data compliance requirements (e.g., legal, data policy).

#J-18808-Ljbffr