Logo
Saic

Chief Data Architect

Saic, Reston, Virginia, United States, 22090


Description

SAIC, a leading provider of systems development & deployment, targeting & intelligence analysis, systems engineering & integration, and training capabilities and solutions for the Intelligence Community, is seeking creative and dedicated professionals to fulfill their career goals and objectives while delivering mission excellence on programs of national importance.

SAIC is seeking a Chief Data Architect to define a Big Data implementation framework; translate mission needs into Big Data system, analytic, and visualization tools requirements; and work with multi-disciplinary teams to design, guide engineering solutions, and then explore, examine, and visualize data to extract knowledge and insights from large and complex collections of digital data. The role is responsible for managing digital data that comes from many different data providers, consisting of a variety of structured, semi-structured, and unstructured data types, with enormous volume, varying velocity (periodic static documents to real-time data feeds), and rapid growth. Duties will include the design, development, testing, deployment, and use of associated advanced analytic and data science tools including data visualization, link analysis, correlation, sophisticated analytics programs, machine learning, and statistical methods to prepare data for use in predictive and prescriptive modeling.

Qualifications

Required:

Active TS/SCI with polygraph clearance.

Bachelor’s degree in computer science, data analytics, software engineering, or related technical field.

A minimum of five (5) years of hands-on experience with Big Data applications, such as Hadoop, and associated applications (e.g., administration, CM, monitoring, performance tuning, etc.).

Desired:

Experience working in a government mission Data Exploitation environment (e.g., acquiring data, storing data, processing data, analyzing data, visualizing data, turning data into intelligence products, disseminating data, and knowledge management).

Experience supporting a cloud computing environment (e.g., private, hybrid, and/or public).

Experience leading and managing cross-functional technology teams.

Experience leading major programming/scripting languages like Java, Linux, C++, PHP, Ruby, Python, and/or R.

Experience working with ETL tools such as Informatica, Talend, and/or Pentaho.

Experience designing solutions for multiple large data warehouses with cluster and parallel architecture as well as high-scale or distributed platforms.

Experience in the research and development of applications/services in disciplines such as: Natural Language Processing, Machine Learning, Conceptual Modeling, Statistical Analysis, Predictive Modeling, and Hypothesis testing.

#J-18808-Ljbffr