Saic
Big Data Architect
Saic, Chantilly, Virginia, United States, 22021
DescriptionSAIC is seeking an experienced, results-oriented, mission-driven
Big Data Architect
with a specialized focus on Data Engineering to perform data model design, data formatting, and ETL development optimized for efficient storage, access, and computation in support of national security objectives.Responsibilities include, but are not limited to:As part of an Agile team, increase innovation capacity and drive the velocity of development of data ingestion and data analysis.Synchronize efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.Passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.Implement data tests for quality and also focus on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity.QualificationsRequired:Active TS/SCI with Polygraph ClearanceBachelor’s Degree in Computer Science, Information Systems, Engineering or years of experience in lieu of degree14 years of overall related professional experience3+ years' hands-on development experience using Java, JavaScript, Python to ETL data.ETL experience, including formats such as XML, JSON, and YML, and normalizing data and high-volume data ingestion.3+ years' experience using and ingesting data into SQL and NoSQL database systemsFamiliarity with the NEXIS platformExperience with Apache NiFiExperience programming in Apache Spark and PySparkDesired:Familiarity with building Containerized Services (e.g. via Docker)Familiarity with Databricks platformExperience developing and maintaining data processing flows.Experience with Amazon Web Services (AWS)Experience with CI/CD pipelineExperience with Agile Methodologies and Kanban FrameworkExperience with utilizing relational databases including the use of MySQL and/or Oracle for designing database schemasExperience with Linux, REST services, and HTTP
#J-18808-Ljbffr
Big Data Architect
with a specialized focus on Data Engineering to perform data model design, data formatting, and ETL development optimized for efficient storage, access, and computation in support of national security objectives.Responsibilities include, but are not limited to:As part of an Agile team, increase innovation capacity and drive the velocity of development of data ingestion and data analysis.Synchronize efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.Passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.Implement data tests for quality and also focus on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity.QualificationsRequired:Active TS/SCI with Polygraph ClearanceBachelor’s Degree in Computer Science, Information Systems, Engineering or years of experience in lieu of degree14 years of overall related professional experience3+ years' hands-on development experience using Java, JavaScript, Python to ETL data.ETL experience, including formats such as XML, JSON, and YML, and normalizing data and high-volume data ingestion.3+ years' experience using and ingesting data into SQL and NoSQL database systemsFamiliarity with the NEXIS platformExperience with Apache NiFiExperience programming in Apache Spark and PySparkDesired:Familiarity with building Containerized Services (e.g. via Docker)Familiarity with Databricks platformExperience developing and maintaining data processing flows.Experience with Amazon Web Services (AWS)Experience with CI/CD pipelineExperience with Agile Methodologies and Kanban FrameworkExperience with utilizing relational databases including the use of MySQL and/or Oracle for designing database schemasExperience with Linux, REST services, and HTTP
#J-18808-Ljbffr