Logo
NuWave Solutions

Data Engineer

NuWave Solutions, Washington, District of Columbia, us, 20022


Overview

BigBear.ai

is seeking to hire a

Data Engineer (Cloud) to work with one of our clients in Washington, DC. As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions.

What you will do

Bullets of responsibilities of the role, examples:Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectivesCreate and manage product roadmaps that reflect both innovation and growth strategiesPartner with a government product owner and a product team of 7-8 FTEsDevelop and design data pipelines to support an end-to-end solutionDevelop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes)Integrate data pipelines with AWS cloud services to extract meaningful insightsManage production data within multiple datasets ensuring fault tolerance and redundancyDesign and develop robust and functional dataflows to support raw data and expected dataProvide Tier 3 technical support for deployed applications and dataflowsCollaborate with the rest of data engineering team to design and launch new featuresCoordinate and document dataflows, capabilities, etc.Occasionally (as needed) support to off-hours deployment such as evening or weekendsWhat you need to have

Clearance: Must possess and maintain a TS-SCI clearanceBachelor’s degree or equivalent practical experienceUnderstanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S)Familiar with Amazon Web Managed Services (AWS)Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similarProficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XMLWorking knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and RedisFamiliar with Linux/Unix server environmentsExperience with Agile development methodologyPublishing and/or presenting design reportsCoordinating with other team members to reach project milestones and deadlinesWorking knowledge with Collaboration tools, such as, Jira and ConfluenceWhat we'd like you to have

Master’s degree or equivalent experience in a related fieldFamiliarity and experience with the Intelligence Community (IC), and the intel cycleFamiliarity and experience with the Department of Homeland Security (DHS)Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred)Experience with cloud message APIs and usage of push notificationsKeen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national securityWorking knowledge with public keys and digital certificatesExperience with DevOps environmentsExpertise in various COTS, GOTS, and open source tools which support development of data integration and visualization applicationsExperience with cloud message APIs and usage of push notificationsSpecialization in Object Oriented Programming languages, scripting, and databasesAbout BigBear.ai

BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai’s predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.

What you will do

Bullets of responsibilities of the role, examples:Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectivesCreate and manage product roadmaps that reflect both innovation and growth strategiesPartner with a government product owner and a product team of 7-8 FTEsDevelop and design data pipelines to support an end-to-end solutionDevelop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes)Integrate data pipelines with AWS cloud services to extract meaningful insightsManage production data within multiple datasets ensuring fault tolerance and redundancyDesign and develop robust and functional dataflows to support raw data and expected dataProvide Tier 3 technical support for deployed applications and dataflowsCollaborate with the rest of data engineering team to design and launch new featuresCoordinate and document dataflows, capabilities, etc.Occasionally (as needed) support to off-hours deployment such as evening or weekendsWhat you need to have

Clearance: Must possess and maintain a TS-SCI clearanceBachelor’s degree or equivalent practical experienceUnderstanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S)Familiar with Amazon Web Managed Services (AWS)Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similarProficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XMLWorking knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and RedisFamiliar with Linux/Unix server environmentsExperience with Agile development methodologyPublishing and/or presenting design reportsCoordinating with other team members to reach project milestones and deadlinesWorking knowledge with Collaboration tools, such as, Jira and Confluence