Logo
Bigbear.ai

Databricks Subject Matter Specialist

Bigbear.ai, Washington, District of Columbia, us, 20022


BigBear.ai

is hiring for a

Databricks Subject Matter Specialist.

This position requires a solid background developing solutions for high volume, low latency applications and can operate in a fast-paced, highly collaborative environment. The role will be responsible for managing and administering a Databricks Environment. An Active TS/SCI w/Poly is required.This position is Hybrid/Remote and will be expected to report to the Walker Lane office (Alexandria, VA) at least 2 - 3 days per week or as needed.

What you will doOversee a Databricks environment, including clusters, workspaces, jobs, and notebooks, to ensure optimal performance, reliability, and scalability.Adjust and optimize Databricks clusters and resources based on workload requirements and best practices.Create and manage workspaces and permissions.Understand and craft all applicable AWS IAM roles and policies necessary for program use cases.Inspect Databricks logs for auditing and cost reporting.Understand and execute DoD security best practices across the Databricks platform.Work closely with Advana teams to ensure smooth operation and physical design of databases, clusters, jobs, and workspaces.Provide support to data engineers, data scientists, and analysts, enabling them to leverage the full potential of Databricks for analytics and machine learning.Manage Multiple Environments - maintain and support configuration management across unclassified and classified environments.

What you need to haveBachelor's degree with 5 years of experience.Experience as the Databricks account owner, managing workspaces, AWS accounts, audit logs, and high-level usage monitoring.Experience as Databricks workspace admin, managing workspace users and groups including single sign-on, provisioning, access control, and workspace storage.Experience with Databricks security and privacy setup.Experience optimizing usage for performance and cost preferred.Ability to multitask and reprioritize tasking on the fly according to the needs of a growing platform and its stakeholders.Strong coding skills in SQL and Python (PySpark) with experience optimizing code.Must possess and maintain an

Active TS/SCI w/Poly.Experience working with DOD data.

What we'd like you to haveExperience with Qlik dashboarding and development functions.Experience in using DoD Data Management to manipulate and integrate databases with SQL, SAS, or other languages.Knowledge of Big Data systems, including Hadoop, HDFS, Hive, or Cloudera.Able to discern stakeholder needs, effectively communicate development plans, and track progress milestones.Excellent organizational and time management skills to handle multiple tasks simultaneously.Excellent critical thinking skills to assess numbers, trends, and data to reach new conclusions based on findings.Excellent quantitative skills, including statistical analysis, process design, and data management.

About BigBear.aiBigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai’s predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit

bigbear.ai

and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.

#J-18808-Ljbffr