Databricks
Specialist Solutions Architect - Cloud Infrastructure & Security
Databricks, Oklahoma City, Oklahoma, United States,
FEQ126R112
Mission
As a Specialist Solutions Architect (SSA) - Cloud Infrastructure & Security, you will guide customers in the administration and security of their Databricks deployments for a variety of our customers. You will be in a customer-facing role, working with and supporting Solution Architects, that requires hands-on production experience with public cloud - AWS, Azure, and GCP. SSAs help customers through the design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be cloud deployments, security, networking, or more. The impact you will have:
Provide technical leadership to guide strategic customers to the successful administration of Databricks, ranging from design to deployment. Architect production level deployments, including meeting necessary security and networking requirements. Become a technical expert in an area such as cloud platforms, automation, security, networking, or identity management. Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content and custom architectures. Provide tutorials and training to improve community adoption (including hackathons and conference presentations). Contribute to the Databricks Community. What we look for:
5+ years experience in a technical role with expertise in at least one of the following: Cloud Platforms & Architecture: Cloud Native Architecture in CSPs such as AWS, Azure, and GCP, Serverless Architecture. Security: Platform security, Network security, Data Security, Gen AI & Model Security, Encryption, Vulnerability Management, Compliance. Networking: Architecture design, implementation, and performance. Identity management: Provisioning, SCIM, OAuth, SAML, Federation. Platform Administration: High availability and disaster recovery, cluster management, observability, logging, monitoring, audit, cost management. Infrastructure Automation and InfraOps with IaC tools like Terraform. Maintain and extend Databricks environment to evolve with complex needs. Deep Specialty Expertise in at least one of the following areas: Security - understanding how to secure data platforms and manage identities. Complex deployments. Public Cloud experience - experience designing data platforms on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking. Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience. Hands-on experience with Python, Java, or Scala and proficiency in SQL, and Terraform experience is desirable. 2 years of professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures. 2 years of customer-facing experience in a pre-sales or post-sales role. Can meet expectations for technical training and role-specific outcomes within 6 months of hire. This role can be remote, but we prefer that you be located in the job listing area and can travel up to 30% when needed.
#J-18808-Ljbffr
Mission
As a Specialist Solutions Architect (SSA) - Cloud Infrastructure & Security, you will guide customers in the administration and security of their Databricks deployments for a variety of our customers. You will be in a customer-facing role, working with and supporting Solution Architects, that requires hands-on production experience with public cloud - AWS, Azure, and GCP. SSAs help customers through the design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be cloud deployments, security, networking, or more. The impact you will have:
Provide technical leadership to guide strategic customers to the successful administration of Databricks, ranging from design to deployment. Architect production level deployments, including meeting necessary security and networking requirements. Become a technical expert in an area such as cloud platforms, automation, security, networking, or identity management. Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content and custom architectures. Provide tutorials and training to improve community adoption (including hackathons and conference presentations). Contribute to the Databricks Community. What we look for:
5+ years experience in a technical role with expertise in at least one of the following: Cloud Platforms & Architecture: Cloud Native Architecture in CSPs such as AWS, Azure, and GCP, Serverless Architecture. Security: Platform security, Network security, Data Security, Gen AI & Model Security, Encryption, Vulnerability Management, Compliance. Networking: Architecture design, implementation, and performance. Identity management: Provisioning, SCIM, OAuth, SAML, Federation. Platform Administration: High availability and disaster recovery, cluster management, observability, logging, monitoring, audit, cost management. Infrastructure Automation and InfraOps with IaC tools like Terraform. Maintain and extend Databricks environment to evolve with complex needs. Deep Specialty Expertise in at least one of the following areas: Security - understanding how to secure data platforms and manage identities. Complex deployments. Public Cloud experience - experience designing data platforms on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking. Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience. Hands-on experience with Python, Java, or Scala and proficiency in SQL, and Terraform experience is desirable. 2 years of professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures. 2 years of customer-facing experience in a pre-sales or post-sales role. Can meet expectations for technical training and role-specific outcomes within 6 months of hire. This role can be remote, but we prefer that you be located in the job listing area and can travel up to 30% when needed.
#J-18808-Ljbffr