Logo
Snowflake

Data Platform Architect - Data Engineering, Field CTO Office

Snowflake, Washington, District of Columbia, us, 20022


Build the future of the AI Data Cloud. Join the Snowflake team. There is only one Data Cloud. Snowflake's founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn't stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow. Our Sales Engineering organization is seeking a Data Platform Architect to join our Field CTO Office who can provide leadership in working with both technical and business executives in the design and architecture of the Snowflake Cloud Data Platform as a critical component of their enterprise data architecture and overall ecosystem. In this role you will work with sales teams, product management, and technology partners to leverage your expertise, best practices and reference architectures highlighting Snowflake's Cloud Data Platform capabilities across Data Warehouse, Data Lake, and Data Engineering workloads. As a Data Platform Architect, you must share our passion and vision in helping our customers and partners drive faster time to insight through Snowflake's Cloud Data Platform, thrive in a dynamic environment, and have the flexibility and willingness to jump in and get things done. You are equally comfortable in both a business and technical context, interacting with executives and talking shop with technical audiences. IN THIS ROLE YOU WILL GET TO: Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners.

Partner with sales teams and channel partners to understand the needs of our customers, strategize on how to navigate and accelerate winning sales cycles, provide compelling value-based enterprise architecture deliverables and working sessions to ensure customers are set up for success operationally, and support strategic enterprise pilots / proof-of-concepts.

Collaborate closely with our Product team to effectively influence the Cloud Data Platform product roadmaps based on field team and customer feedback.

Partner with Product Marketing teams to spread awareness and support pipeline building via customer roundtables, conferences, events, blogs, webinars, and whitepapers.

Contribute to the creation of reference architectures, blueprints, best practices, etc. based on field experience to continue to up-level and enable our internal stakeholders.

ON DAY ONE, WE WILL EXPECT YOU TO HAVE: 10+ years of architecture and data engineering experience within the Enterprise Data space.

Deep Technical Hands-on Expertise within one or more of the following (preferably with several): Data Warehouse Modernization / Migrations, Data Lakes, Data Engineering.

Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies.

Must have some prior knowledge of Data Engineering tools for ingestion, transformation and curation.

Familiarity with real-time or near real-time use cases (ex. CDC) and technologies (ex. Kafka, Flink) and deep understanding of integration services and tools for building ETL and ELT data pipelines and its orchestration technologies such as Matillion, Fivetran, Airflow, Informatica, etc.

3+ years of Cloud Provider experience including certifications in AWS, GCP and/or Azure with detailed knowledge of cloud storage and services supporting data ingestion, transformation, pipelines, and stream processing.

Working knowledge of any of the known table formats like Hudi, Iceberg etc and experience with at least one Big data project for data lake or data engineering is a plus.

Hands-on experience with database change management and DevOps processes and technologies such as Github, Gitlab, Jenkins and Terraform.

Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences, and effectively handle any impromptu questions.

Bachelor's Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience

preferred .

Every Snowflake employee is expected to follow the company's confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company's data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential. The following represents the expected range of compensation for this role: The estimated base salary range for this role is $162,000 - $225,700. Additionally, this role is eligible to participate in Snowflake's bonus and equity plan. The successful candidate's starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible for a competitive benefits package that includes: medical, dental, vision, life, and disability insurance; 401(k) retirement plan; flexible spending & health savings account; at least 12 paid holidays; paid time off; parental leave; employee assistance program; and other company benefits.

#J-18808-Ljbffr