Snowflake Computing
Senior Software Engineer - Data Lake
Snowflake Computing, Clyde Hill, Washington
Build the future of data. Join the Snowflake team. The Snowflake Data Lake team's mission is to power open standards with Snowflake innovation. Our customers want to bring more data to Snowflake to support their variety of data lake use cases with large data sets but face the common challenges of control, cost, and interoperability. This team aims to address these challenges and enable customers to benefit from Snowflake's rich features and integrated platform capabilities while embracing their choice of open table standards (e.g., Apache Iceberg), file formats (e.g.,Apache Parquet), storage solutions, and third-party open source tool set (e.g.,Apache Spark). We're on the early journey to build the best data lake solutions for any workload at scale. We are seeking talented Senior Software Engineers who are technical leaders in the big data open source community to join us to define the strategy, engage and deliver innovation into the open source community, and bring Snowflake to millions of big data professionals. AS A SENIOR SOFTWARE ENGINEER AT SNOWFLAKE, YOU WILL: Understand customer requirements and define product strategies. Design, develop, and operate highly reliable large scale data lake systems. Embrace Snowflake innovations with open source standards and tool sets. Be an active influencer for the direction of open source standards. Partner closely with Product teams to understand requirements and design cutting edge new capabilities that go directly into customer's hands. Analyze fault-tolerance and high availability issues, performance and scale challenges, and solve them. Ensure operational excellence of the services and meet the commitments to our customers regarding reliability, availability, and performance. IDEAL CANDIDATE WILL HAVE MOST OF THE FOLLOWING QUALIFICATIONS: 8 years of hands-on experience in large scale data intensive distributed systems, especially in distributed file systems, object storage, data warehouse, data lake, data analytics, and data platform infrastructure. Strong development skills in Java and C++. An active PMC (Program Management Committee) or Committer to open source like Apache Iceberg, Parquet, Spark, Hive, Flink, Delta Lake, Presto, Trino, and Avro. Proven track record of leading and delivering large and complex big data projects across organizations. A growth mindset and excitement about breaking the status quo by seeking innovative solutions. An excellent team player who is consistent in making everyone around you better. Experience with public clouds (AWS, Azure, GCP) is a plus BS/MS in Computer Science or related major, or equivalent experience Every Snowflake employee is expected to follow the company's confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company's data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential. The following represents the expected range of compensation for this role: The estimated base salary range for this role is $214,000 - $327,750. Additionally, this role is eligible to participate in Snowflake's bonus and equity plan. The successful candidate's starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible for a competitive package that includes: medical, dental, vision, life, and disability insurance; 401(k) retirement plan; flexible spending & health savings account; at least 12 paid holidays; paid time off; parental leave; employee assistance program; and other company benefits. Snowflake is growing fast, and we're scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building