Staff Software Engineer / Data
WEX, San Francisco, CA, United States
About the Team/Role
WEX is an innovative global commerce platform and payments technology company looking to forge the way in a rapidly changing environment, to simplify the business of doing business for customers, freeing them to spend more time, with less worry, on the things they love and care about. We are journeying to build a consistent world-class user experience across our products and services and leverage customer-focused innovations across all our strategic initiatives, including big data, AI, and Risk.
We are looking for a highly motivated and highly potential Staff Engineer to join our Data team to make big business impacts and grow your career.
This is a really exciting time to be in the Data team at WEX as a technical leader. Wex has sophisticated business and products empowering a wide variety of customer businesses. Data generated from these production systems, applications, and platforms are rich and complex. As one of the most valuable assets of Wex, the data provides huge potential values for our customers and business. It's the Data team's responsibility to build big data technology, platforms, systems, and tools to clean, process, enrich, and optimize the core company data, and make them easy and efficient to use to enable Wex customers and internal teams to generate customer and business values. We build value added data products for Wex customers too. We develop and leverage advanced technologies from the industry to ensure our efficiency and effectiveness, including modern big data technologies and AI technologies. We use agile development with the combined engineering approach and the product operating model.
We have challenging problems with huge business impact potentials for you to lead, work on and grow. We also have a strong team with highly talented and skillful engineers and leaders to support, guide, coach you.
If you dream to be a strong engineer who can solve tough problems, lead, generate big impacts, and grow fast, this is a great opportunity for you!
How you'll make an impact
Collaborate with partners/stakeholders to learn about our customers' business and key challenges.
Design, test, code, and instrument new data products/systems/platforms/pipelines at large complexity level with simple and high quality solutions.
Effectively measure, inspect, and drive decisions using data.
Effectively develop and maintain CI/CD automation using tools such as GitHub Actions.
Effectively implement Infrastructure as Code (IaC) using tools like Terraform, including provisioning and managing cloud-based data infrastructure.
Effectively perform software development with TDD and BDD, Microservice and event oriented Architectures with high efficiency, reliability, quality, and scalability.
Support live data products/systems/platforms/pipelines, promote proactive monitoring including high data quality, rapid incident response, and continuous improvement.
Effectively analyze data, existing systems and processes independently to effectively identify bottlenecks and opportunities for improvements.
Mentor your peers, foster continuous learning of new technology within your team and organization.
Attract high talents from industry to your team; help in interviews and provide quality and timely feedback.
Role model at our team's process and best practices, and apply them to given tasks with help from peers and your manager. Proactively understand customer/business problems you try to solve with these tasks, and ensure your design and implementation can actually solve these problems in an effective, reliable, and sustainable way.
Partner with, assist, or lead your peers on completing complex tasks.
Lead a scrum team in a hands-on way with proper agile development practices, and ensure the high quality and timely development and delivery that solve target problems effectively.
Own large complex components or systems/products/platforms.
Participate and lead technical discussions.
Very efficiently design and build high quality and performed systems with craftsmanship.
Independently & productively complete work at medium/large complexity level and proactively seek reviews from senior engineers on your work to ensure high quality
Proactively identify and communicate dependencies.
Proactively review work from peers and provide constructive feedback/comments.
Build reliable, secure, high quality, efficient, and easy to use big data platforms and tools at scale for supporting all kinds of data transferring, ingestion, processing, serving, delivery, consumption, and data governance needs.
Build reliable, scalable, highly efficient and performed systems, platforms, data pipelines, and tools for E-2-E data life cycle, including data ingestion, cleaning, processing, enrichment, optimization, and serving by leveraging the Data platform. It's our responsibility to deliver high quality, rich, easy to understand & use data for external and internal purposes. For these, you will build data quality measurement & monitoring techniques and systems, metadata and data catalog, Master Data Management, etc.
Use data modeling techniques to design/implement efficient and easy to use data models and structures.
Become a deep subject matter expert in your functional area and best practices.
Assess unique circumstances and apply creative problem-solving techniques to resolve issues or suggest various approaches.
Use data and/or AI technology or tools in your design and development for high productivity and better solution quality. Influence your peers in this area.
Lead initiatives for your team by using your wide-ranging experience and deep technical knowledge to make decisions on method and approach to solving issues.
Hold yourself and your team accountable for delivering quality results using defined OKRs.
Interact with Senior Managers to discuss plans, results, and advise on complex matters.
Experience you'll bring:
Bachelor's degree in Computer Science, Software Engineering, orrelated field.
OR demonstrable equivalent deep understanding, experience, and capability.Master or PhD degree in Computer Science (or related field) and 5+ years of experience in software engineering or 7+ years of experience in software engineering at a large scale. Experience in data system/platform development.
A technically deep, innovative, empathetic, and passionate technical leader able to act and deliver to the business needs.
Very strong problem-solving skills, excellent communication and collaboration skills.
Highly self motivated and eager to learn. Always watching out for new technologies and adopting appropriate ones for improving your productivities as well as the quality & effectiveness of your deliverables. E.g. good at leveraging GenAI technology and tools for increasing your work productivity and quality, as well as building innovative products/systems for your customers. Lead the team in this regard.
Rich experience in designing a simple, high quality, performed, and efficient solution for a large complex level problem.
Rich experience and good understanding of CI/CD automation.
Rich experience in combined engineering practice and agile development.
Very rich experience and strong implementation skills using languages like Java, C#, Golang, & Python, including coding, automated testing, measurement and monitoring with a high productivity/throughput. Experienced in TDD approach.
Very experienced and with good understanding of data processing techniques, such as data pipeline/platform development, SQL, and DBs
Very experienced in data ingestion, cleaning, processing, enrichment, storage, serving, quality assurance techniques and tools, such as data pipeline development, SQL & relational algebra, DBs, ELT.
Experienced in cloud technology, such as AWS and Azure.
Good understanding of data warehousing and dimensional modeling, etc.
Passionate about understanding and solving customer/business problems.
Understanding of data governance is a plus.
Preferred Qualifications:
Experience in building various architectures Datalakes , Data Lakehouse and Data warehouse.
Extensive experience in data engineering, including data pipeline development, SQL, and database management. Proven ability to optimize data ingestion, transformation, and storage processes, ensuring scalable and efficient data flow for business-critical systems.
Experience in Data Modelling using industry best practices for cloud based Datalakes , Data Lakehouse and Data Warehouses.
Experience in implementing large scale Data LakeHouse , Data Warehouse with dimensional modeling techniques.