Logo
Netskope

Data Engineer

Netskope, Charlotte, North Carolina, United States, 28245


About NetskopeToday, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.

Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Melbourne, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events (pre and hopefully post-Covid) and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter@Netskope.

About the position:*Please note: This is a 4-month contract role.

This role will be responsible for designing, building, and enabling an infrastructure that supports our GTM data-driven initiatives. Examples include but are not limited to: customer segmentation, sales territory construction, sales coverage analysis, customer hierarchy structure development, and data flow orchestration and optimization. This role will report into the Go-To-Market (GTM) Strategy, Operations and Enablement (GSOE) organization under the Data Operations team. This position will work with various operational teams to ensure that the data has integrity, structure, and is consumable in relevant formats for analysis and decision-making.

Responsibilities:

Data Ingestion and Pipeline Orchestration: Develop and maintain data pipelines to collect data from various sources and build processes to manage currency of data pipelines.

Data Transformation: Clean, normalize, and enrich data to ensure its suitability for analysis.

Data Storage: Design, implement, and manage data storage solutions for replication and flexibility.

ETL (Extract, Transform, Load): Create and maintain ETL processes to extract, transform, and load data into target storage systems.

Data Modeling: Define data schemas and structures to support efficient querying and analysis.

Data Quality and Governance: Implement data validation, profiling, and quality checks to ensure data accuracy and compliance with data governance policies.

Ad Hoc Data Analysis and Clean Up: Build and conduct various ad hoc queries, summaries, and analysis which may lead to data clean-up activities to standardize and normalize the data.

Performance Optimization: Optimize data infrastructure for scalability and efficiency.

Security and Compliance: Implement data security measures to ensure data compliance.

Collaboration: Work closely with GTM Strategy & Planning, Data Operations, Marketing Operations, SFDC/Systems Operations, Sales Operations, and Channel Operations team to understand data requirements and advise/provide the necessary infrastructure and data workflows to support business processes.

Scalability: Design data workflows to support business processes inclusive of large volumes/bulk data sets for efficiencies, automation and growth.

Requirements:

Proven experience as a data engineer, working with GTM data sets and tools (e.g., SFDC).

Proficiency in SQL and data modeling.

Strong programming skills, with knowledge of languages like Python, Java.

Experience with ETL tools and data pipeline orchestration, including internal/external data sets (e.g., KNIME, Alteryx, Tableau Data Prep, etc.).

Familiarity with databases (e.g., SQL) and data lake (e.g., Snowflake) technologies.

Knowledge of data security and compliance best practices.

Strong problem-solving and verbal/written communication skills.

Ability to work collaboratively in a global team environment.

Education:

Bachelor's or higher degree in Computer Science, Information Technology, or a related field.

#LI-AF1Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.

Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer toNetskope's Privacy Policyfor more details.