JobRialto
Data Engineer
JobRialto, New York, New York, us, 10261
Job Summary:
• Experience range of at least 12+ years.
• Experience in real-time data streaming (preferably Kafka with Snowflake).
• Knowledge of Snowpark.
• Conversant with designing architectural diagrams/flowcharts.
• Experience in data warehouse and big data processing.
• Expert in Snowflake and its features and data storage architecture.
• Features include internal stage, external stage, COPY INTO, PUT, Snowpipe, streaming data processing, SQL API, etc.
• Experience in designing ETL pipeline and workflow using Snowflake.
• Hands-on experience in writing SQL, Snowpark Python, SQL Python, and stored procedures.
• Experience in working with public cloud data storage.
Key Responsibilities:
• Design and implement real-time data streaming solutions using Kafka and Snowflake.
• Create and maintain architectural diagrams and flowcharts.
• Develop and optimize data warehouse and big data processing solutions.
• Utilize Snowflake features for data storage and processing.
• Design ETL pipelines and workflows using Snowflake.
• Write and maintain SQL, Snowpark Python, SQL Python, and stored procedures.
• Collaborate with cross-functional teams to ensure seamless data integration and processing.
Required Qualifications:
• 12+ years of experience in relevant fields.
• Proficiency in Snowflake, SQL, Snowpark Python, SQL Python, and data storage architecture.
• Experience in real-time data streaming (preferably Kafka with Snowflake).
• Knowledge of Snowpark.
• Ability to design architectural diagrams/flowcharts.
• Strong understanding of data warehouse and big data processing.
Skills:
• Snowflake
• SQL
• Snowpark Python
• SQL Python
• Stored procedures
• Data storage architecture
• Real-time data streaming (preferably Kafka with Snowflake)
• Architectural diagrams/flowcharts
\n
Education:
Bachelors Degree
• Experience range of at least 12+ years.
• Experience in real-time data streaming (preferably Kafka with Snowflake).
• Knowledge of Snowpark.
• Conversant with designing architectural diagrams/flowcharts.
• Experience in data warehouse and big data processing.
• Expert in Snowflake and its features and data storage architecture.
• Features include internal stage, external stage, COPY INTO, PUT, Snowpipe, streaming data processing, SQL API, etc.
• Experience in designing ETL pipeline and workflow using Snowflake.
• Hands-on experience in writing SQL, Snowpark Python, SQL Python, and stored procedures.
• Experience in working with public cloud data storage.
Key Responsibilities:
• Design and implement real-time data streaming solutions using Kafka and Snowflake.
• Create and maintain architectural diagrams and flowcharts.
• Develop and optimize data warehouse and big data processing solutions.
• Utilize Snowflake features for data storage and processing.
• Design ETL pipelines and workflows using Snowflake.
• Write and maintain SQL, Snowpark Python, SQL Python, and stored procedures.
• Collaborate with cross-functional teams to ensure seamless data integration and processing.
Required Qualifications:
• 12+ years of experience in relevant fields.
• Proficiency in Snowflake, SQL, Snowpark Python, SQL Python, and data storage architecture.
• Experience in real-time data streaming (preferably Kafka with Snowflake).
• Knowledge of Snowpark.
• Ability to design architectural diagrams/flowcharts.
• Strong understanding of data warehouse and big data processing.
Skills:
• Snowflake
• SQL
• Snowpark Python
• SQL Python
• Stored procedures
• Data storage architecture
• Real-time data streaming (preferably Kafka with Snowflake)
• Architectural diagrams/flowcharts
\n
Education:
Bachelors Degree