Purgo
Data/ML Architect
Purgo, Palo Alto, California, United States, 94306
The Data Architect role offers the successful candidate the opportunity to pioneer the adoption of generative AI in design, development, and migration of data applications. This is a hands-on technical role involving deep collaboration with both Purgo AI’s product/engineering team and its customers/partners. The role drives the maturation and adoption of Purgo AI’s redefined software design lifecycle amongst both cloud data warehouse partners and customers.Requirements
Responsibilities
Identify and quantify customer business requirements across the product’s use-cases.Design Purgo AI’s generative AI powered software design lifecycle solutions for the identified use-cases across leading cloud data warehouses like Snowflake and Databricks.Architect automated machine learning training pipelines and machine learning inference setup dynamically from data.Architect text to inference using NLP to perform online ML inference.Architect, implement, and optimize automated data pipelines, ETL processes, model workflow, and data warehousing solutions.Design, document, and advocate best practices for data architecture, AI modeling, and data integration for accelerated adoption of Purgo AI solutions.Conduct technical assessments, develop proofs-of-value, and present automated solutions for driving optimized paths of product adoption within customers.Continue to build deep subject-matter expertise in latest industry trends and advancements in cloud data platforms, code generation LLMs, text-to-query, and related technologies.Architect and implement security, compliance, and data governance standards in all Purgo AI solutions.About you
Graduate education background in computer science or information technology.Minimum of 5 years of experience in a Data Architect or Solutions Architect role with a focus on implementing cloud data warehouse (e.g. Snowflake, Databricks) solutions.Enterprise customer facing experience in driving large scale cloud data warehouse implementations/migrations is strongly preferable.Strong expertise in ML Training and ML inference in various algorithms such as classifications, regression, neural networks.Strong expertise in NLP and text processing to ML interfaces.Strong expertise in data warehousing, ETL processes, AI modeling, and data integration techniques.Proficiency in programming languages such as SQL, Python, and Scala.Experience with cloud platforms such as AWS, Azure, or Google Cloud.Deep understanding of data modeling, AI algorithms (e.g. forecasting, anomaly detection etc.), data governance, and data security principles.Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment.Strong communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders.Relevant certifications in Snowflake, Databricks, AI, or cloud platforms are a plus.About the Company
Purgo AI is an application design studio powered by generative AI, which interprets high-level business problem statements and generates design, source code, and deployment of data applications over cloud data warehouses. The product’s fine-tuned LLMs specialize in solving business problems with business intelligence (ETL/ELT), cloud migration, and on-demand machine-learning & inferencing for forecasting, anomaly detection & pattern recognition. Purgo AI integrates out-of-box with all leading cloud data warehouses including Databricks, Snowflake, Microsoft Fabric, Google BigQuery, and AWS RedShift.
#J-18808-Ljbffr
Responsibilities
Identify and quantify customer business requirements across the product’s use-cases.Design Purgo AI’s generative AI powered software design lifecycle solutions for the identified use-cases across leading cloud data warehouses like Snowflake and Databricks.Architect automated machine learning training pipelines and machine learning inference setup dynamically from data.Architect text to inference using NLP to perform online ML inference.Architect, implement, and optimize automated data pipelines, ETL processes, model workflow, and data warehousing solutions.Design, document, and advocate best practices for data architecture, AI modeling, and data integration for accelerated adoption of Purgo AI solutions.Conduct technical assessments, develop proofs-of-value, and present automated solutions for driving optimized paths of product adoption within customers.Continue to build deep subject-matter expertise in latest industry trends and advancements in cloud data platforms, code generation LLMs, text-to-query, and related technologies.Architect and implement security, compliance, and data governance standards in all Purgo AI solutions.About you
Graduate education background in computer science or information technology.Minimum of 5 years of experience in a Data Architect or Solutions Architect role with a focus on implementing cloud data warehouse (e.g. Snowflake, Databricks) solutions.Enterprise customer facing experience in driving large scale cloud data warehouse implementations/migrations is strongly preferable.Strong expertise in ML Training and ML inference in various algorithms such as classifications, regression, neural networks.Strong expertise in NLP and text processing to ML interfaces.Strong expertise in data warehousing, ETL processes, AI modeling, and data integration techniques.Proficiency in programming languages such as SQL, Python, and Scala.Experience with cloud platforms such as AWS, Azure, or Google Cloud.Deep understanding of data modeling, AI algorithms (e.g. forecasting, anomaly detection etc.), data governance, and data security principles.Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment.Strong communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders.Relevant certifications in Snowflake, Databricks, AI, or cloud platforms are a plus.About the Company
Purgo AI is an application design studio powered by generative AI, which interprets high-level business problem statements and generates design, source code, and deployment of data applications over cloud data warehouses. The product’s fine-tuned LLMs specialize in solving business problems with business intelligence (ETL/ELT), cloud migration, and on-demand machine-learning & inferencing for forecasting, anomaly detection & pattern recognition. Purgo AI integrates out-of-box with all leading cloud data warehouses including Databricks, Snowflake, Microsoft Fabric, Google BigQuery, and AWS RedShift.
#J-18808-Ljbffr