Accesa IT Consulting SRL
Data Platforms Solutions Architect
Accesa IT Consulting SRL, Snowflake, Arizona, United States, 85937
We are seeking an experienced Data Platforms Solutions Architect to join our team. This role goes beyond traditional architecture and has a strong emphasis on business development applications of the technology knowledge. As a Solutions Architect, you will collaborate with clients and internal teams to design and present tailored data platform solutions, custom to various customer opportunities needs.
You will leverage your hands-on experience with modern data engineering tools paired with your ability to provide strategic insights during presales engagements.
Responsibilities:
Drive innovation in Data & AI Business Opportunities:
actively contribute to business development by identifying opportunities, understanding client needs, and proposing innovative data platform solutions based on modern cloud accelerators and frameworks (e.g. Databricks, Snowflake, Synapse); Tailor client focused solutions:
partner with sales teams to deliver compelling technical presentations, demonstrations, and proofs of concept while being able to influence clients' strategic decisions by showcasing the value of data engineering and platform architectures tailored to their needs; Lead discovery and shape solutions:
Drive as a leader the Discovery Workshops with new clients, aiming on collecting the business requirements and work with the BA/PO in documenting these requirements; Design operational-fit architectures:
Carefully analyse all the use-cases of the data platform for provided business added value (e.g. pipelines for BI, AI driven decision-making, data mesh self-service architectures); Offer technical advisory:
Guide clients through tool selection and best practices in data platform design, including pitching the reasoning behind certain choices; pitch the end-to-end solution to the internal and end-customer stakeholders; Advance technical expertise:
Design and implement scalable, secure, and high-performance data platforms, including data lakes, warehouses, and integration pipelines; Collaborate with cross-functional teams:
partner with stakeholders to align business goals with technology solutions, proposing practical models to calculate the Total-cost-of-ownership (TCO); Technical leadership:
Implement the PoCs associated with the provided high-level solution designs and lead the mid-senior data engineers in transforming the PoC into the Production MVP. Qualifications
Must have: 7+ years of experience in data engineering, focusing on data governance, data lakes, pipelines, and ETL/ELT processes; 3+ years of experience in a Data Platforms Solutions Architect role, leading platform design and implementation; Expertise in data lake platforms like Databricks, Azure Data Lake, AWS Lake Formation, or Google BigQuery; Proficiency in data processing frameworks (Apache Spark, Kafka, Flink) and orchestration tools (Airflow, ADF, AWS Step Functions); Strong knowledge of security & governance (RBAC, encryption, GDPR, HIPAA compliance) and distributed storage systems (Parquet, ORC, Delta Lake); Experience with cloud-native platforms (AWS, Azure, GCP) and IaC tools (Terraform, CloudFormation, or Bicep); Proficiency in automation, DevOps, and observability using tools like DataDog, Prometheus, or cloud-native monitoring; Strong skills in Python, SQL, or Scala for building data transformations and performance-optimized models (star, snowflake, data vault); Pre-sales support experience, including RFPs, technical proposals, and business development support; Strong communication & leadership skills, with the ability to engage technical and non-technical stakeholders. Nice to have: Experience integrating data lakes with machine learning platforms like TensorFlow, PyTorch, or cloud-based AI/ML services (e.g., AWS SageMaker, Azure ML, Google Vertex AI); Familiarity with feature stores and their role in operationalizing ML (e.g., Feast, Databricks Feature Store); Understanding of data modeling, data ingestion, and ETL/ELT processes tailored for AI/ML use cases; A plus are the following certifications: Microsoft Certified: Azure Solutions Architect Expert, Google Professional Cloud Architect, Databricks Certified Data Engineer Professional, Snowflake Advanced Architect Certification, Certified AI Practitioner (CAIP).
#J-18808-Ljbffr
Drive innovation in Data & AI Business Opportunities:
actively contribute to business development by identifying opportunities, understanding client needs, and proposing innovative data platform solutions based on modern cloud accelerators and frameworks (e.g. Databricks, Snowflake, Synapse); Tailor client focused solutions:
partner with sales teams to deliver compelling technical presentations, demonstrations, and proofs of concept while being able to influence clients' strategic decisions by showcasing the value of data engineering and platform architectures tailored to their needs; Lead discovery and shape solutions:
Drive as a leader the Discovery Workshops with new clients, aiming on collecting the business requirements and work with the BA/PO in documenting these requirements; Design operational-fit architectures:
Carefully analyse all the use-cases of the data platform for provided business added value (e.g. pipelines for BI, AI driven decision-making, data mesh self-service architectures); Offer technical advisory:
Guide clients through tool selection and best practices in data platform design, including pitching the reasoning behind certain choices; pitch the end-to-end solution to the internal and end-customer stakeholders; Advance technical expertise:
Design and implement scalable, secure, and high-performance data platforms, including data lakes, warehouses, and integration pipelines; Collaborate with cross-functional teams:
partner with stakeholders to align business goals with technology solutions, proposing practical models to calculate the Total-cost-of-ownership (TCO); Technical leadership:
Implement the PoCs associated with the provided high-level solution designs and lead the mid-senior data engineers in transforming the PoC into the Production MVP. Qualifications
Must have: 7+ years of experience in data engineering, focusing on data governance, data lakes, pipelines, and ETL/ELT processes; 3+ years of experience in a Data Platforms Solutions Architect role, leading platform design and implementation; Expertise in data lake platforms like Databricks, Azure Data Lake, AWS Lake Formation, or Google BigQuery; Proficiency in data processing frameworks (Apache Spark, Kafka, Flink) and orchestration tools (Airflow, ADF, AWS Step Functions); Strong knowledge of security & governance (RBAC, encryption, GDPR, HIPAA compliance) and distributed storage systems (Parquet, ORC, Delta Lake); Experience with cloud-native platforms (AWS, Azure, GCP) and IaC tools (Terraform, CloudFormation, or Bicep); Proficiency in automation, DevOps, and observability using tools like DataDog, Prometheus, or cloud-native monitoring; Strong skills in Python, SQL, or Scala for building data transformations and performance-optimized models (star, snowflake, data vault); Pre-sales support experience, including RFPs, technical proposals, and business development support; Strong communication & leadership skills, with the ability to engage technical and non-technical stakeholders. Nice to have: Experience integrating data lakes with machine learning platforms like TensorFlow, PyTorch, or cloud-based AI/ML services (e.g., AWS SageMaker, Azure ML, Google Vertex AI); Familiarity with feature stores and their role in operationalizing ML (e.g., Feast, Databricks Feature Store); Understanding of data modeling, data ingestion, and ETL/ELT processes tailored for AI/ML use cases; A plus are the following certifications: Microsoft Certified: Azure Solutions Architect Expert, Google Professional Cloud Architect, Databricks Certified Data Engineer Professional, Snowflake Advanced Architect Certification, Certified AI Practitioner (CAIP).
#J-18808-Ljbffr