Accenture
GCP Data Platform Architect
Accenture, Walnut Creek, California, United States, 94598
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate, and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
As part of our Google Cloud Platform practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. There will never be a typical day, and that’s why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative, taking on today’s biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Position Overview
The Google Cloud Platform (GCP) Senior Data Platform Architect is responsible for designing, implementing, and managing scalable and secure data solutions on the Google Cloud Platform (GCP). This individual will be a technical leader, collaborating with various teams to understand data requirements and translating them into robust architectural blueprints. The ideal candidate possesses a deep understanding of GCP's data services, data warehousing, data lakes, big data technologies, and data security best practices. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with an appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud. Key Responsibilities
Architecture and Design:
Design and implement end-to-end data solutions on GCP, encompassing data ingestion, storage, processing, transformation, and analytics. Data Warehousing and Lakes:
Architect, design, and deploy data warehouses and data lakes, utilizing technologies like BigQuery, Dataflow, and Dataproc. Big Data Solutions:
Design and implement big data solutions on GCP, including data pipelines, streaming analytics, and machine learning workflows. Data Security and Governance:
Establish and maintain robust data security frameworks, implement access controls, and ensure data governance practices. Performance Optimization:
Monitor, analyze, and optimize data platform performance to ensure optimal efficiency and cost-effectiveness. Technology Evaluation:
Stay updated on the latest GCP data technologies, evaluating and recommending their adoption within the organization. Collaboration:
Work collaboratively with data engineers, data scientists, business analysts, and other stakeholders to understand requirements and deliver optimal solutions. Documentation:
Develop clear and comprehensive documentation, including architectural diagrams, design specifications, and operational guidelines. Foundational GenAI:
Understanding of LLMs, Prompt engineering, Model Evaluation & Fine-tuning, ML ops, and Ethical & responsible AI. Travel may be required for this role. The amount of travel will vary from 25% to 100% depending on business need and client requirements. Basic Qualifications
GCP Expertise:
Minimum of 3+ years of professional experience with GCP data services, including BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and related technologies. Data Warehousing:
Minimum of 6+ years of strong proficiency in data warehousing concepts, data modeling, and ETL/ELT processes. Big Data:
Minimum of 6+ years of professional experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Data Security:
Deep understanding of data security principles and best practices on GCP with minimum 3+ years of professional experience. Cloud Architecture:
Minimum of 6+ years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment. Proven track record in designing and implementing cloud-based data architectures. Education:
Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate’s Degree, must have a minimum of 6 years work experience). Preferred Qualifications
GCP:
Experience working in GCP cloud – Organization policies, IAM, VM, DB, Kubernetes & Containers. Vertex AI:
Proficiency in using Google Cloud's Vertex AI platform for building, deploying, and managing machine learning models, including GenAI models. Generative AI Studio:
Experience with Generative AI Studio for prototyping and experimenting with generative AI models. Model Garden:
Familiarity with Google's Model Garden and its offerings for accessing and deploying pre-trained GenAI models. MLOps for GenAI:
Experience in implementing MLOps practices for the development, deployment, and monitoring of GenAI models. Problem-Solving:
Excellent analytical and problem-solving skills. Communication:
Strong communication and interpersonal skills, capable of collaborating effectively with various teams. Certifications:
GCP Professional Data Engineer or equivalent certifications are highly desirable. Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired in California, Colorado, District of Columbia, Illinois, Maryland, Minnesota, New York, or Washington as set forth below. We accept applications on an ongoing basis and there is no fixed deadline to apply. Equal Employment Opportunity Statement
Accenture is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion, or sexual orientation.
#J-18808-Ljbffr
The Google Cloud Platform (GCP) Senior Data Platform Architect is responsible for designing, implementing, and managing scalable and secure data solutions on the Google Cloud Platform (GCP). This individual will be a technical leader, collaborating with various teams to understand data requirements and translating them into robust architectural blueprints. The ideal candidate possesses a deep understanding of GCP's data services, data warehousing, data lakes, big data technologies, and data security best practices. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with an appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud. Key Responsibilities
Architecture and Design:
Design and implement end-to-end data solutions on GCP, encompassing data ingestion, storage, processing, transformation, and analytics. Data Warehousing and Lakes:
Architect, design, and deploy data warehouses and data lakes, utilizing technologies like BigQuery, Dataflow, and Dataproc. Big Data Solutions:
Design and implement big data solutions on GCP, including data pipelines, streaming analytics, and machine learning workflows. Data Security and Governance:
Establish and maintain robust data security frameworks, implement access controls, and ensure data governance practices. Performance Optimization:
Monitor, analyze, and optimize data platform performance to ensure optimal efficiency and cost-effectiveness. Technology Evaluation:
Stay updated on the latest GCP data technologies, evaluating and recommending their adoption within the organization. Collaboration:
Work collaboratively with data engineers, data scientists, business analysts, and other stakeholders to understand requirements and deliver optimal solutions. Documentation:
Develop clear and comprehensive documentation, including architectural diagrams, design specifications, and operational guidelines. Foundational GenAI:
Understanding of LLMs, Prompt engineering, Model Evaluation & Fine-tuning, ML ops, and Ethical & responsible AI. Travel may be required for this role. The amount of travel will vary from 25% to 100% depending on business need and client requirements. Basic Qualifications
GCP Expertise:
Minimum of 3+ years of professional experience with GCP data services, including BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and related technologies. Data Warehousing:
Minimum of 6+ years of strong proficiency in data warehousing concepts, data modeling, and ETL/ELT processes. Big Data:
Minimum of 6+ years of professional experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Data Security:
Deep understanding of data security principles and best practices on GCP with minimum 3+ years of professional experience. Cloud Architecture:
Minimum of 6+ years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment. Proven track record in designing and implementing cloud-based data architectures. Education:
Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate’s Degree, must have a minimum of 6 years work experience). Preferred Qualifications
GCP:
Experience working in GCP cloud – Organization policies, IAM, VM, DB, Kubernetes & Containers. Vertex AI:
Proficiency in using Google Cloud's Vertex AI platform for building, deploying, and managing machine learning models, including GenAI models. Generative AI Studio:
Experience with Generative AI Studio for prototyping and experimenting with generative AI models. Model Garden:
Familiarity with Google's Model Garden and its offerings for accessing and deploying pre-trained GenAI models. MLOps for GenAI:
Experience in implementing MLOps practices for the development, deployment, and monitoring of GenAI models. Problem-Solving:
Excellent analytical and problem-solving skills. Communication:
Strong communication and interpersonal skills, capable of collaborating effectively with various teams. Certifications:
GCP Professional Data Engineer or equivalent certifications are highly desirable. Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired in California, Colorado, District of Columbia, Illinois, Maryland, Minnesota, New York, or Washington as set forth below. We accept applications on an ongoing basis and there is no fixed deadline to apply. Equal Employment Opportunity Statement
Accenture is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion, or sexual orientation.
#J-18808-Ljbffr