Accenture
GCP Data Engineer
Accenture, San Francisco, California, United States, 94199
Are you ready to step up to the New and take your technology expertise to the next level?
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
As part of our Google Cloud Platform practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. There will never be a typical day and that’s why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on today’s biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Position Overview
The Google Cloud Platform (GCP) Data Engineer will be responsible for architecting transformation and modernization of enterprise data solutions on GCP integrating native GCP services and 3rd party data technologies. A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud. Key Responsibilities
Lead a team of data engineers in designing, developing, testing, and deploying high performance data analytics solutions in GCP. Work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on premise and cloud. Build solution architecture, provision infrastructure, secure and reliable data-centric services and applications in GCP. Implement end-to-end data analytics solutions (from data ingestion through visualization) for large-scale, complex client environments. Analyze and understand Big Data and analytical technologies on GCP and provide thought leadership to clients to assist with defining their architecture, data, analytics strategies. Communicate complex technical topics to non-technical business and senior executives. Experience working with recommendation engines, data pipelines, or distributed machine learning and experience with data analytics and data visualization techniques and software. Support data migration and transformation projects. Knowledge of Google AutoML framework to build intelligence into the data pipelines. Minimum Qualifications
Minimum of 5 years' experience in any cloud platform including 2 years of deep experience with GCP data and analytics services including Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions. Minimum of 3 years of proven ability to re-architect and re-platform on-premise data warehouses to GCP, and design/build production data pipelines within hybrid architectures using Java, Python, or Scala. Minimum of 3 years of expertise in architecting and implementing next-generation data and analytics platforms on GCP, including data engineering, ingestion, and curation functions using GCP native or custom programming. Minimum of 3 years of hands-on experience architecting & designing data lakes on GCP, and implementing data ingestion solutions using GCP native services or 3rd-party tools like Talend/Informatica. Minimum of 3 years of proficiency in designing and optimizing data models on GCP using BigQuery and BigTable. Minimum of 2 years of proficiency in using Google Cloud's Vertex AI platform for building, deploying, and managing machine learning models, including GenAI models. Minimum of 2 years of experience in implementing MLOps practices for the development, deployment, and monitoring of GenAI models. Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate’s Degree, must have a minimum 6 years work experience). Preferred Qualifications
Experience architecting and implementing metadata management, data governance, and security for data platforms on GCP. Ability to design operations architecture and conduct performance engineering for large-scale data lakes in production. Experience architecting and operating large production Hadoop/NoSQL clusters on-premise or in the cloud. Familiarity with introducing and operationalizing self-service data preparation tools (e.g., Trifacta, Paxata) on GCP. 3+ years experience writing complex SQL queries and stored procedures. Experience with Generative AI Studio for prototyping and experimenting with generative AI models. Familiarity with Google's Model Garden and its offerings for accessing and deploying pre-trained GenAI models. Google Certified Professional Data Engineer. Travel may be required for this role. The amount of travel will vary from 25% to 100% depending on business need and client requirements. Compensation
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired in California, Colorado, District of Columbia, Illinois, Maryland, Minnesota, New York, or Washington as set forth below. We accept applications on an ongoing basis and there is no fixed deadline to apply. Equal Employment Opportunity Statement
Accenture is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion or sexual orientation.
#J-18808-Ljbffr
The Google Cloud Platform (GCP) Data Engineer will be responsible for architecting transformation and modernization of enterprise data solutions on GCP integrating native GCP services and 3rd party data technologies. A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud. Key Responsibilities
Lead a team of data engineers in designing, developing, testing, and deploying high performance data analytics solutions in GCP. Work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on premise and cloud. Build solution architecture, provision infrastructure, secure and reliable data-centric services and applications in GCP. Implement end-to-end data analytics solutions (from data ingestion through visualization) for large-scale, complex client environments. Analyze and understand Big Data and analytical technologies on GCP and provide thought leadership to clients to assist with defining their architecture, data, analytics strategies. Communicate complex technical topics to non-technical business and senior executives. Experience working with recommendation engines, data pipelines, or distributed machine learning and experience with data analytics and data visualization techniques and software. Support data migration and transformation projects. Knowledge of Google AutoML framework to build intelligence into the data pipelines. Minimum Qualifications
Minimum of 5 years' experience in any cloud platform including 2 years of deep experience with GCP data and analytics services including Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions. Minimum of 3 years of proven ability to re-architect and re-platform on-premise data warehouses to GCP, and design/build production data pipelines within hybrid architectures using Java, Python, or Scala. Minimum of 3 years of expertise in architecting and implementing next-generation data and analytics platforms on GCP, including data engineering, ingestion, and curation functions using GCP native or custom programming. Minimum of 3 years of hands-on experience architecting & designing data lakes on GCP, and implementing data ingestion solutions using GCP native services or 3rd-party tools like Talend/Informatica. Minimum of 3 years of proficiency in designing and optimizing data models on GCP using BigQuery and BigTable. Minimum of 2 years of proficiency in using Google Cloud's Vertex AI platform for building, deploying, and managing machine learning models, including GenAI models. Minimum of 2 years of experience in implementing MLOps practices for the development, deployment, and monitoring of GenAI models. Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate’s Degree, must have a minimum 6 years work experience). Preferred Qualifications
Experience architecting and implementing metadata management, data governance, and security for data platforms on GCP. Ability to design operations architecture and conduct performance engineering for large-scale data lakes in production. Experience architecting and operating large production Hadoop/NoSQL clusters on-premise or in the cloud. Familiarity with introducing and operationalizing self-service data preparation tools (e.g., Trifacta, Paxata) on GCP. 3+ years experience writing complex SQL queries and stored procedures. Experience with Generative AI Studio for prototyping and experimenting with generative AI models. Familiarity with Google's Model Garden and its offerings for accessing and deploying pre-trained GenAI models. Google Certified Professional Data Engineer. Travel may be required for this role. The amount of travel will vary from 25% to 100% depending on business need and client requirements. Compensation
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired in California, Colorado, District of Columbia, Illinois, Maryland, Minnesota, New York, or Washington as set forth below. We accept applications on an ongoing basis and there is no fixed deadline to apply. Equal Employment Opportunity Statement
Accenture is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion or sexual orientation.
#J-18808-Ljbffr