Logo
Deloitte

Data Science Solution Specialist - Generative AI

Deloitte, Charlotte, NC


Data Scientist (Generative AI) -Solution Specialist - USDC

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

Work you'll do

The Generative AI Engineer will, as part of several client delivery teams, be responsible for developing, designing, and maintaining cutting-edge AI-based systems, ensuring smooth and engaging user experiences. Additionally, the Generative AI Engineer will participate in a wide variety of Natural Language Processing activities, including refining and optimizing prompts to improve the outcome of Large Language Models (LLMs), and code and design review. The kinds of activities performed by the Prompt Engineer will also include, but not be limited to:
  • Working across client teams to develop and architect Generative AI solutions using ML and GenAI
  • Developing and promoting standards across the community
  • Evaluating and selecting appropriate AI tools and machine learning models for tasks, as well as building and training working versions of those models using Python and other open-source technologies
  • Working with leadership and stakeholders to identify AI opportunities and promote strategy.
  • Developing and conducting trainings for users across the Government & Public Services landscape on principles used to develop models and how to interact with models to facilitate their business processes.
  • Building and prioritizing backlog for future machine-learning enabled features to support client business processes.
  • You'll design and build generative models, selecting the most suitable architecture (e.g., GANs, VAEs) based on the desired output (text, images, code). This involves writing code using Python libraries like TensorFlow or PyTorch.
  • Once your model is built, you'll train it on the prepared data, fine-tuning hyperparameters to achieve optimal performance. You'll then evaluate the model's outputs to assess its effectiveness and identify areas for improvement.
  • You'll collaborate with other engineers to integrate your generative AI solution into existing systems or develop new applications. This might involve deploying the model on cloud platforms for scalability.
  • The field of generative AI is rapidly evolving. Staying abreast of the latest research, advancements, and ethical considerations in AI development is an ongoing process.
The TeamArtificial Intelligence & Data Engineering

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Artificial Intelligence & Data Engineering will work with our clients to:
  • Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms
  • Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions
  • Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements
Qualifications

Required:
  • 3+ years of experience programming in Python or R.
  • Knowledge of Python libraries like Pandas, Scikit-Learn, Numpy, NLTK is required
  • 3+ years of experience with Natural Language Processing (NLP) and Large Language Models (LLM) 3+ years of experience building and maintaining scalable API solutions
  • Experience working with RAG technologies and LLM frameworks (Langchain, Claude and LLamaIndex), LLM model registries (Hugging Face), LLM APIs, embedding models, and vector databases (FAISS , Milvus , OpenSearch, Pinecone etc.)
  • Experience working with Retrieval Augmented Thoughts (RAT) and chain of thoughts.
  • Experience building scalable data models and performing complex relational databases queries using SQL (Oracle, MySQL, PostGres), etc.
  • Experience working with cloud computing platforms (e.g., AWS, Azure, Google Cloud) and containerization technologies (e.g., Docker, Kubernetes).
  • Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines
  • Experience driving DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI
  • Experience with machine learning libraries and services like TensorFlow, PyTorch, or Amazon SageMaker.
  • Experience integrating GenAI solution on cloud platform (e.g., AWS, Azure, Google Cloud)
  • 3+ years of experience designing solutions to address client requirements
  • 1+ years of experience with the design and implementation (building, containerizing, and deploying end to end automated data and ML pipelines) of automated cloud solutions
  • 3+ years of experience in developing algorithms using data science technologies to build analytical models
  • 3+ years of data extraction/manipulation experience using scripts specific to AI/ML
  • 3+ years of modeling experience using a variety of regression and supervised and unsupervised learning techniques.
  • 3+ years of experience in data wrangling/cleansing, statistical modeling, and programming
  • 3+ years of extensive experience working in an Agile development environment
  • 3+ years of experience for fluency in both structured and unstructured data (SQL, NOSQL)
  • 3+ years of production experience with Apache Spark
  • 3+ years of hands-on experience with web APIs, CI/CD for ML, and Serverless Deployment
  • 3+ years of experience with presentation and data analysis software such as: SAS, R, SPSS, MATLAB, QlikView, Excel and Access
  • 1+ years of experience/familiarity with Linux OS and Windows servers
  • 1+ years of experience with/knowledge of Docker, Jenkins, Kubernetes, and other DevOps tools
  • Must live in a commutable distance (approximately 100-mile radius) to one of the following Delivery locations: Atlanta, GA; Charlotte, NC; Dallas, TX; Gilbert, AZ; Houston, TX; Lake Mary, FL; Mechanicsburg, PA; Philadelphia, PA; with the ability to commute to assigned location for the day, without the need for overnight accommodations
  • Expectation to co-locate in your designated Delivery location up to 30% of the time based on business needs. This may include a maximum of 10% overnight client/project travel
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve. This may include overnight travel
  • Bachelor's degree, preferably in Computer Sciences, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Preferred:
  • Previous Government Consulting and/or professional services experience
  • In depth understanding of AI protocols and standards
  • Understanding of technology risks and the ability to assess and mitigate them
  • Deep knowledge of a specific domain or industry, with a focus on applying NLP/LLM solutions in that context
  • Experience with debugging and troubleshooting software or solutions design issues
  • Proven ability to stay current with best practices and new technology solutions in the field
  • Ability to display both breadth and depth of knowledge regarding functional and technical issues
  • Experience presenting to clients or other decision makers to present and sell ideas to various audiences (technical and non-technical)
  • Certification from any of the three major cloud platforms (AWS / Azure / GCP) in Cloud Architecture / Engineering / DevOps / ML.
  • Familiarity with Kubeflow or MLflow
  • Experience with machine learning pipelines (Azure ML)
  • Familiarity with the latest Natural Language Processing or Computer Vision related algorithms
Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html

Recruiting tips

From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.

Benefits

At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.

Our people and culture

Our diverse, equitable, and inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our client most complex challenges. This makes Deloitte one of the most rewarding places to work. Learn more about our inclusive culture.

Our purpose

Deloitte's purpose is to make an impact that matters for our clients, our people, and in our communities. We are creating trust and confidence in a more equitable society. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. We are focusing our collective efforts to advance sustainability, equity, and trust that come to life through our core commitments. Learn more about Deloitte's purpose, commitments, and impact.

Professional development

From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career.