eTeam
Tech Data Analyst Job at eTeam in San Jose
eTeam, San Jose, CA, United States
Role: Tech Data Analyst
Location: San Jose, CA
Duration: 6+ Months
The Opportunity
We are looking for a Technical Data Analyst who has hands on experience migrating Sales users and Sales data like Accounts, Opportunities, Contacts from Client.com to Dynamics 365. He should also have strong experience in data profiling, data integration, transformations, understanding how data impacts downstream systems like Sales and Finance data warehouses/data lakes, reporting etc.
What You'll Do
Qualifications
Few pointers:
Migration Users / Data from SF to MSD. Person , savvy with Data and know fields mapping (not label mapping ) field mapping. Feed this data to downstream systems. Lot of dayta is Finance related data, Data Analysis ,. How does it affect ECC , Hands on for DA.
Databricks knowledge, PBI knowledge, SQL, Python knowledge
Location: San Jose, CA
Duration: 6+ Months
The Opportunity
We are looking for a Technical Data Analyst who has hands on experience migrating Sales users and Sales data like Accounts, Opportunities, Contacts from Client.com to Dynamics 365. He should also have strong experience in data profiling, data integration, transformations, understanding how data impacts downstream systems like Sales and Finance data warehouses/data lakes, reporting etc.
What You'll Do
- Work with client internal data teams and business teams to decipher the data related requirements for the project
- Need to have hands on experience in understanding the SFDC to Dynamics field mappings & data models.
- Design end to end strategy to stitch objects from various systems and financial KPIs
- Building integrations between SFDC, Dynamics, SAP ECC and DBX for Power BI reporting and financial metrics.
- Leverage data sources across the enterprise to build sophisticated and insightful analyses and data models for Sales, Finance and Marketing
- Need to have hands on experience with migrating Marketing and Sales data and users from Client.com to Dynamics
- Work with the Product Managers to build detailed data requirements/specifications for Engineering teams to build the solution in downstream data management and reporting systems.
- Need to understand the migration challenges from similar experiences and build creative solutions to help with migrating data from SFDC to Dynamics.
- Consolidate requirements and suggest building new reporting capabilities for analysis using advanced BI techniques and tools.
- Proactively collaborate with various product managers to bring a perspective on all data we work on.
- Conducts QA testing and validations and provides inputs to the Engineering teams - along with the PdMs
- Support Release Planning, scheduling backlog items into regular releases aligned to business priority while working with the PdM's
- Supports production cutover and Production acceptance testing.
- Supports post go live sessions with business, addresses and drives technical issues raised during Hyper Care. This will be done with the PdM's and Engineering teams.
Qualifications
- Requires bachelor's degree. Preferred candidates will have a major in computer science, MBA from reputable institution or equivalent experience.
- 4+ years of data analytics, 'data BSA' or data product management experience with solid understanding of how to deliver data solutions in an agile environment.
- Strong proficiency in SQL/SparkSQL/Python to query and manipulate large data sets. Experience with platforms like Databricks, Power BI and Tableau.
- You are a self-starter, independent, hard worker, with a high degree of motivation to go above and beyond the task at hand. You anticipate and creatively implement next steps in a complex environment.
- You have mastered the ability to influence outcomes, navigate, mediate to consensus with integrity. You possess great interpersonal communication, presentation skills, and social skills and a solid sense of humor.
- Data requirement writing skills: collecting, prioritizing, and gathering input from multiple sources, providing accurate requirements with attention to detail.
- You already know or can rapidly learn enterprise application capabilities in order to deliver transaction and event-driven data solutions (examples: SAP/HANA, MS Dynamics or Client Data, ADLS/Hadoop/Databricks datalake/lakehouse solutions, and/or Kafka streams)
Few pointers:
Migration Users / Data from SF to MSD. Person , savvy with Data and know fields mapping (not label mapping ) field mapping. Feed this data to downstream systems. Lot of dayta is Finance related data, Data Analysis ,. How does it affect ECC , Hands on for DA.
Databricks knowledge, PBI knowledge, SQL, Python knowledge