Infojini Consulting
Data Warehouse Analyst Advanced
Infojini Consulting, Tallahassee, FL
Scope of Services
Responsible for gathering and assessing business information needs and preparing system requirements. Performs analyses, development, and evaluation of data mining in the Snowflake cloud data warehouse environment which includes data design, database architecture, and data management utilizing Informatica PowerCenter, Intelligent Data Quality, and Enterprise Data Catalog. Uses data mining and data analysis tools including Power BI and Tableau. Reviews and validates data loaded into the data warehouse for accuracy. Interacts with user community to produce reporting requirements. Provides technical consulting to users of the various data stores and advises users on conflicts and inappropriate data usage. Responsible for prototyping solutions, building solutions, preparing test scripts, and conducting tests for Snowflake data population including data replication, extraction, loading, cleansing, and transforming the data. Responsible for enterprise data modeling. Maintains knowledge of software tools, languages, scripts, and shells that effectively support the Snowflake cloud data warehouse environment.
Education
Bachelor's Degree in Computer Science, Information Systems, or related field is required. Equivalent work experience may substitute for the degree requirement.
Experience
A minimum of 7 years of IT work experience utilizing data management tools, business intelligence tools, and data warehousing. Extensive experience with Informatica PowerCenter and Snowflake is preferred.
Primary Job Duties/ Tasks
1.Participates in strategic planning for the Snowflake enterprise data warehouse
2.Analyzes transactional data stores and develops data warehouse models to optimize the warehouse data stores for reporting and analytics
3.Prototypes, builds, and tests extraction, transformation, and load (ETL or ELT) jobs
4.Prototypes, builds, and tests data quality processes and related jobs
5.Ensures data warehouse metadata is collected and maintained
6.Prototypes, builds, and tests Power BI reports and dashboards
7.Prototypes, builds, and tests Tableau reports and dashboards
8.Coaches and mentors peers in data warehousing concepts and the use of the tools utilized to analyze data, design the warehouse models, and populate the warehouse.
9.Assists with the development and maintenance of methods and practices documentation.
Specific Knowledge, Skills, and Abilities
1.Extensive knowledge of data warehouse and data mart concepts
2.Ability to model transactional data for data warehousing usage
3.Knowledge of and skill in Snowflake cloud data warehouse functionality
4.Extensive knowledge of and expert skill in Informatica PowerCenter functionality
5.Knowledge of and skill in Informatica Intelligent Data Quality functionality
6.Knowledge of and skill in Informatica Enterprise Data Catalog functionality
7.Knowledge of and skill in Power BI functionality
8.Knowledge of and skill in Tableau functionality
9.Knowledge of and skill in relational database platforms including DB2, SQL Server, and Oracle
Responsible for gathering and assessing business information needs and preparing system requirements. Performs analyses, development, and evaluation of data mining in the Snowflake cloud data warehouse environment which includes data design, database architecture, and data management utilizing Informatica PowerCenter, Intelligent Data Quality, and Enterprise Data Catalog. Uses data mining and data analysis tools including Power BI and Tableau. Reviews and validates data loaded into the data warehouse for accuracy. Interacts with user community to produce reporting requirements. Provides technical consulting to users of the various data stores and advises users on conflicts and inappropriate data usage. Responsible for prototyping solutions, building solutions, preparing test scripts, and conducting tests for Snowflake data population including data replication, extraction, loading, cleansing, and transforming the data. Responsible for enterprise data modeling. Maintains knowledge of software tools, languages, scripts, and shells that effectively support the Snowflake cloud data warehouse environment.
Education
Bachelor's Degree in Computer Science, Information Systems, or related field is required. Equivalent work experience may substitute for the degree requirement.
Experience
A minimum of 7 years of IT work experience utilizing data management tools, business intelligence tools, and data warehousing. Extensive experience with Informatica PowerCenter and Snowflake is preferred.
Primary Job Duties/ Tasks
1.Participates in strategic planning for the Snowflake enterprise data warehouse
2.Analyzes transactional data stores and develops data warehouse models to optimize the warehouse data stores for reporting and analytics
3.Prototypes, builds, and tests extraction, transformation, and load (ETL or ELT) jobs
4.Prototypes, builds, and tests data quality processes and related jobs
5.Ensures data warehouse metadata is collected and maintained
6.Prototypes, builds, and tests Power BI reports and dashboards
7.Prototypes, builds, and tests Tableau reports and dashboards
8.Coaches and mentors peers in data warehousing concepts and the use of the tools utilized to analyze data, design the warehouse models, and populate the warehouse.
9.Assists with the development and maintenance of methods and practices documentation.
Specific Knowledge, Skills, and Abilities
1.Extensive knowledge of data warehouse and data mart concepts
2.Ability to model transactional data for data warehousing usage
3.Knowledge of and skill in Snowflake cloud data warehouse functionality
4.Extensive knowledge of and expert skill in Informatica PowerCenter functionality
5.Knowledge of and skill in Informatica Intelligent Data Quality functionality
6.Knowledge of and skill in Informatica Enterprise Data Catalog functionality
7.Knowledge of and skill in Power BI functionality
8.Knowledge of and skill in Tableau functionality
9.Knowledge of and skill in relational database platforms including DB2, SQL Server, and Oracle