Minnesota Timberwolves
Data Scientist
Minnesota Timberwolves, Minneapolis, Minnesota, United States, 55400
Minnesota Timberwolves Basketball, LP seeks a Data Scientist in Minneapolis, MN to develop and integrate predictive models into our tech ecosystem, introduce and maintain AI, ML, and neural networking assets, evolve a dynamic pricing model, and optimize custom data source queries. These technical objectives will be paired with a need to use advanced quantitative and qualitative analytical methodologies to provide actionable insights and solutions to complex business challenges. The Data Scientist will consult with internal stakeholders and lead complex data systems and modeling initiatives to accelerate the achievement of organizational objectives while enabling data-driven decisions across the company.
The Specific Duties And Responsibilities IncludeDevelop and train internal Python based predictive statistical models to identify opportunities, mitigate risk, and drive business objectives;Design, implement and maintain AI/ML workloads, custom SQL data sources, and GitHub repository for all business, operational, and performance metrics;Execute ETL workflows and leverage Power Apps to ingest data into a single, consistent data source;Create, test, and grow an artificial neural network (ANN) to identify hidden patterns and correlations in raw data;Partner with B.I. Developers, Analysts, and data warehouse vendor to acquire, structure and manage data from platforms and applications across the dataverse;Own data governance across multiple domains by using relational database management languages (SQL, Python, R, etc.) to audit and scrub all reporting/model centric schemas;Lead the integration of new business domains and their respective data into the data warehouse and their relevant reporting portfolio;Be a resource to the business by leveraging your comprehensive knowledge of the professional sports industry, paired with the technical output of models, AI/ML, and various analytic methodologies, to influence a dynamic pricing strategy, marketing campaigns, lead generation, etc.
We require a Bachelor’s degree in Information Systems, Data Science, Computational Engineering, or a closely related field. We also require three years of experience in SQL programming for relational database management and the integration of data across SAS Viya and SSRS, at least one year of which must include data integration across Tableau. We require at least one year of experience in R/Python, ETL workflows/data pipelining, creating workflow automation scripts, AI/machine learning, and SQL server reporting services in a Microsoft environment and SQL within Snowflake.
#J-18808-Ljbffr
The Specific Duties And Responsibilities IncludeDevelop and train internal Python based predictive statistical models to identify opportunities, mitigate risk, and drive business objectives;Design, implement and maintain AI/ML workloads, custom SQL data sources, and GitHub repository for all business, operational, and performance metrics;Execute ETL workflows and leverage Power Apps to ingest data into a single, consistent data source;Create, test, and grow an artificial neural network (ANN) to identify hidden patterns and correlations in raw data;Partner with B.I. Developers, Analysts, and data warehouse vendor to acquire, structure and manage data from platforms and applications across the dataverse;Own data governance across multiple domains by using relational database management languages (SQL, Python, R, etc.) to audit and scrub all reporting/model centric schemas;Lead the integration of new business domains and their respective data into the data warehouse and their relevant reporting portfolio;Be a resource to the business by leveraging your comprehensive knowledge of the professional sports industry, paired with the technical output of models, AI/ML, and various analytic methodologies, to influence a dynamic pricing strategy, marketing campaigns, lead generation, etc.
We require a Bachelor’s degree in Information Systems, Data Science, Computational Engineering, or a closely related field. We also require three years of experience in SQL programming for relational database management and the integration of data across SAS Viya and SSRS, at least one year of which must include data integration across Tableau. We require at least one year of experience in R/Python, ETL workflows/data pipelining, creating workflow automation scripts, AI/machine learning, and SQL server reporting services in a Microsoft environment and SQL within Snowflake.
#J-18808-Ljbffr