TechWish
Data Architect
TechWish, Chicago, Illinois, United States, 60290
Data ArchitectMust be local to Chicagovery strong skills with Snowflake****MUST HAVE VERY STRONG SNOWFLAKE
Work on a Snowflake DW design and implementation.Design and deploy services to AWS using Terraform and Cloud formation stack.Design CICD pipelines for applications and deploy them.Design near real-time data pipelines using Kafka streaming service. Exposure to Amazon AWS or another cloud provider- Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker- Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.- Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills- Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders- Rigorous attention to detail and accuracy- Aware of and motivated by driving business value- Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase- Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata
Work on a Snowflake DW design and implementation.Design and deploy services to AWS using Terraform and Cloud formation stack.Design CICD pipelines for applications and deploy them.Design near real-time data pipelines using Kafka streaming service. Exposure to Amazon AWS or another cloud provider- Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker- Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.- Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills- Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders- Rigorous attention to detail and accuracy- Aware of and motivated by driving business value- Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase- Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata