Improving
Sr. Data Engineer
Improving, Minneapolis, Minnesota, United States, 55400
Sr. Data Engineer (Cloud)
Description
As a Data Engineer, you'll be working with the latest cloud and technology stacks to help clients implement and mature their modern data architecture. You will work with tools/platforms like Kafka, Snowflake, and Databricks to help clients get the most out of their data that may be in systems like Salesforce, SAP, SQL Server, or file storage. With a variety of projects, technologies, and clients, you will constantly be growing, and never bored.
Responsibilities
At least 4 years of experience as a hands-on software or data engineerAt least 1-2 years building production-grade data solutions (Example: ETL/ELT, Spark, Azure Data Factory, AWS Data Migration Services, streaming systems)Demonstrated aptitude for problem-solving and creativityAbility to learn new technologies and apply learnings to production-grade solutionsExperience with at least one prominent cloud provider (e.g.: AWS, Azure, GCP)Strong working knowledge of a querying language like SQLUnderstanding of CI/CD, automated testing, and the DevOps cultureEffectively communicate complex technical solutions to a variety of audiences through oral and written mediumsQualifications and Skills
Production experience with at least one distributed data system like Snowflake, Databricks, Cassandra, DynamoDb, Elastic, or HadoopProduction experience with at least one messaging technology like Kafka, Kinesis, Pulsar, or RabbitMQCertification on at least one relevant platform/tool (AWS, Azure, GCP, Snowflake, Databricks, Spark)Can translate business needs into optimized and efficient data models in SQL or NoSQLService frameworks such as Spring Boot, Ratpack, Vert.x, or PlayKnowledge of data analytics, visualization and governanceExperience working in an agile development framework like Scrum or Kanban
Description
As a Data Engineer, you'll be working with the latest cloud and technology stacks to help clients implement and mature their modern data architecture. You will work with tools/platforms like Kafka, Snowflake, and Databricks to help clients get the most out of their data that may be in systems like Salesforce, SAP, SQL Server, or file storage. With a variety of projects, technologies, and clients, you will constantly be growing, and never bored.
Responsibilities
At least 4 years of experience as a hands-on software or data engineerAt least 1-2 years building production-grade data solutions (Example: ETL/ELT, Spark, Azure Data Factory, AWS Data Migration Services, streaming systems)Demonstrated aptitude for problem-solving and creativityAbility to learn new technologies and apply learnings to production-grade solutionsExperience with at least one prominent cloud provider (e.g.: AWS, Azure, GCP)Strong working knowledge of a querying language like SQLUnderstanding of CI/CD, automated testing, and the DevOps cultureEffectively communicate complex technical solutions to a variety of audiences through oral and written mediumsQualifications and Skills
Production experience with at least one distributed data system like Snowflake, Databricks, Cassandra, DynamoDb, Elastic, or HadoopProduction experience with at least one messaging technology like Kafka, Kinesis, Pulsar, or RabbitMQCertification on at least one relevant platform/tool (AWS, Azure, GCP, Snowflake, Databricks, Spark)Can translate business needs into optimized and efficient data models in SQL or NoSQLService frameworks such as Spring Boot, Ratpack, Vert.x, or PlayKnowledge of data analytics, visualization and governanceExperience working in an agile development framework like Scrum or Kanban