Robert Half
Robert Half is hiring: Data Engineer in Minneapolis
Robert Half, Minneapolis, MN, US
Job Description
Job Description
We are offering an exciting opportunity for a Data Engineer in the non-profit sector. Located in Minneapolis, Minnesota, this role involves working within a team to drive the development and optimization of our data systems.
Responsibilities:
• Designing and constructing highly scalable data processing systems.
• Utilizing Apache Kafka, Apache Pig, and Apache Spark to manage data flow and processing.
• Implementing algorithms and data visualization strategies.
• Developing APIs and utilizing AWS technologies for cloud-based solutions.
• Employing SQL for database management and optimization.
• Extracting, transforming, and loading data (ETL) processes.
• Ensuring data quality and security through effective data governance frameworks.
• Utilizing data modeling tools such as ERwin, ER/Studio, PowerDesigner, or similar.
• Managing and cataloging data with metadata management tools.
• Maintaining a solid understanding of relational database concepts and designing optimized database schemas and indexes.• Minimum of 5 years experience in the role of a Data Engineer or similar.
• Proficiency with Apache Kafka, Apache Pig, and Apache Spark.
• Solid understanding and experience with Cloud Technologies.
• Expertise in Data Visualization and Algorithm Implementation.
• Strong background in Analytics and Apache Hadoop.
• Experience in API Development and AWS Technologies.
• Proficiency in SQL - Structured Query Language.
• Demonstrable experience with ETL - Extract Transform Load.
• Strong ability in Data Modeling.
• Prior experience working in the non-profit industry is an advantage.
• Excellent problem-solving skills and attention to detail.
• Ability to work independently and as part of a team.
• Excellent written and verbal communication skills.
Responsibilities:
• Designing and constructing highly scalable data processing systems.
• Utilizing Apache Kafka, Apache Pig, and Apache Spark to manage data flow and processing.
• Implementing algorithms and data visualization strategies.
• Developing APIs and utilizing AWS technologies for cloud-based solutions.
• Employing SQL for database management and optimization.
• Extracting, transforming, and loading data (ETL) processes.
• Ensuring data quality and security through effective data governance frameworks.
• Utilizing data modeling tools such as ERwin, ER/Studio, PowerDesigner, or similar.
• Managing and cataloging data with metadata management tools.
• Maintaining a solid understanding of relational database concepts and designing optimized database schemas and indexes.• Minimum of 5 years experience in the role of a Data Engineer or similar.
• Proficiency with Apache Kafka, Apache Pig, and Apache Spark.
• Solid understanding and experience with Cloud Technologies.
• Expertise in Data Visualization and Algorithm Implementation.
• Strong background in Analytics and Apache Hadoop.
• Experience in API Development and AWS Technologies.
• Proficiency in SQL - Structured Query Language.
• Demonstrable experience with ETL - Extract Transform Load.
• Strong ability in Data Modeling.
• Prior experience working in the non-profit industry is an advantage.
• Excellent problem-solving skills and attention to detail.
• Ability to work independently and as part of a team.
• Excellent written and verbal communication skills.