Logo
InterEx Group

Data Engineer

InterEx Group, Raleigh, NC, United States


Job Summary

We are seeking a skilled and motivated Data Engineer to join a Specialty Insurance Firm. In this role, you will be responsible for building, maintaining, and optimizing the data infrastructure and pipelines to support advanced analytics, risk modeling, and data-driven decision-making

Key Responsibilities

  • Data Pipeline Development: Design, build, and maintain scalable data pipelines to efficiently process, transform, and load data from multiple sources into our data lake or data warehouse.
  • Data Integration: Integrate diverse data sources such as claims data, underwriting data, and third-party data for comprehensive analytics and reporting.
  • Data Quality & Governance: Implement robust data quality checks and governance standards to ensure data accuracy, consistency, and compliance with industry regulations.
  • Database Management: Develop, optimize, and manage data storage solutions using cloud-based platforms (e.g., AWS, Azure, or GCP) and relational or NoSQL databases.
  • Collaboration with Stakeholders: Work closely with data scientists, actuaries, and business analysts to understand data requirements and support analytics initiatives.
  • Automation: Automate repetitive data workflows, ensuring timely data availability and minimizing manual interventions.
  • Data Security: Maintain data security protocols and collaborate with IT to safeguard sensitive insurance data in line with regulatory standards.
  • Documentation: Document data processes, pipeline configurations, and database designs to ensure maintainability and transparency.

Qualifications

  • Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience: 3+ years of experience as a Data Engineer, preferably within the insurance or financial services sector.
  • Technical Skills:
  • Proficiency in SQL, Python, and data pipeline tools (e.g., Apache Spark, Kafka, Airflow).
  • Experience with cloud platforms like AWS (Redshift, S3, Glue), Azure (Synapse, Data Factory), or Google Cloud (BigQuery, Cloud Dataflow).
  • Familiarity with ETL tools (e.g., Informatica, Talend, Alteryx) and data warehousing concepts.
  • Knowledge of data modeling, data lakes, and data warehouse architecture.
  • Strong understanding of API integrations and data streaming technologies.
  • Industry Knowledge: Experience in or knowledge of the insurance sector, particularly with claims, underwriting, and actuarial data, is a plus.
  • Analytical Skills: Ability to work with large datasets and drive meaningful insights and solutions for business stakeholders.
  • Communication: Excellent written and verbal communication skills to explain complex technical concepts to non-technical teams.

Nice to Have

  • Experience with machine learning frameworks and deployment of models in production.
  • Knowledge of data visualization tools (e.g., Tableau, Power BI) to create actionable dashboards.
  • Certification in cloud computing (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).