Logo
Tekfortune Inc

Sr Data Engineer

Tekfortune Inc, Edina, Minnesota, United States,


Tekfortune is a fast-growing consulting firm specialized in permanent, contract & project-based staffing services for world's leading organizations in a broad range of industries. In this quickly changing economic landscape, virtual recruiting and remote work are critical for the future of work. To support the active project demands and skills gaps, our staffing experts can help you find the best job for you.

Role: Sr Data EngineerLocation: 100% remoteDuration: 6-12 month C2H

Required Skills:Top requirements• 5+ years of data engineering experience developing large data pipelines• Strong fundamental Scala and Python programming skills• Basic understanding of AWS or other cloud provider resources (S3)• Strong SQL skills and ability to create queries to analyze complex datasets• Hands-on production environment experience with distributed processing systems such as Spark• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines

Job Description:Responsibilities

Focus on major areas of work, typically 20% or more of role % of Time• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines in Scala and Python / Spark while maintaining strict uptime SLAs• Extend functionality of current Core Data platform offerings, including metadata parsing, extending the Metastore API, and building new integrations with APIs both internal and external to the Data organization• Implement Ingestion of new batch and streaming data pipelines using Scala, Databricks, and Airflow• Implement the Lakehouse architecture, working with customers, partners, and stakeholders to shift towards a Lakehouse centric data platform• Implementation shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across all data pipelines across the Data organization• Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform• Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, partitioning strategies, and more• Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)• Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team• Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements• Maintain detailed documentation of your work and changes to support data quality and data governance requirements

Basic Qualifications• 5+ years of data engineering experience developing large data pipelines• Strong algorithmic problem-solving expertise• Strong fundamental Scala and Python programming skills• Basic understanding of AWS or other cloud provider resources (S3)• Strong SQL skills and ability to create queries to analyze complex datasets• Hands-on production environment experience with distributed processing systems such as Spark• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines• Some scripting language experience• Willingness and ability to learn and pick up new skillsets• Self-starting problem solver with an eye for detail and excellent analytical and communication skills

Preferred Qualifications• Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Redshift, Big Query)• Experience in developing APIs with GraphQL• Deep Understanding of AWS or other cloud providers as well as infrastructure as code• Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices• Familiar with Scrum and Agile methodologies

Required Education : STEM Bachelor's or Master's Degree

For more information and other jobs available please contact our recruitment team at careers@tekfortune.com . To view all the jobs available in the USA and Asia please visit our website at https://www.tekfortune.com/careers/ .