Logo
Intelliswift

Data Engineer II

Intelliswift, Austin, Texas, us, 78716


Job ID: 24-05160 Pay rate range - $55/hr. to $57/hr. on WAustin TX - Hybrid 3 days in office

Must HaveAWS Cloud servicesData EngineeringPython or Scala

REQUIRED SKILLSLooking for Sr. Data engineerVery comfortable with Data modeling, data warehousing ETL, and Scripting. Orchestration, Data Pipeline, TaxonomyPython, Scala or Spark, SQLLooking for someone who is already AWS cloud services like S3, lambda, glue, EMR, Lake formation, step function, ec2Experience with AWS services with a minimum 2 years is a Must

Years of Experience:

3 - 5+ years of experienceDegree or Certification:

BS in Computer science or Statistics or engineeringMasters PreferredData Engineer CertificationAWS certificationKey job responsibilities

As a Data Engineer, your pivotal role will revolve around designing, building, and maintaining robust data pipelines and infrastructure to drive data-driven insights.Core responsibilities include architecting scalable and fault-tolerant pipelines to extract, transform, and load data from various sources into data warehouses or lakes.Leveraging cutting-edge tools and technologies, you'll implement ingestion, transformation, and integration processes, ensuring stringent data quality through validation, cleansing, and deduplication.You'll design and implement high-performance, scalable, and cost-efficient data storage solutions. Utilizing CI/CD principles, you'll automate infrastructure deployment and maintenance processes, ensuring a streamlined data ecosystem.Data modeling and engineering will be crucial, involving designing and implementing optimized data models, developing and maintaining ETL processes, and collaborating with analysts and functional managers to understand data needs and provide tailored data structures.Ensuring data governance, security, and compliance will be paramount.You'll implement robust policies and processes, enforce access controls, encryption, and auditing mechanisms, and collaborate cross-functionally to establish and maintain data governance standards.Continuous improvement and automation will be woven into your daily activities, continuously evaluating and optimizing processes, pipelines, and infrastructure, implementing automation and monitoring tools, and staying ahead of emerging data engineering technologies and trends.

BASIC QUALIFICATIONS

3+ years of data engineering experience with data modeling, warehousing, and building ETL pipelinesKnowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence Knowledge of distributed systems as it pertain to data storage and computingExperience with SQLA good candidate can partner with business owners directly to understand their requirements and provide data that can help them observe patterns and spot anomalies.Bachelor's degree in computer science, engineering, analytics, mathematics, statistics, IT, or equivalent

PREFERRED QUALIFICATIONS

Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, Firehose, Lambda, and IAM roles and permissionsExperience with non-relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases)Master's degree in computer science, engineering, analytics, mathematics, statistics, IT, or equivalentExperience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Experience building world-class data environments with a high emphasis on data quality and data securityStrong problem-solving and analytical skills, with the ability to translate business requirements into technical solutions