Wavicle Data Solutions
Sr. Data Engineer
Wavicle Data Solutions, Chicago, Illinois, United States, 60290
About the RoleWe are looking for a
Senior Data Engineer
with strong real-life experience in python development, including pySpark in an AWS Cloud environment. The ideal engineer will have experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis and Lambda in S3, Redshift, RDS, and MongoDB/DynamoDB ecosystems.
Responsibilities:Build the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of sources Hadoop, Spark, AWS Lambda, etc.Design, develop, test, deploy, maintain and improve data integration pipeline.Develop pipeline objects using Apache Spark / Pyspark / Python or Scala.Design and develop data pipeline architectures using Hadoop, Spark and related AWS services.Load and performance test data pipelines built using the above-mentioned technologies.
Required Knowledge And Level Of Experience5+ years of
professional work experience
building data pipelines, preferably with a strategic/management consulting organization.5+ years of
professional work experience
with AWS and Python programming, experience with Python frameworks (e.g. Django, Flask, Bottle) is required.Expert-level knowledge with SQL to write complex, highly-optimized queries across large volumes of data.2+ years of
professional hands-on experience
implementing ETL pipelines using AWS services such as Glue, Lambda, EMR, Athena, S3, SNS, Kinesis, Data-Pipelines, Pyspark is required.2+ years of hands-on
professional work experience
developing ETL pipelines using Scala, Python, R, or Java is required.SQL expert, and ability to write complex, highly-optimized queries across large volumes of data.Hands-on experience in building logical and physical data models using tools such as ErWin, Enterprise Architect or Visio is required.Knowledge or experience in architectural best practices in building data lakes.Strong problem solving and troubleshooting skills with the ability to exercise mature judgment.Strong written and verbal communication skills.Experience building strong relationships with client leaders and participating in pre-sales proposal activities is highly desired.Bachelor or Master's degree in Computer Science, Engineering or equivalent degree is required.Must be open to up to travel to the client location, as required by the client engagement.
#J-18808-Ljbffr
Senior Data Engineer
with strong real-life experience in python development, including pySpark in an AWS Cloud environment. The ideal engineer will have experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis and Lambda in S3, Redshift, RDS, and MongoDB/DynamoDB ecosystems.
Responsibilities:Build the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of sources Hadoop, Spark, AWS Lambda, etc.Design, develop, test, deploy, maintain and improve data integration pipeline.Develop pipeline objects using Apache Spark / Pyspark / Python or Scala.Design and develop data pipeline architectures using Hadoop, Spark and related AWS services.Load and performance test data pipelines built using the above-mentioned technologies.
Required Knowledge And Level Of Experience5+ years of
professional work experience
building data pipelines, preferably with a strategic/management consulting organization.5+ years of
professional work experience
with AWS and Python programming, experience with Python frameworks (e.g. Django, Flask, Bottle) is required.Expert-level knowledge with SQL to write complex, highly-optimized queries across large volumes of data.2+ years of
professional hands-on experience
implementing ETL pipelines using AWS services such as Glue, Lambda, EMR, Athena, S3, SNS, Kinesis, Data-Pipelines, Pyspark is required.2+ years of hands-on
professional work experience
developing ETL pipelines using Scala, Python, R, or Java is required.SQL expert, and ability to write complex, highly-optimized queries across large volumes of data.Hands-on experience in building logical and physical data models using tools such as ErWin, Enterprise Architect or Visio is required.Knowledge or experience in architectural best practices in building data lakes.Strong problem solving and troubleshooting skills with the ability to exercise mature judgment.Strong written and verbal communication skills.Experience building strong relationships with client leaders and participating in pre-sales proposal activities is highly desired.Bachelor or Master's degree in Computer Science, Engineering or equivalent degree is required.Must be open to up to travel to the client location, as required by the client engagement.
#J-18808-Ljbffr