Technogen International Company
Data Engineer
Technogen International Company, Chicago, IL
Job Title - Data Engineer 3 - (Python & AWS)
Request #39305-1
Location: Chicago Office IL preferred open to Peoria, IL - Hybrid role, 1-2x times a week.
Duration: 12 months
Position's Contributions to Work Group:
Python & AWS development to build, enhance and maintain monitoring capabilities.
Typical task breakdown:
- Develop datasets and microservices in support of new monitoring solutions and automations to accelerate issue detection and correction.
- Transformation of data to enable machine learning and/or monitoring of anomalies.
- Consumption of metadata to enable anomaly alarms and monitoring.
- Employee is also responsible for performing other job duties as assigned by Caterpillar management from time to time.
Interaction with team:
- Liaise with designers, engineers, and support teams to improve data pipeline performance & reliability.
Work environment:
- Work independently and collaborate with internal team and cross functional teams via Teams meetings, chat, and/or email
Candidate Requirements
Education & Experience Required:
- Bachelor's degree in computer programming or a relevant field required and 5-7 years' experience required.
- 4+ years' experience with Master's degree or higher
- Associates degree with a minimum of 10 years of experience in this field.
Technical Skills (Required)
• Experience with serverless design in AWS
• Four years or more of experience in data engineering and/or software development.
• Three years or more of experience with development, operations, or architecture in AWS using Python.
• Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM)
• Strong competency in testing, including unit testing to coverage standards and integration testing.
(Desired)
• Software development experience with object-oriented development and design patterns
• AWS technical certifications (Developer Associate, Solution Architect Associate)
• Familiarity with machine learning and data design to support machine learning
• Experience with productionizing machine learning workloads
• Experience with ElasticSearch/ELK, Kibana, Grafana, and/or Prometheus
• Experience with OpenTelemetry
Soft Skills (Required)
• Verbal and written communication skills, problem solving skills, customer service and interpersonal skills.
• Basic ability to work independently and manage one's time.
(Desired)
• Ability to work collaboratively in a complex, rapidly changing, and culturally diverse environment.
• Ability to clearly communicate complex technical ideas.
• Comfortable working in a dynamic environment where digital is still evolving as a core offering.
Disqualifiers/Red Flags:
• No experience with automated testing
• Make sure to list on the resume where the candidate resides or they will be DQ
• No public cloud experience.
Interview Process:
- 1st round: initial round ( Manager asking non-technical questions)- 30 min
- 2nd round: Technical interview ( 30 min)
- Interviews will be via teams
Request #39305-1
Location: Chicago Office IL preferred open to Peoria, IL - Hybrid role, 1-2x times a week.
Duration: 12 months
Position's Contributions to Work Group:
Python & AWS development to build, enhance and maintain monitoring capabilities.
Typical task breakdown:
- Develop datasets and microservices in support of new monitoring solutions and automations to accelerate issue detection and correction.
- Transformation of data to enable machine learning and/or monitoring of anomalies.
- Consumption of metadata to enable anomaly alarms and monitoring.
- Employee is also responsible for performing other job duties as assigned by Caterpillar management from time to time.
Interaction with team:
- Liaise with designers, engineers, and support teams to improve data pipeline performance & reliability.
Work environment:
- Work independently and collaborate with internal team and cross functional teams via Teams meetings, chat, and/or email
Candidate Requirements
Education & Experience Required:
- Bachelor's degree in computer programming or a relevant field required and 5-7 years' experience required.
- 4+ years' experience with Master's degree or higher
- Associates degree with a minimum of 10 years of experience in this field.
Technical Skills (Required)
• Experience with serverless design in AWS
• Four years or more of experience in data engineering and/or software development.
• Three years or more of experience with development, operations, or architecture in AWS using Python.
• Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM)
• Strong competency in testing, including unit testing to coverage standards and integration testing.
(Desired)
• Software development experience with object-oriented development and design patterns
• AWS technical certifications (Developer Associate, Solution Architect Associate)
• Familiarity with machine learning and data design to support machine learning
• Experience with productionizing machine learning workloads
• Experience with ElasticSearch/ELK, Kibana, Grafana, and/or Prometheus
• Experience with OpenTelemetry
Soft Skills (Required)
• Verbal and written communication skills, problem solving skills, customer service and interpersonal skills.
• Basic ability to work independently and manage one's time.
(Desired)
• Ability to work collaboratively in a complex, rapidly changing, and culturally diverse environment.
• Ability to clearly communicate complex technical ideas.
• Comfortable working in a dynamic environment where digital is still evolving as a core offering.
Disqualifiers/Red Flags:
• No experience with automated testing
• Make sure to list on the resume where the candidate resides or they will be DQ
• No public cloud experience.
Interview Process:
- 1st round: initial round ( Manager asking non-technical questions)- 30 min
- 2nd round: Technical interview ( 30 min)
- Interviews will be via teams