Intelliswift Software
Data Engineer - Java and Spark, Parquet, Avro, Hive, Iceberg
Intelliswift Software, Seattle, Washington, United States
Must Have skills: Java and Spark Parquet, Avro, Hive, Iceberg Detailed Job Description - 6 years overall software development experience with at least 3 years of experience in large scale data platforms. - Excellent expertise in Spark and Java. - Knowledge of Golang is highly desirable. - Good understanding of containerization using Docker and Kubernetes. - Understanding of version control like Git and CI/CD workflow. - Understanding of Parquet/Avro/Hive/Iceberg. - Well experienced in debugging and troubleshooting. - Good Communication and excellent collaboration skils to interact with internal customers. Deliverables: - Help support Data Replication Service (Spark jobs) and develop features/bug fixes. - Help support SparkCp (a library) in debugging and fixing issues encountered by DRS. - Help support the CDH Migration Tool in debugging and fixing issues encountered by customers. - Implement monitoring/alerting improvements to the 3 tools mentioned previously for better supportability. - Perform onboarding tasks to help move customer data. - Monitor and address user requests and issues as they arise.