SysMind Tech
Senior Software Engineer (Big Data Development)
SysMind Tech, Orlando, Florida, us, 32885
Senior Software Engineer (Big Data Development)
Responsibilities • Build components of large-scale data platform for real-time and batch processing, and own features of big data applications to fit evolving business needs • Build next-gen cloud based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability • Contribute to the best engineering practices, including the use of design patterns, CI/CD, code review and automated test • Chip in ground-breaking innovation and apply the state-of-the-art technologies • As a key member of the team, contribute to all aspects of the software lifecycle: design, experimentation, implementation and testing. • Collaborate with program managers, product managers, SDET, and researchers in an open and innovative environment • WHAT TO BRING • Bachelor or above in computer science or EE • 4+ years of professional programming in Java, Scala, Python, and etc. • 3+ years of big data development experience with technical stacks like Spark, Flink, Singlestore, Kafka, Nifi and AWS big data technologies • Knowledge of system, application design and architecture • Experience of build industry level high available and scalable service • Passion about technologies, and openness to interdisciplinary work
Technical Qualification • Experience with processing large amount of data at petabyte level • Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, Spinnaker, IAM, ALB, and etc. • Experience with ClickHouse, Druid, Snowflake, Impala, Presto, Kinesis, etc. • Experience in widely used Web framework (React.js, Vue.js, Angular, etc.) and good knowledge of Web stack HTML, CSS, Webpack.
Responsibilities • Build components of large-scale data platform for real-time and batch processing, and own features of big data applications to fit evolving business needs • Build next-gen cloud based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability • Contribute to the best engineering practices, including the use of design patterns, CI/CD, code review and automated test • Chip in ground-breaking innovation and apply the state-of-the-art technologies • As a key member of the team, contribute to all aspects of the software lifecycle: design, experimentation, implementation and testing. • Collaborate with program managers, product managers, SDET, and researchers in an open and innovative environment • WHAT TO BRING • Bachelor or above in computer science or EE • 4+ years of professional programming in Java, Scala, Python, and etc. • 3+ years of big data development experience with technical stacks like Spark, Flink, Singlestore, Kafka, Nifi and AWS big data technologies • Knowledge of system, application design and architecture • Experience of build industry level high available and scalable service • Passion about technologies, and openness to interdisciplinary work
Technical Qualification • Experience with processing large amount of data at petabyte level • Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, Spinnaker, IAM, ALB, and etc. • Experience with ClickHouse, Druid, Snowflake, Impala, Presto, Kinesis, etc. • Experience in widely used Web framework (React.js, Vue.js, Angular, etc.) and good knowledge of Web stack HTML, CSS, Webpack.