JobRialto
Senior Java Software Engineer
JobRialto, Boston, MA, United States
Job Summary:
The Customer Data Technology group within the Personal Investing (PI) organization is seeking an experienced Software Engineer to build and maintain large-scale data processing systems. In this role, you will leverage cutting-edge technologies to design and deliver innovative Big Data solutions, with a focus on enhancing customer experiences in the financial services space. You will be working in a collaborative environment where you will have opportunities to innovate and drive the evolution of data processing technologies.
Key Responsibilities:
Certifications (if any):
AWS Certified Solutions Architect - Associate or Professional.
AWS Certified DevOps Engineer.
Any other relevant cloud or data-related certifications.
Education: Bachelors Degree
The Customer Data Technology group within the Personal Investing (PI) organization is seeking an experienced Software Engineer to build and maintain large-scale data processing systems. In this role, you will leverage cutting-edge technologies to design and deliver innovative Big Data solutions, with a focus on enhancing customer experiences in the financial services space. You will be working in a collaborative environment where you will have opportunities to innovate and drive the evolution of data processing technologies.
Key Responsibilities:
- Design, develop, and maintain scalable, high-performance data processing systems that enhance customer experiences.
- Build microservices (REST APIs, GraphQL APIs) using Object-Oriented Programming languages (Java, Python, Spring Boot) on AWS.
- Apply expertise in Big Data solutions, utilizing AWS services to manage and process large datasets.
- Work with relational (AWS RDS, Oracle, Postgres) and NoSQL databases (DynamoDB, Elasticsearch, Graph database) to ensure data integrity and scalability.
- Develop event-driven applications using messaging technologies like SQS, SNS, Kinesis, Kafka, and Lambda.
- Implement continuous integration and continuous delivery (CI/CD) pipelines using tools like Maven, Jenkins, GitHub, and Docker.
- Collaborate with cross-functional teams to design and implement data-driven solutions that align with business objectives.
- Research and explore emerging technology trends to drive innovation in the data ecosystem.
- Support and deploy machine learning models using AWS Sagemaker.
- Participate in Agile (Kanban/SCRUM) development processes to deliver high-quality software.
- Bachelor's or Master's Degree in a technology-related field (e.g., Computer Science, Engineering, etc.).
- 6+ years of experience in implementing Big Data solutions, with a strong focus on cloud technologies (AWS preferred).
- Expertise in Object-Oriented Programming languages (Java, Python) and building microservices (REST APIs, GraphQL APIs) in AWS.
- Experience with relational (AWS RDS, Oracle, Postgres) and NoSQL databases (DynamoDB, Elasticsearch, Graph database).
- Solid experience in developing event-driven/stream processing applications using technologies like SQS, SNS, Kinesis, Kafka, and Lambda.
- Proficiency in DevOps practices, including CI/CD pipelines using tools like Maven, Jenkins, GitHub, and Docker.
- Knowledge of deploying machine learning models using AWS Sagemaker.
- Experience with Agile methodologies (Kanban, SCRUM).
- Exposure to the Python machine learning ecosystem (NumPy, Pandas, Scikit-learn, TensorFlow, NLP).
- Familiarity with in-memory technologies like ElasticCache, Redis, etc.
- Strong problem-solving abilities and a passion for technology innovation.
- Ability to work collaboratively in a fast-paced and dynamic team environment.
Certifications (if any):
AWS Certified Solutions Architect - Associate or Professional.
AWS Certified DevOps Engineer.
Any other relevant cloud or data-related certifications.
Education: Bachelors Degree