Logo
Arrowstreet Capital

Senior Cloud Data Engineer, Framework Engineering

Arrowstreet Capital, Boston, Massachusetts, us, 02298


Job Overview

The Investment Process Engineer is responsible for the fidelity of Arrowstreet Capital's Data-Driven Investment Process. The role involves building visibility, and recovery mechanisms for our modern distributed investment process. The daily investment process includes processing external vendor data and the data flow through pre trade, optimization, post trade, and reporting. The modern investment process is driven by many common frameworks, distributed platforms and applications with complex dependencies. This will require a systematic approach and consistent tooling to maintain a comprehensive view and reduce process debt.

The ideal candidate is interested in connecting modern technology and the processes that drive the daily automated investment process. This role presents a great opportunity for a technologist interested in the end-to-end investment process and working with the latest cloud native technology.

ResponsibilitiesDevelop cloud-native solutions to enable visibility, rapid response, reduce mean time to recovery of the end-to-end data-driven investment process.Design, build a system to interact with distributed investment process applications to obtain relevant information for observability and managementWork with key stakeholders to build tooling to visualize the investment process network, to identify potential system risk factors, capacity issuesDesign, build ability to run simulations through the entire systemKnowledge of data warehousing concepts, dimensional modeling, and data modeling techniques.Accountable for the completion of assigned deliverables in a timely manner with little day to day managementAble to assess the trade-offs of different candidate systems, processes or technologiesYou will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal financial and system data visualization solutions.Working with cross-functional teams, you will participate in gathering and documenting requirements to meet business needs and use those requirements to help design, develop, test and implement reports and dashboards that utilize the underlying data to deliver cutting-edge process management products.Qualifications

Bachelor's degree in computer science, systems analysis or a related study, or equivalent experience5+ Building cloud-based applications in Python5+ years of experience with cloud services (AWS, GCP, Azure), with AWS preferred.5+ years of experience AWS data services like AWS Redshift, AWS Aurora PostgreSQL, S3(parquet), Athena or DynamoDB3+ years of experience in developing and optimizing data transformation processes using tools like Apache Spark or cloud-native services (e.g., AWS Glue, EMR).2+ years of experience in building CI/CD pipelines, strong knowledge of Git.2+ years of experience in container technologies like Docker, Kubernetes.2+ year experience with writing Infrastructure as Code with python/terraform/cloud formationFamiliarity with tools like AWS OpenSearch, Prometheus, CloudWatch, CloudTrail, IAM resource and role policiesPrior Experience with Object-Oriented languages like C# or Java is a plus.Prior AWS Certifications would be a plusPrevious experience writing cloud-native applications using AWS SAM (or serverless), Lambda, Step Functions and/or DynamoDB would be a plus.

We maintain a friendly, team-oriented environment and place a high value on professionalism, attitude, and initiative.

We maintain a friendly, team-oriented environment and place a high value on professionalism, attitude and initiative.