Saxon Global
AWS Data Architect
Saxon Global, Boston, MA, United States
Top required skills based on interviews and feedback (Client want to see experience around all of this):
- Healthcare Payor / Provider client experience REQUIRED.
- Deep experience working as an AWS Data Solution Architect with 5+ years AWS Cloud. Must excel in client/ stakeholder interactions, bridging the gap between technical complexities and business requirements. Clear and concise communication of technical decisions to non-technical stakeholders will be crucial for success and successful adoption
- 10+ years of creating data engineering solutions, ETL/ELT pipelines on multi-terabyte SQL/NoSQL databases, data warehouse environment
- Data Warehouse / Data Platforms (Kafka and Kinesis required) - They have NO INTEREST in folks with just Azure or GCP
- Hands-on experience with building AWS POCs for data solutions which include:
- Moving legacy on-prem data platforms to the cloud
- ETL / Data Modeling / Data Parsing
- Data streaming / Realtime data transfer
- Business side interaction-Comfortable talking with Business stakeholders and working with them on POCs to gather data requirement and define future state
- Experience with Aurora DB loads in real-time/streaming (Kafka, Kinesis).
- Experience with data parsing in Json or xml.
- Creates and maintains overall data & analytics solutions for applications/products across multiple projects of medium complexity.
- Performs POC to establish standard data & analytics solution patterns that can be leveraged for multiple products and applications.
- Ensures that the design is consistent with the overall enterprise architecture.
- Responsible for securing approvals from review boards for the data & analytics solutions
- Partners with peer solution architects, engineering, product, operations teams to build feasible & scalable solutions.
- Research and documents architecturally significant aspects/views of the systems
- Ensures consistent use of technologies and platforms within data & analytics ecosystems.
- Leads or participates in project planning of architecture and design activities and provides estimates.
- Contributes to creation of our IT Roadmap
- Responsible for application planning and platform management
- Establishes enterprise level Data Security, Master Data Management and Reference Data Management guidelines.
- Translates detailed business requirements into optimal technology solutions using AWS services, Lambda, EC2,Glue and other leading open-source tools and technologies.
- Manages technical delivery for projects, designs processes, manage and drive execution with help of consultants and associates.
- Diagnose and address complex problems, including performance issues, scale and drive to resolution to meet business initiatives.
- Assist data analysts and end users across all functional areas in identifying long-term, strategic data needs for the enterprise, defining how the data is related to the business processes and developing a conceptual and functional model of the data.
- Assist and advocate for bringing new technologies for data management, governance, and usage.
- Work hands on with emerging technologies during software evaluations, PoCs and pilots.
- Ensure that data is secure and protected from unauthorize access or modification.
- Minimum 8 years of experience in data, analytics, AI/ML, insights, business intelligence, data warehouse products, platforms and applications.
- Minimum 5 years of experience in working as Solution/Data Architect on cloud platforms (AWS is highly desirable).
- Minimum 4 years of experience with ETL/ELT design and development using tools like Talend, Informatica, AWS Glue, Oracle Data Integrator (ODI) or equivalent.
- Minimum 5 years of experience in database architecture using relational SQL (e.g. Oracle, PostgreSQL, MySQL), and NoSQL (DynamoDB, MongoDB, Elasticsearch).
- Minimum 4 years of experience with programming languages Node.js, Python, Java or Scala.
- Minimum 3 years of experience with building applicating with serverless architecture on AWS platforms using Athena, Glue, EC2, Lamda, Kenesis etc..
- Minimum 2 years of experience with large-scale data processing platforms such as Spark, EMR, and/or HPC computing experience with e.g. Apache Aurora, Slurm.
- Minimum 4 years of solutioning experience in AWS Cloud environment leveraging multiple services for storage, processing, analyzing large volume of data.
- Strong database architecture, critical thinking, and problem-solving abilities, along with an ability to handle ambiguous and evolving requirements.
- Minimum 2 years of experience with building solutions using tools like Vizio, Draw.io and Miro.
- Experience working with RESTful API and general service-oriented architectures.
- Experience with DevOps, Continuous Integration and Continuous Delivery technologies is desirable.
- AWS Solution Architect certification (associate & above) is highly desirable.
- Healthcare domain knowledge required.
- Excellent verbal and written communication skills.