INSPYR Solutions
Data Architect - Long Term Project - (Remote/US)
INSPYR Solutions, New York, New York, United States,
Data Architect - Long Term Project - (Remote/US) Title: Data Architect Location: (Remote/US) Duration: Long Term Project - 7 months Compensation: $70-79.95 Work Requirements: US Citizen, GC Holders or Authorized to Work in the U.S.What We Do/Project The Data Architect will be passionate about data as a first-class citizen to empower and enable data-driven business decisions. The Data Architect will lead our full-stack data architecture strategy and implementation, including data modeling, data governance, warehouse aggregation, and BI tooling. They will drive our modernization efforts to ensure our data flows, processing, and reporting are accurate and insightful. They will work closely with senior leadership, research, marketing, product management, architects, and developers to design, build, and deploy data warehouse solutions and reporting tools that meet the growing analytical needs of our organization. Job Responsibilities / Typical Day in the Role • Define the vision, requirements, and lead development of the Enterprise Data Lake focusing on the business needs • Design and develop the data platform to efficiently and cost effectively address various data needs across the business. • Build software across our entire cutting-edge data platform, including event driven data processing, storage, and serving through scalable and highly available APIs, with awesome cutting-edge technologies. • Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery. • Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action. • Help us stay ahead of the curve by working closely with data engineers, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically in ways which make other groups jealous • Help build and maintain foundational data products such as but not limited to Finance, Titles, Content Sales, Theatrical, Consumer Products etc. • Work closely with data analysts and business stake holders to make data easily accessible and understandable to them. • Ensure data quality by implementing re-usable data quality frameworks. • Work closely with various other data engineering teams to roll out new capabilities. • Build process and tools to maintain Machine Learning pipelines in production. • Develop and enforce data engineering, security, data quality standards through automation. • Be responsible for cloud cost and improving efficiency. Must Have Skills / Requirements 1) Experience building and scaling data platforms 2) Experience with cloud data technologies including Snowflake, AWS, Apache Airflow 3) Experience working as an Architect, designing enterprise data solutions. Nice to Have Skills / Preferred Requirements 1) Public speaking and presentation skills. 2) Deep understanding of API connectivity and data streaming architecture Soft Skills • Passion for working with data and deriving insights to answer business questions that drive actions and decision-making • Experience with Data Modelling tools • Experience leveraging creative analytics techniques and output to tell a story to drive business decisions • Champions the capabilities and benefits of analytical data assets. Partner effectively across departments and stakeholders • Solid business acumen, and critical problem-solving ability, with a capacity for strategic thinking • Comfort level with ambiguity and ability to manage multiple projects at the same time • Excellent communication, presentation, and customer relationship skills. Education / Certifications 1) Bachelor's degree in computer science, information systems, or information technology Technology requirements: Experience with programming languages – SQL & Python. Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.) Years experience: 14 years of experience building and scaling data platforms 5 years of experience with cloud data technologies including Snowflake, AWS, Apache Airflow 3 Years of experience working as an Architect, designing enterprise data solutions.