Serigor Inc.
Serigor Inc. is hiring: Big Data Architect IT Consultant Master (ONSITE) in Wash
Serigor Inc., Washington, DC, US
Job Description
Job Description
Job Title: Big Data Architect IT Consultant Master (ONSITE)
Location: Washington, DC
Duration: 12 Months+
Job Description:
The client seeks an experienced IT Consultant to support the design, development, implementation and maintenance of an enterprise Big Data solution as part of the client Data Modernization Effort.
This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team supporting client. This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). The platform will be designed for District wide use and integration with other client Enterprise Data tools such as Esri, Tableau, MicroStrategy, API Gateways, and Oracle databases and integration tools.
Responsibilities:
Minimum Education/Certification Requirements:
Skills:
SkillsRequired / DesiredAmountof ExperienceExperience implementing Big Data storage and analytics platforms such as Databricks and Data LakesRequired5YearsKnowledge of Big Data and Data Architecture and Implementation best practicesRequired5YearsKnowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft AzureRequired5YearsExperience with deployment of data tools and storage on cloud platforms such as Microsoft AzureRequired5YearsKnowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, OracleRequired10YearsExperience querying structured and unstructured data sources including SQL and NoSQL databasesRequired5YearsExperience modeling and ingesting data into and between various data systems through the use of Data PipelinesRequired5YearsExperience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, ImpalaRequired5YearsExperience with API / Web Services (REST/SOAP)Required3YearsExperience with complex event processing and real-time streaming dataRequired3YearsExperience with deployment and management of data science tools and modules such as JupyterHubRequired3YearsExperience with ETL, data processing, analytics using languages such as Python, Java or RRequired3YearsExperience with Cloudera Data PlatformHighly desired3Years16+ yrs planning, coordinating, and monitoring project activitiesRequired16Years16+ yrs leading projects, ensuring they are in compliance with established standards/proceduresRequired16YearsBachelor’s degree in IT or related field or equivalent experienceRequired
Location: Washington, DC
Duration: 12 Months+
Job Description:
The client seeks an experienced IT Consultant to support the design, development, implementation and maintenance of an enterprise Big Data solution as part of the client Data Modernization Effort.
This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team supporting client. This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). The platform will be designed for District wide use and integration with other client Enterprise Data tools such as Esri, Tableau, MicroStrategy, API Gateways, and Oracle databases and integration tools.
Responsibilities:
- Coordinates IT project management, engineering, maintenance, QA, and risk management.
- Plans, coordinates, and monitors project activities.
- Develops technical applications to support users.
- Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems.
- Provides training for system products and procedures.
- Performs application upgrades.
- Performs, monitoring, maintenance, or reporting on real- time databases, real-time network and serial data communications, and real-time graphics and logic applications.
- Troubleshoots problems.
- Ensures project life-cycle is in compliance with District standards and procedures.
Minimum Education/Certification Requirements:
- Bachelor’s degree in Information Technology or related field or equivalent experience
Skills:
SkillsRequired / DesiredAmountof ExperienceExperience implementing Big Data storage and analytics platforms such as Databricks and Data LakesRequired5YearsKnowledge of Big Data and Data Architecture and Implementation best practicesRequired5YearsKnowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft AzureRequired5YearsExperience with deployment of data tools and storage on cloud platforms such as Microsoft AzureRequired5YearsKnowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, OracleRequired10YearsExperience querying structured and unstructured data sources including SQL and NoSQL databasesRequired5YearsExperience modeling and ingesting data into and between various data systems through the use of Data PipelinesRequired5YearsExperience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, ImpalaRequired5YearsExperience with API / Web Services (REST/SOAP)Required3YearsExperience with complex event processing and real-time streaming dataRequired3YearsExperience with deployment and management of data science tools and modules such as JupyterHubRequired3YearsExperience with ETL, data processing, analytics using languages such as Python, Java or RRequired3YearsExperience with Cloudera Data PlatformHighly desired3Years16+ yrs planning, coordinating, and monitoring project activitiesRequired16Years16+ yrs leading projects, ensuring they are in compliance with established standards/proceduresRequired16YearsBachelor’s degree in IT or related field or equivalent experienceRequired
Powered by JazzHR
OeoixUFK0x