Trident Consulting
Data Architect
Trident Consulting, Parsippany, New Jersey, us, 07054
Trident Consulting is seeking an
"
Data Architect
"
for one of our clients in " Parsippany, NJ "
A global leader in business and technology services.
Role:
Data ArchitectLocation:
Parsippany, NJ (Onsite)Type:
Contract
Required Skills : Python, Amazon Redshift, Amazon S3, Data Architect, Data Modelling, DB Performance Optimization
Job summary :We are seeking a highly skilled Sr. Architect with 12 to 15 years of experience to join our team.The ideal candidate will have extensive experience in Cloud Data pipeline, with Architecting and modelling skills.
Must haves:Experience in AWS and enterprise data warehousing project/ETL (building ETL pipeline), Enterprise Data Engineering and Analytics projects.Data Modelling design (ER/Dimensional Modelling) - Conceptual/Logical/Physical.Clear understanding Data warehousing and Data Lake concepts.Redshift implementation with hands-on experience in AWS.Understand business requirements and existing system designs, enterprise applications, IT security guidelines, Legal ProtocolsShould possess Data modelling experience and should be able to collaborate with other teams within project/program.Proven experience in data modelling & analysis, data migration strategy, cleanse and migrate large master data sets, data alignment across multiple applications, data governance.Should be able to assist in making technology choices and decisions in an enterprise architecture scenario.Should possess working experience in different database environment/applications like OLTP, OLAP etc.Design, build and operationalize data solutions and applications using one or more of AWS data and analytics services in combination with 3rd partiesEMR, RedShift, Kinesis, Glue.Actively participate in optimization and performance tuning of data ingestion and SQL processesKnowledge on basic AWS services like S3, EC2, etcExperience in any of the following AWS Athena and Glue PySpark, EMR, RedshiftDesign and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programmingAnalyze, re-architect, and re-platform on-premises data warehouses to data platforms on AWS cloud using AWS or 3rd party servicesUnderstand and implement security and version controlsSupport data engineers with design of ETL processes, code reviews, and knowledge sharing
Roles and Responsibility:Ability to explain data lake architecture using AWS servicesIn-depth knowledge on AWS well-architected framework - Good programming skill using scripting (e.g., Python)Good to have experience on one ETL toolClarify and finalize detailed scope for migrationConduct customer interviews to understand existing standards, policies and quality compliance, enterprise metadata standardsWork with various SMEs in understanding business process flows, functional requirements specifications of existing system, their current challenges and constraints and future expectationDocument current state & prepare target state architectureExcellent client interfacing and communication skillsVery good understanding of Data Intelligence concepts technologies etc.Address customer issues with speed and efficiencyDevelop and manage relations with key client stakeholdersIdentify resources required for project completionAbout TridentTrident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include• 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America• 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
"
Data Architect
"
for one of our clients in " Parsippany, NJ "
A global leader in business and technology services.
Role:
Data ArchitectLocation:
Parsippany, NJ (Onsite)Type:
Contract
Required Skills : Python, Amazon Redshift, Amazon S3, Data Architect, Data Modelling, DB Performance Optimization
Job summary :We are seeking a highly skilled Sr. Architect with 12 to 15 years of experience to join our team.The ideal candidate will have extensive experience in Cloud Data pipeline, with Architecting and modelling skills.
Must haves:Experience in AWS and enterprise data warehousing project/ETL (building ETL pipeline), Enterprise Data Engineering and Analytics projects.Data Modelling design (ER/Dimensional Modelling) - Conceptual/Logical/Physical.Clear understanding Data warehousing and Data Lake concepts.Redshift implementation with hands-on experience in AWS.Understand business requirements and existing system designs, enterprise applications, IT security guidelines, Legal ProtocolsShould possess Data modelling experience and should be able to collaborate with other teams within project/program.Proven experience in data modelling & analysis, data migration strategy, cleanse and migrate large master data sets, data alignment across multiple applications, data governance.Should be able to assist in making technology choices and decisions in an enterprise architecture scenario.Should possess working experience in different database environment/applications like OLTP, OLAP etc.Design, build and operationalize data solutions and applications using one or more of AWS data and analytics services in combination with 3rd partiesEMR, RedShift, Kinesis, Glue.Actively participate in optimization and performance tuning of data ingestion and SQL processesKnowledge on basic AWS services like S3, EC2, etcExperience in any of the following AWS Athena and Glue PySpark, EMR, RedshiftDesign and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programmingAnalyze, re-architect, and re-platform on-premises data warehouses to data platforms on AWS cloud using AWS or 3rd party servicesUnderstand and implement security and version controlsSupport data engineers with design of ETL processes, code reviews, and knowledge sharing
Roles and Responsibility:Ability to explain data lake architecture using AWS servicesIn-depth knowledge on AWS well-architected framework - Good programming skill using scripting (e.g., Python)Good to have experience on one ETL toolClarify and finalize detailed scope for migrationConduct customer interviews to understand existing standards, policies and quality compliance, enterprise metadata standardsWork with various SMEs in understanding business process flows, functional requirements specifications of existing system, their current challenges and constraints and future expectationDocument current state & prepare target state architectureExcellent client interfacing and communication skillsVery good understanding of Data Intelligence concepts technologies etc.Address customer issues with speed and efficiencyDevelop and manage relations with key client stakeholdersIdentify resources required for project completionAbout TridentTrident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include• 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America• 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area