Cognizant North America
AWS Data Architect/Databricks (Onsite)
Cognizant North America, Hartford, Connecticut, us, 06112
We are Cognizant Artificial Intelligence.
Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.
With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.
You must be legally authorized to work in the United States without the need of employer sponsorship, now or at any time in the future.
This is an onsite position open to any qualified applicant in the United States.
Job Title: AWS Data Architect - Databricks
Job summary: We are seeking an experienced Architect with 10 to 13 years of experience to join our team. The ideal candidate will have extensive technical skills in Spark in Scala, Delta Sharing, Databricks Unity Catalog Admin, Databricks CLI, Delta Live Pipelines, Structured Streaming, Risk Management, Apache Airflow, Amazon S3, Amazon Redshift, Python, Databricks SQL, Databricks Delta Lake, Databricks Workflows, and PySpark. Additionally, experience in the Property & Casualty Insurance domain is mandatory. Roles/Responsibilities
Own the design and implementation of data architecture solutions using Spark in Scala and Databricks technologies. Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin. Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows. Implement and handle Structured Streaming solutions to ensure real-time data processing. Apply risk management principles to ensure data security and compliance. Use Apache Airflow for orchestrating sophisticated data workflows. Lead data storage and retrieval using Amazon S3 and Amazon Redshift. Develop and maintain Python scripts for data processing and automation. Build and optimize Databricks SQL queries for efficient data analysis. Implement Databricks Delta Lake for scalable and reliable data lakes. Craft and manage Databricks Workflows to automate data pipelines. Apply PySpark for large-scale data processing and analytics. Collaborate with multi-functional teams to ensure data solutions meet business requirements. Provide technical guidance and mentorship to junior team members. Ensure all solutions enforce industry best practices and company standards. Contribute to the continuous improvement of data architecture processes and methodologies. Stay updated with the latest industry trends and technologies to drive innovation. Ensure the architecture solutions align with the company goals and objectives. Support the Property & Casualty Insurance domain with tailored data solutions. Deliver high-quality, scalable, and maintainable data architecture solutions. Ensure the hybrid work model is effectively used for efficient productivity. Qualifications
Extensive experience with Spark in Scala and Databricks technologies. Proficiency in Delta Sharing and Databricks Unity Catalog Admin. Expertise in Databricks CLI and Delta Live Pipelines. Strong knowledge of Structured Streaming and risk management. Experience with Apache Airflow, Amazon S3, and Amazon Redshift. Proficiency in Python and Databricks SQL. Experience with Databricks Delta Lake and Databricks Workflows. Solid skills in PySpark for data processing and analytics. Mandatory experience in the Property & Casualty Insurance domain. Ability to work effectively in a hybrid work model. Strong problem-solving and analytical skills. Superb communication and teamwork abilities. Dedication to continuous learning and professional development. Certifications Required
Databricks Certified Data Engineer Associate, AWS Certified Solutions Architect, Apache Airflow Certification. Salary and Other Compensation:
Applications will be accepted until January 16, 2025. The annual salary for this position is between $81,000 - $140,000 depending on experience and other qualifications of the successful candidate. This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans. Benefits:
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements: Medical/Dental/Vision/Life Insurance Paid holidays plus Paid Time Off 401(k) plan and contributions Long-term/Short-term Disability Paid Parental Leave Employee Stock Purchase Plan Disclaimer:
The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
#J-18808-Ljbffr
Job summary: We are seeking an experienced Architect with 10 to 13 years of experience to join our team. The ideal candidate will have extensive technical skills in Spark in Scala, Delta Sharing, Databricks Unity Catalog Admin, Databricks CLI, Delta Live Pipelines, Structured Streaming, Risk Management, Apache Airflow, Amazon S3, Amazon Redshift, Python, Databricks SQL, Databricks Delta Lake, Databricks Workflows, and PySpark. Additionally, experience in the Property & Casualty Insurance domain is mandatory. Roles/Responsibilities
Own the design and implementation of data architecture solutions using Spark in Scala and Databricks technologies. Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin. Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows. Implement and handle Structured Streaming solutions to ensure real-time data processing. Apply risk management principles to ensure data security and compliance. Use Apache Airflow for orchestrating sophisticated data workflows. Lead data storage and retrieval using Amazon S3 and Amazon Redshift. Develop and maintain Python scripts for data processing and automation. Build and optimize Databricks SQL queries for efficient data analysis. Implement Databricks Delta Lake for scalable and reliable data lakes. Craft and manage Databricks Workflows to automate data pipelines. Apply PySpark for large-scale data processing and analytics. Collaborate with multi-functional teams to ensure data solutions meet business requirements. Provide technical guidance and mentorship to junior team members. Ensure all solutions enforce industry best practices and company standards. Contribute to the continuous improvement of data architecture processes and methodologies. Stay updated with the latest industry trends and technologies to drive innovation. Ensure the architecture solutions align with the company goals and objectives. Support the Property & Casualty Insurance domain with tailored data solutions. Deliver high-quality, scalable, and maintainable data architecture solutions. Ensure the hybrid work model is effectively used for efficient productivity. Qualifications
Extensive experience with Spark in Scala and Databricks technologies. Proficiency in Delta Sharing and Databricks Unity Catalog Admin. Expertise in Databricks CLI and Delta Live Pipelines. Strong knowledge of Structured Streaming and risk management. Experience with Apache Airflow, Amazon S3, and Amazon Redshift. Proficiency in Python and Databricks SQL. Experience with Databricks Delta Lake and Databricks Workflows. Solid skills in PySpark for data processing and analytics. Mandatory experience in the Property & Casualty Insurance domain. Ability to work effectively in a hybrid work model. Strong problem-solving and analytical skills. Superb communication and teamwork abilities. Dedication to continuous learning and professional development. Certifications Required
Databricks Certified Data Engineer Associate, AWS Certified Solutions Architect, Apache Airflow Certification. Salary and Other Compensation:
Applications will be accepted until January 16, 2025. The annual salary for this position is between $81,000 - $140,000 depending on experience and other qualifications of the successful candidate. This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans. Benefits:
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements: Medical/Dental/Vision/Life Insurance Paid holidays plus Paid Time Off 401(k) plan and contributions Long-term/Short-term Disability Paid Parental Leave Employee Stock Purchase Plan Disclaimer:
The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
#J-18808-Ljbffr