Stifel Financial Corp
Data Engineer
Stifel Financial Corp, Saint Louis, MO
Summary
The Data Engineer is responsible for designing, developing, and maintaining data pipelines and infrastructure on the AWS platform. The Data Engineer works with large volumes of data, ensuring its quality, reliability, and accessibility. Tasks may include data ingestion, transformation, storage, data sharing and consumption, and implementing data security and privacy measures. This role is crucial in enabling efficient and effective data-driven decision-making.
Essential Duties & Responsibilities
Qualifications
Education & Experience
Systems & Technology
About Stifel
Stifel is a more than 130 years old and still thinking like a start-up. We are a global wealth management and investment banking firm serious about innovation and fresh ideas. Built on a simple premise of safeguarding our clients' money as if it were our own, coined by our namesake, Herman Stifel, our success is intimately tied to our commitment to helping families, companies, and municipalities find their own success.
While our headquarters is in St. Louis, we have offices in New York, San Francisco, Baltimore, London, Frankfurt, Toronto, and more than 400 other locations. Stifel is home to approximately 9,000 individuals who are currently building their careers as financial advisors, research analysts, project managers, marketing specialists, developers, bankers, operations associates, among hundreds more. Let's talk about how you can find your place here at Stifel, where success meets success.
At Stifel we offer an entrepreneurial environment, comprehensive benefits package to include health, dental and vision care, 401k, wellness initiatives, life insurance, and paid time off.
Stifel is an Equal Opportunity Employer.
#LI-LL1
The Data Engineer is responsible for designing, developing, and maintaining data pipelines and infrastructure on the AWS platform. The Data Engineer works with large volumes of data, ensuring its quality, reliability, and accessibility. Tasks may include data ingestion, transformation, storage, data sharing and consumption, and implementing data security and privacy measures. This role is crucial in enabling efficient and effective data-driven decision-making.
Essential Duties & Responsibilities
- Build and maintain scalable and reliable data pipelines, ensuring the smooth flow of data from various sources to the desired destinations in the AWS cloud environment.
- Work closely with stakeholders to understand their data requirements and design data solutions that meet their needs, including understanding data models/schemas and implementing ETL (Extract, Transform, and Load) processes to transform raw data into a usable format in the destination
- Responsible for monitoring and optimizing the performance of data pipelines, troubleshooting any issues that arise, and ensuring data quality and integrity.
Qualifications
- Proficient in programming languages such as Python and SQL for database querying and manipulation.
- Strong understanding of AWS services related to data engineering, such as Amazon S3, Amazon Redshift, Amazon Aurora Postgres, AWS Glue, AWS Lambda, AWS Step Function, AWS Lake Formation, Amazon Data Zone, Amazon Kinesis, MSK, and Amazon EMR.
- Knowledge of database design principles and experience with database management systems.
- Experience with data storage technologies like relational databases (e.g., SQL Server, PostgreSQL) and distributed storage systems (e.g., PySpark).
- Understanding of Extract, Transform, Load (ETL) processes and experience with ETL tools like AWS Glue and SQL Server Integration Services is essential and should be skilled at integrating disparate data sources and ensuring data quality and consistency.
- Understanding and experience with orchestration tools like Apache Airflow, AWS Glue Workflows, AWS Step Functions, and notification services.
- Familiarity with IAC such as Terraform, git, and DevOps pipelines.
- Strong analytical thinking and problem-solving abilities are essential to identify and resolve data-related issues effectively, with the ability to analyze complex data sets, identify patterns, and derive actionable insights.
- Awareness of data governance practices, data privacy regulations, and security protocols is crucial, with experience implementing data security measures and ensuring compliance with relevant standards is desirable.
Education & Experience
- Bachelor's Degree in Computer Science, related field, or equivalent experience.
- 3+ years of post-Bachelor progressive experience in data engineering.
Systems & Technology
- Proficient in the following computer languages:
- Python
- SQL
- AWS technologies to include:
- Glue
- S3
- Redshift
- Lambda
- Lake Formation
- DataZone
About Stifel
Stifel is a more than 130 years old and still thinking like a start-up. We are a global wealth management and investment banking firm serious about innovation and fresh ideas. Built on a simple premise of safeguarding our clients' money as if it were our own, coined by our namesake, Herman Stifel, our success is intimately tied to our commitment to helping families, companies, and municipalities find their own success.
While our headquarters is in St. Louis, we have offices in New York, San Francisco, Baltimore, London, Frankfurt, Toronto, and more than 400 other locations. Stifel is home to approximately 9,000 individuals who are currently building their careers as financial advisors, research analysts, project managers, marketing specialists, developers, bankers, operations associates, among hundreds more. Let's talk about how you can find your place here at Stifel, where success meets success.
At Stifel we offer an entrepreneurial environment, comprehensive benefits package to include health, dental and vision care, 401k, wellness initiatives, life insurance, and paid time off.
Stifel is an Equal Opportunity Employer.
#LI-LL1