Logo
Spectraforce Technologies

Senior Data Engineer

Spectraforce Technologies, Richardson, Texas, United States, 75080


Job Title:

Senior Data Engineer

Location:

Richardson, TX (

3x a week office

)

Duration: 8

months

Job Description:

Qualifications:

* bachelor's degree (BA/BS) in a related field such as information systems, mathematics, or computer science.

* Typically has 6 years of relevant work experience. Consideration given to equivalent combination of education and experience.

* Excellent written and verbal communication skills. Strong organizational and analytical skills.

* Expertise in Data Extraction, Transformation, Loading, Data Analysis, Data Profiling, and SQL Tuning.

* Expertise in Relational & Dimensional Databases in engines like SQL Server, Postgres, Oracle...

* Strong experience in designing and developing enterprise scale data warehouse systems using Snowflake.

* Strong expertise in designing and developing reusable and scalable Data products with data quality scores and integrity checks.

* Strong expertise in developing end to end complex data workflows using Data ingestion tools such as Snaplogic, ADF, Matallion etc.

* Experience with cloud platforms AWS / Azure cloud technologies, Agile methodologies and DevOps is a big plus.

* Experience in architecting cloud native solutions across multiple B2B and B2B2C data domains.

* Experience in architecture of modern APIs for the secure sharing of data across internal application components as well as external technology partners.

* Experience in Data orchestration tools like Apache Airflow, Chronos with Mesos cluster etc.

* Expertise in designing and developing data transformation models in DBT.

* Comparing and analyzing provided statistical information to identify patterns, relationships, and problems; and using this information to design conceptual and logical data models and flowcharts to present to management.

* Experience with developing CICD pipelines in Jenkins or Azure DevOps.

* Knowledge of Python for data manipulation and automation.

* Knowledge of data governance frameworks and best practices.

* Knowledge in integrating with source code versioning tools like Git Hub.

Responsibilities:

* Plan & analyze, develops, maintains, and enhances client systems as well as supports systems of moderate to high complexity.

* Participates in the design, specification, implementation, and maintenance of systems.

* Designs, codes, tests, and documents software programs of moderate complexity as per the requirement specifications.

* Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools.

* Participates in design reviews and technical briefings for specific applications.

* Integrate data from various sources, ensuring consistency, accuracy, and reliability.

* Develop and manage ETL/ELT processes to support data warehousing and analytics.

* Assists in preparation of requirement specifications, Analyzing the data, design and develop data driven applications including documenting and revising user procedures and/or manuals.

* Involved with resolution of Medium to severe complexity software development issues that may arise in a production environment.

* Utilize Python for data manipulation, automation, and integration tasks.

* Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability

* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL Server, PostgreSQL, SSIS, T-SQL, PL/SQL

* Work with stakeholders including the Product, Data, Design, Frontend and Backend teams to assist with data-related technical issues and support their data infrastructure needs

* Write complex SQL, T-SQL, PL/SQL queries, stored procedures, functions, cursors in SQL Server and PostgreSQL. Peer review other team members code

* Analyze the long running queries/functions/procures, design and develop performance optimization strategy.

* Create and manage SSIS packages and/or Informatica to perform day to day ETL activities. Use variety of strategies for complex data transformations using an ETL tool

* Perform DBA activities like maintaining the systems health and performance tuning, manage database access, deployments to higher environments, on-call support, shell scripting and python scripting is a plus

* Participate in employing the Continuous Deliver and Continuous Deployment (CI/CD) tools for optimal productivity.

* Collaborate with scrum team members during daily standup and actively engage in sprint refinement, planning, review and retrospective.

* Analyzes, reviews, and alters program to increase operating efficiency or adapt to new requirements.

* Writes documentation to describe program development, logic, coding, and corrections.