Logo
United IT Solutions

Data Quality Engineer

United IT Solutions, Dallas, Texas, United States, 75215


Job Requirements Data Quality Engineer

Location: Dallas, TX

Client: Qentelli/SWA(South West Airlines)

Mode: Hybrid(Need locals)

Required Things: Need locals, Don't submit candidates after 1990 born, Candidate need to be local and having Dallas, TX Driving Licence, Must be on our W2 no layers, Must to get confirmation that they haven't submitted to SWA prior

Rate: $55 to $60 w2 Max

Description:

Work Description SDET - ETL, Advanced SQL, UNIX/LINUX/ Python and Data Cloud

Work with business stakeholders, Business Systems Analysts and Developers to ensure quality delivery of software. • Interact with key business functions to confirm data quality policies and governed attributes. • Follow quality management best practices and processes to bring consistency and completeness to integration service testing • Designing and managing the testing AWS environments of data workflows during development and deployment of data products • Provide assistance to the team in Test Estimation & Test Planning • Design, development of Reports and dashboards. • Analyzing and evaluating data sources, data volume, and business rules. • Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. • Interpret and analyses data from various source systems to support data integration and data reporting needs. • Experience in testing Database Application to validate source to destination data movement and transformation. • Work with team leads to prioritize business and information needs. • Develop and summarize Data Quality analysis and dashboards. • Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. • Execute testing of data analytic and data integration on time and within budget. • Troubleshoot & determine best resolution for data issues and anomalies • Experience in Functional Testing, Regression Testing, System Testing, Integration Testing & End to End testing. • Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms

Job Requirements:

Must Haves Skills • Extensive experience with Python scripting and Cloud Technologies • Extensive experience with AWS components like S3, Athena, EMR, Glue, Redshift, Kinesis and Sagemaker • Experience in Automating ETL process with Python and Automation around AWS Data & Infrastructure • Extensive Experience with SQL/Unix/Linux scripting is a must • Extensive experience on Developing/testing Cloud/On Prem ETL (Ab Initio, AWS Glue, Informatica, Alteryx) • Extensive Experience in Data migration is a must (Teradata to Redshift preferred) • Experienced in large-scale application development testing - Cloud/ On Prem Data warehouse, Data Lake, Data Science • Experience in building Data flow CI/CD pipelines in GitLab • Extensive experience in DevOps/Data Ops space. • Experience in Data Science platforms like SageMaker/Machine Learning Studio/ H2O. • Strong experience of Kafka.

Nice to Have Skills • Experience using Jenkins and Gitlab • Experience using both Waterfall and Agile methodologies. • Experience in testing storage tools like S3, HDFS • Experience with one or more industry-standard defect or Test Case management Tools

Soft Skills • Great communication skills (regularly interacts with cross functional team members) • Who takes Ownership to complete the tasks on time with less supervision • Guiding developers and automation teams in case of an issue • Monitoring, reviewing, and managing technical operations • Effective problem-solving expertise, trouble shooting, code debugging, and root cause analysis skills