Senior Data Engineer / Data Analyst Control Automation & ETL Testing

Job Overview

Location
Wilmington
Job Type
Full Time
Date Posted
25 days ago

Job Description

Job Title Options:
Senior Data Engineer / Data Analyst Control Automation & ETL Testing
Location:
Wilmington, DE (Hybrid 3 days onsite/week)

Engagement Type:
Long-Term Contract (W2)

Client:
Capital One (Former Capital One experience required)

Job Summary / Description:
We are seeking an experienced Data Analyst to join the Risk and Controls organization at Capital One. This role bridges data engineering and control execution, with a strong focus on ETL development, automated data validation, and scripted QA testing.

The ideal candidate will bring deep expertise in Python and SQL, along with hands-on experience building and monitoring production-grade ETL pipelines. This position plays a critical role in ensuring data integrity, lineage, and compliance across first-line risk controls in a highly collaborative, fast-paced environment.

Key Responsibilities:
• Design, develop, and maintain scalable ETL pipelines that transform raw data into reliable datasets supporting risk and control frameworks.
• Optimize complex SQL queries and Python-based data pipelines to improve performance and reliability across platforms such as Postgres and Snowflake.
• Integrate structured and unstructured data sources (including JSON) into a unified data layer for reporting and control execution.
• Build and maintain automated data quality and QA test suites using Python frameworks such as PyTest and Great Expectations.
• Implement data-as-code testing frameworks to proactively detect anomalies, schema drift, and data integrity issues.
• Perform unit and integration testing to validate ETL logic against business and system rules.
• Support data governance initiatives, including metadata management, technical lineage, and CI/CD deployment of data assets.
• Evaluate upstream and downstream integration points to ensure SQL logic accurately reflects system states and reporting requirements.
• Identify bottlenecks in data pipelines and implement automation solutions to eliminate manual processes.
• Partner closely with Engineering, Operations, and Risk teams to translate control requirements into technical ETL specifications.
• Communicate data risks, discrepancies, and remediation plans clearly to both technical and non-technical stakeholders.

Basic Qualifications:

• Master’s degree in a quantitative or technical discipline.
• Proven experience developing and supporting ETL pipelines in production environments.
• Expert-level proficiency in Python and SQL for data manipulation, transformation, and automated testing.
• Experience working with relational and non-relational databases (Postgres, MySQL, DynamoDB, Cassandra, or similar).

Preferred Qualifications:
• Experience building automated QA and data validation frameworks.
• Hands-on experience with AWS services such as S3, Glue, Lambda, and IAM.
• Familiarity with data orchestration tools (Airflow, Prefect) and version control systems (Git).
• Strong experience processing and transforming unstructured data (JSON) for structured analytics and reporting.

Similar Jobs

Campus Leader Fellowship Remote

Voice Up Publishing Incorporated

Full Time

Campus Leader Fellowship Remote

Voice Up Publishing Incorporated

Full Time

Campus Leader Fellowship Remote

Voice Up Publishing Incorporated

Full Time

Children's Creative Storytelling Internship

Voice Up Publishing Incorporated

Full Time

"Inspire Global Solutions"


We "Inspire global solutions" provide solutions in determining your requirements and career needs that you dream for ever. A clear vision and a power of professional hands will give you platform to up hold your professional career.

Connect with us


© 2018-2026 Inspire Global Solutions, All right reserved
 
image