Job Description & Details
Automation testing and data pipelines are becoming the backbone of modern, data‑driven enterprises, especially as real‑time processing gains traction. Companies are scrambling for experts who can blend solid ETL validation with robust code and streaming platforms. This contract role in Dallas offers a chance to leverage deep Python, Kafka, and AWS expertise while working on high‑impact automation projects.
Job Summary
We are seeking an experienced Automation SDET to design, develop, and maintain automated test frameworks for ETL processes, data validation, and streaming integrations. The candidate will write Python scripts, configure Kafka consumers/producers, and utilize AWS services to ensure data quality and pipeline reliability. This is a hybrid, contract position requiring on‑site client interviews and local presence in Dallas.
Top 3 Critical Skills Table
| Skill | Why it's critical | Mastery Level |
|---|---|---|
| ETL Testing & Data Validation | Guarantees accurate data movement across systems, preventing costly downstream errors. | Senior |
| Python Programming | Powers the automation framework, scripting, and integration with AWS/Kafka services. | Senior |
| Kafka & AWS Services | Enables real‑time data streaming and cloud‑based processing, essential for modern pipelines. | Senior |
Interview Preparation
- Describe your end‑to‑end experience building an ETL testing framework in Python.
What the interviewer is looking for: Depth of knowledge in test design, handling of large data sets, and automation best practices. - How do you validate data integrity when consuming from a Kafka topic?
What the interviewer is looking for: Understanding of offset management, schema validation, and idempotent consumption. - Explain a challenging bug you uncovered in an AWS‑based data pipeline and how you resolved it.
What the interviewer is looking for: Problem‑solving skills, familiarity with AWS services (e.g., S3, Lambda, Glue), and debugging techniques. - What strategies do you use to keep automated tests maintainable as ETL jobs evolve?
What the interviewer is looking for: Test modularity, data‑driven testing, and version control practices. - Can you walk through the steps to set up a CI/CD pipeline for ETL test suites?
What the interviewer is looking for: Experience with CI tools (Jenkins, GitLab CI), containerization, and automated reporting.
Resume Optimization
- Automation SDET
- ETL testing
- Data validation
- Python
- Kafka
- AWS services
- Hybrid work model
- Contract
- Dallas, TX
- 9+ years experience
Application Strategy
When reaching out to the recruiter, send a concise email that starts with a friendly greeting, attaches your up‑to‑date resume, and clearly highlights your top relevant skills. Make sure to mention your expertise in Python automation, ETL testing, and Kafka/AWS integrations, and reference any recent projects that showcase these abilities. Emphasize that you are a local Dallas candidate ready for an in‑person interview.
Career Roadmap
| Current Role | Typical Experience | Core Focus | Next Position |
|---|---|---|---|
| Automation SDET (Contract) | 9+ years | ETL automation, Python, Kafka, AWS | Senior Automation Engineer (10‑12 yrs) |
| Senior Automation Engineer | 10‑12 yrs | End‑to‑end pipeline reliability, team leadership | Lead SDET / Automation Architect |
| Lead SDET / Automation Architect | 13+ yrs | Strategy, large‑scale test frameworks, cross‑team mentorship | Director of Quality Engineering |