Back to Jobs

Databricks Engineer (Retail Domain)

Not Disclosed

Job Description & Details

The retail sector is rapidly digitizing, and companies need robust data pipelines to stay competitive. A Databricks Engineer who can bridge Python development with enterprise‑grade support is in high demand right now. This role offers a chance to work on critical integration services and high‑priority incident resolution for a leading retail inventory platform.

Job Summary

We are looking for a Databricks Python L2 Support Engineer to provide extensive support for Integration Services, troubleshoot batch programs in the Aptos Store Inventory Management System, and resolve high‑priority (P1/P2) incidents. The role requires strong Python and Databricks expertise, familiarity with PL/SQL and ServiceNow, and the ability to analyze root causes in a fast‑paced retail environment.

Top 3 Critical Skills Table

Skill Why it's critical Mastery Level
Databricks Core platform for data processing and analytics in retail pipelines Senior
Python Primary language for automation, batch jobs, and incident scripts Senior
Incident Management (P1/P2) Ensures rapid resolution of high‑impact issues affecting store inventory Mid

Interview Preparation

  1. How do you design a scalable Databricks job to process nightly batch inventory feeds?
    What the interviewer is looking for: Understanding of cluster sizing, job orchestration, fault tolerance, and performance tuning.
  2. Explain how you would troubleshoot a failing Python ETL job in Databricks that throws intermittent errors.
    What the interviewer is looking for: Debugging methodology, logging strategies, and use of Databricks notebooks and job runs.
  3. Describe your experience integrating ServiceNow with data pipelines for incident ticket creation.
    What the interviewer is looking for: Knowledge of APIs, webhook automation, and end‑to‑end incident lifecycle handling.
  4. What steps would you take to perform root‑cause analysis on a P1 incident affecting the Aptos inventory system?
    What the interviewer is looking for: Systematic approach, use of logs, metrics, and collaboration with cross‑functional teams.
  5. Can you discuss a scenario where you optimized a PL/SQL query used in a Databricks workflow?
    What the interviewer is looking for: Ability to bridge relational databases with Spark, query optimization techniques, and performance impact assessment.

Resume Optimization

  • Databricks
  • Python
  • L2 Support
  • Integration Services
  • Batch Programs
  • Aptos Store Inventory Management System
  • P1/P2 Incident Resolution
  • PL/SQL
  • ServiceNow
  • Retail Data Engineering

Application Strategy

When reaching out to the recruiter, send a concise email greeting, attach your resume, and clearly highlight your top skills. Make sure to mention related skills you possess, such as Databricks, Python, and Incident Management, and reference any projects where you handled high‑priority retail data incidents or built batch pipelines.

Career Roadmap

Current Role Typical Experience Core Focus Next Position
Databricks Engineer 2‑4 years in Python & Databricks Incident support, batch pipelines Senior Databricks Engineer
Senior Databricks Engineer 5‑7 years, end‑to‑end data platforms Architecture, performance tuning Data Platform Lead
Data Platform Lead 8‑10 years, cross‑team leadership Strategy, governance, scaling Director of Data Engineering