"The AI engineering field is exploding as businesses race to embed generative intelligence into their products, making skilled engineers a hot commodity. This role in Phoenix offers a chance to build production\u2011grade AI pipelines on leading cloud platforms, directly impacting innovative solutions. If you have 8+ years of Python and cloud experience, this position could be a career\u2011defining move.\n\n# Job Summary\nWe are looking for a senior\u2011level AI Engineer who will design, develop, and deploy scalable data pipelines and generative\u2011AI solutions on GCP or AWS. The candidate must combine deep Python expertise with hands\u2011on experience in LLM integration, data modeling, and API development to deliver production\u2011ready, cloud\u2011native applications.\n\n# Top 3 Critical Skills Table\n| Skill | Why it's critical | Mastery Level |\n|-------|-------------------|---------------|\n| Advanced Python programming | Core for building data pipelines and AI models | Senior |\n| Generative AI / LLM integration | Drives the core AI product functionality | Senior |\n| Cloud platforms (GCP/AWS) | Enables scalable, production\u2011grade deployment | Senior |\n\n# Interview Preparation\n1. **How do you design a scalable data pipeline for training large language models on cloud infrastructure?**\n *What the interviewer is looking for:* Understanding of distributed processing, storage choices, and cost\u2011effective scaling on GCP/AWS.\n2. **Explain the trade\u2011offs between using managed services (e.g., BigQuery, SageMaker) versus custom compute for AI workloads.**\n *What the interviewer is looking for:* Ability to evaluate performance, latency, security, and operational overhead.\n3. **Describe a situation where you integrated an LLM into an existing API service. What challenges did you face and how did you overcome them?**\n *What the interviewer is looking for:* Real\u2011world experience with prompt engineering, latency optimization, and version control.\n4. **What best practices do you follow for data modeling in a high\u2011throughput AI pipeline?**\n *What the interviewer is looking for:* Knowledge of schema design, partitioning, data validation, and governance.\n5. **Walk through your approach to monitoring and troubleshooting a cloud\u2011native AI application in production.**\n *What the interviewer is looking for:* Familiarity with logging, alerting, A/B testing, and automated rollback mechanisms.\n\n# Resume Optimization\n- Python\n- Generative AI\n- Large Language Models (LLM)\n- GCP\n- AWS\n- Data Engineering\n- Scalable Data Pipelines\n- API Development\n- Cloud\u2011native Applications\n- Data Modeling\n\n# Application Strategy\nWhen reaching out to the recruiter, send a concise email that opens with a friendly greeting, attaches your resume, and clearly highlights your top skills. Make sure to mention related skills you possess, such as advanced Python development, Generative AI/LLM integration, and extensive experience with GCP/AWS. Reference specific projects where you built scalable AI pipelines or cloud\u2011native solutions that align with the job description.\n\n# Career Roadmap\n| Current Role | Typical Experience | Core Focus | Next Position |\n|--------------|--------------------|------------|---------------|\n| AI Engineer | 8+ years in Python, GenAI, Cloud | Building scalable AI pipelines | Senior AI Engineer |\n| Senior AI Engineer | 10\u201112 years, lead projects | Architecture & mentorship | Lead AI Engineer |\n| Lead AI Engineer | 13\u201115 years, strategic AI initiatives | Cross\u2011team AI strategy | AI Engineering Manager |\n| AI Engineering Manager | 15+ years, people & product leadership | Managing teams & roadmap | Director of AI Engineering |\n"