About Deloitte
Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of
member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also
referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US
member firms of DTTL, their related entities that. operate using the “Deloitte” name in the United States and their respective affiliates.
Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see
www.deloitte.com/about to learn more about our global network of member firms.
Copyright © 2026 Deloitte Development LLC. All rights reserved.
Sr. Analyst – Data Engineer – Deloitte Support Services India Private Limited
Work you’ll do:
We are seeking a seasoned Data Engineer to design, build, and operate scalable data pipelines and curated datasets that power analytics, reporting, and data products. This role requires strong hands-on data engineering, data modeling, and data architecture experience in cloud environments (preferably GCP) and Python. Collaborate with cross-functional agile team of analytics engineers, human centered designers, web developers and dev-ops engineers to launch innovative AI-driven Data and Analytics products. Experience with Azure and AWS data engineering tools is a plus
Responsibilities:
─ Build and maintain batch and/or streaming data pipelines to ingest, transform, and serve data for analytics and AI use cases
─ Enhance data collection procedures while upholding data security and privacy norms and obligations to gather most relevant data
─ Develop reusable Python components for data processing, automation, and orchestration.
─ Design and implement logical and physical data models to support BI, analytics and AI consumption.
─ Contribute to data architecture decisions: source-to-target design, layering (raw/curated/semantic), integration patterns, and scalability considerations.
─ Implement data quality checks, validation, monitoring/alerting, and production support processes.
─ Optimize performance and cost (query tuning, partitioning/clustering, pipeline efficiency).
─ Support AI-driven analytics products by enabling feature-ready datasets, metadata, lineage, and reproducible data inputs for model training/inference
─ Participate in code reviews, design reviews, documentation, and knowledge sharing.
─ Develop an understanding of the core business functions of the client, and bring innovative analytics solution ideas to support those functions
Required Technical Skills:
Must Have:
─ Data engineering: building production-grade ETL/ELT pipelines and managing structured datasets.
─ Data modeling: translating business needs into analytics-ready models; strong SQL.
─ Data architecture: designing end-to-end data flows, standards, and patterns for reliability and scalability.
─ GCP: hands-on experience with common services like BigQuery, Cloud Storage (GCS), Pub/Sub, Dataflow, Dataproc (Spark), Cloud Composer (Airflow), Cloud Functions and/or Cloud Run, Cloud Logging/Monitoring, Vertex AI, Secret Manager, IAM.
About Deloitte
Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of
member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also
referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US
member firms of DTTL, their related entities that. operate using the “Deloitte” name in the United States and their respective affiliates.
Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see
www.deloitte.com/about to learn more about our global network of member firms.
Copyright © 2026 Deloitte Development LLC. All rights reserved.
─ Python: strong scripting/software engineering fundamentals (testing, packaging, debugging, maintainable code).
─ Proven technical troubleshooting and performance tuning experience
─ Experience with integrating Looker, Power BI, Tableau, any other BI tools
Good to have:
─ Azure data engineering tools/services (e.g., Fabric, Data Factory, Databricks).
─ AWS data engineering tools/services (e.g., Glue, Redshift, EMR, Lambda).
─ Familiarity with AI engineering concepts (feature engineering, training/inference data consistency, model monitoring data needs, governance for AI/ML data).
─ Exposure or experience supporting AI-driven data and analytics products, such as preparing ML-ready datasets, feature pipelines, or integrating data workflows into model development/serving processes.
─ Familiarity with CI/CD and Infrastructure-as-Code practices for data platforms.
─ Experience with governance concepts (catalog/lineage, access controls, PII handling)
Soft Skills:
─ Excellent verbal and written communication skills, with an ability to convey complex concepts clearly and concisely.
─ Good at engaging business stakeholders and apt at business engagement and requirement discovery
─ Analytical ability to manage multiple projects and prioritize tasks into manageable work products
─ Can operate independently or with minimum supervision
─ Strong sense of ownership, urgency, and drive
─ High level of initiative and self-motivation
Education:
─ Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
Experience:
─ 3–5 years of hands-on experience in data engineering and analytics data delivery.
─ Demonstrated experience designing data models and contributing to data architecture for cloud-based platforms.
─ Experience supporting production pipelines (monitoring, troubleshooting, SLAs) and working in Agile delivery teams.
─ Prior experience partnering with data engineers, AI engineers and DevOps practitioners to operationalize data for model training and analytics products is preferred.
Location: Hyderabad
Shift Timing: Shift between 2:00 PM to 11:00 PM IST