The Customer Team empowers organizations to build deeper relationships with customers through innovative strategies, advanced analytics, GenAI, transformative technologies, and creative design. We enable Deloitte client service teams to enhance customer experiences and drive sustained growth and customer value creation and capture, through customer and commercial strategies, digital product and innovation, marketing, commerce, sales, and service. We are a team of strategists, data scientists, operators, creatives, designers, engineers, and architects, balancing business strategy, technology, creativity, and ongoing managed services to help solve the biggest problems that impact customers, partners, constituents, and the workforce. We also offer Business Process as a Service, enabling organizations to streamline operations and achieve greater efficiency through scalable, technology-enabled managed insights that guide ongoing transformation and operational excellence.
Position Summary
Level: Consultant or equivalent
Work you’ll do
As a Data Engineer within Deloitte’s Customer Strategy & Design (CS&D) practice, you will design, build, and optimize large-scale data ingestion, transformation, and orchestration systems that support analytics, AI, and digital transformation initiatives. You will play a critical role in building resilient data ecosystems, driving automation, and enabling actionable insights for global clients.
1. Data Pipeline Development & Architecture
· Design, build, and optimize ETL/ELT pipelines using Python/PySpark/SQL on distributed data platforms.
· Engineer streaming and batch workflows using Azure Data Factory/Databricks/Apache Spark/Snowflake.
· Implement ingestion frameworks for structured, semi-structured, and unstructured data (APIs/Kafka/SFTP/Delta Lake).
· Design scalable Lakehouse architectures integrating Delta Lake/Azure Synapse/AWS Glue.
2. Cloud & Big Data Platforms
· Deploy and maintain data pipelines on Azure/AWS/GCP using Data Factory/Glue/Dataflow/BigQuery/Redshift.
· Leverage Databricks/Spark SQL/Hadoop/Hive/Presto for distributed processing.
· Work with Snowflake/Azure Synapse/BigQuery for data warehousing and analytics.
· Implement partitioning, clustering, and indexing for optimal performance.
3. Data Modelling & Management
· Design robust data models (3NF/dimensional/data vault/star/snowflake).
· Implement metadata management, lineage, and cataloging with Apache Atlas/Purview/Collibra.
· Enforce data governance and compliance (GDPR/HIPAA/CCPA).
· Ensure data quality using Great Expectations/Deequ/Monte Carlo.
4. Orchestration, Automation & CI/CD
· Automate pipelines using Apache Airflow/dbt/Azure Synapse Pipelines.
· Implement CI/CD with Azure DevOps/GitHub Actions/Jenkins.
· Use Terraform/Ansible for Infrastructure-as-Code (IaC).
· Build observability dashboards using Prometheus/Grafana/Datadog.
5. Streaming & Real-Time Processing
· Design and deploy real-time pipelines using Apache Kafka/Kinesis/Azure Event Hubs.
· Implement streaming ETL with Spark Structured Streaming/Kafka Streams/Flink.
· Integrate near real-time data into analytics and ML workflows.
6. Collaboration & Documentation
· Collaborate with data scientists, architects, analysts, and product teams to translate business needs into engineering solutions.
· Document architecture, logic, and lineage for audit and reproducibility.
· Contribute to internal knowledge sharing and reusable frameworks.
If you are open to working night shifts, please let our Talent team know during the hiring process.`
KEY AREAS OF EXPERTISE
· Languages & Frameworks: Python/PySpark/Scala/SQL/Java
· Big Data & Cloud: Hadoop/Spark/Databricks/Snowflake/Synapse/BigQuery/Redshift
· ETL & Orchestration: Azure Data Factory/Apache Airflow/dbt/NiFi/Informatica
· Data Modeling: Kimball/Inmon/Data Vault/Star/Snowflake
· Data Quality & Governance: Great Expectations/Apache Atlas/Collibra/Purview
· Visualization Tools: Power BI/Tableau/Looker/Fabric
The team:
The Customer Strategy & Design (CS&D) team is a core part of Deloitte’s Customer portfolio, helping organizations reimagine customer engagement, drive growth, and enhance experiences across the lifecycle. We operate at the intersection of strategy, design, and digital transformation, bringing together strategists, designers, analysts, and industry experts.
We work with C-level leaders to tackle complex challenges—from launching new ventures and redefining go-to-market strategies to shaping omnichannel experiences and driving marketing, sales, pricing, and service excellence. Our strength lies in delivering executable strategies that balance long-term vision with practical implementation.
Partnering across industries, we ensure that our solutions deliver both measurable impact and meaningful customer outcomes, guiding clients from insight to execution.
Qualifications
Must Have Skills/Project Experience/Certifications:
· 4 - 6 years for Senior Consultants and 2 - 3 years for Consultants
· Should be comfortable working in night shift.
Deloitte is seeking an experienced Data Engineering professional with deep expertise in building scalable data platforms and enabling enterprise data transformation.
· Professional experience: Experience in data engineering and analytics consulting from consulting firms, Big 4 firms, OR experience within enterprise data teams in industry with a focus on data architecture, pipeline development, cloud data platforms, and analytics enablement through scalable data solutions.
· Industry experience: across below listed industries is preferable:
o Retail, Consumer Goods & Industrial Products
o Telecom, Media & Technology
o Life Sciences & Healthcare
o Energy & Industrial
o Transportation, Hospitality
Good understanding of how businesses price the products and services to different customers in a B2B or B2C or B2B2C environment.
· Core Consulting skills: Managing the pace and delivery of data engineering projects, including coordination with key project stakeholders, defining technical requirements, and ensuring successful execution within timelines and quality standards.
· Technical & Engineering Skills: Strong foundation in data architecture, pipelines, and cloud data platforms. Able to communicate technical outcomes effectively to non-technical stakeholders.
· Expertise in Advanced Data Engineering & Emerging Platforms:
o Understanding of DataOps/DevOps/MLOps for continuous integration and deployment.
o Experience implementing CI/CD/IaC using Terraform/CloudFormation and automation tools (Azure DevOps/GitHub Actions/Jenkins).
o Knowledge of real-time data streaming architectures and modern data principles (Data Mesh/Data Fabric).
Mandatory Tools Experience:
o Microsoft Office: Proficient in Microsoft Excel for advanced data analysis and financial modeling, and Microsoft PowerPoint for creating impactful presentations and visualizing data insights
o Python/R: Proficiency in Python or R for data manipulation, statistical analysis, and machine learning
o SQL: Strong command over SQL for data querying and management
o Tableau/Power BI: Experience in data visualization using Tableau or Power BI to communicate insights effectively
· Master’s degree (M.Tech,/M.Sc/MS) in Computer Science, Information Technology, Data Engineering/Data Science, or related fields from top institutes in India.
Location:
· Bengaluru/Hyderabad