Human Capital
The Human Capital Offering Portfolio focuses on helping organizations manage and sustain their performance through their most important asset: their people. Centered on five core issues, this Portfolio signifies to the market that we see Human Capital as a topic critical to the C-Suite. As we go-to-market, we will show our clients that we serve more than HR organizations – from the CEO to CFO, Risk Manager to Business Unit leader—and that we deliver on our issues and help create value for our clients.
Position Summary
Level: Manager
As a Manager at Deloitte Consulting, you will oversee the technical delivery of enterprise-scale software solutions, lead cross-functional and global teams and mentor junior members. You will collaborate to understand functional requirements, support sales and proposal efforts, and drive end-to-end project delivery, including estimation and planning, to ensure successful outcomes.
Work you’ll do:
HC Forward is Deloitte’s innovation engine for Human Capital, integrating technology, data, and industry expertise to create scalable solutions and assets that extend client capabilities and drive ongoing value across all Human Capital offerings.
In this role you will -
- Design and evolve multi-tenant data platforms serving analysts, data scientists, ML engineers, and GenAI applications.
- Architect scalable solutions for batch, streaming, and real-time data workloads at a terabyte scale.
- Define and implement data mesh principles and build self-serve data platform capabilities for domain teams.
- Create reference architectures for common HR and workforce analytics patterns (e.g., People Analytics, Skills Intelligence, Payroll Optimization).
- Lead the design of cloud-agnostic and hybrid platforms using Azure, AWS, GCP, and open-source technologies.
- Establish and enforce data architecture standards, reusable patterns, and best practices across the organization.
- Lead architecture reviews and ensure alignment across products and platform initiatives.
- Maintain technical documentation using frameworks such as C4 modeling, TOGAF, Zachman, and ADRs.
- Guide engineering teams to ensure implementation aligns with the architectural vision and roadmap.
- Drive technical decision-making that balances innovation, performance, scalability, and cost.
- Build proof-of-concepts and prototypes to validate and iterate architectural approaches.
- Develop reusable components, templates, and accelerators for pipelines, governance, and quality frameworks.
- Troubleshoot complex production issues requiring deep architectural insight and cross-system understanding.
- Collaborate with product managers, client engagement teams, and senior stakeholders to align platform capabilities with business goals.
- Mentor engineers in system design and data architecture and contribute to thought leadership through whitepapers and conferences.
- Travel: Based on business needs
The team:
Our Insights, Innovation & Operate Offering is designed to enhance key aspects of our clients' businesses by leveraging cutting-edge technology, data, and a blend of deep technical and human expertise. We innovate and deliver creative, industry-specific solutions that streamline operations and accelerate speed-to-value.
Qualifications
Must Have Skills/Project Experience/Certifications:
- 12–15+ years in software/data engineering with progressive experience in data architecture roles.
- 4–5+ years as a senior architect or technical lead designing large-scale enterprise data platforms.
- Proven leadership in technical initiatives, team mentoring, and stakeholder engagement at the manager level.
- Demonstrated experience in at least one major industry vertical (e.g., Human Capital, Finance, Healthcare, Retail, Manufacturing).
- Designed and scaled data platforms at terabyte+ scale serving 100+ concurrent users.
- Deep expertise in distributed batch processing (e.g., Spark, Databricks).
- Strong knowledge of modern storage solutions: Data Lakes (S3, ADLS), Warehouses (Snowflake, BigQuery, Synapse), and Lakehouses (Apache Iceberg).
- Experience with orchestration tools (Airflow, Argo, Dagster, or equivalents).
- Hands-on with relational and OLAP databases (e.g., PostgreSQL, MySQL/MariaDB, DuckDB).
- Solid grounding in data modeling patterns: Dimensional, Data Vault, event sourcing, and CQRS.
- Proficient in Python and SQL for data engineering and architecture tasks.
- Familiar with infrastructure tooling: Kubernetes, Terraform/Pulumi, GitOps, and cloud-native stacks.
- Strong focus on performance tuning: query optimization, indexing, partitioning, and caching.
- Knowledge of data security and compliance: encryption, RBAC, GDPR, HIPAA.
- Skilled in stakeholder communication, cost optimization (FinOps), and mentoring technical teams.
Good to Have Skills/Project Experience/Certifications:
- Exposure to emerging data architectures like data mesh, data fabric, and federated learning.
- Understanding GenAI integration patterns: RAG architecture, embedding pipelines, vector search.
- Experience with graph databases (e.g., Neo4j, Amazon Neptune) for relationship-based analytics.
- Relevant cloud certifications (AWS Solutions Architect, Azure Data Engineer, GCP Professional).
- Contributions to open-source projects, industry publications, or conference speaking engagements.
Education:
BE/B.Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university
Location:
Bengaluru/Hyderabad/Pune/Chennai/Kolkata