Databricks Consultant
Human Capital
The Human Capital Offering Portfolio focuses on helping organizations manage and sustain their performance through their most important asset: their people. Centered on five core issues, this Portfolio signifies to the market that we see Human Capital as a topic critical to the C-Suite. As we go-to-market, we will show our clients that we serve more than HR organizations – from the CEO to CFO, Risk Manager to Business Unit leader—and that we deliver on our issues and help create value for our clients.
Insights, Innovation & Operate Offering:
Our Insights, Innovation & Operate Offering is designed to enhance key aspects of our clients' businesses by leveraging cutting-edge technology, data, and a blend of deep technical and human expertise. We innovate and deliver creative, industry-specific solutions that streamline operations and accelerate speed-to-value.
Position Summary
Databricks Data Scientist (Consultants, Senior Consultants)
As a Databricks Data Scientist at Deloitte Consulting, you will be responsible for designing, developing, and deploying advanced analytics and machine learning solutions using the Databricks Unified Analytics Platform. You will work with large-scale datasets to extract meaningful insights, build predictive models, and drive data-driven decision-making for our clients across various industries.
We are looking for professionals who can:
· Design and implement scalable data science solutions on Databricks platform
· Build and deploy machine learning models using Databricks ML and MLflow
· Architect end-to-end data pipelines leveraging Delta Lake architecture
· Collaborate with cross-functional teams to translate business problems into analytical solutions
· Drive innovation through advanced analytics, AI/ML, and big data technologies
Work you'll do:
HC Forward is Deloitte's innovation engine for Human Capital, integrating technology, data, and industry expertise to create scalable solutions and assets that extend client capabilities and drive ongoing value across all Human Capital offerings.
Common (All Levels)
· Design and develop machine learning models and analytics solutions on Databricks platform
· Leverage Databricks SQL, Delta Lake, and MLflow for efficient data processing and model lifecycle management
· Build scalable data pipelines using Apache Spark and PySpark within Databricks environment
· Collaborate with data engineers, architects, and business stakeholders to deliver end-to-end solutions
· Perform exploratory data analysis, feature engineering, and model evaluation
· Document technical solutions, best practices, and knowledge transfer materials
· Travel as required by project needs
· Demonstrate strong written and spoken communication skills to facilitate technical and business conversations
Consultant (4–6 Years)
· Develop and deploy machine learning models using Databricks ML and scikit-learn
· Implement data transformations and ETL processes using PySpark and Spark SQL
· Create interactive dashboards and visualizations using Databricks notebooks
· Participate in model validation, testing, and performance optimization
· Support senior team members in solution architecture and design decisions
· Conduct code reviews and ensure adherence to coding standards
Senior Consultant (6–10 Years)
· Lead design and implementation of complex analytics solutions on Databricks platform
· Architect scalable machine learning pipelines with model versioning and deployment automation
· Optimize Spark jobs and cluster configurations for performance and cost efficiency
· Mentor junior team members on Databricks best practices and advanced analytics techniques
· Drive technical solutioning discussions with clients and stakeholders
· Integrate Databricks solutions with enterprise data ecosystems (Azure, AWS, GCP)
· Lead proof-of-concepts and innovation initiatives using emerging Databricks features
Common Skills: Deep expertise in Databricks platform (Workspace, SQL, ML, Delta Lake); Proficiency in Python, PySpark, and Spark SQL; Experience with machine learning frameworks (scikit-learn, TensorFlow, PyTorch); Understanding of MLOps principles and MLflow; Knowledge of cloud platforms (Azure, AWS, or GCP); Strong foundation in statistics, mathematics, and machine learning algorithms; Experience with Delta Lake architecture and data lakehouse concepts; Ability to work with large-scale distributed data processing
Preferred Skills: Databricks certification (Associate or Professional level); Experience with advanced ML techniques (deep learning, NLP, computer vision); Knowledge of streaming data processing with Structured Streaming; Familiarity with Databricks Unity Catalog and data governance; Experience with CI/CD pipelines for ML models; Understanding of containerization (Docker, Kubernetes); Knowledge of data visualization tools (Tableau, Power BI); Experience in HR analytics or workforce analytics domain
Education: BE/B.Tech/MCA/M.Sc (CS, Data Science, Statistics, Mathematics or related)
Locations: Bengaluru, Hyderabad, Pune, Chennai,