Artificial Intelligence & Engineering
AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations.
Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical to businesses.
Position Summary
Level: Consultant
As an experienced Consultant at Deloitte Consulting, you will be responsible for individually delivering high quality work products within due timelines. Need-basis you will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements.
Work you’ll do:
Apply industry knowledge to requirements gathering and analysis, ensuring alignment with sector needs. Demonstrate expertise across the entire software development lifecycle, from design and coding to deployment and defect resolution, collaborating with stakeholders for effective delivery. Conduct peer and team reviews to maintain quality standards. Actively engage in Agile practices, including sprint planning, retrospectives, and effort estimation. Build and share business process knowledge, supporting project knowledge management and team training. Proactively recommend process improvements, track efficiency gains, and contribute to automation and innovation, all aimed at enhancing project outcomes and delivering greater value to clients.
The team:
AI & Data offers a full spectrum of solutions for designing, developing, and operating cutting-edge Data and AI platforms, products, insights, and services. Our offering helps clients innovate, and enhance their data, AI, and analytics capabilities, ensuring they can mature and scale effectively.
AI & Data professionals will work with our clients to:
- Design and modernize large-scale data and analytics platforms including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud, edge and AI/ML technologies, platforms and methodologies for storage, processing
- Leverage automation, cognitive and AI based solutions to manage data, predict scenarios and prescribe actions
- Drive operational efficiency by managing and upgrading their data ecosystems and application platforms, utilizing analytics expertise and providing As-a-Service models to ensure continuous insights and enhancements.
Qualifications
- Must Have Skills/Project Experience/Certifications:
- 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context.
- Strong proficiency in Scala, including functional programming concepts.
- Experience building and maintaining ETL/data pipelines.
- Solid understanding of data structures, algorithms, and software engineering principles.
- Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar).
- Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink).
- Proficiency in writing unit and integration tests for data pipelines.
- Experience with version control systems (e.g., Git).
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions.
- Design, develop, and maintain scalable data pipelines using Scala and related technologies.
- Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing.
- Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability.
- Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers.
- Monitor, troubleshoot, and optimize data pipelines for performance and reliability.
- Participate in code reviews, provide constructive feedback, and adhere to best practices in software development.
- Document technical solutions, data flows, and pipeline architectures.
- Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments.
- Stay current with emerging technologies and industry trends in big data and Scala development.
- Experience with cloud platforms (AWS, Azure, or GCP) and related data services.
- Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
- Familiarity with containerization and orchestration tools (Docker, Kubernetes).
- Exposure to CI/CD pipelines and DevOps practices.
- Experience with data modeling and data warehousing concepts.
- Knowledge of other programming languages (e.g., Python, Java) is a plus.
- Experience working in Agile/Scrum environments.
Education:
- BE/B.Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university
Location:
- Bengaluru/Hyderabad/Pune/Chennai/Kolkata