Artificial Intelligence & Engineering
AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations.
Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical to businesses.
Position Summary
Level: Senior Consultant or equivalent
As a Senior Consultant at Deloitte Consulting, you will design, develop, and deploy enterprise-scale software solutions, lead the creation of robust pipelines and manage code deployment across environments. You will collaborate with cross-functional, global teams to translate functional requirements into effective deliverables, independently guiding and mentoring junior team members. Your role spans the full project lifecycle, including estimation, planning, execution, and tracking key metrics for analysis, ensuring high-quality and timely delivery of solutions.
Work you’ll do:
Lead discussions with business and functional analysts to understand requirements and assess integration impacts on business architecture. Prepare technical architecture and design, clarify requirements, and resolve ambiguities. Develop solutions in line with established technical designs, integration standards, and quality processes. Create and enforce technical design and development guides, templates, and standards. Facilitate daily scrum meetings, manage deliverables, and prepare weekly status reports for leadership review. Conduct detailed deliverable reviews and provide technical guidance to team members. Collaborate with onsite clients, coordinators, analysts, and cross-functional teams. Design templates or scripts to automate routine development or operational tasks.
The team:
AI & Data offers a full spectrum of solutions for designing, developing, and operating cutting-edge Data and AI platforms, products, insights, and services. Our offering helps clients innovate, and enhance their data, AI, and analytics capabilities, ensuring they can mature and scale effectively.
AI & Data professionals will work with our clients to:
- Design and modernize large-scale data and analytics platforms including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud, edge and AI/ML technologies, platforms and methodologies for storage, processing
- Leverage automation, cognitive and AI based solutions to manage data, predict scenarios and prescribe actions
- Drive operational efficiency by managing and upgrading their data ecosystems and application platforms, utilizing analytics expertise and providing As-a-Service models to ensure continuous insights and enhancements.
Qualifications
Must Have Skills/Project Experience/Certifications:
- 6 - 10 years of hands-on experience in Apache Spark and Python programming
- Deep technical understanding of distributed computing and broader awareness of different Spark versions
- Strong UNIX operating system concepts and shell scripting knowledge
- Hands-on experience using Spark & Python
- Deep experience in developing data processing tasks using PySpark such as reading data from external sources, merging data, performing data enrichment and loading in to target data destinations.
- Experience in deployment and operationalizing the code, knowledge of scheduling tools like Airflow, Control-M etc. is preferred
- Working experience on AWS ecosystem, Google Cloud, BigQuery etc. is an added advantage
- Hands on experience with AWS S3 Filesystem operations
- Good knowledge of Hadoop, Hive and Cloudera/ Hortonworks Data Platform
- Should have exposure with Jenkins or equivalent CICD tool & Git repository
- Experience handling CDC operations with a huge volume of data
- Should understand and have operating experience with Agile delivery model
- Should have experience in Spark related performance tuning
- Should be well versed with understanding of design documents like HLD, TDD etc
- Should be well versed with Data historical load and overall Framework concepts
- Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc
Good to Have Skills/Project Experience/Certifications:
- Exposure to PySpark, Cloudera/ Hortonworks, Hadoop and Hive.
- Exposure to AWS S3/EC2 and Apache Airflow
- Participation in client interactions/meetings is desirable.
- Participation in code-tuning is desirable.
Location:
- Bengaluru/Hyderabad/Pune/Chennai/Kolkata