Artificial Intelligence & Engineering
AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations.
Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical to businesses.
Position Summary
Level: Consultant
As an experienced Consultant at Deloitte Consulting, you will be responsible for individually delivering high quality work products within due timelines. Need-basis you will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements.
Work you’ll do:
Apply industry knowledge to requirements gathering and analysis, ensuring alignment with sector needs. Demonstrate expertise across the entire software development lifecycle, from design and coding to deployment and defect resolution, collaborating with stakeholders for effective delivery. Conduct peer and team reviews to maintain quality standards. Actively engage in Agile practices, including sprint planning, retrospectives, and effort estimation. Build and share business process knowledge, supporting project knowledge management and team training. Proactively recommend process improvements, track efficiency gains, and contribute to automation and innovation, all aimed at enhancing project outcomes and delivering greater value to clients.
The team:
AI & Data offers a full spectrum of solutions for designing, developing, and operating cutting-edge Data and AI platforms, products, insights, and services. Our offering helps clients innovate, and enhance their data, AI, and analytics capabilities, ensuring they can mature and scale effectively.
AI & Data professionals will work with our clients to:
- Design and modernize large-scale data and analytics platforms including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud, edge and AI/ML technologies, platforms and methodologies for storage, processing
- Leverage automation, cognitive and AI based solutions to manage data, predict scenarios and prescribe actions
- Drive operational efficiency by managing and upgrading their data ecosystems and application platforms, utilizing analytics expertise and providing As-a-Service models to ensure continuous insights and enhancements.
Qualifications
Must Have Skills/Project Experience/Certifications:
- 3-6 years of experience in design and implementation of migrating an Enterprise legacy system to Big Data Ecosystem for Data Warehousing project.
- Must have excellent knowledge in Apache Spark and Python programming experience
- Deep technical understanding of distributed computing and broader awareness of different Spark.
- Strong UNIX operating system concepts and shell scripting knowledge
- Hands-on experience using Spark & Python
- Deep experience in developing data processing tasks using PySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations.
- Experience in deployment and operationalizing the code, knowledge of scheduling tools like Airflow, Control-M etc. is preferred
- Working experience on AWS ecosystem, Google Cloud, BigQuery etc. is an added advantage
- Hands on experience with AWS S3 Filesystem operations
- Good knowledge of Hadoop, Hive and Cloudera/ Hortonworks Data Platform
- Should have exposure with Jenkins or equivalent CICD tool & Git repository
- Experience handling CDC operations for huge volume of data
- Should understand and have operating experience with Agile delivery model
- Should have experience in Spark related performance tuning
- Should be well versed with understanding of design documents like HLD, TDD etc
- Should be well versed with Data historical load and overall Framework concepts
- Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc
Good to Have Skills/Project Experience/Certifications:
- Exposure to PySpark, Cloudera/ Hortonworks, Hadoop and Hive.
- Exposure to AWS S3/EC2 and Apache Airflow
- Participation in client interactions/meetings is desirable.
- Participation in code-tuning is desirable.
- BE/B.Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university
Location:
- Bengaluru/Hyderabad/Pune/Chennai/Kolkata
Our purpose
Our people and culture
Professional development
Benefits to help you thrive
At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you.
Recruiting tips