Strategy & Analytics
AI & Data
In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.
The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.
AI & Data will work with our clients to:
- Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms
- Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions
- Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements
Amazon Web Services
Qualifications:
- 3-9 Years of technology Consulting experience
- A minimum of 2 Years of experience in Cloud Operations
- High degree of knowledge using AWS services like lambda, GLUE, S3, Redshift, SNS, SQS and more.
- Strong scripting experience with python and ability to write SQL queries and string analytical skills.
- Experience working on CICD/DevOps is nice to have.
- Proven experience with agile/iterative methodologies implementing Cloud projects.
- Ability to translate business requirements and technical requirements into technical design.
- Good knowledge of end to end project delivery methodology implementing Cloud projects.
- Strong UNIX operating system concepts and shell scripting knowledge
- Good knowledge of cloud computing technologies and current computing trends.
- Effective communication skills (written and verbal) to properly articulate complicated cloud reports to management and other IT development partners.
- Ability to operate independently with clear focus on schedule and outcomes.
- Experience with algorithm development, including statistical and probabilistic analysis, clustering, recommendation systems, natural language processing, and performance analysis
Google Cloud Platform - Data Engineer
Qualifications:
- BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience.
- 3-9 years of relevant consulting, industry or technology experience
- Experience in Cloud SQL and Cloud Bigtable
- Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics
- Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer
- Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume).
- Experience working with technical customers.
- Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript.
- Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments.
- Experience in technical consulting.
- Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have)
- Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow).
- Working knowledge of ITIL and/or agile methodologies
PySpark
Qualifications:
- Education: B.Tech/M.Tech/MCA/MS
- 3-9 years of experience in design and implementation of migrating an Enterprise legacy system to Big Data Ecosystem for Data Warehousing project.
- Must have excellent knowledge in Apache Spark and Python programming experience
- Deep technical understanding of distributed computing and broader awareness of different Spark version
- Strong UNIX operating system concepts and shell scripting knowledge
- Hands-on experience using Spark & Python
- Deep experience in developing data processing tasks using PySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations.
- Experience in deployment and operationalizing the code, knowledge of scheduling tools like Airflow, Control-M etc. is preferred
- Working experience on AWS ecosystem, Google Cloud, BigQuery etc. is an added advantage Hands on experience with AWS S3 Filesystem operations
- Good knowledge of Hadoop, Hive and Cloudera/ Hortonworks Data Platform
- Should have exposure with Jenkins or equivalent CICD tool & Git repository
- Experience handling CDC operations for huge volume of data
- Should understand and have operating experience with Agile delivery model
- Should have experience in Spark related performance tuning
- Should be well versed with understanding of design documents like HLD, TDD etc
- Should be well versed with Data historical load and overall Framework concepts
- Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc
Python Developer
Qualifications:
- 3-9 Years of technology Consulting experience
- A minimum of 2 Years of experience into Unit testing and debugging skills
- Extensive experience in Pandas/Numpy dataframes, slicing, data wrangling, aggregations.
- Lambda Functions, Decorators.
- Vector operations on Pandas data frames /series.
- Application of applymap, apply, map functions.
- Understanding on using a framework based on specific needs and requirements.
- Understanding of the threading limitations of Python, and multi-process architecture
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
- Good Understanding of fundamental design principles behind a scalable application
- Good Understanding of accessibility and security compliance
- Familiarity with event-driven programming in Python
- Proficient understanding of code versioning tools ( Git, Mercurial or SVN)
- Knowledge of PowerShell and SQL Server
- You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets
- You have an eye for detail, good data intuition, and a passion for data quality
- Gooed Knowledge of user authentication and authorization between multiple systems, servers, and environments
Primary Skills (Must Have)
- Python and data analysis libraries (Pandas, NumPy, SciPy).
- Django
- DS/Algo
- SQL (Read & Write)
- CRUD
- Awareness of Microservices
Data Scientist
Qualifications
- Experience in descriptive & predictive analytics
- Education: Bachelors / Masters in a quantative fields. Analytics certificate programs from a premier institute will be preferred.
- Should have hands on experience implementing & executing Data Science projects throughout the entire lifecycle
- Expertise in Sentiment Analysis, Entity extraction, document classification, Natural Language Processing (NLP) & Natural Language Generation (NLG)
- Strong understanding of text pre-processing & normalization techniques such as tokenization
- Strong knowledge of Python is a must
- Strong SQL querying skills & data processing using Spark
- Hands on experience in data mining with Spark
- String expertise in any commercial data visualization tool such as Tableau, Qlik, Spotfire etc.
- Good understanding of Hadoop ecosystem
Full Stack Developer:
Qualifications
- A Bachelor’s degree and 3.5 till 10 years of front end design experience are required.
- Available portfolio of in-market examples of successful user interface design, focused on web or mobile.
- Expertise in designing interactive user interfaces and web interface details for various mobile and interactive platforms.
- Demonstrated experience using HTML5, CSS, and Adobe CS tools.
- Ability to work well in teams while independently managing all of the design components related to a given project.
- Communicates clearly and concisely, both written and verbal.
- Strong problem solving and troubleshooting skills with the ability to exercise mature judgment
- Experience designing interfaces for wearable technology.
- Experience in animation and videography.
- Experience in data visualization using Tableau or a comparable BI tool.
- Experience with Spotfire, Qlikview, D3, R, SPSS, SAS.
- Graduate degree with a major in Human Computer Interaction or a closely-related design field.
Additional Qualifications (for experienced candidates)
- Professional/Internship experience working in one or more of the following analytics domains: Customer & Growth, Supply Chain, Finance, Risk, and Workforce
- Familiarity with Spark and Hadoop/MapReduce
- Good experience in working with very large-scale data
- Familiarity with modeling and programming with both Windows/UNIX based operating systems