Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed.
You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services (GCP), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist.
Work you’ll do
As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues.
In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through:
architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more.
The key responsibilities may involve some or all of the areas listed below:
- Act as a trusted technical advisor to customers and solve complex Big Data challenges.
- Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes
Qualifications Technical Requirements
- BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience.
- Experience in Cloud SQL and Cloud Bigtable
- Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics
- Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer
- Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume).
- Experience working with technical customers.
- Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript.
Consulting Requirements
- 6- 10 years of relevant consulting, industry or technology experience
- Strong problem solving and troubleshooting skills
- Strong communicator
- Willingness to travel up in case of project requirement