GCP Data Engineer
Our clients team is processing customer support phone calls and other interactions to understand intent and reasons for contact. This analysis ultimately allows to identify issues in our services and offers to customers, and to give feedback to business functions to resolve those issues. The call processing is being built as an automated data pipline from call source systems into Google Cloud Platform, where machine learning and natural language models are applied, as well as general analytics and reporting.
The scope of the consultant services is to assist our client in
We are looking for a strong GCP Data Engineer for these responsibilities: – Integrate into an agile scrum team of Data Scientists, Data Analysts, Data Engineers, Software Engineers – Work with and contribute to a DevOps setup (continuous integration, etc.) for test-driven development on Google Cloud Platform – Set up ETL and date pipelines to facilitate analytics and reporting – Develop and contribute to a data pipeline to integrate calls and other data into Google Cloud Platform, facilitate automated machine learning training & prediction services, and export results into downstream consumers – Set up monitoring for data and machine learning quality – Work closely with Product Owner and business stakeholders to ensure business value
Desired knowledge, experience, competence skills etc
– Agile & Scrum
– Google Cloud Platform
– Data Engineering, ETL
– Apache Beam / Dataflow
– BigQuery & SQL – Python
– DevOps, DataOps, CI/CD, deployment on GCP
– Self-driven, action- and goal-oriented, good communication to technical and non-technical stakeholders.
Three most important things Experience
1. Data Engineering >7 years experience (Expert
2. Google Cloud Platform 4-7 years experience (Senior)
3. DevOps 4-7 years experience (Senior)
More information is available on application
Remote working role with future visits to Sweden to meet management team.
1 year contract .
Consultant: Darren Jacks