Design, create, test and maintain production-grade ETL pipelines for the enterprise graph database.
Consider extraction tasks that deal with multi-modal disconnected datasets.
Build jobs that consist of many machine learning steps to transform multi-model datasets into an insurance graph
Ensure robustness and stability of data pipelines.
Optimize performance and load times of delta data loads.
Implement service monitoring of data pipelines.
Automation and CI/CD integration with Azure DevOps
Maintain a scalable microservice architecture based on best practices.
Contribute to a multi-lingual code base of Python, Typescript and Java.
Participate in code reviews.
Work with PaaS and SaaS components in a Cloud environment (ADLS Gen2, Neo4J, Databricks, Azure Kubernetes,
Co-ordinate development activities with adjacent product teams and Munich Re’s central cloud+IT infrastructure
Excellent communication skills
Fluent in Python and Typescript (clean code experience)
Extensive ETL experience
Strong background working with graph databases, e.g. Neo4J, TigerGraph or Stardog.
SQL and NoSQL databases
Cloud development, preferably in Azure
Azure Active Directory, oAuth
Strong DevOps experience including GIT, Azure DevOps and Gitops
Nice to have: Preferred experience working with distributed ML applications and MLOps in Databricks, Azure API