We are in search of an adept Data Engineer (f/m/d) to join our newly established team, "Regulatory Reporting IT." As a company subject to rigorous regulations, it is imperative that we furnish data to regulatory authorities within specified timeframes. Our focus is on constructing a platform that leverages cloud-based solutions such as Databricks, Looker/Looker Studio, Kubernetes, and Python.
Your role will encompass the utilization of SQL, Jira & Scrum, Terraform, RabbitMQ, Docker, and GCP tools (Cloud Storage, BigQuery, potentially including Dataproc, Cloud Composer, Cloud Data Fusion).
Operating within an agile and expanding team, you will participate in daily stand-ups, refinement sessions, and sprint reviews. The choice is yours: you can work remotely from your residence in Germany or join us at our Leipzig office.
Above all, we place great importance on idea exchange, mutual support, and fostering individual growth.
- You have accomplished a university degree in Information Technology/Computer Science Engineering or completed an IT specialist apprenticeship.
- You possess several years of experience in ETL/data pipeline development and support.
- Familiarity with at least one major cloud platform (AWS/Azure/GCP) is a defining attribute.
- Preferably, you have prior exposure to Databricks.
Please reach out to me, Louise Bagge.