Senior Analytics Engineer (f/m/d)

Permanent employee, Full-time · Munich, Berlin, Hamburg

Our Mission
At Circus (Xetra: CA1), headquartered in Munich with offices in Berlin and Hamburg, we are pioneering global developments in on-demand autonomous food production. From culinary creation to production and operations, everything is powered by seamlessly integrated robotics and transformative AI.

Our foundation lies in groundbreaking advancements in robotics, driven by artificial intelligence — the long-missing link in realizing fully autonomous food automation. The time has come for a new status quo: just as autonomous vehicles are reshaping our cities and daily lives, we are redefining the entire experience of food.

Our team brings together leading minds in Robotics, AI, Engineering, and Culinary, working closely with industry-leading partners to revolutionize how food is prepared, accessed, and experienced.
Join us as we shape a future where the art of cooking meets the science of technology — and where food becomes something fundamentally new.
About the Role
We are looking for a Senior Analytics Engineer to play a key role in building and evolving our end-to-end analytics infrastructure.
In this role, you will architect, implement, and maintain our data models, pipelines, warehouse, and reporting layers. You’ll help turn raw data into clear, actionable insights for business and product teams, enabling data-informed decisions at all levels of the company.

This is a hands-on role with high impact and visibility, but you won’t be alone: you’ll collaborate closely with Engineering, Product, Operations and Finance to build a reliable, scalable and user-friendly analytics ecosystem.

How We Work
  • We value collaboration over “lone heroes”: you’ll work closely with experienced engineers, product managers and domain experts.
  • We aim for high standards and psychological safety: we learn from mistakes instead of hiding them.
  • We support hybrid work
Your Daily Business
  • Own and develop our analytics infrastructure end-to-end: from data ingestion and transformation to reporting and visualization.
  • Design, develop and maintain robust data models using SQL and dbt, ensuring high data quality, integrity and performance.
  • Build and manage data pipelines for batch and (where needed) real-time data using Snowflake, dbt and other relevant tools.
  • Develop, maintain and improve our Google Looker reporting layer to enable self-serve analytics across the company.
  • Partner with stakeholders to define clear KPIs, dashboards and reporting needs, and turn questions into well-designed datasets.
  • Document your work and contribute to high-quality governance practices around data access, definitions and lineage.
  • Continuously improve analytics processes, automation and scalability, including CI/CD practices for analytics.
  • Monitor and maintain pipeline health, job failures and data freshness to ensure reliability and trust in our data.
  • Own and evolve our demand forecasting model together with business stakeholders.

Base Qualifications
If you recognize yourself in most of these, we’d love to talk — even if you don’t tick every single box.
  • Strong experience with SQL and data modeling (e.g. dimensional modeling, star/snowflake schemas).
  • Hands-on experience with a cloud data warehouse (e.g. Snowflake, BigQuery, Redshift).
  • Experience using dbt or similar tools to build and maintain analytics models.
  • Experience with at least one BI / dashboarding tool (e.g. Looker, Looker Studio, Tableau, Power BI, Mode).
  • Experience working in a production environment with version control and CI/CD for analytics or data models.
  • A demonstrated ownership mindset: you’ve led analytics projects end-to-end and can drive them from idea to production.
  • Strong communication skills and the ability to translate business questions into technical requirements and vice versa.
Preferred Qualifications

You do not need all of these to apply. These are areas where you can grow on the job:
  • Experience with Python for scripting, data validation, and light data engineering tasks.
  • Familiarity with the AWS ecosystem, especially Airflow or similar tools for orchestrating data workflows.
  • Experience working with event-driven data architectures (e.g. Kafka, Kinesis, pub/sub).
  • Hands-on work with time-series forecasting and forecasting pipelines.
  • Experience designing or maintaining data engineering pipelines in Python.
If you’re unsure whether your experience is a match:
If you meet most of the “Base Qualifications” and are excited about the role, we encourage you to apply.

Our Offer
  • Competitive Compensation: Receive a competitive compensation package with offered stock options for full-time employees.
  • Work-Life Balance: Enjoy regular team events, a hybrid work setup, 2 months/year remote work (Worcation), and unlimited vacation days, along with a corporate pension plan.
  • Personal Development: 500€ personal budget for training, in-house trainings, and growth opportunities in a fast-paced environment.
  • Mobility: Benefit from a 50€ monthly budget for mobility or other expenses and a subsidized Dance Mobility Subscription for E-Bike/E-Moped.
  • Well-Being: Get access to discounted Urban Sports Club membership and Wellhub for a variety of well-being offerings.
Disclaimer
We pride ourselves on being an inclusive and equal opportunities employer with a diverse and multicultural team. We are proud to have an inclusive workplace environment that will foster your development no matter your gender, civil status, family status, sexual orientation, religion, age, disability, education level, or race. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Uploading document. Please wait.
Please add all mandatory information with a * to send your application.