flaconi is hiring in Berlin 

Jobs in Germany

Home  | English Speaking Jobs  | 7Mind Gmbh  | Senior Data Engineer (f/m/d)
  • Berlin


  • Was wir bieten

    • Purpose. Use your skills for something good and work with our passionate team to improve the quality of our user's lives through mindfulness.

    • Mindful, supportive, and safe environment for individuals to grow professionally and personally

    • Regular workshops and team-building events (remote and on-site)

    • Learning budget (books, conferences, etc.)

    • Diverse and inclusive team from over 10 nationalities

    • Health and wellness perks, including flexible options to suit different needs (e.g. Urban Sports Club or EGYM Wellpass)

    • Comprehensive mental health support through nilo.health

    • Lifetime access to 7Mind Plus 

    • Flexible working hours (80%, 90%, or 100% weeks)

    • Flexible home office policy (post-COVID hybrid office/remote working)

    Our stack
    • Google Analytics, Google Tag Manager, Rudderstack, Tableau, Looker, Adjust, Braze, Emarsys, AWS, BigQuery, Firebase, Airflow, dbt, Databricks, Amplitude


    Wen wir suchen

    As the Senior Data Engineer, you’ll play a pivotal role in shaping and scaling our data infrastructure to empower actionable insights and data-driven strategies. Reporting directly to the Head of Data & Analytics, you will own all data engineering initiatives at Gymondo, with the goal of scaling your impact across all sister companies within the holding along with fellow engineers.

    With a modern tech stack and exciting cross-functional challenges, this role is perfect for someone passionate about building scalable data systems, enabling stakeholders, and proactively driving data innovation.

    • Data Infrastructure Ownership: Design, build, test, and maintain scalable, reliable, and secure data pipelines and architectures.

    • Cross-functional Collaboration: Work closely with Marketing, Product, Engineering, and Content teams to ingest new data sources, automate workflows, and provide actionable insights.

    • Tool Implementation: Support the implementation of new data tools (e.g., customer data platforms, database replication for live CRM streams).

    • Data Architecture Leadership: Shape and document scalable data guidelines, ensuring compliance with data governance and security policies.

    • Data Model Development: Acquire datasets and design models that align with business needs and KPIs.

    • BigQuery Optimization: Optimize performance and cost-efficiency of the BigQuery ecosystem.

    • Event Tracking Design: Collaborate with app and web developers to design event logs and align with analytics KPIs.

    • Knowledge Sharing: Proactively share knowledge about data assets and best practices with data producers and consumers.

    • Stakeholder Communication: Present complex data topics to both technical and non-technical team members across organizational levels.


    Was du mitbringst

    • A Master’s degree in a quantitative discipline such as computer science, statistics, applied mathematics, engineering, or a related field.

    • 4-7 years of relevant experience with a proven track record of building and maintaining data warehouses in an industry context.

    • Proficiency in SQL and Python (must-have).

    • Hands-on experience with Google Cloud Platform (BigQuery, GCS) or similar cloud services (AWS, Azure).

    • Strong expertise with data orchestration tools such as Airflow or RudderStack.

    • Familiarity with Databricks for advanced data processing and transformations.

    • Experience with ETL/ELT concepts and writing test cases for efficient data governance.

    • Working knowledge of Git/GitHub/GitLab for version control and CI/CD pipelines.

    • Proficiency in Bash/Linux scripting.

    • Ability to transform raw data into structured formats to support analytical objectives and empower decision-making at scale.

    • Strong communication skills to present technical concepts, proofs of concept, and data models to a broad range of stakeholders.

    • A passion for fast-paced environments and startups, with a willingness to adapt and learn new tools and technologies.

    Nice to have/Bonus:

    • Familiarity with the digital marketing & product ecosystem (e.g., attribution, performance, event tracking, subscriptions, and e-commerce).

    • Experience with container-based environments (Docker/Kubernetes).

    • Knowledge of Terraform for infrastructure automation.

    • Experience with visualization tools such as Tableau, Apache Superset, or Looker.

    It’s not necessary that you should have all the skills. A combination of four or more, and ability to learn fast should be sufficient based on seniority levels.

    Helpful information

    Job recommendations