Who we are…
At BuildingMinds, we drive advanced digital strategies for the real estate industry creating a positive impact on the planet, people's wellbeing, and sustainable profitability.
Our mission is to lift unexploited data treasures that are still hidden in building structures and data silos. A single, centralized, and secure platform with a dynamic Digital Building Twin at its core allows our customers to utilize a new level of data-driven insights, unleash machine learning and AI and make more informed decisions for a better present and a more sustainable future.
We are on the lookout for talented, passionate natural-born disruptors to join our team and become a “NetZero Hero”
How we work…
Being on the forefront of disrupting an entire industry, we offer the freedom to be creative and forward-thinking. As a start-up in Berlin and with the full backing of Schindler, we also have the stability, support and resources of an established world leader in the buildings industry. This enables us to bring together the best of two worlds and to create an utterly unique culture of creativity, accelerated innovation and learning. We value speed, agility and creativity.
Your role as (Senior) Data Engineer…
As data engineer, you will be a crucial expert member of the technology team and ensure our core business model: bringing together building data from various sources in a reliant and scalable way. Data is the central asset for the BuildingMinds platform and all its services, and you will be its guardian and the facilitator.
By bringing in your expertise in data pipelines, you will play an essential role in growing and nurturing the data sources and quality in our platform and taking our product to the next level.
In detail you will…
- Assuming stewardship for data quality, consistency and usability
- Documentation of all data structures, transformations and content
- Automating and monitoring the process of data ingression and storage from various heterogenous and global data sources
- Developing and extending the central BuildingMinds data model to store all incoming data
- Implementing and maintaining standard and ad-hoc data queries and reports
We would like to hear from you if you have…
- an MSc, or PhD in in appropriate technology field (computer science, physics, statistics, applied mathematics, operations research, etc.)
- Hands-on-experience with the Big Data ecosystem, e.g. Spark, Airflow, Kafka, Flink and others
- Craftsmanship in modern programming languages, such as Python, Scala or Java
- Familiarity with Microservice paradigms and related technologies, such as Kubernetes, Docker and REST API’s
- Very good database knowledge (database design and programming), e.g. PostgreSQL and MongoDB
- Practical experience implementing automated processes for data imports and exports, including monitoring, testing, logging and debugging, following DevOps paradigms
- Experience in handling structured and unstructured data, such as text, image, sensor and machine data
- Practical knowledge of the Azure data stack is highly beneficial (Azure Data Lake, Databricks, Data Factory, etc.)
- Familiarity with industry data standards, such as Microsoft’s Common Data Model
We offer you:
- A diverse team with people from all over the world, of all ages with a supportive atmosphere and good vibes
- A hybrid Berlin work policy
- Annual learning budget focusing on people's professional and personal development
- Apple latest technology for your best efficiency
- The opportunity to shape building sustainability
- Competitive compensation and benefits plus a variety of snacks and drinks in the office.
- Unlimited working contract with 30 days of vacation per year
- Free German classes with Goethe Institute