A back arrow icon.
RDFox Blog

Fully Interconnected Construction Projects that Automatically Adjust for Changes. But How?

Fully Interconnected Construction Projects that Automatically Adjust for Changes. But How?
Mads Holten Rasmussen

The Architecture, Engineering and Construction (AEC) industry is a fragmented one where organisations are formed temporarily around each project. These organisations consist of domain experts like developers, architects and engineers affiliated with different companies. The experts collaborate on defining and developing the project - an iterative process with major changes and pivots along the way. If everybody worked in an interconnected and digitized environment this might not be such a big problem, but unfortunately, it is not so. With the introduction of Building Information Modelling (BIM) one might have the impression that we are there, but the reality is that the current implementations are based on siloed 3D models that are only integrated to the extent that they live in the same coordinate system and can therefore be compared and checked for clashes.

Construction projects are getting increasingly more complicated and nobody has the full understanding of the derived consequences of a change.
Mads Holten Rasmussen, Business Development Director, NIRAS

In this article, I suggest an approach to transitioning the industry to the next level by modelling information beyond 3D geometries, considering component properties such as volume flow, temperature, and power output using Linked Data technologies and incremental reasoning made possible by software such as RDFox.

Case study

Let’s take a deeper look at the case from the Heating, Ventilation and Air Conditioning (HVAC) domain. Having worked within HVAC for many years, this is where my expertise lies, but you can probably imagine how a similar technique could be applied to other domains such as structural analysis, Life Cycle Assessment (LCA), planning, cost calculation, thermal simulations etc. The particular case we will focus on is the design of the central heating system in a new building.

When doing so, the engineer needs to calculate the heat losses for each space in the building so the space heaters can be sized accordingly. If you ask the typical engineer what is needed to do such a calculation, the request would probably be the latest plan drawings, and typically as PDF documents. The actual information takeoff from the PDF document is then carried out by that engineer. The geometrical attributes of the building envelope are measured using a ruler and these are transferred to a dedicated spreadsheet with standard formulas for calculating the heat losses. In some situations, the individual engineer might even build a new calculation sheet from scratch.

The problem is that these areas are only valid until the architect changes the layout of the building. Especially in the early stages, it is quite typical that walls are moved, and keeping track of these changes and updating the heat loss calculations is as described a quite manual process. When you then consider that this is only one minor piece in a large network of interconnected elements, a single change can easily result in a whole cascade of consequences elsewhere in the project. Again, only focusing on the heating system, where the heat loss determines the size of the radiator, that determines the required flow of hot water in the pipe system, that determines the size of the individual pipes, pumps and other mechanical components, it becomes a quite cumbersome job just to monitor and adapt as changes occur.

In the ideal world, the engineer would have a language where he or she could express all the interconnections and what rules should apply as a change occurs in a computer-understandable way. Something similar to what we know from spreadsheets, where cells are interconnected by formulas that will automatically adapt to changes.

Logical Reasoning and RDF

We have recently seen some major leaps in probabilistic AI with ChatGPT and other Large Language Models (LLMs). But when we are dealing with engineering we are not interested in a probabilistic result. We are interested in THE result where the building will not fall apart and where we do not use an unfair amount of materials.

Fortunately, there exists an alternative. With the Resource Description Framework (RDF) we have the language we need to describe the world as we see it in a formal way that also computers “understand”. Thereby it is possible to teach the computer how to “read between the lines” and infer those things that are implicitly given from the things that are explicitly present. This is handled in simple logical rules that could for example infer that a person’s father’s brother is that person’s uncle. Rules can also contain arithmetic calculations and could for example describe that the area of a window is the product of its width and height.

With RDF, a digital model of a building that goes beyond geometry can be defined. Using ontologies and principles backed by the Linked Building Data (LBD) W3C community group and DiCon, I have here created a minimal example. In the building, we have a space and this space is adjacent to a wall and a window. We use an interface to describe exactly what part of the wall is adjacent to the particular space and while doing so, we describe that it is also an interface to the outside. We assign all the thermal and geometrical properties to these elements that are necessary to perform a heat loss calculation. When all these boundary conditions are explicitly set by the engineer or from data in the architect’s 3D model they can be formalized with RDF in a graph looks like the one below.

The graph and the RDF can be further investigated in my JSON-LD visualizer, but it is not in the scope of this article to cover this topic in detail. What I will however cover is the fact that with a logical rule, it is possible to describe that for every space in the model, the transmission heat loss (the part involving heat lost through the building envelope) can be calculated by a set of simple rules.

The details

In detail, the rules could be defined as laid out in this section (can be skipped for non-HVAC engineers).

  • Rule 1: The temperature difference [K] of the heat transmission interface is calculated from the difference between the two zones’ heating design temperatures.
  • Rule 2: The UA-value [W/K] of the heat transmission interface can be described as the product of the wall’s U-value and the interface’s area.
  • Rule 3: The total transmission loss [W] over the interface is the product of the temperature difference and the UA-value.
  • Rule 4: The space’s total heat transmission loss is the sum of heat losses of all heat transfer interfaces.

With such a setup a reasoning engine can make sure that if the areas of the heat transfer interfaces change due to changes in the architectural model, the space’s heat loss will automatically be updated. The same is the case if the wall’s or window’s thermal properties change.

How will it affect how we work?

Transitioning to this completely new way of working is probably not easy. The mindset of the engineers will need to change from “I need to calculate this” to “I need to collect the boundary conditions and the model will make sure that derived information is instantly available”. This doesn’t sound hard, but it is from my experience the nature of the engineer to enjoy making small private systems for these sorts of things and we should be aware that there will for sure be resistance. On the other hand, technology will allow for a solve-it-once paradigm as we know it from programming and perhaps this will just mean that we can continuously improve our services by building on top of each other’s work.

We must transition into a solve-it-once paradigm.

Ideally, the BIM authoring tools should automatically update the state of the common information model so that there is a live view of the project. However, I am not sure how exactly this should work. Probably it will be a disaster if everything just updates automatically at the slightest change and there might therefore be the need for a private and public space or branching as we know them from git. Linked Data allow the common model to be a federated one that is spread across multiple servers and while this gives a very clear division of responsibility there are still lots of challenges to be conquered before we are there entirely. In this podcast, my colleague Jeroen Werbrouck and I discuss the potential of such a decentralized BIM.

Challenges

It will probably be hard to keep track of all these rules and understand how they affect one another, so it is important to develop tools that provide a good overview. Further, the whole thing will break if multiple rules together create infinite loops. For example, if the area of a door is derived by the product of its width and height while the height is derived from the area and the width etc. Therefore it is important to use quite explicit property names such as heightWidthDerivedArea.

The hardest rule to tackle, however, is probably the resistance from the engineers who need to embrace a totally different way of working. But this change is probably inevitable, so those who try to fight it will likely fail. After all, transferring manual repetitive tasks to the machines will free up time that can be used to do more detailed analysis, take more features into consideration and deliver better projects.

Other potentials

Besides the obvious one that a system like this would relieve a lot of pressure from the shoulders of the engineers that currently keep track of all this in their heads and the fact that we by formalizing our knowledge can achieve better reuse and thereby do better projects there is one more I would like to mention.

When the project is finalized the current situation is that the client receives the as-build material as PDF documents and silo BIM models. The data model proposed in this article will be incredibly rich and therefore would have the potential of delivering priceless value over the lifetime of the building. What happens if a wall is removed and two rooms are thereby combined? Ask the model! What if we wish to extend the heating system? Ask for the consequences!

Who wins?

The building owner will get a more well-coordinated project with fewer mistakes. The whole federated design team will have faster feedback and will have a stronger foundation for making the right decisions.

The biggest challenge faced by the construction industry is its enormous carbon footprint and in order to conquer this we need to work smarter!

Next up

Did I catch your interest? In my next article, I will make a tutorial that in detail describes how this is done using RDFox.

Mads Holten Rasmussen is Business Development Director at NIRAS and Ph.D. in Linked Building Data. He is also M.Sc. in Architectural Engineering with a specialization in energy and indoor climate simulations and HVAC design and worked in this field in his early career.

NIRAS and Oxford Semantic Technologies have a partnership.

Take your first steps towards a solution.

Start with a free RDFox demo!

Take your first steps towards a solution.

Get started with RDFox for free!

Team and Resources

The team behind Oxford Semantic Technologies started working on RDFox in 2011 at the Computer Science Department of the University of Oxford with the conviction that flexible and high-performance reasoning was a possibility for data-intensive applications without jeopardising the correctness of the results. RDFox is the first market-ready knowledge graph designed from the ground up with reasoning in mind. Oxford Semantic Technologies is a spin-out of the University of Oxford and is backed by leading investors including Samsung Venture Investment Corporation (SVIC), Oxford Sciences Enterprises (OSE) and Oxford University Innovation (OUI).