BIM Busters - Quality Checks are the only way to achieve data quality
The article explores the difference between quality checks, e.g., with the IDS, and quality assurance - avoiding mistakes in the first place. Shown with the example of quantity takeoff for costing.
Last week's BIM -Buster on the IDS and its strengths and weaknesses hit a spot in the BIM community. It seems it's a hot topic, and I hope to reach as many people with this article as last week, but I fear it won't as it is a less popular take on data quality. Still, it's a topic close to my heart. Probably once again, it's an example of the social media law that says:
"The importance of the topic is indirectly proportional to the amount of likes."
In other words, the more important a topic, the fewer likes, and shares.
Why data quality matters
Unlike many BIM enthusiasts, I don't believe that more data is better than less. It's all about what you can do with the data - how to use it for better decision-making to achieve higher-quality buildings or any other business value.
In the last three years, I looked at over 1000 architectural BIMs, and most of them shared these characteristics:
Geometry was good enough for plan generation in the architect's BIM solution (not from the IFC). To showcase the difference, I have an excellent example from a resent discussion with a surveyor. They handed over plans and a model from their survey. These plans looked good, as you would expect. But their job was to create an Archicad model for the architect to continue working. They modeled walls and slabs in thin layers, showing what they could measure and a thick one with their estimation (exported as an IfcBuildingElementProxy). It makes sense to transport this information. However, the two layers make it very hard to use the model for continious work. When talking to them, I could not reach them. They had the point of view that they did everything correctly as quantity surveyors. And I agreed they did. What they did not do, was considering how somebody continues to work with their model.
The models are good enough to get an idea of the building. They are not transporting any atmosphere or beauty. (The exceptions are BIMs for mansions; their architects pay more attention to transport atmosphere).
Metadata was mostly inconsistent, rudimentary, and not useable. The space names are the only constants we can usually find in the attribute "LongName" or sometimes "Description" that is usable.
Using the model for more than just looking at it was only possible with extensive data wrangling. However, this might be an availability bias because cost calculators or thermal simulation engineers uploaded the models to abstractBIM when they were unusable. So I only see the bad ones!
So please prove me wrong and tell me there are better models in the wilderness!
When I work with high data quality, I feel like a magician, asking questions about the model and getting valuable answers. This can help:
In project management. With high data quality, I can control project costs and schedules by controlling the quality.
Risk management, e.g., data visualization, allows me to understand concepts faster and make better decisions.
Automation. Once I have data I can trust, I can use this to further enrich models automatically - with an impact on the first two topics.
The first strategy for quality BIM data: Quality Checks
The idea is simple. Define which data you need, communicate your needs, and then check if the deliverables fulfill the need. If not, send it back to correct it. The IDS helps to convey the metadata requirements on a technical level and makes it easy for the delivery party to check quality with a tool of their choice with no or less setup time. Setting up a quality gate should increase the likelihood that somebody does a control before sending a file out. Unfortunately, how often did somebody review the plans before sending them out? It's a mindset issue!
There are a few other issues with this strategy. First, it takes work to formulate and communicate data requirements, especially when you have little data competency. Defining is hard work, and you need the willingness to look at your workflows. As a client, you can contractually enforce data quality, but often, a subcontractor needs high data quality, making enforcing more difficult. A subcontractor has to pay close attention to how to formulate, when to -, and what to communicate; otherwise, the subcontractor won't get the contract.
Secondly, there is often insufficient time (or willingness) for the feedback loop, and people resort to mitigation strategies primarily by compromising and doing manual work. I experienced this:
as a client rep with the area calculations from the architects. They rarely are according to standard and, therefore, suitable for benchmarking.
as a project manager, I need quantities for cost calculations. It was easier to manually measure instead of using the BIM.
working for a wood construction company with manufacturing departments. Production was always based on our models, done by professionals with experience working in the factory. I
at abstract, working with thermal simulation engineers and cost calculators. The architect almost always believes their BIM is great, but it's unsuitable for a import in calculation or simulation tools. That led simulation engineers to give up on the architect's BIM. Just last week, I saw a simulation engineer doing a manual takeoff on pdf, although he got an IFC with well modeled spaces, windows, and doors and could have used the abstractBIM. But the reputation of bad architects' BIMs prevailed, and he preferred to start a 4-5-day manual takeoff instead of using the model.
Exploring Requirments for Quantity Takeoff
So, let's look at the example of quantity takeoff, what needs to be covered in an IDS, and the modeling guidelines so that the BIM can feed cost calculations.
We need to define which elements must be in the model and, of course, that they are modeled, labeled, and assigned correctly. For example, we need to ensure the model does not contain duplicated geometry. Even worse, we can do all the quality checks we want, and the BIM automated checks or the IDS won't help us if the correct metadata is used in the model but assigned to the wrong elements. For example, we won't be able to automatically find a wrongly labeled gypsum board wall in an underground exterior wall.
For ease of the exercise, I only focus on the main cost drivers in an architectural model: IfcSpace, IfcSlab, IfcWall, IfcCovering, IfcWindow, and IfcDoor. But the more you want to extract from the model, the more needs to be defined.
Especially for walls and slabs, we need to define the level of geometric detail so that we can do a meaningful quantity takeoff. To make it easier, we are only interested in the main sqm surface area and not all the different layers of the wall (I firmly believe it is too late to start with the cost calculation once you get the correct wall layers in the model). Therefore, we need one solid with the following attributes:
As long as we don't aim for automation across projects, we just need a name attribute that is somehow consistent. A gypsum board wall should always be called the same, and not one Gypsum board wall X and another part of the model should not have another name. The moment we want to automate cross-projects, we need to standardize the material names even more.
The PredefinedType specifies the type of the entity in more detail. E.g., it separates the slab in a baseplate, a floor, or a roof. It is relevant for a cost calculation and already predefined in the IFC standard - hence the name of the attribute.
IsExternal is a Property in the Pset*Common. E.g., PsetWallCommon is defined as a boolean value. Therefore, it should contain a TRUE or FALSE.
FireResistancy is another value in the common Property Set (Pset). Here, we need to define the values allowed according to national norms. The IFC standard provides a place to store the data; the value is not determined. I added this property to visualize the interdependence between property values and geometric modeling guidelines. When we want to assign the fire resistance in a one-to-one relationship, we need to separate the walls when the fire resistance changes. Usually, this information is available already in the early design stages of the fire resistance plan, but the architect only incorporates it in the model during the construction phase (if ever). So, if we want to have it in the model for costing, we need to change how we usually work. We need to separate walls much earlier to assign the fire resistance values in the model.
Next, we need to define the required quantities and, for an automated workflow across projects, the units and the name of the Qset and its attributes. Every architectural BIM solution follows its standard for quantities. That's why I recommend that quantity surveyors calculate them in a post-processing. E.g., in SimpleBIM
Usually, getting the cost classification in the BIM is unnecessary. The reason is simple: most of the time, it will be wrong anyway because the modeler does not know it as well as the cost calculator.
I hope you see that cost calculation is a complex task requiring a lot of definitions and BIM quality checks. Moreove the necessary level of geometry and information will often be included only in the late design stages when it is usually too late for the cost calculations. In these late stages we don't have the flexibility to make meaningful changes and save money.
I started on defining the IDS for it and gave up, it’s too much. With the IDS it would be possible to check the metadata. Checking for duplicated geometry or if the geometry is split correctly is all that is not possible to describe in the standard. Don't get me wrong, I think IDS is great; I just want to point out that reality is more complex and that maybe another strategy than religiously checking the data quality is better suited.
Second strategy for quality BIM data: Quality Assurance
The difference between quality checks and quality assurance is that the former focuses on finding mistakes after the work is done, and quality assurance focuses on the workflow to avoid mistakes in the first place.
I'm a big fan of assurance, which is closer to value creation. To set up quality assurance processes, we can follow a few principles:
Make the quality relevant for the person doing the work. When modelers have to enter information they don't need, maybe don't even understand the data, it is likely incorrect.
Make it hard to make mistakes. For example with drop-down lists to select values instead of typing them.
Ask for as little as possible. The fewer requirements the better!
Make it easy to spot mistakes and missing data. E.g., through visualization and coloring plans or through excluding all the metadata that is not relevant. This means the modelers only see values they need to work with, e.g., Vectorworks is an excellent example; when properly set up, the modeler only sees the relevant attributes.
Avoid any duplication of information. Any duplication increases the probability of inconsistencies. And the amount of quality checks and cross-checks you need to do to find them.
The best quality assurance workflows work with as little input data as possible and usually incorporate some quality control aspects. Some requirements will always be necessary; we must ensure they are as simple as possible!
Quantity Takeoff workflow with quality assurance
For the example with quantity takeoff/costing, the requirments can be as simple as:
Ask for an IFC file following the proper IFC structure.
Communicate the requirements early on. When using the abstractBIM, it's enough to ask for the IfcSpaces, modeled between the top of the final floor and the bottom of the structural slab in every void of the building.
Including the Space names consistently in the LongName attribute.
Model exterior spaces like balconies and set the attribute IsExternal.
Model and export windows and doors as IfcWindow and IfcDoor.
Here I created a IDS file with the usBIM IDSeditor for IFC4:
<ids:ids xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://standards.buildingsmart.org/IDS http://standards.buildingsmart.org/IDS/0.9.6/ids.xsd" xmlns:ids="http://standards.buildingsmart.org/IDS">
<!--edited with usBIM.IDSeditor (http://www.accasoftware.com)-->
<ids:info>
<ids:title>IDS for abstractBIM for QTO</ids:title>
<ids:copyright>MIT Licence</ids:copyright>
<ids:version>0.1</ids:version>
<ids:author>simon.dilhas@abstractbim.com</ids:author>
<ids:date>2024-01-19</ids:date>
<ids:purpose>Checking BIM models before abstraction in abstractBIM.com</ids:purpose>
</ids:info>
<ids:specifications>
<ids:specification ifcVersion="IFC4 IFC4X3" name="Has Spaces" minOccurs="1" maxOccurs="unbounded" description="The Model must contain IfcSpaces as 3D Geometry. Best modeled between top of the flooring to botom of the slab and between the walls.">
<ids:applicability>
<ids:entity>
<ids:name>
<ids:simpleValue>IFCSPACE</ids:simpleValue>
</ids:name>
</ids:entity>
</ids:applicability>
<ids:requirements>
<ids:property datatype="IFCBOOLEAN" minOccurs="0" maxOccurs="unbounded">
<ids:propertySet>
<ids:simpleValue>Pset_SpaceCommon</ids:simpleValue>
</ids:propertySet>
<ids:name>
<ids:simpleValue>IsExternal</ids:simpleValue>
</ids:name>
<ids:value>
<xs:restriction base="xs:boolean">
<xs:pattern value="true|false" />
</xs:restriction>
</ids:value>
</ids:property>
</ids:requirements>
</ids:specification>
<ids:specification ifcVersion="IFC4 IFC4X3" name="May have windows or doors" minOccurs="0" maxOccurs="unbounded" description="The model may contain Windows ">
<ids:applicability>
<ids:entity>
<ids:name>
<xs:restriction base="xs:string">
<xs:enumeration value="IFCWINDOW" />
<xs:enumeration value="IFCDOOR" />
</xs:restriction>
</ids:name>
</ids:entity>
</ids:applicability>
<ids:requirements />
</ids:specification>
</ids:specifications>
</ids:ids>
When the architect follows these basic rules, the abstractBIM algorithm can generate an abstractBIM with the main building elements. As the elements are automatically generated, there is
consistency in the metadata e.g. every interior wall is labeled as interior.
geometric uniformity. For example, the spaces have a wall covering element, excluding the windows and doors. (something challenging to model manually). And all the geometry is simplified. This goes against the traditional BIM theory that the LOG steadily increases from LOG100 to finally reach LOV500 for facility management. For simulations, you want to have a simplified model even when the architectural model is already LOG350), or for FM, a less detailed model with a high LOI would often be better.
topological connectivity between all elements, E.g., every wall knows strictly between which two spaces it is (and this is not a very abstract concept like the 2nd Level Space Boundary but is simple to address attribute).
This topological connectivity especially opens possibilities for cost calculation. Let's think about the typical material used in a multifamily home.
Often, the walls of a staircase have a special material and need to be addressed separately in a calculation. We can query the model to only show the wall area from walls with the staircase in the attribute "Space connection." We can often assign this wall as a REI90 wall made of concrete.
A 12.5cm wall is often either brick or drywall.
A wall around the bathroom in 12.5 cm is often not a gypsum board wall.
Experience shows that 40 rules are often enough to describe a multifamily home built in Switzerland entirely (for cost calculation).
So, once these rules are defined, they can be adapted and reused for other projects. Bringing down the effort for a quantity takes minutes instead of hours and days. At least for the significant main positions, but please consider the 80/20 rule. When you have the main quantities, you can focus on the details and the unit prices.
Summary
Quality control and quality assurance work best hand in hand. The best approach is to have simple BIM requirements for specific usecases, using assurance workflows for enrichment and avoiding from mistakes and when quality checks on defined quality gates. From a technological point of view, we can do very complex quality checks, but it's hard to build them up and maintain them.
Now, with the hype of the IDS, it's worth remembering that it's always better to avoid mistakes in the first place than to find them. If we can't avoid them, seeing them as early as possible is better! So, build up your toolbelt:
Use the IDS to check metadata
Use quality assurance workflows e.g. with the abstractBIM to make it hard for everybody to make mistakes.
Use precise and simple requirements (based on goals) to start and define what "quality" means and communicate these requirments so that humans can fullfill them.
Only when we as BIM community cut the grap and don’t jump on any hype, thinking it’s the solution, we will be able to convince a wider audience to implement other, better and easier workflows. If we don’t we won’t be able to change how we work on a wider level in the industry. We are just at the beginning…