- Councils & Projects
|buildingSMART alliance January 2013 Challenge: Graphisoft|
January 2013 bSa Challenge: Graphisoft
by E. William East, PhD, PE, F.ASCE, Dr. Chris Bogen, PhD, and Mariangelica Carrasquillo, Engineer Research and Development Center, U.S. Army Corps of Engineers
The January 2013 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.
The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie Toolkit was developed and used for this challenge. The rules used by the Toolkit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 15 December 2012. This page provides information about the performance of one specific commercial software product.
Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.
Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.
1 Software companies were required to provide an IFC model based on the Facility Management Handover Model View Definition. Since the MVD specification is based on IFC 2x4 and current software has only implemented IFC 2x3, this files based on IFC 2x3 Coordination Model View Definitions were acceptable for this event.
2 The Solibri File Optimizer was used to compress the natively created IFC 2x3 Coordination Model View Definition file. This compression eliminated redundant properties and relationships allowed in native IFC Coordination Model View Definition files.
3 The COBie File produced for this challenge was created through the use of the COBie Tool Kit (ZIP) based on rules developed and IFC to spreadsheet mappings documented in the COBie Responsibility Matrix.
4 The Facility Management Handover Model View Definition IFC file listed here is the file that was produced by the COBie Tool Kit. Those working with IFC file exchanges will notice the file size reduction when non-asset and geometric information, found in the Coordination MVD but not the FM Handover MVD, are omitted.
Test Procedure: Design Phase
The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, a presentation documented the detailed asset information found on the Clinic drawings for two typical rooms. This presentation shows the complexity of incorporating the complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets identified in a detailed cross-referenced review of the clinic drawings. This presentation was used to explain the level of detail required to establish if a COBie file contains the same information as printed drawings. Once the file was received an additional room, the first floor mechanical room was also added to the evaluation. To summarize the presentation, evaluation criteria for typical design COBie file should include, at minimum, the following types of reviews.
To support the integration of scheduled information into the model an updated example COBie file based on a manual takeoff of equipment/product schedules. This updated Clinic COBie file will be publically released through the Common BIM File page as "version 12" of the Clinic COBie example files on or about March 2013.
While significant effort was spent to develop the underlying redacted drawings, initial models, and updated COBie models not one design software vendor met the complete COBie Challenge to have the model provided match the provided design drawings or example COBie file. As a result, the following criteria were established to evaluate design model submissions to establish an estimate of the quality of the submitted data file.
The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).
Design Software Results
The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.
1 Although the value is not exactly the same as provided in the sample, vendor's selection is not far away from the room's classification therefore, no penalty was applied for this item.
2 There was a discrepancy between the units of measure used for usable height, but based on the unit conversion the values are accurate and no penalty is applied.
3 The vendor did not model the same zones that were in the original file, but the zone information delivered was correct therefore no penalty was applied.
4 The Type.Name values provided did not match the values provided in the sample clinic file which are the same that would be found in the construction drawings. Therefore a penalty of 31 minutes is applied for this error.
5 It was expected that an architecture model would contain the electrical fixtures from the reflected ceiling plan as well as the plumbing fixtures for the bathroom (sinks, toilets, etc). No penalty was applied for this discrepancy.
6 The data provided by the vendor was accurate, but the naming conventions were not consistent with the sample Clinic model, therefore the data does not match the construction drawings and a penalty of 472 minutes is applied.
7 The quality of the attribute data cannot be assessed because the vendor modeled different attributes than those provided in the sample clinic file. Since penalty is not considered based on missing attributes, no penalty is applied.
This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. Based on the quality control report, were zero errors identified.
The vendor produced the architectural data of the model for a portion of the original clinic model. However, the provided data was inconsistent with the sample data which resulted in a penalty of 503 minutes (8 hours). This means that a user utilizing this product is estimated to have to spend 8 hours cleaning/fixing the COBie file for a facility of comparable size and complexity.