- Councils & Projects
|buildingSMART alliance January 2013 Challenge: Bentley Systems|
January 2013 bSa Challenge: Bentley Systems
by E. William East, PhD, PE, F.ASCE, Dr. Chris Bogen, PhD, and Mariangelica Carrasquillo, Engineer Research and Development Center, U.S. Army Corps of Engineers
The January 2013 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.
The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie Toolkit was developed and used for this challenge. The rules used by the Toolkit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 15 December 2012. This page provides information about the performance of one specific commercial software product.
Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.
Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.
1 Software companies were required to provide an IFC model based on the Facility Management Handover Model View Definition. Since the MVD specification is based on IFC 2x4 and current software has only implemented IFC 2x3, this files based on IFC 2x3 Coordination Model View Definitions were acceptable for this event.
Test Procedure: Design Phase
The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, two presentations were developed to identify specific asset information found on the Clinic drawings. The first presentation provided the complete set of asset information found on the drawings for two specific rooms. The presentation is available here. The complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets were identified from a detailed cross-referenced review of the clinic drawings. This presentation was used to define the level of detail that would be required if the entire set of building asset information were identified in the COBie model at the design stage. The second of the two COBie quality assessment presentations created, but not used as part of this challenge identified the same information two additional spaces in the building. The original plan for evaluating design software was to simply count the deviations between the model and the drawings and assess time required to manually correct the model.
After extensive review and discussion by the design challenge participants, it was determined that COBie data should be focused on managed assets, and not all information about all assets. This requirement was primarily driven by the majority of design software companies who did not follow the Challenge guidelines and develop the full Clinic model. The organizers of the Challenge, to respond to design software companies who did not meet the stated goal of the Challenge, were required to developed an enhanced COBie file based on a painstaking process of electronic and manual takeoff of drawings and equipment/product schedule information.
In any case, the resultant working COBie file provided to the design software vendors was updated, developed and distributed to design software firms. While the enhanced model contains room, door, window and MEP schedules, it did not contain COBie.Components for specific custom medical casework and owner furnished medical equipment (identified as Type.Name = "Equip*"). Following the publications of these pages an updated version of this working Clinic COBie file will be provided through the Common BIM File page.
Given the production and distribution of the enhanced working COBie file the quality metric used for design data models were as follows:
Each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is an estimate of the time required for someone skilled in the use of COBie and who has manually created COBie data from partially completed design COBie file.
Submissions for the bSa Challenge for COBie in January 2013 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).
Design Software Results
The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.
1 Penalty was applied for data mismatch since the "right answer" was given based on drawings and manually created model file.
2 Space.GrossArea and Space.NetArea - due to the unit convention between files, the measurements in the in the produced are slightly different from the measurements in the sample file. Although the data does not match the requirements, the discrepancies between the produced and sample data were considered small enough and therefore, no penalty was applied for this error.
3 COBie.Category - Although the value is not exactly the same as provided in the sample, vendor's selection is not far away from the room's classification therefore, no penalty was applied for this item.
4 Vendor partially produced the data on the type worksheet. 38% of the instances are missing the Type.Category name token. A penalty of 30 minutes (1min/row) was applied.
5 Vendor partially produced the data on this worksheet. 183 entries were missing the unit field. A penalty of 183 minutes (1 min/row) was applied for this error. Additionally vendor produce attributes at the component level but fail to remove those attributes associate with types (1,098 entries). Considering that these could be deleted in mass only an additional 5 minute penalty was applied. Total penalty for attributes was 188 minutes.
This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. No errors were encountered based on the quality control report; therefore no penalty was applied for the data format, delivery of required fields, and proper referencing across the worksheets.
The quality of the produced data did not match 100% to the sample data provided and penalties were applied as described above. A total of 218 minutes (3.6 hours) penalty was applied for the data mismatch. This means that a user utilizing Bentley software would have to spend 3.6 hours fixing/cleaning the COBie file.