- Councils & Projects
|buildingSMART alliance January 2014 Challenge: Autodesk, Inc.|
January 2014 bSa Challenge: Autodesk
by E. William East, PhD, PE, F.ASCE - Engineer Research and Development Center, U.S. Army, Corps of Engineers
The January 2014 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.
The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting notes from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie ToolKit was developed and used for this challenge. The rules used by the TookKit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 1 December 2013. This page provides information about the performance of one specific commercial software product.
Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.
Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.
Test Procedure: Design Phase
The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, two presentations were developed to identify specific asset information found on the Clinic drawings. The first presentation provided the complete set of asset information found on the drawings for two specific rooms. The presentation is available here. The complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets were identified from a detailed cross-referenced review of the clinic drawings. This presentation was used to define the level of detail that would be required if the entire set of building asset information were identified in the COBie model at the design stage. The second of the two COBie quality assessment presentations created, but not used as part of this challenge identified the same information two additional spaces in the building. The original plan for evaluating design software was to simply count the deviations between the model and the drawings and assess time required to manually correct the model.
After extensive review and discussion by the design challenge participants, it was determined that COBie data should be focused on managed assets, and not all information about all assets. This requirement was primarily driven by the majority of design software companies who did not follow the Challenge guidelines and develop the full Clinic model. The organizers of the Challenge, to respond to design software companies who did not meet the stated goal of the Challenge, were required to developed an enhanced COBie file based on a painstaking process of electronic and manual takeoff of drawings and equipment/product schedule information.
In any case, the resultant working COBie file provided to the design software vendors was updated, developed and distributed to design software firms. While the enhanced model contains room, door, window and MEP schedules, it did not contain COBie.Components for specific custom medical casework and owner furnished medical equipment (identified as Type.Name = "Equip*"). Following the publications of these pages an updated version of this working Clinic COBie file will be provided through the Common BIM File page.
Given the production and distribution of the enhanced working COBie file the quality metric used for design data models were as follows:
Each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is an estimate of the time required for someone skilled in the use of COBie and who has manually created COBie data from partially completed design COBie file.
Submissions for the bSa Challenge for COBie in January 2014 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).
Design Software Results
The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.
This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. Based on the format quality check there were no problems reported in the QC report and the vendor successfully exported all of the information required by the 2014 COBie challenge. This means that a user should not have to spend additional time editing the COBie file to fill in missing data or correct format errors on a COBie file exported by the vendor's software.
There are several reasons why buildingSMART alliance efforts to create the United States Building Information Model Standards have been successful. First, owners have begun to understand the need to receive building information that decreases both the first and recurring costs. There is now a standard (the COBie standard) and commentary (the COBie Guide) that defines the required information for building information deliverables.
Second, use of open standards decreases the cost of software developers to meet owners' requirements. Finally, innovation in this domain is now beginning in earnest since a common information framework frees software companies from chasing owner-specific and proprietary software-to-software information exchange protocols.
The publication of NBIMS-US v 2.0 began the transformation of planning, design, construction, and operations from a document-centric approach that began in the Italian Renaissance to a process where documents are provided with matching data. The overall result of the January 2014 bSa Challenge demonstrates that this transformation has begun. The specific results provided on this page give specific evidence that this process is underway