|buildingSMART alliance January 2014 Challenge: EcoDomus|
January 2014 bSa Challenge: EcoDomus
by by E. William East, PhD, PE, F.ASCE - Engineer Research and Development Center, U.S. Army, Corps of Engineers
Information exchange standards developed by buildingSMART chapters are intended to streamline the delivery of building information through the life of capital projects. buildingSMART alliance Challenge events allow software companies to demonstrate their ability to meet these standards. Challenge criteria requires that software companies produce and or consume the required information exchanges consistent with both the format and content equivalent to that which would be provided in contract documents. The mantra "data must match deliverables" is the fundamental quality standard of bSa Challenge events.
An independent quality control and/or quality assurance process conducted to verify software claims of compliance. Software vendors are required to demonstrate their products and provide sufficient configuration information to allow users to repeat this process on their own projects.
Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.
Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.
Test Procedure: Construction Phase
The COBie design stage model can be thought of as an outline that is filled-in during the construction process. While there are minor deviations resulting from change orders, the essential process of construction software that supports COBie is to support the contractor to add detail to the designer's model as construction progresses. To meet this goal, construction software must import design model data correctly, allow the contractor to update the changed information, and then export that updated set of information to the owner. Testing of COBie construction files was based solely on the ability of the construction software to produced properly formed COBie files. COBie construction files were not tested against the quality of the design models, only if the data updated during construction was found in the correct location in the completed COBie file. The testing of these files was based on the COBie Tool Kit Construction Quality Control report.
The COBie Construction Challenge also contained several participants whose software supports only a portion of the construction workflow. For these companies, the specific COBie data related to their applications were the only worksheets evaluated. As a result, there may be differences between the worksheets exported by one construction software firm and another. For COBie worksheets that are within the scope of a given software's business case, each COBie data row that contained deviations, that could not otherwise be explained, was assessed a one-minute penalty. This penalty is a low estimate of the time required for someone to manually correct COBie construction files.
Note that some vendors are able to "pass along" information provided from design. This means their output file will contain information provided from design although this information can't be modified by the software. Although present in the file the test result and checked by the construction QC report the test result section will not account for pass thru information.
Submissions for the bSa Challenge for COBie in January 2014 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required, meaning the software vendor is able to modify information in the identified worksheet. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).
Construction Software Results
The next three columns in the table below describe the accuracy of the vendor's construction model submissions. Quality Control Errors identify technical errors found in the file provided by the vendor. In some cases, the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided.
The next column indicates the number of errors reported by the COBie Took Kit Quality Control Report. In cases where errors were not properly reported additional notes may be provided. The next column, "Record," identifies total count of records found in the vendor's file. Finally, attributes of Spaces, Types, and Components represent the count of attributes for these worksheets found in the vendor's file. Both the record and attribute count were compared to the count in the test file during the checking process; values for the test file are shown in parenthesis in both columns.
This company successfully completed the construction COBie challenge by consuming the handover COBie file of the Medical Clinic model and exported an updated data set using the COBie format. No errors were encountered based on the construction quality control report, therefore no penalty was applied. This means that a user utilizing EcoDomus' software will not have to spend time cleaning/fixing the COBie file.