|buildingSMART alliance January 2013 Challenge: EcoDomus|
January 2013 bSa Challenge: EcoDomus
by E. William East, PhD, PE, F.ASCE and Mariangelica Carrasquillo, Engineer Research and Development Center, U.S. Army Corps of Engineers
The January 2013 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.
The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie Toolkit was developed and used for this challenge. The rules used by the Toolkit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 15 December 2012. This page provides information about the performance of one specific commercial software product.
Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.
Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.
Test Procedure: Construction Phase
The COBie design stage model can be thought of as an outline of facility asset information that is filled-in during the construction process. While there are minor deviations due to change orders that must be addressed in the base model, the essential process of construction software is to provide a tool through which the contractor, and associated business partners, are able to add detail to the designer's model as construction progresses. To meet this goal construction software must import design model data correctly, allow the contractor to update that information, and then export the resulting set of information to the owner to reflect the current completion of the facility. Testing of COBie construction files, therefore, was based solely on the ability of the construction software to produced properly formed COBie files. COBie construction files were not tested against the quality of the design models, only if the data updated during construction was found in the correct location in the completed COBie file. The testing of these files was based on the COBie Tool Kit Construction Quality Control report.
The COBie Construction Challenge also contained several new participants whose software supports only a portion of the construction workflow. For these companies, the specific COBie worksheets related to their specific business applications were the only worksheets evaluated. As a result, there will be differences between the worksheets exported by one construction software firm, versus that exported by another product. For COBie worksheets that are within the scope of a given software's business case, each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is a low estimate of the time required for someone not skilled in the manual process required to manually correct COBie construction files.
Submissions for the bSa Challenge for COBie in January 2013 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).
Construction Software Results
The next three columns in the table below describe the accuracy of the vendor's construction model submissions. Quality Control Errors identify technical errors found in the file provided by the vendor. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided.
The next column indicates the number of errors reported by the COBie Took Kit Quality Control Report. In cases where errors were not properly reported additional notes may be provided. The next column, "Types," identifies total count of COBie.Types found in the vendor's file and the number of items containing errors when evaluated against information found on design drawings. The "Components" column identifies the total number of components identified in the model and the number of components that are in error when compared with the design drawings. Finally, attributes of Types and Components were compared with the values found on drawing schedules.
1An error in the attribute sheet was encountered do to a failure to pick a category for a specific attribute. 1 minute of penalty (1min/error) was applied for this error.
This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. Based on the errors in the quality control report a total of 1 minute penalty was applied, which mean that a user utilizing EcoDomus software will spend 1 minute cleaning/fixing the COBie file.