Contact Us   |   Your Cart   |   Sign In   |   Join
buildingSMART alliance January 2013 Challenge: Bentley Systems

January 2013 bSa Challenge: Bentley Systems

by E. William East, PhD, PE, F.ASCE, Dr. Chris Bogen, PhD, and Mariangelica Carrasquillo, Engineer Research and Development Center, U.S. Army Corps of Engineers

The January 2013 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.

The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie Toolkit was developed and used for this challenge. The rules used by the Toolkit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 15 December 2012. This page provides information about the performance of one specific commercial software product.

Product Information

Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.

Company Name Bentley Systems, Incorporated
Company Logo Bentley Logo (JPG)
Product Name AECOsim Building Designer (BETA)
Product Version
COBie User Guide COBie User Guide (PDF)
COBie-US Template Building Template US COBie (ZIP)
COBie ABD Mapping Guide COBie ABD Mapping (PDF)
Migration Path for Existing Users Existing Project to COBie Upgrade Guide (PDF)
User Support Contact
Files Produced: Coordinated Design Phase COBie

Deliverable Information

Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.

Native BIM File Not provided
Exported IFC File1 2013-01-21_ClinicModel 21 MB (ZIP)
216.2 MB (IFC)
Quality Control Report2 2013-01-21_ClinicModel_DesignQC 48 KB (HTML)


1 Software companies were required to provide an IFC model based on the Facility Management Handover Model View Definition. Since the MVD specification is based on IFC 2x4 and current software has only implemented IFC 2x3, this files based on IFC 2x3 Coordination Model View Definitions were acceptable for this event.

2 The Quality Control Report, for either the design or construction phases, was produced by the COBie Tool Kit based on business rules developed and documented in the COBie Responsibility Matrix.

Test Procedure: Design Phase

The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, two presentations were developed to identify specific asset information found on the Clinic drawings. The first presentation provided the complete set of asset information found on the drawings for two specific rooms. The presentation is available here. The complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets were identified from a detailed cross-referenced review of the clinic drawings. This presentation was used to define the level of detail that would be required if the entire set of building asset information were identified in the COBie model at the design stage. The second of the two COBie quality assessment presentations created, but not used as part of this challenge identified the same information two additional spaces in the building. The original plan for evaluating design software was to simply count the deviations between the model and the drawings and assess time required to manually correct the model.

After extensive review and discussion by the design challenge participants, it was determined that COBie data should be focused on managed assets, and not all information about all assets. This requirement was primarily driven by the majority of design software companies who did not follow the Challenge guidelines and develop the full Clinic model. The organizers of the Challenge, to respond to design software companies who did not meet the stated goal of the Challenge, were required to developed an enhanced COBie file based on a painstaking process of electronic and manual takeoff of drawings and equipment/product schedule information.

In any case, the resultant working COBie file provided to the design software vendors was updated, developed and distributed to design software firms. While the enhanced model contains room, door, window and MEP schedules, it did not contain COBie.Components for specific custom medical casework and owner furnished medical equipment (identified as Type.Name = "Equip*"). Following the publications of these pages an updated version of this working Clinic COBie file will be provided through the Common BIM File page.

Given the production and distribution of the enhanced working COBie file the quality metric used for design data models were as follows:

  1. Spatial Assets and Room Schedules. Did the architectural and construction documents stage submission correctly identify each space and (at a minimum) its associated room schedule attributes found on the drawings?

  2. Architectural Schedules. Did the architectural and construction documents stage submission include the following classes of assets and were these assets correctly identified and have (at a minimum) scheduled properties found on the drawings? Expected Architectural products schedules are Doors, Windows, Plumbing Fixtures, and Lighting Fixtures.

  3. MEP Schedules. Did the documents stage submission include both the spaces and architectural assets and properties, and the HVAC, electrical and plumbing fixture schedules and have (at a minimum) the scheduled properties found on the drawings? Expected minimum MEP scheduled products checked were Air Handling Units, Boiler, Chiller, Pumps and VAV boxes. Lighting fixtures were also checked as were plumbing fixtures.

Each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is an estimate of the time required for someone skilled in the use of COBie and who has manually created COBie data from partially completed design COBie file.

Test Results

Submissions for the bSa Challenge for COBie in January 2013 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).

Design Software Results

The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.

COBie Sheet In Scope QC Format Errors Record (Count / Errors1) Attribute (Count / Errors)
Contact Yes 0 1/ n/a 0 / 0
Facility Yes 0 1 / n/a 8 / n/a
Floor Yes 0 2 / n/a 14 / n/a
Space Yes 0 259 / 0%2,3 1873 / 0%
Zone Yes 0 230 / 0 0 / 0
Type Yes 0 80 / 38%4 978 / 6%5
Component Yes 0 1397 / 0% 8244 / 1%5
System No n/a n/a n/a
Assembly No n/a n/a n/a
Connection No n/a n/a n/a
Spare No n/a n/a n/a
Resource No n/a n/a n/a
Job No n/a n/a n/a
Impact No n/a n/a n/a
Document Yes 0 80 / n/a 0 / 0
Attribute Yes 0 18117 /1%5 n/a
Coordinate No n/a n/a n/a
Issue No n/a n/a n/a
Picklists No n/a n/a n/a


1 Penalty was applied for data mismatch since the "right answer" was given based on drawings and manually created model file.

2 Space.GrossArea and Space.NetArea - due to the unit convention between files, the measurements in the in the produced are slightly different from the measurements in the sample file. Although the data does not match the requirements, the discrepancies between the produced and sample data were considered small enough and therefore, no penalty was applied for this error.

3 COBie.Category - Although the value is not exactly the same as provided in the sample, vendor's selection is not far away from the room's classification therefore, no penalty was applied for this item.

4 Vendor partially produced the data on the type worksheet. 38% of the instances are missing the Type.Category name token. A penalty of 30 minutes (1min/row) was applied.

5 Vendor partially produced the data on this worksheet. 183 entries were missing the unit field. A penalty of 183 minutes (1 min/row) was applied for this error. Additionally vendor produce attributes at the component level but fail to remove those attributes associate with types (1,098 entries). Considering that these could be deleted in mass only an additional 5 minute penalty was applied. Total penalty for attributes was 188 minutes.


Format Compliance

This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. No errors were encountered based on the quality control report; therefore no penalty was applied for the data format, delivery of required fields, and proper referencing across the worksheets.

Content Quality

The quality of the produced data did not match 100% to the sample data provided and penalties were applied as described above. A total of 218 minutes (3.6 hours) penalty was applied for the data mismatch. This means that a user utilizing Bentley software would have to spend 3.6 hours fixing/cleaning the COBie file.

Additional Resources

Previous March 2010 Challenge Results

COBie homepage

COBie ToolKit

Example Model Files

Back to the 2013 bSa Challenge

Community Search
Latest News

MMC Webinar: How Does Business Continuity Contribute to Community Resilience?

12/4/2016 » 12/8/2016

Workshop: Your Building Control Systems Have Been Hacked, Now What?

1/9/2017 » 1/12/2017
Building Innovation 2017 Conference & Expo