Contact Us   |   Your Cart   |   Sign In   |   Join
buildingSMART alliance January 2013 Challenge: AutoDesk

January 2013 bSa Challenge: AutoDesk

by E. William East, PhD, PE, F.ASCE, Dr. Chris Bogen, PhD, and Mariangelica Carrasquillo, Engineer Research and Development Center, U.S. Army Corps of Engineers

The January 2013 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.

The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie Toolkit was developed and used for this challenge. The rules used by the Toolkit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 15 December 2012. This page provides information about the performance of one specific commercial software product.

Product Information

Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.

Company Name Autodesk, Inc.
Company Logo Logo File (BMP)
Product Name Revit
Product Version 2013
Configuration Guide Autodesk Revit Software Configuration Guide (DOC)
User Support Contact
Files Produced: Coordinated Design Phase COBie

Deliverable Information

Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.

Native BIM File Clinic A
Clinic MEP
14.4 MB (RVT)
34.4 MB (RVT)
COBie File COBie Challenge Autodesk Final 1 MB (XLS)
Quality Control Report1 COBie QC report - Design Deliverable 50 KB (HTML)

Notes:

1 The Quality Control Report, for either the design or construction phases, was produced by the COBie Tool Kit based on business rules developed and documented in the COBie Responsibility Matrix.

Test Procedure: Design Phase

The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, a presentation documented the detailed asset information found on the Clinic drawings for two typical rooms. This presentation shows the complexity of incorporating the complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets identified in a detailed cross-referenced review of the clinic drawings. This presentation was used to explain the level of detail required to establish if a COBie file contains the same information as printed drawings. Once the file was received an additional room, the first floor mechanical room was also added to the evaluation. To summarize the presentation, evaluation criteria for typical design COBie file should include, at minimum, the following types of reviews.

  1. Spatial Assets and Room Schedules. Do all design submissions correctly identify each space and associated room schedule attributes as found on the drawings?
  2. Architectural Schedules. Do architectural and construction documents stage submission include the following classes of assets? Were these assets correctly identified and have (at a minimum) scheduled properties found on the drawings? Expected Architectural products schedules are Doors, Windows, Plumbing Fixtures, and Lighting Fixtures.
  3. MEP Schedules. Do the coordinated design or construction documents stage submissions include both the spaces and architectural assets and properties, and the HVAC, electrical and plumbing fixture schedules and have (at a minimum) the scheduled properties found on the drawings? Expected minimum MEP scheduled products checked were Air Handling Units, Boiler, Chiller, Pumps and VAV boxes. Lighting fixtures and plumbing fixtures in the combined model should reflect the engineering drawings and not architectural drawings, in case of conflict in drawings.

To support the integration of scheduled information into the model an updated example COBie file based on a manual takeoff of equipment/product schedules. This updated Clinic COBie file will be publically released through the Common BIM File page as "version 12" of the Clinic COBie example files on or about March 2013.

While significant effort was spent to develop the underlying redacted drawings, initial models, and updated COBie models not one design software vendor met the complete COBie Challenge to have the model provided match the provided design drawings or example COBie file. As a result, the following criteria were established to evaluate design model submissions to establish an estimate of the quality of the submitted data file.

  1. COBie format assessment. The design model was checked against the format requirements based on the COBie Toolkit's Design QC report. Vendors were expected to have zero formatting errors therefore any row containing a formatting error was assessed a one-minute penalty.
  2. COBie.Space worksheet assessment. The design model files were filtered to evaluate only three specific Spaces. Two were identified in the design quality control presentation, plus the first floor mechanical room. Unexplained deviations between the vendors' file and the sample data were assessed.
  3. COBie.Zone worksheet assessment. The design model files were filtered to evaluate on the zones identified for the three specific Spaces. As long as the design model reflected zones that were correctly formatted, even if those zones did not match the zones identified in the sample file. Any row found to be in error, or internally inconsistent, was penalized once per row.
  4. COBie.Type worksheet assessment. The sample COBie model file was filtered to identify only those types appearing in the three specific Spaces. Since vendors did not completely model the clinic facility only those Types that were found to match were evaluated.
  5. COBie.Attributes assessment for COBie.Types. The sample COBie model was filtered to identify only attributes of those types whose components appearing in the three specific Spaces. Since vendors did not provide a complete set of Type Attributes where appropriate, only those Type Attributes found to match were evaluated.
  6. COBie.Component worksheet assessment. The design model files were filtered to evaluate the list of COBie.Components in each of the three specific Spaces. Since vendors did not provide a complete set of COBie.Components, only those Components found to match were evaluated.
  7. COBie.Attributes assessment for COBie.Components. The sample COBie model was filtered to identify only attributes of those types whose components appearing in the three specific Spaces. Since vendors did not provide a complete set of Type Attributes where appropriate, only those Type Attributes found to match were evaluated.
  8. Penalty Assessment Each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is an estimate of the time required for someone skilled in the use of COBie and who has manually created COBie data from partially completed design COBie file. Even where information did not match, if the vendor provided a reasonable explanation and the complete information set was internally consistent, then the error was waived. For each worksheet, the percentage of rows in error for the three sample rooms tested was assumed to apply to the entire model. That percentage of error was applied across all other rows of the worksheet to estimate the complete time required, based on the sample data provided.

Test Results

The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).

Design Software Results

The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.

COBie Sheet In Scope QC Format Errors Record (Count / Error Ratio) Attribute (Count / Error Ratio)
Contact No 0 57 / n/a 0 / n/a
Facility Yes 0 1 / n/a 0 / n/a
Floor Yes 0 5 / n/a 0 / n/a
Space Yes 0 269 / 01 5486 / 0
Zone Yes 0 18 / 02 0 / n/a
Type Yes 0 37 / 03 118 / 11%4
Component Yes 15 478 / 03 3144 / 0
System Yes 0 13 / n/a 0 / n/a
Assembly No 0 0 / n/a 0 / n/a
Connection No 0 0 / n/a 0 / n/a
Spare No 0 0 / n/a 0 / n/a
Resource No 0 0 / n/a 0 / n/a
Job No 0 0 / n/a 0 / n/a
Impact No 0 0 / n/a 0 / n/a
Document No 0 0 / n/a 0 / n/a
Attribute Yes 0 8748 / 0 n/a
Coordinate No n/a 525 / n/a n/a
Issue No n/a n/a n/a
Picklists No n/a n/a n/a

Notes:

1 There was a discrepancy between the units of measure used for usable height, but based on the unit conversion the values are accurate and no penalty is applied.

2 In the newest version of COBie (2.4) the standard convention for Zone.SpaceNames is to provide only one space instead of comma-delimited lists of space; now a Zone row is uniquely identified by Zone.Name+Zone.Category+Zone.SpaceNames. However, no penalty is applied because the provided data is accurate and is still an acceptable representation for Zones.

3 The description fields for many rows contain the Autodesk family name for the Component or Type. Though this doesn't match the original model, it is a valid use of Description.

4 Based on the inspected data, Voltage values (Cold Water Pump, e.g.) did not match the values on the original model. This incurs a 4-minute penalty for the time it takes to manually fix these errors.

5 This error is due to one component row referencing a Space name that does not exist in the Space worksheet. This incurs a one minute penalty.

Conclusion

Format Compliance

This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. Based on the quality control report, there was only one small error that incurred a one minute penalty with respect to the internal consistency of the output format that would require an estimated 5 minutes to correct.

Content Quality

The vendor did not produce the 100% of the sample data; however the produced data was very accurate to expected sample data found on the project drawings. Based on the content quality criteria a 4 minutes penalty was applied as described in the session above. In sum a total of 5 minutes was applied for the submitted file. This means that a user utilizing Autodesk Revit 2013 software is estimated to have to spend 9 minutes cleaning/fixing the COBie file for a facility of comparable size and complexity.

Additional Resources

Previous June 2010 Revit Challenge Results (for COBie Output)

COBie homepage

COBie ToolKit

Example Model Files

Back to the 2013 bSa Challenge

Community Search
Latest News
Calendar

9/28/2016
BSSC Webinar: Response Spectrum Analysis/Linear Response History Analysis

10/4/2016
Workshop: Your Building Control Systems Have Been Hacked, Now What?

10/12/2016
Workshop: Zeroing in on Schools: Transforming New and Existing K-12 Buildings to Zero Energy

10/18/2016
Introduction to Cybersecuring Building Control Systems Workshop (October 2016)

10/19/2016
Advanced Cybersecuring Building Control Systems Workshop (October 2016)