Contact Us   |   Your Cart   |   Sign In   |   Join
buildingSMART alliance January 2014 Challenge: Graphisoft

January 2014 bSa Challenge: Graphisoft

by E. William East, PhD, PE, F.ASCE - Engineer Research and Development Center, U.S. Army, Corps of Engineers

The January 2014 buildingSMART alliance Challenge was the first buildingSMART Challenge event that expanded the set of contracted information exchanges beyond the Construction-Operations Building information exchange (COBie). This event introduced standards for the exchange of building programming information, at the early stages of project planning, and standards for the exchange of building systems, from design through the production of construction shop drawings. Software systems supporting COBie version 2.4, the Building Programming information exchange (BPie), and the Heating, Ventilating and Air Conditioning information exchange (HVACie) formats were demonstrated.

The detailed requirements for this Challenge were published by the buildingSMART alliance in February 2012 and discussed among the participants during monthly software developer meetings between February and October 2012. Meeting notes were provided to all potential Challenge participants. An example of one of the extensive sets of meeting notes from one of the meetings can be seen here. A new tool for the automated checking of COBie files, the COBie ToolKit was developed and used for this challenge. The rules used by the TookKit were documented in the COBie Responsibility Matrix and developed with direct feedback by the software developers and programmers. Another important contribution of these monthly discussions was the development of, and agreement to, rules governing the order of precedence when merging multiple model files at the Construction Documents stage of design. Final submissions were due 1 December 2013. This page provides information about the performance of one specific commercial software product.

Product Information

Files tested in the bSa Challenge are produced from software that can be commercially purchased on, or before, to the date of the Challenge. The table below provides information about the product tested and links to user guides and support hotlines.

Company Name GRAPHISOFT SE
Company Logo GRAPHISOFT ArchiCAD Logo
Product Name ArchiCAD
Product Version 17
User Guide User Manual
User Support Yanni Alexakis
GRAPHISOFT NA, NEWTON, MA
(617) 485-4215
http://www.graphisoft.com/support/ifc/
Files Produced: Project Planning/Programming
Architectural Design Phase
Coordinated Design Phase
Construction Phase
Construction Shop Drawing Phase
BPie
COBie
COBie
COBie
HVACie

Deliverable Information

Software companies were required to provide a set of files to demonstrate their products compliance with the requirements of the bSa Challenge. The table below provides a link to these files. As appropriate, notes are provided below the table.

Native BIM File Clinic_A_24102013_ArchiCAD17-Native model.zip 25.92 MB (.zip)
Exported IFC File Clinic_A_24102013_ArchiCAD17-IFC.zip 12.71 MB (.zip)
COBie File (See Note 2) Clinic_A_24102013_ArchiCAD17-Spreadsheet.zip 610.47 KB (.zip)
Quality Control Report (See Note 4) Clinic_A_24102013_ArchiCAD17.html 50.59 KB (html)

Notes:
(1) Software companies were required to provide an IFC model based on the Facility Management Handover Model View Definition. Since the MVD specification is based on IFC 2x4 and current software has only implemented IFC 2x3, this files based on IFC 2x3 Coordination Model View Definitions were acceptable for this event.
(2) The COBie File produced for this challenge was created through the use of the COBie Tool Kit based on rules developed and IFC to spreadsheet mappings documented in the COBie Responsibility Matrix.
(3) The Facility Management Handover Model View Definition IFC file listed here is the file that was produced by the COBie Tool Kit. Those working with IFC file exchanges will notice the file size reduction when non-asset and geometric information, found in the Coordination MVD but not the FM Handover MVD, are omitted.
(4) The Quality Control Report, for either the design or construction phases, was produced by the COBie ToolKit based on business rules developed and documented in the COBie Responsibility Matrix.

Test Procedure: Design Phase

The procedure for testing COBie design models required that the submitted design data must match the information content found on the Medical Clinic drawings. To that end, two presentations were developed to identify specific asset information found on the Clinic drawings. The first presentation provided the complete set of asset information found on the drawings for two specific rooms. The presentation is available here. The complete set of architectural, mechanical, electrical, plumbing, fire protection, communication, and other systems assets were identified from a detailed cross-referenced review of the clinic drawings. This presentation was used to define the level of detail that would be required if the entire set of building asset information were identified in the COBie model at the design stage. The second of the two COBie quality assessment presentations created, but not used as part of this challenge identified the same information two additional spaces in the building. The original plan for evaluating design software was to simply count the deviations between the model and the drawings and assess time required to manually correct the model.

After extensive review and discussion by the design challenge participants, it was determined that COBie data should be focused on managed assets, and not all information about all assets. This requirement was primarily driven by the majority of design software companies who did not follow the Challenge guidelines and develop the full Clinic model. The organizers of the Challenge, to respond to design software companies who did not meet the stated goal of the Challenge, were required to developed an enhanced COBie file based on a painstaking process of electronic and manual takeoff of drawings and equipment/product schedule information.

In any case, the resultant working COBie file provided to the design software vendors was updated, developed and distributed to design software firms. While the enhanced model contains room, door, window and MEP schedules, it did not contain COBie.Components for specific custom medical casework and owner furnished medical equipment (identified as Type.Name = "Equip*"). Following the publications of these pages an updated version of this working Clinic COBie file will be provided through the Common BIM File page.

Given the production and distribution of the enhanced working COBie file the quality metric used for design data models were as follows:

  1. Spatial Assets and Room Schedules. Did the architectural and construction documents stage submission correctly identify each space and (at a minimum) its associated room schedule attributes found on the drawings?
  2. Architectural Schedules. Did the architectural and construction documents stage submission include the following classes of assets and were these assets correctly identified and have (at a minimum) scheduled properties found on the drawings? Expected Architectural products schedules are Doors, Windows, Plumbing Fixtures, and Lighting Fixtures.
  3. MEP Schedules. Did the documents stage submission include both the spaces and architectural assets and properties, and the HVAC, electrical and plumbing fixture schedules and have (at a minimum) the scheduled properties found on the drawings? Expected minimum MEP scheduled products checked were Air Handling Units, Boiler, Chiller, Pumps and VAV boxes. Lighting fixtures were also checked as were plumbing fixtures.

Each COBie row that contains one or more deviations, that could not be explained by direct inspection of the vendors data file, was assessed a one minute penalty. This penalty is an estimate of the time required for someone skilled in the use of COBie and who has manually created COBie data from partially completed design COBie file.

Test Results

Submissions for the bSa Challenge for COBie in January 2014 were tested to ensure compliance with the COBie format, meeting business rules for design and construction deliverables, and the data quality standards, as noted in the previous paragraphs. The table below provides the results of these tests and evaluations. The first column, "COBie Sheet," identifies each worksheet in a COBie file. Those worksheets not required, must be present, but may be left empty. The second column, "In Scope," indicates if that a given worksheet (or its equivalent IFC formatted) data was required for the related challenge. If the information was in scope, the column contains the value of "Yes." If the information was not in scope, the column contains a value of "No" and while present in the file should be empty (or contain only the first row containing the column headings).

Design Software Results

The three right-most columns in the table below provide the results of the testing team's evaluation of the accuracy of the vendors design model submissions. Quality Control Errors identify technical errors found in the vendor's file. In some cases the COBie Tool Kit reported errors that required additional explanation, in these cases notes are provided. The next column, "Record (Count/Error)," identifies total count of COBie rows found for the required row and the number of rows containing errors when evaluated against information found on design drawings. The "Attribute (Count/Errors)" column identifies the attributes or properties found for all of the records of the given type. Errors related to these attributes were identified if the related schedule data found on the drawings did not, at a minimum match the information found on the related design drawings.

COBie Sheet In Scope QC Format Errors Record (Count/Errors) Attribute (Count/Errors)
Contact No 0 1 / n/a 0 / 0
Facility Yes 0 1 / 0 8 / 0
Floor Yes 0 5 / 0 20 / 0
Space Yes 0 269 / 01 4962 / 0
Zone Yes 0 259 / 02 0 / n/a
Type Yes 0 118 / 0 0 / n/a
Component Yes 0 3140 / 0 13635 / 0
System Yes 0 428 / n/a 0 / 0
Assembly No 0 0 / 0 0 / 0
Connection No 0 0 / 0 0 / 0
Spare No 0 0 / 0 0 / 0
Resource No 0 0 / 0 0 / 0
Job No 0 0 / 0 0 / 0
Impact No 0 0 / 0 0 / 0
Document No 0 118 / 0 0 / 0
Attribute Yes 0 18625 / n/a n/a
Coordinate No n/a n/a n/a
Issue No n/a n/a n/a
Picklists No n/a n/a n/a

Notes:
1. The count of the spaces was the same, but the vendor did not include a loading dock outside the building (1F02) or the Site. However, they did partition 2AC1 into 4 areas. Since the missing spaces are outside the building, this is not an error, but simply a noted modeling deviation.
2. The count of the spaces in "Logistics" zone did not match those of the test model because of the noted (1.) absence of space 1F02. This is therefore not logged as an error, but a noted modeling deviation.

Conclusion

This company successfully completed the construction COBie challenge by producing the handover COBie file of the Medical Clinic model. Based on the quality control report there were no format errors reported. The vendor also correctly exported the data required by the COBie design challenge criteria. This means that a user should not have to spend additional time correcting a COBie file exported from the vendor's software.

The full COBie Challenge for 2014 was to ensure that the data in the COBie model matched the data on the published design drawings. For this Challenge the submitter failed to provide the proper modeler time to meet the full COBie Challenge. Given accuracy of what was included in the model it can be expected that an additional manual effort would be required to repair properties in the design model that do not match those found on the construction contract drawings.

There are several reasons why buildingSMART alliance efforts to create the United States Building Information Model Standards have been successful. First, owners have begun to understand the need to receive building information that decreases both the first and recurring costs. There is now a standard (the COBie standard) and commentary (the COBie Guide) that defines the required information for building information deliverables.

Second, use of open standards decreases the cost of software developers to meet owners' requirements. Finally, innovation in this domain is now beginning in earnest since a common information framework frees software companies from chasing owner-specific and proprietary software-to-software information exchange protocols.

The publication of NBIMS-US v 2.0 began the transformation of planning, design, construction, and operations from a document-centric approach that began in the Italian Renaissance to a process where documents are provided with matching data. The overall result of the January 2014 bSa Challenge demonstrates that this transformation has begun. The specific results provided on this page give specific evidence that this process is underway.

Additional Resources

Graphisoft 2013 bSa Challenge Results

Back to the 2014 bSa Challenge

Community Search
Latest News
Calendar

11/4/2014 » 11/5/2014
NAPA Asphalt Sustainability Conference

11/5/2014 » 11/6/2014
Green Schools: Investing in Our Future

12/3/2014
Introduction to Cybersecuring Building Control Systems Workshop (December)

12/4/2014
Advanced Cybersecuring Building Control Systems Workshop (December)

1/6/2015 » 1/9/2015
Building Innovation 2015