The Copy/Paste Method

With all the recent chatter regarding whether COBie is dead or not has caused me to look at a few things. In particular, one item came to mind that gets overlooked more than we know.

Let me present a scenario, we are working on a project requiring the construction of a particular retaining wall. During the course such project, this retaining wall is built a total of three different times. Each time demolishing the previous and rebuilding from scratch. If this actually happened on a project, someone may be losing their job over the amount of rework.

Now let us turn our attention at how data is handled during the course of a project. From project inception to completion, the data for a project may be recreated multiple times. At project genesis, the design team creates a program that develops into a working model. This model then generates the construction documents and specifications. As the project is awarded to a contractor, that contractor will proceed to go through their processes in creating more data, unique to their workflows.

Even during the course of construction, data is duplicated and triplicated as submittal data and specification data begins to accumulate. When Commissioning starts, it’s entirely possible that this data again may be recreated in order to satisfy owner requirements for facilities maintenance, etc. However, this may not be the case at every company or for every project. Some may be more efficient at managing their data during the project life cycle. Yet I venture to say, that a lot work within this type of model. The rebuilding of this data just isn’t as “in your face” as a retaining wall built three times over.

The problem is quite simply this: How do we avoid the recreation of project data during the life of the project? How do we harvest the data from the design team in order to begin the propagation of data that can eventually be used by all project team members? Why can't we use the data from the MEP BIMs to start identifying specific equipment needed for the Owner's facilities group? I know that I am not the only one who feels this way.

How do we go about harvesting the data from a BIM to begin this process?

Currently, a few programs and/or companies exit to offer services to help facilitate this process. However, these can be pricey and potentially lock an Owner into a long, drawn out process of bringing in a third party to 'help' manage this data and the transition into their CMMS systems. The common thread in all of this is the 1s and 0s. As Dr. Evil said, "I'm the boss, I need the info." The current available options are limited in means to extract the data from BIMs. You can export schedules out to Excel, then copy and paste into spreadsheets. I'm not a big fan of a copy paste workflow. It's prone to errors and mistakes. The other option is export the data to a database, manage that data in that database and create exports to different formats - such xml, spreadsheets or another database type. This is probably the best solution to managing large amounts of data for different sources.

But how do we do this?

Currently you are extremely limited in linking a database to a BIM model. Your options are even less when it comes to leveraging anything AutoCAD related. As a BIM person, how do we go about harvesting, managing and collectively managing the data? Currently, there is no one perfect solution but probably a hodge-podge of methods that get results, but at what cost? BIM builds its reputation upon the usage of the copy paste method an acceptable usage of the technologies we have in place. Can you go around with a sense of pride saying, "I copied and pasted that!"  Should this be our battle cry?  I hope not.

The question is.....Where do we go from here?

In order to answer that question, we need to know exactly where 'here' is. All of us are at a different place on this path, some just beginning the journey, with possibly going down the wrong path. The effort is there. The solution lies in the extraction and migration of data for different sources, then merging that data into one source where the project team can pull specific data to fill out whatever is required. This can range from equipment matrices to specific data sets for FM imports to performance evaluation on how the process was managed. The potential is mind boggling. The mindset needs to change from wholesale model management to overall data management. It can be possible to create an as built model with all pertinent project data integrated into it with this type of data management. I know some can claim that this is possible at the present moment, but again, how many hoops and barrel rolls do I have to complete in order to make it work. Until we create a true information exchange on a BIM level, by this I mean models, submittals, manufacturer data and anything else desired, we will resort to the tried and true 'copy/paste method.'


  1. DJ-

    The problem of data preservation vs. re-creation throughout the project life is well-recognized, although not (I dare say) as well-understood or well-solved. Part of it comes out of the architecture-engineering tradition. In the '70s, when I was beginning my practice of architecture, we thought nothing of drawing and re-drawing–manually–a project, two or three (or more!) times. For example, a floor plan might be:
    -Sketched by the project designer;
    -"Hard-lined" on yellow trace to work out dimensional issues;
    -Traced on vellum for the SD client presentation;
    -"Hard-line" constructed again for SD/CD drawings, with more detail embedded;
    -Finally traced on vellum or Mylar with pencil and/or ink for CDs;

    We accepted this vast amount of re-work as our standard workflow, and considered the re-working as a method of quality control and refinement. Actual changes were additional to this workflow, sometimes incorporated when changing drawing stages, sometimes erased / redrawn within a stage.

    CAD solved quite a bit of the redraw but didn't change the basic 2D management or workflow. In fact, for most AE's, CAD was difficult to review on-screen, and check-printing became the standard review process. We actually increased our consumption of paper as the result of an automated process.

    Now we have 3D BIM models generating 2D drawing views. For many firms, that's the extent of it — their deliverables are drawings, and that's that. BIM is primarily a drawing coordinator for them.

    But COBie, and the Penn State PXP process, point to a standard way of embedding spec data in BIM objects, with the tantalizing possibility that we may be able to capture and retain the entire information history of an object throughout its life-cycle: from programming (performance requirements) through schematic design and design development (spec requirements) through procurement and installation (product asset and maintenance data). In an ideal world, we'd be able to query an object at any stage of its life and look all the way back in time to its initial requirements spec, and understand (when and if something goes wrong) where the error occurred, so we can either sue (if we're in the USA) or improve our process for the future (if we're elsewhere 🙂 This is in theory possible, and may indeed happen some day.

    There are (as I see it) two major challenges in the theoretical COBie framework:

    The first is that performance standards are usually specified by space in the project, e.g.:
    "The clean room shall have
    -### SF of ergonomic work surfaces
    -work surface lighting of ### footcandles
    -Internal air temperatures maintained between ## and ## degrees Fahrenheit at ##% RH;
    -## air changes per hour;"

    This is all fine and good program practice. The problem is that there is no direct path between these high-level performance specs and the detailed design results other than the designer's sketches and calculations. Some would say that this is how it should be–this is the role of a professional, after all. I don't disagree, but only observe that the main improvement in design audits resulting from BIM automation is the fact that you know where the specs should reside. In theory, of course. (Perhaps this is indeed a great leap forward and I'm being too skeptical.)

    The second, and more practical, issue is that COBie implementation that I've seen so far are (understandably enough) focused on export, with (I suspect) little import implementation. Certainly this was the case when COBie was spreadsheets only. Now that COBie is (or can be) embedded in the BIM as part of its IFC data, I think there is the real possibility of a cradle-to-grave data-assessment of the building model. But I realistically think it's still a year or three off to be able to demonstrate.

    Entering and managing COBie data in BIM authoring tools is still too fragmented and difficult. I do believe this will change as more jurisdictions adopt COBie as the "I in BIM" standard.

    • Robert,

      Great reply and response. Thanks for sharing you insight – I too come from an architectural background where recreation was an accepted practice. The point of this post was to illicit response from others and see their thoughts on this current workflow that many employ – the copy paste method. I do think we are closer than we realize in implementing COBie data or any specific data into our BIMs – what needs to be realized is that this interface is going to occur in a database enviromnet not in Revit or any other BIM authoring software. The interface and workflows there are a currently a little too cumbersome. Currently, its possible to exchange data on a database level with BIM authoring programs – its the initial creation and propogation of that information is what I am currently focused on. Its definitely the I in BIM – However, I think we also need to pay special attention to how much 'I' we embedd – sometime too much information is a bad thing.

  2. Yes you have my permission to say I copied that!! Thanks for comments.

    • Bob

      Thanks- also,do you have consider that NIST estimate of $15.8 Billion is too conservative an estimate of the "Cut and Paste Waste" per year…

  3. I refer to the generation going through undergraduate college programs the "copy-paste" generation in all my classes. The students often submit assignments simply "copied and pasted" with little reflection or analysis–let alone reading it once! I'm lucky if they even provide a citation…thank goodness for "Google Scholar" to find the source.

    • Bob

      DJ – Some excellent questions — Can I get your permission to "Copy and Paste" these questions for the NBIMS Terminology SubCommittee meeting this morning? I'd like to reference your points with the ten year old NISTs study of poorly designed info workflow done by Mike Gallaher et al that estimated over $15.8 Billion is wasted each year…minimum.

      This report, prepared for NIST by RTI International and the Logistic
      Management Institute, estimates the cost of inadequate interoperability
      in the U.S. capital facilities industry to be $15.8 billion per year

Leave a reply