Using Scatter Plots to Assess Building Performance–Part 6

While teaching a class earlier today, I realized that there were a couple of useful techniques that I had yet to illustrate in terms of working with scatter plots.  These techniques are particularly useful in that they allow you to fairly quickly start to bracket the cost associated with a particular energy pattern if you were to isolate it.  If that pattern is an undesirable pattern, then the those costs are the savings potential associated with eliminating that pattern.

Counting a Few More Dots

Basically, these techniques automate the concept I illustrated in the post titled Using Scatter Plots to Assess Building Performance–Part 4 where I determine the savings by “counting the dots” since each dot represents 15 minutes (or what ever the sample interval is) of consumption.  In fact the video clip works with that same data set to quantify another opportunity that could be identified in the scatter plot.

As you may recall, our electrical data cloud looked like this.

image

And, based on other information we had about the facility, we had concluded that the lower band in the cloud (nominally 20 kW) likely represented base load; i.e. equipment and systems that ran around the clock.

Computers in server rooms are likely a part of this load, as are things like vending machines, security lighting, personal computers that are left on all of the time, etc.  Most facilities have some opportunities for improvement in this area, so being able to isolate it and quantify it would be desirable, and the techniques in the video could be used to do that.

Based on its magnitude and the nature of the equipment in the facility, we had also concluded the upper band in the cloud (the band at approximately 55 – 90+ kW) likely represented the operation of the chiller since it was the only piece of equipment that could generate that big of a step in consumption.

We acknowledged that it might also be possible for the air handling systems to generate a peak in that area if they ramped up to full load.   But field observations and trends indicated that seldom happened.

Finally, we concluded that the middle cloud (the nominal 32 to 55 kW band) likely represented the normal operating load associated with the HVAC equipment and related utility systems.  All of those systems are scheduled;  they should only run Monday through Friday between the hours of 7 am and 5 pm.  Otherwise the systems should be off other than for an occasional weekend event or a short duration run for night set-back or set-up to keep the building temperatures from drifting too far out of limits during extreme weather.

Given the Monterey location, there is not that much extreme weather.  So, you would expect that if you filtered out the normal operating hours and days (i.e. turn off the data points associated with them), then the middle band and upper band in the cloud would disappear.  Such an experiment is very easy to perform with a scatter plot and when we did it for this data set, it was quite revealing.  Here is what we saw when we eliminated:

  • Data points at or below 20 kW (the base load), and
  • Data points associated with Saturday and Sunday (times when the building should be off other than the rare weekend event), and
  • Data points associated with 7 am – 6 pm Monday through Friday (the normal operating hours.

image

Not what we were expecting to see;  if everything really was off, the hours remaining in the data set should have little if any consumption associated with them and the cloud should have totally disappeared.  But the chart gives a very quick visual clue that such is not the case.

Adding a few simple formulas quantifies the potential operating cost associated with things running when they should not be as being in the range of 44,631 to 60,383 kWh per year ($8,034 to $10,869).  Adding Saturday and Sunday back in (four mouse clicks) increases the potential to 63,767 to 86,273 kWh annually ($11,478 to $15,529).  Those are pretty firm numbers since they come from the building data and a few engineering assumptions and they certainly merit some additional investigation to see if they can be captured.

The Dot Counting Technique

This video clip walks through the steps I used to develop the numbers above.  The techniques are simple and quick to implement and as you can see, yield some powerful results that are grounded in the metrics of the building.

And once you have a spreadsheet set up for a typical data set from a given utility or metering system, it is not difficult at all to copy and paste new data into it, meaning your initial effort really yields a tool that you can use as an ongoing resource in addition to helping assess savings for the current project.

At this point in the discussion, all of the savings are theoretical;  we are making projections base on assumptions applied to the metered data.  The results help focus retrocommissioning efforts on the targets that we suspect to be the drivers behind the patterns based on our filters.

But as you will see in the next post, the guidance provided by these techniques frequently leads to some measure of the anticipated savings.  And the same techniques used to evaluate the opportunity can be used to assess the benefit delivered.

David-Signature1_thumb1_thumb

David Sellers
Senior Engineer – Facility Dynamics Engineering
PowerPoint-Generated-White_thumb1_th

Click here for a recent index to previous posts

This entry was posted in Excel Techniques, Retrocommissioning Findings. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s