This post and several that follow will round out the video clips and information in the preceding posts in the string by providing a few additional examples of how interval data and scatter charts can be used to perform building diagnostics and/or project savings.
Just Another Outdoor Air vs. kW Regression
This example starts out as a sort of “vanilla” interval data example. The facility was in Monterey, CA, a fairly mild climate. It was a class room facility with offices, a number of small training rooms, and one large training room that could be divided into four quadrants. The initial plot of kW v.s. Outdoor Air Temperature (OAT) gave us a fairly big cloud with maybe two layers that you could see; perhaps a hint of a third layer.
My shading trick revealed that there was in fact three layers.
Identifying the Drivers Behind the Bands in the Cloud
It was pretty easy to explain the difference between the 32 kW- 40 kW band and the 60-70kW band. One circuit on the chiller was about a 30 kW load and it was the only thing in the building that was big enough to cause a jump of that magnitude.
I suspected the 32-40 kW band was equipment running when the building was occupied and the 20 kW band was equipment that ran round the clock. But the 32-40 kW band was lower than I had expected given the equipment schedule. In other words, the sum of the major equipment kW from the motor schedule was in the range of 130 kw. The major loads were VAV air handling units, two pumps, a boiler and a chiller that could stage up and down.
So while you would not expect the load to jump up 130 kW from the 20 kW base load when things started up, you would expect to see a cloud that had its lower edge at some fraction of the 130 kW, representing the systems at minimum capacity, for instance the VAV systems at minimum flow and the chiller off. And when things loaded up, you would expect to see the top of the cloud up at about 130 kW over the base load.
So, prior to going on site, based on the equipment schedules, I was expecting a cloud with a base maybe at 60 kW (about 40 or so kW over the base load) and a top at about 150 kW (130 kW over the observed base load). The cloud would be darkest at the load condition encountered most frequently. Clearly, that was not what I was seeing
When we got on site and looked around, the reason for the lower than expected cloud shape became obvious. We discovered that most of the equipment was running at significantly less than the full load bhp rating, as summarized in the table below.
Based on our site observations, I was fairly confident that the middle cloud – the 32-50 kW or so layer – represented the building operating with one stage of the chiller cycling. I also concluded that the cloud above that just about had to be the chiller’s second stage cycling on and off along with systems occasionally ramping up to their full rated bhp.
An Economizer Performance Insight
While on site, we were able to verify that the air handling systems tended to run with discharge temperatures in the 50-60°F range. Since the systems had economizers, that meant that if the outdoor air temperatures were at or below those conditions, you should not need to run a chiller. What was striking about that in the context of our kW cloud was this.
Assuming that the upper band in the cloud was associated with the second stage of the chiller, you would not expect to see any events at temperatures below 50-60°F. In other words, you should not need any chilled water, let alone two stages of chiller capacity below about 60°F. But if you filtered the data for events below 60°F and kW above 50 (an “eyeball average” assessment of where the top of the middle cloud, representing typical operation was), you ended up with this chart.
So if our assumptions were correct, it appeared that something was causing the chiller to run unnecessarily during cool weather. In other words, the economizers appeared to not be delivering economy.
Counting the Dots to Assess the Savings
Here is where counting the dots came in. Each point in the filtered data represented a kW value that existed for 15 minutes. If you assumed that each of those points represented the added load associated with the chiller running one circuit, then by counting the dots and dividing by 4 (because there are four 15 minute intervals in an hour), I could come up with how many hours the chiller used at least one circuit of capacity when it didn’t need to.
I could then do some basic math to extrapolate how much energy that might be costing the facility, as summarized in the table below.
So my bottom line up to this point is that the interval data combined with some field based assumptions allowed us to project a significant savings that was grounded in real data and focused subsequent RCx efforts on the part of the Owner.
Additional Opportunities Come to Light
One other thing that caught our eye as we walked through the facility the first time was that the large meeting room in the core of the facility was very lightly loaded. It spent a lot of time sitting empty or with a very small group of people in it. But the system serving it ran any time the building was occupied versus any time any of the meeting rooms were occupied. So right off the bat, we realized we had a simple scheduling opportunity; only run the system if a meeting room was in use.
But there was another significant opportunity that you could see in the energy data clouds. To understand that, you need to know a bit more about the system serving the meeting rooms. It was a dedicated four zone variable volume reheat air handling system that included a preheat coil and chilled water coil at the air handling unit. Unless the unit was in a warm-up cycle, the system delivered air in the 52-60°F range in anticipation of the need to maintain the space relative humidity at or below 50% and handle the cooling loads associated with an internal space that would have people, lights, and equipment in it.
Each of the four terminal units had a fixed minimum flow setting that was about 30% of the maximum flow setting and in general terms would support an occupancy of about 10-20 people per room, depending on exactly how you did the minimum outdoor air flow analysis.
Thinking Through How the System Might Use Energy
If you think about how a system like that would function if there was nobody using the meeting rooms, you will realize the following sequence of events might occur:
- The system would start up as scheduled and do what it needed to do to bring the space conditions under control.
- If the space had cooled off during the unoccupied cycle, then the system would run with the terminal units at minimum flow and use the reheat coils to warm the space up. There would be no outdoor air introduced at the main AHU since the warm-up cycle would occur prior to occupancy.
- If the space had warmed up during the unoccupied cycle, the system would use outdoor air cooling if it was viable and operate at peak flow to bring the spaces down to set point. If the outdoor air was hot and humid the system would still operate at peak flow but would provide the cooling via chiller operation using no outdoor air since the spaces would not be occupied during a cool down cycle.
- Once the spaces were at the target temperature for occupancy and the building was occupied, the system would shift to its normal operating mode, providing at least minimum outdoor air and providing supply air in the 52-60°F range in anticipation of a net cooling load in the space due to internal gains.
- Here is where the opportunity comes into play. If there is nobody in the space generating a cooling load, then eventually, the system would drive to the minimum flow rate setting for the terminal units and start to reheat. And it would just sit at that condition until the load in the zone became high enough to drive the flow requirement up off of the minimum flow setting.
In other words, by nature, the system was set up to do simultaneous heating and cooling. And given the lightly loaded spaces, we suspected it might be doing a lot of it.
Getting a Visual on Simultaneous Heating and Cooling
It turned out that you could see this in the scatter plots. For this site, we were able to get interval data for the gas meter in addition to the electric meter. The caveat’ was that the gas interval was once every 24 hours vs. the once very 15 minute interval available for the electrical data.
So the thermal data was not nearly as “granular” as the electrical data. Meaning you could not use it to understand hour by hour trends, but you could use it to under strand daily trends, compare weekends to week days, etc. So still pretty powerful.
Here is what the gas data looked like.
The thermal data also had three layers with one of the layers being 0 consumption, which occurred a lot of the time when the building was unoccupied over the week end given the mild Monterey environment.
Looking at kW and Therms Concurrently
Since we were looking for simultaneous heating and cooling clues, I decided it would be nice to be able to look at both clouds together. The problem was that due to the scaling of the axis, they ended up on top of each other.
Some Good News and Some Bad News (Then Some More Good News)
Having the clouds on top of each other had its good and bad points. One of the good points was that it allowed us to see that the cloud shapes seemed to be related for both utilities. Some of that is related to the magnitude of the consumption being similar given the units. In other words, the building kW data was in the range of 0 to 120 kW and the building thermal data was in the range of 0 t0 120 therms.
So in the context of the Y axis, the clouds lie on top of each other simply by virtue of the units of measure used. If I plotted electricity as watts and gas energy as therms, then the clouds would separate but the thermal cloud would compress to a line given the scale I would need to use for a peak electrical load of 120,000 watts.
But the fact that the clouds over-lap in terms of the x axis tells us something. In a very simple building with gas heat and electric cooling, you would anticipate that by the time the electric cooling started, the thermal heat would have stopped. So, if therms were blue and kWs were red, the blue cloud would exist to the left of the temperature that represented the balance point for the building.1
The red cloud would likely exist there too because of lights, plug loads, the furnace fan, and things like that. But it would not grow vertically until you were to the right of the balance point, meaning you now were using electricity to cool the building in addition to providing the other functions.
What catches your eye about the data set we are looking at is that the blue cloud (thermal consumption) exists concurrently for all of the conditions where we have a red cloud (electrical consumption), including when it is significantly above the likely building balance point (60-65°F would be a good guess).
Excel Has a Secondary Horizontal Axis Capability
To try to get better picture of that, I decided to separate the clouds from each other. To do that, I used the secondary axis feature of excel. Specifically, I plotted thermal data on the primary vertical and horizontal axis and the electrical data on the secondary vertical and horizontal axis.
Most of you probably are aware of Excel’s ability to plot data on a secondary vertical axis. The specifics of doing it vary from version to version, but in general terms, you pick the data series you want on the secondary axis, select format, and the select “Secondary Axis”. Here is what that looks like in Excel 2013.
When I selected the red data series in the graph above and then selected “Secondary Axis” here is what I got. Note that there is now a secondary axis on the right side of the graph and the red data is being plotted on that basis. I changed the scale to make that obvious.
Once you have established a secondary vertical axis, Excel allows you to add a secondary horizontal axis via the chart design tools.
Closing In on a Result
Here is what my chart looked like after going through these steps along with a bit of additional formatting. Note that I used colors for the axis lines to help your eye associate them with the proper data series. I also scaled the two horizontal axis differently, which is the “trick” that let me shift the clouds apart.
When I filtered both data sets to show what was going on for temperatures above 60°F when the building is occupied and to isolate the top cloud, I got this. Bear in mind that for the gas data, I got one dot for every day at the indicated condition where-as for the thermal data, I got one dot for every 15 minutes at the given condition, which is why there are a lot more red dots than blue dots.
Since this building should not need much heat to offset the envelope this filtered data pattern suggests that the boiler is working to offset cooling being done by the chiller, assuming that our hypothesis about the top part of the electrical kW cloud being associated with the chiller operation is correct.
That is exactly what you would expect to happen with a reheat system, especially if it did not have an integrated economizer process and reverted to recirculating air once the outdoor temperature got above the required discharge temperature set point.
The Bottom Line
So the bottom line on this example is that the interval data plots supported our hypothesis about the impact of the large unoccupied meeting rooms on the building consumption patterns. We could have used the data to assess the savings potential, just like I used it to assess the savings potential for the chiller operating when the economizers should have been able to handle the load.
But Brian and Jay, the guys I was working with on this, were so excited by the discovery that I was talking to my self when I suggested we do that. They were already off figuring out how to capture the savings, something they accomplished by adding occupancy sensors to the meeting rooms and initiating a project that would add CO2 sensors to reset the minimum flow rate as a function of occupancy when the rooms were in use.
Senior Engineer – Facility Dynamics Engineering
1. The balance point of a building is a concept associated with the degree-day approach for estimating energy consumption. It is the temperature at which the internal gains in a simple building exactly offset the losses through the envelope. As a result, you do not need to provide any additional energy to heat or cool the building in order to maintain the desired internal temperature.
Once the outdoor temperature rises above the balance point, you will need to do cooling to maintain your desired set point. If it drops below the balance point, you will need to do heating to maintain set point.
For a complex building with core loads that require year round cooling, the concept of a balance point is not particularly valid in the context of the building taken as a whole. However, for the perimeter zones, it is still a useful concept in terms of understanding when you probably are doing heating vs. cooling.
Typically, for a perimeter zone these days, the balance point will be in the 60-70°F range with 65°F being a sort of de facto standard. That is why the standard baseline line temperature for heating degree days is typically 65°F.