Brett Sargent, Chief Technology Officer and VP/GM of Products and Solutions at LumaSense Technologies, Inc., talks to AZoSensors about the management of Big Data in the sensor industry.
KK – You’ve recently presented at two back-to-back conferences on data analytics. What were the main points covered during your presentations?
BS – The main points covered in the presentation were:
- The Grid is not Smart by just putting smart meters at the end point of the grid. The infrastructure on a global basis is old, very old. Many components are 40+ years old at this point. While having smart meters at the end may enable pricing options such as Time of Use and Demand Response, it does not help with sustainable and reliable electricity flow from Generation to the end user. You lose your smart meter…you are back in the 1980’s. You lose your grid…welcome to the 1880’s.
- The grid is getting pushed harder and harder. The number of significant power outages have increased significantly over the past few years…growing from 76 in 2007 to 307 in 2011. The average age of a transformer in North America is over 40 years old.
- With such an aged grid, you really have 2 choices on what you can do:
- Build new infrastructure to help with growing electrical demand and replace the aged infrastructure. This is met with huge obstacles such as a lack of budget, NIMBY (Not in my backyard), BANANA (Build Absolutely Nothing Anywhere Near Anyone), aged workforce, etc.
- Make things last longer – do things to extend the life of existing assets on the electrical grid. This is a more viable solution…and can be accomplished through the addition of Sensors.
- The new approach and the right approach is SLEx (Substation Life Extension), where you approach a substation and evaluate the assets and then outfit the substation as needed in order to extend the life of the substation to the point at which you can upgrade, replace, etc.
- However, when you put a high number of sensors in place to enable SLEx…then you also step into the pit of Big Data. A high number of sensors in a substation can drive Big Data where you can have 3-5 Gb/sec of data coming out of a substation, overwhelming a utility and crushing the bit pipe used to carry data.
- To avoid this, utilities should embrace Intelligent Sensing At The Edge, which will keep the data storage and analytics close to the sensor head and then use the “report by exception” philosophy in order to send information back to the utility on an as needed basis, instead of flowing all of the data continuously back to the utility. The data is not lost or misplaced…just stored near the asset being monitored and available to be analyzed as needed.
KK – What are your views on Big Data for Big Substations and the challenges this brings for the utility companies?
BS – Big Data has a different definition to different people, based on Volume, Velocity, Variety and Veracity of the data coming at them. The definition of Big Data for Google will be different then it is for a utility customer. Big substations need a large number of sensors in order to monitor and keep track of the assets that are located at that substation. This will bring 3-5 Gb/sec of data out of a substation that is almost impossible to handle effectively. So where do you process the data? The options are:
- At the Edge
- In the Cloud
- In the Server (at HQ).
Each has pros and cons, but keep in mind the further you move data, the more likely it is something will go wrong or the data will get hacked, lost, corrupted, etc. It also costs more money in order to move data a greater distance. Factors to consider where you process the data include:
- Bandwidth – which is the most important
- Security – NERC CIP
- Size of Network.
Sensors are needed. They create big data. The challenge is to know where you are going to process this big data and how you are going to handle it. The most cost effective way is to do intelligent “sensing at the edge”.