Big data gives power providers a better view of energy consumption

Power companies throughout the United States are now using data collection tools to gather information from smart meters, substations and distributed resources in an effort to figure out how the grid can become more efficient. In order to effectively obtain actionable intelligence, utilities are amassing this digital information and processing it through analytics programs hosted on cloud servers.

Obtaining an operational perspective 
According to InformationWeek, General Electric Water and Power Chief Technology Officer Jim Fowler claimed that the emergence of critical masses of data combined with the ability to merge large data sets have enabled the energy technology company to stockpile 100 million hours of operating and maintenance information from 1,700 turbines and compare it with weather statistics. In conclusion, Flower noted that GE is helping its customers attain a 1 percent improvement output. 

Though this statistic may seem insignificant, the slight improvement in production will result in anywhere from $2 million to $5 million in annual savings per turbine. Over the next 15 years, GE clientele utilizing the company's 1,700 turbines will retain $66 million. This success is an example of how big data is being organized to attain market-changing knowledge that will have residual effects on the energy economy. The production rates of these machines may never have been addressed if not for the analytics tools used to interpret the information delivered by the sensors. 

GE is one of many companies capitalizing on the Internet of Things by using distributed mechanisms in their products to gain algorithmic insight. Over time, this procedure allows executives and engineers to readjust the original designs of particular equipment so that machinery operates to the best of its ability. 

Know how it is used 
Due to the variable nature of energy demand, it's imperative that utilities possess a comprehensive view of all assets participating in the ebb and flow of electricity. Intelligent electronic devices work off of the information transmitted by smart meters, substations and distributed resources to figure out where power could be best used. Cloud computing has offered utilities a means of storing such intelligence, but they are finding it difficult to prioritize what data should be given appropriate attention. 

Bob Ritchie, a contributor to Smart Grid News, claimed that the North American grid is undergoing a monumental shift in protocol. More renewable energies are being implemented into critical infrastructure, electric vehicles have contributed to consumption rates and coal and nuclear power are slowly fading out of practice due to regulatory and financial pressures.

"More supply sources (many with no means of central control) coupled with increased demand volatility will complicate grid management," said Ritchie. "Utilities will need to deliver a broader range of information, faster service and near instantaneous status updates."

Leveraged through cloud storage, the data sets collected by these assets have been used to fuel analytics tools. Ritchie noted that such technology can help utilities anticipate transformer failures, identify energy theft attempts and merge weather and demographic statistics with the information in order to predict demand. A push for sustainability combined with a rising population is forcing energy providers to adapt to the shifting environment, one that includes big data as another grid asset. 

Related Articles


  • Welcome to GoGrid!
  • I'm a Cloud Infrastructure and Big Data Solutions expert.
  • What questions do you have today?
Call us at 1(877) 946-4743 (US & Canada)
GoGrid Compliance