EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 45

The Energy That Makes the World Move

Septimiu Nechifor
Head of Research Group @Siemens



PROGRAMMING

Throughout its intense and challenge-rich history, the technology, which deals with the processing, communication and storage of data, has provided a landscape of continuing diversification, heterogeneity and distribution, in all walks of life. The interaction between technology and the way it is used generates innovation and puts an extremely useful pressure on our ability to innovate and adapt. As can be seen in the case of individuals, and of groups made up of people and animals, heterogeneous technical systems entail a high complexity in terms of control and a continuing ability to react or to learn, to infer and to predict, to adapt to an ever-changing environment.

In addition, what becomes critical is the way in which the varying performance of participating systems is treated in terms of how the systems react to accidents, which can be foreseen or not. Due to this understanding of the surrounding context, adopting a solution, which is suitable for functional and cost optimisation, becomes essential for the optimal functioning of these complex artefacts. Energy is an integral part of human history. Every step that was taken by human groups towards finding, transporting, storing and using energy had decisive impact in the way the world we live in is modelled.

There are two fundamental milestones in the history of civilisation: 1. the invention of the steam engine, which brought about the industrial revolution and the phenomenal change in freedom of movement and production capabilities, and 2. the ability to produce and transport electrical energy, which had extraordinary impact on the quality of life and on the world's rapid urbanisation. The technology revolution brought about the production, the transport, the distribution and the consumption of energy, may it be electrical, thermal or mechanical. The technology revolution has also given the world access to safety, comfort and opportunities, which are hard to imagine otherwise. What can also be noticed is the way in which innovation has been sustained and nurtured in the last half century by the evolution of calculus capability, data storage and access to telecommunication services in various locations.

Steps towards intelligent energy

The relationship between energy and IT, from production to consumption, has evolved in time and is a very interesting story. In what follows, we will limit ourselves and refer to a less visible field and the way it benefits from Big Data Analytics and Machine Learning. This field is called Maintenance, and the topic found at the crossroads of Predictive Analytics and Maintenance is called Predictive Maintenance, which focuses on detecting signals and trends, which can indicate the tendency of a system or of a sub-assembly to function sub-optimally. Sub-optimality eventually leads to incontrollable flaws and to usage halt, for a given system time. In this context, Predictive Maintenance gathers data obtained as a result of monitoring given infrastructures plus additional relevant data, such as weather forecast or the ones regarding the cost of raw material, and provides recommendations, for various stretches of time, recommendations which are influenced by the impact of how the monitored infrastructure is maintained. This approach is certainly a step ahead, as compared to the blind maintenance where installation checks are usually performed at regular intervals or are determined by the statistical approximation of the deprecation degree of the various components.

The great compounds which produce electrical energy, more commonly known as electrical power plants, are complex, high-volume, long-term investments. The last decades have shown that the mixture of possible resources, both from fossil and renewable sources (hydro, wind, solar) brings about specific changes in terms of planning and the decision to buy one or more sources, since the end consumer expects an optimal, predictable price. At the same time, for electrical power plants, the cost of continuing usage (TCO - Total Cost of Ownership), since the power plant does not switch off unless there are revisions, is estimated as being made up of at most 30% of the initial investment, and at least 70% of the maintenance costs. These figures show that the impact of a maintenance process, provided it is done intelligently, is extremely high, over the entire lifespan of the power plant. All these data led to searching for IT solutions, capable of monitoring in real time or near real time, the relevant process parameters (the so-called process instrumentation) which evaluate with maximal accuracy the influence of the work context over the relevant processes: combustion, vibrations, wind availability (in generating Wind Power), temperature or surcharge in the support installations designed for fuel transport. All these sources generate an immense volume of data, giving rise to a complex Big Data problem, the purpose of which being to provide decision support in terms of actionable insight and the maintenance of component parts. As such, a Big Data problem gives rise to Smart Data.

A context for Big Data

To understand how such a problem is approached by Big Data, we can identify a number of generic steps (the English terms are self-explanatory): identifying the relevant measurement points, placing the sensors, transmitting the measured parameters periodically or upon request (operations known to be performed by SCADA systems), data pruning, classification/clustering, Data Mining/Pattern Matching, Model Learning, Model Calibration, decision making/context analysis/diagnosis and request for modification.

To offer a complete image of such a problem, it is safe to say that a remote diagnosis centre, capable of managing several wind turbine parks, receives measured data for 3200 parameters per each turbine, and about 300 million measurements per week. While databases relying on a historical measurement record contained 97 terabytes in 2013, their volume increased to 270 terabytes in 2015. It is obviously impossible for the human operator to be able to notice useful trends, at such a scale, over a period that can justify correct field intervention.

The large volume of detailed real-time data is the key to success for such a system because it enables the experts in diagnostic models to use Machine Learning algorithms and simulated models to determine, with high precision, with what kind of flaw it deals and how maintenance activities need to be performed. The final performance is enhanced, since the system allows for the remote fixing of 80% of possible flaw cases over a minimum period, when the turbine is paused from production. Nowadays, a maximum 15% of all cases require the physical presence of technicians to solve occurring problems.

The paradigm that changes us

By all means, technology is extremely useful, and its impact would be less lower without any a priori knowledge of the way in which monitored installations work. At the same time, Big Data Analytics allows for the analysis and correlation of collected data over long stretches of time, and the generation of new products, which are analytics ready, and are specially designed for an energy market capable of offering trustworthy, ecological and stable prices for consumers.

Big Data has already changed and will profoundly change the way in which IT is done, and this is because of the completely different work paradigm, which is much more adapted to reality, regardless of whether we refer to markets, industrial processes or life science (both in pharma and diagnostics). Instead of keeping ourselves anchored in data bases or relatively inflexible processes, the most precious reality is raw data. In analysing raw data, the miners of the future can discover correlations and changes to which cyber-physically integrated systems can react more efficiently in combination with the human factor. Each pre-processing step, relevant for the Big Data approach, such as data pruning, anomaly elimination, dimensional diminishment, prepares the other key steps: statistical analysis and applying suitable learning algorithms in identifying the functional models of pattern matching. As a final remark, we can notice how Big Data profoundly changes the way in which tomorrow's engineers will tackle problems.

Conference TSM

VIDEO: ISSUE 109 LAUNCH EVENT

Sponsors

  • Accenture
  • BT Code Crafters
  • Accesa
  • Bosch
  • Betfair
  • MHP
  • BoatyardX
  • .msg systems
  • P3 group
  • Ing Hubs
  • Cognizant Softvision
  • Colors in projects

VIDEO: ISSUE 109 LAUNCH EVENT