Matrica - The Future of Electricity Forecasting Part 1

We are entering the smart grid revolution, moving to a new paradigm of electricity forecasting and trading.

The new paradigm sees large, centralised generation and transmission replaced by interconnected local grids, each largely self-sufficient through local embedded, renewable generation, batteries, and load flexibility. Consumption and generation will be managed by artificial intelligence (AI) systems, driven by ‘big data’.

The new paradigm foresees greater consumer empowerment, enabling unbundling of boundary point meter volumes into separate consumption and generation data streams, each capable of being traded with different market participants. Indeed, major changes to wholesale electricity market rules are under way to reward consumer flexibility and encourage more active market participation by consumers.

The new paradigm is being enabled by advances in measurement and control technology, including AI. Although ‘big data’ is an over-used cliché in some areas, it has a real meaning in electricity supply, normally referring to the growth of domestic ‘smart’ meters, weather, and SCADA systems that enables the real-time forecasting, monitoring, trading, balancing, and control of consumption and generation points on the grids.

The move to smart grids and big data will have major implications for electricity load and renewables forecasting, and will have knock-on impacts into intra-day trading and balancing.

Energy Forecasting – the arrival of ‘big data’

Up until recently relatively few consumers had half-hourly metering. Domestics and small businesses were on standard profiles. Renewable generation capacity was tiny, and had insignificant impact on system balancing. Large-scale battery storage was science fiction. Consumers were passive, at most choosing between incremental price differences offered by few large suppliers where competition existed. Opportunities for demand-side participation and auto-generation were rare, and limited to larger industrial and commercial customers.

Load forecasting was relatively straightforward, and could be conducted on large spreadsheets or small relational databases, using established forecasting techniques e.g. multivariate regression. Weather data updates were infrequent or restricted on variables , because profiling defined the ‘official’ weather variables and their use in the regression coefficients. More accurate weather data for domestic forecasting was unnecessary.

Integration of forecasting with trading systems was limited, with updates only sent to pricing and trading departments a few times per day in the busy, supply contract renewal periods, and less frequently at quieter times. It was easy to become complacent.

However, the electricity industry was always recognised that domestic and small commercial load would one day move to interval metering, thus resulting in hugely increased data volumes, and the need to deploy more sophisticated forecasting and telecommunication systems. The implementation of smart meters means that this vision has finally arrived – as has the need to manage the corresponding increases in data.

With the rise of renewables, weather forecasting became more important, as wind and solar radiation variables became a major component of the intermittent generation forecasting mix.

Weather data is going through a big data upheaval of its own, including spatial resolution, additional variables, and update frequency, which must be addressed in forecasts:

  • Spatial resolution has improved to one reading per 90 square metres, so the impact of local topology is important;
  • ‘Height’ is now a major wind variable, because the variation in windspeed at different heights is significant for turbines of different sizes;
  • Solar panel ‘orientation’ has become important because, at 90 square metres spatial resolution, variable shade from vegetation and buildings impacts output through the day;
  • Frequency of weather data updates has risen dramatically, requiring more frequent refreshes of weather-sensitive forecasts.

The widening range of weather variables engenders the need for the creation of more weather-based forecasting models because some load and generation may be sensitive to previously unexpected combinations of existing and new weather variables. This requires an R&D project solely to identify hidden patterns in all possible combinations, and is likely to involve AI.

From a forecasting perspective, ‘big data’ on energy supply means we have more data, updated more frequently, driving more sophisticated models, all striving toward satisfying the voracious appetite of the intraday trading and balancing systems.

What is certain is that ‘big data’ will not be uninvented.

What is the threshold for ‘Big Data’ solutions?

Although the term ‘big data’ is a nebulous, overused cliché in some areas, Matrica believes it has real meaning in electricity forecasting, so this question about thresholds is serious and cannot be ignored.

To Matrica, ‘big data’ in electricity forecasting refers to the large increase in the amount, granularity, and frequency of data about a specific point on the grid, be it load or generation. To focus on domestic consumer by way of example, in the ‘olden days’ we had quarterly and estimated meter readings applied to profiles, using some simple arithmetic around regression coefficients to convert a few readings a year into half-hourly values for trading and settlement purposes. Now we have domestic smart meters which measure half-hourly and can be read daily. Taken to the extreme, smart meters enable real-time streaming of the load and generation points to which they are attached, giving forecasters unprecedented detail on the behaviour of the entities under forecast.

Matrica believes that the evolution of energy forecasting from the 1990s to ‘big data’ has resulted in a five orders of magnitude increase in data quantities, assuming a move from domestic profiling to multi-supplier smart meters, and from half-hourly thermal generation to 5-minute wind turbines of variable height, and solar panels under variable shade.

Forecasting and data managers often ask us, “At what point does energy forecasting enter the realms of ‘big data’?”

Generally, a relational database starts to struggle badly if it has to process 500 million active records. A vertically integrated company offering wind-generated supply to smart meter customers can easily generate millions of active rows per day, almost all required for forecasting purposes. Furthermore, the rate at which these new rows are created is much faster now than before, triggering more frequent re-runs of the forecasting models. Thus the requirement to support processing of ‘big data’ can arrive very quickly where extensive renewable generation and domestic smart meters are deployed, more so if forecasting is to be integrated directly into intra-day trading and balancing.

So the simple answer to the question is, “When your forecasting relational database exceeds 500 million active rows”.

Large companies with heavy reliance on forecasting will need to review their system architectures, especially if they expect to move into intra-day or algorithmic trading, balancing, and optimisation. Simply adding more hardware to operate billion-row relational databases is not a viable long-term solution.

‘Big data’ is just one element of the new paradigm. In the second Blog, we’ll consider the forthcoming changes to electricity market rules, specifically the demise of the ‘Supplier Hub’ Model, and the rise of the ‘prosumer’.

If you have any questions arising from this Blog, and would like to know more about how Matrica can help prepare your forecasting for ‘big data’ and the new smart grid, please contact Dr M F Earthey mark.earthey@matrica.co.uk

By Dr M F Earthey
September 26, 2019