Building the connected utility of the future

0

Today’s utilities bear little resemblance to the utility companies of decades ago.

For most of the last century, electric utility operations have been built around an architecture of one-way power flow from generation to customers, under fairly predictable conditions. Today, however, the environment in which utilities operate is very different.

Climate change is contributing to wildfires and extreme temperatures, forcing many utility companies to operate under conditions they never anticipated.1 At the same time, the integration of renewable generation, storage and other DERs is driving a transition to a DSO model in which the grid is more distributed and electricity flows bi-directionally between utilities and consumers. Finally, through consolidations and mergers, utilities have become more organizationally complex, adding yet another obstacle to responding dynamically to these new conditions.

Meeting the magnitude of these challenges requires an unprecedented level of connectivity between distributed organizations. To harden systems, predict when equipment will fail, and communicate thoughtfully with tens of thousands of customers during emergency events, utilities need a data-driven operating system to enterprise-wide that spans operators, analysts and executives.

Bridging the gap between analytics and operations

Utilities today have access to a rich set of data, but they may not be making the most of it. With the right systems in place, utilities can gather real-time signals about grid operations from smart meters, leverage LiDAR for highly accurate GIS records, and deploy drones to remotely inspect condition. actives. To harness this data, many utilities are investing in data science teams and new, out-of-the-box ML/AI toolkits. While these efforts frequently create impressive data assets and promising models, many prototypes fail to make it to production and fail to achieve the promised operational improvements.

For example, imagine a utility that has started a pilot project to deploy new sensors to help manage DER integration. The data science team leveraged industry-standard data science tools to detect power quality faults and recommend where maintenance inspections should start looking for the source of the problem.

To get this started, every day the business counterparts of the data science team export spreadsheets of model outputs for maintenance engineers.

Engineers then review the recommendations in conjunction with the context of other applications – a time-consuming and error-prone process – before committing the decisions to a legacy planning system. As part of the review, engineers cross-reference modeling results with various maintenance schedules, often performing ad hoc analysis based on their real-world experience. Additionally, outdated historical assumptions are often built into these models, requiring engineers to manually correct the same discrepancies each time the data science team provides a new set of results.

For the data science team, model development is also a manual and tedious process, involving manual data cleaning and feature engineering. Data scientists struggle to run the model on all feeders and leverage models from other analytics projects. More importantly, they don’t get timely feedback from their consumers. Without frequent updates on which recommendations were accurate and which were not, it is difficult to iteratively improve models over time and ensure they were accurate.

With so many distributed information sources and the messy process of combining human information with data, utilities can find it difficult to get the most out of their investments in new data sources and analytical capabilities. Many utility companies may face technical and organizational barriers that prevent their data, analytics, engineering, and operations teams from iterating closely together.

Why Most Utilities’ IT Toolkits Are Not Enough

For decades, utilities have relied on foundational software systems that are critical to power infrastructure, but also very burdensome. Purpose-built on-premises systems, such as DMS, SCADA and GIS, are essential for managing network infrastructure data, but they are specialized for a narrow set of operational data and processes, making them a limited usefulness for utilities struggling with an ever-changing set. of challenges.

Newer iterations of these systems, like ADMS and DERMS, promise to help utilities manage a more distributed, DER-enabled network. However, they are still too specialized to deal with the larger issues facing public services. These systems typically don’t integrate external data, use custom templates, or help utility companies create new workflows. Utilities will need these capabilities and more, to adapt their operations in the face of bi-directional power flow through distributed energy resources, more complex weather events and other 21st century challenges.

Apart from these systems, utilities have followed the lead of other industries by investing in new cloud systems such as data warehouses and analytics engines. These are capable of collecting data from different sources, but they are removed from the core operations that made legacy systems, such as DMS and EMS, so valuable. Cloud-based data warehouses can encapsulate a complex reality, but they don’t offer levers to act or respond to new threats. They don’t go the last mile to put those analytics in the hands of field teams and network operators so they can make better decisions day in and day out. For example, even after years of investing in cloud modernization, when faced with an emergency operation or an urgent new work program, many utilities rely heavily on spreadsheets, email and file shares, which quickly become obsolete and leave no trace of where the information came from. Often, new cloud tools don’t actually improve business outcomes when they’re needed most.

Creating a common operating environment

In the face of massive disruption, analytics alone are not enough: utilities need a platform that can help them power complex operations by combining the best of data and human insight. This platform should break down data silos and ultimately enable a feedback loop between data teams, analysts, engineering and planning teams, and network operators.

Pacific Gas and Electric (PG&E) uses the Palantir smelter to overlay different sets of information, such as real-time network conditions and risk patterns. The utility can also perform preventive maintenance by developing models to predict the condition of equipment, analyzing data from smart meters to understand where to prioritize repairs. PG&E credits Foundry for helping the company manage the risk of wildfires caused by electrical equipment, keeping Californians safe.

Palantir has partnered with numerous utilities to deploy Foundry across the entire value chain, from emergency operations to predictive maintenance to electric vehicle planning. To learn more about how Foundry can help your utility’s decision-making capabilities, check out our website and meet us at DISTRIBUTECH in Dallas, Texas, May 23-25.


1) Jones, Matthew W., Adam Smith, Richard Betts, Josep G. Canadell, I. Colin Prentice, and Corinne Le Quéré. “Climate change increases the risk of wildfires.” ScienceBrief Review 116 (2020): 117. https://www.preventionweb.net/files/73797_wildfiresbriefingnote.pdf

Share.

Comments are closed.