enOPTIMAL: Energy, Optimization and Learning

Virtual seminar series at the interface of energy, optimization and machine learning research


enOPTIMAL seminar series brings together scholars and professionals to discuss the latest developments in theory and applications of optimization & machine learning in modern energy systems. We invite speakers from across the globe to give a 40-minute talk on their most recent research, followed by a 20-minute Q&A session. Please check the links below to see the past & future seminars and to subscribe to seminar updates.

 

Organized by Vladimir Dvorkin (MITEI & MIT LIDS), Stefanos Delikaraoglou (MIT LIDS) and Audun Botterud (MIT LIDS)

Logistics

https://mit.zoom.us/j/97460304229

Watch livestream on YouTube

Fridays: 8am PT, 11am ET, 5pm CET

Fill in Google Form to stay updated

Add Google calendar

Watch previous talks

Fall 2021 Seminars

Avatar

Emiliano Dall’Anese

University of Colorado Boulder

October 1

Avatar

Maryam Kamgarpour

Swiss Federal Institute of Technology Lausanne

October 15

Avatar

Liyang Han

Technical University of Denmark

October 29

Avatar

Victor Zavala

University of Wisconsin–Madison

UPD: Monday, November 1

Avatar

Ana Radovanovic

Google Research      

November 12

Avatar

Hao Zhu

University of Texas at Austin

November 19

Avatar

Hélène Le Cadre

VITO/EnergyVille     

December 3

Avatar

Martin Schmidt

Trier University

December 17

Avatar

Eilyan Bitar

Cornell University     

December (TBA)

Upcoming Seminars

Maryam Kamgarpour

Maryam Kamgarpour

Laboratoire d’Automatique

Swiss Federal Institute of Technology Lausanne

Learning in Multi-Agent Systems with Applications to Electricity Markets

October 15, 2021      

A rising challenge in control of large-scale control systems such as the energy and the transportation networks is to address autonomous decision making of interacting agents. In this setting, a Nash equilibrium is a stable solution outcome in the sense that no agent finds it profitable to unilaterally deviate from her decision. Due to geographic distance, privacy concerns or simply the scale of these systems, each agent can only base her decision on local measurements. Hence, a fundamental question is: do agents learn to play a Nash equilibrium strategy based only on local information? I will motivate this question with problems arising in energy markets and discuss my work on addressing it for two classes of games, those with continuous and finite action spaces. I will present our algorithms for learning Nash equilibria in these games, discuss their convergence and show their applications in in energy markets. This is a joint work with my colleague T. Tatarenko, my former and current doctoral students O. Karaca and P.G. Sessa.

Biography

Maryam Kamgarpour holds a Doctor of Philosophy in Engineering from the University of California, Berkeley and a Bachelor of Applied Science from University of Waterloo, Canada. Her research is on safe decision-making and control under uncertainty, game theory and mechanism design, mixed integer and stochastic optimization and control. Her theoretical research is motivated by control challenges arising in intelligent transportation networks, robotics, power grid systems, financial markets and healthcare. She is the recipient of NASA High Potential Individual Award, NASA Excellence in Publication Award, the European Union (ERC) Starting Grant and NSERC Discovery Accelerator Grant. She was with the faculty of Electrical Engineering & Information Technology of ETH Zurich from 2016-2019, with the faculty of Electrical and Computer Engineering of UBC from 2020-202, and seem to have finally found her home in the faculty of Mechanical engineering at EPFL since October 2021.

Liyang Han

Liyang Han

Department of Electrical Engineering

Technical University of Denmark

A Data Market under a Lasso Regression Framework

October 29, 2021      

As the complexity and uncertainty of modern energy systems grow, we are seeing a growing need for good quality data for improving system and market operations. Traditionally, data in energy operations has largely been regarded as a free and highly accessible commodity, which is in significant contrast to the rising concerns about data privacy. Therefore, recent academic research has been investigating various forms of data markets to incentivize data exchanges.

In this talk, I will be presenting our recent work on a specific type of data market that is based on linear regression frameworks. The chosen use case is wind power forecasting, in which wind agents have the potential to improve their forecasts through gaining access to measurement data from neighboring wind agents. Based on how data payments are determined, I will introduce two market clearing mechanisms: 1) a cooperative game theoretic profit allocation, and 2) a lasso (least absolute shrinkage and selection operator) payment scheme. A comparison of the two payment mechanisms will be presented to show the difference in their preserved market properties and in their levels of incentive for the data sellers and data buyers.

Joint work with Pierre Pinson and Jalal Kazempour.

Biography

Liyang Han is a postdoc researcher at the Energy Analytics and Markets group at the Technical University of Denmark. His current research focuses on the economics of data with applications in renewable energy sources forecasting and other energy applications. Prior to coming to DTU, he received his PhD from the University of Oxford with a thesis focusing on the application of cooperative game theoretic frameworks in prosumer-centric energy markets. He is interested in expanding his research on data markets to broader applications with metrics to measure the feasibility and fairness of such markets.

Victor M. Zavala

Victor M. Zavala

Chemical and Biological Engineering

University of Wisconsin–Madison

Remunerating Space-Time, Load-Shifting Flexibility from Data Centers in Electricity Markets

UPD: Monday, November 1, 2021      

This talk discusses an electricity market clearing formulation that seeks to remunerate spatio-temporal, load-shifting flexibility provided by data centers (DaCes). Load-shifting flexibility is a key asset for power grid operators as they aim to integrate larger amounts of intermittent renewable power and to decarbonize the grid. Central to our study is the concept of virtual links, which provide non- physical pathways that can be used by DaCes to shift power loads (by shifting computing loads) across space and time. We use virtual links to show that the clearing formulation treats DaCes as prosumers that simultaneously request load and provide a load-shifting flexibility service. Our analysis also reveals that DaCes are remunerated for the provision of load-shifting flexibility based on nodal price differences (across space and time). We also show that DaCe flexibility helps relieve space-time price volatility and show that the clearing formulation satisfies fundamental economic properties that are expected from coordinated markets (e.g., provides a competitive equilibrium and achieves revenue adequacy and cost recovery). The concepts presented are applicable to other key market players that can offer space-time shifting flexibility such as distributed manufacturing facilities and storage systems. Case studies are presented to demonstrate these properties.

Biography

Victor M. Zavala is the Baldovin-DaPra Professor in the Department of Chemical and Biological Engineering at the University of Wisconsin-Madison and a computational mathematician in the Mathematics and Computer Science Division at Argonne National Laboratory. He holds a B.Sc. degree from Universidad Iberoamericana and a Ph.D. degree from Carnegie Mellon University, both in chemical engineering. He is on the editorial board of the Journal of Process Control, Mathematical Programming Computation, and Computers & Chemical engineering. He is a recipient of NSF and DOE Early Career awards and of the Presidential Early Career Award for Scientists and Engineers (PECASE). His research interests include statistics, control, and optimization and applications to energy and environmental systems.

Ana Radovanovic

Ana Radovanovic

Carbon and Efficiency Driven Computing

Google Research

Carbon Aware Computing at Google

November 12, 2021      

Biography

Ana’s research focuses on fundamental principles of operation, design and control of systems in which uncertainty is an inherent property and important assumption in the analysis and design. She is mainly interested in developing approaches that lead to explicit and insightful results, which highlight business tradeoffs and provide general design guidelines. Ana has developed models, analysis and optimization approaches in different areas of technology: Web caching, stochastic service networks, revenue management and pricing in capacitated systems with reusable resources, job scheduling algorithms in large computer centers, online advertising.

Hao Zhu

Hao Zhu

Electrical & Computer Engineering

University of Texas at Austin

Physics-Aware and Risk-Aware Machine Learning for Power System Operations

November 19, 2021      

Recent years have witnessed rapid transformations of contemporary advances in machine learning (ML) and data science to aid the transition of energy systems into a truly sustainable, resilient, and distributed infrastructure. A blind application of the latest-and-greatest ML algorithms to solve stylized grid operation problems, however, may fail to recognize the underlying physics models or safety constraint requirements. This talk will introduce two examples of bridging physics-aware and risk-aware ML advances into efficient and reliable grid operations. First, we develop a topology-aware approach using graph neural networks (GNNs) to predict the price and line congestion as the outputs of real-time AC optimal power flow problem. Building upon the underlying relation between prices and topology, this proposed solution significantly reduces the model complexity of existing end-to-end ML methods while efficiently adapting to varying grid topology. Second, we put forth a risk-aware ML method to ensure the safety guarantees of data-driven, scalable reactive power dispatch policies in distribution grids. An adaptive sampling method is leveraged to improve the gradient computation of conditional variance-at-risk (Cvar) and attain guaranteed voltage violation performance. Our proposed algorithms have demonstrated the advantages of incorporating power system modeling into the design of learning-enabled power system operations.

Biography

Hao Zhu is currently an Assistant Professor of Electrical and Computer Engineering (ECE) at The University of Texas at Austin. She received the B.S. degree from Tsinghua University in 2006, and the M.Sc. and Ph.D. degrees from the University of Minnesota in 2009 and 2012, all in electrical engineering. From 2012 to 2017, she was a Postdoctoral Research Associate and then an Assistant Professor of ECE at the University of Illinois at Urbana-Champaign. Her research focuses on developing innovative algorithmic solutions for learning and optimization problems in future energy systems. Her current research interests include physics-aware and risk-aware machine learning for power systems, and the design of energy management system under the cyber-physical coupling. Dr. Zhu is a recipient of the NSF CAREER Award (2017), an invited attendee of the US NAE Frontier of Engineering Symposium (2020), as well as the faculty advisor for three Best Student Papers awarded at the North American Power Symposium. She is currently a member of the IEEE Power & Energy Society (PES) Long Range Planning (LRP) Committee and an Associate Editor for the IEEE Transactions on Smart Grid.

Hélène Le Cadre

Hélène Le Cadre

VITO/EnergyVille

Network Games Equilibrium Computation: Duality Extension and Privacy

December 3, 2021      

Biography

Hélène holds a Magistère in mathematics from Ecole Normale Supérieure (ENS) Rennes and Rennes 1 University, France. She graduated from Institut Mines-Télécom (IMT) Atlantique, with a speciality in information science and control theory. She received a Ph.D. degree in applied mathematics from Rennes 1 university. Hélène worked as a research fellow and assistant professor at Ecole des Mines de Paris, at Sophia-Antipolis, France and was a visiting researcher at the center of operations research and econometrics at University catholique of Louvain, in Belgium. She is currently working as researcher at VITO/EnergyVille research center, in Belgium. Her research interests revolve around (algorithmic) game theory, optimization, machine learning and forecasting, with applications in energy, telecommunication networks and economics. I have been involved in European H 2020 projects (Magnitude, SmartNet, Elvire, Etics), as well as in fundamental research projects from the PGMO, a French research program dedicated to operations research , optimization, and their link with data science co-funded by EDF R&D, Thalès and Orange, and from Belgian fundamental research funding agency (FWO).

Martin Schmidt

Martin Schmidt

Department of Mathematics

Trier University

Multilevel Mixed-Integer Nonlinear Optimization for Electricity Market Design: Motivation, Models, Solution Techniques, and Results

December 17, 2021      

We consider the combination of a network design and graph partitioning model in a multilevel framework for determining the optimal network expansion and the optimal zonal configuration of zonal pricing electricity markets. The two classical discrete optimization problems of network design and graph partitioning together with nonlinearities due to economic modeling yield extremely challenging mixed-integer nonlinear multilevel models for which we develop two problem-tailored solution techniques. The first approach relies on an equivalent bilevel formulation and a standard KKT transformation thereof including novel primal-dual bound tightening techniques, whereas the second is a tailored Benders-like decomposition approach. We prove for both methods that they yield global optimal solutions. Afterward, we compare the approaches in a numerical study and show that the tailored Benders approach clearly outperforms the standard KKT transformation. Finally, we present some case study that illustrates the economic effects that are captured in our model and discuss some recent results for the Germany electricity sector.

Biography

Martin Schmidt

Eilyan Bitar

Eilyan Bitar

School of Electrical and Computer Engineering

Cornell University

TBA

December, 2021      

Biography

Eilyan Bitar is an Associate Professor in the School of Electrical and Computer Engineering at Cornell University. His current research is focused on the design and analysis of robust and stochastic optimization algorithms, focusing primarily on problems that entail sequential decision-making in uncertain and adversarial environments. Over the course of his career, he has explored a variety of application domains spanning electricity markets, transportation systems, and e-commerce. He received his BS (2006) and PhD (2011) from UC Berkeley (advisor: Kameshwar Poolla). Prior to joining Cornell, he spent one year as a Postdoctoral Fellow at the California Institute of Technology (advisor: Steven Low) and UC Berkeley (advisor: Kameshwar Poolla). He is a recipient of the NSF Faculty Early Career Development Award (CAREER), the David D. Croll Sesquicentennial Faculty Fellowship, the John and Janet McMurtry Fellowship, the John G. Maurer Fellowship, and the Robert F. Steidel Jr. Fellowship.

Previous Seminars

Vassilis Kekatos

Vassilis Kekatos

Department of Electrical and Computer Engineering

Virginia Tech

Physics-Aware Deep Learning for Optimal Power Flow

April 16, 2021       Video

Distribution grids are currently challenged by the rampant integration of distributed energy resources (DER). Scheduling DERs via an optimal power flow problem (OPF) in real time and at scale under uncertainty is a formidable task. Prompted by the success of deep neural networks (DNNs) in other fields, this talk presents two learning-based schemes for near-optimal DER operation. The first solution engages a DNN to predict the solutions of an OPF given the anticipated demands and renewable generation. Different from the generic learning setup, the training dataset here (namely past OPF solutions) features rich yet largely unexploited structure. To leverage prior information, we train a DNN to match not only the OPF solutions, but also their partial derivatives with respect to the input parameters of the OPF. Sensitivity analyses for non-convex and convexified renditions of the OPF show how such derivatives can be readily computed from the OPF solutions. The proposed sensitivity-informed DNN features sample efficiency improvements at a modest computational increase. Nonetheless, this two-stage OPF-then-learn approach may not be suitable for DER operation when the OPF input parameters are changing rapidly. To deal with such setups, we put forth an alternative OPF-and-learn scheme. Here the DNN is not trained to mimic the OPF minimizers but is rather organically integrated into a stochastic OPF formulation. A key advantage of this second DNN is that it can be driven by partial OPF inputs or proxies being the measurements the utility can collect from the grid in real time.

Biography

Vassilis Kekatos is an Assistant Professor with the power systems group in the Bradley Dept. of ECE at Virginia Tech. He obtained his Ph.D. from the Univ. of Patras, Greece in 2007. He is a recipient of the US National Science Foundation CAREER Award in 2018 and the Marie Curie Fellowship during 2009-2012. He has been a postdoctoral research associate with the ECE Dept. at the Univ. of Minnesota, and a visiting researcher with the Univ. of Texas at Austin and the Ohio State Univ. His current research focus is on optimization and learning for future energy systems. He is currently serving on the editorial board of the IEEE Trans. on Smart Grid.

Kyri Baker

Kyri Baker

Civil, Environmental, and Architectural Engineering Department

University of Colorado Boulder

Emulating AC OPF Solvers for Obtaining Sub-second Feasible, Near-Optimal Solutions

April 30, 2021       Video

Optimization of electric power grids is a challenging, large-scale, non-convex problem. In order to optimize assets across these networks on fast operational timescales, the problem is typically simplified using linear models or other heuristics - resulting in increased cost of operation and potentially decreased reliability. Much work has been performed on improving these models through convexifications and other approximations, but here we take an alternate approach. We use the abundance of data from grid operators or generated offline to train machine learning models that can calculate optimal grid setpoints without even solving an optimization problem. By using an interesting application of a neural network and post-processing procedure, feasible, near-optimal solutions to the AC Optimal Power Flow problem can be obtained in milliseconds.

Biography

Dr. Kyri Baker received her B.S., M.S., and Ph.D. in Electrical and Computer Engineering from Carnegie Mellon University in 2009, 2010, and 2014, respectively. From 2015 to 2017, she worked at the National Renewable Energy Laboratory. Since Fall 2017, she has been an Assistant Professor at the University of Colorado Boulder, and is a Fellow of the Renewable and Sustainable Energy Institute (RASEI). She received the NSF CAREER award in 2021. She develops computationally efficient optimization and learning algorithms for energy systems ranging from building-level assets to transmission grids.

Subhonmesh Bose

Subhonmesh Bose

Department of Electrical and Computer Engineering

University of Illinois at Urbana-Champaign

Risk-Sensitive Market Design for the Power Grid

May 14, 2021       Video

Integration of variable renewable and distributed energy resources in the grid makes demand and supply conditions uncertain. In this talk, we explore customized algorithm to tackle risk-sensitive electricity market clearing problems, where power delivery risk is modeled via the conditional value at risk (CVaR) measure. The market clearing formulations are such that they allow a system operator to effectively explore the cost-reliability tradeoff. We discuss algorithmic architectures and their convergence properties to solve these risk-sensitive optimization problems at scale. The first half of this talk will focus on a CVaR-sensitive optimization problem that can be cast as a large linear program. For this problem, we propose and analyze an algorithm that shares parallels and differences with Benders decomposition. The second half of this talk will focus on another CVaR-sensitive problem for which we propose and analyze sample complexity of a stochastic primal-dual algorithm.

Biography

Subhonmesh Bose is an Assistant Professor in the Department of Electrical and Computer Engineering at UIUC. His research focuses on facilitating the integration of renewable and distributed energy resources into the grid edge, leveraging tools from optimization, control and game theory. Before joining UIUC, he was a postdoctoral fellow at the Atkinson Center for Sustainability at Cornell University. Prior to that, he received his MS and Ph.D. degrees from Caltech in 2012 and 2014, respectively. He received the NSF CAREER Award in 2021. He has been the co-recipient of best paper awards at the IEEE Power and Energy Society General Meetings in 2013 and 2019. His research projects have been supported by grants from NSF, PSERC, Siebel Energy Institute and C3.ai, among others.

Andrew A. Chien

Andrew A. Chien

Large-scale Sustainable Systems Group

The University of Chicago

Understanding Productive Coupling of Adaptive Loads for the Renewable Power Grid. Opportunities and Challenges

May 28, 2021       Video

For seven years we have studied stranded power (curtailment and negative pricing), a large (TWh) and rapidly growing phenomena (>30% CAGR), characterizing its temporal and spatial structure in power grids with growing renewable generation. Our goal is to exploit this power for productive use, both reducing the associated carbon-emissions for that use (e.g. cloud computing), and increasing grid renewable absorption. Our work shows that dispatchable computing loads that exploit stranded power can improve renewable absorption significantly. High-performance and cloud computing workloads can be supported with high quality-of-service and superior economics (total-cost-of-ownership) can be achieved despite intermittent power. Such loads can now exceed 1 gigawatt per site. In 2021, variations of these ideas are achieving acceptance and early deployment.

The rise of adaptive loads optimized for selfish purpose raises significant questions as to acceptable and constructive behavior for both grid stability and operations (e.g. social welfare and decarbonization). We have explored the impact of adaptive cloud compute load on power prices and carbon emissions, varying coupling model (none, selfish optimization, and grid optimization). The results show that computing adaptive loads can disturb grid dispatch and both increase carbon emissions and inflict financial harm on other customers. At today’s cloud-computing penetration levels, such effects would be significant and increasing.

More generally, these studies expose fundamental challenges with large-scale adaptive load to make it a pillar of increasing renewable absorption. If possible, we will discuss major economic and operational obstacles to creating adaptive load (flexibility) and effective coupling interfaces (e.g. service and markets). These challenges underpin an emerging “struggle for control” between competing spheres of optimization; directly, the tensions between societal aspirations for grid decarbonization and acceptable power cost.

Biography

Andrew A. Chien is the William Eckhardt Professor at the University of Chicago, Director of the CERES Center for Unstoppable Computing, and a Senior Scientist at Argonne National Laboratory. Since 2017, he has served as Editor-in-Chief of the Communications of the ACM, and he currently serves on the National Science Foundation CISE Advisory Committee. Chien leads the Zero-carbon Cloud project, exploring synergies between cloud computing and the renewable power grid. Chien is a global research leader in parallel computing, computer architecture, clusters, and cloud computing, and has received numerous awards for his research. Dr. Chien served as Vice President of Research at Intel Corporation from 2005-2010, leading long-range and “disruptive technologies” research as well as worldwide academic and government partnership. From 1998-2005, he was the SAIC Chair Professor at UCSD, and prior to that a Professor at the University of Illinois. Dr. Chien is an ACM, IEEE, and AAAS Fellow, and earned his PhD, MS, and BS from the Massachusetts Institute of Technology.

Emiliano Dall’Anese

Emiliano Dall’Anese

Department of Electrical, Computer, and Energy Engineering

University of Colorado Boulder

Data-based Online Optimization of Networked Systems: High-Probability Analysis and Applications to Power Grids

October 1, 2021       Video

The talk focuses on the synthesis of data-based online algorithms to optimize the behavior of networked systems – with a specific focus on power grids – operating in uncertain, dynamically changing environments, and with human-in-the-loop components. Formalizing the optimization objectives through a time-varying optimization problem, the talk considers the design of online algorithms with the following features: i) concurrent learning modules are utilized to learn the cost function on-the-fly (using infrequent functional evaluations, and leveraging Gaussian Processes), and ii) algorithms are implemented in closed-loop with the plant to bypass the need for an accurate knowledge of the algebraic map of the system. Leveraging contraction arguments, the performance of the online algorithm is analyzed in terms of tracking of an optimal solution trajectory implicitly defined by the time-varying optimization problem; in particular, results in terms of convergence in expectation and in high-probability are presented, with the latter derived by adopting a sub-Weibull assumption on measurement errors and the error in the reconstruction of the cost. Additional results are offered in terms of cumulative fixed-point residual. The online algorithm is applied to solve real-time demand response problems in power grids, where one would like to strike a balance between maximizing the profit for providing grid services and minimizing the (unknown) discomfort function of the end-users. Additional examples include real-time optimal power flow applications.

Biography

Emiliano Dall’Anese is an Assistant Professor in the Department of Electrical, Computer, and Energy Engineering at the University of Colorado Boulder, where he is also an affiliate faculty with the Department of Applied Mathematics. He received the Ph.D. degree from the Department of Information Engineering, University of Padova, Italy, in 2011. His research interests span the areas of optimization, control, and learning, with current emphasis on online optimization and optimization of dynamical systems; tools and methods are applied to problems in energy, transportation, and healthcare. He received the National Science Foundation CAREER Award in 2020, and the IEEE PES Prize Paper Award in 2021.