Performance Evaluation and Stochastic Optimization with Gradually Changing Non-Stationary Data
Mar 30, 2022 09:00 AM Singapore (Registration will open at 08:50 AM.)
Join Zoom Meeting:
Meeting ID: 940 4340 7158
We propose and analyze effective estimators for data-oriented expected performance evaluation and data-oriented stochastic optimization problems, in presence of non-stationarity in the data that reflects gradual distribution changes. Under the metric of guaranteed mean square errors, we prove optimality of the proposed estimators, which intuitively assign relatively more weights to the more recently collected data and abandon data that were collected long ago. Under a limiting regime naturally implied by the gradually changing non-stationarity, we prove the first set of central limit theorems for the proposed estimators in such non-stationary environments, so that valid large-sample statistical inference can be available. We then prove central limit theorems for a weighted sample average approximation approach that is designed to solve data-oriented stochastic optimization problems under gradually changing non-stationarity. The proved limit theorems and techniques may be useful for general bias-variance trade-off analysis when data-oriented decision optimization tasks encounter non-stationary data.
Papers related to the talk:
About the Speaker
Zeyu Zheng is an Assistant Professor at the University of California Berkeley, Department of Industrial Engineering and Operations Research. He received his PhD in Operations Research, PhD minor in Statistics, and MA in Economics from Stanford University, and BS in Mathematics from Peking University. He has done research in Simulation and non-stationary stochastic modeling and decision making.
For more information about the ESD Seminar, please email firstname.lastname@example.org