Record Details

Stochastic Newton Methods With Enhanced Hessian Estimation

Electronic Theses of Indian Institute of Science

View Archive Info
 
 
Field Value
 
Title Stochastic Newton Methods With Enhanced Hessian Estimation
 
Creator Reddy, Danda Sai Koti
 
Subject Stochastic Newton Methods
Hessian Estimation
Simultaneous Perturbation Stochastic Approximation (SPSA)
Random Directions Stochastic Approximation (RDSA)
Finite-difference Stochastic Approximation (FDSA)
Hessian Estimation Scheme
Computer Science
 
Description Optimization problems involving uncertainties are common in a variety of engineering disciplines such as transportation systems, manufacturing, communication networks, healthcare and finance. The large number of input variables and the lack of a system model prohibit a precise analytical solution and a viable alternative is to employ simulation-based optimization. The idea here is to simulate a few times the stochastic system under consideration while updating the system parameters until a good enough solution is obtained.
Formally, given only noise-corrupted measurements of an objective function, we wish to end a parameter which minimises the objective function. Iterative algorithms using statistical methods search the feasible region to improve upon the candidate parameter. Stochastic approximation algorithms are best suited; most studied and applied algorithms for funding solutions when the feasible region is a continuously valued set. One can use information on the gradient/Hessian of the objective to aid the search process. However, due to lack of knowledge of the noise distribution, one needs to estimate the gradient/Hessian from noisy samples of the cost function obtained from simulation. Simple gradient search schemes take much iteration to converge to a local minimum and are heavily dependent on the choice of step-sizes. Stochastic Newton methods, on the other hand, can counter the ill-conditioning of the objective function as they incorporate second-order information into the stochastic updates. Stochastic Newton methods are often more accurate than simple gradient search schemes.
We propose enhancements to the Hessian estimation scheme used in two recently proposed stochastic Newton methods, based on the ideas of random directions stochastic approximation (2RDSA) [21] and simultaneous perturbation stochastic approximation (2SPSA-31) [6], respectively. The proposed scheme, inspired by [29], reduces the error in the Hessian estimate by
(i) Incorporating a zero-mean feedback term; and (ii) optimizing the step-sizes used in the Hessian recursion. We prove that both 2RDSA and 2SPSA-3 with our Hessian improvement scheme converges asymptotically to the true Hessian. The key advantage with 2RDSA and 2SPSA-3 is that they require only 75% of the simulation cost per-iteration for 2SPSA with improved Hessian estimation (2SPSA-IH) [29]. Numerical experiments show that 2RDSA-IH outperforms both 2SPSA-IH and 2RDSA without the improved Hessian estimation scheme.
 
Contributor Bhatnagar, Shalabh
 
Date 2018-05-22T15:14:41Z
2018-05-22T15:14:41Z
2018-05-22
2017
 
Type Thesis
 
Identifier http://etd.iisc.ernet.in/2005/3582
http://etd.iisc.ernet.in/abstracts/4450/G28169-Abs.pdf
 
Language en_US
 
Relation G28169