An Asymptotically Efficient Stochastic Newton Method with First Order Cost for Online Convex Optimization
Stochastic Newton methods capable of achieving asymptotic efficiency have historically required a per-iteration cost of O(d3) for problems of dimension d. This presentation will first review the concept of asymptotic efficiency, the statistical benchmark for an optimal estimator. We then introduce an online algorithm that achieves this same statistical optimality with a reduced per-iteration cost of O(ℓd2), where the mask size ℓ can be chosen