Professor Albert Berahas

Albert S. Berahas

Assistant Professor

Location

2783 IOE

Biography

Albert S. Berahas is an Assistant Professor in the Department of Industrial and Operations Engineering at the University of Michigan. Prior to this appointment, he was a Postdoctoral Research Fellow at Lehigh University (2018-2020) and at Northwestern University (2018). He received his PhD in Engineering Sciences and Applied Mathematics in 2018 from Northwestern University. He received his undergraduate degree in Operations Research and Industrial Engineering from Cornell University in 2009, and in 2012 obtained an MSC degree in Engineering Sciences and Applied Mathematics from Northwestern University. His research broadly focuses on designing, developing and analyzing algorithms for solving large scale nonlinear optimization problems. Specifically, he is interested in and has explored several sub-fields of nonlinear optimization such as: (i) general nonlinear optimization algorithms, (ii) optimization algorithms for machine learning, (iii) constrained optimization, (iv) stochastic optimization, (v) derivative-free optimization, and (vi) distributed optimization.

Education

  • PhD, Northwestern University, Engineering Sciences and Applied Mathematics (2013-2018)
  • MSC, Northwestern University, Engineering Sciences and Applied Mathematics (2011-2012)
  • BSC, Cornell University, Operations Research and Industrial Engineering (2005-2009)

Research Interests

  • Nonlinear Optimization
  • Machine Learning
  • Deep Learning
  • Stochastic Optimization
  • Constrained Optimization
  • Derivative-Free Optimization
  • Distributed Optimization

Research areas:
,

Professional Service

Awards

Publications

  • – A. S. Berahas, F. E. Curtis, D. P. Robinson, and B. Zhou. Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization. SIAM Journal on Optimization, 31(2):1352–1379, 2021.
  • – A. S. Berahas, L. Cao, and K. Scheinberg. Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise. SIAM Journal on Optimization, 31(2):1489–1518, 2021.
  • – A. S. Berahas, F. E. Curtis, and B. Zhou. Limited-Memory BFGS with Displacement Aggregation. Mathematical Programming, DOI: 10.1007/s10107-021-01621-6, 2021.
  • – A. S. Berahas, R. Bollapragada, and J. Nocedal. An investigation of Newton-Sketch and sub- sampled Newton methods. Optimization Methods and Software, 35(4):661–680, 2020.
  • – A. S. Berahas, R. H. Byrd, and J. Nocedal. Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods. SIAM Journal on Optimization, 29(2):965–993, 2019.
  • – A. S. Berahas, R. Bollapragada, N. S. Keskar, and E. Wei. Balancing Communication and Computation in Distributed Optimization. IEEE Transactions on Automatic Control, 64(8):3141– 3155, 2018.
  • – A. S. Berahas, J. Nocedal, and M. Takáč. A Multi-Batch L-BFGS Method for Machine Learning. In Advances in Neural Information Processing Systems, pages 1055–1063, 2016.