The Air Force Office of Scientific Research (AFOSR) has honored Albert S. Berahas by naming him a member of the 2025 Young Investigator Program (YIP). Berahas has been selected for his proposal titled “Computational Optimization in the Absence of Derivatives and in the Presence of Noise: From Aircraft Design to Large Scale Neural Networks.”
The project
The objective of Berahas’ project is to develop, analyze and implement algorithms for solving noisy constrained derivative-free optimization (DFO) problems. These are complex optimization problems where derivative information of the objective and constraint functions is often unavailable and values contaminated with noise. Such problems are widely prevalent in diverse fields including engineering design, medicine and machine learning.
The project is off to a promising start, with Berahas and his team already developing a derivative-free Sequential Quadratic Programming (SQP) method tailored for equality constraints problems, demonstrating practical applicability and proving convergence guarantees. Building upon this preliminary work, the team will extend their research to address inequality constraints, relaxed assumptions and further develop extensions of Interior Point (IP) methods.
This research has the potential to significantly enhance AFOSR’s capabilities, aligning closely with the Mathematical Optimization Program by delivering novel theory, algorithms and software advancements. Berahas’ team will focus on the development of an open-source software package to make these advanced algorithms accessible to a wider audience.
About the AFOSR Young Investigator Program (YIP)
This year YIP provided 48 early career professionals three-year grants of up to $450,000 to further their exceptional research projects. The program aims to foster innovative basic research in science and engineering, support early career development and create engagement opportunities that align with the Department of the Air Force’s mission.
More about Berahas
Berahas is an assistant professor at the University of Michigan (U-M) Department of Industrial and Operations Engineering (IOE). Before joining the faculty at U-M IOE he served as a postdoctoral research fellow at Lehigh University and Northwestern University. He received his B.S.E. in Operations Research and Industrial Engineering from Cornell University and both his MSc and Ph.D. in Engineering Sciences and Applied Mathematics from Northwestern University.
His research focuses on large-scale nonlinear optimization. He dedicates his efforts to creating, analyzing and implementing algorithms that tackle complex optimization challenges. His expertise spans several sub-fields, including optimization for machine learning, constrained optimization and stochastic optimization. Additionally, he explores the nuances of derivative-free optimization and the dynamics of decentralized optimization.