New Methods Move Beyond Traditional Techniques
Paul Grigas, associate professor of Industrial Engineering and Operations Research at the University of California, Berkeley, shared new research on how to more efficiently solve complex mathematical problems that depend on changing inputs, or hyperparameters. Traditionally, these problems are tackled by breaking them into many smaller ones and solving each separately—a time-consuming process known as discretization. Grigas introduced a new approach that uses machine learning to estimate the entire range of solutions at once, avoiding the need to repeat the process for every input variation.
Faster, Smarter Algorithms
This learning-based method works by training a model to predict solutions across all possible inputs, significantly cutting down on computing time. The approach performs especially well when combined with standard optimization techniques like stochastic gradient descent. In cases with a single input variable and smooth problem conditions, Grigas also introduced new algorithms inspired by how these problems change gradually, similar to how a curve follows a path. These methods not only improve speed but also maintain high accuracy, as demonstrated in tests involving real-world tasks such as classifying data and managing investment portfolios.
About the Speaker
Grigas focuses on combining optimization and machine learning to make better, data-driven decisions. He is a faculty member at UC Berkeley and part of the National Science Foundation’s AI Institute for Advances in Optimization. His work has earned several top awards, including honors from INFORMS for outstanding research in operations and optimization.