Using Smoothing Functions to Obtain Guarantees on the Tightness of Global solutions Attained by Algorithms for Non-Convex Optimization

Using Smoothing Functions to Obtain Guarantees on the Tightness of Global solutions Attained by Algorithms for Non-Convex Optimization

Using Smoothing Functions to Obtain Guarantees on the Tightness of Global solutions Attained by Algorithms for Non-Convex Optimization

Sunday, September 7, 2025
  • Lecturer: Chen Zakaim
  • Location: Amado 814
Abstract:
In this talk, we will discuss the problem of maximizing a convex function – a problem known to be NP-Hard. When the objective function and the constraints are twice differentiable, there exists a known algorithm that solves the problem. The novelty of my thesis is in developing an approach that also works when the twice-differentiability assumption does not hold — for example, in the case of a function that is the sum of convex functions. Non-smoothness is addressed by approximating the function from above and below with particular smooth functions. Furthermore, we show that the method can be extended to functions that are not necessarily twice differentiable, and even to problems that are not convex, by representing them as a difference of two convex functions (DC programming). Advisor: Prof. Aharon Ben-Tal
Print to PDF