Plenary speakers



  • MAURICIO RESENDE
    Mathematical Optimization and Planning (MOP)
    Amazon.com, Inc., Seattle - USA

    Metaheuristic approaches to continuous global optimization

    Metaheuristics were introduced in the 1980s as effective algorithms to solve hard combinatorial optimization problems.  More recently their application has been extended to continuous global optimization.  In this talk we review two such metaheuristics: Continuous GRASP (CGRASP) and Biased Random-Key Genetic Algorithms (BRKGA). We apply them to the problem of minimizing a function of continuous variables subject to box constraints. Several applications are described.

  • AURELIO RIBEIRO LEITE DE OLIVEIRA
    Instituto de Matemática, Estatística e Computação Científica (IMECC)

    Universidade Estadual de Campinas (UNICAMP)

    Interior point methods and DC power systems - How to squeeze a matrix
    Interior point methods have been successfully applied for solving optimal power system problems very early since their appearance. In particular, they achieve even better performance for the so called DC formulation since often such formulation leads to quadratic problems with separable variables. In this talk, some formulations are discussed and interior point methods designed to solve such specific problems are developed. The resulting linear system matrix pattern is exploited and due to the power system structure and physics laws, the obtained matrix operations can be, in a large amount, reduced. That matrix studies are presented, starting from a simple formulation, and are generalized to more complete formulations up to a full model proposed for the Brazilian network transmission power system.

  • JACEK GONDZIO
    School of Mathematics
    University of Edinburgh, Scotland, UK

    Continuation in Optimization: From interior point methods for large-scale optimization to Big Data optimization
    In this talk we will discuss similarities between two homotopy-based approaches:
    - (inexact) primal-dual interior point method for LP/QP, and
    - preconditioned Newton conjugate gradient method for big data optimization.
    Both approaches rely on clever exploitation of the curvature of optimized functions and deliver efficient techniques for solving optimization problems of unprecedented sizes. We will address both theoretical and practical aspects of these methods.References:
    [1]  J. Gondzio, Interior point methods 25 years later, EJOR 218 (2012) pp. 587--601. DOI: 10.1016/j.ejor.2011.09.017
    [2]  K. Fountoulakis and J. Gondzio, A second-order method for strongly convex L1-regularization problems, Mathematical Programming 156 (2016), pp. 189--219. DOI:  10.1007/s10107-015-0875-4
    [3]  K. Fountoulakis and J. Gondzio, Performance of first- and second-order methods for big data optimization, interior point method for convex quadratic programming, Technical Report ERGO-15-005, School of Mathematics
    http://www.maths.ed.ac.uk/~gondzio/reports/trillion.html