Filtres

Sélection(s)

Précisez votre recherche

Disponibilité

Date

Type de document

Genre spécifique

Sujet traité

Lieu représenté

Collections et fonds

Créateur

Lieu de publication

Lieu de conservation

Langue

Conditions d'utilisation

Options d'affichage

Liste des résultats de recherche

  • Consultation sur place seulement

    A framework around limited-memory partitioned quasi-Newton methods

    Bigeon, Jean

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2023
    Livres

    Voir le détail

  • Consultation sur place seulement

    Online Newton's method with linear time-varying equality constraints

    Lesage-Landry, Antoine

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2022
    Livres

    Voir le détail

  • Consultation sur place seulement

    A proximal quasi-Newton trust-region method for nonsmooth regularized optimization

    Aravkin, Aleksandr

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2021
    Livres

    Voir le détail

  • Consultation sur place seulement

    Diagonal quasi-Newton updating methods for large-scale nonlinear least squares

    Al-Siyabi, Ahmed

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2024
    Livres

    Voir le détail

  • Étude du comportement des méthodes BFGS et L-BFGS pour résoudre un sous-problème de région de confiance

    Bourhis, Johann

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions (GERAD), 2019

    a new relationship between CG and the Broyden class, the class of quasi-Newton methods that generalize the BFGS method.This new result allows to rediscover former results established by Broyden in 1970 [1].In addition, we study the use of theimited-memory BFGS (L-BFGS) method in a trust-region algorithm by providing the

    Livres

    Voir le détail

  • Méthodes numériques en actuariat avec R : analyse numérique

    Goulet, Vincent

    Québec : Goulet, Vincent, 2019

    du problème 33 5.2 Mise en contexte 33 5.3 Indice poure problème 35 5.4 Méthode de bissection 36 5.5 Indice poure problème 40 5.6 Méthode du point fixe 40 xi xii Table des matières 5.7 Indice poure problème 44 5.8 Méthode de Newton–Raphson 46 5.9 Indice poure problème 54 5.10 Fonctions d’optimisation dans Excel et R

    Livres

    Voir le détail

  • Scaled projected-directions methods with application to transmission tomography

    Mestdagh, Guillaume

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2019

    algorithms: TRON, a trust-region method with exact second derivatives, and L-BFGS-B, ainesearch method with aimited-memory quasi-Newton Hessian approximation.We compare our approach with one where a change of variable is made in the problem.On two reconstruction problems, our approach converges faster than the change of

    Livres

    Voir le détail

  • Scaled projected-directions methods with application to transmission tomography

    Mestdagh, Guillaume

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2020

    , and L-BFGS-B, ainesearch method with aimited-memory quasi-Newton Hessian approximation.We compare our approach with one where a change of variable is made in the problem.On two reconstruction problems, our approach converges faster than the change of variable approach, and achieves much tighter accuracy in terms of

    Livres

    Voir le détail

  • A matrix-free augmented Lagrangian algorithm with application to large-scale structural design optimization

    Arreckx, Sylvain

    Montréal (Québec) Canada : GERAD HEC Montréal, 2014

    feasible or realistic to store Jacobians or Hessians explicitly.Matrix-free implementations of standard optimization methods—that do not explicitly form Jacobians and Hessians, and possibly use quasi-Newton derivative approximations— circumvent those restrictions but such implementations are virtually non-existent.We develop

    Livres

    Voir le détail

  • On optimization algorithms for maximum likelihood estimation / Anh Tien Mai, Fabian Bastin, Michel Toulouse

    Mai, Anh Tien, auteur

    Montréal (Québec) : CIRRELT, 2014

    , these conditions are often violated, and the estimation can fail to converge.This hased Bunch (1987) to consider theog-likelihood problem as a particular case of generalized regression and to propose to add a correction term to the Hessian approximation, similarly to the Gauss-Newton method in the context ofeast

    Livres

    Voir le détail

  • A symmetric formulation of the linear system arising in interior methods for convex optimization with bounded condition number

    Ghannad, Alexandre

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2020

    for (1): φ(x) D21x A Ty z 0 (3a) Ax D22y b (3b) Xz µe (3c) (x, z) > 0, (3d) from which we eliminated r D2y.Linesearch-based interior methods for (1) apply Newton’s method for nonlinear equations to (3).At an approximate solution (x, y, z) with (x, z) > 0, they compute search directions x, y, z from systems

    Livres

    Voir le détail

  • Solving unconstrained nonlinear programs using ACCPM

    Dehghani, Ahad

    Montréal] : [GERAD HEC Montréal], 2012

    A T y s c.(26) Using the KKT conditions we get the system S 1e λ 0, Aλ 0, AT y s c, s > 0, where S diag(s1, s2, · · · , sm).Part of the challenge of computing the analytic center is that we are not given an initial point s c AT y > 0.Goffin and Mokhtarian [12] suggested to use an infeasible Newton method

    Livres

    Voir le détail

  • Deep LDA-pruned nets and their robustness

    Tian, Qing

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2020

    VGG16).Note: Ori.means original nets, Gauss.represents Gaussian noise (stddev 5), speckle noise strength is 0.05.FGSM Attack: Fast Gradient Signed Method [5].Newton Attack: Newton Fool Attack [14] Acc Dif CIFAR100 Adience LFWA-G LFWA-S Ori.Pruned Ori.Pruned Ori.Pruned Ori.Pruned Gauss.-2.5% -2.0% -0.5% -0.1% -5.2% -4.2

    Livres

    Voir le détail

  • Continuous variable neighborhood search (C-VNS) for solving systems of nonlinear equations

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2018

    main reason to have more than twenty heuristic approaches proposed in theiterature, which additionally clearly indicates that exact solution methods cannot be the right choice for solving the NSE problem. There are many techniques for finding just one solution to the given system, such as Newton and quasi- Newton

    Livres

    Voir le détail

  • The conjugate residual method in linesearch and trust-region methods

    Dahito, Marie-Ange

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2019

    particularly appealing in inexact Newton methods for optimization, typically used in ainesearch context.CR is also relevant in a trust-region context as it causes monotonic decrease of convex quadratic models (Fong and Saunders, 2012).We investigate modifications that make CR suitable, even in the presence of negative curvature

    Livres

    Voir le détail

  • An interior-point method-based solver for simulation of aircraft parts riveting

    Stefanova, Maria

    Montréal : GERAD HEC Montréal, 2017

    a polynomial-time algorithm forinear programming problems, and a competitor to the simplex method. Equivalence of the Karmarkar algorithm to a barrier Newton method was shown by Gill et al. (1986). We use this result to describe ideas of interior point methods for quadratic programming problems. In barrier method, the

    Livres

    Voir le détail

  • Convex nondifferentiable optimization: a survey focussed on the analytic center cutting plane method

    Montréal : GERAD HEC Montréal, 1999

    this theory Useful results from interior point methods Analytic centers can been found via a damped Newton s method To this end we recall the derivatives of F F y As and F y AS AT The Newton step with respect to F is p y F y F y AS AT As The analytic center is uniquely

    Livres

    Voir le détail

  • Importance of data loading pipeline in training deep neural networks

    Zolnouri, Mahdi

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2020

    computation.A training process optimizes aoss function, here L(w) n i 1 yiog σ(x > i w) (1 yi)og(1 σ(x>i w)), (1) where σ(x) {1 exp( x)} 1 is the sigmoid activation.For the case ofogistic regression iterative re-weightedeast squares is often used to optimize L, which is equivalent to Newton’s method.Newton

    Livres

    Voir le détail

  • Implementing a smooth exact penalty function for equality-constrained nonlinear optimization

    Montréal : École des hautes études commerciales. Groupe d'études et de recherche en analyse des décisions, 2019

    focused oninesearch schemes that require computing an explicit Hessian approximation and using it to compute a Newton direction.One of our goals is to show how to adapt the method toarge-scale problems by taking advantage of computational advances made since Fletcher’s proposal.Improved sparse matrix factorizations and

    Livres

    Voir le détail

  • A regularized factorization-free method for equality-constrained optimization

    Arreckx, Sylvain

    Montréal (Québec) Canada : GERAD HEC Montréal, 2016

    a KKT point of (6) for any ρ 0 and δ > 0.Proof.Immediate, by direct comparison of (3) and (7).Sequential quadratic programming methods for (6) may be interpreted as applying Newton’s method to (7).A Newton-like step for (7) from (xk, uk, yk) solves theinear system Hk ρkI JTkδkI δkI Jk δkI x u y gk

    Livres

    Voir le détail