L. D. Macdonald

ORCID: 0009-0009-9972-5557
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Chemical Physics Studies
  • Advanced Optimization Algorithms Research
  • High-pressure geophysics and materials
  • Advanced Physical and Chemical Molecular Interactions
  • Graphene research and applications
  • Inorganic Chemistry and Materials
  • Sparse and Compressive Sensing Techniques
  • Radiative Heat Transfer Studies
  • Ammonia Synthesis and Nitrogen Reduction
  • Stochastic Gradient Optimization Techniques
  • Numerical methods in inverse problems
  • Spectroscopy and Quantum Chemical Studies

University of Toronto
1987-1992

The Harbola-Sahni exchange potential is the work needed to move an electron against electric field of its hole charge distribution. We prove that it not exact density-functional theory, by showing yields wrong second-order gradient expansion in slowly varying limit. But we also discover correct local-density approximation. Thus a more physically version Slater potential, one better suited for molecular and solid-state applications. As step our derivation, present third-order density, discuss...

10.1103/physreva.41.78 article EN Physical Review A 1990-01-01

Abstract This paper studies the convergence properties of a family Relaxed $$\ell $$ <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"> <mml:mi>ℓ</mml:mi> </mml:math> -Minimal Gradient Descent methods for quadratic optimization; includes omnipresent Steepest method, as well Minimal method. Simple proofs are provided that show, in an appropriately chosen norm, gradient and distance iterates from optimality converge linearly, all members family. Moreover, function values decrease...

10.1007/s10589-025-00670-3 article EN cc-by Computational Optimization and Applications 2025-03-06

This paper studies the convergence properties of a family Relaxed $\ell$-Minimal Gradient Descent methods for quadratic optimization; includes omnipresent Steepest method, as well Minimal method. Simple proofs are provided that show, in an appropriately chosen norm, gradient and distance iterates from optimality converge linearly, all members family. Moreover, function values decrease iteration complexity results provided. All theoretical hold when (fixed) relaxation is employed. It also...

10.48550/arxiv.2404.19255 preprint EN arXiv (Cornell University) 2024-04-30
Coming Soon ...