This project gathers experience from multiple optimization problems at the core of Machine Learning (ML), which, in turn, serve as the driving force behind broader concepts in Artificial Intelligence (AI). The repository is organized into modules, each addressing a distinct optimization topic and providing implementations in Python.
- Module 1 – Linear Programming (LP)
- Module 2 – Isoperimetric Optimization
- Module 3 – Piecewise Constant Signal Reconstruction
- Module 4 – Basis Pursuit
- Module 5 – Backtracking Line Search
- Module 6 – Newton's Method
- Module 7 – Levenberg–Marquardt Parameter Estimation
- Module 8 – Nonlinear Constrained Least Squares Optimization
- Module 9 – Quasi-Newton Method
- Module 10 – Linear Programming (LP) Solutions using Sequential Barrier Method (SBM)
- Bibliography
Examples of solving linear programming optimization problems using the CVXPY library in Python. The examples demonstrate different modeling techniques and solution approaches:
- Scalar vs. vector-based formulations
- Alternative transformations of constraints
Implementation of an isoperimetric optimization problem using convex programming techniques. The objective is to determine a function f(x)
that maximizes the area under the curve (the integral of f(x)
) while satisfying:
- A specified total length of the curve
- A maximum curvature constraint
- Passing through given fixed points
Two scripts that perform reconstruction of piecewise constant signals from noisy measurements:
-
LASSO-based Optimization (
zad1.py
):- Uses the Least Absolute Shrinkage and Selection Operator (LASSO).
- Minimizes the
L2
norm of measurement errors with anL1
norm constraint on the signal’s gradient to promote sparsity.
-
Linear Programming-based Optimization (
zad2.py
):- Reformulates the signal reconstruction as a Linear Programming task.
- Minimizes discrepancies between the measured noisy signal and the estimated signal, enforcing piecewise constant behavior via linear constraints.
Implementation of a Basis Pursuit problem using an overcomplete dictionary of Gabor basis functions:
- A synthetic signal is generated with varying amplitude and phase.
- An overcomplete dictionary of Gabor functions is constructed.
- L1 regularization (Lasso) selects a sparse subset of these functions that best represent the signal.
- A refined approximation is obtained through a least-squares fit on the selected basis elements.
- The code evaluates the reconstruction quality via metrics (e.g., mean squared error, relative error) and visualizes the time-frequency distribution of both the original and reconstructed signals.
Implementation of the Backtracking Line Search algorithm, a common method for finding suitable step lengths in iterative optimization.
It ensures a sufficient decrease in the objective function by checking the
Armijo condition: φ(s) ≤ φ(0) + α * s * φ'(0)
where:
φ(s)
is the objective function at step length s
.
α ∈ (0, 1)
, controlling the sufficient decrease condition.
φ'(0)
is the derivative of the objective function at s = 0
.
Comparison and implementation of two variations of Newton’s method for nonlinear optimization problems:
-
Classic Newton’s Method
- Uses the gradient and Hessian matrix to iteratively minimize a function.
-
Damped Newton’s Method
- Enhances the classic approach by incorporating a backtracking line search for better convergence properties.
Implementation of the Levenberg–Marquardt (LM) algorithm to estimate parameters of various models:
-
Sinusoidal Function (
zad1.py
)
y(t) = A * sin(ω * t + φ)
Estimated Parameters: AmplitudeA
, Angular frequencyω
, Phase shiftφ
-
Damped Sinusoidal Function (
zad2.py
)
y(t) = A * exp(-a * t) * sin(ω * t + φ)
Estimated Parameters: AmplitudeA
, Damping coefficienta
, Angular frequencyω
, Phase shiftφ
-
First-Order Inertia (
zad3.py
)
y(t) = k * (1 - exp(-t / T))
Estimated Parameters: Gaink
, Time constantT
-
Double Inertia (
zad4.py
)
y(t) = k * [1 - (1 / (T1 - T2)) * (T1 * exp(-t / T1) - T2 * exp(-t / T2))]
Estimated Parameters: Gaink
, Time constantsT1
,T2
-
Second-Order Oscillatory System (
zad5.py
)
y(t) = k * [1 - exp(-γ * t) * (cos(β * t) + (γ / β) * sin(β * t))]
Estimated Parameters: Gaink
, Damping factorγ
, Oscillation frequencyβ
Nonlinear constrained optimization algorithms using the Augmented Lagrangian Algorithm (ALA) combined with the Levenberg–Marquardt (LM) method:
-
zad1.py
Solves a 2D nonlinear least squares problem with a single nonlinear constraint using ALA and LM. Includes residual visualization. -
zad2.py
Compares the Augmented Lagrangian Algorithm (ALA) and Penalty method for a 3D constrained optimization problem, visualizing residual convergence and parameter evolution. -
zad3.py
A Boolean Least Squares problem minimizing||Ax - b||^2
with elements ofx
restricted to +1 or -1. Compares brute-force and ALA solutions.
Scripts to find the minimum of a two-variable function using quasi-Newton optimization methods:
- SR1 (Symmetric Rank One)
- DFP (Davidon–Fletcher–Powell)
- BFGS (Broyden–Fletcher–Goldfarb–Shanno)
Python scripts demonstrating solutions to Linear Programming problems via the Sequential Barrier Method (SBM). Results are compared with standard LP solvers (e.g., linprog
from SciPy).
-
A. Ben-Tal and A. Nemirovski.
Lectures on Modern Convex Optimization. SIAM, 2001. -
Stephen Boyd and Lieven Vandenberghe.
Convex Optimization. Cambridge University Press, New York, NY, USA, 2004.
Available online:
http://web.stanford.edu/~boyd/cvxbook/ -
Stephen Boyd and Lieven Vandenberghe.
Additional Exercises for Convex Optimization, 2004.
https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook_extra_exercises.pdf -
G.C. Calafiore and L. El Ghaoui.
Optimization Models. Cambridge University Press, 2014. -
E.K.P. Chong and S.H. Zak.
An Introduction to Optimization. Wiley, 2004. -
Ulrich Münz, Amer Mešanović, Michael Metzger, and Philipp Wolfrum.
Robust optimal dispatch, secondary, and primary reserve allocation for power systems with uncertain load and generation.
IEEE Transactions on Control Systems Technology, 26(2):475–485, 2018. -
Course materials from Optimization Methods for the Master’s program in Computer Science at WUT.