Skip to content

Instantly share code, notes, and snippets.

@mdmitry1
Last active January 2, 2026 17:51
Show Gist options
  • Select an option

  • Save mdmitry1/4a553eb57a9d57b2dae0ccd368be754e to your computer and use it in GitHub Desktop.

Select an option

Save mdmitry1/4a553eb57a9d57b2dae0ccd368be754e to your computer and use it in GitHub Desktop.

Eggholder Function Optimization

This project demonstrates global optimization techniques using SciPy on the Eggholder function, a complex mathematical function commonly used for testing optimization algorithms.

Overview

The Eggholder function is a challenging optimization problem with many local minima, making it an excellent benchmark for testing global optimization algorithms.
Eggholder function figure

Eggholder function formula

Global minimum: f(x*) = -959.6407, at x* = (512, 404.2319)

This implementation:

  • Visualizes the Eggholder function in 3D
  • Generates a dataset of function values
  • Compares two global optimization methods: SHGO and Dual Annealing
  • Sorts and displays results

Files

  • optimization_ex.py - Main script that visualizes and optimizes the Eggholder function

Requirements

pip install -r requirements.txt

Usage

Running the Optimization

python3 optimization_ex.py

This will:

  1. Generate a 3D plot of the Eggholder function (closes automatically after 5 seconds)
  2. Create dataset.txt containing X1, X2, Y1 coordinates
  3. Sort the dataset by the objective function value
  4. Run two optimization algorithms and compare results

Validation: creating dataset and optimization using SHGO

pytest

Non-linear Constraints: Cattle Feed Problem (HS73)

Problem Description

This is a classic non-linear optimization problem from Hock and Schittkowski's test problem collection (problem 73), also known as the cattle-feed problem. It demonstrates constrained optimization with both linear and non-linear constraints.

Mathematical Formulation

Objective Function

Minimize:
f(x) = 24.55x₁ + 26.75x₂ + 39x₃ + 40.50x₄

Constraints

Linear Constraint:
2.3x₁ + 5.6x₂ + 11.1x₃ + 1.3x₄ - 5 ≥ 0

Non-linear Constraint:
12x₁ + 11.9x₂ + 41.8x₃ + 52.1x₄ - 1.645√(0.28x₁² + 0.19x₂² + 20.5x₃² + 0.62x₄²) - 21 ≥ 0

Equality Constraint:
x₁ + x₂ + x₃ + x₄ - 1 = 0

Bounds:
0 ≤ xᵢ ≤ 1 for all i ∈ {1, 2, 3, 4}

Problem Context

This problem represents a cattle feed mixture optimization where:

  • Decision variables (x₁, x₂, x₃, x₄): proportions of four different feed ingredients
  • Objective: Minimize the total cost of the feed mixture
  • Constraints: Ensure nutritional requirements are met while maintaining mixture proportions

The equality constraint ensures that the proportions sum to 1 (100% of the mixture).

Optimal Solution

The approximate optimal solution is:

x₁ ≈ 0.6355216
x₂ ≈ -0.12 × 10⁻¹¹ (essentially 0)
x₃ ≈ 0.3127019
x₄ ≈ 0.05177655

f(x*) ≈ 29.894378

Key Features

  • Problem Type: Non-linear programming (NLP)
  • Difficulty: The non-linear constraint involving a square root of sum of squares makes this problem non-convex
  • Dimensions: 4 variables, 4 constraints (3 inequality, 1 equality)
  • Applications: Feed formulation, mixture problems, portfolio optimization

Implementation Notes

To solve this problem, you would typically use:

  • Non-linear optimization solvers (e.g., IPOPT, SNOPT, fmincon)
  • Sequential Quadratic Programming (SQP) methods
  • Interior point methods

The problem requires careful handling of the non-linear constraint, especially ensuring the expression under the square root remains non-negative during optimization.

References

[1] Hock, W. and Schittkowski, K., "Test Examples for Nonlinear Programming Codes", Lecture Notes in Economics and Mathematical Systems, Vol. 187, Springer-Verlag, 1981.

#!/usr/bin/python3.14
# https://www.geeksforgeeks.org/artificial-intelligence/bayesian-optimization-in-machine-learning/
import numpy as np
from matplotlib import pyplot as plt
from skopt import gp_minimize
from skopt.space import Real, Integer
from skopt.plots import plot_convergence
from hashlib import sha256
from sys import argv
from math import inf
# Define the objective function to minimize
def objective_function(x):
return (x[0] - 2) ** 2 + (x[1] - 3) ** 2
def main(timeout: int = inf):
# Define the search space
space = [Real(0.0, 5.0, name='x1'), # Continuous space for x1
Real(0.0, 5.0, name='x2')] # Continuous space for x2
# Perform Bayesian Optimization
result = gp_minimize(objective_function, # The function to minimize
space, # The search space
n_calls=100, # The number of evaluations
xi=0.0001, # Accuracy-related parameter
random_state=42) # Random state for reproducibility
# Plot convergence
plt.switch_backend('Qt5Agg')
fig, ax = plt.subplots(figsize=(10, 6))
plot_convergence(result, ax = ax)
if not inf == timeout:
timer = fig.canvas.new_timer(interval=timeout, callbacks=[(plt.close, [], {})])
timer.start()
plt.show()
# Print the best parameters and the corresponding minimum value
pprint_res=f"Best parameters: x1 = {round(result.x[0],3):.3f}, x2 = {round(result.x[1],3):.3f}\n" + \
f"Minimum value: {round(result.fun,3):.3f}"
print(pprint_res)
return sha256(pprint_res.encode()).hexdigest()
if __name__ == "__main__":
print(main()) if len(argv) < 2 else print(main(float(argv[1])))
This file has been truncated, but you can view the full file.
View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment