Optimization with Constraints -

Python Numerical Methods


Optimization of an Equation for Max or Min with Constraints

Optimization using minimize, and PuLP. First we will use the minimize library to optimize an objective function as a maximum or minimum. The programs for max and min are the same except for multiplying the objective equation by negative one for maximums. Scipy does not support maximums, so we must multiply by negative one. The second area we will examine is the use of PuLP for optimization of linear equations. Note that generally, problems with constraints of the form aX+bY>=C are minimization problems. Problems with constraints of the form aX+bY<=C are maximization problems. This page was updated in April 2026. Suggested reading is Numerical Methods for Engineers, Fifth Edition by Chapra and Canale.



Two Examples of Minimum Optimization with Constraining Equations

Minimum optimization with inequality constraints. Example 1.
min f(x,y,z) = log(x**2+1) + y**4 + x*z
x**3 – y**2 >= 1
x>=0, z >= 0


import numpy as np
from scipy.optimize import minimize
# Objective function
def f(vars):
    x, y, z = vars
    return np.log(x**2 + 1) + y**4 + x*z
# Inequality constraints (SciPy uses form g(x) >= 0)
cons = [
    {"type": "ineq", "fun": lambda v: v[0]**3 - v[1]**2 - 1},  # x^3 - y^2 - 1 >= 0
    {"type": "ineq", "fun": lambda v: v[0]},                  # x >= 0
    {"type": "ineq", "fun": lambda v: v[2]},                  # z >= 0
]
# Initial guess
x0 = np.array([1.5, 0.1, 0.1])
# Run optimization (function, initial guess, and constraints)
result = minimize(f, x0, constraints=cons)
print("Success:", result.success)
print("Optimal variables (x, y, z):", result.x)
print("Minimum value:", result.fun)

#Output
#Success: True
#Optimal variables (x, y, z): [ 9.99999994e-01  1.32265285e-06 -1.30115393e-11]
#Minimum value: 0.6931471750392096
#The output rounds to [1,0,0].
      

Table of constraints and scipy equivalents.



***************************************************************************************





Second example. We use SLSQP minimization method because it supports both equality and inequality constraints. We use an initial guess of [0.5, 0.5]. We will plot the equations.

Objective function:
Minimize f(x, y) = x^2 + y^2

Subject to constraints:
Equality: ( x + y = 1 )
Inequality: ( x - y >= 1 )

  
import numpy as np
from scipy.optimize import minimize

# Objective function: f(x, y) = x^2 + y^2
def objective(vars):
    x, y = vars
    return x**2 + y**2

# Equality constraint: x + y = 1
def constraint_eq(vars):
    x, y = vars
    return x + y - 1

# Inequality constraint: x - y >= 1  -->  x - y - 1 >= 0
def constraint_ineq(vars):
    x, y = vars
    return x - y - 1

# Define constraints in scipy format
constraints = [
    {'type': 'eq', 'fun': constraint_eq},     # equality
    {'type': 'ineq', 'fun': constraint_ineq}  # inequality
]

# Initial guess
x0 = np.array([0.5, 0.5])

# Run optimization
result = minimize(
    objective,
    x0,
    method='SLSQP',  # Sequential Least Squares Programming
    constraints=constraints,
    options={'disp': True}
)

# Output results
if result.success:
    print("Optimal solution found:")
    """ :.4f — Formats the number as a floating-point
value with exactly 4 digits after the decimal point."""
    print(f"x = {result.x[0]:.4f}, y = {result.x[1]:.4f}")
    print(f"Min. value of objective func = {result.fun:.4f}")
else:
    print("Optimization failed:", result.message)

#Output:
#Optimization terminated successfully
#Optimal solution found:
#x = 1.0000, y = 0.0000
#Min. value of objective func = 1.0000

 

   
#Plot the three eq.: f(x, y) = x^2 + y^2, x + y = 1 ,x - y = 1
import numpy as np
import matplotlib.pyplot as plt

# Create grid
x = np.linspace(-3, 3, 400)
y = np.linspace(-3, 3, 400)
X, Y = np.meshgrid(x, y)

# Function f(x, y) = x^2 + y^2
Z = X**2 + Y**2

# Plot contour
plt.figure(figsize=(7, 7))
contours = plt.contour(X, Y, Z, levels=15, cmap='viridis')
plt.clabel(contours, inline=True, fontsize=8)

# Plot lines
plt.plot(x, 1 - x, 'r', label='x + y = 1')
plt.plot(x, x - 1, 'b', label='x - y = 1')

# Labels and legend
plt.xlabel('x')
plt.ylabel('y')
plt.title('Contour f(x, y) = x² + y² with x+y=1 and x−y=1')
plt.legend()
plt.grid(True)
plt.axis('equal')

plt.show()  
  

Three equation plot, including contour plot.f(x, y) = x² + y² with x+y=1 and x−y=1

Two Examples of Maximization Optimization with Constraining Equations

This first example uses maximization optimization with one constraining equation. The program below has explanatory comments. The steps are the same as above. The difference is we will be maximizing the objective function, rather than minimizing it. Because scipy only supports minimization, we will be multiplying the objective function and constraint by -1 . The multiplication by -1 allows us to maximize.

SciPy does not have a maximization function. SciPy minimizes, you convert to maximization objective by negating the coefficients of the objective function.
Objective (maximize): f(x,y) Constraints: ; x <= 4; y <= 6; x,y >= 0
SciPy minimize formulation:
Maximum (3x+5y) -> Minimum –(3x+5y)
3x+2y=18
0 <= x <=4 ; 0 <= y <= 6


import numpy as np
from scipy.optimize import minimize

# Objective: maximize 3x + 5y ->  minimize -(3x + 5y)
def obj(vars):
    x, y = vars
    return -(3*x + 5*y)

# Equality constraint: 3x + 2y = 18
def constraint_eq(vars):
    x, y = vars
    return 3*x + 2*y - 18

# Bounds: 0 ≤ x ≤ 4, 0 ≤ y ≤ 6
b = [(0, 4), (0, 6)]

# Define constraints in SciPy format
con = ({
    'type': 'eq',
    'fun': constraint_eq
})

# Initial guess
x0 = np.array([1, 1])

# Solve
result=minimize(obj,x0,method='SLSQP',bounds=b,constraints=con)
import numpy as np
from scipy.optimize import minimize

# Objective: maximize 3x + 5y ->  minimize -(3x + 5y)
def obj(vars):
    x, y = vars
    return -(3*x + 5*y)

# Equality constraint: 3x + 2y = 18
def constraint_eq(vars):
    x, y = vars
    return 3*x + 2*y - 18

# Bounds: 0 ≤ x ≤ 4, 0 ≤ y ≤ 6
b = [(0, 4), (0, 6)]

# Define constraints in 
print("Optimal x, y:", result.x)
#Notice that we take negative of result.fun
print("Maximum value:", -result.fun)

#Output:
#Optimal x, y: [2. 6.]
#Maximum value: 36.0

   

You can plot the constraint line 3x + 2y = 18, the box constraints x = 4, y = 6, and the objective function f(x,y) = 3x + 5y (as contour lines) in one Python figure.

Plot of f(x,y) = 3x + 5y, x=4,y=6,3x + 2y = 18



*******************************************************************************************





Second example. Maximum optimization with constraining equations.
Since scipy.optimize.minimize minimizes functions, we can maximize by minimizing the negative of the objective function.

Objective function to maximize: f(x,y)= 150x+175y
Constraints: 7x+11y<=77; 10x+8y<=80; x<=9; y<=6; x>=0; y>=0

SciPy minimize formulation:
Objective: We negate  so minimize will find the maximum.
Constraints: In SciPy, inequality constraints are written so that fun(x) >= 0.
For example,  becomes 77 - (7x + 11y) >= 0.
Bounds: Here, bounds are implemented as inequality constraints, but you could also use bounds=[(0,9),(0,6)] for x and y.


import numpy as np
from scipy.optimize import minimize

# Objective function (negative for maximization)
def objective(vars):
    x, y = vars
    return -(150 * x + 175 * y)  # negate to maximize

# Inequality constraints: each must be >= 0 in SciPy
con = [
    {'type': 'ineq', 'fun': lambda v: 77 - (7*v[0] + 11*v[1])},  # 7x + 11y <= 77
    {'type': 'ineq', 'fun': lambda v: 80 - (10*v[0] + 8*v[1])},  # 10x + 8y <= 80
    {'type': 'ineq', 'fun': lambda v: 9 - v[0]},                 # x <= 9
    {'type': 'ineq', 'fun': lambda v: 6 - v[1]},                 # y <= 6
    {'type': 'ineq', 'fun': lambda v: v[0]},                     # x >= 0
    {'type': 'ineq', 'fun': lambda v: v[1]}                      # y >= 0
]

# Initial guess
x0 = [0, 0]

# Solve
result = minimize(objective,x0,method='SLSQP',constraints=con)

if result.success:
    x_opt, y_opt = result.x
    max_value = -result.fun  # negate to get max
    print(f"Optimal solution: x = {x_opt:.4f}, y = {y_opt:.4f}")
    print(f"Maximum value of objective function: {max_value:.4f}")
else:
    print("Optimization failed:", result.message)

#Output:
#Optimal solution: x = 4.8889, y = 3.8889
#Maximum value of objective function: 1413.8889


#Plot
import numpy as np
import matplotlib.pyplot as plt

# Create x range
x = np.linspace(0, 12, 400)

# Constraint lines
y1 = (77 - 7*x) / 11          # 7x + 11y = 77
y2 = (80 - 10*x) / 8          # 10x + 8y = 80

# Vertical and horizontal boundaries
x_bound = 9
y_bound = 6

"""Objective function line (choose a constant value, e.g., f = 150*0 + 175*0 = 0)
To show direction, pick a higher constant, e.g., f = 150*9 + 175*6"""
f_const = 150*9 + 175*6
y_obj = (f_const - 150*x) / 175

plt.figure(figsize=(8,6))

# Plot constraint lines
plt.plot(x, y1, label="7x + 11y = 77")
plt.plot(x, y2, label="10x + 8y = 80")
plt.axvline(x_bound, color='purple', label="x = 9")
plt.axhline(y_bound, color='orange', label="y = 6")

# Plot objective function line
plt.plot(x,y_obj,'k--',label="Objective line (150x+175y=const)")

# Feasible region shading (optional)
y_min = np.maximum(0, np.minimum(y1, y2))
y_min = np.minimum(y_min, y_bound)
plt.fill_between(x, 0, y_min, where=(x <= x_bound), color='gray', alpha=0.2)

plt.xlim(0, 12)
plt.ylim(0, 12)
plt.xlabel("x")
plt.ylabel("y")
plt.title("Plot of Constraints and Objective Function")
plt.legend()
plt.grid(True)
plt.show()
 

Plot of f(x,y)= 150x+175y,
Constraints: 7x+11y<=77; 10x+8y<=80; x<=9; y<=6; x>=0; y>=0

Optimization Using PuLP to Maximize Objective of Linear Equation with Constraints

We will now switch to the PuLP module. An optimization problem using PuLP – a linear programming library. We must install PuLP using pip. Go into your virtual environment if you are using one. Type the following to install PuLP in Windows. Install in Ubuntu may differ. There are installation instructions for Jupyter Notebook at Authors webpage at wwww.salarsen.com - Chapters 1 through 4.


 python -m pip install pulp  

In this optimization problem we will be looking at a factory that makes tables, chairs, and bookcases. We want to optimize the production of each to maximize profit. We are given the following information.

x1= number of tables
x2= number of chairs
x3= number of bookcases
Objective Function: Profit = 40*x1+30*x2+45*x3
Constraints:
labor: 2*x1 + 1*x2 + 2.5*x3 <= 60 Hours
Machine: 0.8*x1 + 0.6*x2 + 1.0*x3 <= 16 Hours
Wood: 30*x1 + 20*x2 + 30*x3 <= 400 board-feet
Tables: x1>=10 board-feet
x1, x2, x3 >= 0


  
# Import all classes of PuLP module
from pulp import *
  
#Optimum number of tables, chairs, or bookcases
#A linear program problem using Maximization
model = LpProblem("FurnitureProblem", LpMaximize)
  
#Create 3 variables tables, chairs, and bookcases
# First=item,Second=LowerBound,Third=UpperBound,Fourth=DataType
x1 = LpVariable("tables", 0, None, LpInteger)
x2 = LpVariable("chairs", 0, None, LpInteger)
x3 = LpVariable("bookcases", 0, None, LpInteger)
  
# Create maximize objective function
# Profit equals $ * item
model += 40 * x1 + 30 * x2 + 45 * x3
  
# Constraints
model += 2 * x1 + 1 * x2 + 2.5 * x3 <= 60, "Labor" model += 0.8 * x1 + 0.6 * x2 + 1.0 * x3 <= 16, "Machine" model += 30 * x1 + 20 * x2 + 30 * x3 <= 400, "wood" model += x1 >= 10, "tables" # The problem is solved using PuLP's choice of Solver model.solve() # Each of the variables is printed with the optimum value for v in model.variables(): print(v.name, "=", v.varValue) #Output: # bookcases = 0.0 # chairs = 5.0 # tables = 10.0





This website consists of example problems from numerical methods for engineers. The first examples apply to roots, plotting roots, maximums, mininums, and optimization problems. You have enough examples so that you become familiar with the syntax used in Python. The examples have been tested and the output of the programs are listed in the comments for each. All programs were run Jupyter Notebook. All programs except PuLP will run on https://python-fiddle.com.







Introduction to Engineering Python: For First Year Engineering Students