0% found this document useful (0 votes)
15 views22 pages

Optimization

The document discusses optimization in machine learning, focusing on finding optimal functions for various input scenarios. It covers different types of functions, including convex, non-convex, and multimodal functions, and explains the challenges posed by high-dimensional spaces. Additionally, it introduces optimization algorithms such as Random Walk, Simulated Annealing, and Hill Climbing, outlining their processes and applications.

Uploaded by

yksl3461
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views22 pages

Optimization

The document discusses optimization in machine learning, focusing on finding optimal functions for various input scenarios. It covers different types of functions, including convex, non-convex, and multimodal functions, and explains the challenges posed by high-dimensional spaces. Additionally, it introduces optimization algorithms such as Random Walk, Simulated Annealing, and Hill Climbing, outlining their processes and applications.

Uploaded by

yksl3461
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

CENG463

Machine
Learning
2024 – 2025 Fall

Week - 2

ÖMER MİNTEMUR
Optimization

• Hyphothesis

• Assume that is 2

• There can be many functions that satisfy the condition.

• It is an example for one number

• However, the problem is:

• We want to find an optimal function for a batch of inputs


05/05/2025 OPTIMIZATION 2
Optimization
• Optimization:
• Optimization is the process of finding the best solution to a problem, typically by
minimizing or maximizing a function.

05/05/2025 OPTIMIZATION 3
Introduction
• You are trying to find the lowest point on a hill.

• You start at a random point and take small steps downhill.

• Optimization algorithms guide you in the direction of the steepest descent, helping you reach the
lowest point as quickly as possible.

05/05/2025 OPTIMIZATION 4
Optimization
• Convex Functions
• A function where any line segment connecting two points on the function lies above the function.
• A parabola opening upwards.
• There's only one global minimum.
• Many optimization algorithms are designed specifically for convex functions, leading to efficient solutions.

05/05/2025 OPTIMIZATION 5
Optimization
• Non-Convex Functions
• A function that is not convex
• A sine wave
• Can have multiple local minima and a global minimum
• Finding the global minimum can be difficult due to the presence of local minima

05/05/2025 OPTIMIZATION 6
Optimization
• Multimodal Functions
• A function with multiple local minima
• A function with several peaks and valleys
• Identifying the global minimum can be computationally expensive, especially for high-dimensional spaces

05/05/2025 OPTIMIZATION 7
Optimization
• To sum up

https://burcuukoca.medium.com/effective-comparison-of-unconstrained-
optimization-algorithms-103d4a9f6485

05/05/2025 OPTIMIZATION 8
Optimization
• Those functions are all one dimensional.
• Dimensionality:
• Dimensionality refers to the number of variables or features in an optimization
problem.
• As the dimensionality of a problem increases, the search space becomes
exponentially larger, making optimization more challenging
• Challenges:
• Optimization algorithms can become computationally expensive as the dimensionality
grows
• It's difficult to visualize and understand high-dimensional spaces

https://smowl.net/en/blog/learning-by-doing-definition-methodology/
05/05/2025 OPTIMIZATION 9
Optimization
• Consider 2 Dimensions (2D)

05/05/2025 OPTIMIZATION 10
Optimization
• More than 2D:
• It's difficult to visualize functions with more than three variables.

• Our intuition is limited to three-dimensional space, making it challenging to understand the behavior of high-
dimensional functions

• The likelihood of encountering local minima and saddle points increases in high-dimensional landscapes,
making it harder to find the global optimum

• The sparsity of data in high-dimensional spaces can make it difficult to generalize and find meaningful
patterns

• In high-dimensional spaces, sparsity refers to the phenomenon where most of the data points have many
zero or near-zero values.

05/05/2025 OPTIMIZATION 11
Optimization
• In our case, we know the functions

• We can define a boundary and find an optimized value for the function

• Which one to choose 

https://www.linkedin.com/pulse/top-100-modern-
optimization-algorithms-dinesh-thapa-czuze/

05/05/2025 OPTIMIZATION 12
Optimization
• We will see
• Random Walk
• Simulated Annealing
• Hill Climbing

05/05/2025 OPTIMIZATION 13
Optimization
• Random Walk:
• Pseudocode:
• Define your function to be minimized

• Pick a random point in your function

• Assess the point

• Define a probability value that will decide


• If your current point will go up

• Or if your current point will go down

• Repeat this process n-times

05/05/2025 OPTIMIZATION 14
Random Walk

05/05/2025 OPTIMIZATION 15
Optimization Random Walk

05/05/2025 OPTIMIZATION 16
Simulated Annealing
• The algorithm works by iteratively adjusting the temperature of the system

• It accepts or rejects candidate solutions based on a probabilistic function

• Acceptances or rejections are based on the current temperature

• At high temperatures, the algorithm explores the search space. It accepts worse solutions.

• As temperature decreases, the algorithm becomes more selective.

• It is important how you decrease the temperature

05/05/2025 OPTIMIZATION 17
Simulated Annealing
• Initialization
• Set an initial temperature
• Generate a random initial solution

• Iteration
• Generate a neighbor solution
• Calculate the energy difference between the current and
neighbor solutions
• If the neighbor solution has lower energy (better), accept
it
• If the neighbor solution has higher energy (worse), accept
it with a probability based on the temperature and energy
difference
• Reduce the temperature according to the cooling schedule

• Termination
• Stop the algorithm when the temperature reaches a
predefined minimum or a maximum number of iterations
is reached

05/05/2025 OPTIMIZATION 18
Simulated Annealing

05/05/2025 OPTIMIZATION 19
Hill Climbing
• We can also maximize a function.

05/05/2025 OPTIMIZATION 20
Hill Climbing
• Initialization
• Start from a random initial point.

• Iteration
• Evaluate the function at the current point
• Generate a set of neighboring points
• If the neighbor solution has lower energy
(better), accept it
• Select the neighbor with the highest (or lowest)
function value.
• Move to the selected neighbor.

• Termination
• Stop when there are no neighbors with a higher
(or lower) function value than the current point.

05/05/2025 OPTIMIZATION 21
Hill Climbing

05/05/2025 OPTIMIZATION 22

You might also like