Ad Code

Responsive Advertisement

Machine Learning Program / Project - 04

Question 04: Implement Gradient Descent Algorithm to find the local minima of a function. For example, find the local minima of the function y=(x+3)² starting from the point x=2.

Download hole Program / Project code, by clicking following link:
How does the Gradient Descent Algorithm help in finding the local minima of a function like y = (x + 3) ^ 2 ?
Gradient Descent is an iterative optimization algorithm used to find the local minimum of a function by updating parameters in the opposite direction of the gradient. For the function y = (x + 3)^2, the steps are:
  1. Compute the derivative (gradient): frac{dy}{dx} = 2(x + 3)
  2. Update Rule:
    x = x - \alpha \cdot \text{gradient}
    where alpha is the learning rate.
  3. Start at initial value: Start from x = 2 and iteratively update using the rule above until convergence (e.g., gradient is close to 0).
The algorithm will gradually move x toward the local minimum, which is at x = -3 for this function.

Write a simple pseudocode to implement gradient descent for minimizing y = (x + 3) ^ 2 starting at x = 2 ?
Initialize:
    x = 2            # Starting point
    alpha = 0.1      # Learning rate
    tolerance = 0.0001
    max_iter = 1000

Function:
    gradient(x) = 2 * (x + 3)

Loop:
    for i = 1 to max_iter:
        grad = gradient(x)
        if abs(grad) < tolerance:
            break
        x = x - alpha * grad

Output:
    print "Local minimum at x =", x
This pseudocode represents a basic implementation of gradient descent. The loop continues updating `x` until the gradient is very small (indicating convergence) or a maximum number of iterations is reached. The final value of `x` approximates the local minimum of the function.

Programming Code:
Following code write in: ML_P04.py
# ML Project Program 04

#  Gradient Descent Algorithm
# initialize parameters
cur_x = 2                 # x = 2, given
rate = 0.01
precision = 0.000001
previous_step_size = 1
max_iters = 1000
iters = 0

# Gradient function y=(x+3)²

df = lambda x : 2 * (x + 3)
# Create a loop to perform Gradient Descent

while previous_step_size > precision and iters < max_iters:
    prev_x = cur_x
    cur_x -= rate * df(prev_x)
    previous_step_size = abs(prev_x - cur_x)
    iters += 1
    
print("Local Minima Occurs at : ", cur_x)

# Thanks For Reading.
Output:

Post a Comment

0 Comments

Ad Code

Responsive Advertisement