Sunday, April 28, 2024

Week 4 - Funny Backpropagating AI !!!

Week 4 - Funny Backpropagating AI !!!

Why did the neural network refuse to go to the party?

Because it was too busy backpropagating through its social network!!!

Hahahaa !!!

Week 4 - Backpropagation in AI

Week 4 -  Backpropagation

In simple language, backpropagation is the point when the function has maximum values.  this can be obtained using first derivative equal to zero and when second derivative is used to find the point. See Example below.

Consider this function 

f(x) = 10 x*x*x + 20 (x) + 20

Step 2 : Find the derivative of the function 𝑓(𝑥)=10𝑥3+20𝑥+20, we differentiate each term separately using the power rule for derivatives:

  1. For the term 10𝑥3, the derivative is 𝑑𝑑𝑥(10𝑥3)=30𝑥2.
  2. For the term 20𝑥, the derivative is 𝑑𝑑𝑥(20𝑥)=20.
  3. For the constant term 20, the derivative is 𝑑𝑑𝑥(20)=0, since the derivative of a constant is zero.

Therefore, the derivative of the function 𝑓(𝑥)=10𝑥3+20𝑥+20 is:

𝑓(𝑥)=30𝑥2+20

To find the maximum value of the function 𝑓(𝑥)=40𝑥320𝑥+20, we can use calculus.

First, let's find the critical points by setting the derivative equal to zero:

𝑓(𝑥)=120𝑥220=0

Solving for 𝑥:

120𝑥2=20 𝑥2=20120=16 𝑥=±16

So, the critical points are 𝑥=16 and 𝑥=16.

Now, let's analyze the second derivative to determine if these critical points are maxima, minima, or points of inflection:

𝑓(𝑥)=240𝑥

When 𝑥=16 or 𝑥=16, 𝑓(𝑥)=240𝑥 will be positive, indicating that these are points of local minima.

Since the function 𝑓(𝑥)=40𝑥320𝑥+20 is increasing without bound as 𝑥 approaches positive infinity and decreasing without bound as 𝑥 approaches negative infinity, there is no maximum value for this function. Instead, it has local minima at 𝑥=16 and 𝑥=16.

Above function was created by me and is going sky-bound. Hahaha

Here is one more example :


Let's consider the function 𝑓(𝑥)=𝑥46𝑥2+9.

To find the critical points, we'll take the derivative of 𝑓(𝑥) and set it equal to zero:

𝑓(𝑥)=4𝑥312𝑥=0

Factoring out 4𝑥, we get:

4𝑥(𝑥23)=0

Setting each factor equal to zero:

4𝑥=0and𝑥23=0

From the first equation, we get 𝑥=0.

From the second equation, we get 𝑥=±3.

Now, let's analyze the second derivative to determine if these critical points are maxima, minima, or points of inflection:

𝑓(𝑥)=12𝑥212

When 𝑥=0, 𝑓(𝑥)=12, indicating a maximum point.

When 𝑥=±3, 𝑓(𝑥)=12(3)12=2412=12, indicating minimum points.

So, the maximum and minimum values of the function 𝑓(𝑥)=𝑥46𝑥2+9 are both 9, and they occur at 𝑥=0 and 𝑥=±3, respectively.

However, in AI the function are non-linear and polynomial is maximum value using same concept but using limits function of derivate. The process of backpropagation is commonly associated with neural networks and machine learning, particularly in the context of training neural networks through gradient descent.

Backpropagation is specifically applied in the context of neural networks to update the weights of the network based on the error between the predicted output and the actual output during the training process.

However, if you were using this function as part of a neural network, we could apply backpropagation to calculate the gradients of the loss function with respect to the parameters (weights and biases) of the network, and then use those gradients to update the parameters through gradient descent.

In summary, while backpropagation is not directly applicable to the function 𝑓(𝑥)=10𝑥3+20𝑥+20 itself, it could be used in the context of a neural network that incorporates this function as part of its architecture.

Welcome to world of artificial neuron made by human brain using real neuron !!!