u(t) is called 60 times per second.
t: Elapsed time in seconds.
S: Shorthand for Math.sin.
C: Shorthand for Math.cos.
T: Shorthand for Math.tan.
R: Function that generates rgba-strings, usage ex.: R(255, 255, 255, 0.5)
c: A 1920x1080 canvas.
x: A 2D context for that canvas.
Gradient Descent (iterative) algorithm used to calculate sqrt(2). The big dots are the two roots -sqrt(2) and +sqrt(2) of equation Y=X**3-X*2. The Gradient Descent is an optimization algorithm for finding a local minimum of a function near a starting point, taking successive steps in the direction of the negative of the gradient. x[n+1]=x[n]−η(∇f)xn where η is the "Learning Rate". Gradient Descent is widely used for training Machine Learning models. #neural#ai
Simple version that calculates sqrt(2). If you change initial "x" to -1, it will converge to -sqrt(2), that is the other root of the equation Y=X**3-X*2. Code: x = 1; learning_rate = .1; for(i=0;i<50;i++) { x = x - learning_rate * (x**3 - 2*x) } throw(x);
u(t) is called 60 times per second.
t: elapsed time in seconds.
c: A 1920x1080 canvas.
x: A 2D context for that canvas.
S: Math.sin
C: Math.cos
T: Math.tan
R: Generates rgba-strings, ex.: R(255, 255, 255, 0.5)