### Description

Use gradient descent to find local minima

### Usage

graddsc(fp, x, h = 0.001, tol = 1e-04, m = 1000)

gradasc(fp, x, h = 0.001, tol = 1e-04, m = 1000)

gd(fp, x, h = 100, tol = 1e-04, m = 1000)


### Arguments

 fp function representing the derivative of f x an initial estimate of the minima h the step size tol the error tolerance m the maximum number of iterations

### Details

Gradient descent can be used to find local minima of functions. It will return an approximation based on the step size h and fp. The tol is the error tolerance, x is the initial guess at the minimum. This implementation also stops after m iterations.

### Value

the x value of the minimum found

Other optimz: bisection(), goldsect, hillclimbing(), newton(), sa(), secant()

### Examples

fp <- function(x) { x^3 + 3 * x^2 - 1 }