An entire subfield of analysis called Stenciling exists, for this purpose. Depending on the function or differential equation / system, different stencils are used.
Stenciling doesn't just deal with the derivative but it tries to come up with approximations involving a fixed number of sample points, the stencil, for any differential operator, ie. the Laplacian, higher order derivatives, etc.
> The term "stencil" was coined for such patterns to reflect the concept of laying out a stencil in the usual sense over a computational grid to reveal just the numbers needed at a particular step. [2]
Another thing that I'm wondering about is how to calculate the length of a function.
What I'm thinking, is that you can treat a function like a piece of string, you take f(x) and generate a pair of functions, one above and one below f(x).
You generate the functions at a fixed (infinitesimal) tangential distance 𝛿t and generate deltas for {x, y}
mul = 𝛿t/((𝛿x)^2 + (𝛿y)^2)^.5
Δx = 𝛿y * mul
Δy = 𝛿x * mul
u(x - Δx) = f(x) + Δy
l(x + Δx) = f(x) - Δy
The idea being that the function length is the area ∫(u(x) - l(x)) divided by the thickness 2 * 𝛿t.
I have no idea how wrong this is, but if you can point me toward any resources that'll help when I (inevitably) get stuck, it would be very much appreciated.
Not exactly sure what you are getting at but it does remind me of the Cauchy's Integral Formula which shows that differentiation is equivalent to integration.
I turn the curve into a fixed infinitesimal thickness (2 * 𝛿t) 'rope' and measure the area.
Then I divide the area by the thickness to get the length.
The rope is defined as the area between two curves, both of them manipulations of f(x).
The curves are generated by moving each point (in f(x)) a constant infinitesimal distance at a tangent to the function (this gives the rope its fixed thickness), one in either direction.
For the upper curve {Δx, Δy} = {(n * 𝛿y), -(n * 𝛿x)}, for the lower {(n * 𝛿y), -(n * 𝛿x)}, where n is a normalisation constant 𝛿t/((𝛿x)^2 + (𝛿y)^2)^.5
Once you have these two functions, you can use the integral of the difference to get the area.
I'm a brainlet or maybe it's the 10+ hours of work I did today but I just can't grasp it right now.
There are a lot of ways to take integrals though, so I wouldn't doubt your method could work. Generally though the holes in a lot of these integration methods is their inability to work with functions that are pathological:
https://en.wikipedia.org/wiki/Pathological_(mathematics)
(No idea if you'll ever see this, I also have no damned idea what I'm talking about.)
Looking at the Weierstrass function, it's a chaotically convergent series with a non-convergent gradient series.
(The standard differential forms don't apply, because the assumption of linearisation at the scale of 𝛿x is invalid, due to the scale invariant properties of the fractal.)
In that case the integral function would be even more convergent since the 'noise' (pretty much) adds to zero.
So can you differentiate via integration, and get a gradient for a related function (with less noise)?
I messed around with this, taking the areas of a pair of little triangle approximations and adding them to get a ~rectangle.
Then divide that area by 𝛿x to get 𝛿y, and divide that by 𝛿x to get 𝛿y/𝛿x.
This was the form that I reached:
let y = f(x)
let F(x) = ∫f(x)𝛿x
𝛿y/𝛿x = (F(x + 𝛿x) + F(x - 𝛿x) - 2 * F(x))/((𝛿x) ^ 2)
At very least, it passed my polynomial sanity check:
That's awesome. Glad to see your method worked. If you ever have a chance to encounter a serious mathematician, they can probably give you the proper name of it. Or perhaps we'll have to name it the Yarg integral :-)
Ah yes, this is why epsilon squared equals 0 in the dual numbers -- to automatically remove all higher order derivatives when they appear. Exponents on derivatives are indicate of their rank/order.
> The limit is called the second symmetric derivative. Note that the second symmetric derivative may exist even when the (usual) second derivative does not.
> This limit can be viewed as a continuous version of the second difference for sequences.
So it actually can do what I intended it to do, nothing original - but there was a degree of satisfaction in deriving it myself.
Stenciling doesn't just deal with the derivative but it tries to come up with approximations involving a fixed number of sample points, the stencil, for any differential operator, ie. the Laplacian, higher order derivatives, etc.
> The term "stencil" was coined for such patterns to reflect the concept of laying out a stencil in the usual sense over a computational grid to reveal just the numbers needed at a particular step. [2]
[0] https://en.wikipedia.org/wiki/Five-point_stencil
[1] https://en.m.wikipedia.org/wiki/Finite_difference_coefficien...
[2] https://en.wikipedia.org/wiki/Stencil_(numerical_analysis)