-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Order behavior of integrate and derivative for univariate Taylor series #230
Comments
Thanks for reporting. Let me first say that, the convention we have is to have the same (fixed) order as the initial polynomial. Then, for julia> a = Taylor1([1.0], 0) # zero-th order Taylor1 polynomial
1.0 + 𝒪(t¹)
julia> integrate(a)
0.0 + 𝒪(t¹) If, however, you do the following, you get what is expected: julia> b = Taylor1([1.0], 1) # order-1 Taylor1 polynomial
1.0 + 𝒪(t²)
julia> integrate(b)
1.0 t + 𝒪(t²) Note that, if you further integrate the last answer, i.e., Regarding the A recent, somewhat related issue, is #226. |
Derivative seems to be a special case that is different from any other function, since it reduces the order, instead of increasing it. It really is true that you are losing information (number of known coefficients of the polynomial) so probably that should really reduce the order. |
Thanks a lot for the quick answers. Yes, I agree that integrate and derivative are different in that
The convention of giving to the resulting series the same order as the input series should perhaps have some exceptions so that to guarantee that the results are mathematically correct. By the way, I've checked that |
Why? Don't you expect that |
@dpsanders I agree that |
The convention on the order adopted in TaylorSeries.jl is not what I expected. I find the big Oh notation misleading. I interpret s = a0 + a1t + a2t^2+O(t^3) as s = a0 + a1t +a2t^2+a3t^3+a4t^4+... with unknown coefficients a3, a4, ... Going back to the example I mentioned in my latest comment: consider sin(t), which can be Taylor-expanded at t0=0 as s = t + O(t^2). If we divide s by t (or by anything that can be expanded as t + O(t^2)), I would expect s/t = 1 + O(t). Indeed, sin(t)/t = 1 + O(t). Clearly, sin(t)/t = 1 + O(t^2) is wrong. I understand that with the order convention adopted in TaylorSeries.jl, the elements of type Taylor1 are interpreted on input as polynomials, not as series. The order field of the object of type Taylor1 is only used on output: The result of evaluating a function f on an object p of type Taylor1 with order=n is the polynomial obtained by truncating at order n the series expansion of f(p). Is that right? |
Yes, you are right. To be explicit, we consider the Taylor expansions as polynomials, which are truncated to certain order, which is kept fixed. |
@AnderMuruaUria Sorry to address this issue with such a delay! I just pushed a few changes in #248 which address some of the issues pointed out here. With them, we have: julia> derivative(Taylor1([1., 0.], 1))
0.0 + 𝒪(t¹) There are still few things to do, e.g., that if cc @PerezHz |
Hi Marc,
I hope this email reaches you, since in github i can't find your actual comment to reply it.
Your comment is well taken: integration is anti-derivation in the sense of being "inverse" operations.
The current way (in master) in which `differentiate` (`derivative` is a synonym) is implemented is to
keep the same degree of the polynomial simply by setting the last coefficient to zero. This naive solution
is wrong in the sense that we are stating that the last coefficient is zero, while it may not be so; below
I include an example. This is the bottom line of AnderMuruaUria's criticism.
As an example, consider the following code (uses master of TaylorSeries):
julia> using TaylorSeries
julia> t = Taylor1(6)
1.0 t + 𝒪(t⁷)
julia> sin(t)
1.0 t - 0.16666666666666666 t³ + 0.008333333333333333 t⁵ + 𝒪(t⁷)
julia> differentiate(sin(t))
1.0 - 0.5 t² + 0.041666666666666664 t⁴ + 𝒪(t⁷)
The last result should be equivalent to `cos(t)`, which is not, simply because the last term is
not zero.
julia> cos(t)
1.0 - 0.5 t² + 0.041666666666666664 t⁴ - 0.001388888888888889 t⁶ + 𝒪(t⁷)
The implementation which is proposed in #248 <#248>,
you can test it using the branch lb/iss230, yields, for the differentiate entry:
julia> differentiate(sin(t))
1.0 - 0.5 t² + 0.041666666666666664 t⁴ + 𝒪(t⁶)
which corresponds to `cos(t)` up to order 5.
I've checked that everything is consistent against the code in the discourse thread, and there are
no problems.
The following displays some examples where there will be changes with respect to current master:
1) Division by factorization
- master:
julia> sin(t)/t
1.0 - 0.16666666666666666 t² + 0.008333333333333333 t⁴ + 𝒪(t⁷)
- lb/iss230:
julia> sin(t)/t
1.0 - 0.16666666666666666 t² + 0.008333333333333333 t⁴ + 𝒪(t⁶)
2) sqrt/pow for a factorizable polynomial:
- master
julia> sqrt(t^4)
1.0 t² + 𝒪(t⁷)
- lb/iss230
julia> sqrt(t^4)
1.0 t² + 𝒪(t⁴)
3) An example inspired in your comment:
- master:
julia> integrate(differentiate(sin(t)))
1.0 t - 0.16666666666666666 t³ + 0.008333333333333333 t⁵ + 𝒪(t⁷)
- lb/iss230:
julia> integrate(differentiate(sin(t)))
1.0 t - 0.16666666666666666 t³ + 0.008333333333333333 t⁵ + 𝒪(t⁶)
By the way, regarding your comment on the integration, we do have a way of including
the integration constant: `integrate(f, c)`.
A question related to integration: shall we increase the degree (we have everything needed to give
the correct answer)?
I think the examples show that, if this PR gets merged, it is truly breaking, since the lower order
is the one that rules operations mixing Taylor series of different orders. This is actually quite important,
in our package TaylorIntegration, because such sudden changes of order may truly disrupt the integration,
either causing errors or silently yielding a not-so-precise integration; part of this is what I am checking.
Something alike can happen in `TaylorModels`, but that package is less developed/used.
All the best, and sorry for the late answer,
Luis
… On 26 Mar 2021, at 12:18 PM, Marc-Cox-08 ***@***.***> wrote:
About ".. that should really reduce the order." because ".. losing information (number of known coefficients of the polynomial)"
Correct in that derivatives will reduce the order, but believe that no information should be actually "lost" per se ;
Here's the reasoning: Integrals are "Anti-Derivatives",
IOW Integrals are the inverse of derivatives,
IOW Df^-1(Df^+1(x)) = x (if you will allow the notation)
So for Df^-1(Df^+1(x)) = x to hold true "round trip" then information cannot be "lost" when taking the Derivative Df^1(x) ;
but, in fact, somewhat paradoxically I have found that there is some need to recover the Constant of Integration **.
Constant of Integration ** https://en.wikipedia.org/wiki/Constant_of_integration <https://en.wikipedia.org/wiki/Constant_of_integration>
"This constant expresses an ambiguity inherent in the construction of antiderivatives."
Which I often ignored until I actually tried inverting the function lol.
So to recover the Integration Constant I suggest something like the following pseudo code snippet:
# function f_mean_integral to calculate the Integration Constant, so as to subtract it out.
f_mean_integral(f_integral,range_x) = Statistics.mean(extrapolate(f_integral,r_x) for (r_x) in range_x )
# The usual caveats about sample rate of very Stiff functions yada yada apply,
# so you might need to dig into Quadrature methods to address recovering Integration Constants completely and elegantly for very Stiff functions.
So above is somewhat the "second half" of helpful tips from lbenet about Inverse functions using TaylorSeries.jl
Domains: Machine Learning here >> https://discourse.julialang.org/t/inverse-functions-using-taylorseries-jl/43382/7?u=marc.cox <https://discourse.julialang.org/t/inverse-functions-using-taylorseries-jl/43382/7?u=marc.cox>
HTH
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub <#230 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ABH6X4F3PN5IORFNZ5UADZTTFTFY7ANCNFSM4JF2IHAQ>.
|
@dpsanders What about the integration? Should we increase the order? |
I don't like the way integrate and derivative deal with the order of univariate series in TaylorSeries.jl.
I've checked that, integrate( 1. + O(t) ) gives on output 0. + O(t).
In my opinion, integrate( 1 + O(t)) should give t + O(t^2).
Similarly, derivative( t + O(t^2) ) gives 1 + O(t^2). In my opinion, derivative( t + O(t^2) ) should result in 1 + O(t).
Is there some reason for integrate and derivative behave in that (in my opinion, mathematically unnatural) way in TaylorSeries.jl?
The text was updated successfully, but these errors were encountered: