You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I recently came across this paper outlining an algorithm by Roger Willcocks for subdividing Bezier curves based on "flatness", which means a maximum error tolerance for an approximating polygon.
What's nice is that it does not need to compute curvature, but just the appropriate Bernstein polynomial. Thus an approach of recursively dividing the curve by half, and then testing each half for flatness works sufficiently fast. Here is the core code in Javascript:
If you are interested to add this to your library, maybe as a flatten method, I would be prepared to do a PR.
To get a feeling for efficiency, contrast this with the algorithm used by Ghostscript, which computes the number of steps needed once, and then goes on just like your LUT:
* To calculate how many points to sample along a path in order to
* approximate it to the desired degree of flatness, we define
* dist((x,y)) = abs(x) + abs(y);
* then the number of points we need is
* N = 1 + sqrt(3/4 * D / flatness),
* where
* D = max(dist(p0 - 2*p1 + p2), dist(p1 - 2*p2 + p3)).
* Since we are going to use a power of 2 for the number of intervals,
* we can avoid the square root by letting
* N = 1 + 2^(ceiling(log2(3/4 * D / flatness) / 2)).
* (Reference: DEC Paris Research Laboratory report #1, May 1989.)
I did not verify the math used here, but provided its been part of the software for probably 30 years, I assume it is solid.
I tried out what happens with a cubic Bezier of (0,0, 95,10, 60,-15, 80,120) and flatness = 0.1.
With the Willcocks algorithm, this produces 38 points, so 36 calls to .split(0.5) and 37 calls to isFlatEnough()
With the Ghostscript algorithm, you get 48 points (power of 2 only makes sense for languages with fixed point math, which Javascript is not), and 46 non-trivial (i. e. t != 0 or 1) calls to .compute(t).
So in terms of computational effort, Ghostscript wins, but in terms of shortness of the table, Willcocks wins.
Qualitatively, here is a sceenshot from Inkscape to compare the samples:
the shortest line segments found by Willcocks are significantly longer than in the Ghostscript sample (2.3 to 1.5)
the longest segments are comparable, but in different places (8.5 to 8.4)
the Ghostscript sample, as it is evenly-spaced in t, has a certain "smoothness"
The text was updated successfully, but these errors were encountered:
This is a follow-up to #101.
I recently came across this paper outlining an algorithm by Roger Willcocks for subdividing Bezier curves based on "flatness", which means a maximum error tolerance for an approximating polygon.
What's nice is that it does not need to compute curvature, but just the appropriate Bernstein polynomial. Thus an approach of recursively dividing the curve by half, and then testing each half for flatness works sufficiently fast. Here is the core code in Javascript:
If you are interested to add this to your library, maybe as a
flatten
method, I would be prepared to do a PR.To get a feeling for efficiency, contrast this with the algorithm used by Ghostscript, which computes the number of steps needed once, and then goes on just like your LUT:
I did not verify the math used here, but provided its been part of the software for probably 30 years, I assume it is solid.
I tried out what happens with a cubic Bezier of
(0,0, 95,10, 60,-15, 80,120)
andflatness = 0.1
..split(0.5)
and 37 calls toisFlatEnough()
t != 0 or 1
) calls to.compute(t)
.So in terms of computational effort, Ghostscript wins, but in terms of shortness of the table, Willcocks wins.
Qualitatively, here is a sceenshot from Inkscape to compare the samples:
t
, has a certain "smoothness"The text was updated successfully, but these errors were encountered: