-
Notifications
You must be signed in to change notification settings - Fork 734
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide Asserts for System.Numerics types #3316
Comments
I think we have wanted to deemphasize the *Assert classes in the past in favor of the constraint model. Can you give examples of the kind of assertions you would like? For example, we already cover equality and inequality comparisons via IEquatable and IComparable. |
Most of these types are used for geometric and image processing, so others might want additinal assertions, but for now I'll provide the ones I commonly use: All types:
Vectors:
Matrices:
Quaternions:
Here's some examples I am using in my code: public static void AreEqual(Quaternion expected, Quaternion actual, double delta = 0)
{
Assert.AreEqual(expected.X, actual.X, delta, "X");
Assert.AreEqual(expected.Y, actual.Y, delta, "Y");
Assert.AreEqual(expected.Z, actual.Z, delta, "Z");
Assert.AreEqual(expected.W, actual.W, delta, "W");
}
public static void AngleLessOrEqual(Quaternion a, Quaternion b, double angle)
{
var c= Quaternion.Concatenate(a, Quaternion.Inverse(b));
var actualAngle = Math.Acos(c.W) * 2;
Assert.LessOrEqual(actualAngle , angle);
}
public static void IsNormal(Vector2 vector, double delta = 0)
{
var lenSquared = vector.X * vector.X + vector.Y * vector.Y;
Assert.AreEqual(1, lenSquared, delta * delta);
} |
Great, thanks for the examples! If the team decides in favor of this, we'll probably have it up for grabs since our bandwidth is needed elsewhere. I think we would require a robust set of tests. @nunit/framework-team What do you think? Since these are BCL types, would this belong in NUnit or in an external library such as the one Vicente was thinking of writing? Would we do a constraint model or Assert class or both? |
I personally think this is too specific an area of .NET to justify inclusion in the main framework - but I think it is exactly the sort of thing that is ideal for a 3rd party library, which NUnit should reference through the docs. To me, this case is exactly what our extensibility options are intended to cover - and (if the team agree) - I wish you the best of luck with it becoming popular, @vpenades. |
I agree with @ChrisMaddock on everything except equality. That's pretty basic 😺 and so I think it should be dealt in our implementation. In fact, it pretty much has to be dealt with there, since we provide for extensibility with new constraints but not for adding new type support for a given constraint. |
Here you can see what I am using right now... more or less. |
I do agree with Chris and Charlie. Exact equality of all these types already works, but shall we open a separate issue to enable tolerance to work against this list of types in our existing APIs? I wonder if you could gain some of the benefit by us pointing people to your nuget.org package from our docs as our official recommendation for anyone who needs it. This is just me thinking out loud, not sure what others on the team would think. |
Well, the code snippet I've posted before is just one of the many copies I have around in my test projects, sometimes I tailor them specifically for each test, but generally speaking they're quite standard tests. But I feel it's not that much code to justify a full nuget package project (or maybe yes), I just thought that since the types are BCL and most of the assertions are equality, less equal, regreater equal , etc, it could fit into NUnit. What's critical of all these assertions is that they all require a tolerance value, there's very few geometric operations that don't loose some precission after a few operations, so, for these types, exact equality is almost useless. |
Tolerance already works for a wide range of Types in NUnit, and could be extended. I guess you have seen that Assert.AreEqual only supports it for float and double, but AreEqual is considered legacy and we only have added new features to the Constraint-based assertions (Assert.That) for many years now. It would be interesting to try some of the types in question and see how they work. I think I would start with the scalar types you mentioned first and then move on to Vectors, etc. In fact, if you want to implement "classic" Assertions for these types, similar to AreEqual, I would strongly recommend you base them on the underlying constraints just as NUnit does rather than having the actual code in the AreEqual. That way it would be usable with both types of Assert. |
Here's a test I through together using BigInteger... [Test]
public void BigIntegerEquality()
{
var big1 = new System.Numerics.BigInteger(ulong.MaxValue);
var big2 = new System.Numerics.BigInteger(ulong.MaxValue);
var big3 = big2 - 5;
Assert.That(big1, Is.EqualTo(big2));
Assert.That(big1, Is.Not.EqualTo(big3));
Assert.That(big1, Is.EqualTo(big3).Within(10));
Assert.That(big1, Is.EqualTo(big3).Within(.01).Percent);
} Key takeaways:
Interestingly, you can fake it by using this Assert, which passes... Assert.That(big1, Is.InRange(big2 - 10, big2 + 10)); |
@CharliePoole Interesting... I have to admit I didn't know about the constraint semantics and they look very powerful. But I'm not sure how could we fit multidimensional values like Vector3 in the constraints API... Also, in geometry, checking for normalized values is as common as checking for equality, right now it coud be faked like this: Assert.That(vector.Length(), Is.EqualTo(1).Within(0.0001) ); But compared to: NumericsAssert.LengthIsOne(vector, 0.0001); I have to write much less 😅 Maybe the constraints should support something like this: Assert.That(vector, Is.Normalized.Within(0.0001) ); |
Exactly (that is, your last example)... NUnit is designed to allow you writing your own constraints. But the constraints themselves are not designed to be particularly extensible. The NUnit EqualsConstraint is one of the most complex and adding a numeric type to it requires good understanding of how it works, but it doesn't actually take that much code, however. I spiked this for [Test]
public void BigIntegerEquality()
{
var big1 = new System.Numerics.BigInteger(ulong.MaxValue);
var big2 = new System.Numerics.BigInteger(ulong.MaxValue);
var big3 = big2 - 5;
Assert.That(big1, Is.EqualTo(big2));
Assert.That(big1, Is.Not.EqualTo(big3));
Assert.That(big1, Is.EqualTo(big3).Within(10));
Assert.That(big1, Is.EqualTo(big3).Within(.01).Percent);
Assert.That(big1, Is.GreaterThan(big3));
Assert.That(big3, Is.LessThan(big1));
Assert.That(big3, Is.InRange(big2 - 10, big2 + 10));
} While it didn't take a lot of code, there was a lot of repetition. If I were to do this for real (not a spike, tdd, etc.) I'd probably refactor WRT vectors and other multi-valued types, It's a different and possibly simpler problem. NUnit already handles equality of arrays, lists, collections and enumerations. That's part of what makes @nunit/framework-team I'm willing to do a PR for BigInteger and any other scalars (I only see Complex in the list) based on my spike, but first there has to be a decision as to whether to support these types. |
Equality on multi-valued types like Vector3 or Matrix4x4 would require the same logic than comparing arrays, but unfortunately, the numerics types don't implement IEnumerable, so they can't be used on collection constraints. Maybe the solution for multi valued vectors would be to implicitly convert the content to arrays, but it would need to be implicit for the whole API. For example, another classic check would be this: Assert.That( new Vector3(0,0.6f,1), Is.InRange(Vector3.Zero, Vector3.One) ); and Assert.That( new Vector3(0,0.6f,1), Is.InRange(0, 1) ); This check would fail if any of the component of a vector is out of range. The first InRange would allow for specific limits for every component. So, I don't know how this would fit with the collections constraints... |
Sure, but the logic for comparing various kinds of arrays, lists, collections, enumerables, etc. didn't exist till we put it there either. Key thing is what the team wants to see inside NUnit versus as an addon. After that, it's only implementation. Use of comparison constraints on collections is not something NUnit currently supports. I've always thought we should, but never found the time to work on it. |
I'm in favor of adding tolerance capability for all of the scalar types. If we could come up with a consistent definition of tolerance for the vectors, I'd be in favor of that too. Would it be possible for someone to expect that tolerance for vectors is in terms of euclidean distance rather than elementwise? |
@jnm2 No, expecting tolerance only for the length would mean that Vector3(1,0,0) and Vector3(0,1,0) are equal, because their length is the same (one), but they're pointing in different directions. You could say that vectors have two main "qualities": direction and distance. there's times you can check that two vectors point in the same direction, regardless of their length, or you can check for their length, regardless of the direction (normalization), or you might want to check both (equality) But this is when a vector is interpreted in Euclidean Space. Image processing libraries do heavy use of vectors due to being optimized with SIMD by roslyn, you can see some very extreme uses in the Sixlabors.ImageSharp library. For image processing, a Vector4 is interpreted in Color Space, where the X,Y,Z,W components are interpreted as R,G,B,A , C,M,Y,A or Y,U,V,A values in the range of 0 to 1, so there's a lot of checks usually done in that space, checking for component wise equality, or checking that all the components stay in the 0-1 range. Then, there's Quaternions, which are a thing on their own... they look like a Vector4, but they represents rotations in an Hypersphere Space, with their own set of unique checks. I'm beginning to think that maybe the best approach would be, for now, to go for an old school "NumericsAssert" to gather all the checks typically done by developers using system numerics, and after a while, see how they could fit in the constraints API. Doing it the other way around might lead to misfires, paying too much attention on checks rarely used, and leaving very common checks too verbose to be nice to use. |
I'm "old school" but on a different dimension. 😄 I want to subdivide problems so that each individual change to the ongoing system is simple, even trivial if possible. My preferred mode of subdivision here would be operations like equality, comparison, etc. further subdivided by Type. But I'd also be comfortable with Type, subdivided by operation. In either case, I'd only make changes based on one operation for one Type at a time. If a type in System.Numerics supports an operation, which is normally supported by NUnit for more common numeric types, then I think we should support it. In my view, that would include equality with or without a tolerance, the four basic comparison operations and other secondary operations that those first ones support, like I picked equality as the most common operation and BigInteger because it was first on your list, probably arranged alphabetically. It's not hard to implement. So... I'm still stuck with the question of whether the @nunit/framework-team wants to have any pieces of this idea in NUnit. If I read the discussion correctly, @ChrisMaddock said no, I gave a qualified yes and @jnm2 agreed with both of us. 😄 |
So depending on the use of the vector, someone might want tolerance to mean the vectors refer to points in euclidian space that are within the tolerant distance of each other:
Or that only vector distance is being compared, not direction:
And possibly other interpretations. Therefore I'm only in favor of making our @CharliePoole I think Chris was saying no to new APIs but we will have to wait and see what he says. You also said no to new APIs but suggested making our existing tolerance understand numeric scalar types. I'm in agreement on both points. |
@jnm2 in euclidean space, the only three checks that are commonly used are:
Also, for component wise checks, the most common check is if the individual elements are within a range, typically between 0 and 1. It just happens that checking for normalization is the same as checking for length == 1, but that specific check is SO common that it makes sense to have a specific check just for it. Also, Checking if a quaternion is normalized is essentially the same, that is, quaternion length == 1, but it happens that quaternions don't have "length", or another way to put it is that quaternions only make sense mathematically when their length is 1 (or close) Anyway, I can see there's a lot of decission making before doing any kind of integration within NUnit, and doing it in a hurry would do no good.... so I believe I'm going to drop my own library for my own needs, and at a later time you can review it and adopt whatever you need for NUnit. |
I agree with Joseph's last paragraph in #3316 (comment) and I think that one issue is fine to begin with. If the work is larger than expected then we can create multiple issues. |
@vpenades That makes sense. I'm still not sure we're talking about the same thing for one item:
For this one in particular, were you saying that elementwise tolerance is what everyone would expect who uses this? In other words, no one would really think of tolerance as euclidian (or e.g. taxicab) distance between vectors A and B? @CharliePoole I'm thinking the same as Mikkel. I don't have a strong opinion, but I think I'd lean towards opening issues only if we need specific discussion and this issue has gotten too long, or if the work spans multiple milestones. (This is also on the assumption that PRs may be both more or less granular than issues.) |
I think any of these two would be fine: // element-wise equality check
public static void AreEqual(Vector3 expected, Vector3 actual, double tolerance = 0)
{
Assert.AreEqual(expected.X, actual.X, tolerance, "X");
Assert.AreEqual(expected.Y, actual.Y, tolerance, "Y");
Assert.AreEqual(expected.Z, actual.Z, tolerance, "Z");
} // euclidean equality check
public static void AreEqual(Vector3 expected, Vector3 actual, double tolerance = 0)
{
var diff = (expected - actual).LengthSquared();
Assert.AreEqual(0, diff, tolerance * tolerance);
} The second will probably be more correct when vectors represents points in euclidean space... but as I said, Vectors are not used only for euclidean space, For example, if Vector3 represents RGB colors, the first element-wise equality approach would make more sense. And since we cannot take assumptions about which space is being represented by Vector3, it's better to take the more general, element-wise approach. Also, there's types that don't have an implicit "Length" quality, like quaternions and matrices, for which almost the only way to check equality with tolerance is doing it element-wise. For all these reasons in my implementation I chose element-wise in favour of euclidean. But that's my opinion!, maybe someone with a deeper maths background might differ! |
I'm not the person with that background and I'm hesitant to add something that we feel we can't change later, but I'm warming to elementwise for non-scalar types. I'm pretty sure quaternions are in euclidean space, but that's the only one that by definition would be. The rest might be holding color channel values like you say. I do like the symmetry with matrices, sequences, tuples, basically everything else that has the concept of elements. We could always add something like |
I came here having tried to work out how to adapt or extend an existing constraint, or else how to write one and then use it, to make this work:
The one thing I was trying to do was keep the use of |
You can define your own version.
|
@vpenades Did you do anything more on this issue (now a bit old one) ? |
I wrote my own extensions and I have them in use across some of my projects, at first sight: But I wrote them like 5 years ago, so they don't follow the modern "fluent" pattern of NUnit. A while ago I took a look at the newer NUnit APIs and I couldn't figure out how to extend the API to support these types. If I would be me, I would consider all the vectors and matrices as "tuples" and treat them as fixed sized collections, So a Vector3 would be element wise equivalent to a float[3], or a ValueTuple of 3 elements, so I would be able to do this: Assert.That(new Vector3(1,2,3), Is.EqualTo(new float[] { 1,2,3 });
Assert.That(new Vector3(1,2,3), Is.EqualTo( (1f,2f,3f) ); In these cases, Vector3 and (1f, 2f, 3f) would be converted to array collections. For floating point tolerance Then for geometry specific operations, I would add additional constraints like The problem I see with this is that since the constraints are blind to the types, too many constraints would confuse users not using them. Persontally, I would have preferred an Also, with Net7.0 and the introduction of operator interfaces, I think more people will create their own types based on these operators. So maybe geometric constraints could take advantage of these interfaces. Sadly, only net7 and up. |
Hi
I'm using
System.Numerics.Vectors
namespace quite often, and I'm used to rewrite my own assert tests every time, I am about to consider writing my own library.Just for the sake of not reinventing the wheel, I would like to ask if it could be possible for NUnit team to consider including Asserts for these types.
I agree a lot of people would like to see the types they use often to be included with NUnit, but I believe all the types within
System.Numerics
can be considered intrinsic NET types. They're implicitly included in NetCore and NetStandard, and even the Roslyn compiler gives them special treatment.The proposal would be to add a
NumericsAssert
static class that would cover these types:The text was updated successfully, but these errors were encountered: