Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide Asserts for System.Numerics types #3316

Open
vpenades opened this issue Jul 12, 2019 · 28 comments
Open

Provide Asserts for System.Numerics types #3316

vpenades opened this issue Jul 12, 2019 · 28 comments

Comments

@vpenades
Copy link

Hi

I'm using System.Numerics.Vectors namespace quite often, and I'm used to rewrite my own assert tests every time, I am about to consider writing my own library.

Just for the sake of not reinventing the wheel, I would like to ask if it could be possible for NUnit team to consider including Asserts for these types.

I agree a lot of people would like to see the types they use often to be included with NUnit, but I believe all the types within System.Numerics can be considered intrinsic NET types. They're implicitly included in NetCore and NetStandard, and even the Roslyn compiler gives them special treatment.

The proposal would be to add a NumericsAssert static class that would cover these types:

  • BigInteger
  • Complex
  • Matrix3x2
  • Matrix4x4
  • Plane
  • Quaternion
  • Vector
  • Vector2
  • Vector3
  • Vector4
@jnm2
Copy link
Contributor

jnm2 commented Jul 12, 2019

I think we have wanted to deemphasize the *Assert classes in the past in favor of the constraint model.

Can you give examples of the kind of assertions you would like? For example, we already cover equality and inequality comparisons via IEquatable and IComparable.

@vpenades
Copy link
Author

vpenades commented Jul 12, 2019

Most of these types are used for geometric and image processing, so others might want additinal assertions, but for now I'll provide the ones I commonly use:

All types:

  • Equality with delta
  • IsFinite or IsReal check (all values are finite values, not NaN and not Infinite)

Vectors:

  • IsNormal: asserts the vector has a length of 1 (with delta).

Matrices:

  • Equality (with delta)
  • Determinant checks
  • Orthogonal checks
  • Invertible checks

Quaternions:

  • IsNormal: asserts the vector has a length of 1 (with delta).

Here's some examples I am using in my code:

        public static void AreEqual(Quaternion expected, Quaternion actual, double delta = 0)
        {
            Assert.AreEqual(expected.X, actual.X, delta, "X");
            Assert.AreEqual(expected.Y, actual.Y, delta, "Y");
            Assert.AreEqual(expected.Z, actual.Z, delta, "Z");
            Assert.AreEqual(expected.W, actual.W, delta, "W");
        }

        public static void AngleLessOrEqual(Quaternion a, Quaternion b, double angle)
        {
            var c= Quaternion.Concatenate(a, Quaternion.Inverse(b));

            var actualAngle = Math.Acos(c.W) * 2;

            Assert.LessOrEqual(actualAngle , angle);
        }

        public static void IsNormal(Vector2 vector, double delta = 0)
        {
            var lenSquared = vector.X * vector.X + vector.Y * vector.Y;

            Assert.AreEqual(1, lenSquared, delta * delta);
        }

@jnm2
Copy link
Contributor

jnm2 commented Jul 12, 2019

Great, thanks for the examples! If the team decides in favor of this, we'll probably have it up for grabs since our bandwidth is needed elsewhere. I think we would require a robust set of tests.

@nunit/framework-team What do you think? Since these are BCL types, would this belong in NUnit or in an external library such as the one Vicente was thinking of writing? Would we do a constraint model or Assert class or both?

@ChrisMaddock
Copy link
Member

I personally think this is too specific an area of .NET to justify inclusion in the main framework - but I think it is exactly the sort of thing that is ideal for a 3rd party library, which NUnit should reference through the docs. To me, this case is exactly what our extensibility options are intended to cover - and (if the team agree) - I wish you the best of luck with it becoming popular, @vpenades.

@CharliePoole
Copy link
Member

I agree with @ChrisMaddock on everything except equality. That's pretty basic 😺 and so I think it should be dealt in our implementation. In fact, it pretty much has to be dealt with there, since we provide for extensibility with new constraints but not for adding new type support for a given constraint.

@vpenades
Copy link
Author

Here you can see what I am using right now... more or less.

@jnm2
Copy link
Contributor

jnm2 commented Jul 12, 2019

I do agree with Chris and Charlie. Exact equality of all these types already works, but shall we open a separate issue to enable tolerance to work against this list of types in our existing APIs?

I wonder if you could gain some of the benefit by us pointing people to your nuget.org package from our docs as our official recommendation for anyone who needs it. This is just me thinking out loud, not sure what others on the team would think.

@vpenades
Copy link
Author

Well, the code snippet I've posted before is just one of the many copies I have around in my test projects, sometimes I tailor them specifically for each test, but generally speaking they're quite standard tests.

But I feel it's not that much code to justify a full nuget package project (or maybe yes), I just thought that since the types are BCL and most of the assertions are equality, less equal, regreater equal , etc, it could fit into NUnit.

What's critical of all these assertions is that they all require a tolerance value, there's very few geometric operations that don't loose some precission after a few operations, so, for these types, exact equality is almost useless.

@CharliePoole
Copy link
Member

Tolerance already works for a wide range of Types in NUnit, and could be extended. I guess you have seen that Assert.AreEqual only supports it for float and double, but AreEqual is considered legacy and we only have added new features to the Constraint-based assertions (Assert.That) for many years now.

It would be interesting to try some of the types in question and see how they work. I think I would start with the scalar types you mentioned first and then move on to Vectors, etc.

In fact, if you want to implement "classic" Assertions for these types, similar to AreEqual, I would strongly recommend you base them on the underlying constraints just as NUnit does rather than having the actual code in the AreEqual. That way it would be usable with both types of Assert.

@CharliePoole
Copy link
Member

Here's a test I through together using BigInteger...

        [Test]
        public void BigIntegerEquality()
        {
            var big1 = new System.Numerics.BigInteger(ulong.MaxValue);
            var big2 = new System.Numerics.BigInteger(ulong.MaxValue);
            var big3 = big2 - 5;
            Assert.That(big1, Is.EqualTo(big2));
            Assert.That(big1, Is.Not.EqualTo(big3));
            Assert.That(big1, Is.EqualTo(big3).Within(10));
            Assert.That(big1, Is.EqualTo(big3).Within(.01).Percent);
        }

Key takeaways:

  1. It all compiles
  2. The first two asserts pass, as one would expect.
  3. The third assert fails. That's because NUnit doesn't know it's possible to apply a tolerance to BigInteger. Unfortunately, NUnit has to define "numeric" by a list of Types and BigInteger is not in that list. It should be easy to add it however. Basically, NUnit simply needs to know that it can add and subtract the tolerance from the expected value to come up with a range.

Interestingly, you can fake it by using this Assert, which passes...

            Assert.That(big1, Is.InRange(big2 - 10, big2 + 10));

@vpenades
Copy link
Author

@CharliePoole Interesting... I have to admit I didn't know about the constraint semantics and they look very powerful.

But I'm not sure how could we fit multidimensional values like Vector3 in the constraints API...

Also, in geometry, checking for normalized values is as common as checking for equality, right now it coud be faked like this:

Assert.That(vector.Length(), Is.EqualTo(1).Within(0.0001) );

But compared to:

NumericsAssert.LengthIsOne(vector, 0.0001);

I have to write much less 😅

Maybe the constraints should support something like this:

Assert.That(vector, Is.Normalized.Within(0.0001) );

@CharliePoole
Copy link
Member

Exactly (that is, your last example)...

NUnit is designed to allow you writing your own constraints. But the constraints themselves are not designed to be particularly extensible. The NUnit EqualsConstraint is one of the most complex and adding a numeric type to it requires good understanding of how it works, but it doesn't actually take that much code, however.

I spiked this for BigInteger. This test, expanded from the above, all passes for me...

        [Test]
        public void BigIntegerEquality()
        {
            var big1 = new System.Numerics.BigInteger(ulong.MaxValue);
            var big2 = new System.Numerics.BigInteger(ulong.MaxValue);
            var big3 = big2 - 5;

            Assert.That(big1, Is.EqualTo(big2));
            Assert.That(big1, Is.Not.EqualTo(big3));
            Assert.That(big1, Is.EqualTo(big3).Within(10));
            Assert.That(big1, Is.EqualTo(big3).Within(.01).Percent);

            Assert.That(big1, Is.GreaterThan(big3));
            Assert.That(big3, Is.LessThan(big1));
            Assert.That(big3, Is.InRange(big2 - 10, big2 + 10));
        }

While it didn't take a lot of code, there was a lot of repetition. If I were to do this for real (not a spike, tdd, etc.) I'd probably refactor Numerics.cs. It has been OK because we have not added any types for a long time, but if we are to add more, I think it should be better organized for doing that.

WRT vectors and other multi-valued types, It's a different and possibly simpler problem. NUnit already handles equality of arrays, lists, collections and enumerations. That's part of what makes EqualsConstraint fairly complex. We would again be extending, but in a slightly different part of the code, which is why I suggested doing scalars before vectors, etc.

@nunit/framework-team I'm willing to do a PR for BigInteger and any other scalars (I only see Complex in the list) based on my spike, but first there has to be a decision as to whether to support these types.

@vpenades
Copy link
Author

Equality on multi-valued types like Vector3 or Matrix4x4 would require the same logic than comparing arrays, but unfortunately, the numerics types don't implement IEnumerable, so they can't be used on collection constraints.

Maybe the solution for multi valued vectors would be to implicitly convert the content to arrays, but it would need to be implicit for the whole API.

For example, another classic check would be this:

Assert.That( new Vector3(0,0.6f,1), Is.InRange(Vector3.Zero, Vector3.One) );

and

Assert.That( new Vector3(0,0.6f,1), Is.InRange(0, 1) );

This check would fail if any of the component of a vector is out of range. The first InRange would allow for specific limits for every component.

So, I don't know how this would fit with the collections constraints...

@CharliePoole
Copy link
Member

Sure, but the logic for comparing various kinds of arrays, lists, collections, enumerables, etc. didn't exist till we put it there either. Key thing is what the team wants to see inside NUnit versus as an addon. After that, it's only implementation.

Use of comparison constraints on collections is not something NUnit currently supports. I've always thought we should, but never found the time to work on it.

@jnm2
Copy link
Contributor

jnm2 commented Jul 13, 2019

I'm in favor of adding tolerance capability for all of the scalar types. If we could come up with a consistent definition of tolerance for the vectors, I'd be in favor of that too. Would it be possible for someone to expect that tolerance for vectors is in terms of euclidean distance rather than elementwise?

@vpenades
Copy link
Author

@jnm2 No, expecting tolerance only for the length would mean that Vector3(1,0,0) and Vector3(0,1,0) are equal, because their length is the same (one), but they're pointing in different directions.

You could say that vectors have two main "qualities": direction and distance. there's times you can check that two vectors point in the same direction, regardless of their length, or you can check for their length, regardless of the direction (normalization), or you might want to check both (equality)

But this is when a vector is interpreted in Euclidean Space.

Image processing libraries do heavy use of vectors due to being optimized with SIMD by roslyn, you can see some very extreme uses in the Sixlabors.ImageSharp library. For image processing, a Vector4 is interpreted in Color Space, where the X,Y,Z,W components are interpreted as R,G,B,A , C,M,Y,A or Y,U,V,A values in the range of 0 to 1, so there's a lot of checks usually done in that space, checking for component wise equality, or checking that all the components stay in the 0-1 range.

Then, there's Quaternions, which are a thing on their own... they look like a Vector4, but they represents rotations in an Hypersphere Space, with their own set of unique checks.

I'm beginning to think that maybe the best approach would be, for now, to go for an old school "NumericsAssert" to gather all the checks typically done by developers using system numerics, and after a while, see how they could fit in the constraints API. Doing it the other way around might lead to misfires, paying too much attention on checks rarely used, and leaving very common checks too verbose to be nice to use.

@CharliePoole
Copy link
Member

I'm "old school" but on a different dimension. 😄 I want to subdivide problems so that each individual change to the ongoing system is simple, even trivial if possible. My preferred mode of subdivision here would be operations like equality, comparison, etc. further subdivided by Type. But I'd also be comfortable with Type, subdivided by operation. In either case, I'd only make changes based on one operation for one Type at a time.

If a type in System.Numerics supports an operation, which is normally supported by NUnit for more common numeric types, then I think we should support it. In my view, that would include equality with or without a tolerance, the four basic comparison operations and other secondary operations that those first ones support, like InRange.

I picked equality as the most common operation and BigInteger because it was first on your list, probably arranged alphabetically. It's not hard to implement.

So... I'm still stuck with the question of whether the @nunit/framework-team wants to have any pieces of this idea in NUnit. If I read the discussion correctly, @ChrisMaddock said no, I gave a qualified yes and @jnm2 agreed with both of us. 😄

@jnm2
Copy link
Contributor

jnm2 commented Jul 13, 2019

So depending on the use of the vector, someone might want tolerance to mean the vectors refer to points in euclidian space that are within the tolerant distance of each other:

sqrt((v1.x - v2.x)^2 + (v1.y - v2.y)^2) <= tolerance

Or that only vector distance is being compared, not direction:

abs(sqrt(v1.x^2 + v1.y^2) - sqrt(v2.x^2 + v2.y^2)) <= tolerance

And possibly other interpretations. Therefore I'm only in favor of making our Within API work with all the scalar numeric types.

@CharliePoole I think Chris was saying no to new APIs but we will have to wait and see what he says. You also said no to new APIs but suggested making our existing tolerance understand numeric scalar types. I'm in agreement on both points.

@vpenades
Copy link
Author

vpenades commented Jul 15, 2019

@jnm2 in euclidean space, the only three checks that are commonly used are:

  • the values are real and finite (not null, not infinite)
  • Is the vector A equal to vector B (with tolerance)
  • Is vector A normalized (with tolerance)

Also, for component wise checks, the most common check is if the individual elements are within a range, typically between 0 and 1.

It just happens that checking for normalization is the same as checking for length == 1, but that specific check is SO common that it makes sense to have a specific check just for it. Also, Checking if a quaternion is normalized is essentially the same, that is, quaternion length == 1, but it happens that quaternions don't have "length", or another way to put it is that quaternions only make sense mathematically when their length is 1 (or close)

Anyway, I can see there's a lot of decission making before doing any kind of integration within NUnit, and doing it in a hurry would do no good.... so I believe I'm going to drop my own library for my own needs, and at a later time you can review it and adopt whatever you need for NUnit.

@CharliePoole
Copy link
Member

@jnm2 In that case, I'll do some PRs as time allows. Do you think we should have an issue per Type or just work against this one?

@vpenades I'll invite you to review them. I'm also interested in your ideas on vectors for another framework I'm working on, so I'll be in touch.

@mikkelbu
Copy link
Member

I agree with Joseph's last paragraph in #3316 (comment) and I think that one issue is fine to begin with. If the work is larger than expected then we can create multiple issues.

@jnm2
Copy link
Contributor

jnm2 commented Jul 15, 2019

@vpenades That makes sense. I'm still not sure we're talking about the same thing for one item:

Is the vector A equal to vector B (with tolerance)

For this one in particular, were you saying that elementwise tolerance is what everyone would expect who uses this? In other words, no one would really think of tolerance as euclidian (or e.g. taxicab) distance between vectors A and B?

@CharliePoole I'm thinking the same as Mikkel. I don't have a strong opinion, but I think I'd lean towards opening issues only if we need specific discussion and this issue has gotten too long, or if the work spans multiple milestones. (This is also on the assumption that PRs may be both more or less granular than issues.)

@vpenades
Copy link
Author

vpenades commented Jul 15, 2019

@vpenades That makes sense. I'm still not sure we're talking about the same thing for one item:

Is the vector A equal to vector B (with tolerance)

For this one in particular, were you saying that elementwise tolerance is what everyone would expect who uses this? In other words, no one would really think of tolerance as euclidian (or e.g. taxicab) distance between vectors A and B?

I think any of these two would be fine:

// element-wise equality check
public static void AreEqual(Vector3 expected, Vector3 actual, double tolerance = 0)
        {
            Assert.AreEqual(expected.X, actual.X, tolerance, "X");
            Assert.AreEqual(expected.Y, actual.Y, tolerance, "Y");
            Assert.AreEqual(expected.Z, actual.Z, tolerance, "Z");
        }
// euclidean equality check
public static void AreEqual(Vector3 expected, Vector3 actual, double tolerance = 0)
        {
            var diff = (expected - actual).LengthSquared();
            Assert.AreEqual(0, diff, tolerance * tolerance);
        }

The second will probably be more correct when vectors represents points in euclidean space... but as I said, Vectors are not used only for euclidean space,

For example, if Vector3 represents RGB colors, the first element-wise equality approach would make more sense. And since we cannot take assumptions about which space is being represented by Vector3, it's better to take the more general, element-wise approach.

Also, there's types that don't have an implicit "Length" quality, like quaternions and matrices, for which almost the only way to check equality with tolerance is doing it element-wise.

For all these reasons in my implementation I chose element-wise in favour of euclidean.

But that's my opinion!, maybe someone with a deeper maths background might differ!

@jnm2
Copy link
Contributor

jnm2 commented Jul 15, 2019

I'm not the person with that background and I'm hesitant to add something that we feel we can't change later, but I'm warming to elementwise for non-scalar types. I'm pretty sure quaternions are in euclidean space, but that's the only one that by definition would be. The rest might be holding color channel values like you say. I do like the symmetry with matrices, sequences, tuples, basically everything else that has the concept of elements.

We could always add something like .Within(tolerance, ToleranceMetric.Euclidean) if we guess wrong and it turns out to be common.

@ClickRick
Copy link

ClickRick commented Dec 1, 2019

I came here having tried to work out how to adapt or extend an existing constraint, or else how to write one and then use it, to make this work:

    [Test]
    public void BigIntegerEquality()
    {
        var big = new System.Numerics.BigInteger(1);
        Assert.That(big1, Is.Positive);
    }

The one thing I was trying to do was keep the use of Is consistent with other numeric tests, but I suspect I'd need to add my own static class BigIs or some such.
Unless I've missed an opportunity here?

@lucid-dreamm
Copy link

lucid-dreamm commented Apr 11, 2022

You can define your own version.
An example

using System.Collections.Generic;
using UnityEngine;

namespace TestTools
{
    public class Vector3IntEqualityComparer : IEqualityComparer<Vector3Int>
    {
        private static readonly Vector3IntEqualityComparer m_Instance = new Vector3IntEqualityComparer();

        /// <summary>
        /// A comparer instance
        ///</summary>
        public static Vector3IntEqualityComparer Instance => m_Instance;

        /// <summary>
        /// Initializes an instance of Vector3IntEquality comparer.
        /// </summary>
        public Vector3IntEqualityComparer()
        {
        }

        /// <summary>
        /// Compares the actual and expected Vector3Int objects for equality.
        /// </summary>
        /// <param name="expected">The expected Vector3Int used for comparison</param>
        /// <param name="actual">The actual Vector3Int to test</param>
        /// <returns>True if the vectors are equals, false otherwise.</returns>
        /// <example>
        /// The following example shows how to verify if two Vector3Int are equals
        ///<code>
        ///[TestFixture]
        /// public class Vector3IntTest
        /// {
        ///     [Test]
        ///     public void VerifyThat_TwoVector3IntObjectsAreEqual()
        ///     {
        ///         var actual = new Vector3Int(0, 0, 0);
        ///         var expected = new Vector3Int(0, 0, 0);
        ///         var comparer = new Vector3IntEqualityComparer();
        /// 
        ///         Assert.That(actual, Is.EqualTo(expected).Using(comparer));
        /// 
        ///         actual = new Vector3Int(0, 0, 0);
        ///         expected = new Vector3Int(0, 1, 0);
        /// 
        ///         Assert.That(actual, Is.EqualTo(expected).Using(Vector3IntEqualityComparer.Instance));
        ///      }
        ///  }
        /// </code>
        /// </example>
        public bool Equals(Vector3Int expected, Vector3Int actual)
        {
            return expected.x == actual.x && expected.y == actual.y && expected.z == actual.z;
        }

        /// <summary>
        /// Serves as the default hash function.
        /// </summary>
        /// <param name="vec2">A not null Vector3Int</param>
        /// <returns>Returns 0</returns>
        public int GetHashCode(Vector3Int vec2)
        {
            return 0;
        }
    }
}

@OsirisTerje
Copy link
Member

@vpenades Did you do anything more on this issue (now a bit old one) ?
@nunit/framework-team This looks like an interesting case - anyone interested? And, extra library or built-in ?

@vpenades
Copy link
Author

vpenades commented Apr 23, 2024

@vpenades Did you do anything more on this issue (now a bit old one) ? @nunit/framework-team This looks like an interesting case - anyone interested? And, extra library or built-in ?

I wrote my own extensions and I have them in use across some of my projects, at first sight:

But I wrote them like 5 years ago, so they don't follow the modern "fluent" pattern of NUnit. A while ago I took a look at the newer NUnit APIs and I couldn't figure out how to extend the API to support these types.

If I would be me, I would consider all the vectors and matrices as "tuples" and treat them as fixed sized collections,

So a Vector3 would be element wise equivalent to a float[3], or a ValueTuple of 3 elements, so I would be able to do this:

Assert.That(new Vector3(1,2,3), Is.EqualTo(new float[] { 1,2,3 });
Assert.That(new Vector3(1,2,3), Is.EqualTo( (1f,2f,3f) );

In these cases, Vector3 and (1f, 2f, 3f) would be converted to array collections.

For floating point tolerance .Within(x), I would apply it element wise too, maybe .Within could be extended to support collections, and expect the same number of elements of the compared collections.

Then for geometry specific operations, I would add additional constraints like Has.GeometricLength(xxx)

The problem I see with this is that since the constraints are blind to the types, too many constraints would confuse users not using them.

Persontally, I would have preferred an Assert.That(value).TypeAwareConstraint(xxx); so only constraints applicable to the given type would show up in intellisense.

Also, with Net7.0 and the introduction of operator interfaces, I think more people will create their own types based on these operators.

So maybe geometric constraints could take advantage of these interfaces. Sadly, only net7 and up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants