-
Notifications
You must be signed in to change notification settings - Fork 384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v5 Roadmap #563
Comments
An Idea: An example: We could set the global unit for pressure: If I set a new pressure: Using Using using My idea is not to change the unit for which it does it calculations behind the scene, just changing the unit it using on its in/outputs if nothing else is specified. |
Doesn't seem like anything the client application couldn't do where you stash a static "global" or default unit and pass it into the constructor or From method. I think you're also potentially talking about the idea in #547 |
I think in general it is not a good idea to tamper with global static variables in a library. If you're using another library or project that also depend on UnitsNet, but expects different defaults, you can easily run into issues here. Also the default behavior shipped with UnitsNet could change in a major version bump, possibly causing similar issues if you have a lot of code just passing in values and assuming some unit will be used. It just sounds flaky with not much benefit in my eyes, but I can understand the convenience sounds appealing. This kind of convenience is also very easy to build per-application, just create a factory type like this: static class Quantitator2000
{
public PressureUnit DefaultPressureUnit { get; set; } = PressureUnit.Psi;
public Pressure Pressure(double value) => new Pressure(value, DefaultPressureUnit);
}
var p = Quantitator2000.Pressure(200); // 200 Psi |
ExtensibilityI'd recommend moving the units to their own interfaces instead of Enums.
This could of course for convenience be implemented in:
Could be used in DependencyInjection frameworks: The The base features could be extended with custom abbreviations/namings, that could be automatically picked up by the parsers etc.. The biggest issue I'm facing with Units.NET currently is interop with other systems. For most parts the abbreviations are correct, but sometimes they just don't translate well (LPH instead of L/h, ...). A lot of that would be fixable by setting up some dependency injection and allowing library users to inject their custom unit definitions into the system at runtime. |
Compound Units (L/h, m/s)Units should combine information form their base units. Allowing combinations to exist without explicitly adding them: Velocity should be Length/Time, and as such, allow any length and time unit, not just the ones defined in the jsons, if a new length unit gets added, a new velocity unit should become available. Prefixes (kilo, hecta, deca, ...)Don't have Kilogram as a separate Unit from gram etc. SI's kilo, mega, giga, ... Both these would mean that using Enums is no longer an option though, but that might be a good thing considering #563 (comment) |
In quantities like |
Change all Example UnitsNet/UnitsNet/UnitConverter.cs Line 214 in c4cab69
|
isn't this temperature arithmetic confusion a result of confusing the distinction between points in affine space vs translation vectors in affine space? e.g.
You can't add two points in affine space to get a new location or a vector - it doesn't make sense. i.e. London + Oslo = ? but London - Oslo = 1,160miles A temperature measurement is a point on the temperature scale (1D affine space) in relation to the chosen origin. A temperature difference is a translation vector
|
@AlexC84 You are not wrong, but as outlined in the top post and in the discussions of the issues it links to, I believe it comes down to what people of different technical backgrounds or when working in different contexts, what they expect is the correct behavior here. The current behavior is what you consider correct, that subtraction results in The main argument against changing it is that in order to do a breaking change I need to be convinced that we are changing it for the better; that more people will like it than dislike it. I don't currently know if that statement holds. So, unless someone bring convincing evidence that the majority will favor changing |
I would expect 30C - 20C to equal 10 (unitless). That's the way boost units works too actually |
Nice, a third option, this keeps getting better 😆 I have don't have any strong opinions on this myself, I think it's fine as it is. I don't see a clear "winner" of the three options, and I reckon some people will probably be confused no matter what method we go with. I do think that returning |
https://www.boost.org/doc/libs/1_45_0/boost/units/absolute.hpp There's the boost implementation. I don't like it though. We should think about it a bit. I am not currently in the state with a newborn :) |
Sadly, not. Temperature scale is not an affine space - you can't cross 0 K. Better yet, negative temperatures are hotter than positive ones, and in any case people don't really use Kelvins there. The trick here is what temperature differences are used for: So, radical idea: Disable addition and implicit subtraction for temperatures. They do not make sense. Keep an explicit |
@mostanes Interesting, are you saying that these operations don't make sense to you? Honest question. LINQPad var temp20C = Temperature.FromDegreesCelsius(20);
var tempDelta5C = TemperatureDelta.FromDegreesCelsius(5);
Temperature temp25C = temp20C + tempDelta5C;
Temperature temp15C = temp20C - tempDelta5C;
TemperatureDelta deltaMinus5C = temp20C - temp25C;
temp25C.ToUnit(TemperatureUnit.DegreeCelsius).Dump(); // 25 °C
temp15C.ToUnit(TemperatureUnit.DegreeCelsius).Dump(); // 15 °C
deltaMinus5C.ToUnit(TemperatureDeltaUnit.DegreeCelsius).Dump(); // -5 ∆°C One thing thing I do find unintuitive here is that it defaults to Kelvin unit, when left and right hand side are both Celsius. |
This is perfectly fine.
This is a bad idea (thought experiment: why would degrees celsisus -> temperature delta be different from degrees celsius -> kelvins -> temperature delta?).
From a theoretical physics standpoint, these operations are meaningless. On a more real note, I'm finding it quite hard to actually find a reason to need these operations (except for maybe numerical stability - and whoever requires numerical stability probably uses their own implementation of temperature). Maybe keeping the support for
I think it would be better to make it explicit (say,
Well, Celsius is more of a pseudo-unit and is therefore ambiguous (just check Wikipedia). This is the point where people should really start thinking about what they're doing, because this implies transforming the rigorous types to casual speak (which defeats the point of strong typing if done improperly), so the compiler can't help them with the consistency any further. I think this is worth more consideration than the fact that it's outputting (delta?) Kelvins instead of delta-Celsius. |
Thanks for the details.
I'm not sure I follow. Is this what you meant? If so, the implementation considers them equal. var tempDelta5K = TemperatureDelta.FromKelvins(5);
(tempDelta5C == tempDelta5K).Dump(); // True
You are probably right about the former. I'm a layman when it comes to temperature units, so in my simple mind I might want to calculate the temperature difference of today (30 C) vs yesterday (20 C) and return that it was 10 degrees Celsius hotter today: TemperatureDelta tempDifference = todayTemperature - yesterdayTemperature; // 10 ∆°C In that particular context, subtraction feels intuitive to me.
I'm positive about removing ambiguous operations like you suggest here. There is clearly a lot of confusion or different expectations when it comes to temperature arithmetic from people of different backgrounds. I just wish there was a way to design this that made it absolutely intuitive to everyone, so people won't come asking why they can't do a simple |
Point is: 5 Celsius -> 5 Celsius delta, while 5 Celsius -> 278.15 Kelvin -> 278.15 Kelvin delta -> 278.15 Celsius delta. This would not happen when changing proper units.
That's why I said an explicit temperature delta is ok for user UI - as an user, you only want to know it's 10 Celsius hotter than yesterday. What exactly is going on behind is of less importance. As a programmer though, you need to know precisely what's going on with your code - especially that Celsius is not an absolute scale and you should not just willy-nilly "subtract" temperatures, especially when in Celsius.
This is hard. First |
Would the explicit definition of a "Scale" help? I see that the OM ontology has that addition to their quantities. To quote the paper:
Here is the list of Scales in the current version of the ontology. Could this be somehow of use to us? |
Thanks for the reference. I am not familiar with OM ontology, but this is absolutely interesting knowledge to lean on. I don't immediately see how to apply this information, but at least the terms Do you have any ideas for how we could make the scale definition more explicit? |
One possible modification to the JSON converter schema. |
#1158 merges v5 to master, prerelease nugets are already out. What remains before we can call it a release and postpone the rest for v6? |
It would be great to see the release complete. Fix show-stopping bugs but set the bar high or the release will never be done. |
@MisinformedDNA I agree, I primarily wanted to give it some time to gather feedback. It's been about 2 months since the first v5 prerelease nuget now, so I think it's getting ready to be bumped to stable. I will probably get to it this week, since there is nothing planned that blocks the release. |
After 2 months of pre-release, v5 is now considered stable 🎉 |
@josesimoes just a heads up, I know you were waiting for the nuget to leave pre-release. |
@angularsen awesome!! Thank you for the heads up. Looking forward to revert the hack in nanoFramework pipelines. 😉 |
TODO Summarize what will make the cut for v5
#1158 merges v5 to master, prerelease nugets are already out.
What remains before we can call it a release and postpone the rest for v6?
Temperature arithmetic
In #518 we discussed changing this. Long story short, it's ambiguous what addition and subtraction of
Temperature
should result in, because0 Celsius != 0 Fahrenheit != 0 Kelvins
. Some people expect the temperature difference (ex:30 C - 20 C = 10 delta C
), others expect a new temperature (ex:30 C - 20 C = 303 K - 293 K = 10 K = -263 C
). We currently don't know what ratio of people favor one way over the other, so to make an informed decision I think we need to poll our userbase somehow or find a way to implement this that will cause the least amount of confusion.PRs #550 + #560 implemented this change, but we realized we were not able to agree on whether to make this change or not in time for v4.
The text was updated successfully, but these errors were encountered: