-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FieldCount with a computed ignored property is filling the array with invalid element count #216
Comments
Can you please put this together into a short form test case showing the actual source data (the serialized form) that is runnable and reproducible for someone else? There's inconsistencies with your screenshot. Your code shows a class with members:
And yet your debug view from supposedly this same class shows members of:
So it's hard to tell if this is you providing the wrong source data, or if there actually is something going on with the BinarySerializer library. You should be able to easily put together a Test Case in line with this: With your source data, your expected output data, and then it should fail. |
I have stripped the class to remove noise and make easy to see the actual problem. Here a real and complete example: using BinarySerialization;
namespace IgnoreBind;
public sealed class Preview
{
/// <summary>
/// Gets the section name placeholder
/// </summary>
[FieldOrder(0)][FieldLength(12)][SerializeAs(SerializedType.TerminatedString)] public string TableName { get; set; }
/// <summary>
/// Gets the length of this section
/// </summary>
[FieldOrder(1)] public uint TableLength { get; set; }
/// <summary>
/// Gets the image width, in pixels.
/// </summary>
[FieldOrder(2)] public uint ResolutionX { get; set; } = 224;
/// <summary>
/// Gets the operation mark 'x'
/// </summary>
[FieldOrder(3)][FieldLength(4)][SerializeAs(SerializedType.TerminatedString)] public string Mark { get; set; } = "x";
/// <summary>
/// Gets the image height, in pixels.
/// </summary>
[FieldOrder(4)] public uint ResolutionY { get; set; } = 168;
[Ignore] public uint DataSize => ResolutionX * ResolutionY * 2;
[FieldOrder(5)][FieldCount(nameof(DataSize))] public byte[] Data { get; set; } = null!;
public override string ToString()
{
return $"{nameof(TableName)}: {TableName}, {nameof(TableLength)}: {TableLength}, {nameof(ResolutionX)}: {ResolutionX}, {nameof(Mark)}: {Mark}, {nameof(ResolutionY)}: {ResolutionY}, {nameof(DataSize)}: {DataSize}, {nameof(Data)}: {Data.Length}";
}
}
sealed class Program
{
static void Main(string[] args)
{
var serializer = new BinarySerializer { Endianness = Endianness.Little };
using var inputFile = new FileStream("testfile.px6s", FileMode.Open, FileAccess.Read);
inputFile.Seek(28, SeekOrigin.Begin); // Go to preview address value
var address = serializer.Deserialize<uint>(inputFile);
inputFile.Seek(address, SeekOrigin.Begin); // Go to preview address
var preview = serializer.Deserialize<Preview>(inputFile);
Console.WriteLine(preview);
if (preview.Data.Length != preview.DataSize)
{
Console.WriteLine($"ERROR: Expecting data to be {preview.DataSize} bytes, but got {preview.Data.Length} bytes!");
}
}
} Project sample: IgnoreBind.zip with test file |
Works for me... namespace BinarySerialization.Test.Issues.Issue216
{
public class Preview
{
/// <summary>
/// Gets the image width, in pixels.
/// </summary>
[FieldOrder(0)]
public uint ResolutionX { get; set; }
/// <summary>
/// Gets the image height, in pixels.
/// </summary>
[FieldOrder(2)]
public uint ResolutionY { get; set; }
[Ignore]
public uint DataSize => ResolutionX * ResolutionY * 2;
[FieldOrder(3)]
[FieldCount(nameof(DataSize), BindingMode = BindingMode.OneWay)]
public byte[] Data { get; set; }
}
}
using Microsoft.VisualStudio.TestTools.UnitTesting;
namespace BinarySerialization.Test.Issues.Issue216
{
[TestClass]
public class Issue216Tests : TestBase
{
[TestMethod]
public void TestIssue216()
{
var expected = new Preview
{
ResolutionX = 2,
ResolutionY = 3,
Data = new byte[] { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 },
};
var actual = Roundtrip(expected, new byte[] { 2, 0, 0, 0, 3, 0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 });
Assert.AreEqual(expected.ResolutionX, actual.ResolutionX);
Assert.AreEqual(expected.ResolutionY, actual.ResolutionY);
Assert.AreEqual(expected.Data.Length, actual.Data.Length);
for (int i = 0; i < expected.Data.Length; i++)
Assert.AreEqual(expected.Data[i], actual.Data[i]);
}
}
} |
There is a bad habit to blind trust tests which are written by us, and that means a chance of the test written incorrectly. And so, I think it's the case where your test is not valid. Anyway, we don't need a test, let's look at evidences:
This means the To prove my theory, I just trim file to end at the preview data. This way it works now as expected (Because file ends at preview.Data): I don't know what is happening under the library, but for sure it don't like this use case with |
If you think that the unit test I've written does not detail your problem, you are free to re-write it so that it does show your problem. |
In order to see your test fail, just add: public class Preview
{
/// <summary>
/// Gets the image width, in pixels.
/// </summary>
[FieldOrder(0)]
public uint ResolutionX { get; set; }
/// <summary>
/// Gets the image height, in pixels.
/// </summary>
[FieldOrder(2)]
public uint ResolutionY { get; set; }
[Ignore]
public uint DataSize => ResolutionX * ResolutionY * 2;
[FieldOrder(3)]
[FieldCount(nameof(DataSize), BindingMode = BindingMode.OneWay)]
public byte[] Data { get; set; }
[FieldOrder(4)] public uint AnotherField { get; set; }
} |
with a ValueGetter, the getter appears to be evaluted only once when initially converted to a TypeNode. This is unfortunately prior to any deserialized values being realised. This is a slightly hacky solution, but simply re-attempts to evaluate getters where this is the case. Signed-off-by: Bevan Weiss <bevan.weiss@gmail.com>
@sn4k3 are you able to check the PR that I created that should address the more complex getter situation you were using on the [Ignore] field? |
Going to try to look at this tomorrow, thanks. |
My inclination is to say that this is "as designed". The way the serializer works is that it creates a parallel object tree and serializes that rather than the value itself. There is currently no attempt to preserve internal logic from the original object as this would create a deep rabbit hole. While some very basic things could be possible, it feels like a slippery slope with an ill-defined bottom (e.g. what about stateful things? who's state?). These are the moments when you realize that we are simply using the language as a scaffolding for serialization and ignoring the parts that may not make as much sense. To that end, I've often thought that BinarySerializer should be probably be based on json or some such thing to remove the temptation to go beyond basic declaration. However, the ability to use POCO has proven too much for me to give up. I'm open to a discussion on this if anyone has any clever ideas but I do think the limit and scope of such a capability would need to be well-defined and obvious to the consumer. I'll admit that it may not currently be obvious as it may seem intuitive that basic calculated fields would be supported. |
I think there should be an easy way to compute a value and use it together with the serializer. Create a converter or implement custom serializer just to achieve the same as we can with a simple computed property is in some way painful. The worst part is the library allow that (No throw), leading to unexpected results. Myself I've been try that for years since I use the library, and took me very long to realize the problem. I also looked at issues because there's no way only me want to use such, I mean this is the most common and logical way to obtain that, and found this issue, however it looks like it works for that guy, then I thought: "I'm crazy? How it doesn't work for me!? Is my code wrong? What I'm doing wrong?", but none of my attempts work, that lead me to open the issue two years ago and now with more details. I understand that using fields outside the scope of serializer can lead to "holes", but then we must make sure it's well documented and it throw proper error in order to avoid what happened to me. Like when serializer found a Still, I think we need that easy way to achieve this, so, if ignored fields aren't the way, what about add a new attribute like ( Example: public class Preview
{
/// <summary>
/// Gets the image width, in pixels.
/// </summary>
[FieldOrder(0)] public uint ResolutionX { get; set; } = 224;
/// <summary>
/// Gets the operation mark 'x'
/// </summary>
[FieldOrder(1)] [FieldLength(4)] [SerializeAs(SerializedType.TerminatedString)] public string Mark { get; set; } = "x";
/// <summary>
/// Gets the image height, in pixels.
/// </summary>
[FieldOrder(2)] public uint ResolutionY { get; set; } = 168;
[FieldVirtual] public uint DataSize => ResolutionX * ResolutionY * 2;
[FieldOrder(3)] [FieldCount(nameof(DataSize))] public byte[] Data { get; set; } = Array.Empty<byte>();
} |
@jefffhaynes I’m unsure if there’s an easy way to determine if a getter / setter is ‘too complex’ in which case an exception could be thrown to surface the issue, or perhaps the serialiser/deserialiser result could return a tuple with the value and a list of warnings from the serialisation/deserialisation (things that aren’t bad enough to warrant no result, but might suggest the operation wasn’t exactly as intended). I do think this [Ignore] calculated field along with the serialised equivalent make sense as used in Issue216. Maybe the big problems can just be left for now. And just executing the ValueGetter where present would suffice for now. I disagree with adding extra terminology. |
Ok agree we should do something. Let me think about it. |
with a ValueGetter, the getter appears to be evaluted only once when initially converted to a TypeNode. This is unfortunately prior to any deserialized values being realised. This is a slightly hacky solution, but simply re-attempts to evaluate getters where this is the case. Signed-off-by: Bevan Weiss <bevan.weiss@gmail.com>
Deserializing that results in:
Note that outrageous Data[10730659] instead of 75264
The text was updated successfully, but these errors were encountered: