Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decimal is truncated when saving multiple records on SQL Server #6538

Closed
recalde opened this issue Sep 15, 2016 · 22 comments
Closed

Decimal is truncated when saving multiple records on SQL Server #6538

recalde opened this issue Sep 15, 2016 · 22 comments
Assignees

Comments

@recalde
Copy link

recalde commented Sep 15, 2016

Steps to reproduce

  1. Create a test schema with 20 fields each decimal(15,6)
  2. Create new c# console app with .net 4.6.1
  3. Scaffold ef
  4. Insert 1 record with decimal values (confirm that this works correctly)
  5. Insert 2 records with decimal values (confirm that this is rounding from 1.123456 to 1.000000)
using (var context = new Data.TestRounding.c8c48i_b92605Context())
{
    for (int i = 0; i < recordCount; i++)
    {
        context.Test.Add(new Data.TestRounding.Test
        {
            TestField01 = 1.123456m,
            TestField02 = 2.123456m,
            TestField03 = 3.123456m,
            TestField04 = 4.123456m,
            TestField05 = 5.123456m,
            TestField06 = 6.123456m,
            TestField07 = 7.123456m,
            TestField08 = 8.123456m,
            TestField09 = 9.123456m,
            TestField10 = 10.123456m,
            TestField11 = 11.123456m,
            TestField12 = 12.123456m,
            TestField13 = 13.123456m,
            TestField14 = 14.123456m,
            TestField15 = 15.123456m,
            TestField16 = 16.123456m,
            TestField17 = 17.123456m,
            TestField18 = 18.123456m,
            TestField19 = 19.123456m,
            TestField20 = 20.123456m,
        });
    }
    context.SaveChanges();
}

The issue

Decimal is rounded from 1.23456 to 1.000000 when multiple rows are saved.

Further technical details

EF Core 1.0.0
.NET 4.6.1
Console Application
Windows 7
VS 2015
SQL Server 2012 RC2

@divega divega self-assigned this Sep 15, 2016
@divega
Copy link
Contributor

divega commented Sep 15, 2016

@recalde the problem is in your OnModelCreating() code, where you are configuring each property like this:

  entity.Property(e => e.TestField02)
    .HasColumnType("decimal")
    .HasDefaultValueSql("99.99");

In particular, the call to HasColumnType("decimal") is overriding any precision or scale that EF would use by default for the column, and instead getting you the same precision and scale you would get if you declared the column to be of type "decimal" directly in SQL Server, i.e. you get the default precision of 18 and the default scale of 0.

Instead you could call HasColumnType("decimal(18,6)") or similar, or avoid invoking the HasColumnType() method altogether to get defaults from EF Core (we use a scale of 2 by default).

As a side note, also notice that Test objects that you just added will get returned by tracking queries on the same context, therefore you won't be able to observe the actual precision and scale with which values got stored in the database unless you query with a separate context instance. I assume this may be the reason you believed the first call to InsertRecords() was working as you expected.

@divega
Copy link
Contributor

divega commented Sep 15, 2016

Note for Triage: The observed behavior is currently by design but I can see it being confusing. Nonetheless there are several aspects we may consider addressing (not trying to make any judgement on the value or feasibility of any of them at this point, just leaving that for the triage discussion):

  1. There doesn't seem to be a way to control precision and scale in the fluent API without restoring to HasColumnType().
  2. There is no warning when data gets truncated.
  3. There is no way for users to specify a basic column type and then have EF Core and the provider merge that with configured or default facets, e.g. so that you could specify "varchar" and get a column of type "varchar(max)" or that you could specify "decimal" and get a column of type "decimal(18,2)". We intentionally chose HasColumnType() not to implement that behavior.

@recalde
Copy link
Author

recalde commented Sep 16, 2016

@divega thank you! I'm going to try to override the column type of our scaffolded code now to see if that fixes it on our side.

A couple other questions / comments -

  • The first call to InsertRecords(1) was confirmed to have full decimal precision on the database side, so it seems like this is actually working correctly in this example end-to-end when only one record is included in the SaveChanges(), any idea why that would be the case?
  • Do you think that the HasColumnType mapping is a bug/missing feature in ef-core scaffolding? Should it be fixed there?
  • Note: we re-scaffold our model quite often, and we have a heavily finance/actuarial driven datamodel, that is shared with other applications, so code-first is not an option for us, and this would be a lot of customization for us every time we run ef-scaffold. Do you think we should we look into overriding behavior of Microsoft.EntityFrameworkCore.Scaffolding.Internal classes to add a default type mapping of decimal to include our full/standard precision, the same way we did for CandidateNamingService when the naming convention changed there?

@divega
Copy link
Contributor

divega commented Sep 16, 2016

@recalde I read you initial post too quickly and I somehow missed that this code was generated by our scaffolding. It does looks like a bug.

The first call to InsertRecords(1) was confirmed to have full decimal precision on the database side, so it seems like this is actually working correctly in this example end-to-end when only one record is included in the SaveChanges(), any idea why that would be the case?

I couldn't observe this yesterday when I tried your repro. I ran a query on a separate DbContext instance and the values had been truncated. It would be great if you can double check. If confirmed this could be a separate issue.

@divega
Copy link
Contributor

divega commented Sep 16, 2016

@lajones does this ring a bell? Didn't we fix something like this already? Or was it only for string columns?

@lajones
Copy link
Contributor

lajones commented Sep 16, 2016

@divega - you're thinking of #4312 which was the interaction between HasColumnType() and HasMaxLength() - it's similar but, yes, that was just for string columns.

This seems to be another issue about not maintaining the precision and scale for a SQL Server decimal (or, I would guess, numeric) column. We (I?) should definitely investigate this. As you write above we should be generating HasColumnType("decimal(18,6)") instead of just HasColumnType("decimal").

(Note: we should update https://github.com/aspnet/EntityFramework/blob/dev/test/Microsoft.EntityFrameworkCore.SqlServer.Design.FunctionalTests/ReverseEngineering/E2E.sql to have such columns and then update the tests that use it.)

@divega divega assigned lajones and unassigned divega Sep 16, 2016
@divega
Copy link
Contributor

divega commented Sep 16, 2016

Thanks for confirming @lajones. Can you think of a simple way to override the behavior as described in the third bullet point at #6538 (comment)?

@recalde
Copy link
Author

recalde commented Sep 16, 2016

@divega yes I just re-ran this test (1) and (2) and confirmed that (1) works correctly, and full decimal precision and scale was confirmed via ssms query not db context, and the truncation only happens when 2 or more rows are included in the same SaveChanges().

However, changing the scaffolded code to HasColumnType("decimal(15, 6)") completely fixes the issue, and verified to be working correctly with 100 records.

The difference in behavior between just 1 record and 1+ records is what made tracking back the source of the issue to EF difficult (we couldn't reproduce it in unit tests without targeting multiple rows.)

public static void Main(string[] args)
{
    // This works fine (inserted row 105 with full scale)
    InsertRecords(1);

    // This does not work (inserted row 106 and 107 with truncated scale)
    InsertRecords(2);
}

ssms - SELECT * FROM dbo.Test

TestID TestField01 TestField02 TestField03 ...
105 1.123456 2.123456 3.123456 ...
106 1.000000 2.000000 3.000000 ...
107 1.000000 2.000000 3.000000 ...

@lajones
Copy link
Contributor

lajones commented Sep 16, 2016

@divega I think it's more likely we need to do something in https://github.com/aspnet/EntityFramework/blob/dev/src/Microsoft.EntityFrameworkCore.SqlServer.Design/Internal/SqlServerScaffoldingModelFactory.VisitTypeMapping() to override the data type that is put out dependent on the precision and scale, similar to what we already do for datetime precision.

We already collect the Precision and Scale info in https://github.com/aspnet/EntityFramework/blob/21fdcf99d8aa00defb03d05e6c883d4df3de79fc/src/Microsoft.EntityFrameworkCore.Relational.Design/Metadata/ColumnModel.cs so it should be a matter of hooking them up. It might be possible for them to pass in an overridden version of SqlServerScaffoldingModelFactory which does that as a temporary fix.

@ajcvickers
Copy link
Contributor

See also #5410

@rowanmiller
Copy link
Contributor

The issue with reverse engineer generating the wrong type name is covered by #5410.

@recalde we have not been able to reproduce the issue you see with a difference between 1 and 2+ records being inserted. Can you provide a complete code listing or project that we can run to see the issue.

@recalde
Copy link
Author

recalde commented Sep 28, 2016

@rowanmiller I'm sorry I just noticed that the .sql file in the zip file attached wasn't saved... below is the content the SQL Server 2012 schema, not sure how best to package it, should I update the project to create the schema on a SQL server instance or re-zip with the sql file? -- I didn't try to reproduce on sql express, not sure if its sql server-only?

USE [c8c48i_b92605]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[Test](
    [TestID] [int] IDENTITY(1,1) NOT NULL,
    [TestField01] [decimal](15, 6) NULL,
    [TestField02] [decimal](15, 6) NULL,
    [TestField03] [decimal](15, 6) NULL,
    [TestField04] [decimal](15, 6) NULL,
    [TestField05] [decimal](15, 6) NULL,
    [TestField06] [decimal](15, 6) NULL,
    [TestField07] [decimal](15, 6) NULL,
    [TestField08] [decimal](15, 6) NULL,
    [TestField09] [decimal](15, 6) NULL,
    [TestField10] [decimal](15, 6) NULL,
    [TestField11] [decimal](15, 6) NULL,
    [TestField12] [decimal](15, 6) NULL,
    [TestField13] [decimal](15, 6) NULL,
    [TestField14] [decimal](15, 6) NULL,
    [TestField15] [decimal](15, 6) NULL,
    [TestField16] [decimal](15, 6) NULL,
    [TestField17] [decimal](15, 6) NULL,
    [TestField18] [decimal](15, 6) NULL,
    [TestField19] [decimal](15, 6) NULL,
    [TestField20] [decimal](15, 6) NULL,
 CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED 
(
    [TestID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField01]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField02]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField03]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField04]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField05]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField06]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField07]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField08]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField09]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField10]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField11]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField12]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField13]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField14]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField15]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField16]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField17]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField18]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField19]
GO

ALTER TABLE [dbo].[Test] ADD  DEFAULT ((99.99)) FOR [TestField20]
GO

@divega
Copy link
Contributor

divega commented Sep 29, 2016

@rowanmiller the first post in the issue has a link to a project. With this SQL script it should be complete.

So the database is originally created with the columns with the right precision and scale, but the EF model thinks the store type is just "decimal". It sounds like the part that remains unexplained is how inserting just one record can lead to a different result as inserting two records (no rounding vs. rounding).

It sounds like debugging how we create parameters should help.

@recalde
Copy link
Author

recalde commented Sep 29, 2016

@divega Exactly, I'm completely satisfied with the solution (replace-all to add precision to mapping) and a future fix in scaffolding, it just seems very strange to me that we only observed the issue when we SaveChanges() with 2+ records. It seems that could lead to a lot of difficult-to-diagnose defects.

@rowanmiller rowanmiller added this to the 1.1.0 milestone Oct 3, 2016
@rowanmiller rowanmiller modified the milestones: 1.2.0, 1.1.0 Oct 6, 2016
@divega divega removed this from the 1.2.0 milestone Oct 21, 2016
@divega
Copy link
Contributor

divega commented Oct 21, 2016

I haven't done any further investigation with this repro but another issue (#6835) was filed today that explains the difference in behavior between single row updates and multiple row updates:

While the precision and scale is never configured in DbParameters, the precision and scale are fixed to what is configured in the model (see https://github.com/aspnet/EntityFramework/blob/dev/src/Microsoft.EntityFrameworkCore.SqlServer/Update/Internal/SqlServerUpdateSqlGenerator.cs#L353) in the table variable that we create for batching.

Closing this as a duplicate of the other bug.

@sonphnt
Copy link

sonphnt commented Jul 5, 2017

Today I met this bug also.

The problem is if you are inserting with 1 record then it works perfectly. But If you insert with more than 2 records then decimal is truncated as reported.

I can see in SQL Profiler trace, there are 2 completely different generated SQL scripts. the script generated by 2 record looks more complex by converting to decimal, select from a temp table, so on.. that might cause this issue.

Do we have any fixes about this yet?

Thanks

@ajcvickers
Copy link
Contributor

@sonphnt Make sure your decimal types are explicitly mapped to the correct database type. See #6944 and the discussion on #6835.

@sonphnt
Copy link

sonphnt commented Jul 5, 2017

Hi @ajcvickers

I am using Scaffolding way from Database. Type in database is decimal(16,2) but Context file generated with only "decimal".

Do we fix this issue as title?

Thanks

@ajcvickers
Copy link
Contributor

@sonphnt Change it to decimal(16,2). The reverse engineering issue that creates "decimal" by default will be fixed in an upcoming release.

@sonphnt
Copy link

sonphnt commented Jul 6, 2017

@ajcvickers Yes, I have to change it manually now. What is upcoming release will fix this problem, Preview 4?

I am thinking to use Preview3 to generate Context file in a temp solution and then copy generated files back to my primary project. But I am not sure if Preview 3 solved this issue.

@ajcvickers
Copy link
Contributor

@sonphnt I think this is already fixed in EF Core 2.0 preview 2. @smitpatel Can you confirm?

@smitpatel
Copy link
Contributor

Verified fixed in 2.0.0-preview2-final we scaffold .HasColumnType("decimal(16, 2)")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants