Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blocked sql connections leading to connection pool starvation on Linux with MARS and conn.StateChange #1992

Closed
oncicaradupopovici opened this issue Apr 12, 2023 · 3 comments
Labels
2️⃣ Duplicate Issue/PR that is a duplicate and already exists.

Comments

@oncicaradupopovici
Copy link

oncicaradupopovici commented Apr 12, 2023

Describe the bug

Connection timeout exceptions when all are met:

  1. Runnig on Linux
  2. MARS enabled
  3. Concurrency level greater than connection pool size (in our case minimum 200)
  4. There is some network latency between the client and the server (for example SQL server MI running in Azure)
  5. There is conn.StateChange event handler that executes some sql command when connection state is open

The exact exception is:
System.InvalidOperationException: Timeout expired.  The timeout period elapsed prior to obtaining a connection from the pool.  This may have occurred because all pooled connections were in use and max pool size was reached.

This happens in both dotnet6 and dotnet7 runtimes, with any version of Microsoft.Data.SqlClient or System.Data.SqlClient packages

After a small period of time all 100 connections maintained by the pool are blocked indefinitely, and all subsequent sql connections time out on connection open. Everything remains blocked.

To reproduce

// See https://aka.ms/new-console-template for more information

using Microsoft.Data.SqlClient;

const string ConnectionString = "Server=YOUR_SERVER;Database=YOUR_DB;User Id=YOUR_USER;Password=YOUR_PASS;MultipleActiveResultSets=true";

await ExecuteConcurrentCommands();

Console.WriteLine("FINISHED");

Console.ReadLine();

async Task ExecuteConcurrentCommands()
{
    var tasks = Enumerable.Range(0, 1000).Select(ExecuteCommand);

    await Task.WhenAll(tasks);
}

async Task ExecuteCommand(int index)
{
    using var conn = new SqlConnection(ConnectionString);
    conn.StateChange += void (sender, evt) =>
    {
        if (evt.CurrentState == System.Data.ConnectionState.Open)
        {
            var senderConnection = (SqlConnection)sender;

            var commandText = "SET TEXTSIZE " + index;

            using (var command = senderConnection.CreateCommand())
            {
                command.CommandText = commandText;
                command.ExecuteNonQuery();
            }
        }
    };

    try
    {
        await conn.OpenAsync();

        using var command = conn.CreateCommand();
        command.CommandText = $"SELECT TOP 1 {index}  from [__DocumentSignerMigration]";
        var result = await command.ExecuteScalarAsync();

        Console.WriteLine($"Executed select command {index}, result: " + result);
    }
    catch (Exception ex)
    {
        Console.WriteLine("CAUGHT EXCEPTION, " + ex);
    }
    finally
    {
        await conn.CloseAsync();
    }
}

Further technical details

Microsoft.Data.SqlClient version: 4.0.0
.NET target: dotnet7
SQL Server version: Azure Sql Server MI
Operating system: Debian 11

@JRahnama JRahnama added the 🆕 Triage Needed For new issues, not triaged yet. label Apr 12, 2023
@JRahnama
Copy link
Contributor

@oncicaradupopovici this looks similar to #422. Can you follow with the workarounds on the issue please?

@oncicaradupopovici
Copy link
Author

Yes it is simillar to #422 but in our case there is one more thing, all connections maintained by the pool remain blocked, all 100 connections. Indefinitely. The only thing we can do in this case is to restart the process.
The workaraounds in our case, were to either remove MARS were possible, and in other cases wehere MARS rtequired, we have removed the event handler, and executed the "SET TEXTSIZE" cmd after the connection opened, without the event handler.

@JRahnama JRahnama added 2️⃣ Duplicate Issue/PR that is a duplicate and already exists. and removed 🆕 Triage Needed For new issues, not triaged yet. labels Apr 18, 2023
@JRahnama
Copy link
Contributor

@oncicaradupopovici can you provide the repro in issue #422 and follow the progress there?

Closing the issue as duplicate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2️⃣ Duplicate Issue/PR that is a duplicate and already exists.
Projects
None yet
Development

No branches or pull requests

2 participants