You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I currently have three instances of a console application running, which collects different sensor data and sends it to an InfluxDB database. So far this works quite well.
Now I would like to send the sensor data of each console application to two different databases for testing purposes. After one to three hours at least one instance of WritePointsAsync gets stuck with 100% CPU utilization (of one core). The console application itself lives on and also receives sensor data.
The other two console applications continue to run until they also go into the endless loop.
Steps to reproduce:
create two instances of the InfluxDBClient to two different destination
continuous write to both destinations at the same time
Wait for WritePointsAsync will not end
Expected behavior:
Send data to both DBs.
Actual behavior:
InfluxDBClient hang in infinite loop.
Specifications:
Client Version: Start test with v4.3; Problem still exits on v4.9
InfluxDB Version: InfluxDB v2.6
Platform: Windows 10 x64 .Net Framework
Example
influxDBClient = new List<InfluxDBClient>();
foreach (var url in settings.Url.Split(new[] { ';' }, StringSplitOptions.RemoveEmptyEntries))
{
var clientOptions = new InfluxDBClientOptions.Builder()
.Url(url)
.AuthenticateToken(settings.Token.ToCharArray())
.Org(settings.Organization)
.Bucket(settings.Bucket)
.Build();
influxDBClient.Add(new InfluxDBClient(clientOptions));
}
while(!end)
{
...
Parallel.ForEach(influxDBClient,
client =>
{
if (!client.GetWriteApiAsync().WritePointsAsync(values).ContinueWith(ee =>
{
if (ee.IsFaulted)
{
var ex = ee.Exception.InnerException ?? ee.Exception;
Log($"{ex.GetType().Name}: {ex.Message}");
Log(ex, null, false);
}
}).Wait(SendTimeOut))
Log($"TimeOut send to DB: DB={client} {swSend.Elapsed.TotalMinutes.ToString("0.##")}min", Logtype.Warning);
});
...
}
WritePointsAsync hangs, when output stops
2023-01-15T11:11:49.2223432+01:00 Write to DB: qValues.Count=29
2023-01-15T11:11:49.2263650+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:11:52.2303350+01:00 Write to DB: qValues.Count=32
2023-01-15T11:11:52.2343375+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:11:55.2383351+01:00 Write to DB: qValues.Count=30
2023-01-15T11:11:55.2413354+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:11:58.2463351+01:00 Write to DB: qValues.Count=26
2023-01-15T11:11:58.2503355+01:00 Remove from Queue: SendTime=0,005s
2023-01-15T11:12:01.2553352+01:00 Write to DB: qValues.Count=27
2023-01-15T11:12:01.2613348+01:00 Remove from Queue: SendTime=0,006s
2023-01-15T11:12:04.2633822+01:00 Write to DB: qValues.Count=30
2023-01-15T11:12:04.2673864+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:12:07.2703350+01:00 Write to DB: qValues.Count=27
2023-01-15T11:12:07.2743355+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:12:10.2783351+01:00 Write to DB: qValues.Count=26
2023-01-15T11:12:10.2823354+01:00 Remove from Queue: SendTime=0,004s
2023-01-15T11:12:13.2863351+01:00 Write to DB: qValues.Count=29
The text was updated successfully, but these errors were encountered:
I currently have three instances of a console application running, which collects different sensor data and sends it to an InfluxDB database. So far this works quite well.
Now I would like to send the sensor data of each console application to two different databases for testing purposes. After one to three hours at least one instance of WritePointsAsync gets stuck with 100% CPU utilization (of one core). The console application itself lives on and also receives sensor data.
The other two console applications continue to run until they also go into the endless loop.
Steps to reproduce:
Expected behavior:
Send data to both DBs.
Actual behavior:
InfluxDBClient hang in infinite loop.
Specifications:
Example
WritePointsAsync hangs, when output stops
The text was updated successfully, but these errors were encountered: