You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using the Redis session state provider with an ASP.NET Azure App Service. I'm using Azure Cache for Redis for the cache.
Last week, I had a look at my app's logs, and noticed that I was getting several dozen Redis timeout errors per day. I have had issues in the past where Redis timeouts brought my app to a complete standstill, but that doesn't seem to be happening now. The timeouts occur sporadically but don't cause a critical disruption.
Service levels:
App Service: Standard S1
Redis: P1. I was using C2 until Friday, but the upgrade doesn't seem to have made a difference.
Today, I tried increasing the operation timeout value from 1000ms to 2000ms, but I am continuing to see about a dozen timeouts per hour.
Below is the error message and stack trace from one of these timeouts. What can I do to figure why these are happening and how to remedy them?
StackExchange.Redis.RedisTimeoutException: Timeout performing EVAL (2000ms), next: EVAL, inst: 12, qu: 0, qs: 0, aw: False, bw: Inactive, rs: ReadAsync, ws: Idle, in: 0, last-in: 0, cur-in: 0, sync-ops: 392, async-ops: 1, serverEndpoint: *************:6380, conn-sec: 396.97, mc: 1/1/0, mgr: 10 of 10 available, clientName: *******************(SE.Redis-v2.6.90.64945), IOCP: (Busy=0,Free=1000,Min=1,Max=1000), WORKER: (Busy=3,Free=8188,Min=1,Max=8191), v: 2.6.90.64945 (Please take a look at this article for some common client-side issues that can cause timeouts: https://stackexchange.github.io/StackExchange.Redis/Timeouts)
Stack Trace: at StackExchange.Redis.ConnectionMultiplexer.ExecuteSyncImpl[T](Message message, ResultProcessor`1 processor, ServerEndPoint server, T defaultValue) in /_/src/StackExchange.Redis/ConnectionMultiplexer.cs:line 1980
at StackExchange.Redis.RedisBase.ExecuteSync[T](Message message, ResultProcessor`1 processor, ServerEndPoint server, T defaultValue) in /_/src/StackExchange.Redis/RedisBase.cs:line 62
at StackExchange.Redis.RedisDatabase.ScriptEvaluate(String script, RedisKey[] keys, RedisValue[] values, CommandFlags flags) in /_/src/StackExchange.Redis/RedisDatabase.cs:line 1520
at Microsoft.Web.Redis.StackExchangeClientConnection.<>c__DisplayClass7_0.<Eval>b__0() in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\Shared\StackExchangeClientConnection.cs:line 68
at Microsoft.Web.Redis.StackExchangeClientConnection.OperationExecutor(Func`1 redisOperation) in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\Shared\StackExchangeClientConnection.cs:line 95
at Microsoft.Web.Redis.StackExchangeClientConnection.RetryLogic(Func`1 redisOperation) in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\Shared\StackExchangeClientConnection.cs:line 118
at Microsoft.Web.Redis.StackExchangeClientConnection.Eval(String script, String[] keyArgs, Object[] valueArgs) in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\Shared\StackExchangeClientConnection.cs:line 68
at Microsoft.Web.Redis.RedisConnectionWrapper.TryTakeWriteLockAndGetData(DateTime lockTime, Int32 lockTimeout, Object& lockId, ISessionStateItemCollection& data, Int32& sessionTimeout) in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\RedisSessionStateProvider\RedisConnectionWrapper.cs:line 184
at Microsoft.Web.Redis.RedisSessionStateProvider.GetItemFromSessionStore(Boolean isWriteLockRequired, HttpContextBase context, String id, CancellationToken cancellationToken, Boolean& locked, TimeSpan& lockAge, Object& lockId, SessionStateActions& actions) in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\RedisSessionStateProvider\RedisSessionStateProvider.cs:line 291
at Microsoft.Web.Redis.RedisSessionStateProvider.<GetItemExclusiveAsync>d__20.MoveNext() in C:\TeamCity\buildAgent\work\59b31e8e4035fd30\src\RedisSessionStateProvider\RedisSessionStateProvider.cs:line 206
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.AspNet.SessionState.SessionStateModuleAsync.<GetSessionStateItemAsync>d__74.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.AspNet.SessionState.SessionStateModuleAsync.<AcquireStateAsync>d__65.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.AspNet.SessionState.TaskAsyncHelper.EndTask(IAsyncResult ar)
at Microsoft.AspNet.SessionState.SessionStateModuleAsync.EndAcquireState(IAsyncResult result)
at System.Web.HttpApplication.AsyncEventExecutionStep.InvokeEndHandler(IAsyncResult ar)
at System.Web.HttpApplication.AsyncEventExecutionStep.OnAsyncEventCompletion(IAsyncResult ar)
The text was updated successfully, but these errors were encountered:
I am using the Redis session state provider with an ASP.NET Azure App Service. I'm using Azure Cache for Redis for the cache.
Last week, I had a look at my app's logs, and noticed that I was getting several dozen Redis timeout errors per day. I have had issues in the past where Redis timeouts brought my app to a complete standstill, but that doesn't seem to be happening now. The timeouts occur sporadically but don't cause a critical disruption.
Service levels:
Today, I tried increasing the operation timeout value from 1000ms to 2000ms, but I am continuing to see about a dozen timeouts per hour.
Below is the error message and stack trace from one of these timeouts. What can I do to figure why these are happening and how to remedy them?
The text was updated successfully, but these errors were encountered: