Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The specified configuration does not have a telemetry channel. (Parameter 'configuration') #2195

Closed
nulltoken opened this issue Mar 26, 2021 · 13 comments

Comments

@nulltoken
Copy link

nulltoken commented Mar 26, 2021

  • List of NuGet packages and version that you are using:
    • Microsoft.ApplicationInsights.AspNetCore: 2.17.0
    • Azure.Storage.Blobs: 12.8.0
  • Runtime version: netcoreapp3.1
  • Hosting environment: AzureWebApp

Describe the bug

Instrumentation of the interactions with storage accounts through the Azure.Storage.Blobs throws the following exception:

  Message: 
    System.ArgumentException : The specified configuration does not have a telemetry channel. (Parameter 'configuration')
  Stack Trace: 
    TelemetryClient.ctor(TelemetryConfiguration configuration)
    DiagnosticsEventHandlerBase.ctor(TelemetryConfiguration configuration)
    AzureSdkDiagnosticsEventHandler.ctor(TelemetryConfiguration configuration)
    AzureSdkDiagnosticListenerSubscriber.GetEventHandler(String diagnosticListenerName)
    DiagnosticSourceListenerBase`1.OnNext(DiagnosticListener value)
    AllListenerObservable.OnNewDiagnosticListener(DiagnosticListener diagnosticListener)
    DiagnosticListener.ctor(String name)
    DiagnosticScopeFactory.ctor(String clientNamespace, String resourceProviderNamespace, Boolean isActivityEnabled)
    ClientDiagnostics.ctor(ClientOptions options)
    StorageRequestValidationPipelinePolicy.ctor(ClientOptions options)
    StorageClientOptions.Build(ClientOptions options, HttpPipelinePolicy authentication, Uri geoRedundantSecondaryStorageUri)
    StorageClientOptions.Build(ClientOptions options, Object credentials, Uri geoRedundantSecondaryStorageUri)
    BlobClientOptions.Build(Object credentials)
    BlobContainerClient.ctor(String connectionString, String blobContainerName, BlobClientOptions options)
    BlobContainerClient.ctor(String connectionString, String blobContainerName)
    BlobsDownloader.BuildBlobContainerAccessor(IConfiguration configuration, String connectionStringKey, String name) line 96
    PublicFilesFixture.ctor(PublicFilesWebServerSetupFixture wss) line 27

To Reproduce

Discovered while migrating some code from the now deprecated "Microsoft.Azure.Storage.Blob" to "Azure.Storage.Blobs".
This has been initially mentioned in Azure/azure-sdk-for-net#19068 and @jsquire kindly recommended there to rather discuss it here.

The call stack shows an automated test, running against Azurite 3.11.0, which tries to read a non existing file from an existing container.

If this call stack and my explanation doesn't help you pinpointing in the span of a few minutes the root of the issue, I'll find the time to create a minimal repro case.

@nulltoken nulltoken added the bug label Mar 26, 2021
@nulltoken
Copy link
Author

As an additional information, this may be related to a race concurrency issue.

Depending on the subsets of tests I run on my side, some pass and some fail.

If I run all the failing tests one by one, each of them independently, they all pass.

@nulltoken
Copy link
Author

Hum... I spent some hours trying to create a small repro case but failed :(

What would be the best course of action in order to help you fix this?

FWIW, would that be an acceptable option on your side, I could make myself available for a screen sharing session.

@alejandromelis
Copy link

Hi @nulltoken

I've detected the same issue with Azure Functions V3 .Net Core 3.1 in a Windows consumption plan after migrating to version 12.x.x.

https://github.com/Azure/azure-sdk-for-net/blob/Azure.Storage.Blobs_12.8.0/sdk/core/Azure.Core/src/Pipeline/Internal/RequestActivityPolicy.cs#L20-L26

Seems like a race condition is the failure reason, the failure is at Azure Core library and I fear it may be affecting other SDK Clients.

@alejandromelis
Copy link

alejandromelis commented Apr 2, 2021

They have recently patched the DiagnosticScopeFactory and maybe solve your issue.

Azure/azure-sdk-for-net@502d5f1

In my case the stack trace is quite different, and the affected pipeline is RequestActivityPolicy and this class does not use the factory, it just references the DiagnosticListener as static.

https://github.com/Azure/azure-sdk-for-net/blob/Azure.Storage.Blobs_12.8.0/sdk/core/Azure.Core/src/Pipeline/Internal/RequestActivityPolicy.cs#L20

System.TypeInitializationException : The type initializer for 'Azure.Core.Pipeline.RequestActivityPolicy' threw an exception. 
 System.ArgumentException : The specified configuration does not have a telemetry channel. (Parameter 'configuration')
   at Microsoft.ApplicationInsights.TelemetryClient..ctor(TelemetryConfiguration configuration)
   at Microsoft.ApplicationInsights.DependencyCollector.Implementation.EventHandlers.DiagnosticsEventHandlerBase..ctor(TelemetryConfiguration configuration)
   at Microsoft.ApplicationInsights.DependencyCollector.Implementation.AzureSdkDiagnosticsEventHandler..ctor(TelemetryConfiguration configuration)
   at Microsoft.ApplicationInsights.DependencyCollector.Implementation.AzureSdkDiagnosticListenerSubscriber.GetEventHandler(String diagnosticListenerName)
   at Microsoft.ApplicationInsights.DependencyCollector.Implementation.DiagnosticSourceListenerBase`1.OnNext(DiagnosticListener value)
   at System.Diagnostics.DiagnosticListener.AllListenerObservable.OnNewDiagnosticListener(DiagnosticListener diagnosticListener)
   at System.Diagnostics.DiagnosticListener..ctor(String name)
   at Azure.Core.Pipeline.RequestActivityPolicy..cctor()
   End of inner exception
   at async Azure.Core.Pipeline.RequestActivityPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message,ReadOnlyMemory`1 pipeline)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message,ReadOnlyMemory`1 pipeline)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.RetryPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.RetryPolicy.ProcessAsync(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message,ReadOnlyMemory`1 pipeline)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message,ReadOnlyMemory`1 pipeline)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message,ReadOnlyMemory`1 pipeline)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Storage.Blobs.BlobRestClient.BlockBlob.UploadAsync(ClientDiagnostics clientDiagnostics,HttpPipeline pipeline,Uri resourceUri,Stream body,Int64 contentLength,String version,Nullable`1 timeout,Byte[] transactionalContentHash,String blobContentType,String blobContentEncoding,String blobContentLanguage,Byte[] blobContentHash,String blobCacheControl,IDictionary`2 metadata,String leaseId,String blobContentDisposition,String encryptionKey,String encryptionKeySha256,Nullable`1 encryptionAlgorithm,String encryptionScope,Nullable`1 tier,Nullable`1 ifModifiedSince,Nullable`1 ifUnmodifiedSince,Nullable`1 ifMatch,Nullable`1 ifNoneMatch,String ifTags,String requestId,String blobTagsString,Boolean async,String operationName,CancellationToken cancellationToken)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Threading.Tasks.ValueTask`1.get_Result()
   at async Azure.Storage.Blobs.Specialized.BlockBlobClient.UploadInternal(Stream content,BlobHttpHeaders blobHttpHeaders,IDictionary`2 metadata,IDictionary`2 tags,BlobRequestConditions conditions,Nullable`1 accessTier,IProgress`1 progressHandler,String operationName,Boolean async,CancellationToken cancellationToken)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Storage.Blobs.Specialized.BlockBlobClient.<>c__DisplayClass54_0.b__0(??)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at async Azure.Storage.PartitionedUploader`2.UploadInternal[TServiceSpecificArgs,TCompleteUploadReturn](Stream content,TServiceSpecificArgs args,IProgress`1 progressHandler,Boolean async,CancellationToken cancellationToken)

@nulltoken
Copy link
Author

Hum... I spent some hours trying to create a small repro case but failed :(

What would be the best course of action in order to help you fix this?

FWIW, would that be an acceptable option on your side, I could make myself available for a screen sharing session.

👋 Could this issue be triaged? 🙏

/cc @TimothyMothra @cijothomas

@TimothyMothra
Copy link
Member

Hello, my team is looking into this.

For context, this exception is thrown from here:

public TelemetryClient(TelemetryConfiguration configuration)
{
if (configuration == null)
{
CoreEventSource.Log.TelemetryClientConstructorWithNoTelemetryConfiguration();
configuration = TelemetryConfiguration.Active;
}
this.configuration = configuration;
if (this.configuration.TelemetryChannel == null)
{
throw new ArgumentException("The specified configuration does not have a telemetry channel.", nameof(configuration));
}
}

protected DiagnosticSourceListenerBase(TelemetryConfiguration configuration)
{
this.Configuration = configuration;
this.Client = new TelemetryClient(configuration);
}

From this perspective,
The DiagnosticSourceListenerBase has received a TelemetryConfiguration and tries to initialize a new TelemetryClient.
the TelemetryClient constructor is correctly throwing an exception because the configuration is invalid.
The challenge is finding where this invalid configuration is coming from.

@nulltoken
Copy link
Author

The challenge is finding where this invalid configuration is coming from.

@TimothyMothra Would a screen sharing session be of any help?

@TimothyMothra
Copy link
Member

Hello All,
Thank you for continuing to report this exception.

The SDK is expected to throw exceptions when the TelemetryConfiguration provided is invalid.
The was designed to warn users of a situation that would result in no telemetry being sent to Azure Monitor.
Our current theory is that one of our partners has a misconfiguration, but we've been unable to reproduce the error.

Next Steps:
Please open a support ticket with Azure and reference this GitHub issue.
We need to be able to collect a process dump to investigate what is causing this condition.
It is not recommended to share dumps on GitHub because they may contain PII/OII.

@nulltoken
Copy link
Author

@TimothyMothra Thanks for the feedback.

Please open a support ticket with Azure

As this is failing, we've never deployed this.

However, as previously mentioned, I've stored in a branch the code to consistently repro this on my local computer. Would you point me to a guide to collect a process dump and the way to forward it to you, I'd be happy to help.

@cijothomas
Copy link
Contributor

@nulltoken #2294 is merged, and will be part of nightly release to myget tonight. Would you try it out and see if it resolves the issue you are observing?

@nulltoken
Copy link
Author

@cijothomas Woot! Great job guys! By upgrading from 2.17 to 2.18.0-nightly-build00210 the issue vanished into thin air.

image

@atifaziz
Copy link

@cijothomas Should this or (its) PR #2294 be linked with milestone 2.18 (assuming it'll ship in there)?

@cijothomas
Copy link
Contributor

@cijothomas Should this or (its) PR #2294 be linked with milestone 2.18 (assuming it'll ship in there)?

It'll be part of 2.18. (Its mentioned in the changelog already)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants