diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md new file mode 100644 index 00000000000..e5e77b86217 --- /dev/null +++ b/.github/copilot-instructions.md @@ -0,0 +1,183 @@ +# .NET Extensions Repository + +**Always reference these instructions first and fallback to search or bash commands only when you encounter unexpected information that does not match the info here.** + +The .NET Extensions repository contains a suite of libraries providing facilities commonly needed when creating production-ready applications. Major areas include AI abstractions, compliance mechanisms, diagnostics, contextual options, resilience (Polly), telemetry, AspNetCore extensions, static analysis, and testing utilities. + +## Working Effectively + +### Prerequisites and Bootstrap +- **CRITICAL**: Ensure you have access to the appropriate Azure DevOps feeds. If build fails with "Name or service not known" for `pkgs.dev.azure.com`, this indicates network/authentication issues with required internal feeds. +- Install .NET SDK 9.0.109 (as specified in global.json) if you do not already have it: + ```bash + curl -sSL https://dot.net/v1/dotnet-install.sh | bash /dev/stdin --version 9.0.109 --install-dir ~/.dotnet + export PATH="$HOME/.dotnet:$PATH" + ``` +- Verify installation: `dotnet --version` should show `9.0.109` + +### Essential Build Commands +- **Restore dependencies**: `./restore.sh` (Linux/Mac) or `restore.cmd` (Windows) + - **CRITICAL - NEVER CANCEL**: Takes 5-10 minutes on first run. Always set timeout to 30+ minutes. + - Downloads .NET SDK, tools, and packages from Azure DevOps feeds + - Equivalent to `./build.sh --restore` + +- **Build the entire solution**: `./build.sh` (Linux/Mac) or `build.cmd` (Windows) + - **CRITICAL - NEVER CANCEL**: Takes 45-60 minutes for full build. Always set timeout to 90+ minutes. + - Repository has 124 total projects - build times are expected to be substantial + - Without parameters: equivalent to `./build.sh --restore --build` + - Individual actions: `--restore`, `--build`, `--test`, `--pack` + +- **Run tests**: `./build.sh --test` + - **CRITICAL - NEVER CANCEL**: Takes 20-30 minutes. Always set timeout to 60+ minutes. + - Runs all unit tests across the solution + +### Solution Generation (Key Workflow) +This repository does NOT contain a single solution file. Instead, use the slngen tool to generate filtered solutions: + +- **Generate filtered solution**: `./build.sh --vs ` + - Examples: + - `./build.sh --vs Http,Fakes,AspNetCore` + - `./build.sh --vs Polly` (for resilience libraries) + - `./build.sh --vs AI` (for AI-related projects) + - `./build.sh --vs '*'` (all projects - escape asterisk) + - Creates `SDK.sln` in repository root + - Also performs restore operation + - **NEVER CANCEL**: Takes 5-15 minutes depending on filter scope. Set timeout to 30+ minutes. + +- **Open in Visual Studio Code**: `./start-code.sh` (Linux/Mac) or `start-code.cmd` (Windows) + - Sets up environment variables for proper .NET SDK resolution + - Opens repository in VS Code with correct configuration + +### Build Validation and CI Requirements +- **Always run before committing**: + ```bash + ./build.sh --restore --build --test + ``` +- **Check for API changes**: If you modify public APIs, run `./scripts/MakeApiBaselines.ps1` to update API baseline manifest files +- **NEVER CANCEL** long-running builds or tests - this repository has hundreds of projects and build times are expected to be lengthy + +### Common Build Issues and Workarounds + +1. **"Workload manifest microsoft.net.sdk.aspire not installed"**: + - Run `git clean -xdf` then restore again + - Caused by multiple SDK installations + +2. **"Could not resolve SDK Microsoft.DotNet.Arcade.Sdk"** or feed access errors: + - Indicates no access to internal Microsoft Azure DevOps feeds + - **NOT A CODE ISSUE** - environment/network limitation + - Document as "Build requires access to internal Microsoft feeds" + +3. **SSL connection errors during restore**: + - Try disabling IPv6 on network adapter + - May indicate network/firewall restrictions + +### Project Structure and Navigation + +#### Key Directories +- `src/` - Main source code: + - `src/Analyzers/` - Code analyzers + - `src/Generators/` - Source generators + - `src/Libraries/` - Core extension libraries + - `src/Packages/` - NuGet package definitions + - `src/ProjectTemplates/` - Project templates +- `test/` - Test projects (organized to match src/ structure) +- `docs/` - Documentation including `building.md` +- `scripts/` - PowerShell automation scripts +- `eng/` - Build engineering and configuration + +#### Build Outputs +- `artifacts/bin/` - Compiled binaries +- `artifacts/log/` - Build logs (including `Build.binlog` for MSBuild Structured Log Viewer) +- `artifacts/packages/` - Generated NuGet packages + +#### Key Files +- `global.json` - Specifies required .NET SDK version (9.0.109) +- `Directory.Build.props` - MSBuild properties for entire repository +- `NuGet.config` - Package source configuration (internal Microsoft feeds) + +## Validation Scenarios + +**After making changes, always execute these validation steps**: +1. **Generate relevant filtered solution**: `./build.sh --vs ` + - For AI libraries: `./build.sh --vs AI` + - For AspNetCore: `./build.sh --vs AspNetCore` + - For telemetry: `./build.sh --vs Telemetry,Logging,Metrics` + - For resilience: `./build.sh --vs Polly,Resilience` +2. **Build and test**: `./build.sh --build --test` (remember: NEVER CANCEL, 60+ minute timeout) +3. **For public API changes**: Run `./scripts/MakeApiBaselines.ps1` to update API baseline manifest files +4. **Manual verification**: + - Check that your changes compile across target frameworks (net8.0, net9.0) + - Review generated packages in `artifacts/packages/` if applicable + - Verify no new build warnings in `artifacts/log/Build.binlog` + +**Cannot run applications interactively** - this is a library repository. Validation is primarily through unit tests and integration tests. + +**Common validation patterns by library type**: +- **Source generators** (Microsoft.Gen.*): Build consumer projects that use the generator +- **AspNetCore extensions**: Build test web applications that reference the extensions +- **Testing utilities**: Use them in test projects to verify functionality +- **Analyzers**: Build projects that trigger the analyzer rules + +## Time Expectations and Timeouts + +**CRITICAL - NEVER CANCEL BUILD OR TEST COMMANDS**: +- **First-time setup**: 15-20 minutes (SDK download + initial restore) - timeout: 45+ minutes +- **Restore operation**: 5-10 minutes - **ALWAYS set timeout to 30+ minutes, NEVER CANCEL** +- **Full build**: 45-60 minutes - **ALWAYS set timeout to 90+ minutes, NEVER CANCEL** +- **Test execution**: 20-30 minutes - **ALWAYS set timeout to 60+ minutes, NEVER CANCEL** +- **Filtered solution generation**: 5-15 minutes - **ALWAYS set timeout to 30+ minutes** + +**Repository has 124 total projects** - build times are substantial by design. If commands appear to hang, wait at least 60 minutes before considering alternatives. + +## Advanced Usage + +### Custom Solution Generation +```bash +# Using PowerShell script directly with options +./scripts/Slngen.ps1 -Keywords "Http","Fakes" -Folders -OutSln "MyCustom.sln" +``` + +### Build Configuration Options +- `--configuration Debug|Release` - Build configuration +- `--verbosity minimal|normal|detailed|diagnostic` - MSBuild verbosity +- `--onlyTfms "net8.0;net9.0"` - Build specific target frameworks only + +### Code Coverage +```bash +./build.sh --restore --build --configuration Release --testCoverage +``` +Results available at: `artifacts/TestResults/Release/CoverageResultsHtml/index.html` + +## Common Tasks Reference + +**Key library areas and their keywords for filtered solutions**: +- **AI libraries**: `./build.sh --vs AI` (Microsoft.Extensions.AI.*, embedding, chat completion) +- **Telemetry**: `./build.sh --vs Telemetry,Logging,Metrics` (logging, metrics, tracing) +- **AspNetCore**: `./build.sh --vs AspNetCore` (middleware, diagnostics, testing) +- **Resilience**: `./build.sh --vs Polly,Resilience` (Polly integration, resilience patterns) +- **Compliance**: `./build.sh --vs Compliance,Audit` (data classification, audit reports) +- **Hosting**: `./build.sh --vs Hosting,Options` (contextual options, ambient metadata) +- **Testing utilities**: `./build.sh --vs Testing,Fakes` (mocking, fake implementations) + +**Find projects by pattern**: +```bash +# AI-related projects +find src/Libraries -name "*AI*" -name "*.csproj" + +# All AspNetCore extensions +find src/Libraries -name "*AspNetCore*" -name "*.csproj" + +# Source generators +find src/Generators -name "*.csproj" + +# Test projects for a specific library +find test/Libraries -name "*[LibraryName]*" -name "*.csproj" +``` + +**Working with specific libraries workflow**: +1. Identify library area: `ls src/Libraries/` or use find commands above +2. Generate focused solution: `./build.sh --vs ` +3. Navigate to library directory: `cd src/Libraries/Microsoft.Extensions.[Area]` +4. Check corresponding tests: `cd test/Libraries/Microsoft.Extensions.[Area].Tests` +5. Review library README: `cat src/Libraries/Microsoft.Extensions.[Area]/README.md` +6. Build and test: `./build.sh --build --test` (with appropriate timeouts) \ No newline at end of file diff --git a/eng/Publishing.props b/eng/Publishing.props index 0bcb04d4f32..a39ae2df5ac 100644 --- a/eng/Publishing.props +++ b/eng/Publishing.props @@ -1,5 +1,9 @@ + + + + 3 true diff --git a/eng/Signing.props b/eng/Signing.props index d4b1ec3e6cf..133e317085a 100644 --- a/eng/Signing.props +++ b/eng/Signing.props @@ -1,6 +1,7 @@ + \ No newline at end of file diff --git a/eng/Version.Details.xml b/eng/Version.Details.xml index b75b66eaea6..e4f5ea11353 100644 --- a/eng/Version.Details.xml +++ b/eng/Version.Details.xml @@ -194,17 +194,17 @@ - + https://github.com/dotnet/arcade - 5fe939db0a156be6f10e17c105b1842c0c8c8bdc + a4c9a07d978c070ef5c19d2ec9f811d6a5b20914 - + https://github.com/dotnet/arcade - 5fe939db0a156be6f10e17c105b1842c0c8c8bdc + a4c9a07d978c070ef5c19d2ec9f811d6a5b20914 - + https://github.com/dotnet/arcade - 5fe939db0a156be6f10e17c105b1842c0c8c8bdc + a4c9a07d978c070ef5c19d2ec9f811d6a5b20914 diff --git a/eng/Versions.props b/eng/Versions.props index c704134fe48..91a09df773f 100644 --- a/eng/Versions.props +++ b/eng/Versions.props @@ -83,7 +83,7 @@ 9.0.9 - 9.0.0-beta.25428.3 + 10.0.0-beta.25476.2 @@ -159,5 +159,9 @@ https://github.com/dotnet/arcade/blob/f5a7c5d5c56197b09715dece7541ca06beb94eb0/src/Microsoft.DotNet.Arcade.Sdk/tools/XUnit/XUnit.targets#L9 --> 2.9.3 + + 2.8.2 diff --git a/eng/common/CIBuild.cmd b/eng/common/CIBuild.cmd index 56c2f25ac22..ac1f72bf94e 100644 --- a/eng/common/CIBuild.cmd +++ b/eng/common/CIBuild.cmd @@ -1,2 +1,2 @@ @echo off -powershell -ExecutionPolicy ByPass -NoProfile -command "& """%~dp0Build.ps1""" -restore -build -test -sign -pack -publish -ci %*" \ No newline at end of file +powershell -ExecutionPolicy ByPass -NoProfile -command "& """%~dp0Build.ps1""" -restore -build -test -sign -pack -publish -ci %*" diff --git a/eng/common/SetupNugetSources.ps1 b/eng/common/SetupNugetSources.ps1 index 792b60b49d4..9445c314325 100644 --- a/eng/common/SetupNugetSources.ps1 +++ b/eng/common/SetupNugetSources.ps1 @@ -157,7 +157,7 @@ if ($dotnet31Source -ne $null) { AddPackageSource -Sources $sources -SourceName "dotnet3.1-internal-transport" -SourceEndPoint "https://pkgs.dev.azure.com/dnceng/_packaging/dotnet3.1-internal-transport/nuget/v2" -Creds $creds -Username $userName -pwd $Password } -$dotnetVersions = @('5','6','7','8','9') +$dotnetVersions = @('5','6','7','8','9','10') foreach ($dotnetVersion in $dotnetVersions) { $feedPrefix = "dotnet" + $dotnetVersion; diff --git a/eng/common/SetupNugetSources.sh b/eng/common/SetupNugetSources.sh index facb415ca6f..ddf4efc81a4 100755 --- a/eng/common/SetupNugetSources.sh +++ b/eng/common/SetupNugetSources.sh @@ -99,7 +99,7 @@ if [ "$?" == "0" ]; then PackageSources+=('dotnet3.1-internal-transport') fi -DotNetVersions=('5' '6' '7' '8' '9') +DotNetVersions=('5' '6' '7' '8' '9' '10') for DotNetVersion in ${DotNetVersions[@]} ; do FeedPrefix="dotnet${DotNetVersion}"; diff --git a/eng/common/build.ps1 b/eng/common/build.ps1 index 438f9920c43..8cfee107e7a 100644 --- a/eng/common/build.ps1 +++ b/eng/common/build.ps1 @@ -7,6 +7,7 @@ Param( [string] $msbuildEngine = $null, [bool] $warnAsError = $true, [bool] $nodeReuse = $true, + [switch] $buildCheck = $false, [switch][Alias('r')]$restore, [switch] $deployDeps, [switch][Alias('b')]$build, @@ -20,6 +21,7 @@ Param( [switch] $publish, [switch] $clean, [switch][Alias('pb')]$productBuild, + [switch]$fromVMR, [switch][Alias('bl')]$binaryLog, [switch][Alias('nobl')]$excludeCIBinarylog, [switch] $ci, @@ -71,6 +73,9 @@ function Print-Usage() { Write-Host " -msbuildEngine Msbuild engine to use to run build ('dotnet', 'vs', or unspecified)." Write-Host " -excludePrereleaseVS Set to exclude build engines in prerelease versions of Visual Studio" Write-Host " -nativeToolsOnMachine Sets the native tools on machine environment variable (indicating that the script should use native tools on machine)" + Write-Host " -nodeReuse Sets nodereuse msbuild parameter ('true' or 'false')" + Write-Host " -buildCheck Sets /check msbuild parameter" + Write-Host " -fromVMR Set when building from within the VMR" Write-Host "" Write-Host "Command line arguments not listed above are passed thru to msbuild." @@ -97,6 +102,7 @@ function Build { $bl = if ($binaryLog) { '/bl:' + (Join-Path $LogDir 'Build.binlog') } else { '' } $platformArg = if ($platform) { "/p:Platform=$platform" } else { '' } + $check = if ($buildCheck) { '/check' } else { '' } if ($projects) { # Re-assign properties to a new variable because PowerShell doesn't let us append properties directly for unclear reasons. @@ -113,6 +119,7 @@ function Build { MSBuild $toolsetBuildProj ` $bl ` $platformArg ` + $check ` /p:Configuration=$configuration ` /p:RepoRoot=$RepoRoot ` /p:Restore=$restore ` @@ -122,11 +129,13 @@ function Build { /p:Deploy=$deploy ` /p:Test=$test ` /p:Pack=$pack ` - /p:DotNetBuildRepo=$productBuild ` + /p:DotNetBuild=$productBuild ` + /p:DotNetBuildFromVMR=$fromVMR ` /p:IntegrationTest=$integrationTest ` /p:PerformanceTest=$performanceTest ` /p:Sign=$sign ` /p:Publish=$publish ` + /p:RestoreStaticGraphEnableBinaryLogger=$binaryLog ` @properties } diff --git a/eng/common/build.sh b/eng/common/build.sh index ac1ee8620cd..9767bb411a4 100755 --- a/eng/common/build.sh +++ b/eng/common/build.sh @@ -42,6 +42,8 @@ usage() echo " --prepareMachine Prepare machine for CI run, clean up processes after build" echo " --nodeReuse Sets nodereuse msbuild parameter ('true' or 'false')" echo " --warnAsError Sets warnaserror msbuild parameter ('true' or 'false')" + echo " --buildCheck Sets /check msbuild parameter" + echo " --fromVMR Set when building from within the VMR" echo "" echo "Command line arguments not listed above are passed thru to msbuild." echo "Arguments can also be passed in with a single hyphen." @@ -63,6 +65,7 @@ restore=false build=false source_build=false product_build=false +from_vmr=false rebuild=false test=false integration_test=false @@ -76,6 +79,7 @@ clean=false warn_as_error=true node_reuse=true +build_check=false binary_log=false exclude_ci_binary_log=false pipelines_log=false @@ -87,7 +91,7 @@ verbosity='minimal' runtime_source_feed='' runtime_source_feed_key='' -properties='' +properties=() while [[ $# > 0 ]]; do opt="$(echo "${1/#--/-}" | tr "[:upper:]" "[:lower:]")" case "$opt" in @@ -127,19 +131,22 @@ while [[ $# > 0 ]]; do -pack) pack=true ;; - -sourcebuild|-sb) + -sourcebuild|-source-build|-sb) build=true source_build=true product_build=true restore=true pack=true ;; - -productBuild|-pb) + -productbuild|-product-build|-pb) build=true product_build=true restore=true pack=true ;; + -fromvmr|-from-vmr) + from_vmr=true + ;; -test|-t) test=true ;; @@ -173,6 +180,9 @@ while [[ $# > 0 ]]; do node_reuse=$2 shift ;; + -buildcheck) + build_check=true + ;; -runtimesourcefeed) runtime_source_feed=$2 shift @@ -182,7 +192,7 @@ while [[ $# > 0 ]]; do shift ;; *) - properties="$properties $1" + properties+=("$1") ;; esac @@ -216,7 +226,7 @@ function Build { InitializeCustomToolset if [[ ! -z "$projects" ]]; then - properties="$properties /p:Projects=$projects" + properties+=("/p:Projects=$projects") fi local bl="" @@ -224,15 +234,21 @@ function Build { bl="/bl:\"$log_dir/Build.binlog\"" fi + local check="" + if [[ "$build_check" == true ]]; then + check="/check" + fi + MSBuild $_InitializeToolset \ $bl \ + $check \ /p:Configuration=$configuration \ /p:RepoRoot="$repo_root" \ /p:Restore=$restore \ /p:Build=$build \ - /p:DotNetBuildRepo=$product_build \ - /p:ArcadeBuildFromSource=$source_build \ + /p:DotNetBuild=$product_build \ /p:DotNetBuildSourceOnly=$source_build \ + /p:DotNetBuildFromVMR=$from_vmr \ /p:Rebuild=$rebuild \ /p:Test=$test \ /p:Pack=$pack \ @@ -240,7 +256,8 @@ function Build { /p:PerformanceTest=$performance_test \ /p:Sign=$sign \ /p:Publish=$publish \ - $properties + /p:RestoreStaticGraphEnableBinaryLogger=$binary_log \ + ${properties[@]+"${properties[@]}"} ExitWithExitCode 0 } diff --git a/eng/common/cibuild.sh b/eng/common/cibuild.sh index 1a02c0dec8f..66e3b0ac61c 100755 --- a/eng/common/cibuild.sh +++ b/eng/common/cibuild.sh @@ -13,4 +13,4 @@ while [[ -h $source ]]; do done scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" -. "$scriptroot/build.sh" --restore --build --test --pack --publish --ci $@ \ No newline at end of file +. "$scriptroot/build.sh" --restore --build --test --pack --publish --ci $@ diff --git a/eng/common/core-templates/job/job.yml b/eng/common/core-templates/job/job.yml index 8da43d3b583..5ce51840619 100644 --- a/eng/common/core-templates/job/job.yml +++ b/eng/common/core-templates/job/job.yml @@ -19,11 +19,11 @@ parameters: # publishing defaults artifacts: '' enableMicrobuild: false + enableMicrobuildForMacAndLinux: false microbuildUseESRP: true enablePublishBuildArtifacts: false enablePublishBuildAssets: false enablePublishTestResults: false - enablePublishUsingPipelines: false enableBuildRetry: false mergeTestResults: false testRunTitle: '' @@ -74,9 +74,6 @@ jobs: - ${{ if ne(parameters.enableTelemetry, 'false') }}: - name: DOTNET_CLI_TELEMETRY_PROFILE value: '$(Build.Repository.Uri)' - - ${{ if eq(parameters.enableRichCodeNavigation, 'true') }}: - - name: EnableRichCodeNavigation - value: 'true' # Retry signature validation up to three times, waiting 2 seconds between attempts. # See https://learn.microsoft.com/en-us/nuget/reference/errors-and-warnings/nu3028#retry-untrusted-root-failures - name: NUGET_EXPERIMENTAL_CHAIN_BUILD_RETRY_POLICY @@ -128,23 +125,12 @@ jobs: - ${{ preStep }} - ${{ if and(eq(parameters.runAsPublic, 'false'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: - - ${{ if eq(parameters.enableMicrobuild, 'true') }}: - - task: MicroBuildSigningPlugin@4 - displayName: Install MicroBuild plugin - inputs: - signType: $(_SignType) - zipSources: false - feedSource: https://dnceng.pkgs.visualstudio.com/_packaging/MicroBuildToolset/nuget/v3/index.json - ${{ if eq(parameters.microbuildUseESRP, true) }}: - ${{ if eq(variables['System.TeamProject'], 'DevDiv') }}: - ConnectedPMEServiceName: 6cc74545-d7b9-4050-9dfa-ebefcc8961ea - ${{ else }}: - ConnectedPMEServiceName: 248d384a-b39b-46e3-8ad5-c2c210d5e7ca - env: - TeamName: $(_TeamName) - MicroBuildOutputFolderOverride: '$(Agent.TempDirectory)' + - template: /eng/common/core-templates/steps/install-microbuild.yml + parameters: + enableMicrobuild: ${{ parameters.enableMicrobuild }} + enableMicrobuildForMacAndLinux: ${{ parameters.enableMicrobuildForMacAndLinux }} + microbuildUseESRP: ${{ parameters.microbuildUseESRP }} continueOnError: ${{ parameters.continueOnError }} - condition: and(succeeded(), in(variables['_SignType'], 'real', 'test'), eq(variables['Agent.Os'], 'Windows_NT')) - ${{ if and(eq(parameters.runAsPublic, 'false'), eq(variables['System.TeamProject'], 'internal')) }}: - task: NuGetAuthenticate@1 @@ -160,27 +146,15 @@ jobs: - ${{ each step in parameters.steps }}: - ${{ step }} - - ${{ if eq(parameters.enableRichCodeNavigation, true) }}: - - task: RichCodeNavIndexer@0 - displayName: RichCodeNav Upload - inputs: - languages: ${{ coalesce(parameters.richCodeNavigationLanguage, 'csharp') }} - environment: ${{ coalesce(parameters.richCodeNavigationEnvironment, 'internal') }} - richNavLogOutputDirectory: $(System.DefaultWorkingDirectory)/artifacts/bin - uploadRichNavArtifacts: ${{ coalesce(parameters.richCodeNavigationUploadArtifacts, false) }} - continueOnError: true - - ${{ each step in parameters.componentGovernanceSteps }}: - ${{ step }} - - ${{ if eq(parameters.enableMicrobuild, 'true') }}: - - ${{ if and(eq(parameters.runAsPublic, 'false'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: - - task: MicroBuildCleanup@1 - displayName: Execute Microbuild cleanup tasks - condition: and(always(), in(variables['_SignType'], 'real', 'test'), eq(variables['Agent.Os'], 'Windows_NT')) + - ${{ if and(eq(parameters.runAsPublic, 'false'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: + - template: /eng/common/core-templates/steps/cleanup-microbuild.yml + parameters: + enableMicrobuild: ${{ parameters.enableMicrobuild }} + enableMicrobuildForMacAndLinux: ${{ parameters.enableMicrobuildForMacAndLinux }} continueOnError: ${{ parameters.continueOnError }} - env: - TeamName: $(_TeamName) # Publish test results - ${{ if or(and(eq(parameters.enablePublishTestResults, 'true'), eq(parameters.testResultsFormat, '')), eq(parameters.testResultsFormat, 'xunit')) }}: diff --git a/eng/common/core-templates/job/onelocbuild.yml b/eng/common/core-templates/job/onelocbuild.yml index edefa789d36..c5788829a87 100644 --- a/eng/common/core-templates/job/onelocbuild.yml +++ b/eng/common/core-templates/job/onelocbuild.yml @@ -4,7 +4,7 @@ parameters: # Optional: A defined YAML pool - https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=vsts&tabs=schema#pool pool: '' - + CeapexPat: $(dn-bot-ceapex-package-r) # PAT for the loc AzDO instance https://dev.azure.com/ceapex GithubPat: $(BotAccount-dotnet-bot-repo-PAT) @@ -27,7 +27,7 @@ parameters: is1ESPipeline: '' jobs: - job: OneLocBuild${{ parameters.JobNameSuffix }} - + dependsOn: ${{ parameters.dependsOn }} displayName: OneLocBuild${{ parameters.JobNameSuffix }} @@ -86,8 +86,7 @@ jobs: isAutoCompletePrSelected: ${{ parameters.AutoCompletePr }} ${{ if eq(parameters.CreatePr, true) }}: isUseLfLineEndingsSelected: ${{ parameters.UseLfLineEndings }} - ${{ if eq(parameters.RepoType, 'gitHub') }}: - isShouldReusePrSelected: ${{ parameters.ReusePr }} + isShouldReusePrSelected: ${{ parameters.ReusePr }} packageSourceAuth: patAuth patVariable: ${{ parameters.CeapexPat }} ${{ if eq(parameters.RepoType, 'gitHub') }}: @@ -100,22 +99,20 @@ jobs: mirrorBranch: ${{ parameters.MirrorBranch }} condition: ${{ parameters.condition }} - - template: /eng/common/core-templates/steps/publish-build-artifacts.yml - parameters: - is1ESPipeline: ${{ parameters.is1ESPipeline }} - args: - displayName: Publish Localization Files - pathToPublish: '$(Build.ArtifactStagingDirectory)/loc' - publishLocation: Container - artifactName: Loc - condition: ${{ parameters.condition }} + # Copy the locProject.json to the root of the Loc directory, then publish a pipeline artifact + - task: CopyFiles@2 + displayName: Copy LocProject.json + inputs: + SourceFolder: '$(System.DefaultWorkingDirectory)/eng/Localize/' + Contents: 'LocProject.json' + TargetFolder: '$(Build.ArtifactStagingDirectory)/loc' + condition: ${{ parameters.condition }} - - template: /eng/common/core-templates/steps/publish-build-artifacts.yml + - template: /eng/common/core-templates/steps/publish-pipeline-artifacts.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} args: - displayName: Publish LocProject.json - pathToPublish: '$(System.DefaultWorkingDirectory)/eng/Localize/' - publishLocation: Container - artifactName: Loc - condition: ${{ parameters.condition }} \ No newline at end of file + targetPath: '$(Build.ArtifactStagingDirectory)/loc' + artifactName: 'Loc' + displayName: 'Publish Localization Files' + condition: ${{ parameters.condition }} diff --git a/eng/common/core-templates/job/publish-build-assets.yml b/eng/common/core-templates/job/publish-build-assets.yml index b103b7ee168..37dff559fc1 100644 --- a/eng/common/core-templates/job/publish-build-assets.yml +++ b/eng/common/core-templates/job/publish-build-assets.yml @@ -20,9 +20,6 @@ parameters: # if 'true', the build won't run any of the internal only steps, even if it is running in non-public projects. runAsPublic: false - # Optional: whether the build's artifacts will be published using release pipelines or direct feed publishing - publishUsingPipelines: false - # Optional: whether the build's artifacts will be published using release pipelines or direct feed publishing publishAssetsImmediately: false @@ -32,8 +29,19 @@ parameters: is1ESPipeline: '' + # Optional: 🌤️ or not the build has assets it wants to publish to BAR + isAssetlessBuild: false + + # Optional, publishing version + publishingVersion: 3 + + # Optional: A minimatch pattern for the asset manifests to publish to BAR + assetManifestsPattern: '*/manifests/**/*.xml' + repositoryAlias: self + officialBuildId: '' + jobs: - job: Asset_Registry_Publish @@ -56,6 +64,11 @@ jobs: value: false # unconditional - needed for logs publishing (redactor tool version) - template: /eng/common/core-templates/post-build/common-variables.yml + - name: OfficialBuildId + ${{ if ne(parameters.officialBuildId, '') }}: + value: ${{ parameters.officialBuildId }} + ${{ else }}: + value: $(Build.BuildNumber) pool: # We don't use the collection uri here because it might vary (.visualstudio.com vs. dev.azure.com) @@ -77,15 +90,33 @@ jobs: - checkout: ${{ parameters.repositoryAlias }} fetchDepth: 3 clean: true - - - task: DownloadBuildArtifacts@0 - displayName: Download artifact - inputs: - artifactName: AssetManifests - downloadPath: '$(Build.StagingDirectory)/Download' - checkDownloadedFiles: true - condition: ${{ parameters.condition }} - continueOnError: ${{ parameters.continueOnError }} + + - ${{ if eq(parameters.isAssetlessBuild, 'false') }}: + - ${{ if eq(parameters.publishingVersion, 3) }}: + - task: DownloadPipelineArtifact@2 + displayName: Download Asset Manifests + inputs: + artifactName: AssetManifests + targetPath: '$(Build.StagingDirectory)/AssetManifests' + condition: ${{ parameters.condition }} + continueOnError: ${{ parameters.continueOnError }} + - ${{ if eq(parameters.publishingVersion, 4) }}: + - task: DownloadPipelineArtifact@2 + displayName: Download V4 asset manifests + inputs: + itemPattern: '*/manifests/**/*.xml' + targetPath: '$(Build.StagingDirectory)/AllAssetManifests' + condition: ${{ parameters.condition }} + continueOnError: ${{ parameters.continueOnError }} + - task: CopyFiles@2 + displayName: Copy V4 asset manifests to AssetManifests + inputs: + SourceFolder: '$(Build.StagingDirectory)/AllAssetManifests' + Contents: ${{ parameters.assetManifestsPattern }} + TargetFolder: '$(Build.StagingDirectory)/AssetManifests' + flattenFolders: true + condition: ${{ parameters.condition }} + continueOnError: ${{ parameters.continueOnError }} - task: NuGetAuthenticate@1 @@ -97,10 +128,10 @@ jobs: scriptLocation: scriptPath scriptPath: $(System.DefaultWorkingDirectory)/eng/common/sdk-task.ps1 arguments: -task PublishBuildAssets -restore -msbuildEngine dotnet - /p:ManifestsPath='$(Build.StagingDirectory)/Download/AssetManifests' + /p:ManifestsPath='$(Build.StagingDirectory)/AssetManifests' + /p:IsAssetlessBuild=${{ parameters.isAssetlessBuild }} /p:MaestroApiEndpoint=https://maestro.dot.net - /p:PublishUsingPipelines=${{ parameters.publishUsingPipelines }} - /p:OfficialBuildId=$(Build.BuildNumber) + /p:OfficialBuildId=$(OfficialBuildId) condition: ${{ parameters.condition }} continueOnError: ${{ parameters.continueOnError }} @@ -122,6 +153,17 @@ jobs: Copy-Item -Path $symbolExclusionfile -Destination "$(Build.StagingDirectory)/ReleaseConfigs" } + - ${{ if eq(parameters.publishingVersion, 4) }}: + - template: /eng/common/core-templates/steps/publish-pipeline-artifacts.yml + parameters: + is1ESPipeline: ${{ parameters.is1ESPipeline }} + args: + targetPath: '$(Build.ArtifactStagingDirectory)/MergedManifest.xml' + artifactName: AssetManifests + displayName: 'Publish Merged Manifest' + retryCountOnTaskFailure: 10 # for any logs being locked + sbomEnabled: false # we don't need SBOM for logs + - template: /eng/common/core-templates/steps/publish-build-artifacts.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} @@ -131,7 +173,7 @@ jobs: publishLocation: Container artifactName: ReleaseConfigs - - ${{ if eq(parameters.publishAssetsImmediately, 'true') }}: + - ${{ if or(eq(parameters.publishAssetsImmediately, 'true'), eq(parameters.isAssetlessBuild, 'true')) }}: - template: /eng/common/core-templates/post-build/setup-maestro-vars.yml parameters: BARBuildId: ${{ parameters.BARBuildId }} @@ -152,6 +194,7 @@ jobs: -WaitPublishingFinish true -ArtifactsPublishingAdditionalParameters '${{ parameters.artifactsPublishingAdditionalParameters }}' -SymbolPublishingAdditionalParameters '${{ parameters.symbolPublishingAdditionalParameters }}' + -SkipAssetsPublishing '${{ parameters.isAssetlessBuild }}' - ${{ if eq(parameters.enablePublishBuildArtifacts, 'true') }}: - template: /eng/common/core-templates/steps/publish-logs.yml diff --git a/eng/common/core-templates/job/source-build.yml b/eng/common/core-templates/job/source-build.yml index 5baedac1e03..d805d5faeb9 100644 --- a/eng/common/core-templates/job/source-build.yml +++ b/eng/common/core-templates/job/source-build.yml @@ -12,9 +12,10 @@ parameters: # The name of the job. This is included in the job ID. # targetRID: '' # The name of the target RID to use, instead of the one auto-detected by Arcade. - # nonPortable: false + # portableBuild: false # Enables non-portable mode. This means a more specific RID (e.g. fedora.32-x64 rather than - # linux-x64), and compiling against distro-provided packages rather than portable ones. + # linux-x64), and compiling against distro-provided packages rather than portable ones. The + # default is portable mode. # skipPublishValidation: false # Disables publishing validation. By default, a check is performed to ensure no packages are # published by source-build. @@ -33,9 +34,6 @@ parameters: # container and pool. platform: {} - # Optional list of directories to ignore for component governance scans. - componentGovernanceIgnoreDirectories: [] - is1ESPipeline: '' # If set to true and running on a non-public project, @@ -96,4 +94,3 @@ jobs: parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} platform: ${{ parameters.platform }} - componentGovernanceIgnoreDirectories: ${{ parameters.componentGovernanceIgnoreDirectories }} diff --git a/eng/common/core-templates/job/source-index-stage1.yml b/eng/common/core-templates/job/source-index-stage1.yml index 662b9fcce15..30530359a5d 100644 --- a/eng/common/core-templates/job/source-index-stage1.yml +++ b/eng/common/core-templates/job/source-index-stage1.yml @@ -1,8 +1,5 @@ parameters: runAsPublic: false - sourceIndexUploadPackageVersion: 2.0.0-20250425.2 - sourceIndexProcessBinlogPackageVersion: 1.0.1-20250425.2 - sourceIndexPackageSource: https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json sourceIndexBuildCommand: powershell -NoLogo -NoProfile -ExecutionPolicy Bypass -Command "eng/common/build.ps1 -restore -build -binarylog -ci" preSteps: [] binlogPath: artifacts/log/Debug/Build.binlog @@ -16,12 +13,6 @@ jobs: dependsOn: ${{ parameters.dependsOn }} condition: ${{ parameters.condition }} variables: - - name: SourceIndexUploadPackageVersion - value: ${{ parameters.sourceIndexUploadPackageVersion }} - - name: SourceIndexProcessBinlogPackageVersion - value: ${{ parameters.sourceIndexProcessBinlogPackageVersion }} - - name: SourceIndexPackageSource - value: ${{ parameters.sourceIndexPackageSource }} - name: BinlogPath value: ${{ parameters.binlogPath }} - template: /eng/common/core-templates/variables/pool-providers.yml @@ -34,12 +25,10 @@ jobs: pool: ${{ if eq(variables['System.TeamProject'], 'public') }}: name: $(DncEngPublicBuildPool) - image: 1es-windows-2022-open - os: windows + image: windows.vs2022.amd64.open ${{ if eq(variables['System.TeamProject'], 'internal') }}: name: $(DncEngInternalBuildPool) - image: 1es-windows-2022 - os: windows + image: windows.vs2022.amd64 steps: - ${{ if eq(parameters.is1ESPipeline, '') }}: @@ -47,35 +36,9 @@ jobs: - ${{ each preStep in parameters.preSteps }}: - ${{ preStep }} - - - task: UseDotNet@2 - displayName: Use .NET 8 SDK - inputs: - packageType: sdk - version: 8.0.x - installationPath: $(Agent.TempDirectory)/dotnet - workingDirectory: $(Agent.TempDirectory) - - - script: | - $(Agent.TempDirectory)/dotnet/dotnet tool install BinLogToSln --version $(sourceIndexProcessBinlogPackageVersion) --add-source $(SourceIndexPackageSource) --tool-path $(Agent.TempDirectory)/.source-index/tools - $(Agent.TempDirectory)/dotnet/dotnet tool install UploadIndexStage1 --version $(sourceIndexUploadPackageVersion) --add-source $(SourceIndexPackageSource) --tool-path $(Agent.TempDirectory)/.source-index/tools - displayName: Download Tools - # Set working directory to temp directory so 'dotnet' doesn't try to use global.json and use the repo's sdk. - workingDirectory: $(Agent.TempDirectory) - - script: ${{ parameters.sourceIndexBuildCommand }} displayName: Build Repository - - script: $(Agent.TempDirectory)/.source-index/tools/BinLogToSln -i $(BinlogPath) -r $(System.DefaultWorkingDirectory) -n $(Build.Repository.Name) -o .source-index/stage1output - displayName: Process Binlog into indexable sln - - - ${{ if and(eq(parameters.runAsPublic, 'false'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: - - task: AzureCLI@2 - displayName: Log in to Azure and upload stage1 artifacts to source index - inputs: - azureSubscription: 'SourceDotNet Stage1 Publish' - addSpnToEnvironment: true - scriptType: 'ps' - scriptLocation: 'inlineScript' - inlineScript: | - $(Agent.TempDirectory)/.source-index/tools/UploadIndexStage1 -i .source-index/stage1output -n $(Build.Repository.Name) -s netsourceindexstage1 -b stage1 + - template: /eng/common/core-templates/steps/source-index-stage1-publish.yml + parameters: + binLogPath: ${{ parameters.binLogPath }} \ No newline at end of file diff --git a/eng/common/core-templates/jobs/codeql-build.yml b/eng/common/core-templates/jobs/codeql-build.yml index 4571a7864df..dbc14ac580a 100644 --- a/eng/common/core-templates/jobs/codeql-build.yml +++ b/eng/common/core-templates/jobs/codeql-build.yml @@ -15,7 +15,6 @@ jobs: enablePublishBuildArtifacts: false enablePublishTestResults: false enablePublishBuildAssets: false - enablePublishUsingPipelines: false enableTelemetry: true variables: diff --git a/eng/common/core-templates/jobs/jobs.yml b/eng/common/core-templates/jobs/jobs.yml index 3129670b338..01ada747665 100644 --- a/eng/common/core-templates/jobs/jobs.yml +++ b/eng/common/core-templates/jobs/jobs.yml @@ -5,9 +5,6 @@ parameters: # Optional: Include PublishBuildArtifacts task enablePublishBuildArtifacts: false - # Optional: Enable publishing using release pipelines - enablePublishUsingPipelines: false - # Optional: Enable running the source-build jobs to build repo from source enableSourceBuild: false @@ -30,6 +27,9 @@ parameters: # Optional: Publish the assets as soon as the publish to BAR stage is complete, rather doing so in a separate stage. publishAssetsImmediately: false + # Optional: 🌤️ or not the build has assets it wants to publish to BAR + isAssetlessBuild: false + # Optional: If using publishAssetsImmediately and additional parameters are needed, can be used to send along additional parameters (normally sent to post-build.yml) artifactsPublishingAdditionalParameters: '' signingValidationAdditionalParameters: '' @@ -44,6 +44,7 @@ parameters: artifacts: {} is1ESPipeline: '' repositoryAlias: self + officialBuildId: '' # Internal resources (telemetry, microbuild) can only be accessed from non-public projects, # and some (Microbuild) should only be applied to non-PR cases for internal builds. @@ -84,7 +85,6 @@ jobs: - template: /eng/common/core-templates/jobs/source-build.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} - allCompletedJobId: Source_Build_Complete ${{ each parameter in parameters.sourceBuildParameters }}: ${{ parameter.key }}: ${{ parameter.value }} @@ -97,7 +97,7 @@ jobs: ${{ parameter.key }}: ${{ parameter.value }} - ${{ if and(eq(parameters.runAsPublic, 'false'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: - - ${{ if or(eq(parameters.enablePublishBuildAssets, true), eq(parameters.artifacts.publish.manifests, 'true'), ne(parameters.artifacts.publish.manifests, '')) }}: + - ${{ if or(eq(parameters.enablePublishBuildAssets, true), eq(parameters.artifacts.publish.manifests, 'true'), ne(parameters.artifacts.publish.manifests, ''), eq(parameters.isAssetlessBuild, true)) }}: - template: ../job/publish-build-assets.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} @@ -109,13 +109,12 @@ jobs: - ${{ if eq(parameters.publishBuildAssetsDependsOn, '') }}: - ${{ each job in parameters.jobs }}: - ${{ job.job }} - - ${{ if eq(parameters.enableSourceBuild, true) }}: - - Source_Build_Complete runAsPublic: ${{ parameters.runAsPublic }} - publishUsingPipelines: ${{ parameters.enablePublishUsingPipelines }} - publishAssetsImmediately: ${{ parameters.publishAssetsImmediately }} + publishAssetsImmediately: ${{ or(parameters.publishAssetsImmediately, parameters.isAssetlessBuild) }} + isAssetlessBuild: ${{ parameters.isAssetlessBuild }} enablePublishBuildArtifacts: ${{ parameters.enablePublishBuildArtifacts }} artifactsPublishingAdditionalParameters: ${{ parameters.artifactsPublishingAdditionalParameters }} signingValidationAdditionalParameters: ${{ parameters.signingValidationAdditionalParameters }} repositoryAlias: ${{ parameters.repositoryAlias }} + officialBuildId: ${{ parameters.officialBuildId }} diff --git a/eng/common/core-templates/jobs/source-build.yml b/eng/common/core-templates/jobs/source-build.yml index 0b408a67bd5..d92860cba20 100644 --- a/eng/common/core-templates/jobs/source-build.yml +++ b/eng/common/core-templates/jobs/source-build.yml @@ -2,28 +2,19 @@ parameters: # This template adds arcade-powered source-build to CI. A job is created for each platform, as # well as an optional server job that completes when all platform jobs complete. - # The name of the "join" job for all source-build platforms. If set to empty string, the job is - # not included. Existing repo pipelines can use this job depend on all source-build jobs - # completing without maintaining a separate list of every single job ID: just depend on this one - # server job. By default, not included. Recommended name if used: 'Source_Build_Complete'. - allCompletedJobId: '' - # See /eng/common/core-templates/job/source-build.yml jobNamePrefix: 'Source_Build' # This is the default platform provided by Arcade, intended for use by a managed-only repo. defaultManagedPlatform: name: 'Managed' - container: 'mcr.microsoft.com/dotnet-buildtools/prereqs:centos-stream9' + container: 'mcr.microsoft.com/dotnet-buildtools/prereqs:centos-stream-10-amd64' # Defines the platforms on which to run build jobs. One job is created for each platform, and the # object in this array is sent to the job template as 'platform'. If no platforms are specified, # one job runs on 'defaultManagedPlatform'. platforms: [] - # Optional list of directories to ignore for component governance scans. - componentGovernanceIgnoreDirectories: [] - is1ESPipeline: '' # If set to true and running on a non-public project, @@ -34,23 +25,12 @@ parameters: jobs: -- ${{ if ne(parameters.allCompletedJobId, '') }}: - - job: ${{ parameters.allCompletedJobId }} - displayName: Source-Build Complete - pool: server - dependsOn: - - ${{ each platform in parameters.platforms }}: - - ${{ parameters.jobNamePrefix }}_${{ platform.name }} - - ${{ if eq(length(parameters.platforms), 0) }}: - - ${{ parameters.jobNamePrefix }}_${{ parameters.defaultManagedPlatform.name }} - - ${{ each platform in parameters.platforms }}: - template: /eng/common/core-templates/job/source-build.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} jobNamePrefix: ${{ parameters.jobNamePrefix }} platform: ${{ platform }} - componentGovernanceIgnoreDirectories: ${{ parameters.componentGovernanceIgnoreDirectories }} enableInternalSources: ${{ parameters.enableInternalSources }} - ${{ if eq(length(parameters.platforms), 0) }}: @@ -59,5 +39,4 @@ jobs: is1ESPipeline: ${{ parameters.is1ESPipeline }} jobNamePrefix: ${{ parameters.jobNamePrefix }} platform: ${{ parameters.defaultManagedPlatform }} - componentGovernanceIgnoreDirectories: ${{ parameters.componentGovernanceIgnoreDirectories }} enableInternalSources: ${{ parameters.enableInternalSources }} diff --git a/eng/common/core-templates/post-build/post-build.yml b/eng/common/core-templates/post-build/post-build.yml index 2ee8bbfff54..f6f87fe5c67 100644 --- a/eng/common/core-templates/post-build/post-build.yml +++ b/eng/common/core-templates/post-build/post-build.yml @@ -60,6 +60,11 @@ parameters: artifactNames: '' downloadArtifacts: true + - name: isAssetlessBuild + type: boolean + displayName: Is Assetless Build + default: false + # These parameters let the user customize the call to sdk-task.ps1 for publishing # symbols & general artifacts as well as for signing validation - name: symbolPublishingAdditionalParameters @@ -188,9 +193,6 @@ stages: buildId: $(AzDOBuildId) artifactName: PackageArtifacts checkDownloadedFiles: true - itemPattern: | - ** - !**/Microsoft.SourceBuild.Intermediate.*.nupkg # This is necessary whenever we want to publish/restore to an AzDO private feed # Since sdk-task.ps1 tries to restore packages we need to do this authentication here @@ -320,3 +322,4 @@ stages: -RequireDefaultChannels ${{ parameters.requireDefaultChannels }} -ArtifactsPublishingAdditionalParameters '${{ parameters.artifactsPublishingAdditionalParameters }}' -SymbolPublishingAdditionalParameters '${{ parameters.symbolPublishingAdditionalParameters }}' + -SkipAssetsPublishing '${{ parameters.isAssetlessBuild }}' diff --git a/eng/common/core-templates/steps/cleanup-microbuild.yml b/eng/common/core-templates/steps/cleanup-microbuild.yml new file mode 100644 index 00000000000..c0fdcd3379d --- /dev/null +++ b/eng/common/core-templates/steps/cleanup-microbuild.yml @@ -0,0 +1,28 @@ +parameters: + # Enable cleanup tasks for MicroBuild + enableMicrobuild: false + # Enable cleanup tasks for MicroBuild on Mac and Linux + # Will be ignored if 'enableMicrobuild' is false or 'Agent.Os' is 'Windows_NT' + enableMicrobuildForMacAndLinux: false + continueOnError: false + +steps: + - ${{ if eq(parameters.enableMicrobuild, 'true') }}: + - task: MicroBuildCleanup@1 + displayName: Execute Microbuild cleanup tasks + condition: and( + always(), + or( + and( + eq(variables['Agent.Os'], 'Windows_NT'), + in(variables['_SignType'], 'real', 'test') + ), + and( + ${{ eq(parameters.enableMicrobuildForMacAndLinux, true) }}, + ne(variables['Agent.Os'], 'Windows_NT'), + eq(variables['_SignType'], 'real') + ) + )) + continueOnError: ${{ parameters.continueOnError }} + env: + TeamName: $(_TeamName) diff --git a/eng/common/core-templates/steps/generate-sbom.yml b/eng/common/core-templates/steps/generate-sbom.yml index 7f5b84c4cb8..c05f6502797 100644 --- a/eng/common/core-templates/steps/generate-sbom.yml +++ b/eng/common/core-templates/steps/generate-sbom.yml @@ -5,7 +5,7 @@ # IgnoreDirectories - Directories to ignore for SBOM generation. This will be passed through to the CG component detector. parameters: - PackageVersion: 9.0.0 + PackageVersion: 10.0.0 BuildDropPath: '$(System.DefaultWorkingDirectory)/artifacts' PackageName: '.NET' ManifestDirPath: $(Build.ArtifactStagingDirectory)/sbom diff --git a/eng/common/core-templates/steps/get-delegation-sas.yml b/eng/common/core-templates/steps/get-delegation-sas.yml index 9db5617ea7d..d2901470a7f 100644 --- a/eng/common/core-templates/steps/get-delegation-sas.yml +++ b/eng/common/core-templates/steps/get-delegation-sas.yml @@ -31,16 +31,7 @@ steps: # Calculate the expiration of the SAS token and convert to UTC $expiry = (Get-Date).AddHours(${{ parameters.expiryInHours }}).ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ssZ") - # Temporarily work around a helix issue where SAS tokens with / in them will cause incorrect downloads - # of correlation payloads. https://github.com/dotnet/dnceng/issues/3484 - $sas = "" - do { - $sas = az storage container generate-sas --account-name ${{ parameters.storageAccount }} --name ${{ parameters.container }} --permissions ${{ parameters.permissions }} --expiry $expiry --auth-mode login --as-user -o tsv - if ($LASTEXITCODE -ne 0) { - Write-Error "Failed to generate SAS token." - exit 1 - } - } while($sas.IndexOf('/') -ne -1) + $sas = az storage container generate-sas --account-name ${{ parameters.storageAccount }} --name ${{ parameters.container }} --permissions ${{ parameters.permissions }} --expiry $expiry --auth-mode login --as-user -o tsv if ($LASTEXITCODE -ne 0) { Write-Error "Failed to generate SAS token." diff --git a/eng/common/core-templates/steps/install-microbuild.yml b/eng/common/core-templates/steps/install-microbuild.yml new file mode 100644 index 00000000000..d6b9878f54d --- /dev/null +++ b/eng/common/core-templates/steps/install-microbuild.yml @@ -0,0 +1,91 @@ +parameters: + # Enable install tasks for MicroBuild + enableMicrobuild: false + # Enable install tasks for MicroBuild on Mac and Linux + # Will be ignored if 'enableMicrobuild' is false or 'Agent.Os' is 'Windows_NT' + enableMicrobuildForMacAndLinux: false + # Determines whether the ESRP service connection information should be passed to the signing plugin. + # This overlaps with _SignType to some degree. We only need the service connection for real signing. + # It's important that the service connection not be passed to the MicroBuildSigningPlugin task in this place. + # Doing so will cause the service connection to be authorized for the pipeline, which isn't allowed and won't work for non-prod. + # Unfortunately, _SignType can't be used to exclude the use of the service connection in non-real sign scenarios. The + # variable is not available in template expression. _SignType has a very large proliferation across .NET, so replacing it is tough. + microbuildUseESRP: true + # Location of the MicroBuild output folder + # NOTE: There's something that relies on this being in the "default" source directory for tasks such as Signing to work properly. + microBuildOutputFolder: '$(Build.SourcesDirectory)' + + continueOnError: false + +steps: + - ${{ if eq(parameters.enableMicrobuild, 'true') }}: + - ${{ if eq(parameters.enableMicrobuildForMacAndLinux, 'true') }}: + # Needed to download the MicroBuild plugin nupkgs on Mac and Linux when nuget.exe is unavailable + - task: UseDotNet@2 + displayName: Install .NET 8.0 SDK for MicroBuild Plugin + inputs: + packageType: sdk + version: 8.0.x + installationPath: ${{ parameters.microBuildOutputFolder }}/.dotnet + workingDirectory: ${{ parameters.microBuildOutputFolder }} + condition: and(succeeded(), ne(variables['Agent.Os'], 'Windows_NT')) + + - script: | + REM Check if ESRP is disabled while SignType is real + if /I "${{ parameters.microbuildUseESRP }}"=="false" if /I "$(_SignType)"=="real" ( + echo Error: ESRP must be enabled when SignType is real. + exit /b 1 + ) + displayName: 'Validate ESRP usage (Windows)' + condition: and(succeeded(), eq(variables['Agent.Os'], 'Windows_NT')) + - script: | + # Check if ESRP is disabled while SignType is real + if [ "${{ parameters.microbuildUseESRP }}" = "false" ] && [ "$(_SignType)" = "real" ]; then + echo "Error: ESRP must be enabled when SignType is real." + exit 1 + fi + displayName: 'Validate ESRP usage (Non-Windows)' + condition: and(succeeded(), ne(variables['Agent.Os'], 'Windows_NT')) + + # Two different MB install steps. This is due to not being able to use the agent OS during + # YAML expansion, and Windows vs. Linux/Mac uses different service connections. However, + # we can avoid including the MB install step if not enabled at all. This avoids a bunch of + # extra pipeline authorizations, since most pipelines do not sign on non-Windows. + - task: MicroBuildSigningPlugin@4 + displayName: Install MicroBuild plugin (Windows) + inputs: + signType: $(_SignType) + zipSources: false + feedSource: https://dnceng.pkgs.visualstudio.com/_packaging/MicroBuildToolset/nuget/v3/index.json + ${{ if eq(parameters.microbuildUseESRP, true) }}: + ConnectedServiceName: 'MicroBuild Signing Task (DevDiv)' + ${{ if eq(variables['System.TeamProject'], 'DevDiv') }}: + ConnectedPMEServiceName: 6cc74545-d7b9-4050-9dfa-ebefcc8961ea + ${{ else }}: + ConnectedPMEServiceName: 248d384a-b39b-46e3-8ad5-c2c210d5e7ca + env: + TeamName: $(_TeamName) + MicroBuildOutputFolderOverride: ${{ parameters.microBuildOutputFolder }} + SYSTEM_ACCESSTOKEN: $(System.AccessToken) + continueOnError: ${{ parameters.continueOnError }} + condition: and(succeeded(), eq(variables['Agent.Os'], 'Windows_NT'), in(variables['_SignType'], 'real', 'test')) + + - ${{ if eq(parameters.enableMicrobuildForMacAndLinux, true) }}: + - task: MicroBuildSigningPlugin@4 + displayName: Install MicroBuild plugin (non-Windows) + inputs: + signType: $(_SignType) + zipSources: false + feedSource: https://dnceng.pkgs.visualstudio.com/_packaging/MicroBuildToolset/nuget/v3/index.json + ${{ if eq(parameters.microbuildUseESRP, true) }}: + ConnectedServiceName: 'MicroBuild Signing Task (DevDiv)' + ${{ if eq(variables['System.TeamProject'], 'DevDiv') }}: + ConnectedPMEServiceName: beb8cb23-b303-4c95-ab26-9e44bc958d39 + ${{ else }}: + ConnectedPMEServiceName: c24de2a5-cc7a-493d-95e4-8e5ff5cad2bc + env: + TeamName: $(_TeamName) + MicroBuildOutputFolderOverride: ${{ parameters.microBuildOutputFolder }} + SYSTEM_ACCESSTOKEN: $(System.AccessToken) + continueOnError: ${{ parameters.continueOnError }} + condition: and(succeeded(), ne(variables['Agent.Os'], 'Windows_NT'), eq(variables['_SignType'], 'real')) diff --git a/eng/common/core-templates/steps/publish-logs.yml b/eng/common/core-templates/steps/publish-logs.yml index 0623ac6e112..10f825e270a 100644 --- a/eng/common/core-templates/steps/publish-logs.yml +++ b/eng/common/core-templates/steps/publish-logs.yml @@ -34,7 +34,9 @@ steps: '$(akams-client-id)' '$(microsoft-symbol-server-pat)' '$(symweb-symbol-server-pat)' + '$(dnceng-symbol-server-pat)' '$(dn-bot-all-orgs-build-rw-code-rw)' + '$(System.AccessToken)' ${{parameters.CustomSensitiveDataList}} continueOnError: true condition: always() @@ -45,6 +47,7 @@ steps: SourceFolder: '$(System.DefaultWorkingDirectory)/PostBuildLogs' Contents: '**' TargetFolder: '$(Build.ArtifactStagingDirectory)/PostBuildLogs' + condition: always() - template: /eng/common/core-templates/steps/publish-build-artifacts.yml parameters: diff --git a/eng/common/core-templates/steps/source-build.yml b/eng/common/core-templates/steps/source-build.yml index 0718e4ba902..acf16ed3496 100644 --- a/eng/common/core-templates/steps/source-build.yml +++ b/eng/common/core-templates/steps/source-build.yml @@ -11,10 +11,6 @@ parameters: # for details. The entire object is described in the 'job' template for simplicity, even though # the usage of the properties on this object is split between the 'job' and 'steps' templates. platform: {} - - # Optional list of directories to ignore for component governance scans. - componentGovernanceIgnoreDirectories: [] - is1ESPipeline: false steps: @@ -23,25 +19,12 @@ steps: set -x df -h - # If file changes are detected, set CopyWipIntoInnerSourceBuildRepo to copy the WIP changes into the inner source build repo. - internalRestoreArgs= - if ! git diff --quiet; then - internalRestoreArgs='/p:CopyWipIntoInnerSourceBuildRepo=true' - # The 'Copy WIP' feature of source build uses git stash to apply changes from the original repo. - # This only works if there is a username/email configured, which won't be the case in most CI runs. - git config --get user.email - if [ $? -ne 0 ]; then - git config user.email dn-bot@microsoft.com - git config user.name dn-bot - fi - fi - # If building on the internal project, the internal storage variable may be available (usually only if needed) # In that case, add variables to allow the download of internal runtimes if the specified versions are not found # in the default public locations. internalRuntimeDownloadArgs= if [ '$(dotnetbuilds-internal-container-read-token-base64)' != '$''(dotnetbuilds-internal-container-read-token-base64)' ]; then - internalRuntimeDownloadArgs='/p:DotNetRuntimeSourceFeed=https://dotnetbuilds.blob.core.windows.net/internal /p:DotNetRuntimeSourceFeedKey=$(dotnetbuilds-internal-container-read-token-base64) --runtimesourcefeed https://dotnetbuilds.blob.core.windows.net/internal --runtimesourcefeedkey $(dotnetbuilds-internal-container-read-token-base64)' + internalRuntimeDownloadArgs='/p:DotNetRuntimeSourceFeed=https://ci.dot.net/internal /p:DotNetRuntimeSourceFeedKey=$(dotnetbuilds-internal-container-read-token-base64) --runtimesourcefeed https://ci.dot.net/internal --runtimesourcefeedkey $(dotnetbuilds-internal-container-read-token-base64)' fi buildConfig=Release @@ -50,88 +33,33 @@ steps: buildConfig='$(_BuildConfig)' fi - officialBuildArgs= - if [ '${{ and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}' = 'True' ]; then - officialBuildArgs='/p:DotNetPublishUsingPipelines=true /p:OfficialBuildId=$(BUILD.BUILDNUMBER)' - fi - targetRidArgs= if [ '${{ parameters.platform.targetRID }}' != '' ]; then targetRidArgs='/p:TargetRid=${{ parameters.platform.targetRID }}' fi - runtimeOsArgs= - if [ '${{ parameters.platform.runtimeOS }}' != '' ]; then - runtimeOsArgs='/p:RuntimeOS=${{ parameters.platform.runtimeOS }}' - fi - - baseOsArgs= - if [ '${{ parameters.platform.baseOS }}' != '' ]; then - baseOsArgs='/p:BaseOS=${{ parameters.platform.baseOS }}' - fi - - publishArgs= - if [ '${{ parameters.platform.skipPublishValidation }}' != 'true' ]; then - publishArgs='--publish' - fi - - assetManifestFileName=SourceBuild_RidSpecific.xml - if [ '${{ parameters.platform.name }}' != '' ]; then - assetManifestFileName=SourceBuild_${{ parameters.platform.name }}.xml + portableBuildArgs= + if [ '${{ parameters.platform.portableBuild }}' != '' ]; then + portableBuildArgs='/p:PortableBuild=${{ parameters.platform.portableBuild }}' fi ${{ coalesce(parameters.platform.buildScript, './build.sh') }} --ci \ --configuration $buildConfig \ - --restore --build --pack $publishArgs -bl \ + --restore --build --pack -bl \ + --source-build \ ${{ parameters.platform.buildArguments }} \ - $officialBuildArgs \ $internalRuntimeDownloadArgs \ - $internalRestoreArgs \ $targetRidArgs \ - $runtimeOsArgs \ - $baseOsArgs \ - /p:SourceBuildNonPortable=${{ parameters.platform.nonPortable }} \ - /p:ArcadeBuildFromSource=true \ - /p:DotNetBuildSourceOnly=true \ - /p:DotNetBuildRepo=true \ - /p:AssetManifestFileName=$assetManifestFileName + $portableBuildArgs \ displayName: Build -# Upload build logs for diagnosis. -- task: CopyFiles@2 - displayName: Prepare BuildLogs staging directory - inputs: - SourceFolder: '$(System.DefaultWorkingDirectory)' - Contents: | - **/*.log - **/*.binlog - artifacts/sb/prebuilt-report/** - TargetFolder: '$(Build.StagingDirectory)/BuildLogs' - CleanTargetFolder: true - continueOnError: true - condition: succeededOrFailed() - - template: /eng/common/core-templates/steps/publish-pipeline-artifacts.yml parameters: is1ESPipeline: ${{ parameters.is1ESPipeline }} args: displayName: Publish BuildLogs - targetPath: '$(Build.StagingDirectory)/BuildLogs' + targetPath: artifacts/log/${{ coalesce(variables._BuildConfig, 'Release') }} artifactName: BuildLogs_SourceBuild_${{ parameters.platform.name }}_Attempt$(System.JobAttempt) continueOnError: true condition: succeededOrFailed() sbomEnabled: false # we don't need SBOM for logs - -# Manually inject component detection so that we can ignore the source build upstream cache, which contains -# a nupkg cache of input packages (a local feed). -# This path must match the upstream cache path in property 'CurrentRepoSourceBuiltNupkgCacheDir' -# in src\Microsoft.DotNet.Arcade.Sdk\tools\SourceBuild\SourceBuildArcade.targets -- template: /eng/common/core-templates/steps/component-governance.yml - parameters: - displayName: Component Detection (Exclude upstream cache) - is1ESPipeline: ${{ parameters.is1ESPipeline }} - ${{ if eq(length(parameters.componentGovernanceIgnoreDirectories), 0) }}: - componentGovernanceIgnoreDirectories: '$(System.DefaultWorkingDirectory)/artifacts/sb/src/artifacts/obj/source-built-upstream-cache' - ${{ else }}: - componentGovernanceIgnoreDirectories: ${{ join(',', parameters.componentGovernanceIgnoreDirectories) }} - disableComponentGovernance: ${{ eq(variables['System.TeamProject'], 'public') }} diff --git a/eng/common/core-templates/steps/source-index-stage1-publish.yml b/eng/common/core-templates/steps/source-index-stage1-publish.yml new file mode 100644 index 00000000000..e9a694afa58 --- /dev/null +++ b/eng/common/core-templates/steps/source-index-stage1-publish.yml @@ -0,0 +1,35 @@ +parameters: + sourceIndexUploadPackageVersion: 2.0.0-20250818.1 + sourceIndexProcessBinlogPackageVersion: 1.0.1-20250818.1 + sourceIndexPackageSource: https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json + binlogPath: artifacts/log/Debug/Build.binlog + +steps: +- task: UseDotNet@2 + displayName: "Source Index: Use .NET 9 SDK" + inputs: + packageType: sdk + version: 9.0.x + installationPath: $(Agent.TempDirectory)/dotnet + workingDirectory: $(Agent.TempDirectory) + +- script: | + $(Agent.TempDirectory)/dotnet/dotnet tool install BinLogToSln --version ${{parameters.sourceIndexProcessBinlogPackageVersion}} --add-source ${{parameters.SourceIndexPackageSource}} --tool-path $(Agent.TempDirectory)/.source-index/tools + $(Agent.TempDirectory)/dotnet/dotnet tool install UploadIndexStage1 --version ${{parameters.sourceIndexUploadPackageVersion}} --add-source ${{parameters.SourceIndexPackageSource}} --tool-path $(Agent.TempDirectory)/.source-index/tools + displayName: "Source Index: Download netsourceindex Tools" + # Set working directory to temp directory so 'dotnet' doesn't try to use global.json and use the repo's sdk. + workingDirectory: $(Agent.TempDirectory) + +- script: $(Agent.TempDirectory)/.source-index/tools/BinLogToSln -i ${{parameters.BinlogPath}} -r $(System.DefaultWorkingDirectory) -n $(Build.Repository.Name) -o .source-index/stage1output + displayName: "Source Index: Process Binlog into indexable sln" + +- ${{ if and(ne(parameters.runAsPublic, 'true'), ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}: + - task: AzureCLI@2 + displayName: "Source Index: Upload Source Index stage1 artifacts to Azure" + inputs: + azureSubscription: 'SourceDotNet Stage1 Publish' + addSpnToEnvironment: true + scriptType: 'ps' + scriptLocation: 'inlineScript' + inlineScript: | + $(Agent.TempDirectory)/.source-index/tools/UploadIndexStage1 -i .source-index/stage1output -n $(Build.Repository.Name) -s netsourceindexstage1 -b stage1 diff --git a/eng/common/cross/arm64/tizen/tizen.patch b/eng/common/cross/arm64/tizen/tizen.patch index af7c8be0590..2cebc547382 100644 --- a/eng/common/cross/arm64/tizen/tizen.patch +++ b/eng/common/cross/arm64/tizen/tizen.patch @@ -5,5 +5,5 @@ diff -u -r a/usr/lib/libc.so b/usr/lib/libc.so Use the shared library, but some functions are only in the static library, so try that secondarily. */ OUTPUT_FORMAT(elf64-littleaarch64) --GROUP ( /lib64/libc.so.6 /usr/lib64/libc_nonshared.a AS_NEEDED ( /lib/ld-linux-aarch64.so.1 ) ) +-GROUP ( /lib64/libc.so.6 /usr/lib64/libc_nonshared.a AS_NEEDED ( /lib64/ld-linux-aarch64.so.1 ) ) +GROUP ( libc.so.6 libc_nonshared.a AS_NEEDED ( ld-linux-aarch64.so.1 ) ) diff --git a/eng/common/cross/armel/armel.jessie.patch b/eng/common/cross/armel/armel.jessie.patch deleted file mode 100644 index 2d261561935..00000000000 --- a/eng/common/cross/armel/armel.jessie.patch +++ /dev/null @@ -1,43 +0,0 @@ -diff -u -r a/usr/include/urcu/uatomic/generic.h b/usr/include/urcu/uatomic/generic.h ---- a/usr/include/urcu/uatomic/generic.h 2014-10-22 15:00:58.000000000 -0700 -+++ b/usr/include/urcu/uatomic/generic.h 2020-10-30 21:38:28.550000000 -0700 -@@ -69,10 +69,10 @@ - #endif - #ifdef UATOMIC_HAS_ATOMIC_SHORT - case 2: -- return __sync_val_compare_and_swap_2(addr, old, _new); -+ return __sync_val_compare_and_swap_2((uint16_t*) addr, old, _new); - #endif - case 4: -- return __sync_val_compare_and_swap_4(addr, old, _new); -+ return __sync_val_compare_and_swap_4((uint32_t*) addr, old, _new); - #if (CAA_BITS_PER_LONG == 64) - case 8: - return __sync_val_compare_and_swap_8(addr, old, _new); -@@ -109,7 +109,7 @@ - return; - #endif - case 4: -- __sync_and_and_fetch_4(addr, val); -+ __sync_and_and_fetch_4((uint32_t*) addr, val); - return; - #if (CAA_BITS_PER_LONG == 64) - case 8: -@@ -148,7 +148,7 @@ - return; - #endif - case 4: -- __sync_or_and_fetch_4(addr, val); -+ __sync_or_and_fetch_4((uint32_t*) addr, val); - return; - #if (CAA_BITS_PER_LONG == 64) - case 8: -@@ -187,7 +187,7 @@ - return __sync_add_and_fetch_2(addr, val); - #endif - case 4: -- return __sync_add_and_fetch_4(addr, val); -+ return __sync_add_and_fetch_4((uint32_t*) addr, val); - #if (CAA_BITS_PER_LONG == 64) - case 8: - return __sync_add_and_fetch_8(addr, val); diff --git a/eng/common/cross/build-android-rootfs.sh b/eng/common/cross/build-android-rootfs.sh index 7e9ba2b75ed..fbd8d80848a 100755 --- a/eng/common/cross/build-android-rootfs.sh +++ b/eng/common/cross/build-android-rootfs.sh @@ -6,10 +6,11 @@ usage() { echo "Creates a toolchain and sysroot used for cross-compiling for Android." echo - echo "Usage: $0 [BuildArch] [ApiLevel]" + echo "Usage: $0 [BuildArch] [ApiLevel] [--ndk NDKVersion]" echo echo "BuildArch is the target architecture of Android. Currently only arm64 is supported." echo "ApiLevel is the target Android API level. API levels usually match to Android releases. See https://source.android.com/source/build-numbers.html" + echo "NDKVersion is the version of Android NDK. The default is r21. See https://developer.android.com/ndk/downloads/revision_history" echo echo "By default, the toolchain and sysroot will be generated in cross/android-rootfs/toolchain/[BuildArch]. You can change this behavior" echo "by setting the TOOLCHAIN_DIR environment variable" @@ -25,10 +26,15 @@ __BuildArch=arm64 __AndroidArch=aarch64 __AndroidToolchain=aarch64-linux-android -for i in "$@" - do - lowerI="$(echo $i | tr "[:upper:]" "[:lower:]")" - case $lowerI in +while :; do + if [[ "$#" -le 0 ]]; then + break + fi + + i=$1 + + lowerI="$(echo $i | tr "[:upper:]" "[:lower:]")" + case $lowerI in -?|-h|--help) usage exit 1 @@ -43,6 +49,10 @@ for i in "$@" __AndroidArch=arm __AndroidToolchain=arm-linux-androideabi ;; + --ndk) + shift + __NDK_Version=$1 + ;; *[0-9]) __ApiLevel=$i ;; @@ -50,8 +60,17 @@ for i in "$@" __UnprocessedBuildArgs="$__UnprocessedBuildArgs $i" ;; esac + shift done +if [[ "$__NDK_Version" == "r21" ]] || [[ "$__NDK_Version" == "r22" ]]; then + __NDK_File_Arch_Spec=-x86_64 + __SysRoot=sysroot +else + __NDK_File_Arch_Spec= + __SysRoot=toolchains/llvm/prebuilt/linux-x86_64/sysroot +fi + # Obtain the location of the bash script to figure out where the root of the repo is. __ScriptBaseDir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" @@ -78,6 +97,7 @@ fi echo "Target API level: $__ApiLevel" echo "Target architecture: $__BuildArch" +echo "NDK version: $__NDK_Version" echo "NDK location: $__NDK_Dir" echo "Target Toolchain location: $__ToolchainDir" @@ -85,8 +105,8 @@ echo "Target Toolchain location: $__ToolchainDir" if [ ! -d $__NDK_Dir ]; then echo Downloading the NDK into $__NDK_Dir mkdir -p $__NDK_Dir - wget -q --progress=bar:force:noscroll --show-progress https://dl.google.com/android/repository/android-ndk-$__NDK_Version-linux-x86_64.zip -O $__CrossDir/android-ndk-$__NDK_Version-linux-x86_64.zip - unzip -q $__CrossDir/android-ndk-$__NDK_Version-linux-x86_64.zip -d $__CrossDir + wget -q --progress=bar:force:noscroll --show-progress https://dl.google.com/android/repository/android-ndk-$__NDK_Version-linux$__NDK_File_Arch_Spec.zip -O $__CrossDir/android-ndk-$__NDK_Version-linux.zip + unzip -q $__CrossDir/android-ndk-$__NDK_Version-linux.zip -d $__CrossDir fi if [ ! -d $__lldb_Dir ]; then @@ -116,16 +136,11 @@ for path in $(wget -qO- https://packages.termux.dev/termux-main-21/dists/stable/ fi done -cp -R "$__TmpDir/data/data/com.termux/files/usr/"* "$__ToolchainDir/sysroot/usr/" +cp -R "$__TmpDir/data/data/com.termux/files/usr/"* "$__ToolchainDir/$__SysRoot/usr/" # Generate platform file for build.sh script to assign to __DistroRid echo "Generating platform file..." -echo "RID=android.${__ApiLevel}-${__BuildArch}" > $__ToolchainDir/sysroot/android_platform - -echo "Now to build coreclr, libraries and installers; run:" -echo ROOTFS_DIR=\$\(realpath $__ToolchainDir/sysroot\) ./build.sh --cross --arch $__BuildArch \ - --subsetCategory coreclr -echo ROOTFS_DIR=\$\(realpath $__ToolchainDir/sysroot\) ./build.sh --cross --arch $__BuildArch \ - --subsetCategory libraries -echo ROOTFS_DIR=\$\(realpath $__ToolchainDir/sysroot\) ./build.sh --cross --arch $__BuildArch \ - --subsetCategory installer +echo "RID=android.${__ApiLevel}-${__BuildArch}" > $__ToolchainDir/$__SysRoot/android_platform + +echo "Now to build coreclr, libraries and host; run:" +echo ROOTFS_DIR=$(realpath $__ToolchainDir/$__SysRoot) ./build.sh clr+libs+host --cross --arch $__BuildArch diff --git a/eng/common/cross/build-rootfs.sh b/eng/common/cross/build-rootfs.sh index 4b5e8d7166b..8abfb71f727 100755 --- a/eng/common/cross/build-rootfs.sh +++ b/eng/common/cross/build-rootfs.sh @@ -5,7 +5,7 @@ set -e usage() { echo "Usage: $0 [BuildArch] [CodeName] [lldbx.y] [llvmx[.y]] [--skipunmount] --rootfsdir ]" - echo "BuildArch can be: arm(default), arm64, armel, armv6, ppc64le, riscv64, s390x, x64, x86" + echo "BuildArch can be: arm(default), arm64, armel, armv6, loongarch64, ppc64le, riscv64, s390x, x64, x86" echo "CodeName - optional, Code name for Linux, can be: xenial(default), zesty, bionic, alpine" echo " for alpine can be specified with version: alpineX.YY or alpineedge" echo " for FreeBSD can be: freebsd13, freebsd14" @@ -15,6 +15,7 @@ usage() echo "llvmx[.y] - optional, LLVM version for LLVM related packages." echo "--skipunmount - optional, will skip the unmount of rootfs folder." echo "--skipsigcheck - optional, will skip package signature checks (allowing untrusted packages)." + echo "--skipemulation - optional, will skip qemu and debootstrap requirement when building environment for debian based systems." echo "--use-mirror - optional, use mirror URL to fetch resources, when available." echo "--jobs N - optional, restrict to N jobs." exit 1 @@ -52,28 +53,27 @@ __UbuntuPackages+=" symlinks" __UbuntuPackages+=" libicu-dev" __UbuntuPackages+=" liblttng-ust-dev" __UbuntuPackages+=" libunwind8-dev" -__UbuntuPackages+=" libnuma-dev" __AlpinePackages+=" gettext-dev" __AlpinePackages+=" icu-dev" __AlpinePackages+=" libunwind-dev" __AlpinePackages+=" lttng-ust-dev" __AlpinePackages+=" compiler-rt" -__AlpinePackages+=" numactl-dev" # runtime libraries' dependencies __UbuntuPackages+=" libcurl4-openssl-dev" __UbuntuPackages+=" libkrb5-dev" __UbuntuPackages+=" libssl-dev" __UbuntuPackages+=" zlib1g-dev" +__UbuntuPackages+=" libbrotli-dev" __AlpinePackages+=" curl-dev" __AlpinePackages+=" krb5-dev" __AlpinePackages+=" openssl-dev" __AlpinePackages+=" zlib-dev" -__FreeBSDBase="13.3-RELEASE" -__FreeBSDPkg="1.17.0" +__FreeBSDBase="13.4-RELEASE" +__FreeBSDPkg="1.21.3" __FreeBSDABI="13" __FreeBSDPackages="libunwind" __FreeBSDPackages+=" icu" @@ -91,18 +91,18 @@ __HaikuPackages="gcc_syslibs" __HaikuPackages+=" gcc_syslibs_devel" __HaikuPackages+=" gmp" __HaikuPackages+=" gmp_devel" -__HaikuPackages+=" icu66" -__HaikuPackages+=" icu66_devel" +__HaikuPackages+=" icu[0-9]+" +__HaikuPackages+=" icu[0-9]*_devel" __HaikuPackages+=" krb5" __HaikuPackages+=" krb5_devel" __HaikuPackages+=" libiconv" __HaikuPackages+=" libiconv_devel" -__HaikuPackages+=" llvm12_libunwind" -__HaikuPackages+=" llvm12_libunwind_devel" +__HaikuPackages+=" llvm[0-9]*_libunwind" +__HaikuPackages+=" llvm[0-9]*_libunwind_devel" __HaikuPackages+=" mpfr" __HaikuPackages+=" mpfr_devel" -__HaikuPackages+=" openssl" -__HaikuPackages+=" openssl_devel" +__HaikuPackages+=" openssl3" +__HaikuPackages+=" openssl3_devel" __HaikuPackages+=" zlib" __HaikuPackages+=" zlib_devel" @@ -128,10 +128,12 @@ __AlpineKeys=' 616adfeb:MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAq0BFD1D4lIxQcsqEpQzU\npNCYM3aP1V/fxxVdT4DWvSI53JHTwHQamKdMWtEXetWVbP5zSROniYKFXd/xrD9X\n0jiGHey3lEtylXRIPxe5s+wXoCmNLcJVnvTcDtwx/ne2NLHxp76lyc25At+6RgE6\nADjLVuoD7M4IFDkAsd8UQ8zM0Dww9SylIk/wgV3ZkifecvgUQRagrNUdUjR56EBZ\nraQrev4hhzOgwelT0kXCu3snbUuNY/lU53CoTzfBJ5UfEJ5pMw1ij6X0r5S9IVsy\nKLWH1hiO0NzU2c8ViUYCly4Fe9xMTFc6u2dy/dxf6FwERfGzETQxqZvSfrRX+GLj\n/QZAXiPg5178hT/m0Y3z5IGenIC/80Z9NCi+byF1WuJlzKjDcF/TU72zk0+PNM/H\nKuppf3JT4DyjiVzNC5YoWJT2QRMS9KLP5iKCSThwVceEEg5HfhQBRT9M6KIcFLSs\nmFjx9kNEEmc1E8hl5IR3+3Ry8G5/bTIIruz14jgeY9u5jhL8Vyyvo41jgt9sLHR1\n/J1TxKfkgksYev7PoX6/ZzJ1ksWKZY5NFoDXTNYUgzFUTOoEaOg3BAQKadb3Qbbq\nXIrxmPBdgrn9QI7NCgfnAY3Tb4EEjs3ON/BNyEhUENcXOH6I1NbcuBQ7g9P73kE4\nVORdoc8MdJ5eoKBpO8Ww8HECAwEAAQ== 616ae350:MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAyduVzi1mWm+lYo2Tqt/0\nXkCIWrDNP1QBMVPrE0/ZlU2bCGSoo2Z9FHQKz/mTyMRlhNqTfhJ5qU3U9XlyGOPJ\npiM+b91g26pnpXJ2Q2kOypSgOMOPA4cQ42PkHBEqhuzssfj9t7x47ppS94bboh46\nxLSDRff/NAbtwTpvhStV3URYkxFG++cKGGa5MPXBrxIp+iZf9GnuxVdST5PGiVGP\nODL/b69sPJQNbJHVquqUTOh5Ry8uuD2WZuXfKf7/C0jC/ie9m2+0CttNu9tMciGM\nEyKG1/Xhk5iIWO43m4SrrT2WkFlcZ1z2JSf9Pjm4C2+HovYpihwwdM/OdP8Xmsnr\nDzVB4YvQiW+IHBjStHVuyiZWc+JsgEPJzisNY0Wyc/kNyNtqVKpX6dRhMLanLmy+\nf53cCSI05KPQAcGj6tdL+D60uKDkt+FsDa0BTAobZ31OsFVid0vCXtsbplNhW1IF\nHwsGXBTVcfXg44RLyL8Lk/2dQxDHNHzAUslJXzPxaHBLmt++2COa2EI1iWlvtznk\nOk9WP8SOAIj+xdqoiHcC4j72BOVVgiITIJNHrbppZCq6qPR+fgXmXa+sDcGh30m6\n9Wpbr28kLMSHiENCWTdsFij+NQTd5S47H7XTROHnalYDuF1RpS+DpQidT5tUimaT\nJZDr++FjKrnnijbyNF8b98UCAwEAAQ== 616db30d:MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAnpUpyWDWjlUk3smlWeA0\nlIMW+oJ38t92CRLHH3IqRhyECBRW0d0aRGtq7TY8PmxjjvBZrxTNDpJT6KUk4LRm\na6A6IuAI7QnNK8SJqM0DLzlpygd7GJf8ZL9SoHSH+gFsYF67Cpooz/YDqWrlN7Vw\ntO00s0B+eXy+PCXYU7VSfuWFGK8TGEv6HfGMALLjhqMManyvfp8hz3ubN1rK3c8C\nUS/ilRh1qckdbtPvoDPhSbTDmfU1g/EfRSIEXBrIMLg9ka/XB9PvWRrekrppnQzP\nhP9YE3x/wbFc5QqQWiRCYyQl/rgIMOXvIxhkfe8H5n1Et4VAorkpEAXdsfN8KSVv\nLSMazVlLp9GYq5SUpqYX3KnxdWBgN7BJoZ4sltsTpHQ/34SXWfu3UmyUveWj7wp0\nx9hwsPirVI00EEea9AbP7NM2rAyu6ukcm4m6ATd2DZJIViq2es6m60AE6SMCmrQF\nwmk4H/kdQgeAELVfGOm2VyJ3z69fQuywz7xu27S6zTKi05Qlnohxol4wVb6OB7qG\nLPRtK9ObgzRo/OPumyXqlzAi/Yvyd1ZQk8labZps3e16bQp8+pVPiumWioMFJDWV\nGZjCmyMSU8V6MB6njbgLHoyg2LCukCAeSjbPGGGYhnKLm1AKSoJh3IpZuqcKCk5C\n8CM1S15HxV78s9dFntEqIokCAwEAAQ== +66ba20fe:MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAtfB12w4ZgqsXWZDfUAV/\n6Y4aHUKIu3q4SXrNZ7CXF9nXoAVYrS7NAxJdAodsY3vPCN0g5O8DFXR+390LdOuQ\n+HsGKCc1k5tX5ZXld37EZNTNSbR0k+NKhd9h6X3u6wqPOx7SIKxwAQR8qeeFq4pP\nrt9GAGlxtuYgzIIcKJPwE0dZlcBCg+GnptCUZXp/38BP1eYC+xTXSL6Muq1etYfg\nodXdb7Yl+2h1IHuOwo5rjgY5kpY7GcAs8AjGk3lDD/av60OTYccknH0NCVSmPoXK\nvrxDBOn0LQRNBLcAfnTKgHrzy0Q5h4TNkkyTgxkoQw5ObDk9nnabTxql732yy9BY\ns+hM9+dSFO1HKeVXreYSA2n1ndF18YAvAumzgyqzB7I4pMHXq1kC/8bONMJxwSkS\nYm6CoXKyavp7RqGMyeVpRC7tV+blkrrUml0BwNkxE+XnwDRB3xDV6hqgWe0XrifD\nYTfvd9ScZQP83ip0r4IKlq4GMv/R5shcCRJSkSZ6QSGshH40JYSoiwJf5FHbj9ND\n7do0UAqebWo4yNx63j/wb2ULorW3AClv0BCFSdPsIrCStiGdpgJDBR2P2NZOCob3\nG9uMj+wJD6JJg2nWqNJxkANXX37Qf8plgzssrhrgOvB0fjjS7GYhfkfmZTJ0wPOw\nA8+KzFseBh4UFGgue78KwgkCAwEAAQ== ' __Keyring= __KeyringFile="/usr/share/keyrings/ubuntu-archive-keyring.gpg" __SkipSigCheck=0 +__SkipEmulation=0 __UseMirror=0 __UnprocessedBuildArgs= @@ -162,9 +164,13 @@ while :; do armel) __BuildArch=armel __UbuntuArch=armel - __UbuntuRepo="http://ftp.debian.org/debian/" - __CodeName=jessie + __UbuntuRepo="http://archive.debian.org/debian/" + __CodeName=buster __KeyringFile="/usr/share/keyrings/debian-archive-keyring.gpg" + __LLDB_Package="liblldb-6.0-dev" + __UbuntuPackages="${__UbuntuPackages// libomp-dev/}" + __UbuntuPackages="${__UbuntuPackages// libomp5/}" + __UbuntuSuites= ;; armv6) __BuildArch=armv6 @@ -180,6 +186,18 @@ while :; do __Keyring="--keyring $__KeyringFile" fi ;; + loongarch64) + __BuildArch=loongarch64 + __AlpineArch=loongarch64 + __QEMUArch=loongarch64 + __UbuntuArch=loong64 + __UbuntuSuites=unreleased + __LLDB_Package="liblldb-19-dev" + + if [[ "$__CodeName" == "sid" ]]; then + __UbuntuRepo="http://ftp.ports.debian.org/debian-ports/" + fi + ;; riscv64) __BuildArch=riscv64 __AlpineArch=riscv64 @@ -264,44 +282,21 @@ while :; do ;; xenial) # Ubuntu 16.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=xenial - fi - ;; - zesty) # Ubuntu 17.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=zesty - fi + __CodeName=xenial ;; bionic) # Ubuntu 18.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=bionic - fi + __CodeName=bionic ;; focal) # Ubuntu 20.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=focal - fi + __CodeName=focal ;; jammy) # Ubuntu 22.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=jammy - fi + __CodeName=jammy ;; noble) # Ubuntu 24.04 - if [[ "$__CodeName" != "jessie" ]]; then - __CodeName=noble - fi - if [[ -n "$__LLDB_Package" ]]; then - __LLDB_Package="liblldb-18-dev" - fi - ;; - jessie) # Debian 8 - __CodeName=jessie - __KeyringFile="/usr/share/keyrings/debian-archive-keyring.gpg" - - if [[ -z "$__UbuntuRepo" ]]; then - __UbuntuRepo="http://ftp.debian.org/debian/" + __CodeName=noble + if [[ -z "$__LLDB_Package" ]]; then + __LLDB_Package="liblldb-19-dev" fi ;; stretch) # Debian 9 @@ -319,7 +314,7 @@ while :; do __KeyringFile="/usr/share/keyrings/debian-archive-keyring.gpg" if [[ -z "$__UbuntuRepo" ]]; then - __UbuntuRepo="http://ftp.debian.org/debian/" + __UbuntuRepo="http://archive.debian.org/debian/" fi ;; bullseye) # Debian 11 @@ -340,10 +335,28 @@ while :; do ;; sid) # Debian sid __CodeName=sid - __KeyringFile="/usr/share/keyrings/debian-archive-keyring.gpg" + __UbuntuSuites= - if [[ -z "$__UbuntuRepo" ]]; then - __UbuntuRepo="http://ftp.debian.org/debian/" + # Debian-Ports architectures need different values + case "$__UbuntuArch" in + amd64|arm64|armel|armhf|i386|mips64el|ppc64el|riscv64|s390x) + __KeyringFile="/usr/share/keyrings/debian-archive-keyring.gpg" + + if [[ -z "$__UbuntuRepo" ]]; then + __UbuntuRepo="http://ftp.debian.org/debian/" + fi + ;; + *) + __KeyringFile="/usr/share/keyrings/debian-ports-archive-keyring.gpg" + + if [[ -z "$__UbuntuRepo" ]]; then + __UbuntuRepo="http://ftp.ports.debian.org/debian-ports/" + fi + ;; + esac + + if [[ -e "$__KeyringFile" ]]; then + __Keyring="--keyring $__KeyringFile" fi ;; tizen) @@ -370,7 +383,7 @@ while :; do ;; freebsd14) __CodeName=freebsd - __FreeBSDBase="14.0-RELEASE" + __FreeBSDBase="14.2-RELEASE" __FreeBSDABI="14" __SkipUnmount=1 ;; @@ -388,6 +401,9 @@ while :; do --skipsigcheck) __SkipSigCheck=1 ;; + --skipemulation) + __SkipEmulation=1 + ;; --rootfsdir|-rootfsdir) shift __RootfsDir="$1" @@ -420,16 +436,15 @@ case "$__AlpineVersion" in elif [[ "$__AlpineArch" == "x86" ]]; then __AlpineVersion=3.17 # minimum version that supports lldb-dev __AlpinePackages+=" llvm15-libs" - elif [[ "$__AlpineArch" == "riscv64" ]]; then + elif [[ "$__AlpineArch" == "riscv64" || "$__AlpineArch" == "loongarch64" ]]; then + __AlpineVersion=3.21 # minimum version that supports lldb-dev + __AlpinePackages+=" llvm19-libs" + elif [[ -n "$__AlpineMajorVersion" ]]; then + # use whichever alpine version is provided and select the latest toolchain libs __AlpineLlvmLibsLookup=1 - __AlpineVersion=edge # minimum version with APKINDEX.tar.gz (packages archive) else __AlpineVersion=3.13 # 3.13 to maximize compatibility __AlpinePackages+=" llvm10-libs" - - if [[ "$__AlpineArch" == "armv7" ]]; then - __AlpinePackages="${__AlpinePackages//numactl-dev/}" - fi fi esac @@ -439,15 +454,6 @@ if [[ "$__AlpineVersion" =~ 3\.1[345] ]]; then __AlpinePackages="${__AlpinePackages/compiler-rt/compiler-rt-static}" fi -if [[ "$__BuildArch" == "armel" ]]; then - __LLDB_Package="lldb-3.5-dev" -fi - -if [[ "$__CodeName" == "xenial" && "$__UbuntuArch" == "armhf" ]]; then - # libnuma-dev is not available on armhf for xenial - __UbuntuPackages="${__UbuntuPackages//libnuma-dev/}" -fi - __UbuntuPackages+=" ${__LLDB_Package:-}" if [[ -z "$__UbuntuRepo" ]]; then @@ -496,7 +502,7 @@ if [[ "$__CodeName" == "alpine" ]]; then arch="$(uname -m)" ensureDownloadTool - + if [[ "$__hasWget" == 1 ]]; then wget -P "$__ApkToolsDir" "https://gitlab.alpinelinux.org/api/v4/projects/5/packages/generic/v$__ApkToolsVersion/$arch/apk.static" else @@ -512,11 +518,6 @@ if [[ "$__CodeName" == "alpine" ]]; then echo "$__ApkToolsSHA512SUM $__ApkToolsDir/apk.static" | sha512sum -c chmod +x "$__ApkToolsDir/apk.static" - if [[ -f "/usr/bin/qemu-$__QEMUArch-static" ]]; then - mkdir -p "$__RootfsDir"/usr/bin - cp -v "/usr/bin/qemu-$__QEMUArch-static" "$__RootfsDir/usr/bin" - fi - if [[ "$__AlpineVersion" == "edge" ]]; then version=edge else @@ -536,6 +537,10 @@ if [[ "$__CodeName" == "alpine" ]]; then __ApkSignatureArg="--keys-dir $__ApkKeysDir" fi + if [[ "$__SkipEmulation" == "1" ]]; then + __NoEmulationArg="--no-scripts" + fi + # initialize DB # shellcheck disable=SC2086 "$__ApkToolsDir/apk.static" \ @@ -557,7 +562,7 @@ if [[ "$__CodeName" == "alpine" ]]; then "$__ApkToolsDir/apk.static" \ -X "http://dl-cdn.alpinelinux.org/alpine/$version/main" \ -X "http://dl-cdn.alpinelinux.org/alpine/$version/community" \ - -U $__ApkSignatureArg --root "$__RootfsDir" --arch "$__AlpineArch" \ + -U $__ApkSignatureArg --root "$__RootfsDir" --arch "$__AlpineArch" $__NoEmulationArg \ add $__AlpinePackages rm -r "$__ApkToolsDir" @@ -573,7 +578,7 @@ elif [[ "$__CodeName" == "freebsd" ]]; then curl -SL "https://download.freebsd.org/ftp/releases/${__FreeBSDArch}/${__FreeBSDMachineArch}/${__FreeBSDBase}/base.txz" | tar -C "$__RootfsDir" -Jxf - ./lib ./usr/lib ./usr/libdata ./usr/include ./usr/share/keys ./etc ./bin/freebsd-version fi echo "ABI = \"FreeBSD:${__FreeBSDABI}:${__FreeBSDMachineArch}\"; FINGERPRINTS = \"${__RootfsDir}/usr/share/keys\"; REPOS_DIR = [\"${__RootfsDir}/etc/pkg\"]; REPO_AUTOUPDATE = NO; RUN_SCRIPTS = NO;" > "${__RootfsDir}"/usr/local/etc/pkg.conf - echo "FreeBSD: { url: \"pkg+http://pkg.FreeBSD.org/\${ABI}/quarterly\", mirror_type: \"srv\", signature_type: \"fingerprints\", fingerprints: \"${__RootfsDir}/usr/share/keys/pkg\", enabled: yes }" > "${__RootfsDir}"/etc/pkg/FreeBSD.conf + echo "FreeBSD: { url: \"pkg+http://pkg.FreeBSD.org/\${ABI}/quarterly\", mirror_type: \"srv\", signature_type: \"fingerprints\", fingerprints: \"/usr/share/keys/pkg\", enabled: yes }" > "${__RootfsDir}"/etc/pkg/FreeBSD.conf mkdir -p "$__RootfsDir"/tmp # get and build package manager if [[ "$__hasWget" == 1 ]]; then @@ -681,7 +686,7 @@ elif [[ "$__CodeName" == "haiku" ]]; then ensureDownloadTool - echo "Downloading Haiku package tool" + echo "Downloading Haiku package tools" git clone https://github.com/haiku/haiku-toolchains-ubuntu --depth 1 "$__RootfsDir/tmp/script" if [[ "$__hasWget" == 1 ]]; then wget -O "$__RootfsDir/tmp/download/hosttools.zip" "$("$__RootfsDir/tmp/script/fetch.sh" --hosttools)" @@ -691,34 +696,42 @@ elif [[ "$__CodeName" == "haiku" ]]; then unzip -o "$__RootfsDir/tmp/download/hosttools.zip" -d "$__RootfsDir/tmp/bin" - DepotBaseUrl="https://depot.haiku-os.org/__api/v2/pkg/get-pkg" - HpkgBaseUrl="https://eu.hpkg.haiku-os.org/haiku/master/$__HaikuArch/current" + HaikuBaseUrl="https://eu.hpkg.haiku-os.org/haiku/master/$__HaikuArch/current" + HaikuPortsBaseUrl="https://eu.hpkg.haiku-os.org/haikuports/master/$__HaikuArch/current" + + echo "Downloading HaikuPorts package repository index..." + if [[ "$__hasWget" == 1 ]]; then + wget -P "$__RootfsDir/tmp/download" "$HaikuPortsBaseUrl/repo" + else + curl -SLO --create-dirs --output-dir "$__RootfsDir/tmp/download" "$HaikuPortsBaseUrl/repo" + fi - # Download Haiku packages echo "Downloading Haiku packages" read -ra array <<<"$__HaikuPackages" for package in "${array[@]}"; do echo "Downloading $package..." - # API documented here: https://github.com/haiku/haikudepotserver/blob/master/haikudepotserver-api2/src/main/resources/api2/pkg.yaml#L60 - # The schema here: https://github.com/haiku/haikudepotserver/blob/master/haikudepotserver-api2/src/main/resources/api2/pkg.yaml#L598 + hpkgFilename="$(LD_LIBRARY_PATH="$__RootfsDir/tmp/bin" "$__RootfsDir/tmp/bin/package_repo" list -f "$__RootfsDir/tmp/download/repo" | + grep -E "${package}-" | sort -V | tail -n 1 | xargs)" + if [ -z "$hpkgFilename" ]; then + >&2 echo "ERROR: package $package missing." + exit 1 + fi + echo "Resolved filename: $hpkgFilename..." + hpkgDownloadUrl="$HaikuPortsBaseUrl/packages/$hpkgFilename" if [[ "$__hasWget" == 1 ]]; then - hpkgDownloadUrl="$(wget -qO- --post-data '{"name":"'"$package"'","repositorySourceCode":"haikuports_'$__HaikuArch'","versionType":"LATEST","naturalLanguageCode":"en"}' \ - --header 'Content-Type:application/json' "$DepotBaseUrl" | jq -r '.result.versions[].hpkgDownloadURL')" wget -P "$__RootfsDir/tmp/download" "$hpkgDownloadUrl" else - hpkgDownloadUrl="$(curl -sSL -XPOST --data '{"name":"'"$package"'","repositorySourceCode":"haikuports_'$__HaikuArch'","versionType":"LATEST","naturalLanguageCode":"en"}' \ - --header 'Content-Type:application/json' "$DepotBaseUrl" | jq -r '.result.versions[].hpkgDownloadURL')" curl -SLO --create-dirs --output-dir "$__RootfsDir/tmp/download" "$hpkgDownloadUrl" fi done for package in haiku haiku_devel; do echo "Downloading $package..." if [[ "$__hasWget" == 1 ]]; then - hpkgVersion="$(wget -qO- "$HpkgBaseUrl" | sed -n 's/^.*version: "\([^"]*\)".*$/\1/p')" - wget -P "$__RootfsDir/tmp/download" "$HpkgBaseUrl/packages/$package-$hpkgVersion-1-$__HaikuArch.hpkg" + hpkgVersion="$(wget -qO- "$HaikuBaseUrl" | sed -n 's/^.*version: "\([^"]*\)".*$/\1/p')" + wget -P "$__RootfsDir/tmp/download" "$HaikuBaseUrl/packages/$package-$hpkgVersion-1-$__HaikuArch.hpkg" else - hpkgVersion="$(curl -sSL "$HpkgBaseUrl" | sed -n 's/^.*version: "\([^"]*\)".*$/\1/p')" - curl -SLO --create-dirs --output-dir "$__RootfsDir/tmp/download" "$HpkgBaseUrl/packages/$package-$hpkgVersion-1-$__HaikuArch.hpkg" + hpkgVersion="$(curl -sSL "$HaikuBaseUrl" | sed -n 's/^.*version: "\([^"]*\)".*$/\1/p')" + curl -SLO --create-dirs --output-dir "$__RootfsDir/tmp/download" "$HaikuBaseUrl/packages/$package-$hpkgVersion-1-$__HaikuArch.hpkg" fi done @@ -744,25 +757,67 @@ elif [[ "$__CodeName" == "haiku" ]]; then popd rm -rf "$__RootfsDir/tmp" elif [[ -n "$__CodeName" ]]; then + __Suites="$__CodeName $(for suite in $__UbuntuSuites; do echo -n "$__CodeName-$suite "; done)" + + if [[ "$__SkipEmulation" == "1" ]]; then + if [[ -z "$AR" ]]; then + if command -v ar &>/dev/null; then + AR="$(command -v ar)" + elif command -v llvm-ar &>/dev/null; then + AR="$(command -v llvm-ar)" + else + echo "Unable to find ar or llvm-ar on PATH, add them to PATH or set AR environment variable pointing to the available AR tool" + exit 1 + fi + fi + + PYTHON=${PYTHON_EXECUTABLE:-python3} + + # shellcheck disable=SC2086,SC2046 + echo running "$PYTHON" "$__CrossDir/install-debs.py" --arch "$__UbuntuArch" --mirror "$__UbuntuRepo" --rootfsdir "$__RootfsDir" --artool "$AR" \ + $(for suite in $__Suites; do echo -n "--suite $suite "; done) \ + $__UbuntuPackages + + # shellcheck disable=SC2086,SC2046 + "$PYTHON" "$__CrossDir/install-debs.py" --arch "$__UbuntuArch" --mirror "$__UbuntuRepo" --rootfsdir "$__RootfsDir" --artool "$AR" \ + $(for suite in $__Suites; do echo -n "--suite $suite "; done) \ + $__UbuntuPackages + exit 0 + fi + + __UpdateOptions= if [[ "$__SkipSigCheck" == "0" ]]; then __Keyring="$__Keyring --force-check-gpg" + else + __Keyring= + __UpdateOptions="--allow-unauthenticated --allow-insecure-repositories" fi # shellcheck disable=SC2086 echo running debootstrap "--variant=minbase" $__Keyring --arch "$__UbuntuArch" "$__CodeName" "$__RootfsDir" "$__UbuntuRepo" - debootstrap "--variant=minbase" $__Keyring --arch "$__UbuntuArch" "$__CodeName" "$__RootfsDir" "$__UbuntuRepo" + # shellcheck disable=SC2086 + if ! debootstrap "--variant=minbase" $__Keyring --arch "$__UbuntuArch" "$__CodeName" "$__RootfsDir" "$__UbuntuRepo"; then + echo "debootstrap failed! dumping debootstrap.log" + cat "$__RootfsDir/debootstrap/debootstrap.log" + exit 1 + fi + + rm -rf "$__RootfsDir"/etc/apt/*.{sources,list} "$__RootfsDir"/etc/apt/sources.list.d mkdir -p "$__RootfsDir/etc/apt/sources.list.d/" + + # shellcheck disable=SC2086 cat > "$__RootfsDir/etc/apt/sources.list.d/$__CodeName.sources" < token2) - (token1 < token2) + else: + return -1 if isinstance(token1, str) else 1 + + return len(tokens1) - len(tokens2) + +def compare_debian_versions(version1, version2): + """Compare two Debian package versions.""" + epoch1, upstream1, revision1 = parse_debian_version(version1) + epoch2, upstream2, revision2 = parse_debian_version(version2) + + if epoch1 != epoch2: + return epoch1 - epoch2 + + result = compare_upstream_version(upstream1, upstream2) + if result != 0: + return result + + return compare_upstream_version(revision1, revision2) + +def resolve_dependencies(packages, aliases, desired_packages): + """Recursively resolves dependencies for the desired packages.""" + resolved = [] + to_process = deque(desired_packages) + + while to_process: + current = to_process.popleft() + resolved_package = current if current in packages else aliases.get(current, [None])[0] + + if not resolved_package: + print(f"Error: Package '{current}' was not found in the available packages.") + sys.exit(1) + + if resolved_package not in resolved: + resolved.append(resolved_package) + + deps = packages.get(resolved_package, {}).get("Depends", "") + if deps: + deps = [dep.split(' ')[0] for dep in deps.split(', ') if dep] + for dep in deps: + if dep not in resolved and dep not in to_process and dep in packages: + to_process.append(dep) + + return resolved + +def parse_package_index(content): + """Parses the Packages.gz file and returns package information.""" + packages = {} + aliases = {} + entries = re.split(r'\n\n+', content) + + for entry in entries: + fields = dict(re.findall(r'^(\S+): (.+)$', entry, re.MULTILINE)) + if "Package" in fields: + package_name = fields["Package"] + version = fields.get("Version") + filename = fields.get("Filename") + depends = fields.get("Depends") + provides = fields.get("Provides", None) + + # Only update if package_name is not in packages or if the new version is higher + if package_name not in packages or compare_debian_versions(version, packages[package_name]["Version"]) > 0: + packages[package_name] = { + "Version": version, + "Filename": filename, + "Depends": depends + } + + # Update aliases if package provides any alternatives + if provides: + provides_list = [x.strip() for x in provides.split(",")] + for alias in provides_list: + # Strip version specifiers + alias_name = re.sub(r'\s*\(=.*\)', '', alias) + if alias_name not in aliases: + aliases[alias_name] = [] + if package_name not in aliases[alias_name]: + aliases[alias_name].append(package_name) + + return packages, aliases + +def install_packages(mirror, packages_info, aliases, tmp_dir, extract_dir, ar_tool, desired_packages): + """Downloads .deb files and extracts them.""" + resolved_packages = resolve_dependencies(packages_info, aliases, desired_packages) + print(f"Resolved packages (including dependencies): {resolved_packages}") + + packages_to_download = {} + + for pkg in resolved_packages: + if pkg in packages_info: + packages_to_download[pkg] = packages_info[pkg] + + if pkg in aliases: + for alias in aliases[pkg]: + if alias in packages_info: + packages_to_download[alias] = packages_info[alias] + + asyncio.run(download_deb_files_parallel(mirror, packages_to_download, tmp_dir)) + + package_to_deb_file_map = {} + for pkg in resolved_packages: + pkg_info = packages_info.get(pkg) + if pkg_info: + deb_filename = pkg_info.get("Filename") + if deb_filename: + deb_file_path = os.path.join(tmp_dir, os.path.basename(deb_filename)) + package_to_deb_file_map[pkg] = deb_file_path + + for pkg in reversed(resolved_packages): + deb_file = package_to_deb_file_map.get(pkg) + if deb_file and os.path.exists(deb_file): + extract_deb_file(deb_file, tmp_dir, extract_dir, ar_tool) + + print("All done!") + +def extract_deb_file(deb_file, tmp_dir, extract_dir, ar_tool): + """Extract .deb file contents""" + + os.makedirs(extract_dir, exist_ok=True) + + with tempfile.TemporaryDirectory(dir=tmp_dir) as tmp_subdir: + result = subprocess.run(f"{ar_tool} t {os.path.abspath(deb_file)}", cwd=tmp_subdir, check=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) + + tar_filename = None + for line in result.stdout.decode().splitlines(): + if line.startswith("data.tar"): + tar_filename = line.strip() + break + + if not tar_filename: + raise FileNotFoundError(f"Could not find 'data.tar.*' in {deb_file}.") + + tar_file_path = os.path.join(tmp_subdir, tar_filename) + print(f"Extracting {tar_filename} from {deb_file}..") + + subprocess.run(f"{ar_tool} p {os.path.abspath(deb_file)} {tar_filename} > {tar_file_path}", check=True, shell=True) + + file_extension = os.path.splitext(tar_file_path)[1].lower() + + if file_extension == ".xz": + mode = "r:xz" + elif file_extension == ".gz": + mode = "r:gz" + elif file_extension == ".zst": + # zstd is not supported by standard library yet + decompressed_tar_path = tar_file_path.replace(".zst", "") + with open(tar_file_path, "rb") as zst_file, open(decompressed_tar_path, "wb") as decompressed_file: + dctx = zstandard.ZstdDecompressor() + dctx.copy_stream(zst_file, decompressed_file) + + tar_file_path = decompressed_tar_path + mode = "r" + else: + raise ValueError(f"Unsupported compression format: {file_extension}") + + with tarfile.open(tar_file_path, mode) as tar: + tar.extractall(path=extract_dir, filter='fully_trusted') + +def finalize_setup(rootfsdir): + lib_dir = os.path.join(rootfsdir, 'lib') + usr_lib_dir = os.path.join(rootfsdir, 'usr', 'lib') + + if os.path.exists(lib_dir): + if os.path.islink(lib_dir): + os.remove(lib_dir) + else: + os.makedirs(usr_lib_dir, exist_ok=True) + + for item in os.listdir(lib_dir): + src = os.path.join(lib_dir, item) + dest = os.path.join(usr_lib_dir, item) + + if os.path.isdir(src): + shutil.copytree(src, dest, dirs_exist_ok=True) + else: + shutil.copy2(src, dest) + + shutil.rmtree(lib_dir) + + os.symlink(usr_lib_dir, lib_dir) + +if __name__ == "__main__": + parser = argparse.ArgumentParser(description="Generate rootfs for .NET runtime on Debian-like OS") + parser.add_argument("--distro", required=False, help="Distro name (e.g., debian, ubuntu, etc.)") + parser.add_argument("--arch", required=True, help="Architecture (e.g., amd64, loong64, etc.)") + parser.add_argument("--rootfsdir", required=True, help="Destination directory.") + parser.add_argument('--suite', required=True, action='append', help='Specify one or more repository suites to collect index data.') + parser.add_argument("--mirror", required=False, help="Mirror (e.g., http://ftp.debian.org/debian-ports etc.)") + parser.add_argument("--artool", required=False, default="ar", help="ar tool to extract debs (e.g., ar, llvm-ar etc.)") + parser.add_argument("packages", nargs="+", help="List of package names to be installed.") + + args = parser.parse_args() + + if args.mirror is None: + if args.distro == "ubuntu": + args.mirror = "http://archive.ubuntu.com/ubuntu" if args.arch in ["amd64", "i386"] else "http://ports.ubuntu.com/ubuntu-ports" + elif args.distro == "debian": + args.mirror = "http://ftp.debian.org/debian-ports" + else: + raise Exception("Unsupported distro") + + DESIRED_PACKAGES = args.packages + [ # base packages + "dpkg", + "busybox", + "libc-bin", + "base-files", + "base-passwd", + "debianutils" + ] + + print(f"Creating rootfs. rootfsdir: {args.rootfsdir}, distro: {args.distro}, arch: {args.arch}, suites: {args.suite}, mirror: {args.mirror}") + + package_index_content = asyncio.run(download_package_index_parallel(args.mirror, args.arch, args.suite)) + + packages_info, aliases = parse_package_index(package_index_content) + + with tempfile.TemporaryDirectory() as tmp_dir: + install_packages(args.mirror, packages_info, aliases, tmp_dir, args.rootfsdir, args.artool, DESIRED_PACKAGES) + + finalize_setup(args.rootfsdir) diff --git a/eng/common/cross/tizen-fetch.sh b/eng/common/cross/tizen-fetch.sh index 28936ceef3a..37c3a61f1de 100755 --- a/eng/common/cross/tizen-fetch.sh +++ b/eng/common/cross/tizen-fetch.sh @@ -156,13 +156,8 @@ fetch_tizen_pkgs() done } -if [ "$TIZEN_ARCH" == "riscv64" ]; then - BASE="Tizen-Base-RISCV" - UNIFIED="Tizen-Unified-RISCV" -else - BASE="Tizen-Base" - UNIFIED="Tizen-Unified" -fi +BASE="Tizen-Base" +UNIFIED="Tizen-Unified" Inform "Initialize ${TIZEN_ARCH} base" fetch_tizen_pkgs_init standard $BASE diff --git a/eng/common/cross/toolchain.cmake b/eng/common/cross/toolchain.cmake index 9a7ecfbd42c..0ff85cf0367 100644 --- a/eng/common/cross/toolchain.cmake +++ b/eng/common/cross/toolchain.cmake @@ -67,6 +67,13 @@ elseif(TARGET_ARCH_NAME STREQUAL "armv6") else() set(TOOLCHAIN "arm-linux-gnueabihf") endif() +elseif(TARGET_ARCH_NAME STREQUAL "loongarch64") + set(CMAKE_SYSTEM_PROCESSOR "loongarch64") + if(EXISTS ${CROSS_ROOTFS}/usr/lib/gcc/loongarch64-alpine-linux-musl) + set(TOOLCHAIN "loongarch64-alpine-linux-musl") + else() + set(TOOLCHAIN "loongarch64-linux-gnu") + endif() elseif(TARGET_ARCH_NAME STREQUAL "ppc64le") set(CMAKE_SYSTEM_PROCESSOR ppc64le) if(EXISTS ${CROSS_ROOTFS}/usr/lib/gcc/powerpc64le-alpine-linux-musl) @@ -118,7 +125,7 @@ elseif(TARGET_ARCH_NAME STREQUAL "x86") set(TIZEN_TOOLCHAIN "i586-tizen-linux-gnu") endif() else() - message(FATAL_ERROR "Arch is ${TARGET_ARCH_NAME}. Only arm, arm64, armel, armv6, ppc64le, riscv64, s390x, x64 and x86 are supported!") + message(FATAL_ERROR "Arch is ${TARGET_ARCH_NAME}. Only arm, arm64, armel, armv6, loongarch64, ppc64le, riscv64, s390x, x64 and x86 are supported!") endif() if(DEFINED ENV{TOOLCHAIN}) @@ -148,6 +155,25 @@ if(TIZEN) include_directories(SYSTEM ${TIZEN_TOOLCHAIN_PATH}/include/c++/${TIZEN_TOOLCHAIN}) endif() +function(locate_toolchain_exec exec var) + set(TOOLSET_PREFIX ${TOOLCHAIN}-) + string(TOUPPER ${exec} EXEC_UPPERCASE) + if(NOT "$ENV{CLR_${EXEC_UPPERCASE}}" STREQUAL "") + set(${var} "$ENV{CLR_${EXEC_UPPERCASE}}" PARENT_SCOPE) + return() + endif() + + find_program(EXEC_LOCATION_${exec} + NAMES + "${TOOLSET_PREFIX}${exec}${CLR_CMAKE_COMPILER_FILE_NAME_VERSION}" + "${TOOLSET_PREFIX}${exec}") + + if (EXEC_LOCATION_${exec} STREQUAL "EXEC_LOCATION_${exec}-NOTFOUND") + message(FATAL_ERROR "Unable to find toolchain executable. Name: ${exec}, Prefix: ${TOOLSET_PREFIX}.") + endif() + set(${var} ${EXEC_LOCATION_${exec}} PARENT_SCOPE) +endfunction() + if(ANDROID) if(TARGET_ARCH_NAME STREQUAL "arm") set(ANDROID_ABI armeabi-v7a) @@ -178,66 +204,24 @@ elseif(FREEBSD) set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} -fuse-ld=lld") elseif(ILLUMOS) set(CMAKE_SYSROOT "${CROSS_ROOTFS}") + set(CMAKE_SYSTEM_PREFIX_PATH "${CROSS_ROOTFS}") + set(CMAKE_C_STANDARD_LIBRARIES "${CMAKE_C_STANDARD_LIBRARIES} -lssp") + set(CMAKE_CXX_STANDARD_LIBRARIES "${CMAKE_CXX_STANDARD_LIBRARIES} -lssp") include_directories(SYSTEM ${CROSS_ROOTFS}/include) - set(TOOLSET_PREFIX ${TOOLCHAIN}-) - function(locate_toolchain_exec exec var) - string(TOUPPER ${exec} EXEC_UPPERCASE) - if(NOT "$ENV{CLR_${EXEC_UPPERCASE}}" STREQUAL "") - set(${var} "$ENV{CLR_${EXEC_UPPERCASE}}" PARENT_SCOPE) - return() - endif() - - find_program(EXEC_LOCATION_${exec} - NAMES - "${TOOLSET_PREFIX}${exec}${CLR_CMAKE_COMPILER_FILE_NAME_VERSION}" - "${TOOLSET_PREFIX}${exec}") - - if (EXEC_LOCATION_${exec} STREQUAL "EXEC_LOCATION_${exec}-NOTFOUND") - message(FATAL_ERROR "Unable to find toolchain executable. Name: ${exec}, Prefix: ${TOOLSET_PREFIX}.") - endif() - set(${var} ${EXEC_LOCATION_${exec}} PARENT_SCOPE) - endfunction() - - set(CMAKE_SYSTEM_PREFIX_PATH "${CROSS_ROOTFS}") - locate_toolchain_exec(gcc CMAKE_C_COMPILER) locate_toolchain_exec(g++ CMAKE_CXX_COMPILER) - - set(CMAKE_C_STANDARD_LIBRARIES "${CMAKE_C_STANDARD_LIBRARIES} -lssp") - set(CMAKE_CXX_STANDARD_LIBRARIES "${CMAKE_CXX_STANDARD_LIBRARIES} -lssp") elseif(HAIKU) set(CMAKE_SYSROOT "${CROSS_ROOTFS}") set(CMAKE_PROGRAM_PATH "${CMAKE_PROGRAM_PATH};${CROSS_ROOTFS}/cross-tools-x86_64/bin") - - set(TOOLSET_PREFIX ${TOOLCHAIN}-) - function(locate_toolchain_exec exec var) - string(TOUPPER ${exec} EXEC_UPPERCASE) - if(NOT "$ENV{CLR_${EXEC_UPPERCASE}}" STREQUAL "") - set(${var} "$ENV{CLR_${EXEC_UPPERCASE}}" PARENT_SCOPE) - return() - endif() - - find_program(EXEC_LOCATION_${exec} - NAMES - "${TOOLSET_PREFIX}${exec}${CLR_CMAKE_COMPILER_FILE_NAME_VERSION}" - "${TOOLSET_PREFIX}${exec}") - - if (EXEC_LOCATION_${exec} STREQUAL "EXEC_LOCATION_${exec}-NOTFOUND") - message(FATAL_ERROR "Unable to find toolchain executable. Name: ${exec}, Prefix: ${TOOLSET_PREFIX}.") - endif() - set(${var} ${EXEC_LOCATION_${exec}} PARENT_SCOPE) - endfunction() - set(CMAKE_SYSTEM_PREFIX_PATH "${CROSS_ROOTFS}") + set(CMAKE_C_STANDARD_LIBRARIES "${CMAKE_C_STANDARD_LIBRARIES} -lssp") + set(CMAKE_CXX_STANDARD_LIBRARIES "${CMAKE_CXX_STANDARD_LIBRARIES} -lssp") locate_toolchain_exec(gcc CMAKE_C_COMPILER) locate_toolchain_exec(g++ CMAKE_CXX_COMPILER) - set(CMAKE_C_STANDARD_LIBRARIES "${CMAKE_C_STANDARD_LIBRARIES} -lssp") - set(CMAKE_CXX_STANDARD_LIBRARIES "${CMAKE_CXX_STANDARD_LIBRARIES} -lssp") - # let CMake set up the correct search paths include(Platform/Haiku) else() @@ -307,7 +291,7 @@ endif() # Specify compile options -if((TARGET_ARCH_NAME MATCHES "^(arm|arm64|armel|armv6|ppc64le|riscv64|s390x|x64|x86)$" AND NOT ANDROID AND NOT FREEBSD) OR ILLUMOS OR HAIKU) +if((TARGET_ARCH_NAME MATCHES "^(arm|arm64|armel|armv6|loongarch64|ppc64le|riscv64|s390x|x64|x86)$" AND NOT ANDROID AND NOT FREEBSD) OR ILLUMOS OR HAIKU) set(CMAKE_C_COMPILER_TARGET ${TOOLCHAIN}) set(CMAKE_CXX_COMPILER_TARGET ${TOOLCHAIN}) set(CMAKE_ASM_COMPILER_TARGET ${TOOLCHAIN}) diff --git a/eng/common/darc-init.sh b/eng/common/darc-init.sh index 36dbd45e1ce..e889f439b8d 100755 --- a/eng/common/darc-init.sh +++ b/eng/common/darc-init.sh @@ -68,7 +68,7 @@ function InstallDarcCli { fi fi - local arcadeServicesSource="https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json" + local arcadeServicesSource="https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-eng/nuget/v3/index.json" echo "Installing Darc CLI version $darcVersion..." echo "You may need to restart your command shell if this is the first dotnet tool you have installed." diff --git a/eng/common/dotnet.cmd b/eng/common/dotnet.cmd new file mode 100644 index 00000000000..527fa4bb38f --- /dev/null +++ b/eng/common/dotnet.cmd @@ -0,0 +1,7 @@ +@echo off + +:: This script is used to install the .NET SDK. +:: It will also invoke the SDK with any provided arguments. + +powershell -ExecutionPolicy ByPass -NoProfile -command "& """%~dp0dotnet.ps1""" %*" +exit /b %ErrorLevel% diff --git a/eng/common/dotnet.ps1 b/eng/common/dotnet.ps1 new file mode 100644 index 00000000000..45e5676c9eb --- /dev/null +++ b/eng/common/dotnet.ps1 @@ -0,0 +1,11 @@ +# This script is used to install the .NET SDK. +# It will also invoke the SDK with any provided arguments. + +. $PSScriptRoot\tools.ps1 +$dotnetRoot = InitializeDotNetCli -install:$true + +# Invoke acquired SDK with args if they are provided +if ($args.count -gt 0) { + $env:DOTNET_NOLOGO=1 + & "$dotnetRoot\dotnet.exe" $args +} diff --git a/eng/common/dotnet.sh b/eng/common/dotnet.sh new file mode 100644 index 00000000000..2ef68235675 --- /dev/null +++ b/eng/common/dotnet.sh @@ -0,0 +1,26 @@ +#!/usr/bin/env bash + +# This script is used to install the .NET SDK. +# It will also invoke the SDK with any provided arguments. + +source="${BASH_SOURCE[0]}" +# resolve $SOURCE until the file is no longer a symlink +while [[ -h $source ]]; do + scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + source="$(readlink "$source")" + + # if $source was a relative symlink, we need to resolve it relative to the path where the + # symlink file was located + [[ $source != /* ]] && source="$scriptroot/$source" +done +scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + +source $scriptroot/tools.sh +InitializeDotNetCli true # install + +# Invoke acquired SDK with args if they are provided +if [[ $# > 0 ]]; then + __dotnetDir=${_InitializeDotNetCli} + dotnetPath=${__dotnetDir}/dotnet + ${dotnetPath} "$@" +fi diff --git a/eng/common/generate-locproject.ps1 b/eng/common/generate-locproject.ps1 index 524aaa57f2b..fa1cdc2b300 100644 --- a/eng/common/generate-locproject.ps1 +++ b/eng/common/generate-locproject.ps1 @@ -33,15 +33,27 @@ $jsonTemplateFiles | ForEach-Object { $jsonWinformsTemplateFiles = Get-ChildItem -Recurse -Path "$SourcesDirectory" | Where-Object { $_.FullName -Match "en\\strings\.json" } # current winforms pattern +$wxlFilesV3 = @() +$wxlFilesV5 = @() $wxlFiles = Get-ChildItem -Recurse -Path "$SourcesDirectory" | Where-Object { $_.FullName -Match "\\.+\.wxl" -And -Not( $_.Directory.Name -Match "\d{4}" ) } # localized files live in four digit lang ID directories; this excludes them if (-not $wxlFiles) { $wxlEnFiles = Get-ChildItem -Recurse -Path "$SourcesDirectory" | Where-Object { $_.FullName -Match "\\1033\\.+\.wxl" } # pick up en files (1033 = en) specifically so we can copy them to use as the neutral xlf files if ($wxlEnFiles) { - $wxlFiles = @() - $wxlEnFiles | ForEach-Object { - $destinationFile = "$($_.Directory.Parent.FullName)\$($_.Name)" - $wxlFiles += Copy-Item "$($_.FullName)" -Destination $destinationFile -PassThru - } + $wxlFiles = @() + $wxlEnFiles | ForEach-Object { + $destinationFile = "$($_.Directory.Parent.FullName)\$($_.Name)" + $content = Get-Content $_.FullName -Raw + + # Split files on schema to select different parser settings in the generated project. + if ($content -like "*http://wixtoolset.org/schemas/v4/wxl*") + { + $wxlFilesV5 += Copy-Item $_.FullName -Destination $destinationFile -PassThru + } + elseif ($content -like "*http://schemas.microsoft.com/wix/2006/localization*") + { + $wxlFilesV3 += Copy-Item $_.FullName -Destination $destinationFile -PassThru + } + } } } @@ -114,7 +126,32 @@ $locJson = @{ CloneLanguageSet = "WiX_CloneLanguages" LssFiles = @( "wxl_loc.lss" ) LocItems = @( - $wxlFiles | ForEach-Object { + $wxlFilesV3 | ForEach-Object { + $outputPath = "$($_.Directory.FullName | Resolve-Path -Relative)\" + $continue = $true + foreach ($exclusion in $exclusions.Exclusions) { + if ($_.FullName.Contains($exclusion)) { + $continue = $false + } + } + $sourceFile = ($_.FullName | Resolve-Path -Relative) + if ($continue) + { + return @{ + SourceFile = $sourceFile + CopyOption = "LangIDOnPath" + OutputPath = $outputPath + } + } + } + ) + }, + @{ + LanguageSet = $LanguageSet + CloneLanguageSet = "WiX_CloneLanguages" + LssFiles = @( "P210WxlSchemaV4.lss" ) + LocItems = @( + $wxlFilesV5 | ForEach-Object { $outputPath = "$($_.Directory.FullName | Resolve-Path -Relative)\" $continue = $true foreach ($exclusion in $exclusions.Exclusions) { diff --git a/eng/common/native/install-dependencies.sh b/eng/common/native/install-dependencies.sh new file mode 100644 index 00000000000..477a44f335b --- /dev/null +++ b/eng/common/native/install-dependencies.sh @@ -0,0 +1,62 @@ +#!/bin/sh + +set -e + +# This is a simple script primarily used for CI to install necessary dependencies +# +# Usage: +# +# ./install-dependencies.sh + +os="$(echo "$1" | tr "[:upper:]" "[:lower:]")" + +if [ -z "$os" ]; then + . "$(dirname "$0")"/init-os-and-arch.sh +fi + +case "$os" in + linux) + if [ -e /etc/os-release ]; then + . /etc/os-release + fi + + if [ "$ID" = "debian" ] || [ "$ID_LIKE" = "debian" ]; then + apt update + + apt install -y build-essential gettext locales cmake llvm clang lld lldb liblldb-dev libunwind8-dev libicu-dev liblttng-ust-dev \ + libssl-dev libkrb5-dev pigz cpio + + localedef -i en_US -c -f UTF-8 -A /usr/share/locale/locale.alias en_US.UTF-8 + elif [ "$ID" = "fedora" ] || [ "$ID" = "rhel" ] || [ "$ID" = "azurelinux" ]; then + pkg_mgr="$(command -v tdnf 2>/dev/null || command -v dnf)" + $pkg_mgr install -y cmake llvm lld lldb clang python curl libicu-devel openssl-devel krb5-devel lttng-ust-devel pigz cpio + elif [ "$ID" = "alpine" ]; then + apk add build-base cmake bash curl clang llvm-dev lld lldb krb5-dev lttng-ust-dev icu-dev openssl-dev pigz cpio + else + echo "Unsupported distro. distro: $ID" + exit 1 + fi + ;; + + osx|maccatalyst|ios|iossimulator|tvos|tvossimulator) + echo "Installed xcode version: $(xcode-select -p)" + + export HOMEBREW_NO_INSTALL_CLEANUP=1 + export HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK=1 + # Skip brew update for now, see https://github.com/actions/setup-python/issues/577 + # brew update --preinstall + brew bundle --no-upgrade --file=- < Msbuild engine to use to run build ('dotnet', 'vs', or unspecified)." + Write-Host " -excludeCIBinaryLog When running on CI, allow no binary log (short: -nobl)" Write-Host "" Write-Host "Command line arguments not listed above are passed thru to msbuild." } @@ -34,10 +37,11 @@ function Print-Usage() { function Build([string]$target) { $logSuffix = if ($target -eq 'Execute') { '' } else { ".$target" } $log = Join-Path $LogDir "$task$logSuffix.binlog" + $binaryLogArg = if ($binaryLog) { "/bl:$log" } else { "" } $outputPath = Join-Path $ToolsetDir "$task\" MSBuild $taskProject ` - /bl:$log ` + $binaryLogArg ` /t:$target ` /p:Configuration=$configuration ` /p:RepoRoot=$RepoRoot ` @@ -64,7 +68,7 @@ try { $GlobalJson.tools | Add-Member -Name "vs" -Value (ConvertFrom-Json "{ `"version`": `"16.5`" }") -MemberType NoteProperty } if( -not ($GlobalJson.tools.PSObject.Properties.Name -match "xcopy-msbuild" )) { - $GlobalJson.tools | Add-Member -Name "xcopy-msbuild" -Value "17.12.0" -MemberType NoteProperty + $GlobalJson.tools | Add-Member -Name "xcopy-msbuild" -Value "17.13.0" -MemberType NoteProperty } if ($GlobalJson.tools."xcopy-msbuild".Trim() -ine "none") { $xcopyMSBuildToolsFolder = InitializeXCopyMSBuild $GlobalJson.tools."xcopy-msbuild" -install $true diff --git a/eng/common/sdk-task.sh b/eng/common/sdk-task.sh new file mode 100644 index 00000000000..3270f83fa9a --- /dev/null +++ b/eng/common/sdk-task.sh @@ -0,0 +1,121 @@ +#!/usr/bin/env bash + +show_usage() { + echo "Common settings:" + echo " --task Name of Arcade task (name of a project in SdkTasks directory of the Arcade SDK package)" + echo " --restore Restore dependencies" + echo " --verbosity Msbuild verbosity: q[uiet], m[inimal], n[ormal], d[etailed], and diag[nostic]" + echo " --help Print help and exit" + echo "" + + echo "Advanced settings:" + echo " --excludeCIBinarylog Don't output binary log (short: -nobl)" + echo " --noWarnAsError Do not warn as error" + echo "" + echo "Command line arguments not listed above are passed thru to msbuild." +} + +source="${BASH_SOURCE[0]}" + +# resolve $source until the file is no longer a symlink +while [[ -h "$source" ]]; do + scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + source="$(readlink "$source")" + # if $source was a relative symlink, we need to resolve it relative to the path where the + # symlink file was located + [[ $source != /* ]] && source="$scriptroot/$source" +done +scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + +Build() { + local target=$1 + local log_suffix="" + [[ "$target" != "Execute" ]] && log_suffix=".$target" + local log="$log_dir/$task$log_suffix.binlog" + local binaryLogArg="" + [[ $binary_log == true ]] && binaryLogArg="/bl:$log" + local output_path="$toolset_dir/$task/" + + MSBuild "$taskProject" \ + $binaryLogArg \ + /t:"$target" \ + /p:Configuration="$configuration" \ + /p:RepoRoot="$repo_root" \ + /p:BaseIntermediateOutputPath="$output_path" \ + /v:"$verbosity" \ + $properties +} + +binary_log=true +configuration="Debug" +verbosity="minimal" +exclude_ci_binary_log=false +restore=false +help=false +properties='' +warnAsError=true + +while (($# > 0)); do + lowerI="$(echo $1 | tr "[:upper:]" "[:lower:]")" + case $lowerI in + --task) + task=$2 + shift 2 + ;; + --restore) + restore=true + shift 1 + ;; + --verbosity) + verbosity=$2 + shift 2 + ;; + --excludecibinarylog|--nobl) + binary_log=false + exclude_ci_binary_log=true + shift 1 + ;; + --noWarnAsError) + warnAsError=false + shift 1 + ;; + --help) + help=true + shift 1 + ;; + *) + properties="$properties $1" + shift 1 + ;; + esac +done + +ci=true + +if $help; then + show_usage + exit 0 +fi + +. "$scriptroot/tools.sh" +InitializeToolset + +if [[ -z "$task" ]]; then + Write-PipelineTelemetryError -Category 'Task' -Name 'MissingTask' -Message "Missing required parameter '-task '" + ExitWithExitCode 1 +fi + +taskProject=$(GetSdkTaskProject "$task") +if [[ ! -e "$taskProject" ]]; then + Write-PipelineTelemetryError -Category 'Task' -Name 'UnknownTask' -Message "Unknown task: $task" + ExitWithExitCode 1 +fi + +if $restore; then + Build "Restore" +fi + +Build "Execute" + + +ExitWithExitCode 0 diff --git a/eng/common/sdl/packages.config b/eng/common/sdl/packages.config index 4585cfd6bba..e5f543ea68c 100644 --- a/eng/common/sdl/packages.config +++ b/eng/common/sdl/packages.config @@ -1,4 +1,4 @@ - + diff --git a/eng/common/templates-official/job/job.yml b/eng/common/templates-official/job/job.yml index 81ea7a261f2..92a0664f564 100644 --- a/eng/common/templates-official/job/job.yml +++ b/eng/common/templates-official/job/job.yml @@ -31,6 +31,7 @@ jobs: PathtoPublish: '$(Build.ArtifactStagingDirectory)/artifacts' ArtifactName: ${{ coalesce(parameters.artifacts.publish.artifacts.name , 'Artifacts_$(Agent.Os)_$(_BuildConfig)') }} condition: always() + retryCountOnTaskFailure: 10 # for any logs being locked continueOnError: true - ${{ if and(ne(parameters.artifacts.publish.logs, 'false'), ne(parameters.artifacts.publish.logs, '')) }}: - output: pipelineArtifact @@ -39,6 +40,7 @@ jobs: displayName: 'Publish logs' continueOnError: true condition: always() + retryCountOnTaskFailure: 10 # for any logs being locked sbomEnabled: false # we don't need SBOM for logs - ${{ if eq(parameters.enablePublishBuildArtifacts, true) }}: @@ -46,7 +48,7 @@ jobs: displayName: Publish Logs PathtoPublish: '$(Build.ArtifactStagingDirectory)/artifacts/log/$(_BuildConfig)' publishLocation: Container - ArtifactName: ${{ coalesce(parameters.enablePublishBuildArtifacts.artifactName, '$(Agent.Os)_$(Agent.JobName)' ) }} + ArtifactName: ${{ coalesce(parameters.enablePublishBuildArtifacts.artifactName, '$(Agent.Os)_$(Agent.JobName)_Attempt$(System.JobAttempt)' ) }} continueOnError: true condition: always() sbomEnabled: false # we don't need SBOM for logs diff --git a/eng/common/templates-official/steps/publish-build-artifacts.yml b/eng/common/templates-official/steps/publish-build-artifacts.yml index 100a3fc9849..fcf6637b2eb 100644 --- a/eng/common/templates-official/steps/publish-build-artifacts.yml +++ b/eng/common/templates-official/steps/publish-build-artifacts.yml @@ -24,6 +24,10 @@ parameters: - name: is1ESPipeline type: boolean default: true + +- name: retryCountOnTaskFailure + type: string + default: 10 steps: - ${{ if ne(parameters.is1ESPipeline, true) }}: @@ -38,4 +42,5 @@ steps: PathtoPublish: ${{ parameters.pathToPublish }} ${{ if parameters.artifactName }}: ArtifactName: ${{ parameters.artifactName }} - + ${{ if parameters.retryCountOnTaskFailure }}: + retryCountOnTaskFailure: ${{ parameters.retryCountOnTaskFailure }} diff --git a/eng/common/templates-official/steps/source-index-stage1-publish.yml b/eng/common/templates-official/steps/source-index-stage1-publish.yml new file mode 100644 index 00000000000..9b8b80942b5 --- /dev/null +++ b/eng/common/templates-official/steps/source-index-stage1-publish.yml @@ -0,0 +1,7 @@ +steps: +- template: /eng/common/core-templates/steps/source-index-stage1-publish.yml + parameters: + is1ESPipeline: true + + ${{ each parameter in parameters }}: + ${{ parameter.key }}: ${{ parameter.value }} diff --git a/eng/common/templates/job/job.yml b/eng/common/templates/job/job.yml index 5bdd3dd85fd..238fa0818f7 100644 --- a/eng/common/templates/job/job.yml +++ b/eng/common/templates/job/job.yml @@ -46,6 +46,7 @@ jobs: artifactName: ${{ coalesce(parameters.artifacts.publish.artifacts.name , 'Artifacts_$(Agent.Os)_$(_BuildConfig)') }} continueOnError: true condition: always() + retryCountOnTaskFailure: 10 # for any logs being locked - ${{ if and(ne(parameters.artifacts.publish.logs, 'false'), ne(parameters.artifacts.publish.logs, '')) }}: - template: /eng/common/core-templates/steps/publish-pipeline-artifacts.yml parameters: @@ -56,6 +57,7 @@ jobs: displayName: 'Publish logs' continueOnError: true condition: always() + retryCountOnTaskFailure: 10 # for any logs being locked sbomEnabled: false # we don't need SBOM for logs - ${{ if ne(parameters.enablePublishBuildArtifacts, 'false') }}: @@ -66,7 +68,7 @@ jobs: displayName: Publish Logs pathToPublish: '$(Build.ArtifactStagingDirectory)/artifacts/log/$(_BuildConfig)' publishLocation: Container - artifactName: ${{ coalesce(parameters.enablePublishBuildArtifacts.artifactName, '$(Agent.Os)_$(Agent.JobName)' ) }} + artifactName: ${{ coalesce(parameters.enablePublishBuildArtifacts.artifactName, '$(Agent.Os)_$(Agent.JobName)_Attempt$(System.JobAttempt)' ) }} continueOnError: true condition: always() diff --git a/eng/common/templates/steps/publish-build-artifacts.yml b/eng/common/templates/steps/publish-build-artifacts.yml index 6428a98dfef..605e602e94d 100644 --- a/eng/common/templates/steps/publish-build-artifacts.yml +++ b/eng/common/templates/steps/publish-build-artifacts.yml @@ -25,6 +25,10 @@ parameters: type: string default: 'Container' +- name: retryCountOnTaskFailure + type: string + default: 10 + steps: - ${{ if eq(parameters.is1ESPipeline, true) }}: - 'eng/common/templates cannot be referenced from a 1ES managed template': error @@ -37,4 +41,6 @@ steps: PublishLocation: ${{ parameters.publishLocation }} PathtoPublish: ${{ parameters.pathToPublish }} ${{ if parameters.artifactName }}: - ArtifactName: ${{ parameters.artifactName }} \ No newline at end of file + ArtifactName: ${{ parameters.artifactName }} + ${{ if parameters.retryCountOnTaskFailure }}: + retryCountOnTaskFailure: ${{ parameters.retryCountOnTaskFailure }} diff --git a/eng/common/templates/steps/source-index-stage1-publish.yml b/eng/common/templates/steps/source-index-stage1-publish.yml new file mode 100644 index 00000000000..182cec33a7b --- /dev/null +++ b/eng/common/templates/steps/source-index-stage1-publish.yml @@ -0,0 +1,7 @@ +steps: +- template: /eng/common/core-templates/steps/source-index-stage1-publish.yml + parameters: + is1ESPipeline: false + + ${{ each parameter in parameters }}: + ${{ parameter.key }}: ${{ parameter.value }} diff --git a/eng/common/templates/steps/vmr-sync.yml b/eng/common/templates/steps/vmr-sync.yml new file mode 100644 index 00000000000..599afb6186b --- /dev/null +++ b/eng/common/templates/steps/vmr-sync.yml @@ -0,0 +1,207 @@ +### These steps synchronize new code from product repositories into the VMR (https://github.com/dotnet/dotnet). +### They initialize the darc CLI and pull the new updates. +### Changes are applied locally onto the already cloned VMR (located in $vmrPath). + +parameters: +- name: targetRef + displayName: Target revision in dotnet/ to synchronize + type: string + default: $(Build.SourceVersion) + +- name: vmrPath + displayName: Path where the dotnet/dotnet is checked out to + type: string + default: $(Agent.BuildDirectory)/vmr + +- name: additionalSyncs + displayName: Optional list of package names whose repo's source will also be synchronized in the local VMR, e.g. NuGet.Protocol + type: object + default: [] + +steps: +- checkout: vmr + displayName: Clone dotnet/dotnet + path: vmr + clean: true + +- checkout: self + displayName: Clone $(Build.Repository.Name) + path: repo + fetchDepth: 0 + +# This step is needed so that when we get a detached HEAD / shallow clone, +# we still pull the commit into the temporary repo clone to use it during the sync. +# Also unshallow the clone so that forwardflow command would work. +- script: | + git branch repo-head + git rev-parse HEAD + displayName: Label PR commit + workingDirectory: $(Agent.BuildDirectory)/repo + +- script: | + vmr_sha=$(grep -oP '(?<=Sha=")[^"]*' $(Agent.BuildDirectory)/repo/eng/Version.Details.xml) + echo "##vso[task.setvariable variable=vmr_sha]$vmr_sha" + displayName: Obtain the vmr sha from Version.Details.xml (Unix) + condition: ne(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo + +- powershell: | + [xml]$xml = Get-Content -Path $(Agent.BuildDirectory)/repo/eng/Version.Details.xml + $vmr_sha = $xml.SelectSingleNode("//Source").Sha + Write-Output "##vso[task.setvariable variable=vmr_sha]$vmr_sha" + displayName: Obtain the vmr sha from Version.Details.xml (Windows) + condition: eq(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo + +- script: | + git fetch --all + git checkout $(vmr_sha) + displayName: Checkout VMR at correct sha for repo flow + workingDirectory: ${{ parameters.vmrPath }} + +- script: | + git config --global user.name "dotnet-maestro[bot]" + git config --global user.email "dotnet-maestro[bot]@users.noreply.github.com" + displayName: Set git author to dotnet-maestro[bot] + workingDirectory: ${{ parameters.vmrPath }} + +- script: | + ./eng/common/vmr-sync.sh \ + --vmr ${{ parameters.vmrPath }} \ + --tmp $(Agent.TempDirectory) \ + --azdev-pat '$(dn-bot-all-orgs-code-r)' \ + --ci \ + --debug + + if [ "$?" -ne 0 ]; then + echo "##vso[task.logissue type=error]Failed to synchronize the VMR" + exit 1 + fi + displayName: Sync repo into VMR (Unix) + condition: ne(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo + +- script: | + git config --global diff.astextplain.textconv echo + git config --system core.longpaths true + displayName: Configure Windows git (longpaths, astextplain) + condition: eq(variables['Agent.OS'], 'Windows_NT') + +- powershell: | + ./eng/common/vmr-sync.ps1 ` + -vmr ${{ parameters.vmrPath }} ` + -tmp $(Agent.TempDirectory) ` + -azdevPat '$(dn-bot-all-orgs-code-r)' ` + -ci ` + -debugOutput + + if ($LASTEXITCODE -ne 0) { + echo "##vso[task.logissue type=error]Failed to synchronize the VMR" + exit 1 + } + displayName: Sync repo into VMR (Windows) + condition: eq(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo + +- ${{ if eq(variables['Build.Reason'], 'PullRequest') }}: + - task: CopyFiles@2 + displayName: Collect failed patches + condition: failed() + inputs: + SourceFolder: '$(Agent.TempDirectory)' + Contents: '*.patch' + TargetFolder: '$(Build.ArtifactStagingDirectory)/FailedPatches' + + - publish: '$(Build.ArtifactStagingDirectory)/FailedPatches' + artifact: $(System.JobDisplayName)_FailedPatches + displayName: Upload failed patches + condition: failed() + +- ${{ each assetName in parameters.additionalSyncs }}: + # The vmr-sync script ends up staging files in the local VMR so we have to commit those + - script: + git commit --allow-empty -am "Forward-flow $(Build.Repository.Name)" + displayName: Commit local VMR changes + workingDirectory: ${{ parameters.vmrPath }} + + - script: | + set -ex + + echo "Searching for details of asset ${{ assetName }}..." + + # Use darc to get dependencies information + dependencies=$(./.dotnet/dotnet darc get-dependencies --name '${{ assetName }}' --ci) + + # Extract repository URL and commit hash + repository=$(echo "$dependencies" | grep 'Repo:' | sed 's/Repo:[[:space:]]*//' | head -1) + + if [ -z "$repository" ]; then + echo "##vso[task.logissue type=error]Asset ${{ assetName }} not found in the dependency list" + exit 1 + fi + + commit=$(echo "$dependencies" | grep 'Commit:' | sed 's/Commit:[[:space:]]*//' | head -1) + + echo "Updating the VMR from $repository / $commit..." + cd .. + git clone $repository ${{ assetName }} + cd ${{ assetName }} + git checkout $commit + git branch "sync/$commit" + + ./eng/common/vmr-sync.sh \ + --vmr ${{ parameters.vmrPath }} \ + --tmp $(Agent.TempDirectory) \ + --azdev-pat '$(dn-bot-all-orgs-code-r)' \ + --ci \ + --debug + + if [ "$?" -ne 0 ]; then + echo "##vso[task.logissue type=error]Failed to synchronize the VMR" + exit 1 + fi + displayName: Sync ${{ assetName }} into (Unix) + condition: ne(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo + + - powershell: | + $ErrorActionPreference = 'Stop' + + Write-Host "Searching for details of asset ${{ assetName }}..." + + $dependencies = .\.dotnet\dotnet darc get-dependencies --name '${{ assetName }}' --ci + + $repository = $dependencies | Select-String -Pattern 'Repo:\s+([^\s]+)' | Select-Object -First 1 + $repository -match 'Repo:\s+([^\s]+)' | Out-Null + $repository = $matches[1] + + if ($repository -eq $null) { + Write-Error "Asset ${{ assetName }} not found in the dependency list" + exit 1 + } + + $commit = $dependencies | Select-String -Pattern 'Commit:\s+([^\s]+)' | Select-Object -First 1 + $commit -match 'Commit:\s+([^\s]+)' | Out-Null + $commit = $matches[1] + + Write-Host "Updating the VMR from $repository / $commit..." + cd .. + git clone $repository ${{ assetName }} + cd ${{ assetName }} + git checkout $commit + git branch "sync/$commit" + + .\eng\common\vmr-sync.ps1 ` + -vmr ${{ parameters.vmrPath }} ` + -tmp $(Agent.TempDirectory) ` + -azdevPat '$(dn-bot-all-orgs-code-r)' ` + -ci ` + -debugOutput + + if ($LASTEXITCODE -ne 0) { + echo "##vso[task.logissue type=error]Failed to synchronize the VMR" + exit 1 + } + displayName: Sync ${{ assetName }} into (Windows) + condition: ne(variables['Agent.OS'], 'Windows_NT') + workingDirectory: $(Agent.BuildDirectory)/repo diff --git a/eng/common/templates/vmr-build-pr.yml b/eng/common/templates/vmr-build-pr.yml new file mode 100644 index 00000000000..ce3c29a62fa --- /dev/null +++ b/eng/common/templates/vmr-build-pr.yml @@ -0,0 +1,42 @@ +# This pipeline is used for running the VMR verification of the PR changes in repo-level PRs. +# +# It will run a full set of verification jobs defined in: +# https://github.com/dotnet/dotnet/blob/10060d128e3f470e77265f8490f5e4f72dae738e/eng/pipelines/templates/stages/vmr-build.yml#L27-L38 +# +# For repos that do not need to run the full set, you would do the following: +# +# 1. Copy this YML file to a repo-specific location, i.e. outside of eng/common. +# +# 2. Add `verifications` parameter to VMR template reference +# +# Examples: +# - For source-build stage 1 verification, add the following: +# verifications: [ "source-build-stage1" ] +# +# - For Windows only verifications, add the following: +# verifications: [ "unified-build-windows-x64", "unified-build-windows-x86" ] + +trigger: none +pr: none + +variables: +- template: /eng/common/templates/variables/pool-providers.yml@self + +- name: skipComponentGovernanceDetection # we run CG on internal builds only + value: true + +- name: Codeql.Enabled # we run CodeQL on internal builds only + value: false + +resources: + repositories: + - repository: vmr + type: github + name: dotnet/dotnet + endpoint: dotnet + +stages: +- template: /eng/pipelines/templates/stages/vmr-build.yml@vmr + parameters: + isBuiltFromVmr: false + scope: lite diff --git a/eng/common/tools.ps1 b/eng/common/tools.ps1 index 9b3ad8840fd..06b44de7870 100644 --- a/eng/common/tools.ps1 +++ b/eng/common/tools.ps1 @@ -65,10 +65,8 @@ $ErrorActionPreference = 'Stop' # Base-64 encoded SAS token that has permission to storage container described by $runtimeSourceFeed [string]$runtimeSourceFeedKey = if (Test-Path variable:runtimeSourceFeedKey) { $runtimeSourceFeedKey } else { $null } -# True if the build is a product build -[bool]$productBuild = if (Test-Path variable:productBuild) { $productBuild } else { $false } - -[String[]]$properties = if (Test-Path variable:properties) { $properties } else { @() } +# True when the build is running within the VMR. +[bool]$fromVMR = if (Test-Path variable:fromVMR) { $fromVMR } else { $false } function Create-Directory ([string[]] $path) { New-Item -Path $path -Force -ItemType 'Directory' | Out-Null @@ -259,7 +257,20 @@ function Retry($downloadBlock, $maxRetries = 5) { function GetDotNetInstallScript([string] $dotnetRoot) { $installScript = Join-Path $dotnetRoot 'dotnet-install.ps1' + $shouldDownload = $false + if (!(Test-Path $installScript)) { + $shouldDownload = $true + } else { + # Check if the script is older than 30 days + $fileAge = (Get-Date) - (Get-Item $installScript).LastWriteTime + if ($fileAge.Days -gt 30) { + Write-Host "Existing install script is too old, re-downloading..." + $shouldDownload = $true + } + } + + if ($shouldDownload) { Create-Directory $dotnetRoot $ProgressPreference = 'SilentlyContinue' # Don't display the console progress UI - it's a huge perf hit $uri = "https://builds.dotnet.microsoft.com/dotnet/scripts/$dotnetInstallScriptVersion/dotnet-install.ps1" @@ -383,8 +394,8 @@ function InitializeVisualStudioMSBuild([bool]$install, [object]$vsRequirements = # If the version of msbuild is going to be xcopied, # use this version. Version matches a package here: - # https://dev.azure.com/dnceng/public/_artifacts/feed/dotnet-eng/NuGet/Microsoft.DotNet.Arcade.MSBuild.Xcopy/versions/17.12.0 - $defaultXCopyMSBuildVersion = '17.12.0' + # https://dev.azure.com/dnceng/public/_artifacts/feed/dotnet-eng/NuGet/Microsoft.DotNet.Arcade.MSBuild.Xcopy/versions/17.13.0 + $defaultXCopyMSBuildVersion = '17.13.0' if (!$vsRequirements) { if (Get-Member -InputObject $GlobalJson.tools -Name 'vs') { @@ -533,7 +544,8 @@ function LocateVisualStudio([object]$vsRequirements = $null){ if (Get-Member -InputObject $GlobalJson.tools -Name 'vswhere') { $vswhereVersion = $GlobalJson.tools.vswhere } else { - $vswhereVersion = '2.5.2' + # keep this in sync with the VSWhereVersion in DefaultVersions.props + $vswhereVersion = '3.1.7' } $vsWhereDir = Join-Path $ToolsDir "vswhere\$vswhereVersion" @@ -541,7 +553,8 @@ function LocateVisualStudio([object]$vsRequirements = $null){ if (!(Test-Path $vsWhereExe)) { Create-Directory $vsWhereDir - Write-Host 'Downloading vswhere' + Write-Host "Downloading vswhere $vswhereVersion" + $ProgressPreference = 'SilentlyContinue' # Don't display the console progress UI - it's a huge perf hit Retry({ Invoke-WebRequest "https://netcorenativeassets.blob.core.windows.net/resource-packages/external/windows/vswhere/$vswhereVersion/vswhere.exe" -OutFile $vswhereExe }) @@ -604,14 +617,7 @@ function InitializeBuildTool() { } $dotnetPath = Join-Path $dotnetRoot (GetExecutableFileName 'dotnet') - # Use override if it exists - commonly set by source-build - if ($null -eq $env:_OverrideArcadeInitializeBuildToolFramework) { - $initializeBuildToolFramework="net9.0" - } else { - $initializeBuildToolFramework=$env:_OverrideArcadeInitializeBuildToolFramework - } - - $buildTool = @{ Path = $dotnetPath; Command = 'msbuild'; Tool = 'dotnet'; Framework = $initializeBuildToolFramework } + $buildTool = @{ Path = $dotnetPath; Command = 'msbuild'; Tool = 'dotnet'; Framework = 'net' } } elseif ($msbuildEngine -eq "vs") { try { $msbuildPath = InitializeVisualStudioMSBuild -install:$restore @@ -620,7 +626,7 @@ function InitializeBuildTool() { ExitWithExitCode 1 } - $buildTool = @{ Path = $msbuildPath; Command = ""; Tool = "vs"; Framework = "net472"; ExcludePrereleaseVS = $excludePrereleaseVS } + $buildTool = @{ Path = $msbuildPath; Command = ""; Tool = "vs"; Framework = "netframework"; ExcludePrereleaseVS = $excludePrereleaseVS } } else { Write-PipelineTelemetryError -Category 'InitializeToolset' -Message "Unexpected value of -msbuildEngine: '$msbuildEngine'." ExitWithExitCode 1 @@ -653,7 +659,6 @@ function GetNuGetPackageCachePath() { $env:NUGET_PACKAGES = Join-Path $env:UserProfile '.nuget\packages\' } else { $env:NUGET_PACKAGES = Join-Path $RepoRoot '.packages\' - $env:RESTORENOHTTPCACHE = $true } } @@ -775,26 +780,13 @@ function MSBuild() { $toolsetBuildProject = InitializeToolset $basePath = Split-Path -parent $toolsetBuildProject - $possiblePaths = @( - # new scripts need to work with old packages, so we need to look for the old names/versions - (Join-Path $basePath (Join-Path $buildTool.Framework 'Microsoft.DotNet.ArcadeLogging.dll')), - (Join-Path $basePath (Join-Path $buildTool.Framework 'Microsoft.DotNet.Arcade.Sdk.dll')), - (Join-Path $basePath (Join-Path net7.0 'Microsoft.DotNet.ArcadeLogging.dll')), - (Join-Path $basePath (Join-Path net7.0 'Microsoft.DotNet.Arcade.Sdk.dll')), - (Join-Path $basePath (Join-Path net8.0 'Microsoft.DotNet.ArcadeLogging.dll')), - (Join-Path $basePath (Join-Path net8.0 'Microsoft.DotNet.Arcade.Sdk.dll')) - ) - $selectedPath = $null - foreach ($path in $possiblePaths) { - if (Test-Path $path -PathType Leaf) { - $selectedPath = $path - break - } - } + $selectedPath = Join-Path $basePath (Join-Path $buildTool.Framework 'Microsoft.DotNet.ArcadeLogging.dll') + if (-not $selectedPath) { - Write-PipelineTelemetryError -Category 'Build' -Message 'Unable to find arcade sdk logger assembly.' + Write-PipelineTelemetryError -Category 'Build' -Message "Unable to find arcade sdk logger assembly: $selectedPath" ExitWithExitCode 1 } + $args += "/logger:$selectedPath" } @@ -857,8 +849,8 @@ function MSBuild-Core() { } # When running on Azure Pipelines, override the returned exit code to avoid double logging. - # Skip this when the build is a child of the VMR orchestrator build. - if ($ci -and $env:SYSTEM_TEAMPROJECT -ne $null -and !$productBuild -and -not($properties -like "*DotNetBuildRepo=true*")) { + # Skip this when the build is a child of the VMR build. + if ($ci -and $env:SYSTEM_TEAMPROJECT -ne $null -and !$fromVMR) { Write-PipelineSetResult -Result "Failed" -Message "msbuild execution failed." # Exiting with an exit code causes the azure pipelines task to log yet another "noise" error # The above Write-PipelineSetResult will cause the task to be marked as failure without adding yet another error diff --git a/eng/common/tools.sh b/eng/common/tools.sh index 01b09b65796..c1841c9dfd0 100755 --- a/eng/common/tools.sh +++ b/eng/common/tools.sh @@ -5,6 +5,9 @@ # CI mode - set to true on CI server for PR validation build or official build. ci=${ci:-false} +# Build mode +source_build=${source_build:-false} + # Set to true to use the pipelines logger which will enable Azure logging output. # https://github.com/Microsoft/azure-pipelines-tasks/blob/master/docs/authoring/commands.md # This flag is meant as a temporary opt-opt for the feature while validate it across @@ -58,7 +61,8 @@ use_installed_dotnet_cli=${use_installed_dotnet_cli:-true} dotnetInstallScriptVersion=${dotnetInstallScriptVersion:-'v1'} # True to use global NuGet cache instead of restoring packages to repository-local directory. -if [[ "$ci" == true ]]; then +# Keep in sync with NuGetPackageroot in Arcade SDK's RepositoryLayout.props. +if [[ "$ci" == true || "$source_build" == true ]]; then use_global_nuget_cache=${use_global_nuget_cache:-false} else use_global_nuget_cache=${use_global_nuget_cache:-true} @@ -68,8 +72,8 @@ fi runtime_source_feed=${runtime_source_feed:-''} runtime_source_feed_key=${runtime_source_feed_key:-''} -# True if the build is a product build -product_build=${product_build:-false} +# True when the build is running within the VMR. +from_vmr=${from_vmr:-false} # Resolve any symlinks in the given path. function ResolvePath { @@ -296,8 +300,29 @@ function GetDotNetInstallScript { local root=$1 local install_script="$root/dotnet-install.sh" local install_script_url="https://builds.dotnet.microsoft.com/dotnet/scripts/$dotnetInstallScriptVersion/dotnet-install.sh" + local timestamp_file="$root/.dotnet-install.timestamp" + local should_download=false if [[ ! -a "$install_script" ]]; then + should_download=true + elif [[ -f "$timestamp_file" ]]; then + # Check if the script is older than 30 days using timestamp file + local download_time=$(cat "$timestamp_file" 2>/dev/null || echo "0") + local current_time=$(date +%s) + local age_seconds=$((current_time - download_time)) + + # 30 days = 30 * 24 * 60 * 60 = 2592000 seconds + if [[ $age_seconds -gt 2592000 ]]; then + echo "Existing install script is too old, re-downloading..." + should_download=true + fi + else + # No timestamp file exists, assume script is old and re-download + echo "No timestamp found for existing install script, re-downloading..." + should_download=true + fi + + if [[ "$should_download" == true ]]; then mkdir -p "$root" echo "Downloading '$install_script_url'" @@ -324,6 +349,9 @@ function GetDotNetInstallScript { ExitWithExitCode $exit_code } fi + + # Create timestamp file to track download time in seconds from epoch + date +%s > "$timestamp_file" fi # return value _GetDotNetInstallScript="$install_script" @@ -339,22 +367,14 @@ function InitializeBuildTool { # return values _InitializeBuildTool="$_InitializeDotNetCli/dotnet" _InitializeBuildToolCommand="msbuild" - # use override if it exists - commonly set by source-build - if [[ "${_OverrideArcadeInitializeBuildToolFramework:-x}" == "x" ]]; then - _InitializeBuildToolFramework="net9.0" - else - _InitializeBuildToolFramework="${_OverrideArcadeInitializeBuildToolFramework}" - fi } -# Set RestoreNoHttpCache as a workaround for https://github.com/NuGet/Home/issues/3116 function GetNuGetPackageCachePath { if [[ -z ${NUGET_PACKAGES:-} ]]; then if [[ "$use_global_nuget_cache" == true ]]; then export NUGET_PACKAGES="$HOME/.nuget/packages/" else export NUGET_PACKAGES="$repo_root/.packages/" - export RESTORENOHTTPCACHE=true fi fi @@ -451,25 +471,13 @@ function MSBuild { fi local toolset_dir="${_InitializeToolset%/*}" - # new scripts need to work with old packages, so we need to look for the old names/versions - local selectedPath= - local possiblePaths=() - possiblePaths+=( "$toolset_dir/$_InitializeBuildToolFramework/Microsoft.DotNet.ArcadeLogging.dll" ) - possiblePaths+=( "$toolset_dir/$_InitializeBuildToolFramework/Microsoft.DotNet.Arcade.Sdk.dll" ) - possiblePaths+=( "$toolset_dir/net7.0/Microsoft.DotNet.ArcadeLogging.dll" ) - possiblePaths+=( "$toolset_dir/net7.0/Microsoft.DotNet.Arcade.Sdk.dll" ) - possiblePaths+=( "$toolset_dir/net8.0/Microsoft.DotNet.ArcadeLogging.dll" ) - possiblePaths+=( "$toolset_dir/net8.0/Microsoft.DotNet.Arcade.Sdk.dll" ) - for path in "${possiblePaths[@]}"; do - if [[ -f $path ]]; then - selectedPath=$path - break - fi - done + local selectedPath="$toolset_dir/net/Microsoft.DotNet.ArcadeLogging.dll" + if [[ -z "$selectedPath" ]]; then - Write-PipelineTelemetryError -category 'Build' "Unable to find arcade sdk logger assembly." + Write-PipelineTelemetryError -category 'Build' "Unable to find arcade sdk logger assembly: $selectedPath" ExitWithExitCode 1 fi + args+=( "-logger:$selectedPath" ) fi @@ -506,8 +514,8 @@ function MSBuild-Core { echo "Build failed with exit code $exit_code. Check errors above." # When running on Azure Pipelines, override the returned exit code to avoid double logging. - # Skip this when the build is a child of the VMR orchestrator build. - if [[ "$ci" == true && -n ${SYSTEM_TEAMPROJECT:-} && "$product_build" != true && "$properties" != *"DotNetBuildRepo=true"* ]]; then + # Skip this when the build is a child of the VMR build. + if [[ "$ci" == true && -n ${SYSTEM_TEAMPROJECT:-} && "$from_vmr" != true ]]; then Write-PipelineSetResult -result "Failed" -message "msbuild execution failed." # Exiting with an exit code causes the azure pipelines task to log yet another "noise" error # The above Write-PipelineSetResult will cause the task to be marked as failure without adding yet another error @@ -530,6 +538,13 @@ function GetDarc { fi "$eng_root/common/darc-init.sh" --toolpath "$darc_path" $version + darc_tool="$darc_path/darc" +} + +# Returns a full path to an Arcade SDK task project file. +function GetSdkTaskProject { + taskName=$1 + echo "$(dirname $_InitializeToolset)/SdkTasks/$taskName.proj" } ResolvePath "${BASH_SOURCE[0]}" diff --git a/eng/common/vmr-sync.ps1 b/eng/common/vmr-sync.ps1 new file mode 100644 index 00000000000..97302f3205b --- /dev/null +++ b/eng/common/vmr-sync.ps1 @@ -0,0 +1,138 @@ +<# +.SYNOPSIS + +This script is used for synchronizing the current repository into a local VMR. +It pulls the current repository's code into the specified VMR directory for local testing or +Source-Build validation. + +.DESCRIPTION + +The tooling used for synchronization will clone the VMR repository into a temporary folder if +it does not already exist. These clones can be reused in future synchronizations, so it is +recommended to dedicate a folder for this to speed up re-runs. + +.EXAMPLE + Synchronize current repository into a local VMR: + ./vmr-sync.ps1 -vmrDir "$HOME/repos/dotnet" -tmpDir "$HOME/repos/tmp" + +.PARAMETER tmpDir +Required. Path to the temporary folder where repositories will be cloned + +.PARAMETER vmrBranch +Optional. Branch of the 'dotnet/dotnet' repo to synchronize. The VMR will be checked out to this branch + +.PARAMETER azdevPat +Optional. Azure DevOps PAT to use for cloning private repositories. + +.PARAMETER vmrDir +Optional. Path to the dotnet/dotnet repository. When null, gets cloned to the temporary folder + +.PARAMETER debugOutput +Optional. Enables debug logging in the darc vmr command. + +.PARAMETER ci +Optional. Denotes that the script is running in a CI environment. +#> +param ( + [Parameter(Mandatory=$true, HelpMessage="Path to the temporary folder where repositories will be cloned")] + [string][Alias('t', 'tmp')]$tmpDir, + [string][Alias('b', 'branch')]$vmrBranch, + [string]$remote, + [string]$azdevPat, + [string][Alias('v', 'vmr')]$vmrDir, + [switch]$ci, + [switch]$debugOutput +) + +function Fail { + Write-Host "> $($args[0])" -ForegroundColor 'Red' +} + +function Highlight { + Write-Host "> $($args[0])" -ForegroundColor 'Cyan' +} + +$verbosity = 'verbose' +if ($debugOutput) { + $verbosity = 'debug' +} +# Validation + +if (-not $tmpDir) { + Fail "Missing -tmpDir argument. Please specify the path to the temporary folder where the repositories will be cloned" + exit 1 +} + +# Sanitize the input + +if (-not $vmrDir) { + $vmrDir = Join-Path $tmpDir 'dotnet' +} + +if (-not (Test-Path -Path $tmpDir -PathType Container)) { + New-Item -ItemType Directory -Path $tmpDir | Out-Null +} + +# Prepare the VMR + +if (-not (Test-Path -Path $vmrDir -PathType Container)) { + Highlight "Cloning 'dotnet/dotnet' into $vmrDir.." + git clone https://github.com/dotnet/dotnet $vmrDir + + if ($vmrBranch) { + git -C $vmrDir switch -c $vmrBranch + } +} +else { + if ((git -C $vmrDir diff --quiet) -eq $false) { + Fail "There are changes in the working tree of $vmrDir. Please commit or stash your changes" + exit 1 + } + + if ($vmrBranch) { + Highlight "Preparing $vmrDir" + git -C $vmrDir checkout $vmrBranch + git -C $vmrDir pull + } +} + +Set-StrictMode -Version Latest + +# Prepare darc + +Highlight 'Installing .NET, preparing the tooling..' +. .\eng\common\tools.ps1 +$dotnetRoot = InitializeDotNetCli -install:$true +$darc = Get-Darc +$dotnet = "$dotnetRoot\dotnet.exe" + +Highlight "Starting the synchronization of VMR.." + +# Synchronize the VMR +$darcArgs = ( + "vmr", "forwardflow", + "--tmp", $tmpDir, + "--$verbosity", + $vmrDir +) + +if ($ci) { + $darcArgs += ("--ci") +} + +if ($azdevPat) { + $darcArgs += ("--azdev-pat", $azdevPat) +} + +& "$darc" $darcArgs + +if ($LASTEXITCODE -eq 0) { + Highlight "Synchronization succeeded" +} +else { + Fail "Synchronization of repo to VMR failed!" + Fail "'$vmrDir' is left in its last state (re-run of this script will reset it)." + Fail "Please inspect the logs which contain path to the failing patch file (use -debugOutput to get all the details)." + Fail "Once you make changes to the conflicting VMR patch, commit it locally and re-run this script." + exit 1 +} diff --git a/eng/common/vmr-sync.sh b/eng/common/vmr-sync.sh new file mode 100644 index 00000000000..44239e331c0 --- /dev/null +++ b/eng/common/vmr-sync.sh @@ -0,0 +1,207 @@ +#!/bin/bash + +### This script is used for synchronizing the current repository into a local VMR. +### It pulls the current repository's code into the specified VMR directory for local testing or +### Source-Build validation. +### +### The tooling used for synchronization will clone the VMR repository into a temporary folder if +### it does not already exist. These clones can be reused in future synchronizations, so it is +### recommended to dedicate a folder for this to speed up re-runs. +### +### USAGE: +### Synchronize current repository into a local VMR: +### ./vmr-sync.sh --tmp "$HOME/repos/tmp" "$HOME/repos/dotnet" +### +### Options: +### -t, --tmp, --tmp-dir PATH +### Required. Path to the temporary folder where repositories will be cloned +### +### -b, --branch, --vmr-branch BRANCH_NAME +### Optional. Branch of the 'dotnet/dotnet' repo to synchronize. The VMR will be checked out to this branch +### +### --debug +### Optional. Turns on the most verbose logging for the VMR tooling +### +### --remote name:URI +### Optional. Additional remote to use during the synchronization +### This can be used to synchronize to a commit from a fork of the repository +### Example: 'runtime:https://github.com/yourfork/runtime' +### +### --azdev-pat +### Optional. Azure DevOps PAT to use for cloning private repositories. +### +### -v, --vmr, --vmr-dir PATH +### Optional. Path to the dotnet/dotnet repository. When null, gets cloned to the temporary folder + +source="${BASH_SOURCE[0]}" + +# resolve $source until the file is no longer a symlink +while [[ -h "$source" ]]; do + scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + source="$(readlink "$source")" + # if $source was a relative symlink, we need to resolve it relative to the path where the + # symlink file was located + [[ $source != /* ]] && source="$scriptroot/$source" +done +scriptroot="$( cd -P "$( dirname "$source" )" && pwd )" + +function print_help () { + sed -n '/^### /,/^$/p' "$source" | cut -b 5- +} + +COLOR_RED=$(tput setaf 1 2>/dev/null || true) +COLOR_CYAN=$(tput setaf 6 2>/dev/null || true) +COLOR_CLEAR=$(tput sgr0 2>/dev/null || true) +COLOR_RESET=uniquesearchablestring +FAILURE_PREFIX='> ' + +function fail () { + echo "${COLOR_RED}$FAILURE_PREFIX${1//${COLOR_RESET}/${COLOR_RED}}${COLOR_CLEAR}" >&2 +} + +function highlight () { + echo "${COLOR_CYAN}$FAILURE_PREFIX${1//${COLOR_RESET}/${COLOR_CYAN}}${COLOR_CLEAR}" +} + +tmp_dir='' +vmr_dir='' +vmr_branch='' +additional_remotes='' +verbosity=verbose +azdev_pat='' +ci=false + +while [[ $# -gt 0 ]]; do + opt="$(echo "$1" | tr "[:upper:]" "[:lower:]")" + case "$opt" in + -t|--tmp|--tmp-dir) + tmp_dir=$2 + shift + ;; + -v|--vmr|--vmr-dir) + vmr_dir=$2 + shift + ;; + -b|--branch|--vmr-branch) + vmr_branch=$2 + shift + ;; + --remote) + additional_remotes="$additional_remotes $2" + shift + ;; + --azdev-pat) + azdev_pat=$2 + shift + ;; + --ci) + ci=true + ;; + -d|--debug) + verbosity=debug + ;; + -h|--help) + print_help + exit 0 + ;; + *) + fail "Invalid argument: $1" + print_help + exit 1 + ;; + esac + + shift +done + +# Validation + +if [[ -z "$tmp_dir" ]]; then + fail "Missing --tmp-dir argument. Please specify the path to the temporary folder where the repositories will be cloned" + exit 1 +fi + +# Sanitize the input + +if [[ -z "$vmr_dir" ]]; then + vmr_dir="$tmp_dir/dotnet" +fi + +if [[ ! -d "$tmp_dir" ]]; then + mkdir -p "$tmp_dir" +fi + +if [[ "$verbosity" == "debug" ]]; then + set -x +fi + +# Prepare the VMR + +if [[ ! -d "$vmr_dir" ]]; then + highlight "Cloning 'dotnet/dotnet' into $vmr_dir.." + git clone https://github.com/dotnet/dotnet "$vmr_dir" + + if [[ -n "$vmr_branch" ]]; then + git -C "$vmr_dir" switch -c "$vmr_branch" + fi +else + if ! git -C "$vmr_dir" diff --quiet; then + fail "There are changes in the working tree of $vmr_dir. Please commit or stash your changes" + exit 1 + fi + + if [[ -n "$vmr_branch" ]]; then + highlight "Preparing $vmr_dir" + git -C "$vmr_dir" checkout "$vmr_branch" + git -C "$vmr_dir" pull + fi +fi + +set -e + +# Prepare darc + +highlight 'Installing .NET, preparing the tooling..' +source "./eng/common/tools.sh" +InitializeDotNetCli true +GetDarc +dotnetDir=$( cd ./.dotnet/; pwd -P ) +dotnet=$dotnetDir/dotnet + +highlight "Starting the synchronization of VMR.." +set +e + +if [[ -n "$additional_remotes" ]]; then + additional_remotes="--additional-remotes $additional_remotes" +fi + +if [[ -n "$azdev_pat" ]]; then + azdev_pat="--azdev-pat $azdev_pat" +fi + +ci_arg='' +if [[ "$ci" == "true" ]]; then + ci_arg="--ci" +fi + +# Synchronize the VMR + +export DOTNET_ROOT="$dotnetDir" + +"$darc_tool" vmr forwardflow \ + --tmp "$tmp_dir" \ + $azdev_pat \ + --$verbosity \ + $ci_arg \ + $additional_remotes \ + "$vmr_dir" + +if [[ $? == 0 ]]; then + highlight "Synchronization succeeded" +else + fail "Synchronization of repo to VMR failed!" + fail "'$vmr_dir' is left in its last state (re-run of this script will reset it)." + fail "Please inspect the logs which contain path to the failing patch file (use --debug to get all the details)." + fail "Once you make changes to the conflicting VMR patch, commit it locally and re-run this script." + exit 1 +fi diff --git a/eng/packages/General.props b/eng/packages/General.props index 5be4031ad4d..496309b7346 100644 --- a/eng/packages/General.props +++ b/eng/packages/General.props @@ -2,7 +2,7 @@ - + diff --git a/eng/pipelines/templates/BuildAndTest.yml b/eng/pipelines/templates/BuildAndTest.yml index bd008ead075..bd760e84f3c 100644 --- a/eng/pipelines/templates/BuildAndTest.yml +++ b/eng/pipelines/templates/BuildAndTest.yml @@ -169,11 +169,12 @@ steps: displayName: Build Azure DevOps plugin - script: ${{ parameters.buildScript }} + -restore -sign $(_SignArgs) -publish $(_PublishArgs) -configuration ${{ parameters.buildConfig }} -warnAsError 1 /bl:${{ parameters.repoLogPath }}/publish.binlog - /p:Restore=false /p:Build=false + /p:Build=false $(_OfficialBuildIdArgs) displayName: Sign and publish diff --git a/global.json b/global.json index bb42ae14f2c..fd61e068f3a 100644 --- a/global.json +++ b/global.json @@ -1,9 +1,9 @@ { "sdk": { - "version": "9.0.109" + "version": "10.0.100-rc.1.25451.107" }, "tools": { - "dotnet": "9.0.109", + "dotnet": "10.0.100-rc.1.25451.107", "runtimes": { "dotnet": [ "8.0.0", @@ -18,7 +18,7 @@ "msbuild-sdks": { "Microsoft.Build.NoTargets": "3.7.0", "Microsoft.Build.Traversal": "3.2.0", - "Microsoft.DotNet.Arcade.Sdk": "9.0.0-beta.25428.3", - "Microsoft.DotNet.Helix.Sdk": "9.0.0-beta.25428.3" + "Microsoft.DotNet.Arcade.Sdk": "10.0.0-beta.25476.2", + "Microsoft.DotNet.Helix.Sdk": "10.0.0-beta.25476.2" } } diff --git a/src/Generators/Microsoft.Gen.MetricsReports/MetricsReportsHelpers.cs b/src/Generators/Microsoft.Gen.MetricsReports/MetricsReportsHelpers.cs index ee610c0c032..0354528f810 100644 --- a/src/Generators/Microsoft.Gen.MetricsReports/MetricsReportsHelpers.cs +++ b/src/Generators/Microsoft.Gen.MetricsReports/MetricsReportsHelpers.cs @@ -6,6 +6,7 @@ using Microsoft.Gen.Metrics.Model; namespace Microsoft.Gen.MetricsReports; + internal static class MetricsReportsHelpers { internal static ReportedMetricClass[] MapToCommonModel(IReadOnlyList meteringClasses, string? rootNamespace) diff --git a/src/Generators/Shared/RoslynExtensions.cs b/src/Generators/Shared/RoslynExtensions.cs index 82860a09f59..ef4f7e07911 100644 --- a/src/Generators/Shared/RoslynExtensions.cs +++ b/src/Generators/Shared/RoslynExtensions.cs @@ -2,7 +2,6 @@ // The .NET Foundation licenses this file to you under the MIT license. using System; -using System.Collections.Immutable; using Microsoft.CodeAnalysis; using Microsoft.CodeAnalysis.CSharp.Syntax; @@ -103,33 +102,6 @@ internal static class RoslynExtensions ? throw new ArgumentException("The input type must correspond to a named type symbol.") : GetBestTypeByMetadataName(compilation, type.FullName); - public static ImmutableArray ToImmutableArray(this ReadOnlySpan span) - { -#pragma warning disable S109 // Magic numbers should not be used - switch (span.Length) - { - case 0: - return ImmutableArray.Empty; - case 1: - return ImmutableArray.Create(span[0]); - case 2: - return ImmutableArray.Create(span[0], span[1]); - case 3: - return ImmutableArray.Create(span[0], span[1], span[2]); - case 4: - return ImmutableArray.Create(span[0], span[1], span[2], span[3]); - default: - var builder = ImmutableArray.CreateBuilder(span.Length); - foreach (var item in span) - { - builder.Add(item); - } - - return builder.MoveToImmutable(); - } -#pragma warning restore S109 // Magic numbers should not be used - } - public static SimpleNameSyntax GetUnqualifiedName(this NameSyntax name) => name switch { diff --git a/src/Libraries/.editorconfig b/src/Libraries/.editorconfig index 287246445ae..d33ed319482 100644 --- a/src/Libraries/.editorconfig +++ b/src/Libraries/.editorconfig @@ -5889,7 +5889,7 @@ dotnet_diagnostic.SA1414.severity = warning # Title : Braces for multi-line statements should not share line # Category : StyleCop.CSharp.LayoutRules # Help Link: https://github.com/DotNetAnalyzers/StyleCopAnalyzers/blob/master/documentation/SA1500.md -dotnet_diagnostic.SA1500.severity = warning +dotnet_diagnostic.SA1500.severity = suggestion # rule does not work well with field-based property initializers # Title : Statement should not be on a single line # Category : StyleCop.CSharp.LayoutRules @@ -5957,7 +5957,7 @@ dotnet_diagnostic.SA1512.severity = none # Title : Closing brace should be followed by blank line # Category : StyleCop.CSharp.LayoutRules # Help Link: https://github.com/DotNetAnalyzers/StyleCopAnalyzers/blob/master/documentation/SA1513.md -dotnet_diagnostic.SA1513.severity = warning +dotnet_diagnostic.SA1513.severity = suggestion # rule does not work well with field-based property initializers # Title : Element documentation header should be preceded by blank line # Category : StyleCop.CSharp.LayoutRules diff --git a/src/Libraries/Microsoft.AspNetCore.Diagnostics.Middleware/Microsoft.AspNetCore.Diagnostics.Middleware.csproj b/src/Libraries/Microsoft.AspNetCore.Diagnostics.Middleware/Microsoft.AspNetCore.Diagnostics.Middleware.csproj index 15e0e9d9225..2d6e518f1b5 100644 --- a/src/Libraries/Microsoft.AspNetCore.Diagnostics.Middleware/Microsoft.AspNetCore.Diagnostics.Middleware.csproj +++ b/src/Libraries/Microsoft.AspNetCore.Diagnostics.Middleware/Microsoft.AspNetCore.Diagnostics.Middleware.csproj @@ -11,6 +11,8 @@ $(NetCoreTargetFrameworks) true + + $(InterceptorsNamespaces);Microsoft.Extensions.Configuration.Binder.SourceGeneration true true false @@ -39,7 +41,6 @@ - diff --git a/src/Libraries/Microsoft.AspNetCore.HeaderParsing/HeaderParsingFeature.cs b/src/Libraries/Microsoft.AspNetCore.HeaderParsing/HeaderParsingFeature.cs index 915fbbee722..ca5e38f3651 100644 --- a/src/Libraries/Microsoft.AspNetCore.HeaderParsing/HeaderParsingFeature.cs +++ b/src/Libraries/Microsoft.AspNetCore.HeaderParsing/HeaderParsingFeature.cs @@ -15,7 +15,6 @@ namespace Microsoft.AspNetCore.HeaderParsing; /// public sealed partial class HeaderParsingFeature { - private readonly IHeaderRegistry _registry; private readonly ILogger _logger; private readonly HeaderParsingMetrics _metrics; @@ -26,11 +25,10 @@ public sealed partial class HeaderParsingFeature internal HttpContext? Context { get; set; } - internal HeaderParsingFeature(IHeaderRegistry registry, ILogger logger, HeaderParsingMetrics metrics) + internal HeaderParsingFeature(ILogger logger, HeaderParsingMetrics metrics) { _logger = logger; _metrics = metrics; - _registry = registry; } /// @@ -91,12 +89,11 @@ internal sealed class PoolHelper : IDisposable public PoolHelper( ObjectPool pool, - IHeaderRegistry registry, ILogger logger, HeaderParsingMetrics metrics) { _pool = pool; - Feature = new HeaderParsingFeature(registry, logger, metrics); + Feature = new HeaderParsingFeature(logger, metrics); } public void Dispose() diff --git a/src/Libraries/Microsoft.AspNetCore.HeaderParsing/Microsoft.AspNetCore.HeaderParsing.csproj b/src/Libraries/Microsoft.AspNetCore.HeaderParsing/Microsoft.AspNetCore.HeaderParsing.csproj index adbae73bd9e..32020fa29f9 100644 --- a/src/Libraries/Microsoft.AspNetCore.HeaderParsing/Microsoft.AspNetCore.HeaderParsing.csproj +++ b/src/Libraries/Microsoft.AspNetCore.HeaderParsing/Microsoft.AspNetCore.HeaderParsing.csproj @@ -10,6 +10,8 @@ $(NetCoreTargetFrameworks) true + + $(InterceptorsNamespaces);Microsoft.Extensions.Configuration.Binder.SourceGeneration true true true @@ -30,10 +32,6 @@ - - - - diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/CHANGELOG.md b/src/Libraries/Microsoft.Extensions.AI.Abstractions/CHANGELOG.md index 8a0e875557a..3ba9420a870 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/CHANGELOG.md +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/CHANGELOG.md @@ -1,6 +1,6 @@ # Release History -## NOT YET RELEASED +## 9.9.1 - Added new `ChatResponseFormat.ForJsonSchema` overloads that export a JSON schema from a .NET type. - Added new `AITool.GetService` virtual method. diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatFinishReason.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatFinishReason.cs index 18b3ec658e3..1852aa07f7c 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatFinishReason.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatFinishReason.cs @@ -14,9 +14,6 @@ namespace Microsoft.Extensions.AI; [JsonConverter(typeof(Converter))] public readonly struct ChatFinishReason : IEquatable { - /// The finish reason value. If because `default(ChatFinishReason)` was used, the instance will behave like . - private readonly string? _value; - /// Initializes a new instance of the struct with a string that describes the reason. /// The reason value. /// is . @@ -24,11 +21,11 @@ namespace Microsoft.Extensions.AI; [JsonConstructor] public ChatFinishReason(string value) { - _value = Throw.IfNullOrWhitespace(value); + Value = Throw.IfNullOrWhitespace(value); } /// Gets the finish reason value. - public string Value => _value ?? Stop.Value; + public string Value => field ?? Stop.Value; /// public override bool Equals([NotNullWhen(true)] object? obj) => obj is ChatFinishReason other && Equals(other); diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatResponseUpdate.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatResponseUpdate.cs index 6a4f11c3777..0605d0785bb 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatResponseUpdate.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatResponseUpdate.cs @@ -35,9 +35,6 @@ public class ChatResponseUpdate /// The response update content items. private IList? _contents; - /// The name of the author of the update. - private string? _authorName; - /// Initializes a new instance of the class. [JsonConstructor] public ChatResponseUpdate() @@ -64,8 +61,8 @@ public ChatResponseUpdate(ChatRole? role, IList? contents) /// Gets or sets the name of the author of the response update. public string? AuthorName { - get => _authorName; - set => _authorName = string.IsNullOrWhiteSpace(value) ? null : value; + get; + set => field = string.IsNullOrWhiteSpace(value) ? null : value; } /// Gets or sets the role of the author of the response update. diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/DataContent.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/DataContent.cs index 4b8813c144e..7b2ef0ccb1a 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/DataContent.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/DataContent.cs @@ -15,6 +15,7 @@ using System.Text.Json.Serialization; using Microsoft.Shared.Diagnostics; +#pragma warning disable IDE0032 // Use auto property #pragma warning disable CA1307 // Specify StringComparison for clarity namespace Microsoft.Extensions.AI; diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/ErrorContent.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/ErrorContent.cs index 4588531262b..4b82c4c5e91 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/ErrorContent.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/ErrorContent.cs @@ -14,22 +14,19 @@ namespace Microsoft.Extensions.AI; [DebuggerDisplay("{DebuggerDisplay,nq}")] public class ErrorContent : AIContent { - /// The error message. - private string? _message; - /// Initializes a new instance of the class with the specified error message. /// The error message to store in this content. public ErrorContent(string? message) { - _message = message; + Message = message; } /// Gets or sets the error message. [AllowNull] public string Message { - get => _message ?? string.Empty; - set => _message = value; + get => field ?? string.Empty; + set; } /// Gets or sets an error code associated with the error. diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextContent.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextContent.cs index 0f70a5f8b0a..d6bac57420f 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextContent.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextContent.cs @@ -12,15 +12,13 @@ namespace Microsoft.Extensions.AI; [DebuggerDisplay("{DebuggerDisplay,nq}")] public sealed class TextContent : AIContent { - private string? _text; - /// /// Initializes a new instance of the class. /// /// The text content. public TextContent(string? text) { - _text = text; + Text = text; } /// @@ -29,8 +27,8 @@ public TextContent(string? text) [AllowNull] public string Text { - get => _text ?? string.Empty; - set => _text = value; + get => field ?? string.Empty; + set; } /// diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextReasoningContent.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextReasoningContent.cs index 345cb430dbd..57fec14cc0e 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextReasoningContent.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Contents/TextReasoningContent.cs @@ -17,15 +17,13 @@ namespace Microsoft.Extensions.AI; [DebuggerDisplay("{DebuggerDisplay,nq}")] public sealed class TextReasoningContent : AIContent { - private string? _text; - /// /// Initializes a new instance of the class. /// /// The text reasoning content. public TextReasoningContent(string? text) { - _text = text; + Text = text; } /// @@ -34,8 +32,8 @@ public TextReasoningContent(string? text) [AllowNull] public string Text { - get => _text ?? string.Empty; - set => _text = value; + get => field ?? string.Empty; + set; } /// Gets or sets an optional opaque blob of data associated with this reasoning content. diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Embeddings/EmbeddingGenerationOptions.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Embeddings/EmbeddingGenerationOptions.cs index b9a13d43dd0..4d88c85b760 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Embeddings/EmbeddingGenerationOptions.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Embeddings/EmbeddingGenerationOptions.cs @@ -10,12 +10,10 @@ namespace Microsoft.Extensions.AI; /// Represents the options for an embedding generation request. public class EmbeddingGenerationOptions { - private int? _dimensions; - /// Gets or sets the number of dimensions requested in the embedding. public int? Dimensions { - get => _dimensions; + get; set { if (value is not null) @@ -23,7 +21,7 @@ public int? Dimensions _ = Throw.IfLessThan(value.Value, 1, nameof(value)); } - _dimensions = value; + field = value; } } diff --git a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Image/ImageGenerationResponse.cs b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Image/ImageGenerationResponse.cs index ba3abe8f1a3..8f093634783 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Abstractions/Image/ImageGenerationResponse.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Abstractions/Image/ImageGenerationResponse.cs @@ -11,9 +11,6 @@ namespace Microsoft.Extensions.AI; [Experimental("MEAI001")] public class ImageGenerationResponse { - /// The content items in the generated text response. - private IList? _contents; - /// Initializes a new instance of the class. [JsonConstructor] public ImageGenerationResponse() @@ -24,7 +21,7 @@ public ImageGenerationResponse() /// The contents for this response. public ImageGenerationResponse(IList? contents) { - _contents = contents; + Contents = contents; } /// Gets or sets the raw representation of the image generation response from an underlying implementation. @@ -46,8 +43,8 @@ public ImageGenerationResponse(IList? contents) [AllowNull] public IList Contents { - get => _contents ??= []; - set => _contents = value; + get => field ??= []; + set; } /// Gets or sets usage details for the image generation response. diff --git a/src/Libraries/Microsoft.Extensions.AI.AzureAIInference/CHANGELOG.md b/src/Libraries/Microsoft.Extensions.AI.AzureAIInference/CHANGELOG.md index 412427dcad1..15b9f840773 100644 --- a/src/Libraries/Microsoft.Extensions.AI.AzureAIInference/CHANGELOG.md +++ b/src/Libraries/Microsoft.Extensions.AI.AzureAIInference/CHANGELOG.md @@ -1,6 +1,6 @@ # Release History -## NOT YET RELEASED +## 9.9.1-preview.1.25474.6 - Updated to accommodate the additions in `Microsoft.Extensions.AI.Abstractions`. diff --git a/src/Libraries/Microsoft.Extensions.AI.Evaluation.NLP/Microsoft.Extensions.AI.Evaluation.NLP.csproj b/src/Libraries/Microsoft.Extensions.AI.Evaluation.NLP/Microsoft.Extensions.AI.Evaluation.NLP.csproj index 53564605660..c6d52244f87 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Evaluation.NLP/Microsoft.Extensions.AI.Evaluation.NLP.csproj +++ b/src/Libraries/Microsoft.Extensions.AI.Evaluation.NLP/Microsoft.Extensions.AI.Evaluation.NLP.csproj @@ -26,7 +26,6 @@ - diff --git a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Azure/JsonSerialization/AzureStorageJsonUtilities.cs b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Azure/JsonSerialization/AzureStorageJsonUtilities.cs index 204ac68394b..c2f7b418b01 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Azure/JsonSerialization/AzureStorageJsonUtilities.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Azure/JsonSerialization/AzureStorageJsonUtilities.cs @@ -13,16 +13,14 @@ internal static partial class AzureStorageJsonUtilities { internal static class Default { - private static JsonSerializerOptions? _options; - internal static JsonSerializerOptions Options => _options ??= CreateJsonSerializerOptions(writeIndented: true); + internal static JsonSerializerOptions Options => field ??= CreateJsonSerializerOptions(writeIndented: true); internal static JsonTypeInfo CacheEntryTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo ScenarioRunResultTypeInfo => Options.GetTypeInfo(); } internal static class Compact { - private static JsonSerializerOptions? _options; - internal static JsonSerializerOptions Options => _options ??= CreateJsonSerializerOptions(writeIndented: false); + internal static JsonSerializerOptions Options => field ??= CreateJsonSerializerOptions(writeIndented: false); internal static JsonTypeInfo CacheEntryTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo ScenarioRunResultTypeInfo => Options.GetTypeInfo(); } diff --git a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/CSharp/JsonSerialization/JsonUtilities.cs b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/CSharp/JsonSerialization/JsonUtilities.cs index fb514bd33c8..9427b67be8e 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/CSharp/JsonSerialization/JsonUtilities.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/CSharp/JsonSerialization/JsonUtilities.cs @@ -14,8 +14,7 @@ internal static partial class JsonUtilities { internal static class Default { - private static JsonSerializerOptions? _options; - internal static JsonSerializerOptions Options => _options ??= CreateJsonSerializerOptions(writeIndented: true); + internal static JsonSerializerOptions Options => field ??= CreateJsonSerializerOptions(writeIndented: true); internal static JsonTypeInfo DatasetTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo CacheEntryTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo ScenarioRunResultTypeInfo => Options.GetTypeInfo(); @@ -23,8 +22,7 @@ internal static class Default internal static class Compact { - private static JsonSerializerOptions? _options; - internal static JsonSerializerOptions Options => _options ??= CreateJsonSerializerOptions(writeIndented: false); + internal static JsonSerializerOptions Options => field ??= CreateJsonSerializerOptions(writeIndented: false); internal static JsonTypeInfo DatasetTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo CacheEntryTypeInfo => Options.GetTypeInfo(); internal static JsonTypeInfo ScenarioRunResultTypeInfo => Options.GetTypeInfo(); diff --git a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Safety/ContentSafetyService.cs b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Safety/ContentSafetyService.cs index ec3dc720bb0..26dec553915 100644 --- a/src/Libraries/Microsoft.Extensions.AI.Evaluation.Safety/ContentSafetyService.cs +++ b/src/Libraries/Microsoft.Extensions.AI.Evaluation.Safety/ContentSafetyService.cs @@ -20,15 +20,10 @@ internal sealed partial class ContentSafetyService(ContentSafetyServiceConfigura private const string APIVersionForServiceDiscoveryInHubBasedProjects = "?api-version=2023-08-01-preview"; private const string APIVersionForNonHubBasedProjects = "?api-version=2025-05-15-preview"; - private static HttpClient? _sharedHttpClient; - private static HttpClient SharedHttpClient - { - get - { - _sharedHttpClient ??= new HttpClient(); - return _sharedHttpClient; - } - } + private static HttpClient SharedHttpClient => + field ?? + Interlocked.CompareExchange(ref field, new(), null) ?? + field; private static readonly ConcurrentDictionary _serviceUrlCache = new ConcurrentDictionary(); @@ -445,9 +440,8 @@ private async ValueTask AddHeadersAsync( httpRequestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", token.Token); - if (httpRequestMessage.Content is not null) - { - httpRequestMessage.Content.Headers.ContentType = new MediaTypeHeaderValue("application/json"); - } +#pragma warning disable IDE0058 // Temporary workaround for Roslyn analyzer issue (see https://github.com/dotnet/roslyn/issues/80499). + httpRequestMessage.Content?.Headers.ContentType = new MediaTypeHeaderValue("application/json"); +#pragma warning restore IDE0058 } } diff --git a/src/Libraries/Microsoft.Extensions.AI.OpenAI/CHANGELOG.md b/src/Libraries/Microsoft.Extensions.AI.OpenAI/CHANGELOG.md index da67eb7c841..03fe8357eab 100644 --- a/src/Libraries/Microsoft.Extensions.AI.OpenAI/CHANGELOG.md +++ b/src/Libraries/Microsoft.Extensions.AI.OpenAI/CHANGELOG.md @@ -1,6 +1,6 @@ # Release History -## NOT YET RELEASED +## 9.9.1-preview.1.25474.6 - Updated to depend on OpenAI 2.5.0. - Added M.E.AI to OpenAI conversions for response format types. diff --git a/src/Libraries/Microsoft.Extensions.AI/CHANGELOG.md b/src/Libraries/Microsoft.Extensions.AI/CHANGELOG.md index e5c3028693f..275131b595a 100644 --- a/src/Libraries/Microsoft.Extensions.AI/CHANGELOG.md +++ b/src/Libraries/Microsoft.Extensions.AI/CHANGELOG.md @@ -1,6 +1,6 @@ # Release History -## NOT YET RELEASED +## 9.9.1 - Updated the `EnableSensitiveData` properties on `OpenTelemetryChatClient/EmbeddingGenerator` to respect a `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` environment variable. - Updated `OpenTelemetryChatClient/EmbeddingGenerator` to emit recent additions to the OpenTelemetry Semantic Conventions for Generative AI systems. diff --git a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/DistributedCachingChatClient.cs b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/DistributedCachingChatClient.cs index 47984962598..44ddcf84081 100644 --- a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/DistributedCachingChatClient.cs +++ b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/DistributedCachingChatClient.cs @@ -44,9 +44,6 @@ public class DistributedCachingChatClient : CachingChatClient /// Additional values used to inform the cache key employed for storing state. private object[]? _cacheKeyAdditionalValues; - /// The to use when serializing cache data. - private JsonSerializerOptions _jsonSerializerOptions = AIJsonUtilities.DefaultOptions; - /// Initializes a new instance of the class. /// The underlying . /// An instance that will be used as the backing store for the cache. @@ -59,9 +56,9 @@ public DistributedCachingChatClient(IChatClient innerClient, IDistributedCache s /// Gets or sets JSON serialization options to use when serializing cache data. public JsonSerializerOptions JsonSerializerOptions { - get => _jsonSerializerOptions; - set => _jsonSerializerOptions = Throw.IfNull(value); - } + get; + set => field = Throw.IfNull(value); + } = AIJsonUtilities.DefaultOptions; /// Gets or sets additional values used to inform the cache key employed for storing state. /// Any values set in this list will augment the other values used to inform the cache key. @@ -75,11 +72,11 @@ public IReadOnlyList? CacheKeyAdditionalValues protected override async Task ReadCacheAsync(string key, CancellationToken cancellationToken) { _ = Throw.IfNull(key); - _jsonSerializerOptions.MakeReadOnly(); + JsonSerializerOptions.MakeReadOnly(); if (await _storage.GetAsync(key, cancellationToken) is byte[] existingJson) { - return (ChatResponse?)JsonSerializer.Deserialize(existingJson, _jsonSerializerOptions.GetTypeInfo(typeof(ChatResponse))); + return (ChatResponse?)JsonSerializer.Deserialize(existingJson, JsonSerializerOptions.GetTypeInfo(typeof(ChatResponse))); } return null; @@ -89,11 +86,11 @@ public IReadOnlyList? CacheKeyAdditionalValues protected override async Task?> ReadCacheStreamingAsync(string key, CancellationToken cancellationToken) { _ = Throw.IfNull(key); - _jsonSerializerOptions.MakeReadOnly(); + JsonSerializerOptions.MakeReadOnly(); if (await _storage.GetAsync(key, cancellationToken) is byte[] existingJson) { - return (IReadOnlyList?)JsonSerializer.Deserialize(existingJson, _jsonSerializerOptions.GetTypeInfo(typeof(IReadOnlyList))); + return (IReadOnlyList?)JsonSerializer.Deserialize(existingJson, JsonSerializerOptions.GetTypeInfo(typeof(IReadOnlyList))); } return null; @@ -104,9 +101,9 @@ protected override async Task WriteCacheAsync(string key, ChatResponse value, Ca { _ = Throw.IfNull(key); _ = Throw.IfNull(value); - _jsonSerializerOptions.MakeReadOnly(); + JsonSerializerOptions.MakeReadOnly(); - var newJson = JsonSerializer.SerializeToUtf8Bytes(value, _jsonSerializerOptions.GetTypeInfo(typeof(ChatResponse))); + var newJson = JsonSerializer.SerializeToUtf8Bytes(value, JsonSerializerOptions.GetTypeInfo(typeof(ChatResponse))); await _storage.SetAsync(key, newJson, cancellationToken); } @@ -115,9 +112,9 @@ protected override async Task WriteCacheStreamingAsync(string key, IReadOnlyList { _ = Throw.IfNull(key); _ = Throw.IfNull(value); - _jsonSerializerOptions.MakeReadOnly(); + JsonSerializerOptions.MakeReadOnly(); - var newJson = JsonSerializer.SerializeToUtf8Bytes(value, _jsonSerializerOptions.GetTypeInfo(typeof(IReadOnlyList))); + var newJson = JsonSerializer.SerializeToUtf8Bytes(value, JsonSerializerOptions.GetTypeInfo(typeof(IReadOnlyList))); await _storage.SetAsync(key, newJson, cancellationToken); } @@ -151,7 +148,7 @@ protected override string GetCacheKey(IEnumerable messages, ChatOpt additionalValues.CopyTo(arr.AsSpan(FixedValuesCount)); clientValues.CopyTo(arr, FixedValuesCount + additionalValues.Length); - return AIJsonUtilities.HashDataToString(arr.AsSpan(0, length), _jsonSerializerOptions); + return AIJsonUtilities.HashDataToString(arr.AsSpan(0, length), JsonSerializerOptions); } finally { diff --git a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvocationContext.cs b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvocationContext.cs index 0e426615cfd..554918b0a8e 100644 --- a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvocationContext.cs +++ b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvocationContext.cs @@ -17,18 +17,6 @@ public class FunctionInvocationContext /// private static readonly AIFunction _nopFunction = AIFunctionFactory.Create(() => { }, nameof(FunctionInvocationContext)); - /// The chat contents associated with the operation that initiated this function call request. - private IList _messages = Array.Empty(); - - /// The AI function to be invoked. - private AIFunction _function = _nopFunction; - - /// The function call content information associated with this invocation. - private FunctionCallContent? _callContent; - - /// The arguments used with the function. - private AIFunctionArguments? _arguments; - /// Initializes a new instance of the class. public FunctionInvocationContext() { @@ -37,30 +25,30 @@ public FunctionInvocationContext() /// Gets or sets the AI function to be invoked. public AIFunction Function { - get => _function; - set => _function = Throw.IfNull(value); - } + get; + set => field = Throw.IfNull(value); + } = _nopFunction; /// Gets or sets the arguments associated with this invocation. public AIFunctionArguments Arguments { - get => _arguments ??= []; - set => _arguments = Throw.IfNull(value); + get => field ??= []; + set => field = Throw.IfNull(value); } /// Gets or sets the function call content information associated with this invocation. public FunctionCallContent CallContent { - get => _callContent ??= new(string.Empty, _nopFunction.Name, EmptyReadOnlyDictionary.Instance); - set => _callContent = Throw.IfNull(value); + get => field ??= new(string.Empty, _nopFunction.Name, EmptyReadOnlyDictionary.Instance); + set => field = Throw.IfNull(value); } /// Gets or sets the chat contents associated with the operation that initiated this function call request. public IList Messages { - get => _messages; - set => _messages = Throw.IfNull(value); - } + get; + set => field = Throw.IfNull(value); + } = Array.Empty(); /// Gets or sets the chat options associated with the operation that initiated this function call request. public ChatOptions? Options { get; set; } diff --git a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvokingChatClient.cs b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvokingChatClient.cs index 4495d592adb..fdc5ef7a204 100644 --- a/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvokingChatClient.cs +++ b/src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvokingChatClient.cs @@ -16,8 +16,6 @@ #pragma warning disable CA2213 // Disposable fields should be disposed #pragma warning disable S3353 // Unchanged local variables should be "const" -#pragma warning disable IDE0031 // Use null propagation, suppressed until repo updates to C# 14 -#pragma warning disable IDE0032 // Use auto property, suppressed until repo updates to C# 14 namespace Microsoft.Extensions.AI; @@ -79,12 +77,6 @@ public partial class FunctionInvokingChatClient : DelegatingChatClient /// This component does not own the instance and should not dispose it. private readonly ActivitySource? _activitySource; - /// Maximum number of roundtrips allowed to the inner client. - private int _maximumIterationsPerRequest = 40; // arbitrary default to prevent runaway execution - - /// Maximum number of consecutive iterations that are allowed contain at least one exception result. If the limit is exceeded, we rethrow the exception instead of continuing. - private int _maximumConsecutiveErrorsPerRequest = 3; - /// /// Initializes a new instance of the class. /// @@ -178,7 +170,7 @@ public static FunctionInvocationContext? CurrentContext /// public int MaximumIterationsPerRequest { - get => _maximumIterationsPerRequest; + get; set { if (value < 1) @@ -186,9 +178,9 @@ public int MaximumIterationsPerRequest Throw.ArgumentOutOfRangeException(nameof(value)); } - _maximumIterationsPerRequest = value; + field = value; } - } + } = 40; /// /// Gets or sets the maximum number of consecutive iterations that are allowed to fail with an error. @@ -220,9 +212,9 @@ public int MaximumIterationsPerRequest /// public int MaximumConsecutiveErrorsPerRequest { - get => _maximumConsecutiveErrorsPerRequest; - set => _maximumConsecutiveErrorsPerRequest = Throw.IfLessThan(value, 0); - } + get; + set => field = Throw.IfLessThan(value, 0); + } = 3; /// Gets or sets a collection of additional tools the client is able to invoke. /// @@ -1430,10 +1422,9 @@ private static (List? approvals, List cm) diff --git a/src/Libraries/Microsoft.Extensions.AI/ChatReduction/SummarizingChatReducer.cs b/src/Libraries/Microsoft.Extensions.AI/ChatReduction/SummarizingChatReducer.cs index b79d1d18197..f097c1c9a35 100644 --- a/src/Libraries/Microsoft.Extensions.AI/ChatReduction/SummarizingChatReducer.cs +++ b/src/Libraries/Microsoft.Extensions.AI/ChatReduction/SummarizingChatReducer.cs @@ -45,16 +45,14 @@ public sealed class SummarizingChatReducer : IChatReducer private readonly int _targetCount; private readonly int _thresholdCount; - private string _summarizationPrompt = DefaultSummarizationPrompt; - /// /// Gets or sets the prompt text used for summarization. /// public string SummarizationPrompt { - get => _summarizationPrompt; - set => _summarizationPrompt = Throw.IfNull(value); - } + get; + set => field = Throw.IfNull(value); + } = DefaultSummarizationPrompt; /// /// Initializes a new instance of the class with the specified chat client, @@ -79,7 +77,7 @@ public async Task> ReduceAsync(IEnumerable if (summarizedConversion.ShouldResummarize(_targetCount, _thresholdCount)) { summarizedConversion = await summarizedConversion.ResummarizeAsync( - _chatClient, _targetCount, _summarizationPrompt, cancellationToken); + _chatClient, _targetCount, SummarizationPrompt, cancellationToken); } return summarizedConversion.ToChatMessages(); diff --git a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/LoggingSpeechToTextClientBuilderExtensions.cs b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/LoggingSpeechToTextClientBuilderExtensions.cs index 92a67189982..54ed411bd35 100644 --- a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/LoggingSpeechToTextClientBuilderExtensions.cs +++ b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/LoggingSpeechToTextClientBuilderExtensions.cs @@ -14,7 +14,7 @@ namespace Microsoft.Extensions.AI; [Experimental("MEAI001")] public static class LoggingSpeechToTextClientBuilderExtensions { - /// Adds logging to the audio transcription client pipeline. + /// Adds logging to the speech-to-text client pipeline. /// The . /// /// An optional used to create a logger with which logging should be performed. diff --git a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClient.cs b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClient.cs new file mode 100644 index 00000000000..40461ebe457 --- /dev/null +++ b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClient.cs @@ -0,0 +1,367 @@ +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. + +using System; +using System.Collections.Generic; +using System.Diagnostics; +using System.Diagnostics.CodeAnalysis; +using System.Diagnostics.Metrics; +using System.IO; +using System.Runtime.CompilerServices; +using System.Threading; +using System.Threading.Tasks; +using Microsoft.Extensions.Logging; +using Microsoft.Shared.Diagnostics; + +#pragma warning disable S3358 // Ternary operators should not be nested +#pragma warning disable SA1111 // Closing parenthesis should be on line of last parameter +#pragma warning disable SA1113 // Comma should be on the same line as previous parameter + +namespace Microsoft.Extensions.AI; + +/// Represents a delegating speech-to-text client that implements the OpenTelemetry Semantic Conventions for Generative AI systems. +/// +/// This class provides an implementation of the Semantic Conventions for Generative AI systems v1.37, defined at . +/// The specification is still experimental and subject to change; as such, the telemetry output by this client is also subject to change. +/// +[Experimental("MEAI001")] +public sealed class OpenTelemetrySpeechToTextClient : DelegatingSpeechToTextClient +{ + private readonly ActivitySource _activitySource; + private readonly Meter _meter; + + private readonly Histogram _tokenUsageHistogram; + private readonly Histogram _operationDurationHistogram; + + private readonly string? _defaultModelId; + private readonly string? _providerName; + private readonly string? _serverAddress; + private readonly int _serverPort; + + /// Initializes a new instance of the class. + /// The underlying . + /// The to use for emitting any logging data from the client. + /// An optional source name that will be used on the telemetry data. +#pragma warning disable IDE0060 // Remove unused parameter; it exists for consistency with IChatClient and future use + public OpenTelemetrySpeechToTextClient(ISpeechToTextClient innerClient, ILogger? logger = null, string? sourceName = null) +#pragma warning restore IDE0060 + : base(innerClient) + { + Debug.Assert(innerClient is not null, "Should have been validated by the base ctor"); + + if (innerClient!.GetService() is SpeechToTextClientMetadata metadata) + { + _defaultModelId = metadata.DefaultModelId; + _providerName = metadata.ProviderName; + _serverAddress = metadata.ProviderUri?.Host; + _serverPort = metadata.ProviderUri?.Port ?? 0; + } + + string name = string.IsNullOrEmpty(sourceName) ? OpenTelemetryConsts.DefaultSourceName : sourceName!; + _activitySource = new(name); + _meter = new(name); + + _tokenUsageHistogram = _meter.CreateHistogram( + OpenTelemetryConsts.GenAI.Client.TokenUsage.Name, + OpenTelemetryConsts.TokensUnit, + OpenTelemetryConsts.GenAI.Client.TokenUsage.Description +#if NET9_0_OR_GREATER + , advice: new() { HistogramBucketBoundaries = OpenTelemetryConsts.GenAI.Client.TokenUsage.ExplicitBucketBoundaries } +#endif + ); + + _operationDurationHistogram = _meter.CreateHistogram( + OpenTelemetryConsts.GenAI.Client.OperationDuration.Name, + OpenTelemetryConsts.SecondsUnit, + OpenTelemetryConsts.GenAI.Client.OperationDuration.Description +#if NET9_0_OR_GREATER + , advice: new() { HistogramBucketBoundaries = OpenTelemetryConsts.GenAI.Client.OperationDuration.ExplicitBucketBoundaries } +#endif + ); + } + + /// + protected override void Dispose(bool disposing) + { + if (disposing) + { + _activitySource.Dispose(); + _meter.Dispose(); + } + + base.Dispose(disposing); + } + + /// + /// Gets or sets a value indicating whether potentially sensitive information should be included in telemetry. + /// + /// + /// if potentially sensitive information should be included in telemetry; + /// if telemetry shouldn't include raw inputs and outputs. + /// The default value is , unless the OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT + /// environment variable is set to "true" (case-insensitive). + /// + /// + /// By default, telemetry includes metadata, such as token counts, but not raw inputs + /// and outputs, such as message content, function call arguments, and function call results. + /// The default value can be overridden by setting the OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT + /// environment variable to "true". Explicitly setting this property will override the environment variable. + /// + public bool EnableSensitiveData { get; set; } = TelemetryHelpers.EnableSensitiveDataDefault; + + /// + public override object? GetService(Type serviceType, object? serviceKey = null) => + serviceType == typeof(ActivitySource) ? _activitySource : + base.GetService(serviceType, serviceKey); + + /// + public override async Task GetTextAsync(Stream audioSpeechStream, SpeechToTextOptions? options = null, CancellationToken cancellationToken = default) + { + _ = Throw.IfNull(audioSpeechStream); + + using Activity? activity = CreateAndConfigureActivity(options); + Stopwatch? stopwatch = _operationDurationHistogram.Enabled ? Stopwatch.StartNew() : null; + string? requestModelId = options?.ModelId ?? _defaultModelId; + + SpeechToTextResponse? response = null; + Exception? error = null; + try + { + response = await base.GetTextAsync(audioSpeechStream, options, cancellationToken); + return response; + } + catch (Exception ex) + { + error = ex; + throw; + } + finally + { + TraceResponse(activity, requestModelId, response, error, stopwatch); + } + } + + /// + public override async IAsyncEnumerable GetStreamingTextAsync( + Stream audioSpeechStream, SpeechToTextOptions? options = null, [EnumeratorCancellation] CancellationToken cancellationToken = default) + { + _ = Throw.IfNull(audioSpeechStream); + + using Activity? activity = CreateAndConfigureActivity(options); + Stopwatch? stopwatch = _operationDurationHistogram.Enabled ? Stopwatch.StartNew() : null; + string? requestModelId = options?.ModelId ?? _defaultModelId; + + IAsyncEnumerable updates; + try + { + updates = base.GetStreamingTextAsync(audioSpeechStream, options, cancellationToken); + } + catch (Exception ex) + { + TraceResponse(activity, requestModelId, response: null, ex, stopwatch); + throw; + } + + var responseEnumerator = updates.GetAsyncEnumerator(cancellationToken); + List trackedUpdates = []; + Exception? error = null; + try + { + while (true) + { + SpeechToTextResponseUpdate update; + try + { + if (!await responseEnumerator.MoveNextAsync()) + { + break; + } + + update = responseEnumerator.Current; + } + catch (Exception ex) + { + error = ex; + throw; + } + + trackedUpdates.Add(update); + yield return update; + Activity.Current = activity; // workaround for https://github.com/dotnet/runtime/issues/47802 + } + } + finally + { + TraceResponse(activity, requestModelId, trackedUpdates.ToSpeechToTextResponse(), error, stopwatch); + + await responseEnumerator.DisposeAsync(); + } + } + + /// Creates an activity for a speech-to-text request, or returns if not enabled. + private Activity? CreateAndConfigureActivity(SpeechToTextOptions? options) + { + Activity? activity = null; + if (_activitySource.HasListeners()) + { + string? modelId = options?.ModelId ?? _defaultModelId; + + activity = _activitySource.StartActivity( + string.IsNullOrWhiteSpace(modelId) ? OpenTelemetryConsts.GenAI.GenerateContentName : $"{OpenTelemetryConsts.GenAI.GenerateContentName} {modelId}", + ActivityKind.Client); + + if (activity is { IsAllDataRequested: true }) + { + _ = activity + .AddTag(OpenTelemetryConsts.GenAI.Operation.Name, OpenTelemetryConsts.GenAI.GenerateContentName) + .AddTag(OpenTelemetryConsts.GenAI.Request.Model, modelId) + .AddTag(OpenTelemetryConsts.GenAI.Provider.Name, _providerName) + .AddTag(OpenTelemetryConsts.GenAI.Output.Type, OpenTelemetryConsts.TypeText); + + if (_serverAddress is not null) + { + _ = activity + .AddTag(OpenTelemetryConsts.Server.Address, _serverAddress) + .AddTag(OpenTelemetryConsts.Server.Port, _serverPort); + } + + if (options is not null) + { + if (EnableSensitiveData) + { + // Log all additional request options as raw values on the span. + // Since AdditionalProperties has undefined meaning, we treat it as potentially sensitive data. + if (options.AdditionalProperties is { } props) + { + foreach (KeyValuePair prop in props) + { + _ = activity.AddTag(prop.Key, prop.Value); + } + } + } + } + } + } + + return activity; + } + + /// Adds speech-to-text response information to the activity. + private void TraceResponse( + Activity? activity, + string? requestModelId, + SpeechToTextResponse? response, + Exception? error, + Stopwatch? stopwatch) + { + if (_operationDurationHistogram.Enabled && stopwatch is not null) + { + TagList tags = default; + + AddMetricTags(ref tags, requestModelId, response); + if (error is not null) + { + tags.Add(OpenTelemetryConsts.Error.Type, error.GetType().FullName); + } + + _operationDurationHistogram.Record(stopwatch.Elapsed.TotalSeconds, tags); + } + + if (_tokenUsageHistogram.Enabled && response?.Usage is { } usage) + { + if (usage.InputTokenCount is long inputTokens) + { + TagList tags = default; + tags.Add(OpenTelemetryConsts.GenAI.Token.Type, OpenTelemetryConsts.TokenTypeInput); + AddMetricTags(ref tags, requestModelId, response); + _tokenUsageHistogram.Record((int)inputTokens, tags); + } + + if (usage.OutputTokenCount is long outputTokens) + { + TagList tags = default; + tags.Add(OpenTelemetryConsts.GenAI.Token.Type, OpenTelemetryConsts.TokenTypeOutput); + AddMetricTags(ref tags, requestModelId, response); + _tokenUsageHistogram.Record((int)outputTokens, tags); + } + } + + if (error is not null) + { + _ = activity? + .AddTag(OpenTelemetryConsts.Error.Type, error.GetType().FullName) + .SetStatus(ActivityStatusCode.Error, error.Message); + } + + if (response is not null) + { + AddOutputMessagesTags(response, activity); + + if (activity is not null) + { + if (!string.IsNullOrWhiteSpace(response.ResponseId)) + { + _ = activity.AddTag(OpenTelemetryConsts.GenAI.Response.Id, response.ResponseId); + } + + if (response.ModelId is not null) + { + _ = activity.AddTag(OpenTelemetryConsts.GenAI.Response.Model, response.ModelId); + } + + if (response.Usage?.InputTokenCount is long inputTokens) + { + _ = activity.AddTag(OpenTelemetryConsts.GenAI.Usage.InputTokens, (int)inputTokens); + } + + if (response.Usage?.OutputTokenCount is long outputTokens) + { + _ = activity.AddTag(OpenTelemetryConsts.GenAI.Usage.OutputTokens, (int)outputTokens); + } + + // Log all additional response properties as raw values on the span. + // Since AdditionalProperties has undefined meaning, we treat it as potentially sensitive data. + if (EnableSensitiveData && response.AdditionalProperties is { } props) + { + foreach (KeyValuePair prop in props) + { + _ = activity.AddTag(prop.Key, prop.Value); + } + } + } + } + + void AddMetricTags(ref TagList tags, string? requestModelId, SpeechToTextResponse? response) + { + tags.Add(OpenTelemetryConsts.GenAI.Operation.Name, OpenTelemetryConsts.GenAI.GenerateContentName); + + if (requestModelId is not null) + { + tags.Add(OpenTelemetryConsts.GenAI.Request.Model, requestModelId); + } + + tags.Add(OpenTelemetryConsts.GenAI.Provider.Name, _providerName); + + if (_serverAddress is string endpointAddress) + { + tags.Add(OpenTelemetryConsts.Server.Address, endpointAddress); + tags.Add(OpenTelemetryConsts.Server.Port, _serverPort); + } + + if (response?.ModelId is string responseModel) + { + tags.Add(OpenTelemetryConsts.GenAI.Response.Model, responseModel); + } + } + } + + private void AddOutputMessagesTags(SpeechToTextResponse response, Activity? activity) + { + if (EnableSensitiveData && activity is { IsAllDataRequested: true }) + { + _ = activity.AddTag( + OpenTelemetryConsts.GenAI.Output.Messages, + OpenTelemetryChatClient.SerializeChatMessages([new(ChatRole.Assistant, response.Contents)])); + } + } +} diff --git a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClientBuilderExtensions.cs b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClientBuilderExtensions.cs new file mode 100644 index 00000000000..5e23a41358e --- /dev/null +++ b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/OpenTelemetrySpeechToTextClientBuilderExtensions.cs @@ -0,0 +1,42 @@ +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. + +using System; +using System.Diagnostics.CodeAnalysis; +using Microsoft.Extensions.DependencyInjection; +using Microsoft.Extensions.Logging; +using Microsoft.Shared.Diagnostics; + +namespace Microsoft.Extensions.AI; + +/// Provides extensions for configuring instances. +[Experimental("MEAI001")] +public static class OpenTelemetrySpeechToTextClientBuilderExtensions +{ + /// + /// Adds OpenTelemetry support to the speech-to-text client pipeline, following the OpenTelemetry Semantic Conventions for Generative AI systems. + /// + /// + /// The draft specification this follows is available at . + /// The specification is still experimental and subject to change; as such, the telemetry output by this client is also subject to change. + /// + /// The . + /// An optional to use to create a logger for logging events. + /// An optional source name that will be used on the telemetry data. + /// An optional callback that can be used to configure the instance. + /// The . + public static SpeechToTextClientBuilder UseOpenTelemetry( + this SpeechToTextClientBuilder builder, + ILoggerFactory? loggerFactory = null, + string? sourceName = null, + Action? configure = null) => + Throw.IfNull(builder).Use((innerClient, services) => + { + loggerFactory ??= services.GetService(); + + var client = new OpenTelemetrySpeechToTextClient(innerClient, loggerFactory?.CreateLogger(typeof(OpenTelemetrySpeechToTextClient)), sourceName); + configure?.Invoke(client); + + return client; + }); +} diff --git a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/SpeechToTextClientBuilder.cs b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/SpeechToTextClientBuilder.cs index dae4224a94d..1945a140762 100644 --- a/src/Libraries/Microsoft.Extensions.AI/SpeechToText/SpeechToTextClientBuilder.cs +++ b/src/Libraries/Microsoft.Extensions.AI/SpeechToText/SpeechToTextClientBuilder.cs @@ -58,7 +58,7 @@ public ISpeechToTextClient Build(IServiceProvider? services = null) return audioClient; } - /// Adds a factory for an intermediate audio transcription client to the audio transcription client pipeline. + /// Adds a factory for an intermediate speech-to-text client to the speech-to-text client pipeline. /// The client factory function. /// The updated instance. public SpeechToTextClientBuilder Use(Func clientFactory) @@ -68,7 +68,7 @@ public SpeechToTextClientBuilder Use(Func clientFactory(innerClient)); } - /// Adds a factory for an intermediate audio transcription client to the audio transcription client pipeline. + /// Adds a factory for an intermediate speech-to-text client to the speech-to-text client pipeline. /// The client factory function. /// The updated instance. public SpeechToTextClientBuilder Use(Func clientFactory) diff --git a/src/Libraries/Microsoft.Extensions.AsyncState/IAsyncState.cs b/src/Libraries/Microsoft.Extensions.AsyncState/IAsyncState.cs index 21f33e05a2d..2593136a736 100644 --- a/src/Libraries/Microsoft.Extensions.AsyncState/IAsyncState.cs +++ b/src/Libraries/Microsoft.Extensions.AsyncState/IAsyncState.cs @@ -54,5 +54,5 @@ public interface IAsyncState /// Registers new async context with the state. /// /// Token that gives access to the reserved context. - public AsyncStateToken RegisterAsyncContext(); + AsyncStateToken RegisterAsyncContext(); } diff --git a/src/Libraries/Microsoft.Extensions.Caching.Hybrid/Internal/DefaultJsonSerializerFactory.cs b/src/Libraries/Microsoft.Extensions.Caching.Hybrid/Internal/DefaultJsonSerializerFactory.cs index f499ba485b3..537968113c2 100644 --- a/src/Libraries/Microsoft.Extensions.Caching.Hybrid/Internal/DefaultJsonSerializerFactory.cs +++ b/src/Libraries/Microsoft.Extensions.Caching.Hybrid/Internal/DefaultJsonSerializerFactory.cs @@ -126,10 +126,9 @@ static void PrepareStateForDepth(Type type, ref Dictionary? state, bool result) { var value = result ? FieldOnlyResult.FieldOnly : FieldOnlyResult.NotFieldOnly; - if (state is not null) - { - state[type] = value; - } +#pragma warning disable IDE0058 // Temporary workaround for Roslyn analyzer issue (see https://github.com/dotnet/roslyn/issues/80499). + state?[type] = value; +#pragma warning restore IDE0058 return value; } diff --git a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizationBuilder.cs b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizationBuilder.cs index a7a08eb4d3e..fe7d146c579 100644 --- a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizationBuilder.cs +++ b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizationBuilder.cs @@ -14,7 +14,7 @@ public interface IExceptionSummarizationBuilder /// /// Gets the service collection into which the summary provider instances are registered. /// - public IServiceCollection Services { get; } + IServiceCollection Services { get; } /// /// Adds a summary provider to the builder. diff --git a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizer.cs b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizer.cs index c9c4c4ef48a..3087bded812 100644 --- a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizer.cs +++ b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummarizer.cs @@ -15,5 +15,5 @@ public interface IExceptionSummarizer /// /// The exception to summarize. /// The summary of the given . - public ExceptionSummary Summarize(Exception exception); + ExceptionSummary Summarize(Exception exception); } diff --git a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummaryProvider.cs b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummaryProvider.cs index 24cfd0d079e..000e34ac0c7 100644 --- a/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummaryProvider.cs +++ b/src/Libraries/Microsoft.Extensions.Diagnostics.ExceptionSummarization/IExceptionSummaryProvider.cs @@ -26,15 +26,15 @@ public interface IExceptionSummaryProvider /// This method should only get invoked with an exception which is type compatible with a type /// described by . /// - public int Describe(Exception exception, out string? additionalDetails); + int Describe(Exception exception, out string? additionalDetails); /// /// Gets the set of supported exception types that can be handled by this provider. /// - public IEnumerable SupportedExceptionTypes { get; } + IEnumerable SupportedExceptionTypes { get; } /// /// Gets the set of description strings exposed by this provider. /// - public IReadOnlyList Descriptions { get; } + IReadOnlyList Descriptions { get; } } diff --git a/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/IManualHealthCheck.cs b/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/IManualHealthCheck.cs index 6772a33f0d3..e8164908150 100644 --- a/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/IManualHealthCheck.cs +++ b/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/IManualHealthCheck.cs @@ -13,7 +13,7 @@ public interface IManualHealthCheck : IDisposable /// /// Gets or sets the health status. /// - public HealthCheckResult Result { get; set; } + HealthCheckResult Result { get; set; } } /// diff --git a/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/ManualHealthCheck.cs b/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/ManualHealthCheck.cs index 0302b1a896f..d9faa46f338 100644 --- a/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/ManualHealthCheck.cs +++ b/src/Libraries/Microsoft.Extensions.Diagnostics.HealthChecks.Common/ManualHealthCheck.cs @@ -10,22 +10,20 @@ internal sealed class ManualHealthCheck : IManualHealthCheck { private static readonly object _lock = new(); - private HealthCheckResult _result; - public HealthCheckResult Result { get { lock (_lock) { - return _result; + return field; } } set { lock (_lock) { - _result = value; + field = value; } } } diff --git a/src/Libraries/Microsoft.Extensions.Http.Resilience/Polly/HttpRetryStrategyOptions.cs b/src/Libraries/Microsoft.Extensions.Http.Resilience/Polly/HttpRetryStrategyOptions.cs index db0a8850e20..7f105586ea6 100644 --- a/src/Libraries/Microsoft.Extensions.Http.Resilience/Polly/HttpRetryStrategyOptions.cs +++ b/src/Libraries/Microsoft.Extensions.Http.Resilience/Polly/HttpRetryStrategyOptions.cs @@ -16,8 +16,6 @@ namespace Microsoft.Extensions.Http.Resilience; /// public class HttpRetryStrategyOptions : RetryStrategyOptions { - private bool _shouldRetryAfterHeader; - /// /// Initializes a new instance of the class. /// @@ -47,12 +45,12 @@ public HttpRetryStrategyOptions() /// public bool ShouldRetryAfterHeader { - get => _shouldRetryAfterHeader; + get; set { - _shouldRetryAfterHeader = value; + field = value; - if (_shouldRetryAfterHeader) + if (field) { DelayGenerator = args => args.Outcome.Result switch { diff --git a/src/Libraries/Microsoft.Extensions.Http.Resilience/Routing/UriEndpoint.cs b/src/Libraries/Microsoft.Extensions.Http.Resilience/Routing/UriEndpoint.cs index 2373144556d..028eaca7762 100644 --- a/src/Libraries/Microsoft.Extensions.Http.Resilience/Routing/UriEndpoint.cs +++ b/src/Libraries/Microsoft.Extensions.Http.Resilience/Routing/UriEndpoint.cs @@ -6,15 +6,11 @@ namespace Microsoft.Extensions.Http.Resilience; -#pragma warning disable IDE0032 // Use auto property - /// /// Represents a URI-based endpoint. /// public class UriEndpoint { - private Uri? _uri; - /// /// Gets or sets the URL of the endpoint. /// @@ -22,9 +18,5 @@ public class UriEndpoint /// Only schema, domain name, and port are used. The rest of the URL is constructed from the request URL. /// [Required] - public Uri? Uri - { - get => _uri; - set => _uri = value; - } + public Uri? Uri { get; set; } } diff --git a/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Http/IDownstreamDependencyMetadata.cs b/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Http/IDownstreamDependencyMetadata.cs index daef766c1f2..bed838f31e7 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Http/IDownstreamDependencyMetadata.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Http/IDownstreamDependencyMetadata.cs @@ -13,15 +13,15 @@ public interface IDownstreamDependencyMetadata /// /// Gets the name of the dependent service. /// - public string DependencyName { get; } + string DependencyName { get; } /// /// Gets the list of host name suffixes that can uniquely identify a host as this dependency. /// - public ISet UniqueHostNameSuffixes { get; } + ISet UniqueHostNameSuffixes { get; } /// /// Gets the list of all metadata for all routes to the dependency service. /// - public ISet RequestMetadata { get; } + ISet RequestMetadata { get; } } diff --git a/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Logging/LoggerMessageHelper.cs b/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Logging/LoggerMessageHelper.cs index bb3631909dc..34d9df008ad 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Logging/LoggerMessageHelper.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry.Abstractions/Logging/LoggerMessageHelper.cs @@ -18,26 +18,11 @@ namespace Microsoft.Extensions.Logging; [EditorBrowsable(EditorBrowsableState.Never)] public static class LoggerMessageHelper { - [ThreadStatic] - private static LoggerMessageState? _state; - /// /// Gets a thread-local instance of this type. /// - public static LoggerMessageState ThreadLocalState - { - get - { - var result = _state; - if (result == null) - { - result = new(); - _state = result; - } - - return result; - } - } + [field: ThreadStatic] + public static LoggerMessageState ThreadLocalState => field ??= new(); /// /// Enumerates an enumerable into a string. diff --git a/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/CheckpointTracker.cs b/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/CheckpointTracker.cs index ba2c1646760..5cee914c68a 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/CheckpointTracker.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/CheckpointTracker.cs @@ -16,12 +16,13 @@ internal sealed class CheckpointTracker : IResettable private readonly Registry _checkpointNames; private readonly int[] _checkpointAdded; private readonly Checkpoint[] _checkpoints; - - private long _timestamp; - private int _numCheckpoints; - public long Elapsed => TimeProvider.GetTimestamp() - _timestamp; + public long Elapsed + { + get => TimeProvider.GetTimestamp() - field; + private set; + } public long Frequency => TimeProvider.TimestampFrequency; @@ -36,7 +37,7 @@ public CheckpointTracker(Registry registry) _checkpointAdded = new int[keyCount]; _checkpoints = new Checkpoint[keyCount]; TimeProvider = TimeProvider.System; - _timestamp = TimeProvider.GetTimestamp(); + Elapsed = TimeProvider.GetTimestamp(); } /// @@ -44,7 +45,7 @@ public CheckpointTracker(Registry registry) /// public bool TryReset() { - _timestamp = TimeProvider.GetTimestamp(); + Elapsed = TimeProvider.GetTimestamp(); _numCheckpoints = 0; Array.Clear(_checkpointAdded, 0, _checkpointAdded.Length); return true; diff --git a/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/LatencyContext.cs b/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/LatencyContext.cs index 0777d03c571..251d7cedc1c 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/LatencyContext.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry/Latency/Internal/LatencyContext.cs @@ -22,8 +22,6 @@ internal sealed class LatencyContext : ILatencyContext, IResettable private readonly MeasureTracker _measureTracker; - private long _duration; - public LatencyContext(LatencyContextPool latencyContextPool) { var latencyInstrumentProvider = latencyContextPool.LatencyInstrumentProvider; @@ -36,7 +34,11 @@ public LatencyContext(LatencyContextPool latencyContextPool) public LatencyData LatencyData => IsDisposed ? default : new(_tagCollection.Tags, _checkpointTracker.Checkpoints, _measureTracker.Measures, Duration, _checkpointTracker.Frequency); - private long Duration => IsRunning ? _checkpointTracker.Elapsed : _duration; + private long Duration + { + get => IsRunning ? _checkpointTracker.Elapsed : field; + set; + } #region Checkpoints public void AddCheckpoint(CheckpointToken token) @@ -82,7 +84,7 @@ public void Freeze() if (IsRunning) { IsRunning = false; - _duration = _checkpointTracker.Elapsed; + Duration = _checkpointTracker.Elapsed; } } diff --git a/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.LegacyTagJoiner.cs b/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.LegacyTagJoiner.cs index 46bf77f0816..3f75af17a34 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.LegacyTagJoiner.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.LegacyTagJoiner.cs @@ -21,7 +21,6 @@ internal sealed class LegacyTagJoiner : IReadOnlyList> _extraTags = new(TagCapacity); private IReadOnlyList>? _incomingTags; - private int _incomingTagCount; public LegacyTagJoiner() { @@ -34,7 +33,7 @@ public void Clear() { _extraTags.Clear(); _incomingTags = null; - _incomingTagCount = 0; + Count = 0; State = null; Formatter = null; } @@ -43,7 +42,7 @@ public void Clear() public void SetIncomingTags(IReadOnlyList> value) { _incomingTags = value; - _incomingTagCount = _incomingTags.Count; + Count = _incomingTags.Count; } public KeyValuePair this[int index] @@ -77,7 +76,11 @@ public void SetIncomingTags(IReadOnlyList> value) } } - public int Count => _incomingTagCount + _extraTags.Count + StaticTags!.Length; + public int Count + { + get => field + _extraTags.Count + StaticTags!.Length; + private set; + } public IEnumerator> GetEnumerator() { diff --git a/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.ThreadLocals.cs b/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.ThreadLocals.cs index 77e1ddc1fd1..21cf296e62c 100644 --- a/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.ThreadLocals.cs +++ b/src/Libraries/Microsoft.Extensions.Telemetry/Logging/ExtendedLogger.ThreadLocals.cs @@ -5,43 +5,11 @@ namespace Microsoft.Extensions.Logging; -#pragma warning disable S2696 - internal sealed partial class ExtendedLogger : ILogger { - [ThreadStatic] - private static ModernTagJoiner? _modernJoiner; - - [ThreadStatic] - private static LegacyTagJoiner? _legacyJoiner; - - private static ModernTagJoiner ModernJoiner - { - get - { - var joiner = _modernJoiner; - if (joiner == null) - { - joiner = new(); - _modernJoiner = joiner; - } - - return joiner; - } - } - - private static LegacyTagJoiner LegacyJoiner - { - get - { - var joiner = _legacyJoiner; - if (joiner == null) - { - joiner = new(); - _legacyJoiner = joiner; - } + [field: ThreadStatic] + private static ModernTagJoiner ModernJoiner => field ??= new(); - return joiner; - } - } + [field: ThreadStatic] + private static LegacyTagJoiner LegacyJoiner => field ??= new(); } diff --git a/src/Libraries/Microsoft.Extensions.TimeProvider.Testing/FakeTimeProvider.cs b/src/Libraries/Microsoft.Extensions.TimeProvider.Testing/FakeTimeProvider.cs index b9404afa30e..cfe601e9dfa 100644 --- a/src/Libraries/Microsoft.Extensions.TimeProvider.Testing/FakeTimeProvider.cs +++ b/src/Libraries/Microsoft.Extensions.TimeProvider.Testing/FakeTimeProvider.cs @@ -19,7 +19,6 @@ public class FakeTimeProvider : TimeProvider private DateTimeOffset _now = new(2000, 1, 1, 0, 0, 0, 0, TimeSpan.Zero); private TimeZoneInfo _localTimeZone = TimeZoneInfo.Utc; private volatile int _wakeWaitersGate; - private TimeSpan _autoAdvanceAmount; /// /// Initializes a new instance of the class. @@ -62,11 +61,11 @@ public FakeTimeProvider(DateTimeOffset startDateTime) /// The time value is less than . public TimeSpan AutoAdvanceAmount { - get => _autoAdvanceAmount; + get; set { _ = Throw.IfLessThan(value.Ticks, 0); - _autoAdvanceAmount = value; + field = value; } } @@ -78,7 +77,7 @@ public override DateTimeOffset GetUtcNow() lock (Waiters) { result = _now; - _now += _autoAdvanceAmount; + _now += AutoAdvanceAmount; } WakeWaiters(); diff --git a/src/ProjectTemplates/GeneratedContent.targets b/src/ProjectTemplates/GeneratedContent.targets index 31093493821..2d30c4faf5f 100644 --- a/src/ProjectTemplates/GeneratedContent.targets +++ b/src/ProjectTemplates/GeneratedContent.targets @@ -33,9 +33,9 @@ Specifies external packages that get referenced in generated template content. --> - 9.4.0 - 9.4.0-preview.1.25378.8 - 2.3.0-beta.1 + 9.5.0 + 9.5.0-preview.1.25474.7 + 2.3.0-beta.2 1.0.0-beta.9 1.14.0 11.6.1 @@ -44,7 +44,7 @@ 9.3.1 1.61.0 1.61.0-preview - 0.3.0-preview.2 + 0.4.0-preview.1 5.1.18 1.12.0 0.1.10 diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/.template.config/template.json b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/.template.config/template.json index d95f7c41c13..cffbaa7598f 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/.template.config/template.json +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/.template.config/template.json @@ -1,7 +1,7 @@ { "$schema": "http://json.schemastore.org/template", "author": "Microsoft", - "classifications": [ "Common", "AI", "Web", "Blazor", ".NET Aspire" ], + "classifications": [ "Common", "AI", "Web", "Blazor", "Aspire" ], "identity": "Microsoft.Extensions.AI.Templates.WebChat.CSharp", "name": "AI Chat Web App", "description": "A project template for creating an AI chat application, which uses retrieval-augmented generation (RAG) to chat with your own data.", @@ -64,6 +64,12 @@ "*.sln" ] }, + { + "condition": "(!IsAspire || !IsOllama)", + "exclude": [ + "ChatWithCustomData-CSharp.Web/OllamaResilienceHandlerExtensions.cs" + ] + }, { "condition": "(IsAspire)", "exclude": [ @@ -177,7 +183,7 @@ "displayName": "Use Aspire orchestration", "datatype": "bool", "defaultValue": "false", - "description": "Create the project as a distributed application using .NET Aspire." + "description": "Create the project as a distributed application using Aspire." }, "IsAspire": { "type": "computed", diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Program.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/AppHost.cs similarity index 74% rename from src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Program.cs rename to src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/AppHost.cs index df2f6765d9e..9eff45dcc39 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Program.cs +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/AppHost.cs @@ -1,6 +1,6 @@ var builder = DistributedApplication.CreateBuilder(args); #if (IsOllama) // ASPIRE PARAMETERS -#elif (IsOpenAI || IsGHModels) +#elif (IsOpenAI || IsGHModels || IsAzureOpenAI) // You will need to set the connection string to your own value // You can do this using Visual Studio's "Manage User Secrets" UI, or on the command line: @@ -10,24 +10,9 @@ #elif (IsGHModels) // dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://models.inference.ai.azure.com;Key=YOUR-API-KEY" #else // IsAzureOpenAI -// dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://YOUR-DEPLOYMENT-NAME.openai.azure.com;Key=YOUR-API-KEY" +// dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://YOUR-DEPLOYMENT-NAME.openai.azure.com/openai/v1;Key=YOUR-API-KEY" #endif var openai = builder.AddConnectionString("openai"); -#else // IsAzureOpenAI - -// See https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration -// for instructions providing configuration values -var openai = builder.AddAzureOpenAI("openai"); - -openai.AddDeployment( - name: "gpt-4o-mini", - modelName: "gpt-4o-mini", - modelVersion: "2024-07-18"); - -openai.AddDeployment( - name: "text-embedding-3-small", - modelName: "text-embedding-3-small", - modelVersion: "1"); #endif #if (UseAzureAISearch) @@ -58,12 +43,8 @@ .WithReference(embeddings) .WaitFor(chat) .WaitFor(embeddings); -#elif (IsOpenAI || IsGHModels) +#elif (IsOpenAI || IsGHModels || IsAzureOpenAI) webApp.WithReference(openai); -#else // IsAzureOpenAI -webApp - .WithReference(openai) - .WaitFor(openai); #endif #if (UseAzureAISearch) // VECTOR DATABASE REFERENCES webApp diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/ChatWithCustomData-CSharp.AppHost.csproj.in b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/ChatWithCustomData-CSharp.AppHost.csproj.in index 3bbd301af75..8f5b8baabc0 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/ChatWithCustomData-CSharp.AppHost.csproj.in +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/ChatWithCustomData-CSharp.AppHost.csproj.in @@ -7,7 +7,6 @@ net9.0 enable enable - true b2f4f5e9-1083-472c-8c3b-f055ac67ba54 diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Properties/launchSettings.json b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Properties/launchSettings.json index cff9159f816..ff3cb400c10 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Properties/launchSettings.json +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.AppHost/Properties/launchSettings.json @@ -9,8 +9,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21000", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22000" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21000", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22000" } }, "http": { @@ -21,8 +21,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:19000", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:20000" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:19000", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:20000" } } } diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.ServiceDefaults/Extensions.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.ServiceDefaults/Extensions.cs index 108f1ed2a08..204f7a64164 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.ServiceDefaults/Extensions.cs +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.ServiceDefaults/Extensions.cs @@ -10,11 +10,14 @@ namespace Microsoft.Extensions.Hosting; -// Adds common .NET Aspire services: service discovery, resilience, health checks, and OpenTelemetry. +// Adds common Aspire services: service discovery, resilience, health checks, and OpenTelemetry. // This project should be referenced by each service project in your solution. // To learn more about using this project, see https://aka.ms/dotnet/aspire/service-defaults public static class Extensions { + private const string HealthEndpointPath = "/health"; + private const string AlivenessEndpointPath = "/alive"; + public static TBuilder AddServiceDefaults(this TBuilder builder) where TBuilder : IHostApplicationBuilder { builder.ConfigureOpenTelemetry(); @@ -29,21 +32,8 @@ public static TBuilder AddServiceDefaults(this TBuilder builder) where http.RemoveAllResilienceHandlers(); #pragma warning restore EXTEXP0001 -#if (IsOllama) - // Turn on resilience by default - http.AddStandardResilienceHandler(config => - { - // Extend the HTTP Client timeout for Ollama - config.AttemptTimeout.Timeout = TimeSpan.FromMinutes(3); - - // Must be at least double the AttemptTimeout to pass options validation - config.CircuitBreaker.SamplingDuration = TimeSpan.FromMinutes(10); - config.TotalRequestTimeout.Timeout = TimeSpan.FromMinutes(10); - }); -#else // Turn on resilience by default http.AddStandardResilienceHandler(); -#endif // Turn on service discovery by default http.AddServiceDiscovery(); @@ -77,7 +67,12 @@ public static TBuilder ConfigureOpenTelemetry(this TBuilder builder) w .WithTracing(tracing => { tracing.AddSource(builder.Environment.ApplicationName) - .AddAspNetCoreInstrumentation() + .AddAspNetCoreInstrumentation(tracing => + // Exclude health check requests from tracing + tracing.Filter = context => + !context.Request.Path.StartsWithSegments(HealthEndpointPath) + && !context.Request.Path.StartsWithSegments(AlivenessEndpointPath) + ) // Uncomment the following line to enable gRPC instrumentation (requires the OpenTelemetry.Instrumentation.GrpcNetClient package) //.AddGrpcClientInstrumentation() .AddHttpClientInstrumentation() @@ -124,10 +119,10 @@ public static WebApplication MapDefaultEndpoints(this WebApplication app) if (app.Environment.IsDevelopment()) { // All health checks must pass for app to be considered ready to accept traffic after starting - app.MapHealthChecks("/health"); + app.MapHealthChecks(HealthEndpointPath); // Only health checks tagged with the "live" tag must pass for app to be considered alive - app.MapHealthChecks("/alive", new HealthCheckOptions + app.MapHealthChecks(AlivenessEndpointPath, new HealthCheckOptions { Predicate = r => r.Tags.Contains("live") }); diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/ChatWithCustomData-CSharp.Web.csproj.in b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/ChatWithCustomData-CSharp.Web.csproj.in index 0e753e8c907..967bc1896e0 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/ChatWithCustomData-CSharp.Web.csproj.in +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/ChatWithCustomData-CSharp.Web.csproj.in @@ -21,12 +21,10 @@ #endif --> - - + - diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/OllamaResilienceHandlerExtensions.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/OllamaResilienceHandlerExtensions.cs new file mode 100644 index 00000000000..fed9c91ca93 --- /dev/null +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/OllamaResilienceHandlerExtensions.cs @@ -0,0 +1,34 @@ +using System; +using Microsoft.Extensions.DependencyInjection; + +namespace ChatWithCustomData_CSharp.Web.Services; + +public static class OllamaResilienceHandlerExtensions +{ + public static IServiceCollection AddOllamaResilienceHandler(this IServiceCollection services) + { + services.ConfigureHttpClientDefaults(http => + { +#pragma warning disable EXTEXP0001 // RemoveAllResilienceHandlers is experimental + http.RemoveAllResilienceHandlers(); +#pragma warning restore EXTEXP0001 + + // Turn on resilience by default + http.AddStandardResilienceHandler(config => + { + // Extend the HTTP Client timeout for Ollama + config.AttemptTimeout.Timeout = TimeSpan.FromMinutes(3); + + // Must be at least double the AttemptTimeout to pass options validation + config.CircuitBreaker.SamplingDuration = TimeSpan.FromMinutes(10); + config.TotalRequestTimeout.Timeout = TimeSpan.FromMinutes(10); + }); + + // Turn on service discovery by default + http.AddServiceDiscovery(); + }); + + return services; + } +} + diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.Aspire.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.Aspire.cs index 84475f45a54..51fa6598129 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.Aspire.cs +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.Aspire.cs @@ -3,9 +3,8 @@ using ChatWithCustomData_CSharp.Web.Services; using ChatWithCustomData_CSharp.Web.Services.Ingestion; #if (IsOllama) -#elif (IsOpenAI || IsGHModels) +#else using OpenAI; -#else // IsAzureOpenAI #endif var builder = WebApplication.CreateBuilder(args); @@ -22,11 +21,7 @@ .AddEmbeddingGenerator(); #elif (IsAzureAiFoundry) #else // (IsOpenAI || IsAzureOpenAI || IsGHModels) -#if (IsOpenAI) var openai = builder.AddOpenAIClient("openai"); -#else -var openai = builder.AddAzureOpenAIClient("openai"); -#endif openai.AddChatClient("gpt-4o-mini") .UseFunctionInvocation() .UseOpenTelemetry(configure: c => @@ -50,6 +45,13 @@ #endif builder.Services.AddScoped(); builder.Services.AddSingleton(); +#if (IsOllama) +// Applies robust HTTP resilience settings for all HttpClients in the Web project, +// not across the entire solution. It's aimed at supporting Ollama scenarios due +// to its self-hosted nature and potentially slow responses. +// Remove this if you want to use the global or a different HTTP resilience policy instead. +builder.Services.AddOllamaResilienceHandler(); +#endif var app = builder.Build(); diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.cs index f8d2826ab9a..16130729c6e 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.cs +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Program.cs @@ -6,15 +6,13 @@ using Azure; #if (UseManagedIdentity) using Azure.Identity; +using System.ClientModel.Primitives; #endif #endif #if (IsOllama) using OllamaSharp; -#elif (IsOpenAI || IsGHModels) -using OpenAI; -using System.ClientModel; #else -using Azure.AI.OpenAI; +using OpenAI; using System.ClientModel; #endif @@ -61,17 +59,18 @@ #if (!UseManagedIdentity) // dotnet user-secrets set AzureOpenAI:Key YOUR-API-KEY #endif -var azureOpenAi = new AzureOpenAIClient( - new Uri(builder.Configuration["AzureOpenAI:Endpoint"] ?? throw new InvalidOperationException("Missing configuration: AzureOpenAi:Endpoint. See the README for details.")), +var azureEndpoint = builder.Configuration["AzureOpenAI:Endpoint"] ?? throw new InvalidOperationException("Missing configuration: AzureOpenAi:Endpoint. See the README for details."); +var azureOpenAiClient = new OpenAIClient( #if (UseManagedIdentity) - new DefaultAzureCredential()); + new BearerTokenPolicy(new DefaultAzureCredential(), ["https://cognitiveservices.azure.com/.default"]), #else - new ApiKeyCredential(builder.Configuration["AzureOpenAI:Key"] ?? throw new InvalidOperationException("Missing configuration: AzureOpenAi:Key. See the README for details."))); + new ApiKeyCredential(builder.Configuration["AzureOpenAI:Key"] ?? throw new InvalidOperationException("Missing configuration: AzureOpenAi:Key. See the README for details.")), #endif + new OpenAIClientOptions() { Endpoint = new Uri(azureEndpoint.TrimEnd('/') + "/openai/v1") }); #pragma warning disable OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. -var chatClient = azureOpenAi.GetOpenAIResponseClient("gpt-4o-mini").AsIChatClient(); +var chatClient = azureOpenAiClient.GetOpenAIResponseClient("gpt-4o-mini").AsIChatClient(); #pragma warning restore OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. -var embeddingGenerator = azureOpenAi.GetEmbeddingClient("text-embedding-3-small").AsIEmbeddingGenerator(); +var embeddingGenerator = azureOpenAiClient.GetEmbeddingClient("text-embedding-3-small").AsIEmbeddingGenerator(); #endif #if (UseAzureAISearch) diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/README.md b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/README.md index a828e164ff8..e8115bd182e 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/README.md +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/README.md @@ -130,7 +130,7 @@ Configure your Azure OpenAI endpoint for this project, using .NET User Secrets: ``` #### ---#endif -Make sure to replace `YOUR-AZURE-OPENAI-ENDPOINT` with your actual Azure OpenAI endpoint, formatted like https://YOUR-DEPLOYMENT-NAME.openai.azure.com/ (do not include any path after .openai.azure.com/). +Make sure to replace `YOUR-AZURE-OPENAI-ENDPOINT` with your actual Azure OpenAI endpoint, formatted like https://YOUR-DEPLOYMENT-NAME.openai.azure.com (do not include any path after .openai.azure.com/, the /openai/v1 path will be appended automatically). #### ---#else ### 3. Configure API Key and Endpoint Configure your Azure OpenAI API key and endpoint for this project, using .NET User Secrets: @@ -156,7 +156,7 @@ Configure your Azure OpenAI API key and endpoint for this project, using .NET Us ``` #### ---#endif -Make sure to replace `YOUR-AZURE-OPENAI-KEY` and `YOUR-AZURE-OPENAI-ENDPOINT` with your actual Azure OpenAI key and endpoint. Make sure your endpoint URL is formatted like https://YOUR-DEPLOYMENT-NAME.openai.azure.com/ (do not include any path after .openai.azure.com/). +Make sure to replace `YOUR-AZURE-OPENAI-KEY` and `YOUR-AZURE-OPENAI-ENDPOINT` with your actual Azure OpenAI key and endpoint. Make sure your endpoint URL is formatted like https://YOUR-DEPLOYMENT-NAME.openai.azure.com (do not include any path after .openai.azure.com/, the /openai/v1 path will be appended automatically). #### ---#endif #### ---#endif diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Services/Ingestion/DataIngestor.cs b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Services/Ingestion/DataIngestor.cs index 7eeb41c99fb..175cc505670 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Services/Ingestion/DataIngestor.cs +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/ChatWithCustomData-CSharp.Web/Services/Ingestion/DataIngestor.cs @@ -31,7 +31,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -39,7 +39,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -54,7 +54,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/README.Aspire.md b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/README.Aspire.md index 0b1934bfff7..14aa513acae 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/README.Aspire.md +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/ChatWithCustomData/README.Aspire.md @@ -84,38 +84,8 @@ Note: Ollama and Docker are excellent open source products, but are not maintain #### ---#if (IsAzureOpenAI || UseAzureAISearch) ## Using Azure Provisioning -The project is set up to automatically provision Azure resources, but local configuration is configured. For detailed instructions, see the [Local Provisioning documentation](https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration). +The project is set up to automatically provision Azure resources. When running the app for the first time, you will be prompted to provide Azure configuration values. For detailed instructions, see the [Local Provisioning documentation](https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration). -#### ---#if (hostIdentifier == "vs") -Configure local provisioning for this project using .NET User Secrets: - -1. In Visual Studio, right-click on the ChatWithCustomData-CSharp.AppHost project in the Solution Explorer and select "Manage User Secrets". -2. This opens a `secrets.json` file where you can store your API keys without them being tracked in source control. Add the following configuration: - - ```json - { - "Azure": { - "SubscriptionId": "", - "AllowResourceGroupCreation": true, - "ResourceGroup": "", - "Location": "" - } - } - ``` - -#### ---#else -From the command line, configure local provisioning for this project using .NET User Secrets by running the following commands: - -```sh -cd ChatWithCustomData-CSharp.AppHost -dotnet user-secrets set Azure:SubscriptionId "" -dotnet user-secrets set Azure:AllowResourceGroupCreation "true" -dotnet user-secrets set Azure:ResourceGroup "" -dotnet user-secrets set Azure:Location "" -``` -#### ---#endif - -Make sure to replace placeholder values with real configuration values. #### ---#endif #### ---#if (UseQdrant) @@ -143,9 +113,9 @@ Note: Qdrant and Docker are excellent open source products, but are not maintain ## Trust the localhost certificate -Several .NET Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. +Several Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. -See [Troubleshoot untrusted localhost certificate in .NET Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. +See [Troubleshoot untrusted localhost certificate in Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. # Updating JavaScript dependencies diff --git a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/McpServer/McpServer-CSharp/.mcp/server.json b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/McpServer/McpServer-CSharp/.mcp/server.json index 34c19714f79..8a2ad2b07a5 100644 --- a/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/McpServer/McpServer-CSharp/.mcp/server.json +++ b/src/ProjectTemplates/Microsoft.Extensions.AI.Templates/src/McpServer/McpServer-CSharp/.mcp/server.json @@ -1,12 +1,16 @@ { - "$schema": "https://modelcontextprotocol.io/schemas/draft/2025-07-09/server.json", + "$schema": "https://static.modelcontextprotocol.io/schemas/2025-07-09/server.schema.json", "description": "", "name": "io.github./", + "version": "0.1.0-beta", "packages": [ { - "registry_name": "nuget", - "name": "", + "registry_type": "nuget", + "identifier": "", "version": "0.1.0-beta", + "transport": { + "type": "stdio" + }, "package_arguments": [], "environment_variables": [] } @@ -14,8 +18,5 @@ "repository": { "url": "https://github.com//", "source": "github" - }, - "version_detail": { - "version": "0.1.0-beta" } } diff --git a/src/Shared/.editorconfig b/src/Shared/.editorconfig index bc980461f04..defd1d59afc 100644 --- a/src/Shared/.editorconfig +++ b/src/Shared/.editorconfig @@ -5868,7 +5868,7 @@ dotnet_diagnostic.SA1414.severity = warning # Title : Braces for multi-line statements should not share line # Category : StyleCop.CSharp.LayoutRules # Help Link: https://github.com/DotNetAnalyzers/StyleCopAnalyzers/blob/master/documentation/SA1500.md -dotnet_diagnostic.SA1500.severity = warning +dotnet_diagnostic.SA1500.severity = suggestion # rule does not work well with field-based property initializers # Title : Statement should not be on a single line # Category : StyleCop.CSharp.LayoutRules @@ -5936,7 +5936,7 @@ dotnet_diagnostic.SA1512.severity = none # Title : Closing brace should be followed by blank line # Category : StyleCop.CSharp.LayoutRules # Help Link: https://github.com/DotNetAnalyzers/StyleCopAnalyzers/blob/master/documentation/SA1513.md -dotnet_diagnostic.SA1513.severity = warning +dotnet_diagnostic.SA1513.severity = suggestion # rule does not work well with field-based property initializers # Title : Element documentation header should be preceded by blank line # Category : StyleCop.CSharp.LayoutRules diff --git a/src/Shared/Debugger/IDebuggerState.cs b/src/Shared/Debugger/IDebuggerState.cs index 9e5eb7ac0be..bcbe3d13187 100644 --- a/src/Shared/Debugger/IDebuggerState.cs +++ b/src/Shared/Debugger/IDebuggerState.cs @@ -13,5 +13,5 @@ internal interface IDebuggerState /// /// Gets a value indicating whether a debugger is attached or not. /// - public bool IsAttached { get; } + bool IsAttached { get; } } diff --git a/src/Shared/JsonSchemaExporter/JsonSchemaExporter.JsonSchema.cs b/src/Shared/JsonSchemaExporter/JsonSchemaExporter.JsonSchema.cs index a395c133980..5380d208259 100644 --- a/src/Shared/JsonSchemaExporter/JsonSchemaExporter.JsonSchema.cs +++ b/src/Shared/JsonSchemaExporter/JsonSchemaExporter.JsonSchema.cs @@ -35,244 +35,204 @@ private JsonSchema(bool trueOrFalse) public string? Schema { - get => _schema; + get; set { VerifyMutable(); - _schema = value; + field = value; } } - private string? _schema; - public string? Title { - get => _title; + get; set { VerifyMutable(); - _title = value; + field = value; } } - private string? _title; - public string? Description { - get => _description; + get; set { VerifyMutable(); - _description = value; + field = value; } } - private string? _description; - public string? Ref { - get => _ref; + get; set { VerifyMutable(); - _ref = value; + field = value; } } - private string? _ref; - public string? Comment { - get => _comment; + get; set { VerifyMutable(); - _comment = value; + field = value; } } - private string? _comment; - public JsonSchemaType Type { - get => _type; + get; set { VerifyMutable(); - _type = value; + field = value; } - } - - private JsonSchemaType _type = JsonSchemaType.Any; + } = JsonSchemaType.Any; public string? Format { - get => _format; + get; set { VerifyMutable(); - _format = value; + field = value; } } - private string? _format; - public string? Pattern { - get => _pattern; + get; set { VerifyMutable(); - _pattern = value; + field = value; } } - private string? _pattern; - public JsonNode? Constant { - get => _constant; + get; set { VerifyMutable(); - _constant = value; + field = value; } } - private JsonNode? _constant; - public List>? Properties { - get => _properties; + get; set { VerifyMutable(); - _properties = value; + field = value; } } - private List>? _properties; - public List? Required { - get => _required; + get; set { VerifyMutable(); - _required = value; + field = value; } } - private List? _required; - public JsonSchema? Items { - get => _items; + get; set { VerifyMutable(); - _items = value; + field = value; } } - private JsonSchema? _items; - public JsonSchema? AdditionalProperties { - get => _additionalProperties; + get; set { VerifyMutable(); - _additionalProperties = value; + field = value; } } - private JsonSchema? _additionalProperties; - public JsonArray? Enum { - get => _enum; + get; set { VerifyMutable(); - _enum = value; + field = value; } } - private JsonArray? _enum; - public JsonSchema? Not { - get => _not; + get; set { VerifyMutable(); - _not = value; + field = value; } } - private JsonSchema? _not; - public List? AnyOf { - get => _anyOf; + get; set { VerifyMutable(); - _anyOf = value; + field = value; } } - private List? _anyOf; - public bool HasDefaultValue { - get => _hasDefaultValue; + get; set { VerifyMutable(); - _hasDefaultValue = value; + field = value; } } - private bool _hasDefaultValue; - public JsonNode? DefaultValue { - get => _defaultValue; + get; set { VerifyMutable(); - _defaultValue = value; + field = value; } } - private JsonNode? _defaultValue; - public int? MinLength { - get => _minLength; + get; set { VerifyMutable(); - _minLength = value; + field = value; } } - private int? _minLength; - public int? MaxLength { - get => _maxLength; + get; set { VerifyMutable(); - _maxLength = value; + field = value; } } - private int? _maxLength; - public JsonSchemaExporterContext? GenerationContext { get; set; } public int KeywordCount diff --git a/src/Shared/ServerSentEvents/SseItem.cs b/src/Shared/ServerSentEvents/SseItem.cs index 9c6092fd3cf..013d2fe9098 100644 --- a/src/Shared/ServerSentEvents/SseItem.cs +++ b/src/Shared/ServerSentEvents/SseItem.cs @@ -17,12 +17,6 @@ internal readonly struct SseItem [EditorBrowsable(EditorBrowsableState.Never)] internal readonly string? _eventType; - /// The event's id. - private readonly string? _eventId; - - /// The event's reconnection interval. - private readonly TimeSpan? _reconnectionInterval; - /// Initializes a new instance of the struct. /// The event's payload. /// The event's type. @@ -48,7 +42,7 @@ public SseItem(T data, string? eventType = null) /// Thrown when the value contains a line break. public string? EventId { - get => _eventId; + get; init { if (value.AsSpan().ContainsLineBreaks() is true) @@ -56,7 +50,7 @@ public string? EventId ThrowHelper.ThrowArgumentException_CannotContainLineBreaks(nameof(EventId)); } - _eventId = value; + field = value; } } @@ -66,7 +60,7 @@ public string? EventId /// public TimeSpan? ReconnectionInterval { - get => _reconnectionInterval; + get; init { if (value < TimeSpan.Zero) @@ -74,7 +68,7 @@ public TimeSpan? ReconnectionInterval ThrowHelper.ThrowArgumentException_CannotBeNegative(nameof(ReconnectionInterval)); } - _reconnectionInterval = value; + field = value; } } } diff --git a/test/Generators/Microsoft.Gen.Logging/TestClasses/LogPropertiesExtensions.cs b/test/Generators/Microsoft.Gen.Logging/TestClasses/LogPropertiesExtensions.cs index 013f8d85956..3e0e8683ba5 100644 --- a/test/Generators/Microsoft.Gen.Logging/TestClasses/LogPropertiesExtensions.cs +++ b/test/Generators/Microsoft.Gen.Logging/TestClasses/LogPropertiesExtensions.cs @@ -196,10 +196,10 @@ public MyDerivedClass(double privateFieldValue) internal interface IMyInterface { - public int IntProperty { get; set; } + int IntProperty { get; set; } [LogProperties] - public LeafTransitiveBaseClass? TransitiveProp { get; set; } + LeafTransitiveBaseClass? TransitiveProp { get; set; } } internal sealed class MyInterfaceImpl : IMyInterface diff --git a/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/HeaderParsingFeatureTests.cs b/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/HeaderParsingFeatureTests.cs index ce23cb2dfb0..15000ccb21a 100644 --- a/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/HeaderParsingFeatureTests.cs +++ b/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/HeaderParsingFeatureTests.cs @@ -23,13 +23,10 @@ public sealed class HeaderParsingFeatureTests private readonly IOptions _options; private readonly IServiceCollection _services; private readonly FakeLogger _logger = new(); - private IHeaderRegistry? _registry; - private HttpContext? _context; - private IHeaderRegistry Registry => _registry ??= new HeaderRegistry(_services.BuildServiceProvider(), _options); + private IHeaderRegistry Registry => field ??= new HeaderRegistry(_services.BuildServiceProvider(), _options); - private HttpContext Context - => _context ??= new DefaultHttpContext { RequestServices = _services.BuildServiceProvider() }; + private HttpContext Context => field ??= new DefaultHttpContext { RequestServices = _services.BuildServiceProvider() }; public HeaderParsingFeatureTests() { @@ -48,7 +45,7 @@ public void Parses_header() var key = Registry.Register(CommonHeaders.Date); Context.Request.Headers["Date"] = date; - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; Assert.True(feature.TryGetHeaderValue(key, out var value, out var _)); Assert.Equal(date, value.ToString("R", CultureInfo.InvariantCulture)); @@ -67,7 +64,7 @@ public void Parses_multiple_headers() Context.Request.Headers["Date"] = currentDate; Context.Request.Headers["Test"] = futureDate; - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; Assert.True(feature.TryGetHeaderValue(key, out var value, out var result)); Assert.Equal(currentDate, value.ToString("R", CultureInfo.InvariantCulture)); @@ -89,7 +86,7 @@ public void Parses_with_late_binding() Context.Request.Headers["Date"] = date; - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; var key = Registry.Register(CommonHeaders.Date); @@ -103,7 +100,7 @@ public void TryParse_returns_false_on_header_not_found() { using var meter = new Meter(nameof(TryParse_returns_false_on_header_not_found)); var metrics = GetMockedMetrics(meter); - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; var key = Registry.Register(CommonHeaders.Date); Assert.False(feature.TryGetHeaderValue(key, out var value, out var _)); @@ -120,7 +117,7 @@ public void TryParse_returns_default_on_header_not_found() var date = DateTimeOffset.Now.ToString("R", CultureInfo.InvariantCulture); _options.Value.DefaultValues.Add("Date", date); - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; var key = Registry.Register(CommonHeaders.Date); Assert.True(feature.TryGetHeaderValue(key, out var value, out var result)); @@ -138,7 +135,7 @@ public void TryParse_returns_false_on_error() using var metricCollector = new MetricCollector(meter, "aspnetcore.header_parsing.parse_errors"); Context.Request.Headers["Date"] = "Not a date."; - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; var key = Registry.Register(CommonHeaders.Date); Assert.False(feature.TryGetHeaderValue(key, out var value, out var result)); @@ -161,7 +158,7 @@ public void Dispose_resets_state_and_returns_to_pool() var metrics = GetMockedMetrics(meter); var pool = new Mock>(MockBehavior.Strict); - var helper = new HeaderParsingFeature.PoolHelper(pool.Object, Registry, _logger, metrics); + var helper = new HeaderParsingFeature.PoolHelper(pool.Object, _logger, metrics); helper.Feature.Context = Context; pool.Setup(x => x.Return(helper)); @@ -195,8 +192,8 @@ public void CachingWorks() Context.Request.Headers[HeaderNames.CacheControl] = "max-age=604800"; - var feature = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; - var feature2 = new HeaderParsingFeature(Registry, _logger, metrics) { Context = Context }; + var feature = new HeaderParsingFeature(_logger, metrics) { Context = Context }; + var feature2 = new HeaderParsingFeature(_logger, metrics) { Context = Context }; var key = Registry.Register(CommonHeaders.CacheControl); Assert.True(feature.TryGetHeaderValue(key, out var value1, out var error1)); diff --git a/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/ParserTests.cs b/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/ParserTests.cs index 0932aab6ac0..1a826f086c6 100644 --- a/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/ParserTests.cs +++ b/test/Libraries/Microsoft.AspNetCore.HeaderParsing.Tests/ParserTests.cs @@ -146,8 +146,8 @@ public void ContentDisposition_ReturnsParsedValue() { var sv = new StringValues("attachment; filename=\"cool.html\""); Assert.True(ContentDispositionHeaderValueParser.Instance.TryParse(sv, out var result, out var error)); - Assert.Equal("cool.html", result.FileName); - Assert.Equal("attachment", result.DispositionType); + Assert.Equal("cool.html", result.FileName.ToString()); + Assert.Equal("attachment", result.DispositionType.ToString()); Assert.Null(error); } @@ -174,8 +174,8 @@ public void MediaType_ReturnsParsedValue() { var sv = new StringValues("text/html; charset=UTF-8"); Assert.True(MediaTypeHeaderValueParser.Instance.TryParse(sv, out var result, out var error)); - Assert.Equal("text/html", result.MediaType); - Assert.Equal("UTF-8", result.Charset); + Assert.Equal("text/html", result.MediaType.ToString()); + Assert.Equal("UTF-8", result.Charset.ToString()); Assert.Null(error); } @@ -203,8 +203,8 @@ public void MediaTypes_ReturnsParsedValue() var sv = new StringValues("text/html; charset=UTF-8"); Assert.True(MediaTypeHeaderValueListParser.Instance.TryParse(sv, out var result, out var error)); Assert.Single(result); - Assert.Equal("text/html", result[0].MediaType); - Assert.Equal("UTF-8", result[0].Charset); + Assert.Equal("text/html", result[0].MediaType.ToString()); + Assert.Equal("UTF-8", result[0].Charset.ToString()); Assert.Null(error); } @@ -223,7 +223,7 @@ public void EntityTag_ReturnsParsedValue() var sv = new StringValues("\"HelloWorld\""); Assert.True(EntityTagHeaderValueListParser.Instance.TryParse(sv, out var result, out var error)); Assert.Single(result!); - Assert.Equal("\"HelloWorld\"", result[0].Tag); + Assert.Equal("\"HelloWorld\"", result[0].Tag.ToString()); Assert.Null(error); } @@ -242,7 +242,7 @@ public void StringQuality_ReturnsParsedValue() var sv = new StringValues("en-US"); Assert.True(StringWithQualityHeaderValueListParser.Instance.TryParse(sv, out var result, out var error)); Assert.Single(result!); - Assert.Equal("en-US", result[0].Value); + Assert.Equal("en-US", result[0].Value.ToString()); Assert.Null(error); } @@ -252,8 +252,8 @@ public void StringQuality_Multi() var sv = new StringValues("en-US,en;q=0.5"); Assert.True(StringWithQualityHeaderValueListParser.Instance.TryParse(sv, out var result, out var error)); Assert.Equal(2, result.Count); - Assert.Equal("en-US", result[0].Value); - Assert.Equal("en", result[1].Value); + Assert.Equal("en-US", result[0].Value.ToString()); + Assert.Equal("en", result[1].Value.ToString()); Assert.Equal(0.5, result[1].Quality); Assert.Null(error); } @@ -300,7 +300,7 @@ public void Range_ReturnsParsedValue() { var sv = new StringValues("bytes=200-1000"); Assert.True(RangeHeaderValueParser.Instance.TryParse(sv, out var result, out var error)); - Assert.Equal("bytes", result!.Unit); + Assert.Equal("bytes", result!.Unit.ToString()); Assert.Single(result.Ranges); Assert.Equal(200, result.Ranges.Single().From); Assert.Equal(1000, result.Ranges.Single().To); @@ -341,7 +341,7 @@ public void RangeCondition_ReturnsParsedValue() sv = new StringValues("\"67ab43\""); Assert.True(RangeConditionHeaderValueParser.Instance.TryParse(sv, out result, out error)); - Assert.Equal("\"67ab43\"", result!.EntityTag!.Tag); + Assert.Equal("\"67ab43\"", result!.EntityTag!.Tag.ToString()); Assert.Null(error); } diff --git a/test/Libraries/Microsoft.Extensions.AI.Abstractions.Tests/Microsoft.Extensions.AI.Abstractions.Tests.csproj b/test/Libraries/Microsoft.Extensions.AI.Abstractions.Tests/Microsoft.Extensions.AI.Abstractions.Tests.csproj index d2ae2802123..4c275e54993 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Abstractions.Tests/Microsoft.Extensions.AI.Abstractions.Tests.csproj +++ b/test/Libraries/Microsoft.Extensions.AI.Abstractions.Tests/Microsoft.Extensions.AI.Abstractions.Tests.csproj @@ -29,7 +29,6 @@ - diff --git a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Microsoft.Extensions.AI.Evaluation.Integration.Tests.csproj b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Microsoft.Extensions.AI.Evaluation.Integration.Tests.csproj index 6e3332ebca6..baf584bd8e4 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Microsoft.Extensions.AI.Evaluation.Integration.Tests.csproj +++ b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Microsoft.Extensions.AI.Evaluation.Integration.Tests.csproj @@ -20,7 +20,6 @@ - diff --git a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Settings.cs b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Settings.cs index 25ee9d544cb..7be73a05c10 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Settings.cs +++ b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Settings.cs @@ -57,16 +57,7 @@ public Settings(IConfiguration config) #pragma warning restore CA2208 } - private static Settings? _currentSettings; - - public static Settings Current - { - get - { - _currentSettings ??= GetCurrentSettings(); - return _currentSettings; - } - } + public static Settings Current => field ??= GetCurrentSettings(); private static Settings GetCurrentSettings() { diff --git a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Setup.cs b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Setup.cs index 388ba1f1415..b7e6b13e6d8 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Setup.cs +++ b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Integration.Tests/Setup.cs @@ -3,8 +3,9 @@ using System; using System.ClientModel; -using Azure.AI.OpenAI; +using System.ClientModel.Primitives; using Azure.Identity; +using OpenAI; namespace Microsoft.Extensions.AI.Evaluation.Integration.Tests; @@ -15,22 +16,36 @@ internal static class Setup internal static ChatConfiguration CreateChatConfiguration() { - AzureOpenAIClient azureOpenAIClient = GetAzureOpenAIClient(); - IChatClient chatClient = azureOpenAIClient.GetChatClient(Settings.Current.DeploymentName).AsIChatClient(); + OpenAIClient openAIClient = GetOpenAIClient(); + IChatClient chatClient = openAIClient.GetChatClient(Settings.Current.DeploymentName).AsIChatClient(); return new ChatConfiguration(chatClient); } - private static AzureOpenAIClient GetAzureOpenAIClient() + private static OpenAIClient GetOpenAIClient() { - var endpoint = new Uri(Settings.Current.Endpoint); - AzureOpenAIClientOptions options = new(); + var endpoint = Settings.Current.Endpoint; var credential = new ChainedTokenCredential(new AzureCliCredential(), new DefaultAzureCredential()); - AzureOpenAIClient azureOpenAIClient = + OpenAIClient openAIClient = OfflineOnly - ? new AzureOpenAIClient(endpoint, new ApiKeyCredential("Bogus"), options) - : new AzureOpenAIClient(endpoint, credential, options); + ? new OpenAIClient( + new ApiKeyCredential("Bogus"), + new OpenAIClientOptions + { + Endpoint = new Uri(endpoint.TrimEnd('/') + "/openai/v1") + }) + : +#pragma warning disable OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. + new OpenAIClient( + new BearerTokenPolicy( + credential, + "https://cognitiveservices.azure.com/.default"), + new OpenAIClientOptions + { + Endpoint = new Uri(endpoint.TrimEnd('/') + "/openai/v1") + }); +#pragma warning restore OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. - return azureOpenAIClient; + return openAIClient; } } diff --git a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Tests/Settings.cs b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Tests/Settings.cs index 366e4549748..1118d7f6624 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Tests/Settings.cs +++ b/test/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting.Tests/Settings.cs @@ -27,16 +27,7 @@ public Settings(IConfiguration config) #pragma warning restore CA2208 } - private static Settings? _currentSettings; - - public static Settings Current - { - get - { - _currentSettings ??= GetCurrentSettings(); - return _currentSettings; - } - } + public static Settings Current => field ??= GetCurrentSettings(); private static Settings GetCurrentSettings() { diff --git a/test/Libraries/Microsoft.Extensions.AI.Integration.Tests/ImageGeneratorIntegrationTests.cs b/test/Libraries/Microsoft.Extensions.AI.Integration.Tests/ImageGeneratorIntegrationTests.cs index 8e3078efbb5..76b08941bc5 100644 --- a/test/Libraries/Microsoft.Extensions.AI.Integration.Tests/ImageGeneratorIntegrationTests.cs +++ b/test/Libraries/Microsoft.Extensions.AI.Integration.Tests/ImageGeneratorIntegrationTests.cs @@ -43,13 +43,23 @@ public virtual async Task GenerateImagesAsync_SingleImageGeneration() Assert.NotNull(response); Assert.NotEmpty(response.Contents); - Assert.Single(response.Contents); - var content = response.Contents[0]; - Assert.IsType(content); - var dataContent = (DataContent)content; - Assert.False(dataContent.Data.IsEmpty); - Assert.StartsWith("image/", dataContent.MediaType, StringComparison.Ordinal); + var content = Assert.Single(response.Contents); + switch (content) + { + case UriContent uc: + Assert.StartsWith("http", uc.Uri.Scheme, StringComparison.Ordinal); + break; + + case DataContent dc: + Assert.False(dc.Data.IsEmpty); + Assert.StartsWith("image/", dc.MediaType, StringComparison.Ordinal); + break; + + default: + Assert.Fail($"Unexpected content type: {content.GetType()}"); + break; + } } [ConditionalFact] diff --git a/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/IntegrationTestHelpers.cs b/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/IntegrationTestHelpers.cs index 5c98f814868..3c3b2c6ab97 100644 --- a/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/IntegrationTestHelpers.cs +++ b/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/IntegrationTestHelpers.cs @@ -3,7 +3,7 @@ using System; using System.ClientModel; -using Azure.AI.OpenAI; +using System.ClientModel.Primitives; using Azure.Identity; using OpenAI; @@ -27,11 +27,25 @@ internal static class IntegrationTestHelpers if (apiKey is not null) { - return new AzureOpenAIClient(new Uri(endpoint), new ApiKeyCredential(apiKey)); + return new OpenAIClient( + new ApiKeyCredential(apiKey), + new OpenAIClientOptions + { + Endpoint = new Uri(endpoint.TrimEnd('/') + "/openai/v1") + }); } else { - return new AzureOpenAIClient(new Uri(endpoint), new DefaultAzureCredential()); +#pragma warning disable OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. + return new OpenAIClient( + new BearerTokenPolicy( + new DefaultAzureCredential(), + "https://cognitiveservices.azure.com/.default"), + new OpenAIClientOptions + { + Endpoint = new Uri(endpoint.TrimEnd('/') + "/openai/v1") + }); +#pragma warning restore OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed. } } else if (apiKey is not null) diff --git a/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/Microsoft.Extensions.AI.OpenAI.Tests.csproj b/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/Microsoft.Extensions.AI.OpenAI.Tests.csproj index 536c250cb47..dd086887a2c 100644 --- a/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/Microsoft.Extensions.AI.OpenAI.Tests.csproj +++ b/test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/Microsoft.Extensions.AI.OpenAI.Tests.csproj @@ -32,7 +32,6 @@ - diff --git a/test/Libraries/Microsoft.Extensions.AI.Tests/SpeechToText/OpenTelemetrySpeechToTextClientTests.cs b/test/Libraries/Microsoft.Extensions.AI.Tests/SpeechToText/OpenTelemetrySpeechToTextClientTests.cs new file mode 100644 index 00000000000..c243bf2bf12 --- /dev/null +++ b/test/Libraries/Microsoft.Extensions.AI.Tests/SpeechToText/OpenTelemetrySpeechToTextClientTests.cs @@ -0,0 +1,150 @@ +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. + +using System; +using System.Collections.Generic; +using System.Diagnostics; +using System.IO; +using System.Linq; +using System.Runtime.CompilerServices; +using System.Text.RegularExpressions; +using System.Threading; +using System.Threading.Tasks; +using OpenTelemetry.Trace; +using Xunit; + +namespace Microsoft.Extensions.AI; + +public class OpenTelemetrySpeechToTextClientTests +{ + [Fact] + public void InvalidArgs_Throws() + { + Assert.Throws("innerClient", () => new OpenTelemetrySpeechToTextClient(null!)); + } + + [Theory] + [InlineData(false, false)] + [InlineData(false, true)] + [InlineData(true, false)] + [InlineData(true, true)] + public async Task ExpectedInformationLogged_Async(bool streaming, bool enableSensitiveData) + { + var sourceName = Guid.NewGuid().ToString(); + var activities = new List(); + using var tracerProvider = OpenTelemetry.Sdk.CreateTracerProviderBuilder() + .AddSource(sourceName) + .AddInMemoryExporter(activities) + .Build(); + + using var innerClient = new TestSpeechToTextClient + { + GetTextAsyncCallback = async (request, options, cancellationToken) => + { + await Task.Yield(); + return new("This is the recognized text.") + { + Usage = new() + { + InputTokenCount = 10, + OutputTokenCount = 20, + TotalTokenCount = 30, + }, + }; + }, + + GetStreamingTextAsyncCallback = TestClientStreamAsync, + + GetServiceCallback = (serviceType, serviceKey) => + serviceType == typeof(SpeechToTextClientMetadata) ? new SpeechToTextClientMetadata("testservice", new Uri("http://localhost:12345/something"), "amazingmodel") : + null, + }; + + static async IAsyncEnumerable TestClientStreamAsync( + Stream request, SpeechToTextOptions? options, [EnumeratorCancellation] CancellationToken cancellationToken) + { + await Task.Yield(); + yield return new("This is"); + yield return new(" the recognized"); + yield return new() + { + Contents = + [ + new TextContent(" text."), + new UsageContent(new() + { + InputTokenCount = 10, + OutputTokenCount = 20, + TotalTokenCount = 30, + }), + ] + }; + } + + using var client = innerClient + .AsBuilder() + .UseOpenTelemetry(null, sourceName, configure: instance => + { + instance.EnableSensitiveData = enableSensitiveData; + }) + .Build(); + + SpeechToTextOptions options = new() + { + ModelId = "mycoolspeechmodel", + AdditionalProperties = new() + { + ["service_tier"] = "value1", + ["SomethingElse"] = "value2", + }, + }; + + var response = streaming ? + await client.GetStreamingTextAsync(Stream.Null, options).ToSpeechToTextResponseAsync() : + await client.GetTextAsync(Stream.Null, options); + + var activity = Assert.Single(activities); + + Assert.NotNull(activity.Id); + Assert.NotEmpty(activity.Id); + + Assert.Equal("localhost", activity.GetTagItem("server.address")); + Assert.Equal(12345, (int)activity.GetTagItem("server.port")!); + + Assert.Equal("generate_content mycoolspeechmodel", activity.DisplayName); + Assert.Equal("testservice", activity.GetTagItem("gen_ai.provider.name")); + + Assert.Equal("mycoolspeechmodel", activity.GetTagItem("gen_ai.request.model")); + Assert.Equal(enableSensitiveData ? "value1" : null, activity.GetTagItem("service_tier")); + Assert.Equal(enableSensitiveData ? "value2" : null, activity.GetTagItem("SomethingElse")); + + Assert.Equal(10, activity.GetTagItem("gen_ai.usage.input_tokens")); + Assert.Equal(20, activity.GetTagItem("gen_ai.usage.output_tokens")); + + Assert.True(activity.Duration.TotalMilliseconds > 0); + + var tags = activity.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value); + if (enableSensitiveData) + { + Assert.Equal(ReplaceWhitespace(""" + [ + { + "role": "assistant", + "parts": [ + { + "type": "text", + "content": "This is the recognized text." + } + ] + } + ] + """), ReplaceWhitespace(tags["gen_ai.output.messages"])); + } + else + { + Assert.False(tags.ContainsKey("gen_ai.output.messages")); + } + + static string ReplaceWhitespace(string? input) => Regex.Replace(input ?? "", @"\s+", " ").Trim(); + } +} diff --git a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncContextTests.cs b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncContextTests.cs index 1ad60e9ad4f..0fcdd415fd1 100644 --- a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncContextTests.cs +++ b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncContextTests.cs @@ -7,6 +7,7 @@ using Xunit; namespace Microsoft.Extensions.AsyncState.Test; + public class AsyncContextTests { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncStateTokenTests.cs b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncStateTokenTests.cs index 06a1c30e5b2..ef0a5023cbe 100644 --- a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncStateTokenTests.cs +++ b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/AsyncStateTokenTests.cs @@ -4,6 +4,7 @@ using Xunit; namespace Microsoft.Extensions.AsyncState.Test; + public class AsyncStateTokenTests { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/FeaturesPooledPolicyTests.cs b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/FeaturesPooledPolicyTests.cs index 909389acb95..5abd3e3d7cd 100644 --- a/test/Libraries/Microsoft.Extensions.AsyncState.Tests/FeaturesPooledPolicyTests.cs +++ b/test/Libraries/Microsoft.Extensions.AsyncState.Tests/FeaturesPooledPolicyTests.cs @@ -6,6 +6,7 @@ using Xunit; namespace Microsoft.Extensions.AsyncState.Test; + public class FeaturesPooledPolicyTests { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/ExpirationTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/ExpirationTests.cs index c33bcb50e98..562ba8ae98f 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/ExpirationTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/ExpirationTests.cs @@ -10,6 +10,7 @@ using static Microsoft.Extensions.Caching.Hybrid.Tests.L2Tests; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class ExpirationTests(ITestOutputHelper log) { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/FunctionalTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/FunctionalTests.cs index 7b8396eb50d..730013dbe4f 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/FunctionalTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/FunctionalTests.cs @@ -6,6 +6,7 @@ using Microsoft.Extensions.DependencyInjection; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class FunctionalTests : IClassFixture { private static ServiceProvider GetDefaultCache(out DefaultHybridCache cache, Action? config = null) diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/L2Tests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/L2Tests.cs index 5c9cc2a41c5..f5be5b5277d 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/L2Tests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/L2Tests.cs @@ -11,6 +11,7 @@ using Xunit.Abstractions; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class L2Tests(ITestOutputHelper log) : IClassFixture { private static string CreateString(bool work = false) diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/LocalInvalidationTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/LocalInvalidationTests.cs index 310f7d5cdce..6efc4b14d45 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/LocalInvalidationTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/LocalInvalidationTests.cs @@ -9,6 +9,7 @@ using static Microsoft.Extensions.Caching.Hybrid.Tests.L2Tests; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class LocalInvalidationTests(ITestOutputHelper log) : IClassFixture { private static ServiceProvider GetDefaultCache(out DefaultHybridCache cache, Action? config = null) diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/PayloadTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/PayloadTests.cs index 1f125336dae..a4a9c470551 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/PayloadTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/PayloadTests.cs @@ -12,6 +12,7 @@ using static Microsoft.Extensions.Caching.Hybrid.Tests.L2Tests; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class PayloadTests(ITestOutputHelper log) : IClassFixture { private static ServiceProvider GetDefaultCache(out DefaultHybridCache cache, Action? config = null) diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TagSetTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TagSetTests.cs index 1c63ff5e5c2..818dac7b45c 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TagSetTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TagSetTests.cs @@ -4,6 +4,7 @@ using Microsoft.Extensions.Caching.Hybrid.Internal; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class TagSetTests { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TypeTests.cs b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TypeTests.cs index c2ab242a6b0..d0176ab4e49 100644 --- a/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TypeTests.cs +++ b/test/Libraries/Microsoft.Extensions.Caching.Hybrid.Tests/TypeTests.cs @@ -6,6 +6,7 @@ using Microsoft.Extensions.Caching.Hybrid.Internal; namespace Microsoft.Extensions.Caching.Hybrid.Tests; + public class TypeTests { [Theory] diff --git a/test/Libraries/Microsoft.Extensions.Compliance.Abstractions.Tests/Redaction/RedactionAbstractionsExtensionsTest.cs b/test/Libraries/Microsoft.Extensions.Compliance.Abstractions.Tests/Redaction/RedactionAbstractionsExtensionsTest.cs index 52211d96e62..a890b5396c9 100644 --- a/test/Libraries/Microsoft.Extensions.Compliance.Abstractions.Tests/Redaction/RedactionAbstractionsExtensionsTest.cs +++ b/test/Libraries/Microsoft.Extensions.Compliance.Abstractions.Tests/Redaction/RedactionAbstractionsExtensionsTest.cs @@ -24,12 +24,7 @@ public static void When_Passed_Null_Value_String_Builder_Extensions_Does_Not_App var sb = new StringBuilder(); var redactor = NullRedactor.Instance; - sb.AppendRedacted(NullRedactor.Instance, -#if NETCOREAPP3_1_OR_GREATER - null); -#else - (string?)null); -#endif + sb.AppendRedacted(NullRedactor.Instance, null); Assert.Equal(0, sb.Length); } diff --git a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IAnotherFakeServiceCounter.cs b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IAnotherFakeServiceCounter.cs index 61fcf232461..9e3227e92d2 100644 --- a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IAnotherFakeServiceCounter.cs +++ b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IAnotherFakeServiceCounter.cs @@ -5,5 +5,5 @@ namespace Microsoft.Extensions.DependencyInjection.Test.Helpers; public interface IAnotherFakeServiceCounter { - public int Counter { get; set; } + int Counter { get; set; } } diff --git a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFactoryServiceCounter.cs b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFactoryServiceCounter.cs index 592a617d32f..a3d7b92cbb2 100644 --- a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFactoryServiceCounter.cs +++ b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFactoryServiceCounter.cs @@ -5,5 +5,5 @@ namespace Microsoft.Extensions.DependencyInjection.Test.Helpers; public interface IFactoryServiceCounter { - public int Counter { get; set; } + int Counter { get; set; } } diff --git a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeMultipleCounter.cs b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeMultipleCounter.cs index 772b0be54df..dea82dbdc7d 100644 --- a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeMultipleCounter.cs +++ b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeMultipleCounter.cs @@ -5,5 +5,5 @@ namespace Microsoft.Extensions.DependencyInjection.Test.Helpers; public interface IFakeMultipleCounter { - public int Counter { get; set; } + int Counter { get; set; } } diff --git a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeOpenGenericCounter.cs b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeOpenGenericCounter.cs index e7bc46ec661..e9573e4e34f 100644 --- a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeOpenGenericCounter.cs +++ b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeOpenGenericCounter.cs @@ -5,5 +5,5 @@ namespace Microsoft.Extensions.DependencyInjection.Test.Helpers; public interface IFakeOpenGenericCounter { - public int Counter { get; set; } + int Counter { get; set; } } diff --git a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeServiceCounter.cs b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeServiceCounter.cs index d7708a196d6..f8474f20599 100644 --- a/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeServiceCounter.cs +++ b/test/Libraries/Microsoft.Extensions.DependencyInjection.AutoActivation.Tests/Helpers/IFakeServiceCounter.cs @@ -5,5 +5,5 @@ namespace Microsoft.Extensions.DependencyInjection.Test.Helpers; public interface IFakeServiceCounter { - public int Counter { get; set; } + int Counter { get; set; } } diff --git a/test/Libraries/Microsoft.Extensions.Http.Resilience.Tests/Resilience/HttpStandardResilienceOptionsCustomValidatorTests.cs b/test/Libraries/Microsoft.Extensions.Http.Resilience.Tests/Resilience/HttpStandardResilienceOptionsCustomValidatorTests.cs index f7e164339f0..b5e50b9c273 100644 --- a/test/Libraries/Microsoft.Extensions.Http.Resilience.Tests/Resilience/HttpStandardResilienceOptionsCustomValidatorTests.cs +++ b/test/Libraries/Microsoft.Extensions.Http.Resilience.Tests/Resilience/HttpStandardResilienceOptionsCustomValidatorTests.cs @@ -16,6 +16,7 @@ using Xunit; namespace Microsoft.Extensions.Http.Resilience.Test.Resilience; + public class HttpStandardResilienceOptionsCustomValidatorTests { [Fact] diff --git a/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Buffering/LogBufferingFilterRuleTests.cs b/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Buffering/LogBufferingFilterRuleTests.cs index 222d2aacc9a..a3b6bad6340 100644 --- a/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Buffering/LogBufferingFilterRuleTests.cs +++ b/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Buffering/LogBufferingFilterRuleTests.cs @@ -7,6 +7,7 @@ using Xunit; namespace Microsoft.Extensions.Diagnostics.Buffering.Test; + public class LogBufferingFilterRuleTests { private readonly LogBufferingFilterRuleSelector _selector = new(); diff --git a/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Latency/Internal/MockLatencyContextRegistrationOptions.cs b/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Latency/Internal/MockLatencyContextRegistrationOptions.cs index 6d4ed5908f8..58d2d50d9ba 100644 --- a/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Latency/Internal/MockLatencyContextRegistrationOptions.cs +++ b/test/Libraries/Microsoft.Extensions.Telemetry.Tests/Latency/Internal/MockLatencyContextRegistrationOptions.cs @@ -5,6 +5,7 @@ using Moq; namespace Microsoft.Extensions.Diagnostics.Latency.Test; + internal static class MockLatencyContextRegistrationOptions { public static IOptions GetLatencyContextRegistrationOptions( diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/Project.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/Project.cs index 38ced5b1867..317e81a661f 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/Project.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/Project.cs @@ -8,30 +8,31 @@ namespace Microsoft.Extensions.AI.Templates.Tests; public sealed class Project(string rootPath, string name) { - private string? _startupProjectRelativePath; - private string? _startupProjectFullPath; - public string RootPath => rootPath; public string Name => name; public string? StartupProjectRelativePath { - get => _startupProjectRelativePath; + get; set { if (value is null) { - _startupProjectRelativePath = null; - _startupProjectFullPath = null; + field = null; + StartupProjectFullPath = null!; } - else if (!string.Equals(value, _startupProjectRelativePath, StringComparison.Ordinal)) + else if (!string.Equals(value, field, StringComparison.Ordinal)) { - _startupProjectRelativePath = value; - _startupProjectFullPath = Path.Combine(rootPath, _startupProjectRelativePath); + field = value; + StartupProjectFullPath = Path.Combine(rootPath, field); } } } - public string StartupProjectFullPath => _startupProjectFullPath ?? rootPath; + public string StartupProjectFullPath + { + get => field ?? rootPath; + private set; + } } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/TestCommandResult.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/TestCommandResult.cs index 09d09d50a1c..4b5e2dd2a28 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/TestCommandResult.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Infrastructure/TestCommandResult.cs @@ -7,12 +7,9 @@ namespace Microsoft.Extensions.AI.Templates.Tests; public sealed class TestCommandResult(StringBuilder standardOutputBuilder, StringBuilder standardErrorBuilder, int exitCode) { - private string? _standardOutput; - private string? _standardError; + public string StandardOutput => field ??= standardOutputBuilder.ToString(); - public string StandardOutput => _standardOutput ??= standardOutputBuilder.ToString(); - - public string StandardError => _standardError ??= standardErrorBuilder.ToString(); + public string StandardError => field ??= standardErrorBuilder.ToString(); public int ExitCode => exitCode; } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/README.md b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/README.md index d1459703de1..a2f61924c32 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/README.md +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/README.md @@ -17,19 +17,8 @@ This incompatibility can be addressed by upgrading to Docker Desktop 4.41.1. See ## Using Azure Provisioning -The project is set up to automatically provision Azure resources, but local configuration is configured. For detailed instructions, see the [Local Provisioning documentation](https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration). +The project is set up to automatically provision Azure resources. When running the app for the first time, you will be prompted to provide Azure configuration values. For detailed instructions, see the [Local Provisioning documentation](https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration). -From the command line, configure local provisioning for this project using .NET User Secrets by running the following commands: - -```sh -cd aichatweb.AppHost -dotnet user-secrets set Azure:SubscriptionId "" -dotnet user-secrets set Azure:AllowResourceGroupCreation "true" -dotnet user-secrets set Azure:ResourceGroup "" -dotnet user-secrets set Azure:Location "" -``` - -Make sure to replace placeholder values with real configuration values. # Running the application @@ -47,9 +36,9 @@ Make sure to replace placeholder values with real configuration values. ## Trust the localhost certificate -Several .NET Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. +Several Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. -See [Troubleshoot untrusted localhost certificate in .NET Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. +See [Troubleshoot untrusted localhost certificate in Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. # Updating JavaScript dependencies diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs new file mode 100644 index 00000000000..efa75bc7ec6 --- /dev/null +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs @@ -0,0 +1,19 @@ +var builder = DistributedApplication.CreateBuilder(args); + +// You will need to set the connection string to your own value +// You can do this using Visual Studio's "Manage User Secrets" UI, or on the command line: +// cd this-project-directory +// dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://YOUR-DEPLOYMENT-NAME.openai.azure.com/openai/v1;Key=YOUR-API-KEY" +var openai = builder.AddConnectionString("openai"); + +// See https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration +// for instructions providing configuration values +var search = builder.AddAzureSearch("search"); + +var webApp = builder.AddProject("aichatweb-app"); +webApp.WithReference(openai); +webApp + .WithReference(search) + .WaitFor(search); + +builder.Build().Run(); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Program.cs deleted file mode 100644 index da0220a0b1c..00000000000 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Program.cs +++ /dev/null @@ -1,29 +0,0 @@ -var builder = DistributedApplication.CreateBuilder(args); - -// See https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration -// for instructions providing configuration values -var openai = builder.AddAzureOpenAI("openai"); - -openai.AddDeployment( - name: "gpt-4o-mini", - modelName: "gpt-4o-mini", - modelVersion: "2024-07-18"); - -openai.AddDeployment( - name: "text-embedding-3-small", - modelName: "text-embedding-3-small", - modelVersion: "1"); - -// See https://learn.microsoft.com/dotnet/aspire/azure/local-provisioning#configuration -// for instructions providing configuration values -var search = builder.AddAzureSearch("search"); - -var webApp = builder.AddProject("aichatweb-app"); -webApp - .WithReference(openai) - .WaitFor(openai); -webApp - .WithReference(search) - .WaitFor(search); - -builder.Build().Run(); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json index 4444e808585..681e3bf0d26 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json @@ -9,8 +9,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" } }, "http": { @@ -21,8 +21,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" } } } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj index 54bcd4bc3a0..2ba82b12b27 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj @@ -1,20 +1,19 @@  - + Exe net9.0 enable enable - true secret - - - + + + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs index f56908872e0..b44d60b604b 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs @@ -10,11 +10,14 @@ namespace Microsoft.Extensions.Hosting; -// Adds common .NET Aspire services: service discovery, resilience, health checks, and OpenTelemetry. +// Adds common Aspire services: service discovery, resilience, health checks, and OpenTelemetry. // This project should be referenced by each service project in your solution. // To learn more about using this project, see https://aka.ms/dotnet/aspire/service-defaults public static class Extensions { + private const string HealthEndpointPath = "/health"; + private const string AlivenessEndpointPath = "/alive"; + public static TBuilder AddServiceDefaults(this TBuilder builder) where TBuilder : IHostApplicationBuilder { builder.ConfigureOpenTelemetry(); @@ -64,7 +67,12 @@ public static TBuilder ConfigureOpenTelemetry(this TBuilder builder) w .WithTracing(tracing => { tracing.AddSource(builder.Environment.ApplicationName) - .AddAspNetCoreInstrumentation() + .AddAspNetCoreInstrumentation(tracing => + // Exclude health check requests from tracing + tracing.Filter = context => + !context.Request.Path.StartsWithSegments(HealthEndpointPath) + && !context.Request.Path.StartsWithSegments(AlivenessEndpointPath) + ) // Uncomment the following line to enable gRPC instrumentation (requires the OpenTelemetry.Instrumentation.GrpcNetClient package) //.AddGrpcClientInstrumentation() .AddHttpClientInstrumentation() @@ -111,10 +119,10 @@ public static WebApplication MapDefaultEndpoints(this WebApplication app) if (app.Environment.IsDevelopment()) { // All health checks must pass for app to be considered ready to accept traffic after starting - app.MapHealthChecks("/health"); + app.MapHealthChecks(HealthEndpointPath); // Only health checks tagged with the "live" tag must pass for app to be considered alive - app.MapHealthChecks("/alive", new HealthCheckOptions + app.MapHealthChecks(AlivenessEndpointPath, new HealthCheckOptions { Predicate = r => r.Tags.Contains("live") }); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Program.cs index 450914c4461..ae9a4a3b9a3 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Program.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Program.cs @@ -7,7 +7,7 @@ builder.AddServiceDefaults(); builder.Services.AddRazorComponents().AddInteractiveServerComponents(); -var openai = builder.AddAzureOpenAIClient("openai"); +var openai = builder.AddOpenAIClient("openai"); openai.AddChatClient("gpt-4o-mini") .UseFunctionInvocation() .UseOpenTelemetry(configure: c => diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs index 59732141849..2fe43370071 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs @@ -26,7 +26,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -34,7 +34,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -49,7 +49,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj index a4c6ec68b94..f30e7380293 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.AzureOpenAI_Qdrant_Aspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj @@ -8,14 +8,13 @@ - - + - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Basic.verified/aichatweb/Services/Ingestion/DataIngestor.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Basic.verified/aichatweb/Services/Ingestion/DataIngestor.cs index 65b520980c1..89fe287ebed 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Basic.verified/aichatweb/Services/Ingestion/DataIngestor.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Basic.verified/aichatweb/Services/Ingestion/DataIngestor.cs @@ -26,7 +26,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -34,7 +34,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -49,7 +49,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/README.md b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/README.md index c05c18281ef..94c542fda7f 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/README.md +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/README.md @@ -43,9 +43,9 @@ Learn more about [prototyping with AI models using GitHub Models](https://docs.g ## Trust the localhost certificate -Several .NET Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. +Several Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. -See [Troubleshoot untrusted localhost certificate in .NET Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. +See [Troubleshoot untrusted localhost certificate in Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. # Updating JavaScript dependencies diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs similarity index 78% rename from test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Program.cs rename to test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs index d41eea07e40..033dad8ba1e 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Program.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/AppHost.cs @@ -1,9 +1,11 @@ var builder = DistributedApplication.CreateBuilder(args); +var builder = DistributedApplication.CreateBuilder(args); + // You will need to set the connection string to your own value // You can do this using Visual Studio's "Manage User Secrets" UI, or on the command line: // cd this-project-directory -// dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://models.inference.ai.azure.com;Key=YOUR-API-KEY" +// dotnet user-secrets set ConnectionStrings:openai "Endpoint=https://YOUR-DEPLOYMENT-NAME.openai.azure.com/openai/v1;Key=YOUR-API-KEY" var openai = builder.AddConnectionString("openai"); var webApp = builder.AddProject("aichatweb-app"); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json index 4444e808585..681e3bf0d26 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json @@ -9,8 +9,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" } }, "http": { @@ -21,8 +21,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" } } } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj index ffef1abf363..a7c93eb2928 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj @@ -1,18 +1,17 @@  - + Exe net9.0 enable enable - true secret - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs index f56908872e0..b44d60b604b 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs @@ -10,11 +10,14 @@ namespace Microsoft.Extensions.Hosting; -// Adds common .NET Aspire services: service discovery, resilience, health checks, and OpenTelemetry. +// Adds common Aspire services: service discovery, resilience, health checks, and OpenTelemetry. // This project should be referenced by each service project in your solution. // To learn more about using this project, see https://aka.ms/dotnet/aspire/service-defaults public static class Extensions { + private const string HealthEndpointPath = "/health"; + private const string AlivenessEndpointPath = "/alive"; + public static TBuilder AddServiceDefaults(this TBuilder builder) where TBuilder : IHostApplicationBuilder { builder.ConfigureOpenTelemetry(); @@ -64,7 +67,12 @@ public static TBuilder ConfigureOpenTelemetry(this TBuilder builder) w .WithTracing(tracing => { tracing.AddSource(builder.Environment.ApplicationName) - .AddAspNetCoreInstrumentation() + .AddAspNetCoreInstrumentation(tracing => + // Exclude health check requests from tracing + tracing.Filter = context => + !context.Request.Path.StartsWithSegments(HealthEndpointPath) + && !context.Request.Path.StartsWithSegments(AlivenessEndpointPath) + ) // Uncomment the following line to enable gRPC instrumentation (requires the OpenTelemetry.Instrumentation.GrpcNetClient package) //.AddGrpcClientInstrumentation() .AddHttpClientInstrumentation() @@ -111,10 +119,10 @@ public static WebApplication MapDefaultEndpoints(this WebApplication app) if (app.Environment.IsDevelopment()) { // All health checks must pass for app to be considered ready to accept traffic after starting - app.MapHealthChecks("/health"); + app.MapHealthChecks(HealthEndpointPath); // Only health checks tagged with the "live" tag must pass for app to be considered alive - app.MapHealthChecks("/alive", new HealthCheckOptions + app.MapHealthChecks(AlivenessEndpointPath, new HealthCheckOptions { Predicate = r => r.Tags.Contains("live") }); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Program.cs index a98905903b3..922b09dc06a 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Program.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Program.cs @@ -8,7 +8,7 @@ builder.AddServiceDefaults(); builder.Services.AddRazorComponents().AddInteractiveServerComponents(); -var openai = builder.AddAzureOpenAIClient("openai"); +var openai = builder.AddOpenAIClient("openai"); openai.AddChatClient("gpt-4o-mini") .UseFunctionInvocation() .UseOpenTelemetry(configure: c => diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs index 59732141849..2fe43370071 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs @@ -26,7 +26,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -34,7 +34,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -49,7 +49,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj index 7fe3b1022ee..04dd8022e0e 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.BasicAspire.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj @@ -8,8 +8,7 @@ - - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/README.md b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/README.md index 70e543ffeae..0ef2c04a907 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/README.md +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/README.md @@ -46,9 +46,9 @@ Note: Qdrant and Docker are excellent open source products, but are not maintain ## Trust the localhost certificate -Several .NET Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. +Several Aspire templates include ASP.NET Core projects that are configured to use HTTPS by default. If this is the first time you're running the project, an exception might occur when loading the Aspire dashboard. This error can be resolved by trusting the self-signed development certificate with the .NET CLI. -See [Troubleshoot untrusted localhost certificate in .NET Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. +See [Troubleshoot untrusted localhost certificate in Aspire](https://learn.microsoft.com/dotnet/aspire/troubleshooting/untrusted-localhost-certificate) for more information. # Updating JavaScript dependencies diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/AppHost.cs similarity index 100% rename from test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Program.cs rename to test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/AppHost.cs diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json index 4444e808585..681e3bf0d26 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/Properties/launchSettings.json @@ -9,8 +9,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:9999" } }, "http": { @@ -21,8 +21,8 @@ "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development", "DOTNET_ENVIRONMENT": "Development", - "DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", - "DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" + "ASPIRE_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:9999", + "ASPIRE_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:9999" } } } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj index 7637a36ca6f..c78ef8c5d91 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.AppHost/aichatweb.AppHost.csproj @@ -1,19 +1,18 @@  - + Exe net9.0 enable enable - true secret - - + + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs index 81cc28b27d2..b44d60b604b 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.ServiceDefaults/Extensions.cs @@ -10,11 +10,14 @@ namespace Microsoft.Extensions.Hosting; -// Adds common .NET Aspire services: service discovery, resilience, health checks, and OpenTelemetry. +// Adds common Aspire services: service discovery, resilience, health checks, and OpenTelemetry. // This project should be referenced by each service project in your solution. // To learn more about using this project, see https://aka.ms/dotnet/aspire/service-defaults public static class Extensions { + private const string HealthEndpointPath = "/health"; + private const string AlivenessEndpointPath = "/alive"; + public static TBuilder AddServiceDefaults(this TBuilder builder) where TBuilder : IHostApplicationBuilder { builder.ConfigureOpenTelemetry(); @@ -30,15 +33,7 @@ public static TBuilder AddServiceDefaults(this TBuilder builder) where #pragma warning restore EXTEXP0001 // Turn on resilience by default - http.AddStandardResilienceHandler(config => - { - // Extend the HTTP Client timeout for Ollama - config.AttemptTimeout.Timeout = TimeSpan.FromMinutes(3); - - // Must be at least double the AttemptTimeout to pass options validation - config.CircuitBreaker.SamplingDuration = TimeSpan.FromMinutes(10); - config.TotalRequestTimeout.Timeout = TimeSpan.FromMinutes(10); - }); + http.AddStandardResilienceHandler(); // Turn on service discovery by default http.AddServiceDiscovery(); @@ -72,7 +67,12 @@ public static TBuilder ConfigureOpenTelemetry(this TBuilder builder) w .WithTracing(tracing => { tracing.AddSource(builder.Environment.ApplicationName) - .AddAspNetCoreInstrumentation() + .AddAspNetCoreInstrumentation(tracing => + // Exclude health check requests from tracing + tracing.Filter = context => + !context.Request.Path.StartsWithSegments(HealthEndpointPath) + && !context.Request.Path.StartsWithSegments(AlivenessEndpointPath) + ) // Uncomment the following line to enable gRPC instrumentation (requires the OpenTelemetry.Instrumentation.GrpcNetClient package) //.AddGrpcClientInstrumentation() .AddHttpClientInstrumentation() @@ -119,10 +119,10 @@ public static WebApplication MapDefaultEndpoints(this WebApplication app) if (app.Environment.IsDevelopment()) { // All health checks must pass for app to be considered ready to accept traffic after starting - app.MapHealthChecks("/health"); + app.MapHealthChecks(HealthEndpointPath); // Only health checks tagged with the "live" tag must pass for app to be considered alive - app.MapHealthChecks("/alive", new HealthCheckOptions + app.MapHealthChecks(AlivenessEndpointPath, new HealthCheckOptions { Predicate = r => r.Tags.Contains("live") }); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/OllamaResilienceHandlerExtensions.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/OllamaResilienceHandlerExtensions.cs new file mode 100644 index 00000000000..ae82393302c --- /dev/null +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/OllamaResilienceHandlerExtensions.cs @@ -0,0 +1,34 @@ +using System; +using Microsoft.Extensions.DependencyInjection; + +namespace aichatweb.Web.Services; + +public static class OllamaResilienceHandlerExtensions +{ + public static IServiceCollection AddOllamaResilienceHandler(this IServiceCollection services) + { + services.ConfigureHttpClientDefaults(http => + { +#pragma warning disable EXTEXP0001 // RemoveAllResilienceHandlers is experimental + http.RemoveAllResilienceHandlers(); +#pragma warning restore EXTEXP0001 + + // Turn on resilience by default + http.AddStandardResilienceHandler(config => + { + // Extend the HTTP Client timeout for Ollama + config.AttemptTimeout.Timeout = TimeSpan.FromMinutes(3); + + // Must be at least double the AttemptTimeout to pass options validation + config.CircuitBreaker.SamplingDuration = TimeSpan.FromMinutes(10); + config.TotalRequestTimeout.Timeout = TimeSpan.FromMinutes(10); + }); + + // Turn on service discovery by default + http.AddServiceDiscovery(); + }); + + return services; + } +} + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Program.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Program.cs index cdc88a082b7..c67c70db5d6 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Program.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Program.cs @@ -20,6 +20,11 @@ builder.Services.AddQdrantCollection("data-aichatweb-documents"); builder.Services.AddScoped(); builder.Services.AddSingleton(); +// Applies robust HTTP resilience settings for all HttpClients in the Web project, +// not across the entire solution. It's aimed at supporting Ollama scenarios due +// to its self-hosted nature and potentially slow responses. +// Remove this if you want to use the global or a different HTTP resilience policy instead. +builder.Services.AddOllamaResilienceHandler(); var app = builder.Build(); diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs index d0f7a6bc3a8..894b85c10de 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/Services/Ingestion/DataIngestor.cs @@ -26,7 +26,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -34,7 +34,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -49,7 +49,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj index 85dc735b4c4..03fba5231ac 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.Ollama_Qdrant.verified/aichatweb/aichatweb.Web/aichatweb.Web.csproj @@ -14,7 +14,7 @@ - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.OpenAI_AzureAISearch.verified/aichatweb/Services/Ingestion/DataIngestor.cs b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.OpenAI_AzureAISearch.verified/aichatweb/Services/Ingestion/DataIngestor.cs index 65b520980c1..89fe287ebed 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.OpenAI_AzureAISearch.verified/aichatweb/Services/Ingestion/DataIngestor.cs +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/aichatweb.OpenAI_AzureAISearch.verified/aichatweb/Services/Ingestion/DataIngestor.cs @@ -26,7 +26,7 @@ public async Task IngestDataAsync(IIngestionSource source) var deletedDocuments = await source.GetDeletedDocumentsAsync(documentsForSource); foreach (var deletedDocument in deletedDocuments) { - logger.LogInformation("Removing ingested data for {documentId}", deletedDocument.DocumentId); + logger.LogInformation("Removing ingested data for {DocumentId}", deletedDocument.DocumentId); await DeleteChunksForDocumentAsync(deletedDocument); await documentsCollection.DeleteAsync(deletedDocument.Key); } @@ -34,7 +34,7 @@ public async Task IngestDataAsync(IIngestionSource source) var modifiedDocuments = await source.GetNewOrModifiedDocumentsAsync(documentsForSource); foreach (var modifiedDocument in modifiedDocuments) { - logger.LogInformation("Processing {documentId}", modifiedDocument.DocumentId); + logger.LogInformation("Processing {DocumentId}", modifiedDocument.DocumentId); await DeleteChunksForDocumentAsync(modifiedDocument); await documentsCollection.UpsertAsync(modifiedDocument); @@ -49,7 +49,7 @@ async Task DeleteChunksForDocumentAsync(IngestedDocument document) { var documentId = document.DocumentId; var chunksToDelete = await chunksCollection.GetAsync(record => record.DocumentId == documentId, int.MaxValue).ToListAsync(); - if (chunksToDelete.Any()) + if (chunksToDelete.Count != 0) { await chunksCollection.DeleteAsync(chunksToDelete.Select(r => r.Key)); } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.AotTrue.verified/mcpserver/mcpserver.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.AotTrue.verified/mcpserver/mcpserver.csproj index 27a6ad45810..1b2dd939947 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.AotTrue.verified/mcpserver/mcpserver.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.AotTrue.verified/mcpserver/mcpserver.csproj @@ -38,7 +38,7 @@ - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/.mcp/server.json b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/.mcp/server.json index 02908c09afb..7c7602bea75 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/.mcp/server.json +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/.mcp/server.json @@ -1,12 +1,16 @@ { - "$schema": "https://modelcontextprotocol.io/schemas/draft/2025-07-09/server.json", + "$schema": "https://static.modelcontextprotocol.io/schemas/2025-07-09/server.schema.json", "description": "", "name": "io.github./", + "version": "0.1.0-beta", "packages": [ { - "registry_name": "nuget", - "name": "", + "registry_type": "nuget", + "identifier": "", "version": "0.1.0-beta", + "transport": { + "type": "stdio" + }, "package_arguments": [], "environment_variables": [] } @@ -14,8 +18,5 @@ "repository": { "url": "https://github.com//", "source": "github" - }, - "version_detail": { - "version": "0.1.0-beta" } } diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/mcpserver.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/mcpserver.csproj index a3199648740..f6da2d9485e 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/mcpserver.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.Basic.verified/mcpserver/mcpserver.csproj @@ -34,7 +34,7 @@ - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.SelfContainedFalse.verified/mcpserver/mcpserver.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.SelfContainedFalse.verified/mcpserver/mcpserver.csproj index cb3812885c6..a25caa73486 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.SelfContainedFalse.verified/mcpserver/mcpserver.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.SelfContainedFalse.verified/mcpserver/mcpserver.csproj @@ -27,7 +27,7 @@ - + diff --git a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.net10.verified/mcpserver/mcpserver.csproj b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.net10.verified/mcpserver/mcpserver.csproj index 27c45f47efb..393d0558d5e 100644 --- a/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.net10.verified/mcpserver/mcpserver.csproj +++ b/test/ProjectTemplates/Microsoft.Extensions.AI.Templates.IntegrationTests/Snapshots/mcpserver.net10.verified/mcpserver/mcpserver.csproj @@ -34,7 +34,7 @@ - +