Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Content hashing of archives #3482

Merged
merged 57 commits into from
Oct 18, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
197565a
content hashing of arhcives
sigurdm Jul 5, 2022
628440d
Update repo spec
sigurdm Jul 8, 2022
6308de7
Remove os-marker from gzip output (testing only)
sigurdm Jul 8, 2022
5b9cc91
Update lib/src/command/get.dart
sigurdm Aug 5, 2022
e2b8cf9
Use hex from package:convert
sigurdm Aug 5, 2022
7f96995
Rename enforceContentHashes -> checkContentHashes
sigurdm Aug 5, 2022
6f6adf0
forEach -> for
sigurdm Aug 5, 2022
5f19d8c
Don't type local
sigurdm Aug 5, 2022
407c94c
forEach -> for
sigurdm Aug 5, 2022
6bf38ae
Always call LockFile.checkContentHashes - change impossible case to a…
sigurdm Aug 5, 2022
d75c4a0
Remove get enforce-lockfile
sigurdm Aug 25, 2022
09aa64b
Use summary to explain differences if --enforce-lockfile fails
sigurdm Aug 23, 2022
7ece91e
Refactor checkContentHashes logic
sigurdm Aug 23, 2022
763dca9
Add missing member
sigurdm Aug 25, 2022
1144d02
WIP
sigurdm Aug 30, 2022
5eb8571
merge
sigurdm Aug 30, 2022
e6ea414
Do hash-checks in result
sigurdm Sep 1, 2022
11cf1d2
lint
sigurdm Sep 1, 2022
fe90a68
fixes
sigurdm Sep 1, 2022
ab0e5a4
Document SystemCache.downloadPackage
sigurdm Sep 1, 2022
72634e7
test package server hash content-hashes by default
sigurdm Sep 1, 2022
ec54824
Fix separator for windows in test
sigurdm Sep 2, 2022
1b1765d
Test of package-change-prefixes
sigurdm Sep 2, 2022
80f56cd
Link to spec in gzip testing hack
sigurdm Sep 2, 2022
040abef
Make replaceOs private
sigurdm Sep 2, 2022
660eb21
Simplify gzip hack
sigurdm Sep 2, 2022
e99ede5
setSha256 => overrideArchiveSha256
sigurdm Sep 2, 2022
9fa36b2
Merge
sigurdm Sep 6, 2022
fdf1e76
Update doc/repository-spec-v2.md
sigurdm Sep 23, 2022
76107a6
Update lib/src/solver/result.dart
sigurdm Sep 27, 2022
84c5051
Update lib/src/solver/result.dart
sigurdm Sep 27, 2022
5142846
Typo
sigurdm Sep 27, 2022
0dcb2df
Fix test
sigurdm Sep 27, 2022
04f77b7
Merge remote-tracking branch 'sigurdm/content_hashing' into content_h…
sigurdm Sep 27, 2022
c76b1bf
Merge remote-tracking branch 'origin/master' into content_hashing
sigurdm Sep 27, 2022
206a5a7
Fix test
sigurdm Sep 27, 2022
15013b7
Adjust messages, move check, fixed time comparison
sigurdm Sep 30, 2022
74cbf4c
Merge
sigurdm Oct 3, 2022
f573b61
Update lib/src/utils.dart
sigurdm Oct 6, 2022
aedd523
Update lib/src/source/hosted.dart
sigurdm Oct 6, 2022
ae331d7
Show all content mismatches
sigurdm Oct 6, 2022
79c75be
Update test/content_hash_test.dart
sigurdm Oct 6, 2022
2b091aa
Update lib/src/command/dependency_services.dart
sigurdm Oct 6, 2022
b1de661
Use fixedTimeBytesEquals one more place
sigurdm Oct 6, 2022
142222d
Retry validation errors
sigurdm Oct 6, 2022
8f73023
Revert "Update lib/src/source/hosted.dart"
sigurdm Oct 6, 2022
8190243
Adjust expectations
sigurdm Oct 6, 2022
71a73f0
Merge
sigurdm Oct 6, 2022
e93db2a
WIP
sigurdm Oct 6, 2022
1145157
Don't add content-hashes in dependency-services unless they are alrea…
sigurdm Oct 18, 2022
7a38f4a
Remove unused import
sigurdm Oct 18, 2022
da649bf
Dartfmt
sigurdm Oct 18, 2022
dc34703
Missing return type
sigurdm Oct 18, 2022
dd2e088
Add missing golden
sigurdm Oct 18, 2022
a056776
Fix in dependency_services
sigurdm Oct 18, 2022
192d66e
Better import
sigurdm Oct 18, 2022
6346956
goldens
sigurdm Oct 18, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 12 additions & 1 deletion doc/repository-spec-v2.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ parse the `<message>`.
The `dart pub` client allows users to save an opaque `<token>` for each
`<hosted-url>`. When the `dart pub` client makes a request to a `<hosted-url>`
for which it has a `<token>` stored, it will attach an `Authorization` header
as follows:
as follows:

* `Authorization: Bearer <token>`

Expand Down Expand Up @@ -229,6 +229,7 @@ server, this could work in many different ways.
"version": "<version>",
"retracted": true || false, /* optional field, false if omitted */
"archive_url": "https://.../archive.tar.gz",
"archive_sha256": "95cbaad58e2cf32d1aa852f20af1fcda1820ead92a4b1447ea7ba1ba18195d27"
"pubspec": {
/* pubspec contents as JSON object */
}
Expand All @@ -238,6 +239,7 @@ server, this could work in many different ways.
"version": "<package>",
"retracted": true || false, /* optional field, false if omitted */
"archive_url": "https://.../archive.tar.gz",
"archive_sha256": "95cbaad58e2cf32d1aa852f20af1fcda1820ead92a4b1447ea7ba1ba18195d27"
"pubspec": {
/* pubspec contents as JSON object */
}
Expand All @@ -256,6 +258,15 @@ parameters. This allows for the server to return signed-URLs for S3, GCS or
other blob storage service. If temporary URLs are returned it is wise to not set
expiration to less than 25 minutes (to allow for retries and clock drift).

The `archive_sha256` should be the hex-encoded sha256 checksum of the file at
archive_url. It is an optional field that allows the pub client to verify the
integrity of the downloaded archive.

The `archive_sha256` also provides an easy way for clients to detect if
something has changed on the server. In the absense of this field the client can
still download the archive to obtain a checksum and detect changes to the
archive.

If `<hosted-url>` for the server returning `archive_url` is a prefix of
`archive_url`, then the `Authorization: Bearer <token>` is also included when
`archive_url` is requested. Example: if `https://pub.example.com/path` returns
Expand Down
86 changes: 84 additions & 2 deletions lib/src/command/dependency_services.dart
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ import '../pubspec.dart';
import '../pubspec_utils.dart';
import '../solver.dart';
import '../source/git.dart';
import '../source/hosted.dart';
import '../system_cache.dart';
import '../utils.dart';

Expand Down Expand Up @@ -357,6 +358,7 @@ class DependencyServicesApplyCommand extends PubCommand {
: null;
final lockFileYaml = lockFile == null ? null : loadYaml(lockFile);
final lockFileEditor = lockFile == null ? null : YamlEditor(lockFile);
final hasContentHashes = _lockFileHasContentHashes(lockFileYaml);
for (final p in toApply) {
final targetPackage = p.name;
final targetVersion = p.version;
Expand Down Expand Up @@ -394,6 +396,16 @@ class DependencyServicesApplyCommand extends PubCommand {
lockFileYaml['packages'].containsKey(targetPackage)) {
lockFileEditor.update(
['packages', targetPackage, 'version'], targetVersion.toString());
// Remove the now outdated content-hash - it will be restored below
// after resolution.
if (lockFileEditor
.parseAt(['packages', targetPackage, 'description'])
.value
.containsKey('sha256')) {
lockFileEditor.remove(
['packages', targetPackage, 'description', 'sha256'],
);
}
} else if (targetRevision != null &&
lockFileYaml['packages'].containsKey(targetPackage)) {
final ref = entrypoint.lockFile.packages[targetPackage]!.toRef();
Expand Down Expand Up @@ -457,8 +469,58 @@ class DependencyServicesApplyCommand extends PubCommand {
writeTextFile(entrypoint.pubspecPath, updatedPubspec);
}
// Only if we originally had a lock-file we write the resulting lockfile back.
if (lockFileEditor != null) {
entrypoint.saveLockFile(solveResult);
if (updatedLockfile != null) {
final updatedPackages = <PackageId>[];
for (var package in solveResult.packages) {
if (package.isRoot) continue;
final description = package.description;

// Handle content-hashes of hosted dependencies.
if (description is ResolvedHostedDescription) {
// Ensure we get content-hashes if the original lock-file had
// them.
if (hasContentHashes) {
if (description.sha256 == null) {
// We removed the hash above before resolution - as we get the
// locked id back we need to find the content-hash from the
// version listing.
//
// `pub get` gets this version-listing from the downloaded
// archive but we don't want to download all archives - so we
// copy it from the version listing.
package = (await cache.getVersions(package.toRef()))
.firstWhere((id) => id == package, orElse: () => package);
if ((package.description as ResolvedHostedDescription)
.sha256 ==
null) {
// This happens when we resolved a package from a legacy
// server not providing archive_sha256. As a side-effect of
// downloading the package we compute and store the sha256.
package = await cache.downloadPackage(package);
}
}
} else {
// The original pubspec.lock did not have content-hashes. Remove
// any content hash, so we don't start adding them.
package = PackageId(
package.name,
package.version,
description.withSha256(null),
);
}
}
updatedPackages.add(package);
}

final newLockFile = LockFile(
updatedPackages,
sdkConstraints: updatedLockfile.sdkConstraints,
mainDependencies: pubspec.dependencies.keys.toSet(),
devDependencies: pubspec.devDependencies.keys.toSet(),
overriddenDependencies: pubspec.dependencyOverrides.keys.toSet(),
);

newLockFile.writeToFile(entrypoint.lockFilePath, cache);
}
},
);
Expand Down Expand Up @@ -541,3 +603,23 @@ VersionConstraint _compatibleWithIfPossible(VersionRange versionRange) {
}
return versionRange;
}

/// `true` iff any of the packages described by the [lockfile] has a
/// content-hash.
///
/// Undefined for invalid lock files, but mostly `true`.
bool _lockFileHasContentHashes(dynamic lockfile) {
if (lockfile is! Map) return true;
final packages = lockfile['packages'];
if (packages is! Map) return true;

/// We consider an empty lockfile ready to get content-hashes.
if (packages.isEmpty) return true;
for (final package in packages.values) {
if (package is! Map) return true;
final descriptor = package['description'];
if (descriptor is! Map) return true;
if (descriptor['sha256'] != null) return true;
}
return false;
}
1 change: 1 addition & 0 deletions lib/src/command/get.dart
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ class GetCommand extends PubCommand {
log.warning(log.yellow(
'The --packages-dir flag is no longer used and does nothing.'));
}

await entrypoint.acquireDependencies(
SolveType.get,
dryRun: argResults['dry-run'],
Expand Down
6 changes: 5 additions & 1 deletion lib/src/command/outdated.dart
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,11 @@ class OutdatedCommand extends PubCommand {
latestIsOverridden = true;
}

final packageStatus = await current?.source.status(current, cache);
final packageStatus = await current?.source.status(
current.toRef(),
current.version,
cache,
);
final discontinued =
packageStatus == null ? false : packageStatus.isDiscontinued;
final discontinuedReplacedBy = packageStatus?.discontinuedReplacedBy;
Expand Down
67 changes: 21 additions & 46 deletions lib/src/entrypoint.dart
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ import 'dart:io';
import 'dart:math';

import 'package:collection/collection.dart';
import 'package:meta/meta.dart';
import 'package:path/path.dart' as p;
import 'package:pool/pool.dart';
import 'package:pub_semver/pub_semver.dart';
Expand All @@ -31,6 +30,7 @@ import 'pub_embeddable_command.dart';
import 'pubspec.dart';
import 'sdk.dart';
import 'solver.dart';
import 'solver/report.dart';
import 'source/cached.dart';
import 'source/unknown.dart';
import 'system_cache.dart';
Expand Down Expand Up @@ -291,11 +291,11 @@ class Entrypoint {
///
/// Performs version resolution according to [SolveType].
///
/// [useLatest], if provided, defines a list of packages that will be
/// unlocked and forced to their latest versions. If [upgradeAll] is
/// true, the previous lockfile is ignored and all packages are re-resolved
/// from scratch. Otherwise, it will attempt to preserve the versions of all
/// previously locked packages.
/// [useLatest], if provided, defines a list of packages that will be unlocked
/// and forced to their latest versions. If [upgradeAll] is true, the previous
/// lockfile is ignored and all packages are re-resolved from scratch.
/// Otherwise, it will attempt to preserve the versions of all previously
/// locked packages.
///
/// Shows a report of the changes made relative to the previous lockfile. If
/// this is an upgrade or downgrade, all transitive dependencies are shown in
Expand All @@ -305,8 +305,8 @@ class Entrypoint {
/// If [precompile] is `true` (the default), this snapshots dependencies'
/// executables.
///
/// if [onlyReportSuccessOrFailure] is `true` only success or failure will be shown ---
/// in case of failure, a reproduction command is shown.
/// if [onlyReportSuccessOrFailure] is `true` only success or failure will be
/// shown --- in case of failure, a reproduction command is shown.
///
/// Updates [lockFile] and [packageRoot] accordingly.
Future<void> acquireDependencies(
Expand Down Expand Up @@ -365,17 +365,26 @@ class Entrypoint {
}
}

// We have to download files also with --dry-run to ensure we know the
// archive hashes for downloaded files.
final newLockFile = await result.downloadCachedPackages(cache);

final report = SolveReport(
type, root, lockFile, newLockFile, result.availableVersions, cache,
dryRun: dryRun);
if (!onlyReportSuccessOrFailure) {
await result.showReport(type, cache);
await report.show();
}
_lockFile = newLockFile;

if (!dryRun) {
await result.downloadCachedPackages(cache);
saveLockFile(result);
newLockFile.writeToFile(lockFilePath, cache);
}

if (onlyReportSuccessOrFailure) {
log.message('Got dependencies$suffix.');
} else {
await result.summarizeChanges(type, cache, dryRun: dryRun);
await report.summarize();
}

if (!dryRun) {
Expand Down Expand Up @@ -833,21 +842,6 @@ class Entrypoint {
}
}

/// Saves a list of concrete package versions to the `pubspec.lock` file.
///
/// Will use Windows line endings (`\r\n`) if a `pubspec.lock` exists, and
/// uses that.
void saveLockFile(SolveResult result) {
_lockFile = result.lockFile;

final windowsLineEndings = fileExists(lockFilePath) &&
detectWindowsLineEndings(readTextFile(lockFilePath));

final serialized = lockFile.serialize(root.dir);
writeTextFile(lockFilePath,
windowsLineEndings ? serialized.replaceAll('\n', '\r\n') : serialized);
}

/// If the entrypoint uses the old-style `.pub` cache directory, migrates it
/// to the new-style `.dart_tool/pub` directory.
void migrateCache() {
Expand Down Expand Up @@ -926,22 +920,3 @@ See https://dart.dev/go/sdk-constraint
'"pub" version, please run "$topLevelProgram pub get".');
}
}

/// Returns `true` if the [text] looks like it uses windows line endings.
///
/// The heuristic used is to count all `\n` in the text and if stricly more than
/// half of them are preceded by `\r` we report `true`.
@visibleForTesting
bool detectWindowsLineEndings(String text) {
var index = -1;
var unixNewlines = 0;
var windowsNewlines = 0;
while ((index = text.indexOf('\n', index + 1)) != -1) {
if (index != 0 && text[index - 1] == '\r') {
windowsNewlines++;
} else {
unixNewlines++;
}
}
return windowsNewlines > unixNewlines;
}
34 changes: 19 additions & 15 deletions lib/src/global_packages.dart
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ import 'sdk.dart';
import 'sdk/dart.dart';
import 'solver.dart';
import 'solver/incompatibility_cause.dart';
import 'solver/report.dart';
import 'source/cached.dart';
import 'source/git.dart';
import 'source/hosted.dart';
Expand Down Expand Up @@ -178,7 +179,7 @@ class GlobalPackages {
final tempDir = cache.createTempDir();
// TODO(rnystrom): Look in "bin" and display list of binaries that
// user can run.
_writeLockFile(tempDir, LockFile([id]));
LockFile([id]).writeToFile(p.join(tempDir, 'pubspec.lock'), cache);

tryDeleteEntry(_packageDir(name));
tryRenameDir(tempDir, _packageDir(name));
Expand Down Expand Up @@ -223,24 +224,32 @@ class GlobalPackages {
// We want the entrypoint to be rooted at 'dep' not the dummy-package.
result.packages.removeWhere((id) => id.name == 'pub global activate');

final sameVersions = originalLockFile != null &&
originalLockFile.samePackageIds(result.lockFile);
final lockFile = await result.downloadCachedPackages(cache);
final sameVersions =
originalLockFile != null && originalLockFile.samePackageIds(lockFile);

final PackageId id = result.lockFile.packages[name]!;
final PackageId id = lockFile.packages[name]!;
if (sameVersions) {
log.message('''
The package $name is already activated at newest available version.
To recompile executables, first run `$topLevelProgram pub global deactivate $name`.
''');
} else {
// Only precompile binaries if we have a new resolution.
if (!silent) await result.showReport(SolveType.get, cache);

await result.downloadCachedPackages(cache);
if (!silent) {
await SolveReport(
SolveType.get,
root,
originalLockFile ?? LockFile.empty(),
lockFile,
result.availableVersions,
cache,
dryRun: false,
).show();
}

final lockFile = result.lockFile;
final tempDir = cache.createTempDir();
_writeLockFile(tempDir, lockFile);
lockFile.writeToFile(p.join(tempDir, 'pubspec.lock'), cache);

// Load the package graph from [result] so we don't need to re-parse all
// the pubspecs.
Expand All @@ -263,7 +272,7 @@ To recompile executables, first run `$topLevelProgram pub global deactivate $nam
final entrypoint = Entrypoint.global(
_packageDir(id.name),
cache.loadCached(id),
result.lockFile,
lockFile,
cache,
solveResult: result,
);
Expand All @@ -276,11 +285,6 @@ To recompile executables, first run `$topLevelProgram pub global deactivate $nam
if (!silent) log.message('Activated ${_formatPackage(id)}.');
}

/// Finishes activating package [package] by saving [lockFile] in the cache.
void _writeLockFile(String dir, LockFile lockFile) {
writeTextFile(p.join(dir, 'pubspec.lock'), lockFile.serialize(null));
}

/// Shows the user the currently active package with [name], if any.
LockFile? _describeActive(String name, SystemCache cache) {
late final LockFile lockFile;
Expand Down
Loading