Skip to content

Commit

Permalink
Delete code associated with `--experimental_dynamic_configs=retroacti…
Browse files Browse the repository at this point in the history
…ve`.

This flag value is unused and untested outside of a couple unit tests for individual components. Removing it simplifies work on b/185778053, specifically unblocking the removal of `BuildOptions.DiffForReconstruction`.

Also took this opportunity to apply fixes to warnings in the affected files.

PiperOrigin-RevId: 369970226
  • Loading branch information
justinhorvitz authored and copybara-github committed Apr 22, 2021
1 parent 705b419 commit e362fc9
Show file tree
Hide file tree
Showing 39 changed files with 243 additions and 3,277 deletions.
2 changes: 0 additions & 2 deletions src/main/java/com/google/devtools/build/lib/analysis/BUILD
Original file line number Diff line number Diff line change
Expand Up @@ -1581,10 +1581,8 @@ java_library(
":config/fragment_options",
"//src/main/java/com/google/devtools/build/lib/cmdline",
"//src/main/java/com/google/devtools/build/lib/skyframe/serialization/autocodec",
"//src/main/java/com/google/devtools/build/lib/skyframe/trimming:trimmed_configuration_cache",
"//src/main/java/com/google/devtools/build/lib/util",
"//src/main/java/com/google/devtools/common/options",
"//third_party:flogger",
"//third_party:guava",
"//third_party:jsr305",
"//third_party/protobuf:protobuf_java",
Expand Down
114 changes: 44 additions & 70 deletions src/main/java/com/google/devtools/build/lib/analysis/BuildView.java
Original file line number Diff line number Diff line change
Expand Up @@ -100,49 +100,44 @@
import javax.annotation.Nullable;

/**
* The BuildView presents a semantically-consistent and transitively-closed
* dependency graph for some set of packages.
* The BuildView presents a semantically-consistent and transitively-closed dependency graph for
* some set of packages.
*
* <h2>Package design</h2>
*
* <p>This package contains the Blaze dependency analysis framework (aka
* "analysis phase"). The goal of this code is to perform semantic analysis of
* all of the build targets required for a given build, to report
* errors/warnings for any problems in the input, and to construct an "action
* graph" (see {@code lib.actions} package) correctly representing the work to
* be done during the execution phase of the build.
* <p>This package contains the Blaze dependency analysis framework (aka "analysis phase"). The goal
* of this code is to perform semantic analysis of all of the build targets required for a given
* build, to report errors/warnings for any problems in the input, and to construct an "action
* graph" (see {@code lib.actions} package) correctly representing the work to be done during the
* execution phase of the build.
*
* <p><b>Configurations</b> the inputs to a build come from two sources: the
* intrinsic inputs, specified in the BUILD file, are called <em>targets</em>.
* The environmental inputs, coming from the build tool, the command-line, or
* configuration files, are called the <em>configuration</em>. Only when a
* target and a configuration are combined is there sufficient information to
* perform a build. </p>
* <p><b>Configurations</b> the inputs to a build come from two sources: the intrinsic inputs,
* specified in the BUILD file, are called <em>targets</em>. The environmental inputs, coming from
* the build tool, the command-line, or configuration files, are called the <em>configuration</em>.
* Only when a target and a configuration are combined is there sufficient information to perform a
* build.
*
* <p>Targets are implemented by the {@link Target} hierarchy in the {@code
* lib.packages} code. Configurations are implemented by {@link
* BuildConfiguration}. The pair of these together is represented by an
* instance of class {@link ConfiguredTarget}; this is the root of a hierarchy
* with different implementations for each kind of target: source file, derived
* file, rules, etc.
* <p>Targets are implemented by the {@link Target} hierarchy in the {@code lib.packages} code.
* Configurations are implemented by {@link BuildConfiguration}. The pair of these together is
* represented by an instance of class {@link ConfiguredTarget}; this is the root of a hierarchy
* with different implementations for each kind of target: source file, derived file, rules, etc.
*
* <p>The framework code in this package (as opposed to its subpackages) is responsible for
* constructing the {@code ConfiguredTarget} graph for a given target and configuration, taking care
* of such issues as:
*
* <p>The framework code in this package (as opposed to its subpackages) is
* responsible for constructing the {@code ConfiguredTarget} graph for a given
* target and configuration, taking care of such issues as:
* <ul>
* <li>caching common subgraphs.
* <li>detecting and reporting cycles.
* <li>correct propagation of errors through the graph.
* <li>reporting universal errors, such as dependencies from production code
* to tests, or to experimental branches.
* <li>reporting universal errors, such as dependencies from production code to tests, or to
* experimental branches.
* <li>capturing and replaying errors.
* <li>maintaining the graph from one build to the next to
* avoid unnecessary recomputation.
* <li>maintaining the graph from one build to the next to avoid unnecessary recomputation.
* <li>checking software licenses.
* </ul>
*
* <p>See also {@link ConfiguredTarget} which documents some important
* invariants.
* <p>See also {@link ConfiguredTarget} which documents some important invariants.
*/
public class BuildView {
private static final GoogleLogger logger = GoogleLogger.forEnclosingClass();
Expand All @@ -154,12 +149,11 @@ public class BuildView {

private final ConfiguredRuleClassProvider ruleClassProvider;

/**
* A factory class to create the coverage report action. May be null.
*/
/** A factory class to create the coverage report action. May be null. */
@Nullable private final CoverageReportActionFactory coverageReportActionFactory;

public BuildView(BlazeDirectories directories,
public BuildView(
BlazeDirectories directories,
ConfiguredRuleClassProvider ruleClassProvider,
SkyframeExecutor skyframeExecutor,
CoverageReportActionFactory coverageReportActionFactory) {
Expand Down Expand Up @@ -191,8 +185,7 @@ private ArtifactFactory getArtifactFactory() {
@VisibleForTesting
static LinkedHashSet<ConfiguredTarget> filterTestsByTargets(
Collection<ConfiguredTarget> targets, Set<Label> allowedTargetLabels) {
return targets
.stream()
return targets.stream()
.filter(ct -> allowedTargetLabels.contains(ct.getLabel()))
.collect(Collectors.toCollection(LinkedHashSet::new));
}
Expand Down Expand Up @@ -228,8 +221,9 @@ public AnalysisResult update(
if (viewOptions.skyframePrepareAnalysis) {
PrepareAnalysisPhaseValue prepareAnalysisPhaseValue;
try (SilentCloseable c = Profiler.instance().profile("Prepare analysis phase")) {
prepareAnalysisPhaseValue = skyframeExecutor.prepareAnalysisPhase(
eventHandler, targetOptions, multiCpu, loadingResult.getTargetLabels());
prepareAnalysisPhaseValue =
skyframeExecutor.prepareAnalysisPhase(
eventHandler, targetOptions, multiCpu, loadingResult.getTargetLabels());

// Determine the configurations
configurations =
Expand All @@ -243,12 +237,7 @@ public AnalysisResult update(
// needed. This requires cleaning up the invalidation in SkyframeBuildView.setConfigurations.
try (SilentCloseable c = Profiler.instance().profile("createConfigurations")) {
configurations =
skyframeExecutor
.createConfigurations(
eventHandler,
targetOptions,
multiCpu,
keepGoing);
skyframeExecutor.createConfigurations(eventHandler, targetOptions, multiCpu, keepGoing);
}
try (SilentCloseable c = Profiler.instance().profile("AnalysisUtils.getTargetsWithConfigs")) {
topLevelTargetsWithConfigsResult =
Expand All @@ -261,10 +250,9 @@ public AnalysisResult update(
eventHandler, configurations, viewOptions.maxConfigChangesToShow);

if (configurations.getTargetConfigurations().size() == 1) {
eventBus
.post(
new MakeEnvironmentEvent(
configurations.getTargetConfigurations().get(0).getMakeEnvironment()));
eventBus.post(
new MakeEnvironmentEvent(
configurations.getTargetConfigurations().get(0).getMakeEnvironment()));
}
for (BuildConfiguration targetConfig : configurations.getTargetConfigurations()) {
eventBus.post(targetConfig.toBuildEvent());
Expand All @@ -274,8 +262,7 @@ public AnalysisResult update(
topLevelTargetsWithConfigsResult.getTargetsAndConfigs();

// Report the generated association of targets to configurations
Multimap<Label, BuildConfiguration> byLabel =
ArrayListMultimap.<Label, BuildConfiguration>create();
Multimap<Label, BuildConfiguration> byLabel = ArrayListMultimap.create();
for (TargetAndConfiguration pair : topLevelTargetsWithConfigs) {
byLabel.put(pair.getLabel(), pair.getConfiguration());
}
Expand Down Expand Up @@ -334,13 +321,6 @@ public AnalysisResult update(

String starlarkFunctionName = aspect.substring(delimiterPosition + 1);
for (TargetAndConfiguration targetSpec : topLevelTargetsWithConfigs) {
if (targetSpec.getConfiguration() != null
&& targetSpec.getConfiguration().trimConfigurationsRetroactively()) {
String errorMessage =
"Aspects were requested, but are not supported in retroactive trimming mode.";
throw new ViewCreationFailedException(
errorMessage, createFailureDetail(errorMessage, Analysis.Code.ASPECT_PREREQ_UNMET));
}
aspectConfigurations.put(
Pair.of(targetSpec.getLabel(), aspect), targetSpec.getConfiguration());
aspectKeys.add(
Expand All @@ -359,14 +339,6 @@ public AnalysisResult update(

if (aspectFactoryClass != null) {
for (TargetAndConfiguration targetSpec : topLevelTargetsWithConfigs) {
if (targetSpec.getConfiguration() != null
&& targetSpec.getConfiguration().trimConfigurationsRetroactively()) {
String errorMessage =
"Aspects were requested, but are not supported in retroactive trimming mode.";
throw new ViewCreationFailedException(
errorMessage,
createFailureDetail(errorMessage, Analysis.Code.ASPECT_PREREQ_UNMET));
}
// For invoking top-level aspects, use the top-level configuration for both the
// aspect and the base target while the top-level configuration is untrimmed.
BuildConfiguration configuration = targetSpec.getConfiguration();
Expand Down Expand Up @@ -445,8 +417,10 @@ public AnalysisResult update(
int numTargetsToAnalyze = topLevelTargetsWithConfigs.size();
int numSuccessful = skyframeAnalysisResult.getConfiguredTargets().size();
if (0 < numSuccessful && numSuccessful < numTargetsToAnalyze) {
String msg = String.format("Analysis succeeded for only %d of %d top-level targets",
numSuccessful, numTargetsToAnalyze);
String msg =
String.format(
"Analysis succeeded for only %d of %d top-level targets",
numSuccessful, numTargetsToAnalyze);
eventHandler.handle(Event.info(msg));
logger.atInfo().log(msg);
}
Expand Down Expand Up @@ -765,7 +739,7 @@ private static Pair<ImmutableSet<ConfiguredTarget>, ImmutableSet<ConfiguredTarge
ImmutableSet.Builder<ConfiguredTarget> targetsToTest = ImmutableSet.builder();
ImmutableSet.Builder<ConfiguredTarget> targetsToTestExclusive = ImmutableSet.builder();
for (ConfiguredTarget configuredTarget : allTestTargets) {
Target target = null;
Target target;
try {
target = packageManager.getTarget(eventHandler, configuredTarget.getLabel());
} catch (NoSuchTargetException | NoSuchPackageException e) {
Expand Down Expand Up @@ -795,10 +769,10 @@ private void setArtifactRoots(PackageRoots packageRoots) {
}

/**
* Tests and clears the current thread's pending "interrupted" status, and
* throws InterruptedException iff it was set.
* Tests and clears the current thread's pending "interrupted" status, and throws
* InterruptedException iff it was set.
*/
private final void pollInterruptedStatus() throws InterruptedException {
private static void pollInterruptedStatus() throws InterruptedException {
if (Thread.interrupted()) {
throw new InterruptedException();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,22 +55,15 @@

/**
* The implementation of AnalysisEnvironment used for analysis. It tracks metadata for each
* configured target, such as the errors and warnings emitted by that target. It is intended that
* a separate instance is used for each configured target, so that these don't mix up.
* configured target, such as the errors and warnings emitted by that target. It is intended that a
* separate instance is used for each configured target, so that these don't mix up.
*/
public class CachingAnalysisEnvironment implements AnalysisEnvironment {
private final ArtifactFactory artifactFactory;
public final class CachingAnalysisEnvironment implements AnalysisEnvironment {

private final ArtifactFactory artifactFactory;
private final ActionLookupKey owner;
/**
* If this is the system analysis environment, then errors and warnings are directly reported
* to the global reporter, rather than stored, i.e., we don't track here whether there are any
* errors.
*/
private final boolean isSystemEnv;
private final boolean extendedSanityChecks;
private final boolean allowAnalysisFailures;

private final ActionKeyContext actionKeyContext;

private boolean enabled = true;
Expand All @@ -88,19 +81,18 @@ public class CachingAnalysisEnvironment implements AnalysisEnvironment {
*
* <p>The artifact is stored so that we can deduplicate artifacts created multiple times.
*/
private Map<Artifact, Object> artifacts;
private Map<Artifact, Object> artifacts = new HashMap<>();

/**
* The list of actions registered by the configured target this analysis environment is
* responsible for. May get cleared out at the end of the analysis of said target.
*/
final List<ActionAnalysisMetadata> actions = new ArrayList<>();
private final List<ActionAnalysisMetadata> actions = new ArrayList<>();

public CachingAnalysisEnvironment(
ArtifactFactory artifactFactory,
ActionKeyContext actionKeyContext,
ActionLookupKey owner,
boolean isSystemEnv,
boolean extendedSanityChecks,
boolean allowAnalysisFailures,
ExtendedEventHandler errorEventListener,
Expand All @@ -109,14 +101,12 @@ public CachingAnalysisEnvironment(
this.artifactFactory = artifactFactory;
this.actionKeyContext = actionKeyContext;
this.owner = Preconditions.checkNotNull(owner);
this.isSystemEnv = isSystemEnv;
this.extendedSanityChecks = extendedSanityChecks;
this.allowAnalysisFailures = allowAnalysisFailures;
this.errorEventListener = errorEventListener;
this.skyframeEnv = env;
this.starlarkBuiltinsValue = starlarkBuiltinsValue;
middlemanFactory = new MiddlemanFactory(artifactFactory, this);
artifacts = new HashMap<>();
}

public void disable(Target target) {
Expand Down Expand Up @@ -146,7 +136,7 @@ private static StringBuilder shortDescription(ActionAnalysisMetadata action) {
*/
public void verifyGeneratedArtifactHaveActions(Target target) {
Collection<String> orphanArtifacts = getOrphanArtifactMap().values();
List<String> checkedActions = null;
List<String> checkedActions;
if (!orphanArtifacts.isEmpty()) {
checkedActions = Lists.newArrayListWithCapacity(actions.size());
for (ActionAnalysisMetadata action : actions) {
Expand Down Expand Up @@ -244,10 +234,6 @@ public ActionKeyContext getActionKeyContext() {

@Override
public boolean hasErrors() {
// The system analysis environment never has errors.
if (isSystemEnv) {
return false;
}
Preconditions.checkState(enabled);
return ((StoredEventHandler) errorEventListener).hasErrors();
}
Expand Down Expand Up @@ -295,7 +281,7 @@ public Artifact.DerivedArtifact getDerivedArtifact(
PathFragment rootRelativePath, ArtifactRoot root, boolean contentBasedPath) {
Preconditions.checkState(enabled);
return dedupAndTrackArtifactAndOrigin(
artifactFactory.getDerivedArtifact(rootRelativePath, root, getOwner(), contentBasedPath),
artifactFactory.getDerivedArtifact(rootRelativePath, root, owner, contentBasedPath),
extendedSanityChecks ? new Throwable() : null);
}

Expand All @@ -304,7 +290,7 @@ public SpecialArtifact getTreeArtifact(PathFragment rootRelativePath, ArtifactRo
Preconditions.checkState(enabled);
return (SpecialArtifact)
dedupAndTrackArtifactAndOrigin(
artifactFactory.getTreeArtifact(rootRelativePath, root, getOwner()),
artifactFactory.getTreeArtifact(rootRelativePath, root, owner),
extendedSanityChecks ? new Throwable() : null);
}

Expand All @@ -313,7 +299,7 @@ public SpecialArtifact getSymlinkArtifact(PathFragment rootRelativePath, Artifac
Preconditions.checkState(enabled);
return (SpecialArtifact)
dedupAndTrackArtifactAndOrigin(
artifactFactory.getSymlinkArtifact(rootRelativePath, root, getOwner()),
artifactFactory.getSymlinkArtifact(rootRelativePath, root, owner),
extendedSanityChecks ? new Throwable() : null);
}

Expand All @@ -327,13 +313,13 @@ public Artifact.DerivedArtifact getFilesetArtifact(
PathFragment rootRelativePath, ArtifactRoot root) {
Preconditions.checkState(enabled);
return dedupAndTrackArtifactAndOrigin(
artifactFactory.getFilesetArtifact(rootRelativePath, root, getOwner()),
artifactFactory.getFilesetArtifact(rootRelativePath, root, owner),
extendedSanityChecks ? new Throwable() : null);
}

@Override
public Artifact getConstantMetadataArtifact(PathFragment rootRelativePath, ArtifactRoot root) {
return artifactFactory.getConstantMetadataArtifact(rootRelativePath, root, getOwner());
return artifactFactory.getConstantMetadataArtifact(rootRelativePath, root, owner);
}

@Override
Expand Down Expand Up @@ -363,12 +349,12 @@ public SkyFunction.Environment getSkyframeEnv() {
}

@Override
public StarlarkSemantics getStarlarkSemantics() throws InterruptedException {
public StarlarkSemantics getStarlarkSemantics() {
return starlarkBuiltinsValue.starlarkSemantics;
}

@Override
public ImmutableMap<String, Object> getStarlarkDefinedBuiltins() throws InterruptedException {
public ImmutableMap<String, Object> getStarlarkDefinedBuiltins() {
return starlarkBuiltinsValue.exportedToJava;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -768,14 +768,6 @@ public boolean trimConfigurations() {
return options.configsMode == CoreOptions.ConfigsMode.ON;
}

/**
* Returns whether we should trim configurations to only include the fragments needed to correctly
* analyze a rule.
*/
public boolean trimConfigurationsRetroactively() {
return options.configsMode == CoreOptions.ConfigsMode.RETROACTIVE;
}

/**
* <b>>Experimental feature:</b> if true, qualifying outputs use path prefixes based on their
* content instead of the traditional <code>blaze-out/$CPU-$COMPILATION_MODE</code>.
Expand Down
Loading

0 comments on commit e362fc9

Please sign in to comment.