From 5d1e9224c68d00ce27666b7cf655eaee241d0a1e Mon Sep 17 00:00:00 2001 From: "Napas (Tian) Udomsak" Date: Fri, 17 Jan 2020 13:58:26 -0800 Subject: [PATCH] Merge bazel build master (#9) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * Fix code for Bazel change --incompatible_no_support_tools_in_action_inputs (#758) * Specs2 filtering runner now filters test cases according to filter. (#759) This allows the bazel test runner correctly generate the test log, based only on tests that actually did run. * Add scala_doc rule (#760) * move collect_plugin_paths to common.bzl * add scala_doc rule + aspect implementations * add basic scala_doc Markdown documentation * add scala_doc example * collect plugins in aspect too * declare_directory for scaldoc output path or else it complains * add a simple test target for scala_doc rule * add doc note about scaladoc being kind of slow * fix scala_doc.md code block * privatize scaladoc aspect * get more src_files/compile_jars * also accept scalacopts in scala_doc * turn off scaladoc warnings for now * use host_path_separator in classpath * Update scala_doc.md fix the string `scala_doc` which was copy-pasted as `scala_binary` * Explicitly convert depset to list (#761) This will be required by Bazel 0.27, where the flag `--incompatible_no_support_tools_in_action_inputs` will be on by default. The function `collect_plugin_paths` iterates over its argument, so we need to flatten the depset. * Make sure that plus-one-deps from exports of direct deps also propagate (#764) * PlusOne propagates PlusOne deps of exports * rename of tests * correct test dependency * use scala toolchain to toggle on +1 collection * fix grpc opencensus stats integration (#762) * fix grpc opencensus stats integration - upgrade opencensus-java packages to the latest (0.22.1) - add opencensus-impl and opencensus-impl-core * add opencensus-impl transitive dependency com.lmax:disruptor * Add Canva to the adopters list (#770) * Depset is not iterable (#773) * Specify which version of bazel is required. (#772) * Specify which version of bazel is required. * Update README.md * Specs2 filtering of JUnit descriptions (#766) * Specs2 now will create its JUnit Description tree with filtered child items * Creating a filtered description tree from Specs2 utilities - keeps ordering and hashCodes intact * Redirecting test error output * Update "Getting started" WORKSPACE block (#778) * Migrate from java_common.create_provider to JavaInfo() (#781) * Migrate from java_common.create_provider to JavaInfo() * Fix scala/private/common.bzl * Fix some build failures. * Fix some more builds. * Remove scala/private/common:create_java_provider * Remove unused _create_provider. * Remove unused load statement * Also propagate deps and runtime_deps in scala_import with no jars. * Address review comments. * Update custom-jvm-rule.bzl * Update BUILD * typo * Removed implicit value for deps_providers. * Add dummy intermediate jar file for scala_import with no jars. * Cleanup code. * Replace + with lists extend. * Revert enabling --all_incompatible_changes test by mistake. * Passing source-jar into JavaInfo (#783) * expose source_jar in JavaInfo * nit: clarify conditional * nit: replace one hack with another * nit: replace concat with append * remove "main" attr usage it's no longer needed * add comment * remove usage of attr scala with JavaInfo outputs (#784) * Port bazelbuild/bazel#8196 : improve java_stub_template MANIFEST.MF construction speed (#736) * Import java_stub_template from https://github.com/bazelbuild/rules_scala/commit/8b8271e3ee5709e1340b19790d0b396a0ff3dd0f * Port changes from https://github.com/bazelbuild/bazel/pull/8196 * Remove java_stub_template from WORKSPACE * Update java_stub_template archive URL * Make java_stub_template a normal file * JavaInfo provider should be given for deps (#774) * a JavaInfo provider should be given for deps * flatten providers lists * Revert "flatten providers lists" This reverts commit a464f61f80c9dd70cb0f719fa226d3364fab8bd4. * remove print warning if dep has no JavaInfo since it's required now (#788) * Handle FileAlreadyExistsException in ScalaPBGenerator (#789) * Exclude jmh from reproducibility test, since the code generator is non-deterministic. (#791) https://github.com/bazelbuild/rules_scala/issues/699 * warn if jvm_flags is used in scala_library (#795) * Allow for code coverage to be run on 0.27.1 (#780) * Allow for code coverage to be run on 0.27.1 * Update expected-coverage.dat * actually remove all merge conflicts remove scala_doc merge conflict spacing newline * Replace jar_sha256 with artifact_sha256. (#792) * Simplify _jacoco_offline_instrument. (#790) resolve_command shouldn't be required here. * silence tut (#799) * Due to limitations in the toolchains implementation this is required … (#801) * Add test for host deps * Add a test hopefully to illustrate host deps * Update test * Change api usage to use binds * Remove errant print * See if behavior is different on 0.28.1 * incompatible_string_join_requires_strings: string.join(list) accepts invalid (non-string) list elements * Add a to_list * Another case of depset iterable * Windows ci can only support 0.28.0 via chocolaty right now * Add scalac_jvm_flags option to toolchain (#803) * Add scalac_jvm_flags to scala_toolchain This allows things like setting the max heap on all Scalac workers. * Add docs * Fix comment * Add enable_code_coverage_aspect to the docs * Flags on target should override flags on toolchain. Also fix comment. * Add `scala_test_jvm_flags` to toolchain (#804) * Add scala_test_jvm_flags to the toolchain * Fix package name * Fix target names * Add trivial test suite and rename some things * Wrap all jvm_flags in _expand_location * Remove the deprecated attribute proto_source_root. (#793) * Remove the deprecated attribute proto_source_root. Replace it with strip_import_prefix, its spiritual successor. * Update Bazel version on Travis * Update rules_scala to work with Bazel >=0.27. The flag --incompatible_string_join_requires_strings was flipped, which this repository was incompatible with. * Update to Bazel 0.26 instead. test_coverage_on fails for some mysterious reason that seems unrelated to the cleanup crusade I'm pursuing at the moment. * add point release number so that downloading Bazel succeeds * change whitespace to re-trigger build * update Bazel version, hopefully properly * update test_expected_failure/ * minimize diff * re-trigger Travis * re-trigger Travis again * update README.md note the last version of 0.23 that we ran CI on. * Fix a regression that breaks scalapb when jvm_flags are set on the toolchain (#806) * Fix a regression that breaks scalapb when jvm_flags are set on the toolchain * Move passing manual tests to new directory * Migrate from old-style .java provider to JavaInfo. (#807) * Migrate from old-style .java provider to JavaInfo. * Remove usage of .scala. * Flip these around to be correct (#810) * Clean up jmh rule a bit (#811) * Move scala_repositories macro out to its own file + move scala_doc.bzl (#808) * move scala_repositories macro out to its own file * move scala_repositories.bzl and scala_doc.bzl to private * Pin Bazel versions (#812) * Pin Bazel versions * ensure one set of jobs is named test * Keep the original build structure * fix version for windows * add ci stopgap to scripts used in ci * fix ci errors * adjust for buildkite * remove unused param * Tweak travis image config * use sh language * libxml2-utils * use apt addon * Add notes about updating bazel versions * Move scala_binary to its own file (#813) * add a note about code organization in CONTRIBUTING.md * wip move scala_binary out * fully split out scala_binary properly * _library_outputs is the same thing as common_outputs * fix a bunch more scalac_provider references * rename scala_provider function to get_scalac_provider per review * back to the old variable names * Delurk Powershell for a more unified test setup (#814) * Attempt to delurk powershell * 💥 * adjust * adjust * Add a basic PR template (#817) * Move scala_test/scala_test_suite to their own file (#815) * move scala_test rule to its own file * move scala_test_suite to scala_test.bzl, move sanitize_string_for_usage function to common.bzl for now * add a docstring about scala_test.bzl * move test_resolve_deps to private variable in scala_test.bzl * remove suites attribute debug print * move scala_repl rule to its own file (#820) * remove long-deprecated 'suites' attr in scala_test (#819) * move scala_junit_test rule to its own file (#822) * rename proto rule while maintaining backwards compat (#821) * Disable windows CI (#823) * Move library rules (#827) * move library_attrs to common_attributes.bzl * move scala_macro_library rule to its own file * move all _library rules to scala_library.bzl, private stuff too * move _lib into scala_library.bzl * alphasort * Buildifier as the only lint (#826) * Load buildifier directly * update lint scripts * let buildifier reformat everything * Test lints in CI * remove accidental file extension * use skylib version compatible with rules_go and buildifier * fix an unformatted file that breaks ci (#830) * use travis job pipelines (#829) * Default to usage of scala_proto_library instead of scalapb_proto_library alias (#831) * Refactor build_deployable (#832) * build_deployable -> merge_jars with a more explicit interface * add docstring doc to merge_jars * buildifier fix * parameterize the entire progress_message * Update README.md to include Grand Rounds (#835) * Minor fix to error message (#841) Was not properly adding space, as such: ``` java.lang.IllegalStateException: Was not able to discover any classes for archives=testsuite/test/example/example.jarprefixes=Set(), suffixes=Set(Test) ``` * Remove jvm_flags debug print for scala_library (#840) * remove jvm_flags debug print for scala_library * hard fail for jvm_flags in scala_library, re-add jvm_flags attr for other rules * remove fail, not needed if attr isn't supported * Improve classpath manifest file construction (#842) * use java_common.merge instead of manual _collect util functions (#838) * Fix paths in --proto_path and avoid copying proto files (#850) * Fix paths in proto_path and avoid copying * Prepare mapping in advance * Condensed all transformations into one method * Added tests * Buildifier corrections * Split sh test (#849) * Split shell ingetration tests * Fix test_repl no target from clean build * Fix scala_library_jar_without_srcs_must_fail_on_mismatching_resource_strip_prefix (#859) * Fix test_scala_import_source_jar_should_not_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_non_true (#858) * Add a test build for bazel 1.0 (#861) * Add a test build for bazel 1.0 * cp * cp * I think i messed this up a little and it wasn't running the older ones too, making sure we run everything this time * cp * include Bazel 1.0.0 in compatibility table (#863) * fix typo in codeowners github handle * Mirror all http_archives. (#878) Context: https://github.com/bazelbuild/bazel/issues/10270 * Bump v1.0.0 compatibility test to v1.1.0 (#882) * Bump v1.1.0 compatibility test to v1.2.1 (#883) * Bump v1.1.0 compatibility test to v1.2.0 * Upgrade MacOS from HighSierra to Mojave * Empty commit to trigger a new build * Bump bazel to v1.2.1 * Fix sha for 0.28.0 * Revert "Upgrade MacOS from HighSierra to Mojave" This reverts commit a239d4e25d3073070bbf8d4680ad3dc92f5fac9d. * Update sha for 0.28.0 to HEAD * Explicitly label bazel 0.28.1 (#885) * Bump scala 2.12 to v2.12.10 (#886) * Convert maven_jar to jvm_maven_import_external (#889) * Bump Bazel to v2.0.0 (#902) * authorship attribution for higherkindness/rules_scala design (#903) * Refactor rules into configurable phases (#865) * Add configurable phases * Refactor rules implementation into configurable phases * Customizable phases * Customizable phases tests * Break up init to more reasonable phases * Move final to non-configurable phase * Rename parameter builtin_customizable_phases * Fix ijar * Switch default for buildijar * Add TODOs * Rename provider * Move to advanced_usage * rename custom_phases * Make default phase private * Fix exports_jars * Adjusted_phases * Rename p to be more clear * Add in-line comments * Fix lint * Add doc for phases * Doc for consumers * Doc for contributors * Add more content * Fix md * Test for all rules * Fix junit test * Fix lint * Add more tests * Fix junit test * Fix doc * Change _test_ to _scalatest_ * More doc on provider * expand locations in scalacopts (#890) * expand locations in scalac options * allow plugins in expansion * add a happy path test * make the target names more obvious * comment * Revert "expand locations in scalacopts (#890)" (#904) This reverts commit 5c966ee10f2ab6d33165749b2b03a5088e3d742e. * expand locations in scalacptops, take 2 (#907) * expand locations in scalacopts (#890) * expand locations in scalac options * allow plugins in expansion * add a happy path test * make the target names more obvious * comment * access ctx.attr.plugins with fallback * reformat * Plugin expansion- Use input plugins param instead of ctx (#909) * See test failing * Use input plugins param instead of ctx * Remove phase scala provider (#913) * phase_jvm_flags uses JavaInfo provider instead of scala_provider * remove phase scala_provider * readme clarifies minimum version at HEAD is 1.1.0 * travis.yml moved from 0.28.1 to 1.1.0 * Use https to access central.maven.org (#920) See https://support.sonatype.com/hc/en-us/articles/360041287334 And use repo.maven.apache.org instead of central.maven.org * remove me from codeowners I'm not currently a maintainer of this project. * Return providers array instead of legacy struct (#916) * runfiles and files are part of explicit DefaultInfo provider and do not come from attributes * removed transitive_rjars attribute as it was only needed internally and before phases was exposed mistakenly because that's how the infra worked Now internally phases use p.compile.rjars * executable attribute part of DefaultInfo as well * use coverage_common.instrumented_files_info provider instead of attribute * remove redundant attributes * linting * return array of providers instead of struct * scala_import return array of providers instead of struct * Move declare_executable to phase file (#919) * chore(docs): update WORKSPACE setup for skylib (#926) * scalatest final-remove redundant coverage handling (#923) * update bazel-toolchains version (#937) fix #901 * Move code from rule impls/common to phases (#930) * collect_srcjars to phase * move compile_or_empty and needed code to phase_compile (from rule_impls) * phase_compile to take scalac_provider from phases instead of rule_impls * rule_impls loads are loaded as private and unused are removed * get_scalac_provider in phase_scalac_provider and not rule_impls * move write_java_wrapper from rule_impls to phase_java_wrapper * move merge_jars from rule_impls to phase_merge_jars * move get_unused_dependency_checker_mode from rule_impls to get_unused_dependency_checker_mode * move write_manifest from common to get_write_manifest * move collect_jars_from_common_ctx from rule_impls to phase_collect_jars * move write_executable from rule_impls to phase_write_executable * linting * [CR] inline _collect_jars_from_common_ctx * [CR] inline _collect_srcjars * [CR] inline write_java_wrapper * [CR] inline merge_jars * [CR] inline _write_executable * add default for fetch sources Co-authored-by: Laurent Le Brun Co-authored-by: Igal Tabachnik Co-authored-by: Long Cao <48221800+long-stripe@users.noreply.github.com> Co-authored-by: P. Oscar Boykin Co-authored-by: Ittai Zeidman Co-authored-by: Mackenzie Starr Co-authored-by: Jonathon Belotti Co-authored-by: ianoc-stripe Co-authored-by: Parth Co-authored-by: Christopher Johnson Co-authored-by: Irina Iancu Co-authored-by: joshrosen-stripe <48632449+joshrosen-stripe@users.noreply.github.com> Co-authored-by: Paul Tarjan Co-authored-by: Benjamin Peterson Co-authored-by: David Haxton Co-authored-by: Alex Beal <39505601+beala-stripe@users.noreply.github.com> Co-authored-by: lberki Co-authored-by: Andy Scott Co-authored-by: Mantas Sakalauskas Co-authored-by: Shachar Anchelovich Co-authored-by: ignasl Co-authored-by: Bor Kae Hwang Co-authored-by: Philipp Wollermann Co-authored-by: chenrui Co-authored-by: Jin Co-authored-by: Andreas Herrmann --- .github/PULL_REQUEST_TEMPLATE.md | 11 + .gitignore | 1 + .travis.yml | 105 +- AUTHORS | 1 + CODEOWNERS | 2 +- CONTRIBUTING.md | 6 + README.md | 55 +- WORKSPACE | 104 +- docs/customizable_phase.md | 149 +++ docs/scala_doc.md | 48 + ...roto_library.md => scala_proto_library.md} | 14 +- docs/scala_toolchain.md | 68 +- java_stub_template/file/BUILD.bazel | 5 + java_stub_template/file/file.txt | 365 ++++++ jmh/jmh.bzl | 44 +- junit/junit.bzl | 6 +- lint.sh | 290 +---- manual_test/README.md | 1 + manual_test/scala_test_jvm_flags/BUILD | 43 + .../scala_test_jvm_flags/EmptyTest.scala | 9 + manual_test/scalac_jvm_opts/BUILD | 61 + manual_test/scalac_jvm_opts/Empty.scala | 3 + manual_test/scalac_jvm_opts/test.proto | 8 + private/format.bzl | 22 - scala/BUILD | 6 + ...erClassToCreateEmptyJarForScalaImport.java | 1 + scala/advanced_usage/providers.bzl | 11 + scala/advanced_usage/scala.bzl | 35 + scala/plusone.bzl | 20 +- scala/private/common.bzl | 111 +- scala/private/common_attributes.bzl | 136 ++ scala/private/common_outputs.bzl | 8 + scala/private/macros/scala_repositories.bzl | 200 +++ scala/private/phases/api.bzl | 86 ++ .../phases/phase_collect_exports_jars.bzl | 15 + scala/private/phases/phase_collect_jars.bzl | 119 ++ .../private/phases/phase_collect_srcjars.bzl | 14 + scala/private/phases/phase_compile.bzl | 526 ++++++++ .../phases/phase_coverage_runfiles.bzl | 28 + .../phases/phase_declare_executable.bzl | 15 + scala/private/phases/phase_final.bzl | 27 + scala/private/phases/phase_java_wrapper.bzl | 68 + scala/private/phases/phase_jvm_flags.bzl | 52 + scala/private/phases/phase_merge_jars.bzl | 30 + scala/private/phases/phase_runfiles.bzl | 68 + .../private/phases/phase_scalac_provider.bzl | 12 + .../phases/phase_unused_deps_checker.bzl | 11 + .../private/phases/phase_write_executable.bzl | 174 +++ scala/private/phases/phase_write_manifest.bzl | 13 + scala/private/phases/phases.bzl | 131 ++ scala/private/rule_impls.bzl | 1142 +---------------- scala/private/rules/scala_binary.bzl | 81 ++ scala/private/rules/scala_doc.bzl | 103 ++ scala/private/rules/scala_junit_test.bzl | 133 ++ scala/private/rules/scala_library.bzl | 248 ++++ scala/private/rules/scala_repl.bzl | 80 ++ scala/private/rules/scala_test.bzl | 142 ++ scala/providers.bzl | 34 - scala/scala.bzl | 721 +---------- scala/scala_cross_version.bzl | 8 +- scala/scala_import.bzl | 111 +- scala/scala_maven_import_external.bzl | 11 +- scala/scala_toolchain.bzl | 6 +- scala_proto/BUILD | 26 +- scala_proto/default_dep_sets.bzl | 5 +- scala_proto/private/proto_to_scala_src.bzl | 11 +- .../scala_proto_default_repositories.bzl | 97 +- scala_proto/private/scalapb_aspect.bzl | 23 +- scala_proto/scala_proto.bzl | 28 +- scala_proto/scala_proto_toolchain.bzl | 20 +- scala_proto/toolchains.bzl | 2 - specs2/specs2.bzl | 10 +- specs2/specs2_junit.bzl | 4 +- .../rulesscala/coverage/instrumenter/BUILD | 5 +- src/java/io/bazel/rulesscala/exe/BUILD | 14 +- .../scalac/jvm_export_toolchain.bzl | 8 +- .../specs2/Specs2RunnerBuilder.scala | 61 +- .../io/bazel/rulesscala/specs2/package.scala | 11 +- .../test_discovery/DiscoveredTestSuite.scala | 2 +- .../bazel/rules_scala/scaladoc_support/BUILD | 17 + src/scala/scripts/PBGenerateRequest.scala | 31 +- src/scala/scripts/ScalaPBGenerator.scala | 14 +- test/BUILD | 72 +- test/aspect/BUILD | 4 +- test/coverage/A2.scala | 4 +- test/coverage/BUILD | 16 +- test/coverage/expected-coverage.dat | 22 +- test/jmh/BUILD | 2 +- test/phase/add_to_all_rules/BUILD | 59 + test/phase/add_to_all_rules/PhaseBinary.scala | 7 + .../add_to_all_rules/PhaseJunitTest.scala | 10 + .../phase/add_to_all_rules/PhaseLibrary.scala | 5 + test/phase/add_to_all_rules/PhaseTest.scala | 10 + .../phase_add_to_all_rules.bzl | 10 + .../phase_add_to_all_rules_test.bzl | 60 + test/phase/adjustment/BUILD | 27 + test/phase/adjustment/PhaseLibrary.scala | 5 + test/phase/adjustment/phase_adjustment.bzl | 42 + .../adjustment/phase_adjustment_test.bzl | 79 ++ test/plugins/BUILD | 40 + .../check_expand_location_plugin.scala | 25 + test/plugins/trivial.scala | 5 + test/proto/BUILD | 40 +- test/proto3/BUILD | 8 +- test/shell/test_build_event_protocol.sh | 36 + test/shell/test_compilation.sh | 31 + test/shell/test_deps.sh | 48 + test/shell/test_helper.sh | 112 ++ test/shell/test_javac_jvm_flags.sh | 30 + test/shell/test_junit.sh | 113 ++ test/shell/test_misc.sh | 124 ++ test/shell/test_phase.sh | 90 ++ test_runner.sh => test/shell/test_runner.sh | 4 +- test/shell/test_scala_binary.sh | 22 + test/shell/test_scala_classpath.sh | 22 + test/shell/test_scala_import_source_jar.sh | 68 + test/shell/test_scala_jvm_flags.sh | 21 + test/shell/test_scala_library.sh | 186 +++ test/shell/test_scala_library_jar.sh | 30 + test/shell/test_scala_specs2.sh | 278 ++++ test/shell/test_scalac_jvm_flags.sh | 38 + test/shell/test_toolchain.sh | 19 + test/shell/test_unused_dependency.sh | 54 + .../test/compiler_plugin/BUILD.bazel | 7 +- .../test/extra_protobuf_generator/BUILD | 9 +- .../scala/scalarules/test/fetch_sources/BUILD | 4 +- .../src/main/scala/scalarules/test/ijar/BUILD | 2 +- .../scala/scalarules/test/scala_import/BUILD | 20 +- .../test/scala_import/nl/BUILD.bazel | 2 +- .../main/scala/scalarules/test/scripts/BUILD | 10 + .../test/scripts/PBGenerateRequestTest.scala | 21 + .../test/strict_deps/no_recompilation/BUILD | 2 +- .../twitter_scrooge/twitter_scrooge_test.bzl | 24 +- test_expect_failure/disappearing_class/BUILD | 2 +- .../missing_direct_deps/internal_deps/BUILD | 2 +- .../internal_deps/custom-jvm-rule.bzl | 26 +- test_expect_failure/plus_one_deps/BUILD.bazel | 7 +- .../{exports_deps => deps_of_exports}/A.scala | 0 .../{exports_deps => deps_of_exports}/B.scala | 0 .../plus_one_deps/deps_of_exports/BUILD.bazel | 40 + .../{exports_deps => deps_of_exports}/C.scala | 0 .../{exports_deps => deps_of_exports}/D.scala | 0 .../plus_one_deps/exports_deps/BUILD.bazel | 21 - .../plus_one_deps/exports_of_deps/A.scala | 5 + .../plus_one_deps/exports_of_deps/B.scala | 5 + .../plus_one_deps/exports_of_deps/BUILD.bazel | 26 + .../plus_one_deps/exports_of_deps/C.scala | 3 + .../plus_one_deps/exports_of_deps/D.scala | 5 + .../plus_one_deps/external_deps/BUILD.bazel | 6 +- .../plus_one_deps/internal_deps/BUILD.bazel | 3 + .../with_unused_deps/BUILD.bazel | 8 +- .../proto_source_root/dependency/BUILD | 2 +- .../proto_source_root/user/BUILD | 6 +- test_expect_failure/scala_import/BUILD | 8 +- test_expect_failure/scala_junit_test/BUILD | 5 +- .../specs2/SuiteWithOneFailingTest.scala | 15 + .../scala_test_jvm_flags/BUILD | 35 + .../scala_test_jvm_flags/EmptyTest.scala | 9 + test_expect_failure/scalac_jvm_opts/BUILD | 25 + .../scalac_jvm_opts/Empty.scala | 3 + .../transitive/java_to_scala/BUILD | 2 +- test_lint.sh | 2 +- test_reproducibility.ps1 | 3 - test_reproducibility.sh | 11 +- test_rules_scala.ps1 | 36 - test_rules_scala.sh | 1038 +-------------- test_version.sh | 23 +- test_version/version_specific_tests_dir/BUILD | 8 +- .../version_specific_tests_dir/proto/BUILD | 8 +- tools/BUILD | 14 + tools/bazel | 89 ++ tut_rule/tut.bzl | 4 +- twitter_scrooge/twitter_scrooge.bzl | 52 +- 173 files changed, 6240 insertions(+), 3843 deletions(-) create mode 100644 .github/PULL_REQUEST_TEMPLATE.md create mode 100644 docs/customizable_phase.md create mode 100644 docs/scala_doc.md rename docs/{scalapb_proto_library.md => scala_proto_library.md} (89%) create mode 100644 java_stub_template/file/BUILD.bazel create mode 100644 java_stub_template/file/file.txt create mode 100644 manual_test/README.md create mode 100644 manual_test/scala_test_jvm_flags/BUILD create mode 100644 manual_test/scala_test_jvm_flags/EmptyTest.scala create mode 100644 manual_test/scalac_jvm_opts/BUILD create mode 100644 manual_test/scalac_jvm_opts/Empty.scala create mode 100644 manual_test/scalac_jvm_opts/test.proto create mode 100644 scala/PlaceHolderClassToCreateEmptyJarForScalaImport.java create mode 100644 scala/advanced_usage/providers.bzl create mode 100644 scala/advanced_usage/scala.bzl create mode 100644 scala/private/common_attributes.bzl create mode 100644 scala/private/common_outputs.bzl create mode 100644 scala/private/macros/scala_repositories.bzl create mode 100644 scala/private/phases/api.bzl create mode 100644 scala/private/phases/phase_collect_exports_jars.bzl create mode 100644 scala/private/phases/phase_collect_jars.bzl create mode 100644 scala/private/phases/phase_collect_srcjars.bzl create mode 100644 scala/private/phases/phase_compile.bzl create mode 100644 scala/private/phases/phase_coverage_runfiles.bzl create mode 100644 scala/private/phases/phase_declare_executable.bzl create mode 100644 scala/private/phases/phase_final.bzl create mode 100644 scala/private/phases/phase_java_wrapper.bzl create mode 100644 scala/private/phases/phase_jvm_flags.bzl create mode 100644 scala/private/phases/phase_merge_jars.bzl create mode 100644 scala/private/phases/phase_runfiles.bzl create mode 100644 scala/private/phases/phase_scalac_provider.bzl create mode 100644 scala/private/phases/phase_unused_deps_checker.bzl create mode 100644 scala/private/phases/phase_write_executable.bzl create mode 100644 scala/private/phases/phase_write_manifest.bzl create mode 100644 scala/private/phases/phases.bzl create mode 100644 scala/private/rules/scala_binary.bzl create mode 100644 scala/private/rules/scala_doc.bzl create mode 100644 scala/private/rules/scala_junit_test.bzl create mode 100644 scala/private/rules/scala_library.bzl create mode 100644 scala/private/rules/scala_repl.bzl create mode 100644 scala/private/rules/scala_test.bzl create mode 100644 src/scala/io/bazel/rules_scala/scaladoc_support/BUILD create mode 100644 test/phase/add_to_all_rules/BUILD create mode 100644 test/phase/add_to_all_rules/PhaseBinary.scala create mode 100644 test/phase/add_to_all_rules/PhaseJunitTest.scala create mode 100644 test/phase/add_to_all_rules/PhaseLibrary.scala create mode 100644 test/phase/add_to_all_rules/PhaseTest.scala create mode 100644 test/phase/add_to_all_rules/phase_add_to_all_rules.bzl create mode 100644 test/phase/add_to_all_rules/phase_add_to_all_rules_test.bzl create mode 100644 test/phase/adjustment/BUILD create mode 100644 test/phase/adjustment/PhaseLibrary.scala create mode 100644 test/phase/adjustment/phase_adjustment.bzl create mode 100644 test/phase/adjustment/phase_adjustment_test.bzl create mode 100644 test/plugins/BUILD create mode 100644 test/plugins/check_expand_location_plugin.scala create mode 100644 test/plugins/trivial.scala create mode 100755 test/shell/test_build_event_protocol.sh create mode 100755 test/shell/test_compilation.sh create mode 100755 test/shell/test_deps.sh create mode 100755 test/shell/test_helper.sh create mode 100755 test/shell/test_javac_jvm_flags.sh create mode 100755 test/shell/test_junit.sh create mode 100755 test/shell/test_misc.sh create mode 100755 test/shell/test_phase.sh rename test_runner.sh => test/shell/test_runner.sh (98%) create mode 100755 test/shell/test_scala_binary.sh create mode 100755 test/shell/test_scala_classpath.sh create mode 100755 test/shell/test_scala_import_source_jar.sh create mode 100755 test/shell/test_scala_jvm_flags.sh create mode 100755 test/shell/test_scala_library.sh create mode 100755 test/shell/test_scala_library_jar.sh create mode 100755 test/shell/test_scala_specs2.sh create mode 100755 test/shell/test_scalac_jvm_flags.sh create mode 100755 test/shell/test_toolchain.sh create mode 100755 test/shell/test_unused_dependency.sh create mode 100644 test/src/main/scala/scalarules/test/scripts/BUILD create mode 100644 test/src/main/scala/scalarules/test/scripts/PBGenerateRequestTest.scala rename test_expect_failure/plus_one_deps/{exports_deps => deps_of_exports}/A.scala (100%) rename test_expect_failure/plus_one_deps/{exports_deps => deps_of_exports}/B.scala (100%) create mode 100644 test_expect_failure/plus_one_deps/deps_of_exports/BUILD.bazel rename test_expect_failure/plus_one_deps/{exports_deps => deps_of_exports}/C.scala (100%) rename test_expect_failure/plus_one_deps/{exports_deps => deps_of_exports}/D.scala (100%) delete mode 100644 test_expect_failure/plus_one_deps/exports_deps/BUILD.bazel create mode 100644 test_expect_failure/plus_one_deps/exports_of_deps/A.scala create mode 100644 test_expect_failure/plus_one_deps/exports_of_deps/B.scala create mode 100644 test_expect_failure/plus_one_deps/exports_of_deps/BUILD.bazel create mode 100644 test_expect_failure/plus_one_deps/exports_of_deps/C.scala create mode 100644 test_expect_failure/plus_one_deps/exports_of_deps/D.scala create mode 100644 test_expect_failure/scala_junit_test/specs2/SuiteWithOneFailingTest.scala create mode 100644 test_expect_failure/scala_test_jvm_flags/BUILD create mode 100644 test_expect_failure/scala_test_jvm_flags/EmptyTest.scala create mode 100644 test_expect_failure/scalac_jvm_opts/BUILD create mode 100644 test_expect_failure/scalac_jvm_opts/Empty.scala delete mode 100644 test_reproducibility.ps1 delete mode 100644 test_rules_scala.ps1 create mode 100644 tools/BUILD create mode 100755 tools/bazel diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md new file mode 100644 index 000000000..a75fcdcbd --- /dev/null +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -0,0 +1,11 @@ +### Description + + + + + +### Motivation + diff --git a/.gitignore b/.gitignore index ddfd1e42d..3a209dda9 100644 --- a/.gitignore +++ b/.gitignore @@ -5,3 +5,4 @@ hash1 hash2 .DS_store .bazel_cache +.ijwb diff --git a/.travis.yml b/.travis.yml index d23b486b8..79d47e914 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,58 +1,75 @@ -# trusty beta image has jdk8, gcc4.8.4 -dist: trusty +dist: xenial sudo: required -# xcode8 has jdk8 -osx_image: xcode8 -# Not technically required but suppresses 'Ruby' in Job status message. language: sh +addons: + apt: + packages: + - libxml2-utils + cache: directories: - .bazel_cache + - ~/.bazel_binaries -os: - - linux - - osx - - windows +_linux: &linux + os: linux +_osx: &osx + os: osx + osx_image: xcode10.1 +_windows: + os: windows -env: - # Linting is broken. Disable until fixed. - # See https://github.com/bazelbuild/rules_scala/pull/622 - # we want to test the last release - #- V=0.16.1 TEST_SCRIPT=test_lint.sh - - V=0.23.1 TEST_SCRIPT=test_rules_scala - #- V=0.14.1 TEST_SCRIPT=test_intellij_aspect.sh - - V=0.23.1 TEST_SCRIPT=test_reproducibility +### +# +# Want to update/change bazel versions? +# +# 1. Update the bazel_version case statement in +# ./tools/bazel to include hashes for the bazel version +# you're targeting. +# +# 2. either +# - If you're updating the default bazel version, change +# default_bazel_version in ./tools/bazel. +# or +# - If you want to add an additional bazel version to the build +# matrix, set BAZEL_VERSION= along side +# TEST_SCRIPT below. +# +# 3. If you need to update the Windows version, adjust +# the windows specific install code below. +# +### +jobs: + include: +# Lint + - stage: test + <<: *linux + env: TEST_SCRIPT=test_lint +# Test + - <<: *linux + env: TEST_SCRIPT=test_rules_scala BAZEL_VERSION=1.1.0 + - <<: *linux + env: TEST_SCRIPT=test_rules_scala BAZEL_VERSION=2.0.0 + - <<: *linux + env: TEST_SCRIPT=test_reproducibility BAZEL_VERSION=1.1.0 + - <<: *linux + env: TEST_SCRIPT=test_reproducibility BAZEL_VERSION=2.0.0 + - <<: *osx + env: TEST_SCRIPT=test_rules_scala BAZEL_VERSION=1.1.0 + - <<: *osx + env: TEST_SCRIPT=test_rules_scala BAZEL_VERSION=2.0.0 + - <<: *osx + env: TEST_SCRIPT=test_reproducibility BAZEL_VERSION=1.1.0 + - <<: *osx + env: TEST_SCRIPT=test_reproducibility BAZEL_VERSION=2.0.0 before_install: - | if [[ "${TRAVIS_OS_NAME}" == "windows" ]]; then choco install jdk8 -params 'installdir=c:\\java8' - choco install bazel --version ${V} - else - if [[ "${TRAVIS_OS_NAME}" == "osx" ]]; then - OS=darwin - else - sudo sysctl kernel.unprivileged_userns_clone=1 - sudo add-apt-repository -y ppa:openjdk-r/ppa - sudo apt-get update -q - sudo apt-get install openjdk-8-jdk -y - sudo apt-get install libxml2-utils -y - OS=linux - fi - - if [[ $V =~ .*rc[0-9]+.* ]]; then - PRE_RC=$(expr "$V" : '\([0-9.]*\)rc.*') - RC_PRC=$(expr "$V" : '[0-9.]*\(rc.*\)') - URL="https://storage.googleapis.com/bazel/${PRE_RC}/${RC_PRC}/bazel-${V}-installer-${OS}-x86_64.sh" - else - URL="https://github.com/bazelbuild/bazel/releases/download/${V}/bazel-${V}-installer-${OS}-x86_64.sh" - fi - wget -O install.sh "${URL}" - chmod +x install.sh - ./install.sh --user - rm -f install.sh + choco install bazel --version 0.28.0 fi - cat .bazelrc.travis >> .bazelrc @@ -60,7 +77,7 @@ script: - | if [[ "${TRAVIS_OS_NAME}" == "windows" ]]; then powershell -Command 'Set-ExecutionPolicy RemoteSigned -scope CurrentUser' - powershell -File ./${TEST_SCRIPT}.ps1 - else - bash ./${TEST_SCRIPT}.sh ci + export JAVA_HOME='c:\\\\java8' + export BAZEL_VERSION=host fi + bash ./${TEST_SCRIPT}.sh ci diff --git a/AUTHORS b/AUTHORS index ff7a21933..ffc96eb3c 100644 --- a/AUTHORS +++ b/AUTHORS @@ -10,3 +10,4 @@ Google Inc. Dino Wernli Oscar Boykin John T. Sullivan +Andy Scott diff --git a/CODEOWNERS b/CODEOWNERS index de1e21c61..6c890b68c 100644 --- a/CODEOWNERS +++ b/CODEOWNERS @@ -1 +1 @@ -* @iitaiz @johnynek @dinowernil +* @ittaiz @dinowernil diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 4e7c9425a..b82797489 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -17,6 +17,12 @@ Before you start working on a larger contribution, you should get in touch with us first. Use the issue tracker to explain your idea so we can help and possibly guide you. +### Code organization + +Core Scala rules (including their implementations) and macros go in [./scala/private/rules/](./scala/private/rules/) +and [./scala/private/macros/](./scala/private/macros/), respectively, and are re-exported for public use +in [./scala/scala.bzl](./scala/scala.bzl). + ### Code reviews and other contributions. **All submissions, including submissions by project members, require review.** Please follow the instructions in [the contributors documentation](http://bazel.io/contributing.html). diff --git a/README.md b/README.md index c9f01f2d5..8e0a0f0d4 100644 --- a/README.md +++ b/README.md @@ -18,19 +18,30 @@ This project defines core build rules for [Scala](https://www.scala-lang.org/) t * [scala_library_suite](docs/scala_library_suite.md) * [scala_test_suite](docs/scala_test_suite.md) * [thrift_library](docs/thrift_library.md) -* [scalapb_proto_library](docs/scalapb_proto_library.md) +* [scala_proto_library](docs/scala_proto_library.md) * [scala_toolchain](docs/scala_toolchain.md) * [scala_import](docs/scala_import.md) +* [scala_doc](docs/scala_doc.md) ## Getting started -In order to use these rules you must have bazel 0.23 or later and add the -following to your WORKSPACE file: +1. [Install Bazel](https://docs.bazel.build/versions/master/install.html), see the [compatibility table](#bazel-compatible-versions). +2. Add the following to your `WORKSPACE` file and update the `githash` if needed: ```python -rules_scala_version="a89d44f7ef67d93dedfc9888630f48d7723516f7" # update this as needed - load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") + +# bazel-skylib 0.8.0 released 2019.03.20 (https://github.com/bazelbuild/bazel-skylib/releases/tag/0.8.0) +skylib_version = "0.8.0" +http_archive( + name = "bazel_skylib", + type = "tar.gz", + url = "https://github.com/bazelbuild/bazel-skylib/releases/download/{}/bazel-skylib.{}.tar.gz".format (skylib_version, skylib_version), + sha256 = "2ef429f5d7ce7111263289644d233707dba35e39696377ebab8b0bc701f7818e", +) + +rules_scala_version="69d3c5b5d9b51537231746e93b4383384c9ebcf4" # update this as needed + http_archive( name = "io_bazel_rules_scala", strip_prefix = "rules_scala-%s" % rules_scala_version, @@ -44,8 +55,8 @@ scala_register_toolchains() load("@io_bazel_rules_scala//scala:scala.bzl", "scala_repositories") scala_repositories() -protobuf_version="66dc42d891a4fc8e9190c524fd67961688a37bbe" -protobuf_version_sha256="983975ab66113cbaabea4b8ec9f3a73406d89ed74db9ae75c74888e685f956f8" +protobuf_version="09745575a923640154bcf307fba8aedff47f240a" +protobuf_version_sha256="416212e14481cff8fd4849b1c1c1200a7f34808a54377e22d7447efdf54ad758" http_archive( name = "com_google_protobuf", @@ -54,6 +65,7 @@ http_archive( sha256 = protobuf_version_sha256, ) ``` + This will load the `rules_scala` repository at the commit sha `rules_scala_version` into your Bazel project and register a [Scala toolchain](#scala_toolchain) at the default Scala version (2.11.12) @@ -77,17 +89,19 @@ Rules scala supports all minor versions of Scala 2.11/2.12. By default `Scala 2. version you need to specify it when calling `scala_repositories`. `scala_repositories` takes a tuple `(scala_version, scala_version_jar_shas)` as a parameter where `scala_version` is the scala version and `scala_version_jar_shas` is a `dict` with -`sha256` hashes for the maven artifacts `scala_library`, `scala_reflect` and `scala_compiler`: +`sha256` hashes for the maven artifacts `scala_compiler`, `scala_library`, and `scala_reflect`: + ```python scala_repositories(( - "2.12.8", + "2.12.10", { - "scala_compiler": "f34e9119f45abd41e85b9e121ba19dd9288b3b4af7f7047e86dc70236708d170", - "scala_library": "321fb55685635c931eba4bc0d7668349da3f2c09aee2de93a70566066ff25c28", - "scala_reflect": "4d6405395c4599ce04cea08ba082339e3e42135de9aae2923c9f5367e957315a" + "scala_compiler": "cedc3b9c39d215a9a3ffc0cc75a1d784b51e9edc7f13051a1b4ad5ae22cfbc0c", + "scala_library": "0a57044d10895f8d3dd66ad4286891f607169d948845ac51e17b4c1cf0ab569d", + "scala_reflect": "56b609e1bab9144fb51525bfa01ccd72028154fc40a58685a1e9adcbe7835730" } )) ``` + If you're using any of the rules `twitter_scrooge`, `tut_repositories`, `scala_proto_repositories` or `specs2_junit_repositories` you also need to specify `scala_version` for them. See `./test_version/WORKSPACE.template` for an example workspace using another scala version. @@ -97,7 +111,10 @@ for an example workspace using another scala version. | bazel | rules_scala gitsha | |--------|--------------------| -| 0.23.x | HEAD | +| 2.0.0 | HEAD | +| 1.1.0 | HEAD | +| 0.28.1 | bd0c388125e12f4f173648fc4474f73160a5c628 | +| 0.23.x | ca655e5a330cbf1d66ce1d9baa63522752ec6011 | | 0.22.x | f3113fb6e9e35cb8f441d2305542026d98afc0a2 | | 0.16.x | f3113fb6e9e35cb8f441d2305542026d98afc0a2 | | 0.15.x | 3b9ab9be31ac217d3337c709cb6bfeb89c8dcbb1 | @@ -174,6 +191,16 @@ Unused dependency checking can either be enabled globally for all targets using in these cases you can enable unused dependency checking globally through a toolchain and override individual misbehaving targets using the attribute. +## Advanced configurable rules +To make the ruleset more flexible and configurable, we introduce a phase architecture. By using a phase architecture, where rule implementations are defined as a list of phases that are executed sequentially, functionality can easily be added (or modified) by adding (or swapping) phases. + +Phases provide 3 major benefits: + - Consumers are able to configure the rules to their specific use cases by defining new phases within their workspace without impacting other consumers. + - Contributors are able to implement new functionalities by creating additional default phases. + - Phases give us more clear idea what steps are shared across rules. + +See [Customizable Phase](docs/customizable_phase.md) for more info. + ## Building from source Test & Build: ``` @@ -204,7 +231,9 @@ See [CONTRIBUTING.md](CONTRIBUTING.md) for more info. Here's a (non-exhaustive) list of companies that use `rules_scala` in production. Don't see yours? [You can add it in a PR](https://github.com/bazelbuild/rules_scala/edit/master/README.md)! * [Ascend](https://ascend.io/) +* [Canva](https://www.canva.com/) * [Etsy](https://www.etsy.com/) +* [Grand Rounds](http://grandrounds.com/) * [Kitty Hawk](https://kittyhawk.aero/) * [Meetup](https://meetup.com/) * [Spotify](https://www.spotify.com/) diff --git a/WORKSPACE b/WORKSPACE index 2114e0ac9..2dd6084e8 100644 --- a/WORKSPACE +++ b/WORKSPACE @@ -1,14 +1,26 @@ workspace(name = "io_bazel_rules_scala") - load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") load("@bazel_tools//tools/build_defs/repo:git.bzl", "git_repository") +load("@bazel_tools//tools/build_defs/repo:jvm.bzl", "jvm_maven_import_external") + +http_archive( + name = "com_github_bazelbuild_buildtools", + sha256 = "cdaac537b56375f658179ee2f27813cac19542443f4722b6730d84e4125355e6", + strip_prefix = "buildtools-f27d1753c8b3210d9e87cdc9c45bc2739ae2c2db", + url = "https://github.com/bazelbuild/buildtools/archive/f27d1753c8b3210d9e87cdc9c45bc2739ae2c2db.zip", +) + +load("@com_github_bazelbuild_buildtools//buildifier:deps.bzl", "buildifier_dependencies") + +buildifier_dependencies() + load("//scala:scala.bzl", "scala_repositories") scala_repositories() load("//scala:scala_maven_import_external.bzl", "scala_maven_import_external") -load("//twitter_scrooge:twitter_scrooge.bzl", "twitter_scrooge", "scrooge_scala_library") +load("//twitter_scrooge:twitter_scrooge.bzl", "scrooge_scala_library", "twitter_scrooge") twitter_scrooge() @@ -28,43 +40,52 @@ load("//specs2:specs2_junit.bzl", "specs2_junit_repositories") specs2_junit_repositories() -load("//scala:scala_cross_version.bzl", "scala_mvn_artifact", "default_scala_major_version") +load("//scala:scala_cross_version.bzl", "default_scala_major_version", "scala_mvn_artifact") + +MAVEN_SERVER_URLS = [ + "https://jcenter.bintray.com", + "https://repo1.maven.org/maven2", +] # test adding a scala jar: -maven_jar( +jvm_maven_import_external( name = "com_twitter__scalding_date", artifact = scala_mvn_artifact( "com.twitter:scalding-date:0.17.0", default_scala_major_version(), ), - sha1 = "420fb0c4f737a24b851c4316ee0362095710caa5", + artifact_sha256 = "bf743cd6d224a4568d6486a2b794143e23145d2afd7a1d2de412d49e45bdb308", + server_urls = MAVEN_SERVER_URLS, ) # For testing that we don't include sources jars to the classpath -maven_jar( +jvm_maven_import_external( name = "org_typelevel__cats_core", artifact = scala_mvn_artifact( "org.typelevel:cats-core:0.9.0", default_scala_major_version(), ), - sha1 = "b2f8629c6ec834d8b6321288c9fe77823f1e1314", + artifact_sha256 = "3fda7a27114b0d178107ace5c2cf04e91e9951810690421768e65038999ffca5", + server_urls = MAVEN_SERVER_URLS, ) # test of a plugin -maven_jar( +jvm_maven_import_external( name = "org_psywerx_hairyfotr__linter", artifact = scala_mvn_artifact( "org.psywerx.hairyfotr:linter:0.1.13", default_scala_major_version(), ), - sha1 = "e5b3e2753d0817b622c32aedcb888bcf39e275b4", + artifact_sha256 = "9444dd78684c0cc89d070af0f5ca3f3ae7d56b2a4d7ac1c038f8218ad4d66fad", + server_urls = MAVEN_SERVER_URLS, ) # test of strict deps (scalac plugin UT + E2E) -maven_jar( +jvm_maven_import_external( name = "com_google_guava_guava_21_0_with_file", artifact = "com.google.guava:guava:21.0", - sha1 = "3a3d111be1be1b745edfa7d91678a12d7ed38709", + artifact_sha256 = "972139718abc8a4893fa78cba8cf7b2c903f35c97aaf44fa3031b0669948b480", + server_urls = MAVEN_SERVER_URLS, ) # test of import external @@ -76,8 +97,8 @@ maven_jar( scala_maven_import_external( name = "com_github_jnr_jffi_native", artifact = "com.github.jnr:jffi:jar:native:1.2.17", - fetch_sources = True, artifact_sha256 = "4eb582bc99d96c8df92fc6f0f608fd123d278223982555ba16219bf8be9f75a9", + fetch_sources = True, licenses = ["notice"], server_urls = [ "https://repo.maven.apache.org/maven2/", @@ -85,10 +106,11 @@ scala_maven_import_external( srcjar_sha256 = "5e586357a289f5fe896f7b48759e1c16d9fa419333156b496696887e613d7a19", ) -maven_jar( +jvm_maven_import_external( name = "org_apache_commons_commons_lang_3_5", artifact = "org.apache.commons:commons-lang3:3.5", - sha1 = "6c6c702c89bfff3cd9e80b04d668c5e190d588c6", + artifact_sha256 = "8ac96fc686512d777fca85e144f196cd7cfe0c0aec23127229497d1a38ff651c", + server_urls = MAVEN_SERVER_URLS, ) new_local_repository( @@ -108,23 +130,21 @@ load("@io_bazel_rules_scala//scala:toolchains.bzl", "scala_register_unused_deps_ scala_register_unused_deps_toolchains() - register_toolchains("@io_bazel_rules_scala//test/proto:scalapb_toolchain") - -load("//scala:scala_maven_import_external.bzl", "scala_maven_import_external", "java_import_external") +load("//scala:scala_maven_import_external.bzl", "java_import_external", "scala_maven_import_external") scala_maven_import_external( name = "com_google_guava_guava_21_0", artifact = "com.google.guava:guava:21.0", artifact_sha256 = "972139718abc8a4893fa78cba8cf7b2c903f35c97aaf44fa3031b0669948b480", - srcjar_sha256 = "b186965c9af0a714632fe49b33378c9670f8f074797ab466f49a67e918e116ea", fetch_sources = True, licenses = ["notice"], # Apache 2.0 server_urls = [ "https://repo1.maven.org/maven2/", "https://mirror.bazel.build/repo1.maven.org/maven2", - ], + ], + srcjar_sha256 = "b186965c9af0a714632fe49b33378c9670f8f074797ab466f49a67e918e116ea", ) # bazel's java_import_external has been altered in rules_scala to be a macro based on jvm_import_external @@ -137,7 +157,7 @@ java_import_external( name = "org_apache_commons_commons_lang_3_5_without_file", generated_linkable_rule_name = "linkable_org_apache_commons_commons_lang_3_5_without_file", jar_sha256 = "8ac96fc686512d777fca85e144f196cd7cfe0c0aec23127229497d1a38ff651c", - jar_urls = ["https://repo1.maven.org/maven2/org/apache/commons/commons-lang3/3.5/commons-lang3-3.5.jar"], + jar_urls = ["https://repo.maven.apache.org/maven2/org/apache/commons/commons-lang3/3.5/commons-lang3-3.5.jar"], licenses = ["notice"], # Apache 2.0 neverlink = True, ) @@ -148,16 +168,36 @@ load("//private:format.bzl", "format_repositories") format_repositories() +http_archive( + name = "io_bazel_rules_go", + sha256 = "45409e6c4f748baa9e05f8f6ab6efaa05739aa064e3ab94e5a1a09849c51806a", + url = "https://github.com/bazelbuild/rules_go/releases/download/0.18.7/rules_go-0.18.7.tar.gz", +) + +load( + "@io_bazel_rules_go//go:deps.bzl", + "go_register_toolchains", + "go_rules_dependencies", +) + +go_rules_dependencies() + +go_register_toolchains() + http_archive( name = "bazel_toolchains", - sha256 = "5962fe677a43226c409316fcb321d668fc4b7fa97cb1f9ef45e7dc2676097b26", - strip_prefix = "bazel-toolchains-be10bee3010494721f08a0fccd7f57411a1e773e", + sha256 = "8062febd539d2f3246e479715e3f1eb29f0420eca26da369950309cb2bed25fd", + strip_prefix = "bazel-toolchains-0b442a1bf997840c4f1063ee8a90605392418741", urls = [ - "https://mirror.bazel.build/github.com/bazelbuild/bazel-toolchains/archive/be10bee3010494721f08a0fccd7f57411a1e773e.tar.gz", - "https://github.com/bazelbuild/bazel-toolchains/archive/be10bee3010494721f08a0fccd7f57411a1e773e.tar.gz", + "https://mirror.bazel.build/github.com/bazelbuild/bazel-toolchains/archive/0b442a1bf997840c4f1063ee8a90605392418741.tar.gz", + "https://github.com/bazelbuild/bazel-toolchains/archive/0b442a1bf997840c4f1063ee8a90605392418741.tar.gz", ], ) +load("@bazel_skylib//:workspace.bzl", "bazel_skylib_workspace") + +bazel_skylib_workspace() + load("@bazel_toolchains//rules:rbe_repo.bzl", "rbe_autoconfig") # Creates toolchain configuration for remote execution with BuildKite CI @@ -166,13 +206,8 @@ rbe_autoconfig( name = "buildkite_config", ) -git_repository( - name = "bazel_skylib", - remote = "https://github.com/bazelbuild/bazel-skylib.git", - tag = "0.6.0", -) - ## deps for tests of limited deps support + scala_maven_import_external( name = "org_springframework_spring_core", artifact = "org.springframework:spring-core:5.1.5.RELEASE", @@ -181,7 +216,7 @@ scala_maven_import_external( server_urls = [ "https://repo1.maven.org/maven2/", "https://mirror.bazel.build/repo1.maven.org/maven2", - ], + ], ) scala_maven_import_external( @@ -192,10 +227,10 @@ scala_maven_import_external( server_urls = [ "https://repo1.maven.org/maven2/", "https://mirror.bazel.build/repo1.maven.org/maven2", - ], + ], deps = [ - "@org_springframework_spring_core" - ] + "@org_springframework_spring_core", + ], ) ## deps for tests of compiler plugin @@ -205,6 +240,7 @@ scala_maven_import_external( "org.spire-math:kind-projector:0.9.10", default_scala_major_version(), ), + artifact_sha256 = "897460d4488b7dd6ac9198937d6417b36cc6ec8ab3693fdf2c532652f26c4373", fetch_sources = False, licenses = ["notice"], server_urls = [ diff --git a/docs/customizable_phase.md b/docs/customizable_phase.md new file mode 100644 index 000000000..5d53ac359 --- /dev/null +++ b/docs/customizable_phase.md @@ -0,0 +1,149 @@ +# Customizable Phase + +## Contents +* [Overview](#overview) +* [Who needs customizable phase](#who-needs-customizable-phase) +* [As a consumer](#as-a-consumer) +* [As a contributor](#as-a-contributor) + * [Phase naming convention](#phase-naming-convention) + +## Overview +Phases increase configurability. Rule implementations are defined as a list of phases. Each phase defines a specific step, which helps breaking up implementation into smaller and more readable groups. Some phases are independent from others, which means the order doesn't matter. However, some phases depend on outputs of previous phases, in this case, we should make sure it meets all the prerequisites before executing phases. + +The biggest benefit of phases is that it is customizable. If default phase A is not doing what you expect, you may switch it with your self-defined phase A. One use case is to write your own compilation phase with your favorite Scala compiler. You may also extend the default phase list for more functionalities. One use case is to check the Scala format. + +## Who needs customizable phase +Customizable phase is an advanced feature for people who want the rules to do more. If you are an experienced Bazel rules developer, we make this powerful API public for you to do custom work without impacting other consumers. If you have no experience on writing Bazel rules, we are happy to help but be aware it may be frustrating at first. + +If you don't need to customize your rules and just need the default setup to work correctly, then just load the following file for default rules: +``` +load("@io_bazel_rules_scala//scala:scala.bzl") +``` +Otherwise read on: + +## As a consumer +You need to load the following 2 files: +``` +load("@io_bazel_rules_scala//scala:advanced_usage/providers.bzl", "ScalaRulePhase") +load("@io_bazel_rules_scala//scala:advanced_usage/scala.bzl", "make_scala_binary") +``` +`ScalaRulePhase` is a phase provider to pass in custom phases. Rules with `make_` prefix, like `make_scala_binary`, are customizable rules. `make_`s take a dictionary as input. It currently supports appending `attrs` and `outputs` to default rules, as well as modifying the phase list. + +For example: +``` +ext_add_custom_phase = { + "attrs": { + "custom_content": attr.string( + default = "This is custom content", + ), + }, + "outputs": { + "custom_output": "%{name}.custom-output", + }, + "phase_providers": [ + "//custom/phase:phase_custom_write_extra_file", + ], +} + +custom_scala_binary = make_scala_binary(ext_add_custom_phase) +``` +`make_`s append `attrs` and `outputs` to the default rule definitions. All items in `attrs` can be accessed by `ctx.attr`, and all items in `outputs` can be accessed by `ctx.outputs`. `phase_providers` takes a list of targets which define how you want to modify phase list. +``` +def _add_custom_phase_singleton_implementation(ctx): + return [ + ScalaRulePhase( + custom_phases = [ + ("last", "", "custom_write_extra_file", phase_custom_write_extra_file), + ], + ), + ] + +add_custom_phase_singleton = rule( + implementation = _add_custom_phase_singleton_implementation, +) +``` +`add_custom_phase_singleton` is a rule solely to pass in custom phases using `ScalaRulePhase`. The `custom_phases` field in `ScalaRulePhase` takes a list of tuples. Each tuple has 4 elements: +``` +(relation, peer_name, phase_name, phase_function) +``` + - relation: the position to add a new phase + - peer_name: the existing phase to compare the position with + - phase_name: the name of the new phase, also used to access phase information + - phase_function: the function of the new phase + +There are 5 possible relations: + - `^` or `first` + - `$` or `last` + - `-` or `before` + - `+` or `after` + - `=` or `replace` + +The symbols and words are interchangable. If `first` or `last` is used, it puts your custom phase at the beginning or the end of the phase list, `peer_name` is not needed. + +Then you have to call the rule in a `BUILD` +``` +add_custom_phase_singleton( + name = "phase_custom_write_extra_file", + visibility = ["//visibility:public"], +) +``` + +You may now see `phase_providers` in `ext_add_custom_phase` is pointing to this target. + +The last step is to write the function of the phase. For example: +``` +def phase_custom_write_extra_file(ctx, p): + ctx.actions.write( + output = ctx.outputs.custom_output, + content = ctx.attr.custom_content, + ) +``` +Every phase has 2 arguments, `ctx` and `p`. `ctx` gives you access to the fields defined in rules. `p` is the global provider, which contains information from initial state as well as all the previous phases. You may access the information from previous phases by `p..`. For example, if the previous phase, said `phase_jar` with phase name `jar`, returns a struct +``` +def phase_jar(ctx, p): + # Some works to get the jars + return struct( + class_jar = class_jar, + ijar = ijar, + ) +``` +You are able to access information like `p.jar.class_jar` in `phase_custom_write_extra_file`. You can provide the information for later phases in the same way, then they can access it by `p.custom_write_extra_file.`. + +You should be able to define the files above entirely in your own workspace without making change to the [bazelbuild/rules_scala](https://github.com/bazelbuild/rules_scala). If you believe your custom phase will be valuable to the community, please refer to [As a contributor](#as-a-contributor). Pull requests are welcome. + +## As a contributor +Besides the basics in [As a consumer](#as-a-consumer), the followings help you understand how phases are setup if you plan to contribute to [bazelbuild/rules_scala](https://github.com/bazelbuild/rules_scala). + +These are the relevant files + - `scala/private/phases/api.bzl`: the API of executing and modifying the phase list + - `scala/private/phases/phases.bzl`: re-expose phases for convenience + - `scala/private/phases/phase_.bzl`: all the phase definitions + +Currently phase architecture is used by 7 rules: + - scala_library + - scala_macro_library + - scala_library_for_plugin_bootstrapping + - scala_binary + - scala_test + - scala_junit_test + - scala_repl + +In each of the rule implementation, it calls `run_phases` and returns the information from `phase_final`, which groups the final returns of the rule. To prevent consumers from accidently removing `phase_final` from the list, we make it a non-customizable phase. + +To make a new phase, you have to define a new `phase_.bzl` in `scala/private/phases/`. Function definition should have 2 arguments, `ctx` and `p`. You may expose the information for later phases by returning a `struct`. In some phases, there are multiple phase functions since different rules may take slightly different input arguemnts. You may want to re-expose the phase definition in `scala/private/phases/phases.bzl`, so it's more convenient to access in rule files. + +In the rule implementations, put your new phase in `builtin_customizable_phases` list. The phases are executed sequentially, the order matters if the new phase depends on previous phases. + +If you are making new return fields of the rule, remember to modify `phase_final`. + +### Phase naming convention +Files in `scala/private/phases/` + - `phase_.bzl`: phase definition file + +Function names in `phase_.bzl` + - `phase__`: function with custom inputs of specific rule + - `phase_common_`: function without custom inputs + - `_phase_default_`: private function that takes `_args` for custom inputs + - `_phase_`: private function with the actual logic + +See `phase_compile.bzl` for example. diff --git a/docs/scala_doc.md b/docs/scala_doc.md new file mode 100644 index 000000000..95f714837 --- /dev/null +++ b/docs/scala_doc.md @@ -0,0 +1,48 @@ +# scala_doc + +```python +scala_doc( + name, + deps, +) +``` + +`scala_doc` generates [Scaladoc](https://docs.scala-lang.org/style/scaladoc.html) for sources +for targets, including sources from upstream deps. Readily hostable HTML is written to a `name.html` output folder. + +Scaladoc can be somewhat slow to build. In that case, you can tell Bazel to build this target manually, +i.e. only when named explicitly and not through wildcards: `tags = ["manual"]`. + +## Example + +```python +scala_doc( + name = "scala_docs", + tags = ["manual"], + deps = [ + ":target1", + ":target2", + ":anothertarget", + ], + scalacopts = [ + "-Ypartial-unification", + "-Ywarn-unused-import", + ], +) + +# Use pkg_tar to tarball up +# https://docs.bazel.build/versions/master/be/pkg.html#pkg_tar +pkg_tar( + name = "scala_docs_archive", + srcs = [":scala_docs"], + extension = "tar.gz", +) +``` + +## Attributes + +| Attribute name | Description | +| --------------------- | ----------------------------------------------------- | +| name | `Name, required`
A unique name for this target. +| deps | `List of labels, optional`
Labels for which you want to create scaladoc. +| scalacopts | `List of strings, optional`
Extra compiler options for this library to be passed to scalac. diff --git a/docs/scalapb_proto_library.md b/docs/scala_proto_library.md similarity index 89% rename from docs/scalapb_proto_library.md rename to docs/scala_proto_library.md index 1278806c6..b682ac75b 100644 --- a/docs/scalapb_proto_library.md +++ b/docs/scala_proto_library.md @@ -1,24 +1,24 @@ -# scalapb_proto_library +# scala_proto_library To use this rule, you'll first need to add the following to your `WORKSPACE` file, which adds a few dependencies needed for ScalaPB: ```python load("@io_bazel_rules_scala//scala_proto:scala_proto.bzl", "scala_proto_repositories") -scala_proto_repositories(scala_version = "2.12.8") # or whatever scala_version you're on +scala_proto_repositories(scala_version = "2.12.10") # or whatever scala_version you're on ``` -Then you can import `scalapb_proto_library` in any `BUILD` file like this: +Then you can import `scala_proto_library` in any `BUILD` file like this: ```python -load("@io_bazel_rules_scala//scala_proto:scala_proto.bzl", "scalapb_proto_library") -scalapb_proto_library( +load("@io_bazel_rules_scala//scala_proto:scala_proto.bzl", "scala_proto_library") +scala_proto_library( name = "my_scala_proto_lib", deps = [":my_target"], ) ``` -`scalapb_proto_library` generates a Scala library of Scala proto bindings +`scala_proto_library` generates a Scala library of Scala proto bindings generated by the [ScalaPB compiler](https://github.com/scalapb/ScalaPB). ## Attributes @@ -65,4 +65,4 @@ toolchain( | extra_generator_dependencies | `List of labels, optional`
| grpc_deps | `List of labels, optional (has default)`
gRPC dependencies. A sensible default is provided. | implicit_compile_deps | `List of labels, optional (has default)`
ScalaPB dependencies. A sensible default is provided. -| scalac | `Label, optional (has default)`
Target for scalac. A sensible default is provided. \ No newline at end of file +| scalac | `Label, optional (has default)`
Target for scalac. A sensible default is provided. diff --git a/docs/scala_toolchain.md b/docs/scala_toolchain.md index bb9796d43..9bcc2b82c 100644 --- a/docs/scala_toolchain.md +++ b/docs/scala_toolchain.md @@ -2,8 +2,6 @@ `scala_toolchain` allows you to define global configuration to all Scala targets. -Currently, the only option that can be set is `scalacopts` but the plan is to expand it to other options as well. - **Some scala_toolchain must be registered!** ### Several options to configure `scala_toolchain`: @@ -46,3 +44,69 @@ scala_register_toolchains() # WORKSPACE register_toolchains("//toolchains:my_scala_toolchain") ``` + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Attributes
scalacopts +

List of strings; optional

+

+ Extra compiler options for this binary to be passed to scalac. +

+
scalac_jvm_flags +

List of strings; optional

+

+ List of JVM flags to be passed to scalac. For example ["-Xmx5G"] could be passed to control memory usage of Scalac. +

+

+ This is overridden by the scalac_jvm_flags attribute on individual targets. +

+
scala_test_jvm_flags +

List of strings; optional

+

+ List of JVM flags to be passed to the ScalaTest runner. For example ["-Xmx5G"] could be passed to control memory usage of the ScalaTest runner. +

+

+ This is overridden by the jvm_flags attribute on individual targets. +

+
unused_dependency_checker_mode +

String; optional

+

+ Enable unused dependency checking (see Unused dependency checking). + Possible values are: off, warn and error. +

+
enable_code_coverage_aspect +

"on" or "off"; optional; defaults to "off"

+

+ This enables instrumenting tests with jacoco code coverage. +

+
\ No newline at end of file diff --git a/java_stub_template/file/BUILD.bazel b/java_stub_template/file/BUILD.bazel new file mode 100644 index 000000000..069065102 --- /dev/null +++ b/java_stub_template/file/BUILD.bazel @@ -0,0 +1,5 @@ +filegroup( + name = "file", + srcs = ["file.txt"], + visibility = ["//visibility:public"], +) diff --git a/java_stub_template/file/file.txt b/java_stub_template/file/file.txt new file mode 100644 index 000000000..2deaedd68 --- /dev/null +++ b/java_stub_template/file/file.txt @@ -0,0 +1,365 @@ +#!/usr/bin/env bash +# Copyright 2014 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# This script was generated from java_stub_template.txt. Please +# don't edit it directly. +# +# If present, these flags should either be at the beginning of the command +# line, or they should be wrapped in a --wrapper_script_flag=FLAG argument. +# +# --debug Launch the JVM in remote debugging mode listening +# --debug= to the specified port or the port set in the +# DEFAULT_JVM_DEBUG_PORT environment variable (e.g. +# 'export DEFAULT_JVM_DEBUG_PORT=8000') or else the +# default port of 5005. The JVM starts suspended +# unless the DEFAULT_JVM_DEBUG_SUSPEND environment +# variable is set to 'n'. +# --main_advice= Run an alternate main class with the usual main +# program and arguments appended as arguments. +# --main_advice_classpath= +# Prepend additional class path entries. +# --jvm_flag= Pass to the "java" command itself. +# may contain spaces. Can be used multiple times. +# --jvm_flags= Pass space-separated flags to the "java" command +# itself. Can be used multiple times. +# --singlejar Start the program from the packed-up deployment +# jar rather than from the classpath. +# --print_javabin Print the location of java executable binary and exit. +# --classpath_limit= +# Specify the maximum classpath length. If the classpath +# is shorter, this script passes it to Java as a command +# line flag, otherwise it creates a classpath jar. +# +# The remainder of the command line is passed to the program. + +set -o posix + +# Make it easy to insert 'set -x' or similar commands when debugging problems with this script. +eval "$JAVA_STUB_DEBUG" + +# Prevent problems where the caller has exported CLASSPATH, causing our +# computed value to be copied into the environment and double-counted +# against the argv limit. +unset CLASSPATH + +JVM_FLAGS_CMDLINE=() + +# Processes an argument for the wrapper. Returns 0 if the given argument +# was recognized as an argument for this wrapper, and 1 if it was not. +function process_wrapper_argument() { + case "$1" in + --debug) JVM_DEBUG_PORT="${DEFAULT_JVM_DEBUG_PORT:-5005}" ;; + --debug=*) JVM_DEBUG_PORT="${1#--debug=}" ;; + --main_advice=*) MAIN_ADVICE="${1#--main_advice=}" ;; + --main_advice_classpath=*) MAIN_ADVICE_CLASSPATH="${1#--main_advice_classpath=}" ;; + --jvm_flag=*) JVM_FLAGS_CMDLINE+=( "${1#--jvm_flag=}" ) ;; + --jvm_flags=*) JVM_FLAGS_CMDLINE+=( ${1#--jvm_flags=} ) ;; + --singlejar) SINGLEJAR=1 ;; + --print_javabin) PRINT_JAVABIN=1 ;; + --classpath_limit=*) + CLASSPATH_LIMIT="${1#--classpath_limit=}" + echo "$CLASSPATH_LIMIT" | grep -q '^[0-9]\+$' || \ + die "ERROR: $self failed, --classpath_limit is not a number" + ;; + *) + return 1 ;; + esac + return 0 +} + +die() { + printf "%s: $1\n" "$0" "${@:2}" >&2 + exit 1 +} + +# Windows +function is_windows() { + [[ "${OSTYPE}" =~ msys* ]] || [[ "${OSTYPE}" =~ cygwin* ]] +} + +# macOS +function is_macos() { + [[ "${OSTYPE}" =~ darwin* ]] +} + +# Parse arguments sequentially until the first unrecognized arg is encountered. +# Scan the remaining args for --wrapper_script_flag=X options and process them. +ARGS=() +for ARG in "$@"; do + if [[ "$ARG" == --wrapper_script_flag=* ]]; then + process_wrapper_argument "${ARG#--wrapper_script_flag=}" \ + || die "invalid wrapper argument '%s'" "$ARG" + elif [[ "${#ARGS}" -gt 0 ]] || ! process_wrapper_argument "$ARG"; then + ARGS+=( "$ARG" ) + fi +done + +# Find our runfiles tree. We need this to construct the classpath +# (unless --singlejar was passed). +# +# Call this program X. X was generated by a java_binary or java_test rule. +# X may be invoked in many ways: +# 1a) directly by a user, with $0 in the output tree +# 1b) via 'bazel run' (similar to case 1a) +# 2) directly by a user, with $0 in X's runfiles tree +# 3) by another program Y which has a data dependency on X, with $0 in Y's runfiles tree +# 4) via 'bazel test' +# 5) by a genrule cmd, with $0 in the output tree +# 6) case 3 in the context of a genrule +# +# For case 1, $0 will be a regular file, and the runfiles tree will be +# at $0.runfiles. +# For case 2, $0 will be a symlink to the file seen in case 1. +# For case 3, we use Y's runfiles tree, which will be a superset of X's. +# For case 4, $JAVA_RUNFILES and $TEST_SRCDIR should already be set. +# Case 5 is handled like case 1. +# Case 6 is handled like case 3. + +# If we are running on Windows, convert the windows style path +# to unix style for detecting runfiles path. +if is_windows; then + self=$(cygpath --unix "$0") +else + self="$0" +fi + +if [[ "$self" != /* ]]; then + self="$PWD/$self" +fi + +if [[ "$SINGLEJAR" != 1 || "%needs_runfiles%" == 1 ]]; then + if [[ -z "$JAVA_RUNFILES" ]]; then + while true; do + if [[ -e "$self.runfiles" ]]; then + JAVA_RUNFILES="$self.runfiles" + break + fi + if [[ $self == *.runfiles/* ]]; then + JAVA_RUNFILES="${self%.runfiles/*}.runfiles" + break + fi + if [[ ! -L "$self" ]]; then + break + fi + readlink="$(readlink "$self")" + if [[ "$readlink" = /* ]]; then + self="$readlink" + else + # resolve relative symlink + self="${self%/*}/$readlink" + fi + done + if [[ -n "$JAVA_RUNFILES" ]]; then + export TEST_SRCDIR=${TEST_SRCDIR:-$JAVA_RUNFILES} + elif [[ -f "${self}_deploy.jar" && "%needs_runfiles%" == 0 ]]; then + SINGLEJAR=1; + else + die 'Cannot locate runfiles directory. (Set $JAVA_RUNFILES to inhibit searching.)' + fi + fi +fi + +# If we are running on Windows, we need a windows style runfiles path for constructing CLASSPATH +if is_windows; then + JAVA_RUNFILES=$(cygpath --windows "$JAVA_RUNFILES") +fi + +export JAVA_RUNFILES +export RUNFILES_MANIFEST_FILE="${JAVA_RUNFILES}/MANIFEST" +export RUNFILES_MANIFEST_ONLY=%runfiles_manifest_only% + +if [ -z "$RUNFILES_MANIFEST_ONLY" ]; then + function rlocation() { + if [[ "$1" = /* ]]; then + echo $1 + else + echo "$(dirname $RUNFILES_MANIFEST_FILE)/$1" + fi + } +else + if ! is_macos; then + # Read file into my_array + oifs=$IFS + IFS=$'\n' + my_array=( $(sed -e 's/\r//g' "$RUNFILES_MANIFEST_FILE") ) + IFS=$oifs + + # Process each runfile line into a [key,value] entry in runfiles_array + # declare -A is not supported on macOS because an old version of bash is used. + declare -A runfiles_array + for line in "${my_array[@]}" + do + line_split=($line) + runfiles_array[${line_split[0]}]=${line_split[@]:1} + done + fi + + function rlocation() { + if [[ "$1" = /* ]]; then + echo $1 + else + if is_macos; then + # Print the rest of line after the first space + # First, set the first column to empty and print rest of the line + # Second, use a trick of awk to remove leading and trailing spaces. + echo $(grep "^$1 " $RUNFILES_MANIFEST_FILE | awk '{ $1=""; print }' | awk '{ $1=$1; print }') + else + echo ${runfiles_array[$1]} + fi + fi + } +fi + +if is_macos || [[ ${OSTYPE} == "freebsd" ]]; then + function md5func() { md5 -q $@ ; } +else + function md5func() { md5sum $@ | awk '{print $1}' ; } +fi + +# Set JAVABIN to the path to the JVM launcher. +%javabin% + +if [[ "$PRINT_JAVABIN" == 1 || "%java_start_class%" == "--print_javabin" ]]; then + echo -n "$JAVABIN" + exit 0 +fi + +if [[ "$SINGLEJAR" == 1 ]]; then + CLASSPATH="${self}_deploy.jar" + # Check for the deploy jar now. If it doesn't exist, we can print a + # more helpful error message than the JVM. + [[ -r "$CLASSPATH" ]] \ + || die "Option --singlejar was passed, but %s does not exist.\n (You may need to build it explicitly.)" "$CLASSPATH" +else + # Create the shortest classpath we can, by making it relative if possible. + RUNPATH="${JAVA_RUNFILES}/%workspace_prefix%" + RUNPATH="${RUNPATH#$PWD/}" + CLASSPATH=%classpath% +fi + +# Export the locations which will be used to find the location of the classes from the classpath file. +export SELF_LOCATION="$self" +export CLASSLOADER_PREFIX_PATH="${RUNPATH}" + +# If using Jacoco in offline instrumentation mode, the CLASSPATH contains instrumented files. +# We need to make the metadata jar with uninstrumented classes available for generating +# the lcov-compatible coverage report, and we don't want it on the classpath. +%set_jacoco_metadata% +%set_jacoco_main_class% +%set_jacoco_java_runfiles_root% +%set_java_coverage_new_implementation% + +if [[ -n "$JVM_DEBUG_PORT" ]]; then + JVM_DEBUG_SUSPEND=${DEFAULT_JVM_DEBUG_SUSPEND:-"y"} + JVM_DEBUG_FLAGS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=${JVM_DEBUG_SUSPEND},address=${JVM_DEBUG_PORT}" + + if [[ "$PERSISTENT_TEST_RUNNER" == "true" ]]; then + JVM_DEBUG_FLAGS+=",quiet=y" + fi +fi + +if [[ -n "$MAIN_ADVICE_CLASSPATH" ]]; then + CLASSPATH="${MAIN_ADVICE_CLASSPATH}:${CLASSPATH}" +fi + +# Check if TEST_TMPDIR is available to use for scratch. +if [[ -n "$TEST_TMPDIR" && -d "$TEST_TMPDIR" ]]; then + JVM_FLAGS+=" -Djava.io.tmpdir=$TEST_TMPDIR" +fi + +ARGS=( + ${JVM_DEBUG_FLAGS} + ${JVM_FLAGS} + %jvm_flags% + "${JVM_FLAGS_CMDLINE[@]}" + ${MAIN_ADVICE} + %java_start_class% + "${ARGS[@]}") + + +function create_and_run_classpath_jar() { + # Build class path as one single string separated by spaces + MANIFEST_CLASSPATH="" + if is_windows; then + CLASSPATH_SEPARATOR=";" + URI_PREFIX="file:/" # e.g. "file:/C:/temp/foo.jar" + else + CLASSPATH_SEPARATOR=":" + URI_PREFIX="file:$(pwd)/" # e.g. "file:/usr/local/foo.jar" + fi + + URI_PREFIX=${URI_PREFIX//\//\\/} + MANIFEST_CLASSPATH="${CLASSPATH_SEPARATOR}${CLASSPATH}" + + MANIFEST_CLASSPATH=$(sed "s/ /%20/g" <<< "${MANIFEST_CLASSPATH}") + MANIFEST_CLASSPATH=$(sed "s/$CLASSPATH_SEPARATOR/ $URI_PREFIX/g" <<< "${MANIFEST_CLASSPATH}") + + # Create manifest file + MANIFEST_FILE="$(mktemp -t XXXXXXXX.jar_manifest)" + + ( + echo "Manifest-Version: 1.0" + + CLASSPATH_LINE="Class-Path:$MANIFEST_CLASSPATH" + # No line in the MANIFEST.MF file may be longer than 72 bytes. + # A space prefix indicates the line is still the content of the last attribute. + CLASSPATH_MANIFEST_LINES=$(sed -E $'s/(.{71})/\\1\\\n /g' <<< "${CLASSPATH_LINE}") + echo "$CLASSPATH_MANIFEST_LINES" + echo "Created-By: Bazel" + ) >$MANIFEST_FILE + + # Create classpath JAR file + MANIFEST_JAR_FILE="$(mktemp -t XXXXXXXX-classpath.jar)" + if is_windows; then + MANIFEST_JAR_FILE="$(cygpath --windows "$MANIFEST_JAR_FILE")" + MANIFEST_FILE="$(cygpath --windows "$MANIFEST_FILE")" + fi + JARBIN="${JARBIN:-$(rlocation "$1")}" + $JARBIN cvfm "$MANIFEST_JAR_FILE" "$MANIFEST_FILE" >/dev/null || \ + die "ERROR: $self failed because $JARBIN failed" + + # Execute JAVA command + $JAVABIN -classpath "$MANIFEST_JAR_FILE" "${ARGS[@]}" + exit_code=$? + rm -f "$MANIFEST_FILE" + rm -f "$MANIFEST_JAR_FILE" + exit $exit_code +} + +# If the user didn't specify a --classpath_limit, use the default value. +if [ -z "$CLASSPATH_LIMIT" ]; then + # Windows per-arg limit MAX_ARG_STRLEN == 8k + # Linux per-arg limit MAX_ARG_STRLEN == 128k + is_windows && CLASSPATH_LIMIT=7000 || CLASSPATH_LIMIT=120000 +fi + +#Difference from https://github.com/bazelbuild/bazel/blob/68c7e5a3c679be51f750d44aae146007f0f04b4d/src/main/java/com/google/devtools/build/lib/bazel/rules/java/java_stub_template.txt +JARBIN="%jarbin%" +PERCENTAGE_LITERAL="%" +JARBIN_LITERAL="${PERCENTAGE_LITERAL}jarbin${PERCENTAGE_LITERAL}" +# if jarbin is evaluated to empty string or not substituted fallback to previous behavior +# for backwards compatibility +if [ -z "$JARBIN" ] || [ "$JARBIN" == "$JARBIN_LITERAL" ] ; then + JARBIN="local_jdk/bin/jar" +fi +#Difference ends + +if is_windows && (("${#CLASSPATH}" > ${CLASSPATH_LIMIT} )); then + create_and_run_classpath_jar "${JARBIN}.exe" +elif (("${#CLASSPATH}" > ${CLASSPATH_LIMIT})); then + create_and_run_classpath_jar "$JARBIN" +else + exec $JAVABIN -classpath $CLASSPATH "${ARGS[@]}" +fi \ No newline at end of file diff --git a/jmh/jmh.bzl b/jmh/jmh.bzl index f59318918..d4274de0d 100644 --- a/jmh/jmh.bzl +++ b/jmh/jmh.bzl @@ -4,11 +4,11 @@ load( _scala_maven_import_external = "scala_maven_import_external", ) -def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): +def jmh_repositories(maven_servers = ["https://repo.maven.apache.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_openjdk_jmh_jmh_core", artifact = "org.openjdk.jmh:jmh-core:1.20", - jar_sha256 = "1688db5110ea6413bf63662113ed38084106ab1149e020c58c5ac22b91b842ca", + artifact_sha256 = "1688db5110ea6413bf63662113ed38084106ab1149e020c58c5ac22b91b842ca", licenses = ["notice"], server_urls = maven_servers, ) @@ -19,7 +19,7 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_openjdk_jmh_jmh_generator_asm", artifact = "org.openjdk.jmh:jmh-generator-asm:1.20", - jar_sha256 = "2dd4798b0c9120326310cda3864cc2e0035b8476346713d54a28d1adab1414a5", + artifact_sha256 = "2dd4798b0c9120326310cda3864cc2e0035b8476346713d54a28d1adab1414a5", licenses = ["notice"], server_urls = maven_servers, ) @@ -30,7 +30,7 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_openjdk_jmh_jmh_generator_reflection", artifact = "org.openjdk.jmh:jmh-generator-reflection:1.20", - jar_sha256 = "57706f7c8278272594a9afc42753aaf9ba0ba05980bae0673b8195908d21204e", + artifact_sha256 = "57706f7c8278272594a9afc42753aaf9ba0ba05980bae0673b8195908d21204e", licenses = ["notice"], server_urls = maven_servers, ) @@ -42,7 +42,7 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_ows2_asm_asm", artifact = "org.ow2.asm:asm:6.1.1", - jar_sha256 = "dd3b546415dd4bade2ebe3b47c7828ab0623ee2336604068e2d81023f9f8d833", + artifact_sha256 = "dd3b546415dd4bade2ebe3b47c7828ab0623ee2336604068e2d81023f9f8d833", licenses = ["notice"], server_urls = maven_servers, ) @@ -53,7 +53,7 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_net_sf_jopt_simple_jopt_simple", artifact = "net.sf.jopt-simple:jopt-simple:5.0.3", - jar_sha256 = "6f45c00908265947c39221035250024f2caec9a15c1c8cf553ebeecee289f342", + artifact_sha256 = "6f45c00908265947c39221035250024f2caec9a15c1c8cf553ebeecee289f342", licenses = ["notice"], server_urls = maven_servers, ) @@ -65,7 +65,7 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_apache_commons_commons_math3", artifact = "org.apache.commons:commons-math3:3.6.1", - jar_sha256 = "1e56d7b058d28b65abd256b8458e3885b674c1d588fa43cd7d1cbb9c7ef2b308", + artifact_sha256 = "1e56d7b058d28b65abd256b8458e3885b674c1d588fa43cd7d1cbb9c7ef2b308", licenses = ["notice"], server_urls = maven_servers, ) @@ -75,22 +75,24 @@ def jmh_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): actual = "@io_bazel_rules_scala_org_apache_commons_commons_math3//jar", ) -def _scala_construct_runtime_classpath(deps): - scala_targets = [d.scala for d in deps if hasattr(d, "scala")] - java_targets = [d.java for d in deps if hasattr(d, "java")] - files = [] - for scala in scala_targets: - files.append(scala.transitive_runtime_jars) - for java in java_targets: - files.append(java.transitive_runtime_deps) - return depset(transitive = files) - def _scala_generate_benchmark(ctx): - class_jar = ctx.attr.src.scala.outputs.class_jar - classpath = _scala_construct_runtime_classpath([ctx.attr.src]) + # we use required providers to ensure JavaInfo exists + info = ctx.attr.src[JavaInfo] + + # TODO, if we emit more than one jar, which scala_library does not, + # this might fail. We could possibly extend the BenchmarkGenerator + # to accept more than one jar to scan, and then allow multiple labels + # in ctx.attr.src + outs = info.outputs.jars + if len(outs) != 1: + print("expected exactly 1 output jar in: " + ctx.label) + + # just try to take the first one and see if that works + class_jar = outs[0].class_jar + classpath = info.transitive_runtime_deps ctx.actions.run( outputs = [ctx.outputs.src_jar, ctx.outputs.resource_jar], - inputs = depset([class_jar], transitive = [classpath]), + inputs = classpath, executable = ctx.executable._generator, arguments = [ctx.attr.generator_type] + [ f.path @@ -103,7 +105,7 @@ def _scala_generate_benchmark(ctx): scala_generate_benchmark = rule( implementation = _scala_generate_benchmark, attrs = { - "src": attr.label(allow_single_file = True, mandatory = True), + "src": attr.label(mandatory = True, providers = [[JavaInfo]]), "generator_type": attr.string( default = "reflection", mandatory = False, diff --git a/junit/junit.bzl b/junit/junit.bzl index 221ab217c..4279483ea 100644 --- a/junit/junit.bzl +++ b/junit/junit.bzl @@ -3,11 +3,11 @@ load( _scala_maven_import_external = "scala_maven_import_external", ) -def junit_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): +def junit_repositories(maven_servers = ["https://repo.maven.apache.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_junit_junit", artifact = "junit:junit:4.12", - jar_sha256 = "59721f0805e223d84b90677887d9ff567dc534d7c502ca903c0c2b17f05c116a", + artifact_sha256 = "59721f0805e223d84b90677887d9ff567dc534d7c502ca903c0c2b17f05c116a", licenses = ["notice"], server_urls = maven_servers, ) @@ -19,7 +19,7 @@ def junit_repositories(maven_servers = ["https://repo1.maven.org/maven2"]): _scala_maven_import_external( name = "io_bazel_rules_scala_org_hamcrest_hamcrest_core", artifact = "org.hamcrest:hamcrest-core:1.3", - jar_sha256 = "66fdef91e9739348df7a096aa384a5685f4e875584cce89386a7a47251c4d8e9", + artifact_sha256 = "66fdef91e9739348df7a096aa384a5685f4e875584cce89386a7a47251c4d8e9", licenses = ["notice"], server_urls = maven_servers, ) diff --git a/lint.sh b/lint.sh index 9961a25d4..11d0db475 100755 --- a/lint.sh +++ b/lint.sh @@ -1,293 +1,5 @@ #!/usr/bin/env bash -# Copyright 2018 The Bazel Authors. All rights reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. set -eou pipefail -# Usage -# ===== -# -# - to lint/check conformance to style and best practice of all the files in -# the current working directory: "./lint.sh" or "./lint.sh check". -# - to fix what can be fixed automatically: "./lint.sh fix". -# - to skip a step, e.g. Skylark linting: "FMT_SKYLINT=false ./lint.sh check". -# -# -# Linters/formatters featured here -# ================================ -# -# - google-java-format: Java code -# - buildifier: BUILD/WORKSPACE files -# - skylint: Skylark files (*.bzl) - check only -# - yapf: Skylark and Python files -# -# -# An important note concerning trailing commas -# ============================================ -# -# Yapf could generate during fixing this Skylark one-liner: -# ``` -# foo = rule(attrs = {"hello": attr.string()}, implementation = _impl) -# ``` -# from some perfectly normal-looking code: -# ``` -# foo = rule( -# implementation = _impl, -# attrs = { -# "hello": attr.string() -# } -# ) -# ``` -# but this reformatting is not expected to pass validation. What is missing -# is trailing commas, after the last argument to `rule` and after the last -# element of the dictionary. If those are put: -# ``` -# foo = rule( -# implementation = _impl, -# attrs = { -# "hello": attr.string(), -# }, -# ) -# ``` -# then our configuration of Yapf won't touch anything. -# -# -# Implementation details: Why we need Buildifier, Skylint and Yapf -# ================================================================ -# -# Why do we need Buildifier, Skylint and Yapf to validate Bazel/Skylark files? -# Skylark is after all only a dialect of Python! The reasons are as follows: -# -# - Buildifier can fix BUILD/WORKSPACE files but breaks *.bzl files when it attempts -# to fix them. -# -# - Yapf is the only utility used here that can fix *.bzl files because it understands Python -# fully and not only a special subset of it. -# -# - However, Yapf does not enforce certain conventions that people have in *.bzl files, -# related to the fact that they are used to enrich BUILD files. That's where -# Buildifier comes in handy. For instance, Yapf could generate this one-liner: -# ``` -# foo = rule(implementation = _impl, attrs = {"hello": attr.string()}) -# ``` -# but this is an error for Buildifier, as it expects every keyword argument -# to be on their own line as well as the attribute dictionary to be split, and the -# `attrs` argument to come before the `implementation` argument (alphabetical order). -# By running Buildifier after Yapf, we ensure that these conventions are respected. -# Here, to force Yapf to split the arguments and the dictionary, we can add a comma after -# the last argument/element. Moreover, Yapf does not reorder keywords. Overall, -# if we supply this snippet to Yapf: -# ``` -# foo = rule(attrs = {"hello": attr.string(),}, implementation = _impl,) -# ``` -# we'll end up, after fixing, with: -# ``` -# foo = rule( -# attrs = { -# "hello": attr.string(), -# }, -# implementation = _impl, -# ) -# ``` -# which passes the Buildifier validation. -# -# - Buildifier only validates *.bzl files with respect to their likeness to BUILD files. -# To validate the semantic specific to Skylark files and ensure good practices are followed -# (documentation, unused imports, ...), Skylint can be used. Skylint only operates in "check" -# mode, it cannot fix anything on its own. (On an unrelated note, Pylint gives meaningless -# results when applied to Skylark files, so that's why Skylint is used here.) -# -# - Overall, this sauce has been chosen because it gives an automatic formatting and -# linting warnings that feel natural for Skylark. - -BASE="$(pwd)" -MODE="${1:-check}" - -if [ "$MODE" = "check" ]; then - JAVA_OPTIONS=--dry-run - BUILDIFIER_MODE=check - YAPF_OPTIONS=--diff -else - YAPF_OPTIONS=--in-place - JAVA_OPTIONS=--replace - BUILDIFIER_MODE=fix -fi - -BAZEL_BIN=$(bazel info bazel-bin) -BAZEL_OUTPUT_BASE=$(bazel info output_base) - -function build() { - # NOTE: if and when the Skylink target becomes public, use a sh_binary instead - # of building everything here? - bazel build --color=yes --show_progress_rate_limit=30 \ - @io_bazel//src/tools/skylark/java/com/google/devtools/skylark/skylint:Skylint \ - //private:java_format \ - @com_github_google_yapf//:yapf \ - @io_bazel_buildifier_linux//file \ - @io_bazel_buildifier_darwin//file -} - -function format_py_like() { - local PATTERN=$1 - local STYLE=$(cat) - local OUTPUT - - OUTPUT=$(find "$BASE" \ - -not \( -path $BASE/.git -prune \) \ - -name "$PATTERN" -exec "$BAZEL_BIN/external/com_github_google_yapf/yapf/yapf" \ - $YAPF_OPTIONS \ - "--style=$STYLE" \ - {} \;) - if [ $? != 0 ]; then - return 1 - fi - if [ "$MODE" = "check" ] && [ ! -z "$OUTPUT" ]; then - echo "$OUTPUT" - return 1 - fi -} - -function format_skylark() { - format_py_like "*.bzl" <<'EOF' -{ - based_on_style: google, - spaces_around_default_or_named_assign: True, - blank_lines_around_top_level_definition: 1, - indent_width: 2, - allow_split_before_dict_value: False, - each_dict_entry_on_separate_line: True, - split_arguments_when_comma_terminated: True, -} -EOF -} - -function format_python() { - format_py_like "*.py" <<'EOF' -{ - based_on_style: google, - spaces_around_default_or_named_assign: True, - blank_lines_around_top_level_definition: 2, - indent_width: 2, - indent_dictionary_value: True -} -EOF -} - -function format_bazel() { - if [ "$(uname)" = "Darwin" ]; then - BUILDIFIER=$BAZEL_OUTPUT_BASE/external/io_bazel_buildifier_darwin/file/downloaded - else - BUILDIFIER=$BAZEL_OUTPUT_BASE/external/io_bazel_buildifier_linux/file/downloaded - fi - - ERRORS=0 - $BUILDIFIER -mode=$BUILDIFIER_MODE $( - find "$BASE" \ - -not \( -path $BASE/.git -prune \) \ - -name BUILD -type f) - ERRORS=$((ERRORS+$?)) - $BUILDIFIER -mode=$BUILDIFIER_MODE $( - find "$BASE" \ - -not \( -path $BASE/.git -prune \) \ - -name WORKSPACE -type f) - ERRORS=$((ERRORS+$?)) - - # (buildifier cannot format *.bzl files) - if [ "$MODE" = "check" ] && ! $BUILDIFIER -mode=check $(find "$BASE" -not \( -path $BASE/.git -prune \) -name "*.bzl" -type f) >/dev/null; then - echo "*.bzl BUILDIFIER ERRORS:" - for f in $(find "$BASE" -not \( -path $BASE/.git -prune \) -name "*.bzl" -type f); do - OUTPUT=$($BUILDIFIER -mode=diff $f) - if [ ! -z "$OUTPUT" ]; then - echo "$f" - echo "$OUTPUT" - fi - done - # Some errors are false positives. - echo "(buildifier on *.bzl files: not enforced)" - fi - - if [ $ERRORS != 0 ]; then - echo "Errors: $ERRORS" - return 1 - fi -} - -function format_java() { - local OUTPUT - - OUTPUT=$("$BAZEL_BIN/private/java_format" $JAVA_OPTIONS $( - find "$BASE" \ - -not \( -path $BASE/.git -prune \) \ - -name "*.java" -type f)) - - if [ "$MODE" = "check" ] && [ ! -z "$OUTPUT" ]; then - echo "$OUTPUT" - return 1 - fi -} - -# Skylint only operates in "check" mode, it is a no-op in "fix" mode. -function skylint() { - local OUTPUT - - OUTPUT=$( - find "$BASE" \ - -not \( -path $BASE/.git -prune \) \ - -type f -name "*.bzl" -exec \ - "$BAZEL_BIN/external/io_bazel/src/tools/skylark/java/com/google/devtools/skylark/skylint/Skylint" \ - {} \; - ) - if [ "$MODE" = "check" ] && [ ! -z "$OUTPUT" ]; then - echo "$OUTPUT" - return 1 - fi -} - -SUMMARY="" -OVERALL_RESULT=0 - -function record() { - local SECTION_NAME=$1 - local FUNC=$2 - local DO=$3 - local STATUS - - if ! $DO; then - STATUS="Skipped" - elif eval "$FUNC"; then - STATUS="Ok" - else - STATUS="Failure" - OVERALL_RESULT=1 - fi - - SUMMARY+="$SECTION_NAME $STATUS"$'\n' -} - -function summarize() { - echo "============ SUMMARY ============" - echo "$SUMMARY" - return $OVERALL_RESULT -} - -if "${FMT_PREPARE:-true}"; then - build -fi -record skylark format_skylark "${FMT_SKYLARK:-true}" -record python format_python "${FMT_PYTHON:-true}" -record bazel format_bazel "${FMT_BAZEL:-true}" -record java format_java "${FMT_JAVA:-true}" -SKYLINT="${FMT_SKYLINT:-true}" && [ "$MODE" = "check" ] -record skylint skylint "$SKYLINT" -summarize +./tools/bazel run //tools:buildifier@fix diff --git a/manual_test/README.md b/manual_test/README.md new file mode 100644 index 000000000..acaefadf9 --- /dev/null +++ b/manual_test/README.md @@ -0,0 +1 @@ +This directory contains tests that require extra setup such as extra bazel flags. \ No newline at end of file diff --git a/manual_test/scala_test_jvm_flags/BUILD b/manual_test/scala_test_jvm_flags/BUILD new file mode 100644 index 000000000..1012d6dd1 --- /dev/null +++ b/manual_test/scala_test_jvm_flags/BUILD @@ -0,0 +1,43 @@ +load("//scala:scala_toolchain.bzl", "scala_toolchain") +load("//scala:scala.bzl", "scala_test") + +scala_toolchain( + name = "failing_toolchain_impl", + # This will fail because 1M isn't enough + scala_test_jvm_flags = ["-Xmx1M"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "failing_scala_toolchain", + toolchain = "failing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_toolchain( + name = "passing_toolchain_impl", + # This will pass because 1G is enough + scala_test_jvm_flags = ["-Xmx1G"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "passing_scala_toolchain", + toolchain = "passing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_test( + name = "empty_test", + srcs = ["EmptyTest.scala"], +) + +scala_test( + name = "empty_overriding_test", + srcs = ["EmptyTest.scala"], + # This overrides the option passed in on the toolchain, and should BUILD, even if + # the `failing_scala_toolchain` is used. + jvm_flags = ["-Xmx1G"], +) diff --git a/manual_test/scala_test_jvm_flags/EmptyTest.scala b/manual_test/scala_test_jvm_flags/EmptyTest.scala new file mode 100644 index 000000000..d1fbfc7a0 --- /dev/null +++ b/manual_test/scala_test_jvm_flags/EmptyTest.scala @@ -0,0 +1,9 @@ +package test_expect_failure.scala_test_jvm_flags + +import org.scalatest.FunSuite + +class EmptyTest extends FunSuite { + test("empty test") { + assert(true) + } +} \ No newline at end of file diff --git a/manual_test/scalac_jvm_opts/BUILD b/manual_test/scalac_jvm_opts/BUILD new file mode 100644 index 000000000..65d39b83a --- /dev/null +++ b/manual_test/scalac_jvm_opts/BUILD @@ -0,0 +1,61 @@ +load("//scala:scala_toolchain.bzl", "scala_toolchain") +load("//scala:scala.bzl", "scala_library") +load( + "//scala_proto:scala_proto.bzl", + "scala_proto_library", +) + +scala_toolchain( + name = "failing_toolchain_impl", + # This will fail because 1M isn't enough + scalac_jvm_flags = ["-Xmx1M"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "failing_scala_toolchain", + toolchain = "failing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_toolchain( + name = "passing_toolchain_impl", + # This will pass because 1G is enough + scalac_jvm_flags = ["-Xmx1G"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "passing_scala_toolchain", + toolchain = "passing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_library( + name = "empty_build", + srcs = ["Empty.scala"], +) + +scala_library( + name = "empty_overriding_build", + srcs = ["Empty.scala"], + # This overrides the option passed in on the toolchain, and should BUILD, even if + # the `failing_scala_toolchain` is used. + scalac_jvm_flags = ["-Xmx1G"], +) + +proto_library( + name = "test", + srcs = ["test.proto"], + visibility = ["//visibility:public"], +) + +# This is a regression test for a bug that broke compiling scalapb targets when +# `scalac_jvm_flags` was set on the toolchain. +scala_proto_library( + name = "proto", + visibility = ["//visibility:public"], + deps = [":test"], +) diff --git a/manual_test/scalac_jvm_opts/Empty.scala b/manual_test/scalac_jvm_opts/Empty.scala new file mode 100644 index 000000000..691dbdd9b --- /dev/null +++ b/manual_test/scalac_jvm_opts/Empty.scala @@ -0,0 +1,3 @@ +package test_expect_failure.scalac_jvm_opts + +class Empty \ No newline at end of file diff --git a/manual_test/scalac_jvm_opts/test.proto b/manual_test/scalac_jvm_opts/test.proto new file mode 100644 index 000000000..0884e1d05 --- /dev/null +++ b/manual_test/scalac_jvm_opts/test.proto @@ -0,0 +1,8 @@ +syntax = "proto2"; + +option java_package = "test.proto"; + +message TestResponse1 { + optional uint32 c = 1; + optional bool d = 2; +} diff --git a/private/format.bzl b/private/format.bzl index 944d4dbe1..c7939c173 100644 --- a/private/format.bzl +++ b/private/format.bzl @@ -70,25 +70,3 @@ def format_repositories(): ], jar_sha256 = ("7b839bb7534a173f0ed0cd0e9a583181d20850fcec8cf6e3800e4420a1fad184"), ) - - http_file( - name = "io_bazel_buildifier_linux", - urls = [ - "https://github.com/bazelbuild/buildtools/releases/download/0.11.1/buildifier", - ], - sha256 = ( - "d7d41def74991a34dfd2ac8a73804ff11c514c024a901f64ab07f45a3cf0cfef" - ), - executable = True, - ) - - http_file( - name = "io_bazel_buildifier_darwin", - urls = [ - "https://github.com/bazelbuild/buildtools/releases/download/0.11.1/buildifier.osx", - ], - sha256 = ( - "3cbd708ff77f36413cfaef89cd5790a1137da5dfc3d9b3b3ca3fac669fbc298b" - ), - executable = True, - ) diff --git a/scala/BUILD b/scala/BUILD index 712f2e549..a5c8f2416 100644 --- a/scala/BUILD +++ b/scala/BUILD @@ -58,3 +58,9 @@ _declare_scalac_provider( ], visibility = ["//visibility:public"], ) + +java_library( + name = "PlaceHolderClassToCreateEmptyJarForScalaImport", + srcs = ["PlaceHolderClassToCreateEmptyJarForScalaImport.java"], + visibility = ["//visibility:public"], +) diff --git a/scala/PlaceHolderClassToCreateEmptyJarForScalaImport.java b/scala/PlaceHolderClassToCreateEmptyJarForScalaImport.java new file mode 100644 index 000000000..76bad019d --- /dev/null +++ b/scala/PlaceHolderClassToCreateEmptyJarForScalaImport.java @@ -0,0 +1 @@ +public class PlaceHolderClassToCreateEmptyJarForScalaImport { } diff --git a/scala/advanced_usage/providers.bzl b/scala/advanced_usage/providers.bzl new file mode 100644 index 000000000..da329eccb --- /dev/null +++ b/scala/advanced_usage/providers.bzl @@ -0,0 +1,11 @@ +""" +A phase provider for customizable rules +It is used only when you intend to add functionalities to existing default rules +""" + +ScalaRulePhase = provider( + doc = "A custom phase plugin", + fields = { + "custom_phases": "The phases to add. It takes an array of (relation, peer_name, phase_name, phase_function). Please refer to docs/customizable_phase.md for more details.", + }, +) diff --git a/scala/advanced_usage/scala.bzl b/scala/advanced_usage/scala.bzl new file mode 100644 index 000000000..552532880 --- /dev/null +++ b/scala/advanced_usage/scala.bzl @@ -0,0 +1,35 @@ +""" +Re-expose the customizable rules +It is used only when you intend to add functionalities to existing default rules +""" + +load( + "@io_bazel_rules_scala//scala/private:rules/scala_binary.bzl", + _make_scala_binary = "make_scala_binary", +) +load( + "@io_bazel_rules_scala//scala/private:rules/scala_junit_test.bzl", + _make_scala_junit_test = "make_scala_junit_test", +) +load( + "@io_bazel_rules_scala//scala/private:rules/scala_library.bzl", + _make_scala_library = "make_scala_library", + _make_scala_library_for_plugin_bootstrapping = "make_scala_library_for_plugin_bootstrapping", + _make_scala_macro_library = "make_scala_macro_library", +) +load( + "@io_bazel_rules_scala//scala/private:rules/scala_repl.bzl", + _make_scala_repl = "make_scala_repl", +) +load( + "@io_bazel_rules_scala//scala/private:rules/scala_test.bzl", + _make_scala_test = "make_scala_test", +) + +make_scala_binary = _make_scala_binary +make_scala_library = _make_scala_library +make_scala_library_for_plugin_bootstrapping = _make_scala_library_for_plugin_bootstrapping +make_scala_macro_library = _make_scala_macro_library +make_scala_repl = _make_scala_repl +make_scala_junit_test = _make_scala_junit_test +make_scala_test = _make_scala_test diff --git a/scala/plusone.bzl b/scala/plusone.bzl index 90b1c1f43..1efd17093 100644 --- a/scala/plusone.bzl +++ b/scala/plusone.bzl @@ -5,13 +5,23 @@ For motivation of plus one see the e2e tests """ PlusOneDeps = provider( fields = { - 'direct_deps' : 'list of direct compile dependencies of a target', - } + "direct_deps": "list of direct compile dependencies of a target", + }, ) def _collect_plus_one_deps_aspect_impl(target, ctx): - return [PlusOneDeps(direct_deps = getattr(ctx.rule.attr,'deps',[]))] + if (ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].plus_one_deps_mode == "off"): + return [] + export_plus_one_deps = [] + for exported_dep in getattr(ctx.rule.attr, "exports", []): + if PlusOneDeps in exported_dep: + export_plus_one_deps.extend(exported_dep[PlusOneDeps].direct_deps) + return [PlusOneDeps(direct_deps = export_plus_one_deps + getattr(ctx.rule.attr, "deps", []))] -collect_plus_one_deps_aspect = aspect(implementation = _collect_plus_one_deps_aspect_impl, - attr_aspects = ['deps'], +collect_plus_one_deps_aspect = aspect( + implementation = _collect_plus_one_deps_aspect_impl, + attr_aspects = ["deps", "exports"], + toolchains = [ + "@io_bazel_rules_scala//scala:toolchain_type", + ], ) diff --git a/scala/private/common.bzl b/scala/private/common.bzl index 2cea9ffd3..456e2b6c1 100644 --- a/scala/private/common.bzl +++ b/scala/private/common.bzl @@ -1,10 +1,6 @@ load("@io_bazel_rules_scala//scala:jars_to_labels.bzl", "JarsToLabelsInfo") load("@io_bazel_rules_scala//scala:plusone.bzl", "PlusOneDeps") -def write_manifest(ctx): - main_class = getattr(ctx.attr, "main_class", None) - write_manifest_file(ctx.actions, ctx.outputs.manifest, main_class) - def write_manifest_file(actions, output_file, main_class): # TODO(bazel-team): I don't think this classpath is what you want manifest = "Class-Path: \n" @@ -13,13 +9,6 @@ def write_manifest_file(actions, output_file, main_class): actions.write(output = output_file, content = manifest) -def collect_srcjars(targets): - srcjars = [] - for target in targets: - if hasattr(target, "srcjars"): - srcjars.append(target.srcjars.srcjar) - return depset(srcjars) - def collect_jars( dep_targets, dependency_analyzer_is_off = True, @@ -36,6 +25,21 @@ def collect_jars( else: return _collect_jars_when_dependency_analyzer_is_on(dep_targets) +def collect_plugin_paths(plugins): + """Get the actual jar paths of plugins as a depset.""" + paths = [] + for p in plugins: + if hasattr(p, "path"): + paths.append(p) + elif JavaInfo in p: + paths.extend([j.class_jar for j in p[JavaInfo].outputs.jars]) + # support http_file pointed at a jar. http_jar uses ijar, + # which breaks scala macros + + elif hasattr(p, "files"): + paths.extend([f for f in p.files.to_list() if not_sources_jar(f.basename)]) + return depset(paths) + def _collect_jars_when_dependency_analyzer_is_off( dep_targets, unused_dependency_checker_is_off, @@ -45,27 +49,27 @@ def _collect_jars_when_dependency_analyzer_is_off( runtime_jars = [] jars2labels = {} + deps_providers = [] + for dep_target in dep_targets: # we require a JavaInfo for dependencies # must use java_import or scala_import if you have raw files - if JavaInfo in dep_target: - java_provider = dep_target[JavaInfo] - compile_jars.append(java_provider.compile_jars) - runtime_jars.append(java_provider.transitive_runtime_jars) - - if not unused_dependency_checker_is_off: - add_labels_of_jars_to( - jars2labels, - dep_target, - [], - java_provider.compile_jars.to_list(), - ) - else: - print("ignored dependency, has no JavaInfo: " + str(dep_target)) + java_provider = dep_target[JavaInfo] + deps_providers.append(java_provider) + compile_jars.append(java_provider.compile_jars) + runtime_jars.append(java_provider.transitive_runtime_jars) + + if not unused_dependency_checker_is_off: + add_labels_of_jars_to( + jars2labels, + dep_target, + [], + java_provider.compile_jars.to_list(), + ) if (not plus_one_deps_is_off) and (PlusOneDeps in dep_target): plus_one_deps_compile_jars.append( - depset(transitive = [dep[JavaInfo].compile_jars for dep in dep_target[PlusOneDeps].direct_deps if JavaInfo in dep ]) + depset(transitive = [dep[JavaInfo].compile_jars for dep in dep_target[PlusOneDeps].direct_deps if JavaInfo in dep]), ) return struct( @@ -73,6 +77,7 @@ def _collect_jars_when_dependency_analyzer_is_off( transitive_runtime_jars = depset(transitive = runtime_jars), jars2labels = JarsToLabelsInfo(jars_to_labels = jars2labels), transitive_compile_jars = depset(transitive = compile_jars + plus_one_deps_compile_jars), + deps_providers = deps_providers, ) def _collect_jars_when_dependency_analyzer_is_on(dep_targets): @@ -80,33 +85,33 @@ def _collect_jars_when_dependency_analyzer_is_on(dep_targets): jars2labels = {} compile_jars = [] runtime_jars = [] + deps_providers = [] for dep_target in dep_targets: # we require a JavaInfo for dependencies # must use java_import or scala_import if you have raw files - if JavaInfo in dep_target: - java_provider = dep_target[JavaInfo] - current_dep_compile_jars = java_provider.compile_jars - current_dep_transitive_compile_jars = java_provider.transitive_compile_time_jars - runtime_jars.append(java_provider.transitive_runtime_jars) - - compile_jars.append(current_dep_compile_jars) - transitive_compile_jars.append(current_dep_transitive_compile_jars) - - add_labels_of_jars_to( - jars2labels, - dep_target, - current_dep_transitive_compile_jars.to_list(), - current_dep_compile_jars.to_list(), - ) - else: - print("ignored dependency, has no JavaInfo: " + str(dep_target)) + java_provider = dep_target[JavaInfo] + deps_providers.append(java_provider) + current_dep_compile_jars = java_provider.compile_jars + current_dep_transitive_compile_jars = java_provider.transitive_compile_time_jars + runtime_jars.append(java_provider.transitive_runtime_jars) + + compile_jars.append(current_dep_compile_jars) + transitive_compile_jars.append(current_dep_transitive_compile_jars) + + add_labels_of_jars_to( + jars2labels, + dep_target, + current_dep_transitive_compile_jars.to_list(), + current_dep_compile_jars.to_list(), + ) return struct( compile_jars = depset(transitive = compile_jars), transitive_runtime_jars = depset(transitive = runtime_jars), jars2labels = JarsToLabelsInfo(jars_to_labels = jars2labels), transitive_compile_jars = depset(transitive = transitive_compile_jars), + deps_providers = deps_providers, ) # When import mavan_jar's for scala macros we have to use the jar:file requirement @@ -147,14 +152,12 @@ def _provider_of_dependency_label_of(dependency, path): else: return None -# TODO this seems to have limited value now that JavaInfo has everything -def create_java_provider(scalaattr, transitive_compile_time_jars): - return java_common.create_provider( - use_ijar = False, - compile_time_jars = scalaattr.compile_jars, - runtime_jars = scalaattr.transitive_runtime_jars, - transitive_compile_time_jars = depset( - transitive = [transitive_compile_time_jars, scalaattr.compile_jars], - ), - transitive_runtime_jars = scalaattr.transitive_runtime_jars, - ) +def sanitize_string_for_usage(s): + res_array = [] + for idx in range(len(s)): + c = s[idx] + if c.isalnum() or c == ".": + res_array.append(c) + else: + res_array.append("_") + return "".join(res_array) diff --git a/scala/private/common_attributes.bzl b/scala/private/common_attributes.bzl new file mode 100644 index 000000000..2982536fe --- /dev/null +++ b/scala/private/common_attributes.bzl @@ -0,0 +1,136 @@ +"""Shared attributes for rules""" + +load( + "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", + _coverage_replacements_provider = "coverage_replacements_provider", +) +load( + "@io_bazel_rules_scala//scala:plusone.bzl", + _collect_plus_one_deps_aspect = "collect_plus_one_deps_aspect", +) + +common_attrs_for_plugin_bootstrapping = { + "srcs": attr.label_list(allow_files = [ + ".scala", + ".srcjar", + ".java", + ]), + "deps": attr.label_list( + aspects = [ + _collect_plus_one_deps_aspect, + _coverage_replacements_provider.aspect, + ], + providers = [[JavaInfo]], + ), + "plugins": attr.label_list(allow_files = [".jar"]), + "runtime_deps": attr.label_list(providers = [[JavaInfo]]), + "data": attr.label_list(allow_files = True), + "resources": attr.label_list(allow_files = True), + "resource_strip_prefix": attr.string(), + "resource_jars": attr.label_list(allow_files = True), + "scalacopts": attr.string_list(), + "javacopts": attr.string_list(), + "scalac_jvm_flags": attr.string_list(), + "javac_jvm_flags": attr.string_list(), + "expect_java_output": attr.bool( + default = True, + mandatory = False, + ), + "print_compile_time": attr.bool( + default = False, + mandatory = False, + ), +} + +common_attrs = {} + +common_attrs.update(common_attrs_for_plugin_bootstrapping) + +common_attrs.update({ + # using stricts scala deps is done by using command line flag called 'strict_java_deps' + # switching mode to "on" means that ANY API change in a target's transitive dependencies will trigger a recompilation of that target, + # on the other hand any internal change (i.e. on code that ijar omits) WON’T trigger recompilation by transitive dependencies + "_dependency_analyzer_plugin": attr.label( + default = Label( + "@io_bazel_rules_scala//third_party/dependency_analyzer/src/main:dependency_analyzer", + ), + allow_files = [".jar"], + mandatory = False, + ), + "unused_dependency_checker_mode": attr.string( + values = [ + "warn", + "error", + "off", + "", + ], + mandatory = False, + ), + "_unused_dependency_checker_plugin": attr.label( + default = Label( + "@io_bazel_rules_scala//third_party/unused_dependency_checker/src/main:unused_dependency_checker", + ), + allow_files = [".jar"], + mandatory = False, + ), + "unused_dependency_checker_ignored_targets": attr.label_list(default = []), + "_code_coverage_instrumentation_worker": attr.label( + default = "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/coverage/instrumenter", + allow_files = True, + executable = True, + cfg = "host", + ), +}) + +implicit_deps = { + "_singlejar": attr.label( + executable = True, + cfg = "host", + default = Label("@bazel_tools//tools/jdk:singlejar"), + allow_files = True, + ), + "_zipper": attr.label( + executable = True, + cfg = "host", + default = Label("@bazel_tools//tools/zip:zipper"), + allow_files = True, + ), + "_java_toolchain": attr.label( + default = Label("@bazel_tools//tools/jdk:current_java_toolchain"), + ), + "_host_javabase": attr.label( + default = Label("@bazel_tools//tools/jdk:current_java_runtime"), + cfg = "host", + ), + "_java_runtime": attr.label( + default = Label("@bazel_tools//tools/jdk:current_java_runtime"), + ), + "_scalac": attr.label( + default = Label( + "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/scalac", + ), + ), + "_exe": attr.label( + executable = True, + cfg = "host", + default = Label("@io_bazel_rules_scala//src/java/io/bazel/rulesscala/exe:exe"), + ), +} + +launcher_template = { + "_java_stub_template": attr.label( + default = Label("@io_bazel_rules_scala//java_stub_template/file"), + ), +} + +# Single dep to allow IDEs to pickup all the implicit dependencies. +resolve_deps = { + "_scala_toolchain": attr.label_list( + default = [ + Label( + "//external:io_bazel_rules_scala/dependency/scala/scala_library", + ), + ], + allow_files = False, + ), +} diff --git a/scala/private/common_outputs.bzl b/scala/private/common_outputs.bzl new file mode 100644 index 000000000..a252237e6 --- /dev/null +++ b/scala/private/common_outputs.bzl @@ -0,0 +1,8 @@ +"""Common outputs used in rule outputs""" + +common_outputs = { + "jar": "%{name}.jar", + "deploy_jar": "%{name}_deploy.jar", + "manifest": "%{name}_MANIFEST.MF", + "statsfile": "%{name}.statsfile", +} diff --git a/scala/private/macros/scala_repositories.bzl b/scala/private/macros/scala_repositories.bzl new file mode 100644 index 000000000..ff8017f37 --- /dev/null +++ b/scala/private/macros/scala_repositories.bzl @@ -0,0 +1,200 @@ +load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") +load( + "@io_bazel_rules_scala//scala:scala_cross_version.bzl", + _default_scala_version = "default_scala_version", + _default_scala_version_jar_shas = "default_scala_version_jar_shas", + _extract_major_version = "extract_major_version", + _new_scala_default_repository = "new_scala_default_repository", +) +load( + "@io_bazel_rules_scala//scala:scala_maven_import_external.bzl", + _scala_maven_import_external = "scala_maven_import_external", +) + +def _default_scala_extra_jars(): + return { + "2.11": { + "scalatest": { + "version": "3.0.5", + "sha256": "2aafeb41257912cbba95f9d747df9ecdc7ff43f039d35014b4c2a8eb7ed9ba2f", + }, + "scalactic": { + "version": "3.0.5", + "sha256": "84723064f5716f38990fe6e65468aa39700c725484efceef015771d267341cf2", + }, + "scala_xml": { + "version": "1.0.5", + "sha256": "767e11f33eddcd506980f0ff213f9d553a6a21802e3be1330345f62f7ee3d50f", + }, + "scala_parser_combinators": { + "version": "1.0.4", + "sha256": "0dfaafce29a9a245b0a9180ec2c1073d2bd8f0330f03a9f1f6a74d1bc83f62d6", + }, + }, + "2.12": { + "scalatest": { + "version": "3.0.5", + "sha256": "b416b5bcef6720da469a8d8a5726e457fc2d1cd5d316e1bc283aa75a2ae005e5", + }, + "scalactic": { + "version": "3.0.5", + "sha256": "57e25b4fd969b1758fe042595112c874dfea99dca5cc48eebe07ac38772a0c41", + }, + "scala_xml": { + "version": "1.0.5", + "sha256": "035015366f54f403d076d95f4529ce9eeaf544064dbc17c2d10e4f5908ef4256", + }, + "scala_parser_combinators": { + "version": "1.0.4", + "sha256": "282c78d064d3e8f09b3663190d9494b85e0bb7d96b0da05994fe994384d96111", + }, + }, + } + +def scala_repositories( + scala_version_shas = ( + _default_scala_version(), + _default_scala_version_jar_shas(), + ), + maven_servers = ["https://repo.maven.apache.org/maven2"], + scala_extra_jars = _default_scala_extra_jars()): + (scala_version, scala_version_jar_shas) = scala_version_shas + major_version = _extract_major_version(scala_version) + + _new_scala_default_repository( + maven_servers = maven_servers, + scala_version = scala_version, + scala_version_jar_shas = scala_version_jar_shas, + ) + + scala_version_extra_jars = scala_extra_jars[major_version] + + _scala_maven_import_external( + name = "io_bazel_rules_scala_scalatest", + artifact = "org.scalatest:scalatest_{major_version}:{extra_jar_version}".format( + major_version = major_version, + extra_jar_version = scala_version_extra_jars["scalatest"]["version"], + ), + artifact_sha256 = scala_version_extra_jars["scalatest"]["sha256"], + licenses = ["notice"], + server_urls = maven_servers, + ) + _scala_maven_import_external( + name = "io_bazel_rules_scala_scalactic", + artifact = "org.scalactic:scalactic_{major_version}:{extra_jar_version}".format( + major_version = major_version, + extra_jar_version = scala_version_extra_jars["scalactic"]["version"], + ), + artifact_sha256 = scala_version_extra_jars["scalactic"]["sha256"], + licenses = ["notice"], + server_urls = maven_servers, + ) + + _scala_maven_import_external( + name = "io_bazel_rules_scala_scala_xml", + artifact = "org.scala-lang.modules:scala-xml_{major_version}:{extra_jar_version}".format( + major_version = major_version, + extra_jar_version = scala_version_extra_jars["scala_xml"]["version"], + ), + artifact_sha256 = scala_version_extra_jars["scala_xml"]["sha256"], + licenses = ["notice"], + server_urls = maven_servers, + ) + + _scala_maven_import_external( + name = "io_bazel_rules_scala_scala_parser_combinators", + artifact = + "org.scala-lang.modules:scala-parser-combinators_{major_version}:{extra_jar_version}".format( + major_version = major_version, + extra_jar_version = scala_version_extra_jars["scala_parser_combinators"]["version"], + ), + artifact_sha256 = scala_version_extra_jars["scala_parser_combinators"]["sha256"], + licenses = ["notice"], + server_urls = maven_servers, + ) + + # used by ScalacProcessor + _scala_maven_import_external( + name = "scalac_rules_commons_io", + artifact = "commons-io:commons-io:2.6", + artifact_sha256 = "f877d304660ac2a142f3865badfc971dec7ed73c747c7f8d5d2f5139ca736513", + licenses = ["notice"], + server_urls = maven_servers, + ) + + _scala_maven_import_external( + name = "io_bazel_rules_scala_guava", + artifact = "com.google.guava:guava:21.0", + artifact_sha256 = "972139718abc8a4893fa78cba8cf7b2c903f35c97aaf44fa3031b0669948b480", + licenses = ["notice"], + server_urls = maven_servers, + ) + + if not native.existing_rule("com_google_protobuf"): + http_archive( + name = "com_google_protobuf", + sha256 = "d82eb0141ad18e98de47ed7ed415daabead6d5d1bef1b8cccb6aa4d108a9008f", + strip_prefix = "protobuf-b4f193788c9f0f05d7e0879ea96cd738630e5d51", + # Commit from 2019-05-15, update to protobuf 3.8 when available. + urls = [ + "https://mirror.bazel.build/github.com/protocolbuffers/protobuf/archive/b4f193788c9f0f05d7e0879ea96cd738630e5d51.tar.gz", + "https://github.com/protocolbuffers/protobuf/archive/b4f193788c9f0f05d7e0879ea96cd738630e5d51.tar.gz", + ], + ) + + if not native.existing_rule("zlib"): # needed by com_google_protobuf + http_archive( + name = "zlib", + build_file = "@com_google_protobuf//:third_party/zlib.BUILD", + sha256 = "c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1", + strip_prefix = "zlib-1.2.11", + urls = [ + "https://mirror.bazel.build/zlib.net/zlib-1.2.11.tar.gz", + "https://zlib.net/zlib-1.2.11.tar.gz", + ], + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/com_google_protobuf/protobuf_java", + actual = "@com_google_protobuf//:protobuf_java", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/commons_io/commons_io", + actual = "@scalac_rules_commons_io//jar", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scalatest/scalatest", + actual = "@io_bazel_rules_scala//scala/scalatest:scalatest", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/scala_compiler", + actual = "@io_bazel_rules_scala_scala_compiler", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/scala_library", + actual = "@io_bazel_rules_scala_scala_library", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/scala_reflect", + actual = "@io_bazel_rules_scala_scala_reflect", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/scala_xml", + actual = "@io_bazel_rules_scala_scala_xml", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/parser_combinators", + actual = "@io_bazel_rules_scala_scala_parser_combinators", + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/scala/guava", + actual = "@io_bazel_rules_scala_guava", + ) diff --git a/scala/private/phases/api.bzl b/scala/private/phases/api.bzl new file mode 100644 index 000000000..cc8a340af --- /dev/null +++ b/scala/private/phases/api.bzl @@ -0,0 +1,86 @@ +""" +The phase API for rules implementation +""" + +load( + "@io_bazel_rules_scala//scala:advanced_usage/providers.bzl", + _ScalaRulePhase = "ScalaRulePhase", +) + +# A method to modify the built-in phase list +# - Insert new phases to the first/last position +# - Insert new phases before/after existing phases +# - Replace existing phases +def _adjust_phases(phases, adjustments): + # Return when no adjustment needed + if len(adjustments) == 0: + return phases + phases = phases[:] + + # relation: the position to add a new phase + # peer_name: the existing phase to compare the position with + # phase_name: the name of the new phase, also used to access phase information + # phase_function: the function of the new phase + for (relation, peer_name, phase_name, phase_function) in adjustments: + for idx, (needle, _) in enumerate(phases): + if relation in ["^", "first"]: + phases.insert(0, (phase_name, phase_function)) + elif relation in ["$", "last"]: + phases.append((phase_name, phase_function)) + elif needle == peer_name: + if relation in ["-", "before"]: + phases.insert(idx, (phase_name, phase_function)) + elif relation in ["+", "after"]: + phases.insert(idx + 1, (phase_name, phase_function)) + elif relation in ["=", "replace"]: + phases[idx] = (phase_name, phase_function) + return phases + +# Execute phases +def run_phases(ctx, builtin_customizable_phases, fixed_phase): + # Loading custom phases + # Phases must be passed in by provider + phase_providers = [ + phase_provider[_ScalaRulePhase] + for phase_provider in ctx.attr._phase_providers + if _ScalaRulePhase in phase_provider + ] + + # Modify the built-in phase list + adjusted_phases = _adjust_phases( + builtin_customizable_phases, + [ + phase + for phase_provider in phase_providers + for phase in phase_provider.custom_phases + ], + ) + + # A placeholder for data shared with later phases + global_provider = {} + current_provider = struct(**global_provider) + for (name, function) in adjusted_phases + [fixed_phase]: + # Run a phase + new_provider = function(ctx, current_provider) + + # If a phase returns data, append it to global_provider + # for later phases to access + if new_provider != None: + global_provider[name] = new_provider + current_provider = struct(**global_provider) + + # The final return of rules implementation + return current_provider + +# A method to pass in phase provider +def extras_phases(extras): + return { + "_phase_providers": attr.label_list( + default = [ + phase_provider + for extra in extras + for phase_provider in extra["phase_providers"] + ], + providers = [_ScalaRulePhase], + ), + } diff --git a/scala/private/phases/phase_collect_exports_jars.bzl b/scala/private/phases/phase_collect_exports_jars.bzl new file mode 100644 index 000000000..35a83dbe3 --- /dev/null +++ b/scala/private/phases/phase_collect_exports_jars.bzl @@ -0,0 +1,15 @@ +# +# PHASE: collect exports jars +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:common.bzl", + "collect_jars", +) + +def phase_collect_exports_jars(ctx, p): + # Add information from exports (is key that AFTER all build actions/runfiles analysis) + # Since after, will not show up in deploy_jar or old jars runfiles + # Notice that compile_jars is intentionally transitive for exports + return collect_jars(ctx.attr.exports) diff --git a/scala/private/phases/phase_collect_jars.bzl b/scala/private/phases/phase_collect_jars.bzl new file mode 100644 index 000000000..086975ffd --- /dev/null +++ b/scala/private/phases/phase_collect_jars.bzl @@ -0,0 +1,119 @@ +# +# PHASE: collect jars +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:rule_impls.bzl", + "is_dependency_analyzer_off", + "is_plus_one_deps_off", +) +load( + "@io_bazel_rules_scala//scala/private:common.bzl", + "collect_jars", +) + +def phase_scalatest_collect_jars(ctx, p): + args = struct( + base_classpath = p.scalac_provider.default_classpath + [ctx.attr._scalatest], + extra_runtime_deps = [ + ctx.attr._scalatest_reporter, + ctx.attr._scalatest_runner, + ], + ) + return _phase_default_collect_jars(ctx, p, args) + +def phase_repl_collect_jars(ctx, p): + args = struct( + base_classpath = p.scalac_provider.default_repl_classpath, + ) + return _phase_default_collect_jars(ctx, p, args) + +def phase_macro_library_collect_jars(ctx, p): + args = struct( + base_classpath = p.scalac_provider.default_macro_classpath, + ) + return _phase_default_collect_jars(ctx, p, args) + +def phase_junit_test_collect_jars(ctx, p): + args = struct( + extra_deps = [ + ctx.attr._junit, + ctx.attr._hamcrest, + ctx.attr.suite_label, + ctx.attr._bazel_test_runner, + ], + ) + return _phase_default_collect_jars(ctx, p, args) + +def phase_library_for_plugin_bootstrapping_collect_jars(ctx, p): + args = struct( + unused_dependency_checker_mode = "off", + ) + return _phase_default_collect_jars(ctx, p, args) + +def phase_common_collect_jars(ctx, p): + return _phase_default_collect_jars(ctx, p) + +def _phase_default_collect_jars(ctx, p, _args = struct()): + return _phase_collect_jars( + ctx, + _args.base_classpath if hasattr(_args, "base_classpath") else p.scalac_provider.default_classpath, + _args.extra_deps if hasattr(_args, "extra_deps") else [], + _args.extra_runtime_deps if hasattr(_args, "extra_runtime_deps") else [], + _args.unused_dependency_checker_mode if hasattr(_args, "unused_dependency_checker_mode") else p.unused_deps_checker, + ) + +# Extract very common code out from dependency analysis into single place +# automatically adds dependency on scala-library and scala-reflect +# collects jars from deps, runtime jars from runtime_deps, and +def _phase_collect_jars( + ctx, + base_classpath, + extra_deps, + extra_runtime_deps, + unused_dependency_checker_mode): + unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" + dependency_analyzer_is_off = is_dependency_analyzer_off(ctx) + + deps_jars = collect_jars( + ctx.attr.deps + extra_deps + base_classpath, + dependency_analyzer_is_off, + unused_dependency_checker_is_off, + is_plus_one_deps_off(ctx), + ) + + ( + cjars, + transitive_rjars, + jars2labels, + transitive_compile_jars, + deps_providers, + ) = ( + deps_jars.compile_jars, + deps_jars.transitive_runtime_jars, + deps_jars.jars2labels, + deps_jars.transitive_compile_jars, + deps_jars.deps_providers, + ) + + transitive_rjars = depset( + transitive = [transitive_rjars] + + _collect_runtime_jars(ctx.attr.runtime_deps + extra_runtime_deps), + ) + + return struct( + compile_jars = cjars, + jars2labels = jars2labels, + transitive_compile_jars = transitive_compile_jars, + transitive_runtime_jars = transitive_rjars, + deps_providers = deps_providers, + ) + +def _collect_runtime_jars(dep_targets): + runtime_jars = [] + + for dep_target in dep_targets: + runtime_jars.append(dep_target[JavaInfo].transitive_runtime_jars) + + return runtime_jars diff --git a/scala/private/phases/phase_collect_srcjars.bzl b/scala/private/phases/phase_collect_srcjars.bzl new file mode 100644 index 000000000..cabf0bebe --- /dev/null +++ b/scala/private/phases/phase_collect_srcjars.bzl @@ -0,0 +1,14 @@ +# +# PHASE: collect srcjars +# +# DOCUMENT THIS +# + +def phase_collect_srcjars(ctx, p): + # This will be used to pick up srcjars from non-scala library + # targets (like thrift code generation) + srcjars = [] + for target in ctx.attr.deps: + if hasattr(target, "srcjars"): + srcjars.append(target.srcjars.srcjar) + return depset(srcjars) diff --git a/scala/private/phases/phase_compile.bzl b/scala/private/phases/phase_compile.bzl new file mode 100644 index 000000000..0500e2f37 --- /dev/null +++ b/scala/private/phases/phase_compile.bzl @@ -0,0 +1,526 @@ +# +# PHASE: compile +# +# DOCUMENT THIS +# +load("@bazel_tools//tools/jdk:toolchain_utils.bzl", "find_java_runtime_toolchain", "find_java_toolchain") +load( + "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", + _coverage_replacements_provider = "coverage_replacements_provider", +) +load( + "@io_bazel_rules_scala//scala/private:rule_impls.bzl", + _adjust_resources_path_by_default_prefixes = "adjust_resources_path_by_default_prefixes", + _compile_scala = "compile_scala", + _expand_location = "expand_location", +) + +_java_extension = ".java" + +_scala_extension = ".scala" + +_srcjar_extension = ".srcjar" + +_empty_coverage_struct = struct( + instrumented_files = None, + providers = [], + replacements = {}, +) + +def phase_binary_compile(ctx, p): + args = struct( + buildijar = False, + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_classpath + + ctx.attr.unused_dependency_checker_ignored_targets + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_library_compile(ctx, p): + args = struct( + srcjars = p.collect_srcjars, + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_classpath + ctx.attr.exports + + ctx.attr.unused_dependency_checker_ignored_targets + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_library_for_plugin_bootstrapping_compile(ctx, p): + args = struct( + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_classpath + ctx.attr.exports + ], + unused_dependency_checker_mode = "off", + ) + return _phase_default_compile(ctx, p, args) + +def phase_macro_library_compile(ctx, p): + args = struct( + buildijar = False, + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_macro_classpath + ctx.attr.exports + + ctx.attr.unused_dependency_checker_ignored_targets + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_junit_test_compile(ctx, p): + args = struct( + buildijar = False, + implicit_junit_deps_needed_for_java_compilation = [ + ctx.attr._junit, + ctx.attr._hamcrest, + ], + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_classpath + + ctx.attr.unused_dependency_checker_ignored_targets + ] + [ + ctx.attr._junit.label, + ctx.attr._hamcrest.label, + ctx.attr.suite_label.label, + ctx.attr._bazel_test_runner.label, + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_repl_compile(ctx, p): + args = struct( + buildijar = False, + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_repl_classpath + + ctx.attr.unused_dependency_checker_ignored_targets + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_scalatest_compile(ctx, p): + args = struct( + buildijar = False, + unused_dependency_checker_ignored_targets = [ + target.label + for target in p.scalac_provider.default_classpath + + ctx.attr.unused_dependency_checker_ignored_targets + ], + ) + return _phase_default_compile(ctx, p, args) + +def phase_common_compile(ctx, p): + return _phase_default_compile(ctx, p) + +def _phase_default_compile(ctx, p, _args = struct()): + return _phase_compile( + ctx, + p, + _args.srcjars if hasattr(_args, "srcjars") else depset(), + _args.buildijar if hasattr(_args, "buildijar") else True, + _args.implicit_junit_deps_needed_for_java_compilation if hasattr(_args, "implicit_junit_deps_needed_for_java_compilation") else [], + _args.unused_dependency_checker_ignored_targets if hasattr(_args, "unused_dependency_checker_ignored_targets") else [], + _args.unused_dependency_checker_mode if hasattr(_args, "unused_dependency_checker_mode") else p.unused_deps_checker, + ) + +def _phase_compile( + ctx, + p, + srcjars, + buildijar, + # TODO: generalize this hack + implicit_junit_deps_needed_for_java_compilation, + unused_dependency_checker_ignored_targets, + unused_dependency_checker_mode): + manifest = ctx.outputs.manifest + jars = p.collect_jars.compile_jars + rjars = p.collect_jars.transitive_runtime_jars + transitive_compile_jars = p.collect_jars.transitive_compile_jars + jars2labels = p.collect_jars.jars2labels.jars_to_labels + deps_providers = p.collect_jars.deps_providers + default_classpath = p.scalac_provider.default_classpath + + out = _compile_or_empty( + ctx, + manifest, + jars, + srcjars, + buildijar, + transitive_compile_jars, + jars2labels, + implicit_junit_deps_needed_for_java_compilation, + unused_dependency_checker_mode, + unused_dependency_checker_ignored_targets, + deps_providers, + default_classpath, + ) + + # TODO: simplify the return values and use provider + return struct( + class_jar = out.class_jar, + coverage = out.coverage, + full_jars = out.full_jars, + ijar = out.ijar, + ijars = out.ijars, + rjars = depset(out.full_jars, transitive = [rjars]), + java_jar = out.java_jar, + source_jars = _pack_source_jars(ctx) + out.source_jars, + merged_provider = out.merged_provider, + ) + +def _compile_or_empty( + ctx, + manifest, + jars, + srcjars, + buildijar, + transitive_compile_jars, + jars2labels, + implicit_junit_deps_needed_for_java_compilation, + unused_dependency_checker_mode, + unused_dependency_checker_ignored_targets, + deps_providers, + default_classpath): + # We assume that if a srcjar is present, it is not empty + if len(ctx.files.srcs) + len(srcjars.to_list()) == 0: + _build_nosrc_jar(ctx) + + scala_compilation_provider = _create_scala_compilation_provider(ctx, ctx.outputs.jar, None, deps_providers) + + # no need to build ijar when empty + return struct( + class_jar = ctx.outputs.jar, + coverage = _empty_coverage_struct, + full_jars = [ctx.outputs.jar], + ijar = ctx.outputs.jar, + ijars = [ctx.outputs.jar], + java_jar = False, + source_jars = [], + merged_provider = scala_compilation_provider, + ) + else: + in_srcjars = [ + f + for f in ctx.files.srcs + if f.basename.endswith(_srcjar_extension) + ] + all_srcjars = depset(in_srcjars, transitive = [srcjars]) + + java_srcs = [ + f + for f in ctx.files.srcs + if f.basename.endswith(_java_extension) + ] + + # We are not able to verify whether dependencies are used when compiling java sources + # Thus we disable unused dependency checking when java sources are found + if len(java_srcs) != 0: + unused_dependency_checker_mode = "off" + + sources = [ + f + for f in ctx.files.srcs + if f.basename.endswith(_scala_extension) + ] + java_srcs + _compile_scala( + ctx, + ctx.label, + ctx.outputs.jar, + manifest, + ctx.outputs.statsfile, + sources, + jars, + all_srcjars, + transitive_compile_jars, + ctx.attr.plugins, + ctx.attr.resource_strip_prefix, + ctx.files.resources, + ctx.files.resource_jars, + jars2labels, + ctx.attr.scalacopts, + ctx.attr.print_compile_time, + ctx.attr.expect_java_output, + ctx.attr.scalac_jvm_flags, + ctx.attr._scalac, + unused_dependency_checker_ignored_targets = + unused_dependency_checker_ignored_targets, + unused_dependency_checker_mode = unused_dependency_checker_mode, + ) + + # build ijar if needed + if buildijar: + ijar = java_common.run_ijar( + ctx.actions, + jar = ctx.outputs.jar, + target_label = ctx.label, + java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), + ) + else: + # macro code needs to be available at compile-time, + # so set ijar == jar + ijar = ctx.outputs.jar + + source_jar = _pack_source_jar(ctx) + scala_compilation_provider = _create_scala_compilation_provider(ctx, ijar, source_jar, deps_providers) + + # compile the java now + java_jar = _try_to_compile_java_jar( + ctx, + ijar, + all_srcjars, + java_srcs, + implicit_junit_deps_needed_for_java_compilation, + default_classpath, + ) + + full_jars = [ctx.outputs.jar] + ijars = [ijar] + source_jars = [] + if java_jar: + full_jars += [java_jar.jar] + ijars += [java_jar.ijar] + source_jars += java_jar.source_jars + + coverage = _jacoco_offline_instrument(ctx, ctx.outputs.jar) + + if java_jar: + merged_provider = java_common.merge([scala_compilation_provider, java_jar.java_compilation_provider]) + else: + merged_provider = scala_compilation_provider + + return struct( + class_jar = ctx.outputs.jar, + coverage = coverage, + full_jars = full_jars, + ijar = ijar, + ijars = ijars, + java_jar = java_jar, + source_jars = source_jars, + merged_provider = merged_provider, + ) + +def _pack_source_jars(ctx): + source_jar = _pack_source_jar(ctx) + + #_pack_source_jar may return None if java_common.pack_sources returned None (and it can) + return [source_jar] if source_jar else [] + +def _build_nosrc_jar(ctx): + resources = _add_resources_cmd(ctx) + ijar_cmd = "" + + # this ensures the file is not empty + resources += "META-INF/MANIFEST.MF=%s\n" % ctx.outputs.manifest.path + + zipper_arg_path = ctx.actions.declare_file("%s_zipper_args" % ctx.label.name) + ctx.actions.write(zipper_arg_path, resources) + cmd = """ +rm -f {jar_output} +{zipper} c {jar_output} @{path} +# ensures that empty src targets still emit a statsfile +touch {statsfile} +""" + ijar_cmd + + cmd = cmd.format( + path = zipper_arg_path.path, + jar_output = ctx.outputs.jar.path, + zipper = ctx.executable._zipper.path, + statsfile = ctx.outputs.statsfile.path, + ) + + outs = [ctx.outputs.jar, ctx.outputs.statsfile] + inputs = ctx.files.resources + [ctx.outputs.manifest] + + ctx.actions.run_shell( + inputs = inputs, + tools = [ctx.executable._zipper, zipper_arg_path], + outputs = outs, + command = cmd, + progress_message = "scala %s" % ctx.label, + arguments = [], + ) + +def _create_scala_compilation_provider(ctx, ijar, source_jar, deps_providers): + exports = [] + if hasattr(ctx.attr, "exports"): + exports = [dep[JavaInfo] for dep in ctx.attr.exports] + runtime_deps = [] + if hasattr(ctx.attr, "runtime_deps"): + runtime_deps = [dep[JavaInfo] for dep in ctx.attr.runtime_deps] + return JavaInfo( + output_jar = ctx.outputs.jar, + compile_jar = ijar, + source_jar = source_jar, + deps = deps_providers, + exports = exports, + runtime_deps = runtime_deps, + ) + +def _pack_source_jar(ctx): + # collect .scala sources and pack a source jar for Scala + scala_sources = [ + f + for f in ctx.files.srcs + if f.basename.endswith(_scala_extension) + ] + + # collect .srcjar files and pack them with the scala sources + bundled_source_jars = [ + f + for f in ctx.files.srcs + if f.basename.endswith(_srcjar_extension) + ] + scala_source_jar = java_common.pack_sources( + ctx.actions, + output_jar = ctx.outputs.jar, + sources = scala_sources, + source_jars = bundled_source_jars, + java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), + host_javabase = find_java_runtime_toolchain(ctx, ctx.attr._host_javabase), + ) + + return scala_source_jar + +def _jacoco_offline_instrument(ctx, input_jar): + if not ctx.configuration.coverage_enabled or not hasattr(ctx.attr, "_code_coverage_instrumentation_worker"): + return _empty_coverage_struct + + output_jar = ctx.actions.declare_file( + "{}-offline.jar".format(input_jar.basename.split(".")[0]), + ) + in_out_pairs = [ + (input_jar, output_jar), + ] + + args = ctx.actions.args() + args.add_all(in_out_pairs, map_each = _jacoco_offline_instrument_format_each) + args.set_param_file_format("multiline") + args.use_param_file("@%s", use_always = True) + + ctx.actions.run( + mnemonic = "JacocoInstrumenter", + inputs = [in_out_pair[0] for in_out_pair in in_out_pairs], + outputs = [in_out_pair[1] for in_out_pair in in_out_pairs], + executable = ctx.attr._code_coverage_instrumentation_worker.files_to_run, + execution_requirements = {"supports-workers": "1"}, + arguments = [args], + ) + + replacements = {i: o for (i, o) in in_out_pairs} + provider = _coverage_replacements_provider.create( + replacements = replacements, + ) + instrumented_files_provider = coverage_common.instrumented_files_info( + ctx, + source_attributes = ["srcs"], + dependency_attributes = _coverage_replacements_provider.dependency_attributes, + extensions = ["scala", "java"], + ) + return struct( + providers = [provider, instrumented_files_provider], + replacements = replacements, + ) + +def _jacoco_offline_instrument_format_each(in_out_pair): + return (["%s=%s" % (in_out_pair[0].path, in_out_pair[1].path)]) + +def _try_to_compile_java_jar( + ctx, + scala_output, + all_srcjars, + java_srcs, + implicit_junit_deps_needed_for_java_compilation, + default_classpath): + if not java_srcs and (not (all_srcjars and ctx.attr.expect_java_output)): + return False + + providers_of_dependencies = _collect_java_providers_of(ctx.attr.deps) + providers_of_dependencies += _collect_java_providers_of( + implicit_junit_deps_needed_for_java_compilation, + ) + providers_of_dependencies += _collect_java_providers_of( + default_classpath, + ) + scala_sources_java_provider = _interim_java_provider_for_java_compilation( + scala_output, + ) + providers_of_dependencies += [scala_sources_java_provider] + + full_java_jar = ctx.actions.declare_file(ctx.label.name + "_java.jar") + + provider = java_common.compile( + ctx, + source_jars = all_srcjars.to_list(), + source_files = java_srcs, + output = full_java_jar, + javac_opts = _expand_location( + ctx, + ctx.attr.javacopts + ctx.attr.javac_jvm_flags + + java_common.default_javac_opts( + java_toolchain = ctx.attr._java_toolchain[java_common.JavaToolchainInfo], + ), + ), + deps = providers_of_dependencies, + #exports can be empty since the manually created provider exposes exports + #needs to be empty since we want the provider.compile_jars to only contain the sources ijar + #workaround until https://github.com/bazelbuild/bazel/issues/3528 is resolved + exports = [], + java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), + host_javabase = find_java_runtime_toolchain(ctx, ctx.attr._host_javabase), + strict_deps = ctx.fragments.java.strict_java_deps, + ) + + return struct( + ijar = provider.compile_jars.to_list().pop(), + jar = full_java_jar, + source_jars = provider.source_jars, + java_compilation_provider = provider, + ) + +def _adjust_resources_path(path, resource_strip_prefix): + if resource_strip_prefix: + return _adjust_resources_path_by_strip_prefix(path, resource_strip_prefix) + else: + return _adjust_resources_path_by_default_prefixes(path) + +def _add_resources_cmd(ctx): + res_cmd = [] + for f in ctx.files.resources: + c_dir, res_path = _adjust_resources_path( + f.short_path, + ctx.attr.resource_strip_prefix, + ) + target_path = res_path + if target_path[0] == "/": + target_path = target_path[1:] + line = "{target_path}={c_dir}{res_path}\n".format( + res_path = res_path, + target_path = target_path, + c_dir = c_dir, + ) + res_cmd.extend([line]) + return "".join(res_cmd) + +def _adjust_resources_path_by_strip_prefix(path, resource_strip_prefix): + if not path.startswith(resource_strip_prefix): + fail("Resource file %s is not under the specified prefix to strip" % path) + + clean_path = path[len(resource_strip_prefix):] + return resource_strip_prefix, clean_path + +def _collect_java_providers_of(deps): + providers = [] + for dep in deps: + if JavaInfo in dep: + providers.append(dep[JavaInfo]) + return providers + +def _interim_java_provider_for_java_compilation(scala_output): + return JavaInfo( + output_jar = scala_output, + compile_jar = scala_output, + neverlink = True, + ) diff --git a/scala/private/phases/phase_coverage_runfiles.bzl b/scala/private/phases/phase_coverage_runfiles.bzl new file mode 100644 index 000000000..00e50cf44 --- /dev/null +++ b/scala/private/phases/phase_coverage_runfiles.bzl @@ -0,0 +1,28 @@ +# +# PHASE: coverage runfiles +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", + _coverage_replacements_provider = "coverage_replacements_provider", +) + +def phase_coverage_runfiles(ctx, p): + coverage_runfiles = [] + rjars = p.compile.rjars + if ctx.configuration.coverage_enabled and _coverage_replacements_provider.is_enabled(ctx): + coverage_replacements = _coverage_replacements_provider.from_ctx( + ctx, + base = p.compile.coverage.replacements, + ).replacements + + rjars = depset([ + coverage_replacements[jar] if jar in coverage_replacements else jar + for jar in rjars.to_list() + ]) + coverage_runfiles = ctx.files._jacocorunner + ctx.files._lcov_merger + coverage_replacements.values() + return struct( + coverage_runfiles = coverage_runfiles, + rjars = rjars, + ) diff --git a/scala/private/phases/phase_declare_executable.bzl b/scala/private/phases/phase_declare_executable.bzl new file mode 100644 index 000000000..c7bce9a20 --- /dev/null +++ b/scala/private/phases/phase_declare_executable.bzl @@ -0,0 +1,15 @@ +# +# PHASE: declare executable +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:rule_impls.bzl", + "is_windows", +) + +def phase_declare_executable(ctx, p): + if (is_windows(ctx)): + return ctx.actions.declare_file("%s.exe" % ctx.label.name) + else: + return ctx.actions.declare_file(ctx.label.name) diff --git a/scala/private/phases/phase_final.bzl b/scala/private/phases/phase_final.bzl new file mode 100644 index 000000000..1fdb9514a --- /dev/null +++ b/scala/private/phases/phase_final.bzl @@ -0,0 +1,27 @@ +# +# PHASE: final +# +# DOCUMENT THIS +# +def phase_binary_final(ctx, p): + defaultInfo = DefaultInfo( + executable = p.declare_executable, + files = depset([p.declare_executable, ctx.outputs.jar]), + runfiles = p.runfiles.runfiles, + ) + return [defaultInfo, p.compile.merged_provider, p.collect_jars.jars2labels] + p.compile.coverage.providers + +def phase_library_final(ctx, p): + defaultInfo = DefaultInfo( + files = depset([ctx.outputs.jar] + p.compile.full_jars), # Here is the default output + runfiles = p.runfiles.runfiles, + ) + return [defaultInfo, p.compile.merged_provider, p.collect_jars.jars2labels] + p.compile.coverage.providers + +def phase_scalatest_final(ctx, p): + defaultInfo = DefaultInfo( + executable = p.declare_executable, + files = depset([p.declare_executable, ctx.outputs.jar]), + runfiles = ctx.runfiles(p.coverage_runfiles.coverage_runfiles, transitive_files = p.runfiles.runfiles.files), + ) + return [defaultInfo, p.compile.merged_provider, p.collect_jars.jars2labels] + p.compile.coverage.providers diff --git a/scala/private/phases/phase_java_wrapper.bzl b/scala/private/phases/phase_java_wrapper.bzl new file mode 100644 index 000000000..1db2cfc8b --- /dev/null +++ b/scala/private/phases/phase_java_wrapper.bzl @@ -0,0 +1,68 @@ +# +# PHASE: java wrapper +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:rule_impls.bzl", + _java_bin = "java_bin", +) + +def phase_repl_java_wrapper(ctx, p): + args = struct( + args = " ".join(ctx.attr.scalacopts), + wrapper_preamble = """ +# save stty like in bin/scala +saved_stty=$(stty -g 2>/dev/null) +if [[ ! $? ]]; then + saved_stty="" +fi +function finish() { + if [[ "$saved_stty" != "" ]]; then + stty $saved_stty + saved_stty="" + fi +} +trap finish EXIT +""", + ) + return _phase_default_java_wrapper(ctx, p, args) + +def phase_common_java_wrapper(ctx, p): + return _phase_default_java_wrapper(ctx, p) + +def _phase_default_java_wrapper(ctx, p, _args = struct()): + return _phase_java_wrapper( + ctx, + _args.args if hasattr(_args, "args") else "", + _args.wrapper_preamble if hasattr(_args, "wrapper_preamble") else "", + ) + +def _phase_java_wrapper( + ctx, + args, + wrapper_preamble): + """This creates a wrapper that sets up the correct path + to stand in for the java command.""" + + exec_str = "" + if wrapper_preamble == "": + exec_str = "exec " + + wrapper = ctx.actions.declare_file(ctx.label.name + "_wrapper.sh") + ctx.actions.write( + output = wrapper, + content = """#!/usr/bin/env bash +{preamble} +DEFAULT_JAVABIN={javabin} +JAVA_EXEC_TO_USE=${{REAL_EXTERNAL_JAVA_BIN:-$DEFAULT_JAVABIN}} +{exec_str}$JAVA_EXEC_TO_USE "$@" {args} +""".format( + preamble = wrapper_preamble, + exec_str = exec_str, + javabin = _java_bin(ctx), + args = args, + ), + is_executable = True, + ) + return wrapper diff --git a/scala/private/phases/phase_jvm_flags.bzl b/scala/private/phases/phase_jvm_flags.bzl new file mode 100644 index 000000000..8535a7ad6 --- /dev/null +++ b/scala/private/phases/phase_jvm_flags.bzl @@ -0,0 +1,52 @@ +# +# PHASE: jvm flags +# +# DOCUMENT THIS +# +def phase_jvm_flags(ctx, p): + if ctx.attr.tests_from: + archives = _get_test_archive_jars(ctx, ctx.attr.tests_from) + else: + archives = p.compile.merged_provider.runtime_output_jars + + serialized_archives = _serialize_archives_short_path(archives) + test_suite = _gen_test_suite_flags_based_on_prefixes_and_suffixes( + ctx, + serialized_archives, + ) + return [ + "-ea", + test_suite.archiveFlag, + test_suite.prefixesFlag, + test_suite.suffixesFlag, + test_suite.printFlag, + test_suite.testSuiteFlag, + ] + +def _gen_test_suite_flags_based_on_prefixes_and_suffixes(ctx, archives): + return struct( + archiveFlag = "-Dbazel.discover.classes.archives.file.paths=%s" % + archives, + prefixesFlag = "-Dbazel.discover.classes.prefixes=%s" % ",".join( + ctx.attr.prefixes, + ), + printFlag = "-Dbazel.discover.classes.print.discovered=%s" % + ctx.attr.print_discovered_classes, + suffixesFlag = "-Dbazel.discover.classes.suffixes=%s" % ",".join( + ctx.attr.suffixes, + ), + testSuiteFlag = "-Dbazel.test_suite=%s" % ctx.attr.suite_class, + ) + +def _serialize_archives_short_path(archives): + archives_short_path = "" + for archive in archives: + archives_short_path += archive.short_path + "," + return archives_short_path[:-1] #remove redundant comma + +def _get_test_archive_jars(ctx, test_archives): + flattened_list = [] + for archive in test_archives: + class_jars = [java_output.class_jar for java_output in archive[JavaInfo].outputs.jars] + flattened_list.extend(class_jars) + return flattened_list diff --git a/scala/private/phases/phase_merge_jars.bzl b/scala/private/phases/phase_merge_jars.bzl new file mode 100644 index 000000000..880c5bd4e --- /dev/null +++ b/scala/private/phases/phase_merge_jars.bzl @@ -0,0 +1,30 @@ +# +# PHASE: merge jars +# +# DOCUMENT THIS +# + +def phase_merge_jars(ctx, p): + """Calls Bazel's singlejar utility. + + For a full list of available command line options see: + https://github.com/bazelbuild/bazel/blob/697d219526bffbecd29f29b402c9122ec5d9f2ee/src/java_tools/singlejar/java/com/google/devtools/build/singlejar/SingleJar.java#L337 + Use --compression to reduce size of deploy jars. + """ + deploy_jar = ctx.outputs.deploy_jar + jars_list = p.compile.rjars.to_list() + main_class = getattr(ctx.attr, "main_class", "") + progress_message = "Merging Scala jar: %s" % ctx.label + args = ["--compression", "--normalize", "--sources"] + args.extend([j.path for j in jars_list]) + if main_class: + args.extend(["--main_class", main_class]) + args.extend(["--output", deploy_jar.path]) + ctx.actions.run( + inputs = jars_list, + outputs = [deploy_jar], + executable = ctx.executable._singlejar, + mnemonic = "ScalaDeployJar", + progress_message = progress_message, + arguments = args, + ) diff --git a/scala/private/phases/phase_runfiles.bzl b/scala/private/phases/phase_runfiles.bzl new file mode 100644 index 000000000..db5e6e8a2 --- /dev/null +++ b/scala/private/phases/phase_runfiles.bzl @@ -0,0 +1,68 @@ +# +# PHASE: runfiles +# +# DOCUMENT THIS +# +def phase_library_runfiles(ctx, p): + args = struct( + # Using transitive_files since transitive_rjars a depset and avoiding linearization + transitive_files = p.compile.rjars, + ) + return _phase_default_runfiles(ctx, p, args) + +def phase_scalatest_runfiles(ctx, p): + args = "\n".join([ + "-R", + ctx.outputs.jar.short_path, + _scala_test_flags(ctx), + "-C", + "io.bazel.rules.scala.JUnitXmlReporter", + ]) + args_file = ctx.actions.declare_file("%s.args" % ctx.label.name) + ctx.actions.write(args_file, args) + runfiles_ext = [args_file] + + args = struct( + transitive_files = depset( + [p.declare_executable, p.java_wrapper] + ctx.files._java_runtime + runfiles_ext, + transitive = [p.compile.rjars], + ), + args_file = args_file, + ) + return _phase_default_runfiles(ctx, p, args) + +def phase_common_runfiles(ctx, p): + return _phase_default_runfiles(ctx, p) + +def _phase_default_runfiles(ctx, p, _args = struct()): + return _phase_runfiles( + ctx, + _args.transitive_files if hasattr(_args, "transitive_files") else depset( + [p.declare_executable, p.java_wrapper] + ctx.files._java_runtime, + transitive = [p.compile.rjars], + ), + _args.args_file if hasattr(_args, "args_file") else None, + ) + +def _phase_runfiles( + ctx, + transitive_files, + args_file): + return struct( + runfiles = ctx.runfiles( + transitive_files = transitive_files, + collect_data = True, + ), + args_file = args_file, + ) + +def _scala_test_flags(ctx): + # output report test duration + flags = "-oD" + if ctx.attr.full_stacktraces: + flags += "F" + else: + flags += "S" + if not ctx.attr.colors: + flags += "W" + return flags diff --git a/scala/private/phases/phase_scalac_provider.bzl b/scala/private/phases/phase_scalac_provider.bzl new file mode 100644 index 000000000..aff54f32f --- /dev/null +++ b/scala/private/phases/phase_scalac_provider.bzl @@ -0,0 +1,12 @@ +# +# PHASE: scalac provider +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala:providers.bzl", + _ScalacProvider = "ScalacProvider", +) + +def phase_scalac_provider(ctx, p): + return ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].scalac_provider_attr[_ScalacProvider] diff --git a/scala/private/phases/phase_unused_deps_checker.bzl b/scala/private/phases/phase_unused_deps_checker.bzl new file mode 100644 index 000000000..21f0daebb --- /dev/null +++ b/scala/private/phases/phase_unused_deps_checker.bzl @@ -0,0 +1,11 @@ +# +# PHASE: unused deps checker +# +# DOCUMENT THIS +# + +def phase_unused_deps_checker(ctx, p): + if ctx.attr.unused_dependency_checker_mode: + return ctx.attr.unused_dependency_checker_mode + else: + return ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].unused_dependency_checker_mode diff --git a/scala/private/phases/phase_write_executable.bzl b/scala/private/phases/phase_write_executable.bzl new file mode 100644 index 000000000..92931196b --- /dev/null +++ b/scala/private/phases/phase_write_executable.bzl @@ -0,0 +1,174 @@ +# +# PHASE: write executable +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:rule_impls.bzl", + "expand_location", + "first_non_empty", + "is_windows", + "java_bin", + "runfiles_root", +) +load( + "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", + _coverage_replacements_provider = "coverage_replacements_provider", +) + +def phase_scalatest_write_executable(ctx, p): + # jvm_flags passed in on the target override scala_test_jvm_flags passed in on the + # toolchain + final_jvm_flags = first_non_empty( + ctx.attr.jvm_flags, + ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].scala_test_jvm_flags, + ) + args = struct( + rjars = p.coverage_runfiles.rjars, + jvm_flags = [ + "-DRULES_SCALA_MAIN_WS_NAME=%s" % ctx.workspace_name, + "-DRULES_SCALA_ARGS_FILE=%s" % p.runfiles.args_file.short_path, + ] + expand_location(ctx, final_jvm_flags), + use_jacoco = ctx.configuration.coverage_enabled, + ) + return _phase_deafult_write_executable(ctx, p, args) + +def phase_repl_write_executable(ctx, p): + args = struct( + jvm_flags = ["-Dscala.usejavacp=true"] + ctx.attr.jvm_flags, + main_class = "scala.tools.nsc.MainGenericRunner", + ) + return _phase_deafult_write_executable(ctx, p, args) + +def phase_junit_test_write_executable(ctx, p): + args = struct( + jvm_flags = p.jvm_flags + ctx.attr.jvm_flags, + main_class = "com.google.testing.junit.runner.BazelTestRunner", + ) + return _phase_deafult_write_executable(ctx, p, args) + +def phase_common_write_executable(ctx, p): + return _phase_deafult_write_executable(ctx, p) + +def _phase_deafult_write_executable(ctx, p, _args = struct()): + return _phase_write_executable( + ctx, + p, + _args.rjars if hasattr(_args, "rjars") else p.compile.rjars, + _args.jvm_flags if hasattr(_args, "jvm_flags") else ctx.attr.jvm_flags, + _args.use_jacoco if hasattr(_args, "use_jacoco") else False, + _args.main_class if hasattr(_args, "main_class") else ctx.attr.main_class, + ) + +def _phase_write_executable( + ctx, + p, + rjars, + jvm_flags, + use_jacoco, + main_class): + executable = p.declare_executable + wrapper = p.java_wrapper + + if (is_windows(ctx)): + return _write_executable_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco) + else: + return _write_executable_non_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco) + +def _write_executable_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco): + # NOTE: `use_jacoco` is currently ignored on Windows. + # TODO: tests coverage support for Windows + classpath = ";".join( + [("external/%s" % (j.short_path[3:]) if j.short_path.startswith("../") else j.short_path) for j in rjars.to_list()], + ) + jvm_flags_str = ";".join(jvm_flags) + java_for_exe = str(ctx.attr._java_runtime[java_common.JavaRuntimeInfo].java_executable_exec_path) + + cpfile = ctx.actions.declare_file("%s.classpath" % ctx.label.name) + ctx.actions.write(cpfile, classpath) + + ctx.actions.run( + outputs = [executable], + inputs = [cpfile], + executable = ctx.attr._exe.files_to_run.executable, + arguments = [executable.path, ctx.workspace_name, java_for_exe, main_class, cpfile.path, jvm_flags_str], + mnemonic = "ExeLauncher", + progress_message = "Creating exe launcher", + ) + return [] + +def _write_executable_non_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco): + template = ctx.attr._java_stub_template.files.to_list()[0] + + jvm_flags = " ".join( + [ctx.expand_location(f, ctx.attr.data) for f in jvm_flags], + ) + + javabin = "export REAL_EXTERNAL_JAVA_BIN=${JAVABIN};JAVABIN=%s/%s" % ( + runfiles_root(ctx), + wrapper.short_path, + ) + + if use_jacoco and _coverage_replacements_provider.is_enabled(ctx): + classpath = ctx.configuration.host_path_separator.join( + ["${RUNPATH}%s" % (j.short_path) for j in rjars.to_list() + ctx.files._jacocorunner], + ) + jacoco_metadata_file = ctx.actions.declare_file( + "%s.jacoco_metadata.txt" % ctx.attr.name, + sibling = executable, + ) + ctx.actions.write(jacoco_metadata_file, "\n".join([ + jar.short_path.replace("../", "external/") + for jar in rjars.to_list() + ])) + ctx.actions.expand_template( + template = template, + output = executable, + substitutions = { + "%classpath%": "\"%s\"" % classpath, + "%javabin%": javabin, + "%jarbin%": _jar_path_based_on_java_bin(ctx), + "%jvm_flags%": jvm_flags, + "%needs_runfiles%": "", + "%runfiles_manifest_only%": "", + "%workspace_prefix%": ctx.workspace_name + "/", + "%java_start_class%": "com.google.testing.coverage.JacocoCoverageRunner", + "%set_jacoco_metadata%": "export JACOCO_METADATA_JAR=\"$JAVA_RUNFILES/{}/{}\"".format(ctx.workspace_name, jacoco_metadata_file.short_path), + "%set_jacoco_main_class%": """export JACOCO_MAIN_CLASS={}""".format(main_class), + "%set_jacoco_java_runfiles_root%": """export JACOCO_JAVA_RUNFILES_ROOT=$JAVA_RUNFILES/{}/""".format(ctx.workspace_name), + "%set_java_coverage_new_implementation%": """export JAVA_COVERAGE_NEW_IMPLEMENTATION=YES""", + }, + is_executable = True, + ) + return [jacoco_metadata_file] + else: + # RUNPATH is defined here: + # https://github.com/bazelbuild/bazel/blob/0.4.5/src/main/java/com/google/devtools/build/lib/bazel/rules/java/java_stub_template.txt#L227 + classpath = ctx.configuration.host_path_separator.join( + ["${RUNPATH}%s" % (j.short_path) for j in rjars.to_list()], + ) + ctx.actions.expand_template( + template = template, + output = executable, + substitutions = { + "%classpath%": "\"%s\"" % classpath, + "%java_start_class%": main_class, + "%javabin%": javabin, + "%jarbin%": _jar_path_based_on_java_bin(ctx), + "%jvm_flags%": jvm_flags, + "%needs_runfiles%": "", + "%runfiles_manifest_only%": "", + "%set_jacoco_metadata%": "", + "%set_jacoco_main_class%": "", + "%set_jacoco_java_runfiles_root%": "", + "%workspace_prefix%": ctx.workspace_name + "/", + "%set_java_coverage_new_implementation%": """export JAVA_COVERAGE_NEW_IMPLEMENTATION=NO""", + }, + is_executable = True, + ) + return [] + +def _jar_path_based_on_java_bin(ctx): + java_bin_var = java_bin(ctx) + jar_path = java_bin_var.rpartition("/")[0] + "/jar" + return jar_path diff --git a/scala/private/phases/phase_write_manifest.bzl b/scala/private/phases/phase_write_manifest.bzl new file mode 100644 index 000000000..81681a57c --- /dev/null +++ b/scala/private/phases/phase_write_manifest.bzl @@ -0,0 +1,13 @@ +# +# PHASE: write manifest +# +# DOCUMENT THIS +# +load( + "@io_bazel_rules_scala//scala/private:common.bzl", + _write_manifest_file = "write_manifest_file", +) + +def phase_write_manifest(ctx, p): + main_class = getattr(ctx.attr, "main_class", None) + _write_manifest_file(ctx.actions, ctx.outputs.manifest, main_class) diff --git a/scala/private/phases/phases.bzl b/scala/private/phases/phases.bzl new file mode 100644 index 000000000..95e68b575 --- /dev/null +++ b/scala/private/phases/phases.bzl @@ -0,0 +1,131 @@ +""" +Re-expose all the phase APIs and built-in phases +""" + +load( + "@io_bazel_rules_scala//scala/private:phases/api.bzl", + _extras_phases = "extras_phases", + _run_phases = "run_phases", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_write_executable.bzl", + _phase_common_write_executable = "phase_common_write_executable", + _phase_junit_test_write_executable = "phase_junit_test_write_executable", + _phase_repl_write_executable = "phase_repl_write_executable", + _phase_scalatest_write_executable = "phase_scalatest_write_executable", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_java_wrapper.bzl", + _phase_common_java_wrapper = "phase_common_java_wrapper", + _phase_repl_java_wrapper = "phase_repl_java_wrapper", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_collect_jars.bzl", + _phase_common_collect_jars = "phase_common_collect_jars", + _phase_junit_test_collect_jars = "phase_junit_test_collect_jars", + _phase_library_for_plugin_bootstrapping_collect_jars = "phase_library_for_plugin_bootstrapping_collect_jars", + _phase_macro_library_collect_jars = "phase_macro_library_collect_jars", + _phase_repl_collect_jars = "phase_repl_collect_jars", + _phase_scalatest_collect_jars = "phase_scalatest_collect_jars", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_compile.bzl", + _phase_binary_compile = "phase_binary_compile", + _phase_common_compile = "phase_common_compile", + _phase_junit_test_compile = "phase_junit_test_compile", + _phase_library_compile = "phase_library_compile", + _phase_library_for_plugin_bootstrapping_compile = "phase_library_for_plugin_bootstrapping_compile", + _phase_macro_library_compile = "phase_macro_library_compile", + _phase_repl_compile = "phase_repl_compile", + _phase_scalatest_compile = "phase_scalatest_compile", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_runfiles.bzl", + _phase_common_runfiles = "phase_common_runfiles", + _phase_library_runfiles = "phase_library_runfiles", + _phase_scalatest_runfiles = "phase_scalatest_runfiles", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phase_final.bzl", + _phase_binary_final = "phase_binary_final", + _phase_library_final = "phase_library_final", + _phase_scalatest_final = "phase_scalatest_final", +) +load("@io_bazel_rules_scala//scala/private:phases/phase_scalac_provider.bzl", _phase_scalac_provider = "phase_scalac_provider") +load("@io_bazel_rules_scala//scala/private:phases/phase_write_manifest.bzl", _phase_write_manifest = "phase_write_manifest") +load("@io_bazel_rules_scala//scala/private:phases/phase_collect_srcjars.bzl", _phase_collect_srcjars = "phase_collect_srcjars") +load("@io_bazel_rules_scala//scala/private:phases/phase_collect_exports_jars.bzl", _phase_collect_exports_jars = "phase_collect_exports_jars") +load("@io_bazel_rules_scala//scala/private:phases/phase_unused_deps_checker.bzl", _phase_unused_deps_checker = "phase_unused_deps_checker") +load("@io_bazel_rules_scala//scala/private:phases/phase_declare_executable.bzl", _phase_declare_executable = "phase_declare_executable") +load("@io_bazel_rules_scala//scala/private:phases/phase_merge_jars.bzl", _phase_merge_jars = "phase_merge_jars") +load("@io_bazel_rules_scala//scala/private:phases/phase_jvm_flags.bzl", _phase_jvm_flags = "phase_jvm_flags") +load("@io_bazel_rules_scala//scala/private:phases/phase_coverage_runfiles.bzl", _phase_coverage_runfiles = "phase_coverage_runfiles") + +# API +run_phases = _run_phases +extras_phases = _extras_phases + +# scalac_provider +phase_scalac_provider = _phase_scalac_provider + +# collect_srcjars +phase_collect_srcjars = _phase_collect_srcjars + +# collect_exports_jars +phase_collect_exports_jars = _phase_collect_exports_jars + +# write_manifest +phase_write_manifest = _phase_write_manifest + +# unused_deps_checker +phase_unused_deps_checker = _phase_unused_deps_checker + +# declare_executable +phase_declare_executable = _phase_declare_executable + +# merge_jars +phase_merge_jars = _phase_merge_jars + +# jvm_flags +phase_jvm_flags = _phase_jvm_flags + +# coverage_runfiles +phase_coverage_runfiles = _phase_coverage_runfiles + +# write_executable +phase_scalatest_write_executable = _phase_scalatest_write_executable +phase_repl_write_executable = _phase_repl_write_executable +phase_junit_test_write_executable = _phase_junit_test_write_executable +phase_common_write_executable = _phase_common_write_executable + +# java_wrapper +phase_repl_java_wrapper = _phase_repl_java_wrapper +phase_common_java_wrapper = _phase_common_java_wrapper + +# collect_jars +phase_scalatest_collect_jars = _phase_scalatest_collect_jars +phase_repl_collect_jars = _phase_repl_collect_jars +phase_macro_library_collect_jars = _phase_macro_library_collect_jars +phase_junit_test_collect_jars = _phase_junit_test_collect_jars +phase_library_for_plugin_bootstrapping_collect_jars = _phase_library_for_plugin_bootstrapping_collect_jars +phase_common_collect_jars = _phase_common_collect_jars + +# compile +phase_binary_compile = _phase_binary_compile +phase_library_compile = _phase_library_compile +phase_library_for_plugin_bootstrapping_compile = _phase_library_for_plugin_bootstrapping_compile +phase_macro_library_compile = _phase_macro_library_compile +phase_junit_test_compile = _phase_junit_test_compile +phase_repl_compile = _phase_repl_compile +phase_scalatest_compile = _phase_scalatest_compile +phase_common_compile = _phase_common_compile + +# runfiles +phase_library_runfiles = _phase_library_runfiles +phase_scalatest_runfiles = _phase_scalatest_runfiles +phase_common_runfiles = _phase_common_runfiles + +# final +phase_binary_final = _phase_binary_final +phase_library_final = _phase_library_final +phase_scalatest_final = _phase_scalatest_final diff --git a/scala/private/rule_impls.bzl b/scala/private/rule_impls.bzl index 0d2c4ce8c..15625830c 100644 --- a/scala/private/rule_impls.bzl +++ b/scala/private/rule_impls.bzl @@ -13,47 +13,16 @@ # limitations under the License. """Rules for supporting the Scala language.""" -load( - "@io_bazel_rules_scala//scala:providers.bzl", - "create_scala_provider", - _ScalacProvider = "ScalacProvider", -) load( "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", _coverage_replacements_provider = "coverage_replacements_provider", ) load( ":common.bzl", - "add_labels_of_jars_to", - "collect_jars", - "collect_srcjars", - "create_java_provider", - "not_sources_jar", - "write_manifest", + _collect_plugin_paths = "collect_plugin_paths", ) -load("@io_bazel_rules_scala//scala:jars_to_labels.bzl", "JarsToLabelsInfo") -load("@bazel_tools//tools/jdk:toolchain_utils.bzl", "find_java_runtime_toolchain", "find_java_toolchain") - -_java_extension = ".java" - -_scala_extension = ".scala" - -_srcjar_extension = ".srcjar" - -_empty_coverage_struct = struct( - instrumented_files = struct(), - providers = [], - replacements = {}, -) - -def _adjust_resources_path_by_strip_prefix(path, resource_strip_prefix): - if not path.startswith(resource_strip_prefix): - fail("Resource file %s is not under the specified prefix to strip" % path) - - clean_path = path[len(resource_strip_prefix):] - return resource_strip_prefix, clean_path -def _adjust_resources_path_by_default_prefixes(path): +def adjust_resources_path_by_default_prefixes(path): # Here we are looking to find out the offset of this resource inside # any resources folder. We want to return the root to the resources folder # and then the sub path inside it @@ -68,87 +37,23 @@ def _adjust_resources_path_by_default_prefixes(path): return "", path -def _adjust_resources_path(path, resource_strip_prefix): - if resource_strip_prefix: - return _adjust_resources_path_by_strip_prefix(path, resource_strip_prefix) +def expand_location(ctx, flags): + if hasattr(ctx.attr, "data"): + data = ctx.attr.data else: - return _adjust_resources_path_by_default_prefixes(path) - -def _add_resources_cmd(ctx): - res_cmd = [] - for f in ctx.files.resources: - c_dir, res_path = _adjust_resources_path( - f.short_path, - ctx.attr.resource_strip_prefix, - ) - target_path = res_path - if target_path[0] == "/": - target_path = target_path[1:] - line = "{target_path}={c_dir}{res_path}\n".format( - res_path = res_path, - target_path = target_path, - c_dir = c_dir, - ) - res_cmd.extend([line]) - return "".join(res_cmd) - -def _build_nosrc_jar(ctx): - resources = _add_resources_cmd(ctx) - ijar_cmd = "" - - # this ensures the file is not empty - resources += "META-INF/MANIFEST.MF=%s\n" % ctx.outputs.manifest.path - - zipper_arg_path = ctx.actions.declare_file("%s_zipper_args" % ctx.label.name) - ctx.actions.write(zipper_arg_path, resources) - cmd = """ -rm -f {jar_output} -{zipper} c {jar_output} @{path} -# ensures that empty src targets still emit a statsfile -touch {statsfile} -""" + ijar_cmd - - cmd = cmd.format( - path = zipper_arg_path.path, - jar_output = ctx.outputs.jar.path, - zipper = ctx.executable._zipper.path, - statsfile = ctx.outputs.statsfile.path, - ) - - outs = [ctx.outputs.jar, ctx.outputs.statsfile] - inputs = ctx.files.resources + [ctx.outputs.manifest] - - ctx.actions.run_shell( - inputs = inputs, - tools = [ctx.executable._zipper, zipper_arg_path], - outputs = outs, - command = cmd, - progress_message = "scala %s" % ctx.label, - arguments = [], - ) - -def _collect_plugin_paths(plugins): - paths = [] - for p in plugins: - if hasattr(p, "path"): - paths.append(p) - elif hasattr(p, "scala"): - paths.extend([j.class_jar for j in p.scala.outputs.jars]) - elif hasattr(p, "java"): - paths.extend([j.class_jar for j in p.java.outputs.jars]) - # support http_file pointed at a jar. http_jar uses ijar, - # which breaks scala macros - - elif hasattr(p, "files"): - paths.extend([f for f in p.files if not_sources_jar(f.basename)]) - return depset(paths) - -def _expand_location(ctx, flags): - return [ctx.expand_location(f, ctx.attr.data) for f in flags] + data = [] + return [ctx.expand_location(f, data) for f in flags] def _join_path(args, sep = ","): return sep.join([f.path for f in args]) +# Return the first non-empty arg. If all are empty, return the last. +def first_non_empty(*args): + for arg in args: + if arg: + return arg + return args[-1] + def compile_scala( ctx, target_label, @@ -172,6 +77,7 @@ def compile_scala( unused_dependency_checker_mode = "off", unused_dependency_checker_ignored_targets = []): # look for any plugins: + input_plugins = plugins plugins = _collect_plugin_paths(plugins) internal_plugin_jars = [] dependency_analyzer_mode = "off" @@ -193,7 +99,7 @@ def compile_scala( transitive_cjars_list = transitive_compile_jars.to_list() indirect_jars = _join_path(transitive_cjars_list) - indirect_targets = ",".join([labels[j.path] for j in transitive_cjars_list]) + indirect_targets = ",".join([str(labels[j.path]) for j in transitive_cjars_list]) current_target = str(target_label) @@ -233,7 +139,7 @@ CurrentTarget: {current_target} ignored_targets = ignored_targets, current_target = current_target, ) - if is_dependency_analyzer_off(ctx) and not _is_plus_one_deps_off(ctx): + if is_dependency_analyzer_off(ctx) and not is_plus_one_deps_off(ctx): compiler_classpath_jars = transitive_compile_jars plugins_list = plugins.to_list() @@ -243,7 +149,7 @@ CurrentTarget: {current_target} compiler_classpath = _join_path(compiler_classpath_jars.to_list(), separator) toolchain = ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"] - scalacopts = toolchain.scalacopts + in_scalacopts + scalacopts = [ctx.expand_location(v, input_plugins) for v in toolchain.scalacopts + in_scalacopts] scalac_args = """ Classpath: {cp} @@ -279,7 +185,7 @@ StatsfileOutput: {statsfile_output} resource_src = ",".join([f.path for f in resources]), resource_short_paths = ",".join([f.short_path for f in resources]), resource_dest = ",".join([ - _adjust_resources_path_by_default_prefixes(f.short_path)[1] + adjust_resources_path_by_default_prefixes(f.short_path)[1] for f in resources ]), resource_strip_prefix = resource_strip_prefix, @@ -310,6 +216,13 @@ StatsfileOutput: {statsfile_output} resource_jars + [manifest, argfile] + scalac_inputs ) + # scalac_jvm_flags passed in on the target override scalac_jvm_flags passed in on the + # toolchain + final_scalac_jvm_flags = first_non_empty( + scalac_jvm_flags, + ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].scalac_jvm_flags, + ) + ctx.actions.run( inputs = ins, outputs = outs, @@ -328,212 +241,10 @@ StatsfileOutput: {statsfile_output} # consume the flags on startup. arguments = [ "--jvm_flag=%s" % f - for f in _expand_location(ctx, scalac_jvm_flags) + for f in expand_location(ctx, final_scalac_jvm_flags) ] + ["@" + argfile.path], ) -def _interim_java_provider_for_java_compilation(scala_output): - return java_common.create_provider( - use_ijar = False, - compile_time_jars = [scala_output], - runtime_jars = [], - ) - -def _scalac_provider(ctx): - return ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].scalac_provider_attr[_ScalacProvider] - -def try_to_compile_java_jar( - ctx, - scala_output, - all_srcjars, - java_srcs, - implicit_junit_deps_needed_for_java_compilation): - if not java_srcs and (not (all_srcjars and ctx.attr.expect_java_output)): - return False - - providers_of_dependencies = collect_java_providers_of(ctx.attr.deps) - providers_of_dependencies += collect_java_providers_of( - implicit_junit_deps_needed_for_java_compilation, - ) - providers_of_dependencies += collect_java_providers_of( - _scalac_provider(ctx).default_classpath, - ) - scala_sources_java_provider = _interim_java_provider_for_java_compilation( - scala_output, - ) - providers_of_dependencies += [scala_sources_java_provider] - - full_java_jar = ctx.actions.declare_file(ctx.label.name + "_java.jar") - - provider = java_common.compile( - ctx, - source_jars = all_srcjars.to_list(), - source_files = java_srcs, - output = full_java_jar, - javac_opts = _expand_location( - ctx, - ctx.attr.javacopts + ctx.attr.javac_jvm_flags + - java_common.default_javac_opts( - java_toolchain = ctx.attr._java_toolchain[java_common.JavaToolchainInfo], - ), - ), - deps = providers_of_dependencies, - #exports can be empty since the manually created provider exposes exports - #needs to be empty since we want the provider.compile_jars to only contain the sources ijar - #workaround until https://github.com/bazelbuild/bazel/issues/3528 is resolved - exports = [], - java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), - host_javabase = find_java_runtime_toolchain(ctx, ctx.attr._host_javabase), - strict_deps = ctx.fragments.java.strict_java_deps, - ) - return struct( - ijar = provider.compile_jars.to_list().pop(), - jar = full_java_jar, - source_jars = provider.source_jars, - ) - -def collect_java_providers_of(deps): - providers = [] - for dep in deps: - if JavaInfo in dep: - providers.append(dep[JavaInfo]) - return providers - -def _compile_or_empty( - ctx, - manifest, - jars, - srcjars, - buildijar, - transitive_compile_jars, - jars2labels, - implicit_junit_deps_needed_for_java_compilation, - unused_dependency_checker_mode, - unused_dependency_checker_ignored_targets): - # We assume that if a srcjar is present, it is not empty - if len(ctx.files.srcs) + len(srcjars.to_list()) == 0: - _build_nosrc_jar(ctx) - - # no need to build ijar when empty - return struct( - class_jar = ctx.outputs.jar, - coverage = _empty_coverage_struct, - full_jars = [ctx.outputs.jar], - ijar = ctx.outputs.jar, - ijars = [ctx.outputs.jar], - java_jar = False, - source_jars = [], - ) - else: - in_srcjars = [ - f - for f in ctx.files.srcs - if f.basename.endswith(_srcjar_extension) - ] - all_srcjars = depset(in_srcjars, transitive = [srcjars]) - - java_srcs = [ - f - for f in ctx.files.srcs - if f.basename.endswith(_java_extension) - ] - - # We are not able to verify whether dependencies are used when compiling java sources - # Thus we disable unused dependency checking when java sources are found - if len(java_srcs) != 0: - unused_dependency_checker_mode = "off" - - sources = [ - f - for f in ctx.files.srcs - if f.basename.endswith(_scala_extension) - ] + java_srcs - compile_scala( - ctx, - ctx.label, - ctx.outputs.jar, - manifest, - ctx.outputs.statsfile, - sources, - jars, - all_srcjars, - transitive_compile_jars, - ctx.attr.plugins, - ctx.attr.resource_strip_prefix, - ctx.files.resources, - ctx.files.resource_jars, - jars2labels, - ctx.attr.scalacopts, - ctx.attr.print_compile_time, - ctx.attr.expect_java_output, - ctx.attr.scalac_jvm_flags, - ctx.attr._scalac, - unused_dependency_checker_ignored_targets = - unused_dependency_checker_ignored_targets, - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) - - # build ijar if needed - if buildijar: - ijar = java_common.run_ijar( - ctx.actions, - jar = ctx.outputs.jar, - target_label = ctx.label, - java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), - ) - else: - # macro code needs to be available at compile-time, - # so set ijar == jar - ijar = ctx.outputs.jar - - # compile the java now - java_jar = try_to_compile_java_jar( - ctx, - ijar, - all_srcjars, - java_srcs, - implicit_junit_deps_needed_for_java_compilation, - ) - - full_jars = [ctx.outputs.jar] - ijars = [ijar] - source_jars = [] - if java_jar: - full_jars += [java_jar.jar] - ijars += [java_jar.ijar] - source_jars += java_jar.source_jars - - coverage = _jacoco_offline_instrument(ctx, ctx.outputs.jar) - - return struct( - class_jar = ctx.outputs.jar, - coverage = coverage, - full_jars = full_jars, - ijar = ijar, - ijars = ijars, - java_jar = java_jar, - source_jars = source_jars, - ) - -def _build_deployable(ctx, jars_list): - # This calls bazels singlejar utility. - # For a full list of available command line options see: - # https://github.com/bazelbuild/bazel/blob/master/src/java_tools/singlejar/java/com/google/devtools/build/singlejar/SingleJar.java#L311 - # Use --compression to reduce size of deploy jars. - args = ["--compression", "--normalize", "--sources"] - args.extend([j.path for j in jars_list]) - if getattr(ctx.attr, "main_class", ""): - args.extend(["--main_class", ctx.attr.main_class]) - args.extend(["--output", ctx.outputs.deploy_jar.path]) - ctx.actions.run( - inputs = jars_list, - outputs = [ctx.outputs.deploy_jar], - executable = ctx.executable._singlejar, - mnemonic = "ScalaDeployJar", - progress_message = "scala deployable %s" % ctx.label, - arguments = args, - ) - def _path_is_absolute(path): # Returns true for absolute path in Linux/Mac (i.e., '/') or Windows (i.e., # 'X:\' or 'X:/' where 'X' is a letter), false otherwise. @@ -547,162 +258,18 @@ def _path_is_absolute(path): return False -def _runfiles_root(ctx): +def runfiles_root(ctx): return "${TEST_SRCDIR}/%s" % ctx.workspace_name -def _java_bin(ctx): +def java_bin(ctx): java_path = str(ctx.attr._java_runtime[java_common.JavaRuntimeInfo].java_executable_runfiles_path) if _path_is_absolute(java_path): javabin = java_path else: - runfiles_root = _runfiles_root(ctx) - javabin = "%s/%s" % (runfiles_root, java_path) + runfiles_root_var = runfiles_root(ctx) + javabin = "%s/%s" % (runfiles_root_var, java_path) return javabin -def _write_java_wrapper(ctx, args = "", wrapper_preamble = ""): - """This creates a wrapper that sets up the correct path - to stand in for the java command.""" - - exec_str = "" - if wrapper_preamble == "": - exec_str = "exec " - - wrapper = ctx.actions.declare_file(ctx.label.name + "_wrapper.sh") - ctx.actions.write( - output = wrapper, - content = """#!/usr/bin/env bash -{preamble} -DEFAULT_JAVABIN={javabin} -JAVA_EXEC_TO_USE=${{REAL_EXTERNAL_JAVA_BIN:-$DEFAULT_JAVABIN}} -{exec_str}$JAVA_EXEC_TO_USE "$@" {args} -""".format( - preamble = wrapper_preamble, - exec_str = exec_str, - javabin = _java_bin(ctx), - args = args, - ), - is_executable = True, - ) - return wrapper - -def _jar_path_based_on_java_bin(ctx): - java_bin = _java_bin(ctx) - jar_path = java_bin.rpartition("/")[0] + "/jar" - return jar_path - -def _write_executable(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco): - if (_is_windows(ctx)): - return _write_executable_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco) - else: - return _write_executable_non_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco) - -def _write_executable_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco): - # NOTE: `use_jacoco` is currently ignored on Windows. - # TODO: tests coverage support for Windows - classpath = ";".join( - [("external/%s" % (j.short_path[3:]) if j.short_path.startswith("../") else j.short_path) for j in rjars.to_list()], - ) - jvm_flags_str = ";".join(jvm_flags) - java_for_exe = str(ctx.attr._java_runtime[java_common.JavaRuntimeInfo].java_executable_exec_path) - - cpfile = ctx.actions.declare_file("%s.classpath" % ctx.label.name) - ctx.actions.write(cpfile, classpath) - - ctx.actions.run( - outputs = [executable], - inputs = [cpfile], - executable = ctx.attr._exe.files_to_run.executable, - arguments = [executable.path, ctx.workspace_name, java_for_exe, main_class, cpfile.path, jvm_flags_str], - mnemonic = "ExeLauncher", - progress_message = "Creating exe launcher", - ) - return [] - -def _write_executable_non_windows(ctx, executable, rjars, main_class, jvm_flags, wrapper, use_jacoco): - template = ctx.attr._java_stub_template.files.to_list()[0] - - jvm_flags = " ".join( - [ctx.expand_location(f, ctx.attr.data) for f in jvm_flags], - ) - - javabin = "export REAL_EXTERNAL_JAVA_BIN=${JAVABIN};JAVABIN=%s/%s" % ( - _runfiles_root(ctx), - wrapper.short_path, - ) - - if use_jacoco and _coverage_replacements_provider.is_enabled(ctx): - classpath = ctx.configuration.host_path_separator.join( - ["${RUNPATH}%s" % (j.short_path) for j in rjars.to_list() + ctx.files._jacocorunner + ctx.files._lcov_merger], - ) - jacoco_metadata_file = ctx.actions.declare_file( - "%s.jacoco_metadata.txt" % ctx.attr.name, - sibling = executable, - ) - ctx.actions.write(jacoco_metadata_file, "\n".join([ - jar.short_path.replace("../", "external/") - for jar in rjars - ])) - ctx.actions.expand_template( - template = template, - output = executable, - substitutions = { - "%classpath%": "\"%s\"" % classpath, - "%javabin%": javabin, - "%jarbin%": _jar_path_based_on_java_bin(ctx), - "%jvm_flags%": jvm_flags, - "%needs_runfiles%": "", - "%runfiles_manifest_only%": "", - "%workspace_prefix%": ctx.workspace_name + "/", - "%java_start_class%": "com.google.testing.coverage.JacocoCoverageRunner", - "%set_jacoco_metadata%": "export JACOCO_METADATA_JAR=\"$JAVA_RUNFILES/{}/{}\"".format(ctx.workspace_name, jacoco_metadata_file.short_path), - "%set_jacoco_main_class%": """export JACOCO_MAIN_CLASS={}""".format(main_class), - "%set_jacoco_java_runfiles_root%": """export JACOCO_JAVA_RUNFILES_ROOT=$JAVA_RUNFILES/{}/""".format(ctx.workspace_name), - "%set_java_coverage_new_implementation%": """export JAVA_COVERAGE_NEW_IMPLEMENTATION=YES""", - }, - is_executable = True, - ) - return [jacoco_metadata_file] - else: - # RUNPATH is defined here: - # https://github.com/bazelbuild/bazel/blob/0.4.5/src/main/java/com/google/devtools/build/lib/bazel/rules/java/java_stub_template.txt#L227 - classpath = ctx.configuration.host_path_separator.join( - ["${RUNPATH}%s" % (j.short_path) for j in rjars.to_list()], - ) - ctx.actions.expand_template( - template = template, - output = executable, - substitutions = { - "%classpath%": "\"%s\"" % classpath, - "%java_start_class%": main_class, - "%javabin%": javabin, - "%jarbin%": _jar_path_based_on_java_bin(ctx), - "%jvm_flags%": jvm_flags, - "%needs_runfiles%": "", - "%runfiles_manifest_only%": "", - "%set_jacoco_metadata%": "", - "%set_jacoco_main_class%": "", - "%set_jacoco_java_runfiles_root%": "", - "%workspace_prefix%": ctx.workspace_name + "/", - "%set_java_coverage_new_implementation%": """export JAVA_COVERAGE_NEW_IMPLEMENTATION=NO""", - }, - is_executable = True, - ) - return [] - -def _declare_executable(ctx): - if (_is_windows(ctx)): - return ctx.actions.declare_file("%s.exe" % ctx.label.name) - else: - return ctx.actions.declare_file(ctx.label.name) - -def _collect_runtime_jars(dep_targets): - runtime_jars = [] - - for dep_target in dep_targets: - runtime_jars.append(dep_target[JavaInfo].transitive_runtime_jars) - - return runtime_jars - def is_dependency_analyzer_on(ctx): if (hasattr(ctx.attr, "_dependency_analyzer_plugin") and # when the strict deps FT is removed the "default" check @@ -714,651 +281,8 @@ def is_dependency_analyzer_on(ctx): def is_dependency_analyzer_off(ctx): return not is_dependency_analyzer_on(ctx) -def _is_plus_one_deps_off(ctx): +def is_plus_one_deps_off(ctx): return ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].plus_one_deps_mode == "off" -# Extract very common code out from dependency analysis into single place -# automatically adds dependency on scala-library and scala-reflect -# collects jars from deps, runtime jars from runtime_deps, and -def _collect_jars_from_common_ctx( - ctx, - base_classpath, - extra_deps = [], - extra_runtime_deps = [], - unused_dependency_checker_is_off = True): - dependency_analyzer_is_off = is_dependency_analyzer_off(ctx) - - deps_jars = collect_jars( - ctx.attr.deps + extra_deps + base_classpath, - dependency_analyzer_is_off, - unused_dependency_checker_is_off, - _is_plus_one_deps_off(ctx), - ) - - ( - cjars, - transitive_rjars, - jars2labels, - transitive_compile_jars, - ) = ( - deps_jars.compile_jars, - deps_jars.transitive_runtime_jars, - deps_jars.jars2labels, - deps_jars.transitive_compile_jars, - ) - - transitive_rjars = depset( - transitive = [transitive_rjars] + - _collect_runtime_jars(ctx.attr.runtime_deps + extra_runtime_deps), - ) - - return struct( - compile_jars = cjars, - jars2labels = jars2labels, - transitive_compile_jars = transitive_compile_jars, - transitive_runtime_jars = transitive_rjars, - ) - -def _lib( - ctx, - base_classpath, - non_macro_lib, - unused_dependency_checker_mode, - unused_dependency_checker_ignored_targets): - # Build up information from dependency-like attributes - - # This will be used to pick up srcjars from non-scala library - # targets (like thrift code generation) - srcjars = collect_srcjars(ctx.attr.deps) - - unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" - jars = _collect_jars_from_common_ctx( - ctx, - base_classpath, - unused_dependency_checker_is_off = unused_dependency_checker_is_off, - ) - - (cjars, transitive_rjars) = (jars.compile_jars, jars.transitive_runtime_jars) - - write_manifest(ctx) - outputs = _compile_or_empty( - ctx, - ctx.outputs.manifest, - cjars, - srcjars, - non_macro_lib, - jars.transitive_compile_jars, - jars.jars2labels.jars_to_labels, - [], - unused_dependency_checker_ignored_targets = [ - target.label - for target in base_classpath + ctx.attr.exports + - unused_dependency_checker_ignored_targets - ], - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) - - transitive_rjars = depset(outputs.full_jars, transitive = [transitive_rjars]) - - _build_deployable(ctx, transitive_rjars.to_list()) - - # Using transitive_files since transitive_rjars a depset and avoiding linearization - runfiles = ctx.runfiles( - transitive_files = transitive_rjars, - collect_data = True, - ) - - # Add information from exports (is key that AFTER all build actions/runfiles analysis) - # Since after, will not show up in deploy_jar or old jars runfiles - # Notice that compile_jars is intentionally transitive for exports - exports_jars = collect_jars(ctx.attr.exports) - transitive_rjars = depset( - transitive = [transitive_rjars, exports_jars.transitive_runtime_jars], - ) - - source_jars = _pack_source_jars(ctx) + outputs.source_jars - - scalaattr = create_scala_provider( - class_jar = outputs.class_jar, - compile_jars = depset( - outputs.ijars, - transitive = [exports_jars.compile_jars], - ), - deploy_jar = ctx.outputs.deploy_jar, - full_jars = outputs.full_jars, - ijar = outputs.ijar, - source_jars = source_jars, - statsfile = ctx.outputs.statsfile, - transitive_runtime_jars = transitive_rjars, - ) - - java_provider = create_java_provider(scalaattr, jars.transitive_compile_jars) - - return struct( - files = depset([ctx.outputs.jar] + outputs.full_jars), # Here is the default output - instrumented_files = outputs.coverage.instrumented_files, - jars_to_labels = jars.jars2labels, - providers = [java_provider, jars.jars2labels] + outputs.coverage.providers, - runfiles = runfiles, - scala = scalaattr, - ) - -def get_unused_dependency_checker_mode(ctx): - if ctx.attr.unused_dependency_checker_mode: - return ctx.attr.unused_dependency_checker_mode - else: - return ctx.toolchains["@io_bazel_rules_scala//scala:toolchain_type"].unused_dependency_checker_mode - -def scala_library_impl(ctx): - scalac_provider = _scalac_provider(ctx) - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - return _lib( - ctx, - scalac_provider.default_classpath, - True, - unused_dependency_checker_mode, - ctx.attr.unused_dependency_checker_ignored_targets, - ) - -def scala_library_for_plugin_bootstrapping_impl(ctx): - scalac_provider = _scalac_provider(ctx) - return _lib( - ctx, - scalac_provider.default_classpath, - True, - unused_dependency_checker_ignored_targets = [], - unused_dependency_checker_mode = "off", - ) - -def scala_macro_library_impl(ctx): - scalac_provider = _scalac_provider(ctx) - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - return _lib( - ctx, - scalac_provider.default_macro_classpath, - False, # don't build the ijar for macros - unused_dependency_checker_mode, - ctx.attr.unused_dependency_checker_ignored_targets, - ) - -# Common code shared by all scala binary implementations. -def _scala_binary_common( - ctx, - executable, - cjars, - rjars, - transitive_compile_time_jars, - jars2labels, - java_wrapper, - unused_dependency_checker_mode, - unused_dependency_checker_ignored_targets, - implicit_junit_deps_needed_for_java_compilation = [], - runfiles_ext = []): - write_manifest(ctx) - outputs = _compile_or_empty( - ctx, - ctx.outputs.manifest, - cjars, - depset(), - False, - transitive_compile_time_jars, - jars2labels.jars_to_labels, - implicit_junit_deps_needed_for_java_compilation, - unused_dependency_checker_ignored_targets = - unused_dependency_checker_ignored_targets, - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) # no need to build an ijar for an executable - rjars = depset(outputs.full_jars, transitive = [rjars]) - - _build_deployable(ctx, rjars.to_list()) - - runfiles = ctx.runfiles( - transitive_files = depset( - [executable, java_wrapper] + ctx.files._java_runtime + runfiles_ext, - transitive = [rjars], - ), - collect_data = True, - ) - - source_jars = _pack_source_jars(ctx) + outputs.source_jars - - scalaattr = create_scala_provider( - class_jar = outputs.class_jar, - compile_jars = depset(outputs.ijars), - deploy_jar = ctx.outputs.deploy_jar, - full_jars = outputs.full_jars, - ijar = outputs.class_jar, # we aren't using ijar here - source_jars = source_jars, - statsfile = ctx.outputs.statsfile, - transitive_runtime_jars = rjars, - ) - - java_provider = create_java_provider(scalaattr, transitive_compile_time_jars) - - return struct( - executable = executable, - coverage = outputs.coverage, - files = depset([executable, ctx.outputs.jar]), - instrumented_files = outputs.coverage.instrumented_files, - providers = [java_provider, jars2labels] + outputs.coverage.providers, - runfiles = runfiles, - scala = scalaattr, - transitive_rjars = - rjars, #calling rules need this for the classpath in the launcher - ) - -def _pack_source_jars(ctx): - source_jars = [] - - # collect .scala sources and pack a source jar for Scala - scala_sources = [ - f - for f in ctx.files.srcs - if f.basename.endswith(_scala_extension) - ] - - # collect .srcjar files and pack them with the scala sources - bundled_source_jars = [ - f - for f in ctx.files.srcs - if f.basename.endswith(_srcjar_extension) - ] - scala_source_jar = java_common.pack_sources( - ctx.actions, - output_jar = ctx.outputs.jar, - sources = scala_sources, - source_jars = bundled_source_jars, - java_toolchain = find_java_toolchain(ctx, ctx.attr._java_toolchain), - host_javabase = find_java_runtime_toolchain(ctx, ctx.attr._host_javabase), - ) - if scala_source_jar: - source_jars.append(scala_source_jar) - - return source_jars - -def scala_binary_impl(ctx): - scalac_provider = _scalac_provider(ctx) - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" - - jars = _collect_jars_from_common_ctx( - ctx, - scalac_provider.default_classpath, - unused_dependency_checker_is_off = unused_dependency_checker_is_off, - ) - (cjars, transitive_rjars) = (jars.compile_jars, jars.transitive_runtime_jars) - - wrapper = _write_java_wrapper(ctx, "", "") - - executable = _declare_executable(ctx) - - out = _scala_binary_common( - ctx, - executable, - cjars, - transitive_rjars, - jars.transitive_compile_jars, - jars.jars2labels, - wrapper, - unused_dependency_checker_ignored_targets = [ - target.label - for target in scalac_provider.default_classpath + - ctx.attr.unused_dependency_checker_ignored_targets - ], - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) - _write_executable( - ctx = ctx, - executable = executable, - jvm_flags = ctx.attr.jvm_flags, - main_class = ctx.attr.main_class, - rjars = out.transitive_rjars, - use_jacoco = False, - wrapper = wrapper, - ) - return out - -def scala_repl_impl(ctx): - scalac_provider = _scalac_provider(ctx) - - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" - - # need scala-compiler for MainGenericRunner below - jars = _collect_jars_from_common_ctx( - ctx, - scalac_provider.default_repl_classpath, - unused_dependency_checker_is_off = unused_dependency_checker_is_off, - ) - (cjars, transitive_rjars) = (jars.compile_jars, jars.transitive_runtime_jars) - - args = " ".join(ctx.attr.scalacopts) - - executable = _declare_executable(ctx) - - wrapper = _write_java_wrapper( - ctx, - args, - wrapper_preamble = """ -# save stty like in bin/scala -saved_stty=$(stty -g 2>/dev/null) -if [[ ! $? ]]; then - saved_stty="" -fi -function finish() { - if [[ "$saved_stty" != "" ]]; then - stty $saved_stty - saved_stty="" - fi -} -trap finish EXIT -""", - ) - - out = _scala_binary_common( - ctx, - executable, - cjars, - transitive_rjars, - jars.transitive_compile_jars, - jars.jars2labels, - wrapper, - unused_dependency_checker_ignored_targets = [ - target.label - for target in scalac_provider.default_repl_classpath + - ctx.attr.unused_dependency_checker_ignored_targets - ], - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) - _write_executable( - ctx = ctx, - executable = executable, - jvm_flags = ["-Dscala.usejavacp=true"] + ctx.attr.jvm_flags, - main_class = "scala.tools.nsc.MainGenericRunner", - rjars = out.transitive_rjars, - use_jacoco = False, - wrapper = wrapper, - ) - - return out - -def _scala_test_flags(ctx): - # output report test duration - flags = "-oD" - if ctx.attr.full_stacktraces: - flags += "F" - else: - flags += "S" - if not ctx.attr.colors: - flags += "W" - return flags - -def scala_test_impl(ctx): - if len(ctx.attr.suites) != 0: - print("suites attribute is deprecated. All scalatest test suites are run") - - scalac_provider = _scalac_provider(ctx) - - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - unused_dependency_checker_ignored_targets = [ - target.label - for target in scalac_provider.default_classpath + - ctx.attr.unused_dependency_checker_ignored_targets - ] - unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" - - scalatest_base_classpath = scalac_provider.default_classpath + [ctx.attr._scalatest] - jars = _collect_jars_from_common_ctx( - ctx, - scalatest_base_classpath, - extra_runtime_deps = [ - ctx.attr._scalatest_reporter, - ctx.attr._scalatest_runner, - ], - unused_dependency_checker_is_off = unused_dependency_checker_is_off, - ) - ( - cjars, - transitive_rjars, - transitive_compile_jars, - jars_to_labels, - ) = ( - jars.compile_jars, - jars.transitive_runtime_jars, - jars.transitive_compile_jars, - jars.jars2labels, - ) - - args = "\n".join([ - "-R", - ctx.outputs.jar.short_path, - _scala_test_flags(ctx), - "-C", - "io.bazel.rules.scala.JUnitXmlReporter", - ]) - - argsFile = ctx.actions.declare_file("%s.args" % ctx.label.name) - ctx.actions.write(argsFile, args) - - executable = _declare_executable(ctx) - - wrapper = _write_java_wrapper(ctx, "", "") - out = _scala_binary_common( - ctx, - executable, - cjars, - transitive_rjars, - transitive_compile_jars, - jars_to_labels, - wrapper, - unused_dependency_checker_ignored_targets = - unused_dependency_checker_ignored_targets, - unused_dependency_checker_mode = unused_dependency_checker_mode, - runfiles_ext = [argsFile], - ) - - rjars = out.transitive_rjars - - coverage_runfiles = [] - if ctx.configuration.coverage_enabled and _coverage_replacements_provider.is_enabled(ctx): - coverage_replacements = _coverage_replacements_provider.from_ctx( - ctx, - base = out.coverage.replacements, - ).replacements - - rjars = depset([ - coverage_replacements[jar] if jar in coverage_replacements else jar - for jar in rjars - ]) - coverage_runfiles = ctx.files._jacocorunner + ctx.files._lcov_merger + coverage_replacements.values() - - coverage_runfiles.extend(_write_executable( - ctx = ctx, - executable = executable, - jvm_flags = [ - "-DRULES_SCALA_MAIN_WS_NAME=%s" % ctx.workspace_name, - "-DRULES_SCALA_ARGS_FILE=%s" % argsFile.short_path, - ] + ctx.attr.jvm_flags, - main_class = ctx.attr.main_class, - rjars = rjars, - use_jacoco = ctx.configuration.coverage_enabled, - wrapper = wrapper, - )) - - return struct( - executable = executable, - files = out.files, - instrumented_files = out.instrumented_files, - providers = out.providers, - runfiles = ctx.runfiles(coverage_runfiles, transitive_files = out.runfiles.files), - scala = out.scala, - ) - -def _gen_test_suite_flags_based_on_prefixes_and_suffixes(ctx, archives): - return struct( - archiveFlag = "-Dbazel.discover.classes.archives.file.paths=%s" % - archives, - prefixesFlag = "-Dbazel.discover.classes.prefixes=%s" % ",".join( - ctx.attr.prefixes, - ), - printFlag = "-Dbazel.discover.classes.print.discovered=%s" % - ctx.attr.print_discovered_classes, - suffixesFlag = "-Dbazel.discover.classes.suffixes=%s" % ",".join( - ctx.attr.suffixes, - ), - testSuiteFlag = "-Dbazel.test_suite=%s" % ctx.attr.suite_class, - ) - -def _serialize_archives_short_path(archives): - archives_short_path = "" - for archive in archives: - archives_short_path += archive.short_path + "," - return archives_short_path[:-1] #remove redundant comma - -def _get_test_archive_jars(ctx, test_archives): - flattened_list = [] - for archive in test_archives: - # because we (rules_scala) use the legacy JavaInfo (java_common.create_provider) - # runtime_output_jars contains more jars than needed - if hasattr(archive, "scala"): - jars = [jar.class_jar for jar in archive.scala.outputs.jars] - else: - jars = archive[JavaInfo].runtime_output_jars - flattened_list.extend(jars) - return flattened_list - -def scala_junit_test_impl(ctx): - if (not (ctx.attr.prefixes) and not (ctx.attr.suffixes)): - fail( - "Setting at least one of the attributes ('prefixes','suffixes') is required", - ) - scalac_provider = _scalac_provider(ctx) - - unused_dependency_checker_mode = get_unused_dependency_checker_mode(ctx) - unused_dependency_checker_ignored_targets = [ - target.label - for target in scalac_provider.default_classpath + - ctx.attr.unused_dependency_checker_ignored_targets - ] + [ - ctx.attr._junit.label, - ctx.attr._hamcrest.label, - ctx.attr.suite_label.label, - ctx.attr._bazel_test_runner.label, - ] - unused_dependency_checker_is_off = unused_dependency_checker_mode == "off" - - jars = _collect_jars_from_common_ctx( - ctx, - scalac_provider.default_classpath, - extra_deps = [ - ctx.attr._junit, - ctx.attr._hamcrest, - ctx.attr.suite_label, - ctx.attr._bazel_test_runner, - ], - unused_dependency_checker_is_off = unused_dependency_checker_is_off, - ) - (cjars, transitive_rjars) = (jars.compile_jars, jars.transitive_runtime_jars) - implicit_junit_deps_needed_for_java_compilation = [ - ctx.attr._junit, - ctx.attr._hamcrest, - ] - - executable = _declare_executable(ctx) - - wrapper = _write_java_wrapper(ctx, "", "") - out = _scala_binary_common( - ctx, - executable, - cjars, - transitive_rjars, - jars.transitive_compile_jars, - jars.jars2labels, - wrapper, - implicit_junit_deps_needed_for_java_compilation = - implicit_junit_deps_needed_for_java_compilation, - unused_dependency_checker_ignored_targets = - unused_dependency_checker_ignored_targets, - unused_dependency_checker_mode = unused_dependency_checker_mode, - ) - - if ctx.attr.tests_from: - archives = _get_test_archive_jars(ctx, ctx.attr.tests_from) - else: - archives = [archive.class_jar for archive in out.scala.outputs.jars] - - serialized_archives = _serialize_archives_short_path(archives) - test_suite = _gen_test_suite_flags_based_on_prefixes_and_suffixes( - ctx, - serialized_archives, - ) - launcherJvmFlags = [ - "-ea", - test_suite.archiveFlag, - test_suite.prefixesFlag, - test_suite.suffixesFlag, - test_suite.printFlag, - test_suite.testSuiteFlag, - ] - _write_executable( - ctx = ctx, - executable = executable, - jvm_flags = launcherJvmFlags + ctx.attr.jvm_flags, - main_class = "com.google.testing.junit.runner.BazelTestRunner", - rjars = out.transitive_rjars, - use_jacoco = False, - wrapper = wrapper, - ) - - return out - -def _jacoco_offline_instrument(ctx, input_jar): - if not ctx.configuration.coverage_enabled or not hasattr(ctx.attr, "_code_coverage_instrumentation_worker"): - return _empty_coverage_struct - - worker_inputs, _, worker_input_manifests = ctx.resolve_command( - tools = [ctx.attr._code_coverage_instrumentation_worker], - ) - - output_jar = ctx.actions.declare_file( - "{}-offline.jar".format(input_jar.basename.split(".")[0]), - ) - in_out_pairs = [ - (input_jar, output_jar), - ] - - args = ctx.actions.args() - args.add_all(in_out_pairs, map_each = _jacoco_offline_instrument_format_each) - args.set_param_file_format("multiline") - args.use_param_file("@%s", use_always = True) - - ctx.actions.run( - mnemonic = "JacocoInstrumenter", - inputs = [in_out_pair[0] for in_out_pair in in_out_pairs] + worker_inputs, - outputs = [in_out_pair[1] for in_out_pair in in_out_pairs], - executable = ctx.attr._code_coverage_instrumentation_worker.files_to_run.executable, - input_manifests = worker_input_manifests, - execution_requirements = {"supports-workers": "1"}, - arguments = [args], - ) - - replacements = {i: o for (i, o) in in_out_pairs} - provider = _coverage_replacements_provider.create( - replacements = replacements, - ) - - return struct( - instrumented_files = struct( - dependency_attributes = _coverage_replacements_provider.dependency_attributes, - extensions = ["scala", "java"], - source_attributes = ["srcs"], - ), - providers = [provider], - replacements = replacements, - ) - -def _jacoco_offline_instrument_format_each(in_out_pair): - return (["%s=%s" % (in_out_pair[0].path, in_out_pair[1].path)]) - -def _is_windows(ctx): +def is_windows(ctx): return ctx.configuration.host_path_separator == ";" diff --git a/scala/private/rules/scala_binary.bzl b/scala/private/rules/scala_binary.bzl new file mode 100644 index 000000000..971e24141 --- /dev/null +++ b/scala/private/rules/scala_binary.bzl @@ -0,0 +1,81 @@ +"""Builds Scala binaries""" + +load("@bazel_skylib//lib:dicts.bzl", _dicts = "dicts") +load( + "@io_bazel_rules_scala//scala/private:common_attributes.bzl", + "common_attrs", + "implicit_deps", + "launcher_template", + "resolve_deps", +) +load("@io_bazel_rules_scala//scala/private:common_outputs.bzl", "common_outputs") +load( + "@io_bazel_rules_scala//scala/private:phases/phases.bzl", + "extras_phases", + "phase_binary_compile", + "phase_binary_final", + "phase_common_collect_jars", + "phase_common_java_wrapper", + "phase_common_runfiles", + "phase_common_write_executable", + "phase_declare_executable", + "phase_merge_jars", + "phase_scalac_provider", + "phase_unused_deps_checker", + "phase_write_manifest", + "run_phases", +) + +def _scala_binary_impl(ctx): + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + ("collect_jars", phase_common_collect_jars), + ("java_wrapper", phase_common_java_wrapper), + ("declare_executable", phase_declare_executable), + # no need to build an ijar for an executable + ("compile", phase_binary_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_common_runfiles), + ("write_executable", phase_common_write_executable), + ], + # fixed phase + ("final", phase_binary_final), + ).final + +_scala_binary_attrs = { + "main_class": attr.string(mandatory = True), + "classpath_resources": attr.label_list(allow_files = True), + "jvm_flags": attr.string_list(), +} + +_scala_binary_attrs.update(launcher_template) + +_scala_binary_attrs.update(implicit_deps) + +_scala_binary_attrs.update(common_attrs) + +_scala_binary_attrs.update(resolve_deps) + +def make_scala_binary(*extras): + return rule( + attrs = _dicts.add( + _scala_binary_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + executable = True, + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_binary_impl, + ) + +scala_binary = make_scala_binary() diff --git a/scala/private/rules/scala_doc.bzl b/scala/private/rules/scala_doc.bzl new file mode 100644 index 000000000..59b757d3c --- /dev/null +++ b/scala/private/rules/scala_doc.bzl @@ -0,0 +1,103 @@ +"""Scaladoc support""" + +load("@io_bazel_rules_scala//scala/private:common.bzl", "collect_plugin_paths") + +_ScaladocAspectInfo = provider(fields = [ + "src_files", + "compile_jars", + "plugins", +]) + +def _scaladoc_aspect_impl(target, ctx): + """Collect source files and compile_jars from JavaInfo-returning deps.""" + + # We really only care about visited targets with srcs, so only look at those. + if hasattr(ctx.rule.attr, "srcs"): + # Collect only Java and Scala sources enumerated in visited targets, including src_files in deps. + src_files = depset( + direct = [file for file in ctx.rule.files.srcs if file.extension.lower() in ["java", "scala"]], + transitive = [dep[_ScaladocAspectInfo].src_files for dep in ctx.rule.attr.deps if _ScaladocAspectInfo in dep], + ) + + # Collect compile_jars from visited targets' deps. + compile_jars = depset( + direct = [file for file in ctx.rule.files.deps], + transitive = ( + [dep[JavaInfo].compile_jars for dep in ctx.rule.attr.deps if JavaInfo in dep] + + [dep[_ScaladocAspectInfo].compile_jars for dep in ctx.rule.attr.deps if _ScaladocAspectInfo in dep] + ), + ) + + plugins = depset() + if hasattr(ctx.rule.attr, "plugins"): + plugins = depset(direct = ctx.rule.attr.plugins) + + return [_ScaladocAspectInfo( + src_files = src_files, + compile_jars = compile_jars, + plugins = plugins, + )] + else: + return [] + +_scaladoc_aspect = aspect( + implementation = _scaladoc_aspect_impl, + attr_aspects = ["deps"], + required_aspect_providers = [ + [JavaInfo], + ], +) + +def _scala_doc_impl(ctx): + # scaladoc warns if you don't have the output directory already created, which is annoying. + output_path = ctx.actions.declare_directory("{}.html".format(ctx.attr.name)) + + # Collect all source files and compile_jars to pass to scaladoc by way of an aspect. + src_files = depset(transitive = [dep[_ScaladocAspectInfo].src_files for dep in ctx.attr.deps]) + compile_jars = depset(transitive = [dep[_ScaladocAspectInfo].compile_jars for dep in ctx.attr.deps]) + + # Get the 'real' paths to the plugin jars. + plugins = collect_plugin_paths(depset(transitive = [dep[_ScaladocAspectInfo].plugins for dep in ctx.attr.deps]).to_list()) + + # Construct the full classpath depset since we need to add compiler plugins too. + classpath = depset(transitive = [plugins, compile_jars]) + + # Construct scaladoc args, which also include scalac args. + # See `scaladoc -help` for more information. + args = ctx.actions.args() + args.add("-usejavacp") + args.add("-nowarn") # turn off warnings for now since they can obscure actual errors for large scala_doc targets + args.add_all(ctx.attr.scalacopts) + args.add("-d", output_path.path) + args.add_all(plugins, format_each = "-Xplugin:%s") + args.add_joined("-classpath", classpath, join_with = ctx.configuration.host_path_separator) + args.add_all(src_files) + + # Run the scaladoc tool! + ctx.actions.run( + inputs = depset(transitive = [src_files, classpath]), + outputs = [output_path], + executable = ctx.attr._scaladoc.files_to_run.executable, + mnemonic = "ScalaDoc", + progress_message = "scaladoc {}".format(ctx.label), + arguments = [args], + ) + + return [DefaultInfo(files = depset(direct = [output_path]))] + +scala_doc = rule( + attrs = { + "deps": attr.label_list( + aspects = [_scaladoc_aspect], + providers = [JavaInfo], + ), + "scalacopts": attr.string_list(), + "_scaladoc": attr.label( + cfg = "host", + executable = True, + default = Label("//src/scala/io/bazel/rules_scala/scaladoc_support:scaladoc_generator"), + ), + }, + doc = "Generate Scaladoc HTML documentation for source files in from the given dependencies.", + implementation = _scala_doc_impl, +) diff --git a/scala/private/rules/scala_junit_test.bzl b/scala/private/rules/scala_junit_test.bzl new file mode 100644 index 000000000..f0a142c60 --- /dev/null +++ b/scala/private/rules/scala_junit_test.bzl @@ -0,0 +1,133 @@ +"""Rules for writing tests with JUnit""" + +load("@bazel_skylib//lib:dicts.bzl", _dicts = "dicts") +load( + "@io_bazel_rules_scala//scala/private:common_attributes.bzl", + "common_attrs", + "implicit_deps", + "launcher_template", +) +load("@io_bazel_rules_scala//scala/private:common_outputs.bzl", "common_outputs") +load( + "@io_bazel_rules_scala//scala/private:phases/phases.bzl", + "extras_phases", + "phase_binary_final", + "phase_common_java_wrapper", + "phase_common_runfiles", + "phase_declare_executable", + "phase_junit_test_collect_jars", + "phase_junit_test_compile", + "phase_junit_test_write_executable", + "phase_jvm_flags", + "phase_merge_jars", + "phase_scalac_provider", + "phase_unused_deps_checker", + "phase_write_manifest", + "run_phases", +) + +def _scala_junit_test_impl(ctx): + if (not (ctx.attr.prefixes) and not (ctx.attr.suffixes)): + fail( + "Setting at least one of the attributes ('prefixes','suffixes') is required", + ) + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + ("collect_jars", phase_junit_test_collect_jars), + ("java_wrapper", phase_common_java_wrapper), + ("declare_executable", phase_declare_executable), + # no need to build an ijar for an executable + ("compile", phase_junit_test_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_common_runfiles), + ("jvm_flags", phase_jvm_flags), + ("write_executable", phase_junit_test_write_executable), + ], + # fixed phase + ("final", phase_binary_final), + ).final + +_scala_junit_test_attrs = { + "prefixes": attr.string_list(default = []), + "suffixes": attr.string_list(default = []), + "suite_label": attr.label( + default = Label( + "//src/java/io/bazel/rulesscala/test_discovery:test_discovery", + ), + ), + "suite_class": attr.string( + default = "io.bazel.rulesscala.test_discovery.DiscoveredTestSuite", + ), + "print_discovered_classes": attr.bool( + default = False, + mandatory = False, + ), + "jvm_flags": attr.string_list(), + "_junit": attr.label( + default = Label( + "//external:io_bazel_rules_scala/dependency/junit/junit", + ), + ), + "_hamcrest": attr.label( + default = Label( + "//external:io_bazel_rules_scala/dependency/hamcrest/hamcrest_core", + ), + ), + "_bazel_test_runner": attr.label( + default = Label( + "@io_bazel_rules_scala//scala:bazel_test_runner_deploy", + ), + allow_files = True, + ), +} + +_junit_resolve_deps = { + "_scala_toolchain": attr.label_list( + default = [ + Label( + "//external:io_bazel_rules_scala/dependency/scala/scala_library", + ), + Label("//external:io_bazel_rules_scala/dependency/junit/junit"), + Label( + "//external:io_bazel_rules_scala/dependency/hamcrest/hamcrest_core", + ), + ], + allow_files = False, + ), +} + +_scala_junit_test_attrs.update(launcher_template) + +_scala_junit_test_attrs.update(implicit_deps) + +_scala_junit_test_attrs.update(common_attrs) + +_scala_junit_test_attrs.update(_junit_resolve_deps) + +_scala_junit_test_attrs.update({ + "tests_from": attr.label_list(providers = [[JavaInfo]]), +}) + +def make_scala_junit_test(*extras): + return rule( + attrs = _dicts.add( + _scala_junit_test_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + test = True, + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_junit_test_impl, + ) + +scala_junit_test = make_scala_junit_test() diff --git a/scala/private/rules/scala_library.bzl b/scala/private/rules/scala_library.bzl new file mode 100644 index 000000000..8173d2446 --- /dev/null +++ b/scala/private/rules/scala_library.bzl @@ -0,0 +1,248 @@ +load("@bazel_skylib//lib:dicts.bzl", _dicts = "dicts") +load( + "@io_bazel_rules_scala//scala/private:common.bzl", + "sanitize_string_for_usage", +) +load( + "@io_bazel_rules_scala//scala/private:common_attributes.bzl", + "common_attrs", + "common_attrs_for_plugin_bootstrapping", + "implicit_deps", + "resolve_deps", +) +load("@io_bazel_rules_scala//scala/private:common_outputs.bzl", "common_outputs") +load( + "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", + _coverage_replacements_provider = "coverage_replacements_provider", +) +load( + "@io_bazel_rules_scala//scala/private:phases/phases.bzl", + "extras_phases", + "phase_collect_exports_jars", + "phase_collect_srcjars", + "phase_common_collect_jars", + "phase_library_compile", + "phase_library_final", + "phase_library_for_plugin_bootstrapping_collect_jars", + "phase_library_for_plugin_bootstrapping_compile", + "phase_library_runfiles", + "phase_macro_library_collect_jars", + "phase_macro_library_compile", + "phase_merge_jars", + "phase_scalac_provider", + "phase_unused_deps_checker", + "phase_write_manifest", + "run_phases", +) + +## +# Common stuff to _library rules +## + +_library_attrs = { + "main_class": attr.string(), + "exports": attr.label_list( + allow_files = False, + aspects = [_coverage_replacements_provider.aspect], + ), +} + +## +# scala_library +## + +def _scala_library_impl(ctx): + # Build up information from dependency-like attributes + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("collect_srcjars", phase_collect_srcjars), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + ("collect_jars", phase_common_collect_jars), + ("compile", phase_library_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_library_runfiles), + ("collect_exports_jars", phase_collect_exports_jars), + ], + # fixed phase + ("final", phase_library_final), + ).final + +_scala_library_attrs = {} + +_scala_library_attrs.update(implicit_deps) + +_scala_library_attrs.update(common_attrs) + +_scala_library_attrs.update(_library_attrs) + +_scala_library_attrs.update(resolve_deps) + +def make_scala_library(*extras): + return rule( + attrs = _dicts.add( + _scala_library_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_library_impl, + ) + +scala_library = make_scala_library() + +# Scala library suite generates a series of scala libraries +# then it depends on them with a meta one which exports all the sub targets +def scala_library_suite( + name, + srcs = [], + exports = [], + visibility = None, + **kwargs): + ts = [] + for src_file in srcs: + n = "%s_lib_%s" % (name, sanitize_string_for_usage(src_file)) + scala_library( + name = n, + srcs = [src_file], + visibility = visibility, + exports = exports, + unused_dependency_checker_mode = "off", + **kwargs + ) + ts.append(n) + scala_library( + name = name, + visibility = visibility, + exports = exports + ts, + deps = ts, + ) + +## +# scala_library_for_plugin_bootstrapping +## + +def _scala_library_for_plugin_bootstrapping_impl(ctx): + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("collect_srcjars", phase_collect_srcjars), + ("write_manifest", phase_write_manifest), + ("collect_jars", phase_library_for_plugin_bootstrapping_collect_jars), + ("compile", phase_library_for_plugin_bootstrapping_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_library_runfiles), + ("collect_exports_jars", phase_collect_exports_jars), + ], + # fixed phase + ("final", phase_library_final), + ).final + +# the scala compiler plugin used for dependency analysis is compiled using `scala_library`. +# in order to avoid cyclic dependencies `scala_library_for_plugin_bootstrapping` was created for this purpose, +# which does not contain plugin related attributes, and thus avoids the cyclic dependency issue +_scala_library_for_plugin_bootstrapping_attrs = {} + +_scala_library_for_plugin_bootstrapping_attrs.update(implicit_deps) + +_scala_library_for_plugin_bootstrapping_attrs.update(_library_attrs) + +_scala_library_for_plugin_bootstrapping_attrs.update(resolve_deps) + +_scala_library_for_plugin_bootstrapping_attrs.update( + common_attrs_for_plugin_bootstrapping, +) + +def make_scala_library_for_plugin_bootstrapping(*extras): + return rule( + attrs = _dicts.add( + _scala_library_for_plugin_bootstrapping_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_library_for_plugin_bootstrapping_impl, + ) + +scala_library_for_plugin_bootstrapping = make_scala_library_for_plugin_bootstrapping() + +## +# scala_macro_library +## + +def _scala_macro_library_impl(ctx): + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("collect_srcjars", phase_collect_srcjars), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + ("collect_jars", phase_macro_library_collect_jars), + ("compile", phase_macro_library_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_library_runfiles), + ("collect_exports_jars", phase_collect_exports_jars), + ], + # fixed phase + ("final", phase_library_final), + ).final + +_scala_macro_library_attrs = { + "main_class": attr.string(), + "exports": attr.label_list(allow_files = False), +} + +_scala_macro_library_attrs.update(implicit_deps) + +_scala_macro_library_attrs.update(common_attrs) + +_scala_macro_library_attrs.update(_library_attrs) + +_scala_macro_library_attrs.update(resolve_deps) + +# Set unused_dependency_checker_mode default to off for scala_macro_library +_scala_macro_library_attrs["unused_dependency_checker_mode"] = attr.string( + default = "off", + values = [ + "warn", + "error", + "off", + "", + ], + mandatory = False, +) + +def make_scala_macro_library(*extras): + return rule( + attrs = _dicts.add( + _scala_macro_library_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_macro_library_impl, + ) + +scala_macro_library = make_scala_macro_library() diff --git a/scala/private/rules/scala_repl.bzl b/scala/private/rules/scala_repl.bzl new file mode 100644 index 000000000..fe454f2ef --- /dev/null +++ b/scala/private/rules/scala_repl.bzl @@ -0,0 +1,80 @@ +"""Rule for launching a Scala REPL with dependencies""" + +load("@bazel_skylib//lib:dicts.bzl", _dicts = "dicts") +load( + "@io_bazel_rules_scala//scala/private:common_attributes.bzl", + "common_attrs", + "implicit_deps", + "launcher_template", + "resolve_deps", +) +load("@io_bazel_rules_scala//scala/private:common_outputs.bzl", "common_outputs") +load( + "@io_bazel_rules_scala//scala/private:phases/phases.bzl", + "extras_phases", + "phase_binary_final", + "phase_common_runfiles", + "phase_declare_executable", + "phase_merge_jars", + "phase_repl_collect_jars", + "phase_repl_compile", + "phase_repl_java_wrapper", + "phase_repl_write_executable", + "phase_scalac_provider", + "phase_unused_deps_checker", + "phase_write_manifest", + "run_phases", +) + +def _scala_repl_impl(ctx): + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + # need scala-compiler for MainGenericRunner below + ("collect_jars", phase_repl_collect_jars), + ("java_wrapper", phase_repl_java_wrapper), + ("declare_executable", phase_declare_executable), + # no need to build an ijar for an executable + ("compile", phase_repl_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_common_runfiles), + ("write_executable", phase_repl_write_executable), + ], + # fixed phase + ("final", phase_binary_final), + ).final + +_scala_repl_attrs = { + "jvm_flags": attr.string_list(), +} + +_scala_repl_attrs.update(launcher_template) + +_scala_repl_attrs.update(implicit_deps) + +_scala_repl_attrs.update(common_attrs) + +_scala_repl_attrs.update(resolve_deps) + +def make_scala_repl(*extras): + return rule( + attrs = _dicts.add( + _scala_repl_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + executable = True, + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_repl_impl, + ) + +scala_repl = make_scala_repl() diff --git a/scala/private/rules/scala_test.bzl b/scala/private/rules/scala_test.bzl new file mode 100644 index 000000000..4751adce1 --- /dev/null +++ b/scala/private/rules/scala_test.bzl @@ -0,0 +1,142 @@ +"""Rules for writing tests with ScalaTest""" + +load("@bazel_skylib//lib:dicts.bzl", _dicts = "dicts") +load( + "@io_bazel_rules_scala//scala/private:common_attributes.bzl", + "common_attrs", + "implicit_deps", + "launcher_template", +) +load("@io_bazel_rules_scala//scala/private:common.bzl", "sanitize_string_for_usage") +load("@io_bazel_rules_scala//scala/private:common_outputs.bzl", "common_outputs") +load( + "@io_bazel_rules_scala//scala/private:phases/phases.bzl", + "extras_phases", + "phase_common_java_wrapper", + "phase_coverage_runfiles", + "phase_declare_executable", + "phase_merge_jars", + "phase_scalac_provider", + "phase_scalatest_collect_jars", + "phase_scalatest_compile", + "phase_scalatest_final", + "phase_scalatest_runfiles", + "phase_scalatest_write_executable", + "phase_unused_deps_checker", + "phase_write_manifest", + "run_phases", +) + +def _scala_test_impl(ctx): + return run_phases( + ctx, + # customizable phases + [ + ("scalac_provider", phase_scalac_provider), + ("write_manifest", phase_write_manifest), + ("unused_deps_checker", phase_unused_deps_checker), + ("collect_jars", phase_scalatest_collect_jars), + ("java_wrapper", phase_common_java_wrapper), + ("declare_executable", phase_declare_executable), + # no need to build an ijar for an executable + ("compile", phase_scalatest_compile), + ("merge_jars", phase_merge_jars), + ("runfiles", phase_scalatest_runfiles), + ("coverage_runfiles", phase_coverage_runfiles), + ("write_executable", phase_scalatest_write_executable), + ], + # fixed phase + ("final", phase_scalatest_final), + ).final + +_scala_test_attrs = { + "main_class": attr.string( + default = "io.bazel.rulesscala.scala_test.Runner", + ), + "colors": attr.bool(default = True), + "full_stacktraces": attr.bool(default = True), + "jvm_flags": attr.string_list(), + "_scalatest": attr.label( + default = Label( + "//external:io_bazel_rules_scala/dependency/scalatest/scalatest", + ), + ), + "_scalatest_runner": attr.label( + cfg = "host", + default = Label("//src/java/io/bazel/rulesscala/scala_test:runner"), + ), + "_scalatest_reporter": attr.label( + default = Label("//scala/support:test_reporter"), + ), + "_jacocorunner": attr.label( + default = Label("@bazel_tools//tools/jdk:JacocoCoverage"), + ), + "_lcov_merger": attr.label( + default = Label("@bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main"), + ), +} + +_test_resolve_deps = { + "_scala_toolchain": attr.label_list( + default = [ + Label( + "//external:io_bazel_rules_scala/dependency/scala/scala_library", + ), + Label( + "//external:io_bazel_rules_scala/dependency/scalatest/scalatest", + ), + ], + allow_files = False, + ), +} + +_scala_test_attrs.update(launcher_template) + +_scala_test_attrs.update(implicit_deps) + +_scala_test_attrs.update(common_attrs) + +_scala_test_attrs.update(_test_resolve_deps) + +def make_scala_test(*extras): + return rule( + attrs = _dicts.add( + _scala_test_attrs, + extras_phases(extras), + *[extra["attrs"] for extra in extras if "attrs" in extra] + ), + executable = True, + fragments = ["java"], + outputs = _dicts.add( + common_outputs, + *[extra["outputs"] for extra in extras if "outputs" in extra] + ), + test = True, + toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], + implementation = _scala_test_impl, + ) + +scala_test = make_scala_test() + +# This auto-generates a test suite based on the passed set of targets +# we will add a root test_suite with the name of the passed name +def scala_test_suite( + name, + srcs = [], + visibility = None, + use_short_names = False, + **kwargs): + ts = [] + i = 0 + for test_file in srcs: + i = i + 1 + n = ("%s_%s" % (name, i)) if use_short_names else ("%s_test_suite_%s" % (name, sanitize_string_for_usage(test_file))) + scala_test( + name = n, + srcs = [test_file], + visibility = visibility, + unused_dependency_checker_mode = "off", + **kwargs + ) + ts.append(n) + native.test_suite(name = name, tests = ts, visibility = visibility) diff --git a/scala/providers.bzl b/scala/providers.bzl index ad17e5b34..66fc97f0b 100644 --- a/scala/providers.bzl +++ b/scala/providers.bzl @@ -1,37 +1,3 @@ -# TODO: this should really be a bazel provider, but we are using old-style rule outputs -# we need to document better what the intellij dependencies on this code actually are -def create_scala_provider( - ijar, - class_jar, - compile_jars, - transitive_runtime_jars, - deploy_jar, - full_jars, - source_jars, - statsfile): - - formatted_for_intellij = [ - struct(class_jar = jar, ijar = None, source_jar = None, source_jars = source_jars) - for jar in full_jars - ] - - rule_outputs = struct( - ijar = ijar, - class_jar = class_jar, - deploy_jar = deploy_jar, - jars = formatted_for_intellij, - statsfile = statsfile, - ) - - # Note that, internally, rules only care about compile_jars and transitive_runtime_jars - # in a similar manner as the java_library and JavaProvider - return struct( - outputs = rule_outputs, - compile_jars = compile_jars, - transitive_runtime_jars = transitive_runtime_jars, - transitive_exports = [], #needed by intellij plugin - ) - ScalacProvider = provider( doc = "ScalacProvider", fields = [ diff --git a/scala/scala.bzl b/scala/scala.bzl index ac53cf986..fb4a9eeaf 100644 --- a/scala/scala.bzl +++ b/scala/scala.bzl @@ -1,704 +1,42 @@ load( - "@io_bazel_rules_scala//scala/private:rule_impls.bzl", - _scala_binary_impl = "scala_binary_impl", - _scala_junit_test_impl = "scala_junit_test_impl", - _scala_library_for_plugin_bootstrapping_impl = "scala_library_for_plugin_bootstrapping_impl", - _scala_library_impl = "scala_library_impl", - _scala_macro_library_impl = "scala_macro_library_impl", - _scala_repl_impl = "scala_repl_impl", - _scala_test_impl = "scala_test_impl", + "@io_bazel_rules_scala//specs2:specs2_junit.bzl", + _specs2_junit_dependencies = "specs2_junit_dependencies", ) load( - "@io_bazel_rules_scala//scala/private:coverage_replacements_provider.bzl", - _coverage_replacements_provider = "coverage_replacements_provider", + "@io_bazel_rules_scala//scala/private:macros/scala_repositories.bzl", + _scala_repositories = "scala_repositories", ) -load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive", "http_file") load( - "@io_bazel_rules_scala//scala:scala_maven_import_external.bzl", - _scala_maven_import_external = "scala_maven_import_external", + "@io_bazel_rules_scala//scala/private:rules/scala_binary.bzl", + _scala_binary = "scala_binary", ) load( - "@io_bazel_rules_scala//scala:scala_cross_version.bzl", - _default_scala_version = "default_scala_version", - _default_scala_version_jar_shas = "default_scala_version_jar_shas", - _extract_major_version = "extract_major_version", - _new_scala_default_repository = "new_scala_default_repository", + "@io_bazel_rules_scala//scala/private:rules/scala_doc.bzl", + _scala_doc = "scala_doc", ) load( - "@io_bazel_rules_scala//specs2:specs2_junit.bzl", - _specs2_junit_dependencies = "specs2_junit_dependencies", + "@io_bazel_rules_scala//scala/private:rules/scala_junit_test.bzl", + _scala_junit_test = "scala_junit_test", ) load( - "@io_bazel_rules_scala//scala:plusone.bzl", - _collect_plus_one_deps_aspect = "collect_plus_one_deps_aspect", -) - -_launcher_template = { - "_java_stub_template": attr.label( - default = Label("@java_stub_template//file"), - ), -} - -_implicit_deps = { - "_singlejar": attr.label( - executable = True, - cfg = "host", - default = Label("@bazel_tools//tools/jdk:singlejar"), - allow_files = True, - ), - "_zipper": attr.label( - executable = True, - cfg = "host", - default = Label("@bazel_tools//tools/zip:zipper"), - allow_files = True, - ), - "_java_toolchain": attr.label( - default = Label("@bazel_tools//tools/jdk:current_java_toolchain"), - ), - "_host_javabase": attr.label( - default = Label("@bazel_tools//tools/jdk:current_java_runtime"), - cfg = "host", - ), - "_java_runtime": attr.label( - default = Label("@bazel_tools//tools/jdk:current_java_runtime"), - ), - "_scalac": attr.label( - default = Label( - "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/scalac", - ), - ), - "_exe": attr.label( - executable = True, - cfg = "host", - default = Label("@io_bazel_rules_scala//src/java/io/bazel/rulesscala/exe:exe"), - ), -} - -# Single dep to allow IDEs to pickup all the implicit dependencies. -_resolve_deps = { - "_scala_toolchain": attr.label_list( - default = [ - Label( - "//external:io_bazel_rules_scala/dependency/scala/scala_library", - ), - ], - allow_files = False, - ), -} - -_test_resolve_deps = { - "_scala_toolchain": attr.label_list( - default = [ - Label( - "//external:io_bazel_rules_scala/dependency/scala/scala_library", - ), - Label( - "//external:io_bazel_rules_scala/dependency/scalatest/scalatest", - ), - ], - allow_files = False, - ), -} - -_junit_resolve_deps = { - "_scala_toolchain": attr.label_list( - default = [ - Label( - "//external:io_bazel_rules_scala/dependency/scala/scala_library", - ), - Label("//external:io_bazel_rules_scala/dependency/junit/junit"), - Label( - "//external:io_bazel_rules_scala/dependency/hamcrest/hamcrest_core", - ), - ], - allow_files = False, - ), -} - -# Common attributes reused across multiple rules. -_common_attrs_for_plugin_bootstrapping = { - "srcs": attr.label_list(allow_files = [ - ".scala", - ".srcjar", - ".java", - ]), - "deps": attr.label_list(aspects = [ - _collect_plus_one_deps_aspect, - _coverage_replacements_provider.aspect, - ]), - "plugins": attr.label_list(allow_files = [".jar"]), - "runtime_deps": attr.label_list(providers = [[JavaInfo]]), - "data": attr.label_list(allow_files = True), - "resources": attr.label_list(allow_files = True), - "resource_strip_prefix": attr.string(), - "resource_jars": attr.label_list(allow_files = True), - "scalacopts": attr.string_list(), - "javacopts": attr.string_list(), - "jvm_flags": attr.string_list(), - "scalac_jvm_flags": attr.string_list(), - "javac_jvm_flags": attr.string_list(), - "expect_java_output": attr.bool( - default = True, - mandatory = False, - ), - "print_compile_time": attr.bool( - default = False, - mandatory = False, - ), -} - -_common_attrs = {} - -_common_attrs.update(_common_attrs_for_plugin_bootstrapping) - -_common_attrs.update({ - # using stricts scala deps is done by using command line flag called 'strict_java_deps' - # switching mode to "on" means that ANY API change in a target's transitive dependencies will trigger a recompilation of that target, - # on the other hand any internal change (i.e. on code that ijar omits) WON’T trigger recompilation by transitive dependencies - "_dependency_analyzer_plugin": attr.label( - default = Label( - "@io_bazel_rules_scala//third_party/dependency_analyzer/src/main:dependency_analyzer", - ), - allow_files = [".jar"], - mandatory = False, - ), - "unused_dependency_checker_mode": attr.string( - values = [ - "warn", - "error", - "off", - "", - ], - mandatory = False, - ), - "_unused_dependency_checker_plugin": attr.label( - default = Label( - "@io_bazel_rules_scala//third_party/unused_dependency_checker/src/main:unused_dependency_checker", - ), - allow_files = [".jar"], - mandatory = False, - ), - "unused_dependency_checker_ignored_targets": attr.label_list(default = []), - "_code_coverage_instrumentation_worker": attr.label( - default = "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/coverage/instrumenter", - allow_files = True, - executable = True, - cfg = "host", - ), -}) - -_library_attrs = { - "main_class": attr.string(), - "exports": attr.label_list( - allow_files = False, - aspects = [_coverage_replacements_provider.aspect], - ), -} - -_common_outputs = { - "jar": "%{name}.jar", - "deploy_jar": "%{name}_deploy.jar", - "manifest": "%{name}_MANIFEST.MF", - "statsfile": "%{name}.statsfile", -} - -_library_outputs = {} - -_library_outputs.update(_common_outputs) - -_scala_library_attrs = {} - -_scala_library_attrs.update(_implicit_deps) - -_scala_library_attrs.update(_common_attrs) - -_scala_library_attrs.update(_library_attrs) - -_scala_library_attrs.update(_resolve_deps) - -scala_library = rule( - attrs = _scala_library_attrs, - fragments = ["java"], - outputs = _library_outputs, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_library_impl, -) - -# the scala compiler plugin used for dependency analysis is compiled using `scala_library`. -# in order to avoid cyclic dependencies `scala_library_for_plugin_bootstrapping` was created for this purpose, -# which does not contain plugin related attributes, and thus avoids the cyclic dependency issue -_scala_library_for_plugin_bootstrapping_attrs = {} - -_scala_library_for_plugin_bootstrapping_attrs.update(_implicit_deps) - -_scala_library_for_plugin_bootstrapping_attrs.update(_library_attrs) - -_scala_library_for_plugin_bootstrapping_attrs.update(_resolve_deps) - -_scala_library_for_plugin_bootstrapping_attrs.update( - _common_attrs_for_plugin_bootstrapping, + "@io_bazel_rules_scala//scala/private:rules/scala_library.bzl", + _scala_library = "scala_library", + _scala_library_for_plugin_bootstrapping = "scala_library_for_plugin_bootstrapping", + _scala_library_suite = "scala_library_suite", + _scala_macro_library = "scala_macro_library", ) - -scala_library_for_plugin_bootstrapping = rule( - attrs = _scala_library_for_plugin_bootstrapping_attrs, - fragments = ["java"], - outputs = _library_outputs, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_library_for_plugin_bootstrapping_impl, -) - -_scala_macro_library_attrs = { - "main_class": attr.string(), - "exports": attr.label_list(allow_files = False), -} - -_scala_macro_library_attrs.update(_implicit_deps) - -_scala_macro_library_attrs.update(_common_attrs) - -_scala_macro_library_attrs.update(_library_attrs) - -_scala_macro_library_attrs.update(_resolve_deps) - -# Set unused_dependency_checker_mode default to off for scala_macro_library -_scala_macro_library_attrs["unused_dependency_checker_mode"] = attr.string( - default = "off", - values = [ - "warn", - "error", - "off", - "", - ], - mandatory = False, -) - -scala_macro_library = rule( - attrs = _scala_macro_library_attrs, - fragments = ["java"], - outputs = _common_outputs, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_macro_library_impl, -) - -_scala_binary_attrs = { - "main_class": attr.string(mandatory = True), - "classpath_resources": attr.label_list(allow_files = True), -} - -_scala_binary_attrs.update(_launcher_template) - -_scala_binary_attrs.update(_implicit_deps) - -_scala_binary_attrs.update(_common_attrs) - -_scala_binary_attrs.update(_resolve_deps) - -scala_binary = rule( - attrs = _scala_binary_attrs, - executable = True, - fragments = ["java"], - outputs = _common_outputs, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_binary_impl, -) - -_scala_test_attrs = { - "main_class": attr.string( - default = "io.bazel.rulesscala.scala_test.Runner", - ), - "suites": attr.string_list(), - "colors": attr.bool(default = True), - "full_stacktraces": attr.bool(default = True), - "_scalatest": attr.label( - default = Label( - "//external:io_bazel_rules_scala/dependency/scalatest/scalatest", - ), - ), - "_scalatest_runner": attr.label( - cfg = "host", - default = Label("//src/java/io/bazel/rulesscala/scala_test:runner"), - ), - "_scalatest_reporter": attr.label( - default = Label("//scala/support:test_reporter"), - ), - "_jacocorunner": attr.label( - default = Label("@bazel_tools//tools/jdk:JacocoCoverage"), - ), - "_lcov_merger": attr.label( - default = Label("@bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main"), - ), -} - -_scala_test_attrs.update(_launcher_template) - -_scala_test_attrs.update(_implicit_deps) - -_scala_test_attrs.update(_common_attrs) - -_scala_test_attrs.update(_test_resolve_deps) - -scala_test = rule( - attrs = _scala_test_attrs, - executable = True, - fragments = ["java"], - outputs = _common_outputs, - test = True, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_test_impl, -) - -_scala_repl_attrs = {} - -_scala_repl_attrs.update(_launcher_template) - -_scala_repl_attrs.update(_implicit_deps) - -_scala_repl_attrs.update(_common_attrs) - -_scala_repl_attrs.update(_resolve_deps) - -scala_repl = rule( - attrs = _scala_repl_attrs, - executable = True, - fragments = ["java"], - outputs = _common_outputs, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_repl_impl, +load( + "@io_bazel_rules_scala//scala/private:rules/scala_repl.bzl", + _scala_repl = "scala_repl", ) - -def _default_scala_extra_jars(): - return { - "2.11": { - "scalatest": { - "version": "3.0.5", - "sha256": "2aafeb41257912cbba95f9d747df9ecdc7ff43f039d35014b4c2a8eb7ed9ba2f", - }, - "scalactic": { - "version": "3.0.5", - "sha256": "84723064f5716f38990fe6e65468aa39700c725484efceef015771d267341cf2", - }, - "scala_xml": { - "version": "1.0.5", - "sha256": "767e11f33eddcd506980f0ff213f9d553a6a21802e3be1330345f62f7ee3d50f", - }, - "scala_parser_combinators": { - "version": "1.0.4", - "sha256": "0dfaafce29a9a245b0a9180ec2c1073d2bd8f0330f03a9f1f6a74d1bc83f62d6", - }, - }, - "2.12": { - "scalatest": { - "version": "3.0.5", - "sha256": "b416b5bcef6720da469a8d8a5726e457fc2d1cd5d316e1bc283aa75a2ae005e5", - }, - "scalactic": { - "version": "3.0.5", - "sha256": "57e25b4fd969b1758fe042595112c874dfea99dca5cc48eebe07ac38772a0c41", - }, - "scala_xml": { - "version": "1.0.5", - "sha256": "035015366f54f403d076d95f4529ce9eeaf544064dbc17c2d10e4f5908ef4256", - }, - "scala_parser_combinators": { - "version": "1.0.4", - "sha256": "282c78d064d3e8f09b3663190d9494b85e0bb7d96b0da05994fe994384d96111", - }, - }, - } - -def scala_repositories( - scala_version_shas = ( - _default_scala_version(), - _default_scala_version_jar_shas(), - ), - maven_servers = ["https://repo1.maven.org/maven2"], - scala_extra_jars = _default_scala_extra_jars(), - fetch_sources = False): - (scala_version, scala_version_jar_shas) = scala_version_shas - major_version = _extract_major_version(scala_version) - - _new_scala_default_repository( - maven_servers = maven_servers, - scala_version = scala_version, - scala_version_jar_shas = scala_version_jar_shas, - fetch_sources = fetch_sources, - ) - - scala_version_extra_jars = scala_extra_jars[major_version] - - _scala_maven_import_external( - name = "io_bazel_rules_scala_scalatest", - artifact = "org.scalatest:scalatest_{major_version}:{extra_jar_version}".format( - major_version = major_version, - extra_jar_version = scala_version_extra_jars["scalatest"]["version"], - ), - jar_sha256 = scala_version_extra_jars["scalatest"]["sha256"], - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - _scala_maven_import_external( - name = "io_bazel_rules_scala_scalactic", - artifact = "org.scalactic:scalactic_{major_version}:{extra_jar_version}".format( - major_version = major_version, - extra_jar_version = scala_version_extra_jars["scalactic"]["version"], - ), - jar_sha256 = scala_version_extra_jars["scalactic"]["sha256"], - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - _scala_maven_import_external( - name = "io_bazel_rules_scala_scala_xml", - artifact = "org.scala-lang.modules:scala-xml_{major_version}:{extra_jar_version}".format( - major_version = major_version, - extra_jar_version = scala_version_extra_jars["scala_xml"]["version"], - ), - jar_sha256 = scala_version_extra_jars["scala_xml"]["sha256"], - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - _scala_maven_import_external( - name = "io_bazel_rules_scala_scala_parser_combinators", - artifact = - "org.scala-lang.modules:scala-parser-combinators_{major_version}:{extra_jar_version}".format( - major_version = major_version, - extra_jar_version = scala_version_extra_jars["scala_parser_combinators"]["version"], - ), - jar_sha256 = scala_version_extra_jars["scala_parser_combinators"]["sha256"], - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - # used by ScalacProcessor - _scala_maven_import_external( - name = "scalac_rules_commons_io", - artifact = "commons-io:commons-io:2.6", - jar_sha256 = "f877d304660ac2a142f3865badfc971dec7ed73c747c7f8d5d2f5139ca736513", - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - _scala_maven_import_external( - name = "io_bazel_rules_scala_guava", - artifact = "com.google.guava:guava:21.0", - jar_sha256 = "972139718abc8a4893fa78cba8cf7b2c903f35c97aaf44fa3031b0669948b480", - licenses = ["notice"], - server_urls = maven_servers, - ) - - _scala_maven_import_external( - name = "io_bazel_rules_scala_org_jacoco_org_jacoco_core", - artifact = "org.jacoco:org.jacoco.core:0.7.5.201505241946", - jar_sha256 = "ecf1ad8192926438d0748bfcc3f09bebc7387d2a4184bb3a171a26084677e808", - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - _scala_maven_import_external( - name = "io_bazel_rules_scala_org_ow2_asm_asm_debug_all", - artifact = "org.ow2.asm:asm-debug-all:5.0.1", - jar_sha256 = "4734de5b515a454b0096db6971fb068e5f70e6f10bbee2b3bd2fdfe5d978ed57", - licenses = ["notice"], - server_urls = maven_servers, - fetch_sources = fetch_sources, - ) - - # Using this and not the bazel regular one due to issue when classpath is too long - # until https://github.com/bazelbuild/bazel/issues/6955 is resolved - if not native.existing_rule("java_stub_template"): - http_archive( - name = "java_stub_template", - sha256 = "1859a37dccaee8c56b98869bf1f22f6f5b909606aff74ddcfd59e9757a038dd5", - urls = ["https://github.com/bazelbuild/rules_scala/archive/8b8271e3ee5709e1340b19790d0b396a0ff3dd0f.tar.gz"], - strip_prefix = "rules_scala-8b8271e3ee5709e1340b19790d0b396a0ff3dd0f/java_stub_template", - ) - - if not native.existing_rule("com_google_protobuf"): - http_archive( - name = "com_google_protobuf", - sha256 = "d82eb0141ad18e98de47ed7ed415daabead6d5d1bef1b8cccb6aa4d108a9008f", - strip_prefix = "protobuf-b4f193788c9f0f05d7e0879ea96cd738630e5d51", - # Commit from 2019-05-15, update to protobuf 3.8 when available. - url = "https://github.com/protocolbuffers/protobuf/archive/b4f193788c9f0f05d7e0879ea96cd738630e5d51.tar.gz", - ) - - if not native.existing_rule("zlib"): # needed by com_google_protobuf - http_archive( - name = "zlib", - build_file = "@com_google_protobuf//:third_party/zlib.BUILD", - sha256 = "c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1", - strip_prefix = "zlib-1.2.11", - urls = ["https://zlib.net/zlib-1.2.11.tar.gz"], - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/com_google_protobuf/protobuf_java", - actual = "@com_google_protobuf//:protobuf_java", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/commons_io/commons_io", - actual = "@scalac_rules_commons_io//jar", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scalatest/scalatest", - actual = "@io_bazel_rules_scala//scala/scalatest:scalatest", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/scala_compiler", - actual = "@io_bazel_rules_scala_scala_compiler", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/scala_library", - actual = "@io_bazel_rules_scala_scala_library", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/scala_reflect", - actual = "@io_bazel_rules_scala_scala_reflect", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/scala_xml", - actual = "@io_bazel_rules_scala_scala_xml", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/parser_combinators", - actual = "@io_bazel_rules_scala_scala_parser_combinators", - ) - - native.bind( - name = "io_bazel_rules_scala/dependency/scala/guava", - actual = "@io_bazel_rules_scala_guava", - ) - -def _sanitize_string_for_usage(s): - res_array = [] - for idx in range(len(s)): - c = s[idx] - if c.isalnum() or c == ".": - res_array.append(c) - else: - res_array.append("_") - return "".join(res_array) - -# This auto-generates a test suite based on the passed set of targets -# we will add a root test_suite with the name of the passed name -def scala_test_suite( - name, - srcs = [], - visibility = None, - use_short_names = False, - **kwargs): - ts = [] - i = 0 - for test_file in srcs: - i = i + 1 - n = ("%s_%s" % (name, i)) if use_short_names else ("%s_test_suite_%s" % (name, _sanitize_string_for_usage(test_file))) - scala_test( - name = n, - srcs = [test_file], - visibility = visibility, - unused_dependency_checker_mode = "off", - **kwargs - ) - ts.append(n) - native.test_suite(name = name, tests = ts, visibility = visibility) - -# Scala library suite generates a series of scala libraries -# then it depends on them with a meta one which exports all the sub targets -def scala_library_suite( - name, - srcs = [], - exports = [], - visibility = None, - **kwargs): - ts = [] - for src_file in srcs: - n = "%s_lib_%s" % (name, _sanitize_string_for_usage(src_file)) - scala_library( - name = n, - srcs = [src_file], - visibility = visibility, - exports = exports, - unused_dependency_checker_mode = "off", - **kwargs - ) - ts.append(n) - scala_library( - name = name, - visibility = visibility, - exports = exports + ts, - deps = ts, - ) - -_scala_junit_test_attrs = { - "prefixes": attr.string_list(default = []), - "suffixes": attr.string_list(default = []), - "suite_label": attr.label( - default = Label( - "//src/java/io/bazel/rulesscala/test_discovery:test_discovery", - ), - ), - "suite_class": attr.string( - default = "io.bazel.rulesscala.test_discovery.DiscoveredTestSuite", - ), - "print_discovered_classes": attr.bool( - default = False, - mandatory = False, - ), - "_junit": attr.label( - default = Label( - "//external:io_bazel_rules_scala/dependency/junit/junit", - ), - ), - "_hamcrest": attr.label( - default = Label( - "//external:io_bazel_rules_scala/dependency/hamcrest/hamcrest_core", - ), - ), - "_bazel_test_runner": attr.label( - default = Label( - "@io_bazel_rules_scala//scala:bazel_test_runner_deploy", - ), - allow_files = True, - ), -} - -_scala_junit_test_attrs.update(_launcher_template) - -_scala_junit_test_attrs.update(_implicit_deps) - -_scala_junit_test_attrs.update(_common_attrs) - -_scala_junit_test_attrs.update(_junit_resolve_deps) - -_scala_junit_test_attrs.update({ - "tests_from": attr.label_list(providers = [[JavaInfo]]), -}) - -scala_junit_test = rule( - attrs = _scala_junit_test_attrs, - fragments = ["java"], - outputs = _common_outputs, - test = True, - toolchains = ["@io_bazel_rules_scala//scala:toolchain_type"], - implementation = _scala_junit_test_impl, +load( + "@io_bazel_rules_scala//scala/private:rules/scala_test.bzl", + _scala_test = "scala_test", + _scala_test_suite = "scala_test_suite", ) def scala_specs2_junit_test(name, **kwargs): - scala_junit_test( + _scala_junit_test( name = name, deps = _specs2_junit_dependencies() + kwargs.pop("deps", []), unused_dependency_checker_ignored_targets = @@ -709,3 +47,16 @@ def scala_specs2_junit_test(name, **kwargs): suite_class = "io.bazel.rulesscala.specs2.Specs2DiscoveredTestSuite", **kwargs ) + +# Re-export private rules for public consumption +scala_binary = _scala_binary +scala_doc = _scala_doc +scala_junit_test = _scala_junit_test +scala_library = _scala_library +scala_library_for_plugin_bootstrapping = _scala_library_for_plugin_bootstrapping +scala_library_suite = _scala_library_suite +scala_macro_library = _scala_macro_library +scala_repl = _scala_repl +scala_repositories = _scala_repositories +scala_test = _scala_test +scala_test_suite = _scala_test_suite diff --git a/scala/scala_cross_version.bzl b/scala/scala_cross_version.bzl index 3bd3db270..77095acec 100644 --- a/scala/scala_cross_version.bzl +++ b/scala/scala_cross_version.bzl @@ -57,11 +57,11 @@ def new_scala_default_repository( scala_version, scala_version_jar_shas, maven_servers, - fetch_sources): + fetch_sources=False): _scala_maven_import_external( name = "io_bazel_rules_scala_scala_library", artifact = "org.scala-lang:scala-library:{}".format(scala_version), - jar_sha256 = scala_version_jar_shas["scala_library"], + artifact_sha256 = scala_version_jar_shas["scala_library"], licenses = ["notice"], server_urls = maven_servers, fetch_sources = fetch_sources, @@ -69,7 +69,7 @@ def new_scala_default_repository( _scala_maven_import_external( name = "io_bazel_rules_scala_scala_compiler", artifact = "org.scala-lang:scala-compiler:{}".format(scala_version), - jar_sha256 = scala_version_jar_shas["scala_compiler"], + artifact_sha256 = scala_version_jar_shas["scala_compiler"], licenses = ["notice"], server_urls = maven_servers, fetch_sources = fetch_sources, @@ -77,7 +77,7 @@ def new_scala_default_repository( _scala_maven_import_external( name = "io_bazel_rules_scala_scala_reflect", artifact = "org.scala-lang:scala-reflect:{}".format(scala_version), - jar_sha256 = scala_version_jar_shas["scala_reflect"], + artifact_sha256 = scala_version_jar_shas["scala_reflect"], licenses = ["notice"], server_urls = maven_servers, fetch_sources = fetch_sources, diff --git a/scala/scala_import.bzl b/scala/scala_import.bzl index 0bbfa34ef..e0d889be6 100644 --- a/scala/scala_import.bzl +++ b/scala/scala_import.bzl @@ -13,9 +13,10 @@ def _scala_import_impl(ctx): intellij_metadata, ) = (target_data.code_jars, target_data.intellij_metadata) current_jars = depset(current_target_compile_jars) - exports = _collect(ctx.attr.exports) - transitive_runtime_jars = _collect_runtime(ctx.attr.runtime_deps) - jars = _collect(ctx.attr.deps) + exports = java_common.merge([export[JavaInfo] for export in ctx.attr.exports]) + transitive_runtime_jars = \ + java_common.merge([dep[JavaInfo] for dep in ctx.attr.runtime_deps]) \ + .transitive_runtime_jars jars2labels = {} _collect_labels(ctx.attr.deps, jars2labels) _collect_labels(ctx.attr.exports, jars2labels) #untested @@ -24,64 +25,30 @@ def _scala_import_impl(ctx): ctx.label, jars2labels, ) #last to override the label of the export compile jars to the current target - return struct( - scala = struct( - outputs = struct(jars = intellij_metadata), - ), - providers = [ - _create_provider( - current_jars, - transitive_runtime_jars, - jars, - exports, - ctx.attr.neverlink, - ctx.file.srcjar, - intellij_metadata, - ), - DefaultInfo( - files = current_jars, - ), - JarsToLabelsInfo(jars_to_labels = jars2labels), - ], - ) - -def _create_provider( - current_target_compile_jars, - transitive_runtime_jars, - jars, - exports, - neverlink, - source_jar, - intellij_metadata): - transitive_runtime_jars = [ - transitive_runtime_jars, - jars.transitive_runtime_jars, - exports.transitive_runtime_jars, - ] - if not neverlink: - transitive_runtime_jars.append(current_target_compile_jars) - - source_jars = [] - - if source_jar: - source_jars.append(source_jar) + if current_target_compile_jars: + current_target_providers = [_new_java_info(ctx, jar) for jar in current_target_compile_jars] else: - for metadata in intellij_metadata: - source_jars.extend(metadata.source_jars) + # TODO(#8867): Migrate away from the placeholder jar hack when #8867 is fixed. + current_target_providers = [_new_java_info(ctx, ctx.file._placeholder_jar)] - return java_common.create_provider( - use_ijar = False, - compile_time_jars = depset( - transitive = [current_target_compile_jars, exports.compile_jars], + return [ + java_common.merge(current_target_providers), + DefaultInfo( + files = current_jars, ), - transitive_compile_time_jars = depset(transitive = [ - jars.transitive_compile_jars, - current_target_compile_jars, - exports.transitive_compile_jars, - ]), - transitive_runtime_jars = depset(transitive = transitive_runtime_jars), - source_jars = source_jars, + JarsToLabelsInfo(jars_to_labels = jars2labels), + ] + +def _new_java_info(ctx, jar): + return JavaInfo( + output_jar = jar, + compile_jar = jar, + exports = [target[JavaInfo] for target in ctx.attr.exports], + deps = [target[JavaInfo] for target in ctx.attr.deps], + runtime_deps = [target[JavaInfo] for target in ctx.attr.runtime_deps], + source_jar = ctx.file.srcjar, + neverlink = ctx.attr.neverlink, ) def _add_labels_of_current_code_jars(code_jars, label, jars2labels): @@ -122,24 +89,6 @@ def _filter_out_non_code_jars(files): def _is_source_jar(file): return file.basename.endswith("-sources.jar") -# TODO: it seems this could be reworked to use java_common.merge -def _collect(deps): - transitive_compile_jars = [] - runtime_jars = [] - compile_jars = [] - - for dep_target in deps: - java_provider = dep_target[JavaInfo] - compile_jars.append(java_provider.compile_jars) - transitive_compile_jars.append(java_provider.transitive_compile_time_jars) - runtime_jars.append(java_provider.transitive_runtime_jars) - - return struct( - transitive_runtime_jars = depset(transitive = runtime_jars), - transitive_compile_jars = depset(transitive = transitive_compile_jars), - compile_jars = depset(transitive = compile_jars), - ) - def _collect_labels(deps, jars2labels): for dep_target in deps: if JarsToLabelsInfo in dep_target: @@ -150,14 +99,6 @@ def _collect_labels(deps, jars2labels): for jar in java_provider.compile_jars.to_list(): jars2labels[jar.path] = dep_target.label -def _collect_runtime(runtime_deps): - jar_deps = [] - for dep_target in runtime_deps: - java_provider = dep_target[JavaInfo] - jar_deps.append(java_provider.transitive_runtime_jars) - - return depset(transitive = jar_deps) - scala_import = rule( implementation = _scala_import_impl, attrs = { @@ -169,5 +110,9 @@ scala_import = rule( "exports": attr.label_list(), "neverlink": attr.bool(), "srcjar": attr.label(allow_single_file = True), + "_placeholder_jar": attr.label( + allow_single_file = True, + default = Label("@io_bazel_rules_scala//scala:libPlaceHolderClassToCreateEmptyJarForScalaImport.jar"), + ), }, ) diff --git a/scala/scala_maven_import_external.bzl b/scala/scala_maven_import_external.bzl index 0c2dda477..51b1f962c 100644 --- a/scala/scala_maven_import_external.bzl +++ b/scala/scala_maven_import_external.bzl @@ -228,7 +228,7 @@ jvm_import_external = repository_rule( ), "extra_build_file_content": attr.string(), }, - environ = [_FETCH_SOURCES_ENV_VAR_NAME] + environ = [_FETCH_SOURCES_ENV_VAR_NAME], ) def scala_maven_import_external( @@ -382,7 +382,7 @@ java_import_external( "http://repo1.maven.org/maven2/com/google/guava/guava/20.0/guava-20.0.jar", "http://maven.ibiblio.org/maven2/com/google/guava/guava/20.0/guava-20.0.jar", ], - jar_sha256 = "36a666e3b71ae7f0f0dca23654b67e086e6c93d192f60ba5dfd5519db6c288c8", + artifact_sha256 = "36a666e3b71ae7f0f0dca23654b67e086e6c93d192f60ba5dfd5519db6c288c8", deps = [ "@com_google_code_findbugs_jsr305", "@com_google_errorprone_error_prone_annotations", @@ -397,13 +397,13 @@ java_import_external( "http://repo1.maven.org/maven2/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar", "http://maven.ibiblio.org/maven2/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar", ], - jar_sha256 = "905721a0eea90a81534abb7ee6ef4ea2e5e645fa1def0a5cd88402df1b46c9ed", + artifact_sha256 = "905721a0eea90a81534abb7ee6ef4ea2e5e645fa1def0a5cd88402df1b46c9ed", ) java_import_external( name = "com_google_errorprone_error_prone_annotations", licenses = ["notice"], # Apache 2.0 - jar_sha256 = "e7749ffdf03fb8ebe08a727ea205acb301c8791da837fee211b99b04f9d79c46", + artifact_sha256 = "e7749ffdf03fb8ebe08a727ea205acb301c8791da837fee211b99b04f9d79c46", jar_urls = [ "http://bazel-mirror.storage.googleapis.com/repo1.maven.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar", "http://maven.ibiblio.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar", @@ -440,9 +440,8 @@ will be loaded into a runtime environment where certain dependencies can be reasonably expected to already be provided. """ -def java_import_external(jar_sha256, **kwargs): +def java_import_external(**kwargs): jvm_import_external( rule_name = "java_import", - jar_sha256 = jar_sha256, **kwargs ) diff --git a/scala/scala_toolchain.bzl b/scala/scala_toolchain.bzl index 157eae715..f57e23302 100644 --- a/scala/scala_toolchain.bzl +++ b/scala/scala_toolchain.bzl @@ -10,6 +10,8 @@ def _scala_toolchain_impl(ctx): unused_dependency_checker_mode = ctx.attr.unused_dependency_checker_mode, plus_one_deps_mode = ctx.attr.plus_one_deps_mode, enable_code_coverage_aspect = ctx.attr.enable_code_coverage_aspect, + scalac_jvm_flags = ctx.attr.scalac_jvm_flags, + scala_test_jvm_flags = ctx.attr.scala_test_jvm_flags, ) return [toolchain] @@ -32,6 +34,8 @@ scala_toolchain = rule( "enable_code_coverage_aspect": attr.string( default = "off", values = ["off", "on"], - ) + ), + "scalac_jvm_flags": attr.string_list(), + "scala_test_jvm_flags": attr.string_list(), }, ) diff --git a/scala_proto/BUILD b/scala_proto/BUILD index 08d8bd7d9..55fe207ad 100644 --- a/scala_proto/BUILD +++ b/scala_proto/BUILD @@ -1,4 +1,5 @@ load("//scala_proto:scala_proto_toolchain.bzl", "scala_proto_toolchain") +load("//scala_proto:default_dep_sets.bzl", "DEFAULT_SCALAPB_COMPILE_DEPS", "DEFAULT_SCALAPB_GRPC_DEPS") toolchain_type( name = "toolchain_type", @@ -7,10 +8,10 @@ toolchain_type( scala_proto_toolchain( name = "default_toolchain_impl", - with_grpc = True, - with_flat_package=False, - with_single_line_to_string=False, visibility = ["//visibility:public"], + with_flat_package = False, + with_grpc = True, + with_single_line_to_string = False, ) toolchain( @@ -20,14 +21,13 @@ toolchain( visibility = ["//visibility:public"], ) - scala_proto_toolchain( name = "enable_all_options_toolchain_impl", + visibility = ["//visibility:public"], + with_flat_package = True, with_grpc = True, - with_flat_package=True, # with_java=True, - with_single_line_to_string=True, - visibility = ["//visibility:public"], + with_single_line_to_string = True, ) toolchain( @@ -36,3 +36,15 @@ toolchain( toolchain_type = "@io_bazel_rules_scala//scala_proto:toolchain_type", visibility = ["//visibility:public"], ) + +java_library( + name = "default_scalapb_compile_dependencies", + visibility = ["//visibility:public"], + exports = DEFAULT_SCALAPB_COMPILE_DEPS, +) + +java_library( + name = "default_scalapb_grpc_dependencies", + visibility = ["//visibility:public"], + exports = DEFAULT_SCALAPB_GRPC_DEPS, +) diff --git a/scala_proto/default_dep_sets.bzl b/scala_proto/default_dep_sets.bzl index 5ff3e840b..345890cd8 100644 --- a/scala_proto/default_dep_sets.bzl +++ b/scala_proto/default_dep_sets.bzl @@ -1,4 +1,3 @@ - # These are the compile/runtime dependencies needed for scalapb compilation # and grpc compile/runtime. # @@ -24,6 +23,9 @@ DEFAULT_SCALAPB_GRPC_DEPS = [ "//external:io_bazel_rules_scala/dependency/proto/grpc_context", "//external:io_bazel_rules_scala/dependency/proto/guava", "//external:io_bazel_rules_scala/dependency/proto/opencensus_api", + "//external:io_bazel_rules_scala/dependency/proto/opencensus_impl", + "//external:io_bazel_rules_scala/dependency/proto/disruptor", + "//external:io_bazel_rules_scala/dependency/proto/opencensus_impl_core", "//external:io_bazel_rules_scala/dependency/proto/opencensus_contrib_grpc_metrics", "//external:io_bazel_rules_scala/dependency/proto/google_instrumentation", "//external:io_bazel_rules_scala/dependency/proto/netty_codec", @@ -37,4 +39,3 @@ DEFAULT_SCALAPB_GRPC_DEPS = [ "//external:io_bazel_rules_scala/dependency/proto/netty_common", "//external:io_bazel_rules_scala/dependency/proto/netty_handler_proxy", ] - diff --git a/scala_proto/private/proto_to_scala_src.bzl b/scala_proto/private/proto_to_scala_src.bzl index 58d565fe3..b3f7727de 100644 --- a/scala_proto/private/proto_to_scala_src.bzl +++ b/scala_proto/private/proto_to_scala_src.bzl @@ -4,24 +4,20 @@ load( ) load("//scala/private:rule_impls.bzl", "compile_scala") - def _root_path(f): if f.is_source: return f.owner.workspace_root return "/".join([f.root.path, f.owner.workspace_root]) - def _colon_paths(data): return ":".join([ f.path for f in sorted(data) ]) - def encode_named_generators(named_generators): return ",".join([k + "=" + v for (k, v) in sorted(named_generators.items())]) - def proto_to_scala_src(ctx, label, code_generator, compile_proto, include_proto, transitive_proto_paths, flags, jar_output, named_generators, extra_generator_jars): worker_content = "{output}\n{included_proto}\n{flags_arg}\n{transitive_proto_paths}\n{inputs}\n{protoc}\n{extra_generator_pairs}\n{extra_cp_entries}".format( output = jar_output.path, @@ -33,8 +29,8 @@ def proto_to_scala_src(ctx, label, code_generator, compile_proto, include_proto, # Pass inputs seprately because they doesn't always match to imports (ie blacklisted protos are excluded) inputs = _colon_paths(compile_proto), protoc = ctx.executable._protoc.path, - extra_generator_pairs= "-" + encode_named_generators(named_generators), - extra_cp_entries = "-" + _colon_paths(extra_generator_jars) + extra_generator_pairs = "-" + encode_named_generators(named_generators), + extra_cp_entries = "-" + _colon_paths(extra_generator_jars), ) argfile = ctx.actions.declare_file( "%s_worker_input" % label.name, @@ -44,11 +40,10 @@ def proto_to_scala_src(ctx, label, code_generator, compile_proto, include_proto, ctx.actions.run( executable = code_generator.files_to_run, inputs = compile_proto + include_proto + [argfile, ctx.executable._protoc] + extra_generator_jars, + tools = compile_proto, outputs = [jar_output], mnemonic = "ProtoScalaPBRule", progress_message = "creating scalapb files %s" % ctx.label, execution_requirements = {"supports-workers": "1"}, arguments = ["@" + argfile.path], ) - - diff --git a/scala_proto/private/scala_proto_default_repositories.bzl b/scala_proto/private/scala_proto_default_repositories.bzl index f11bd4900..8a1899345 100644 --- a/scala_proto/private/scala_proto_default_repositories.bzl +++ b/scala_proto/private/scala_proto_default_repositories.bzl @@ -11,7 +11,7 @@ load( def scala_proto_default_repositories( scala_version = _default_scala_version(), - maven_servers = ["https://repo1.maven.org/maven2"]): + maven_servers = ["https://repo.maven.apache.org/maven2"]): major_version = _extract_major_version(scala_version) scala_jar_shas = { @@ -43,7 +43,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:compilerplugin:0.8.4", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapb_plugin"], + artifact_sha256 = scala_version_jar_shas["scalapb_plugin"], licenses = ["notice"], server_urls = maven_servers, ) @@ -59,7 +59,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:protoc-bridge:0.7.3", major_version, ), - jar_sha256 = scala_version_jar_shas["protoc_bridge"], + artifact_sha256 = scala_version_jar_shas["protoc_bridge"], licenses = ["notice"], server_urls = maven_servers, ) @@ -75,7 +75,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:scalapbc:0.8.4", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapbc"], + artifact_sha256 = scala_version_jar_shas["scalapbc"], licenses = ["notice"], server_urls = maven_servers, ) @@ -90,7 +90,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:scalapb-runtime:0.8.4", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapb_runtime"], + artifact_sha256 = scala_version_jar_shas["scalapb_runtime"], licenses = ["notice"], server_urls = maven_servers, ) @@ -105,7 +105,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:scalapb-runtime-grpc:0.8.4", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapb_runtime_grpc"], + artifact_sha256 = scala_version_jar_shas["scalapb_runtime_grpc"], licenses = ["notice"], server_urls = maven_servers, ) @@ -120,7 +120,7 @@ def scala_proto_default_repositories( "com.thesamet.scalapb:lenses:0.8.4", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapb_lenses"], + artifact_sha256 = scala_version_jar_shas["scalapb_lenses"], licenses = ["notice"], server_urls = maven_servers, ) @@ -135,7 +135,7 @@ def scala_proto_default_repositories( "com.lihaoyi:fastparse:1.0.0", major_version, ), - jar_sha256 = scala_version_jar_shas["scalapb_fastparse"], + artifact_sha256 = scala_version_jar_shas["scalapb_fastparse"], licenses = ["notice"], server_urls = maven_servers, ) @@ -148,7 +148,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_grpc_core", artifact = "io.grpc:grpc-core:1.19.0", - jar_sha256 = "3cfaae2db268e4da2609079cecade8434afcb7ab23a126a57d870b722b2b6ab9", + artifact_sha256 = "3cfaae2db268e4da2609079cecade8434afcb7ab23a126a57d870b722b2b6ab9", licenses = ["notice"], server_urls = maven_servers, ) @@ -161,7 +161,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_grpc_stub", artifact = "io.grpc:grpc-stub:1.19.0", - jar_sha256 = "711dad5734b4e8602a271cb383eda504d6d1bf5385ced045a0ca91176ae73821", + artifact_sha256 = "711dad5734b4e8602a271cb383eda504d6d1bf5385ced045a0ca91176ae73821", licenses = ["notice"], server_urls = maven_servers, ) @@ -174,7 +174,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_grpc_protobuf", artifact = "io.grpc:grpc-protobuf:1.19.0", - jar_sha256 = "37e50ab7de4a50db4c9f9a2f095ffc51df49e36c9ab7fffb1f3ad20ab6f47022", + artifact_sha256 = "37e50ab7de4a50db4c9f9a2f095ffc51df49e36c9ab7fffb1f3ad20ab6f47022", licenses = ["notice"], server_urls = maven_servers, ) @@ -187,7 +187,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_grpc_netty", artifact = "io.grpc:grpc-netty:1.19.0", - jar_sha256 = "08604191fa77ef644cd9d7323d633333eceb800831805395a21b5c8e7d02caf0", + artifact_sha256 = "08604191fa77ef644cd9d7323d633333eceb800831805395a21b5c8e7d02caf0", licenses = ["notice"], server_urls = maven_servers, ) @@ -200,7 +200,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_grpc_context", artifact = "io.grpc:grpc-context:1.19.0", - jar_sha256 = "8f4df8618c500f3c1fdf88b755caeb14fe2846ea59a9e762f614852178b64318", + artifact_sha256 = "8f4df8618c500f3c1fdf88b755caeb14fe2846ea59a9e762f614852178b64318", licenses = ["notice"], server_urls = maven_servers, ) @@ -215,7 +215,7 @@ def scala_proto_default_repositories( # io.grpc:grpc-core:1.19.0 defines a dependency on guava 26.0-android # see https://search.maven.org/artifact/io.grpc/grpc-core/1.19.0/jar artifact = "com.google.guava:guava:26.0-android", - jar_sha256 = "1d044ebb866ef08b7d04e998b4260c9b52fab6e6d6b68d207859486bb3686cd5", + artifact_sha256 = "1d044ebb866ef08b7d04e998b4260c9b52fab6e6d6b68d207859486bb3686cd5", licenses = ["notice"], server_urls = maven_servers, ) @@ -228,7 +228,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_google_instrumentation", artifact = "com.google.instrumentation:instrumentation-api:0.3.0", - jar_sha256 = "671f7147487877f606af2c7e39399c8d178c492982827305d3b1c7f5b04f1145", + artifact_sha256 = "671f7147487877f606af2c7e39399c8d178c492982827305d3b1c7f5b04f1145", licenses = ["notice"], server_urls = maven_servers, ) @@ -241,7 +241,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_codec", artifact = "io.netty:netty-codec:4.1.32.Final", - jar_sha256 = "dbd6cea7d7bf5a2604e87337cb67c9468730d599be56511ed0979aacb309f879", + artifact_sha256 = "dbd6cea7d7bf5a2604e87337cb67c9468730d599be56511ed0979aacb309f879", licenses = ["notice"], server_urls = maven_servers, ) @@ -254,7 +254,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_codec_http", artifact = "io.netty:netty-codec-http:4.1.32.Final", - jar_sha256 = "db2c22744f6a4950d1817e4e1a26692e53052c5d54abe6cceecd7df33f4eaac3", + artifact_sha256 = "db2c22744f6a4950d1817e4e1a26692e53052c5d54abe6cceecd7df33f4eaac3", licenses = ["notice"], server_urls = maven_servers, ) @@ -267,7 +267,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_codec_socks", artifact = "io.netty:netty-codec-socks:4.1.32.Final", - jar_sha256 = "fe2f2e97d6c65dc280623dcfd24337d8a5c7377049c120842f2c59fb83d7408a", + artifact_sha256 = "fe2f2e97d6c65dc280623dcfd24337d8a5c7377049c120842f2c59fb83d7408a", licenses = ["notice"], server_urls = maven_servers, ) @@ -280,7 +280,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_codec_http2", artifact = "io.netty:netty-codec-http2:4.1.32.Final", - jar_sha256 = "4d4c6cfc1f19efb969b9b0ae6cc977462d202867f7dcfee6e9069977e623a2f5", + artifact_sha256 = "4d4c6cfc1f19efb969b9b0ae6cc977462d202867f7dcfee6e9069977e623a2f5", licenses = ["notice"], server_urls = maven_servers, ) @@ -293,7 +293,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_handler", artifact = "io.netty:netty-handler:4.1.32.Final", - jar_sha256 = "07d9756e48b5f6edc756e33e8b848fb27ff0b1ae087dab5addca6c6bf17cac2d", + artifact_sha256 = "07d9756e48b5f6edc756e33e8b848fb27ff0b1ae087dab5addca6c6bf17cac2d", licenses = ["notice"], server_urls = maven_servers, ) @@ -306,7 +306,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_buffer", artifact = "io.netty:netty-buffer:4.1.32.Final", - jar_sha256 = "8ac0e30048636bd79ae205c4f9f5d7544290abd3a7ed39d8b6d97dfe3795afc1", + artifact_sha256 = "8ac0e30048636bd79ae205c4f9f5d7544290abd3a7ed39d8b6d97dfe3795afc1", licenses = ["notice"], server_urls = maven_servers, ) @@ -319,7 +319,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_transport", artifact = "io.netty:netty-transport:4.1.32.Final", - jar_sha256 = "175bae0d227d7932c0c965c983efbb3cf01f39abe934f5c4071d0319784715fb", + artifact_sha256 = "175bae0d227d7932c0c965c983efbb3cf01f39abe934f5c4071d0319784715fb", licenses = ["notice"], server_urls = maven_servers, ) @@ -332,7 +332,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_resolver", artifact = "io.netty:netty-resolver:4.1.32.Final", - jar_sha256 = "9b4a19982047a95ea4791a7ad7ad385c7a08c2ac75f0a3509cc213cb32a726ae", + artifact_sha256 = "9b4a19982047a95ea4791a7ad7ad385c7a08c2ac75f0a3509cc213cb32a726ae", licenses = ["notice"], server_urls = maven_servers, ) @@ -345,7 +345,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_common", artifact = "io.netty:netty-common:4.1.32.Final", - jar_sha256 = "cc993e660f8f8e3b033f1d25a9e2f70151666bdf878d460a6508cb23daa696dc", + artifact_sha256 = "cc993e660f8f8e3b033f1d25a9e2f70151666bdf878d460a6508cb23daa696dc", licenses = ["notice"], server_urls = maven_servers, ) @@ -358,7 +358,7 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_netty_handler_proxy", artifact = "io.netty:netty-handler-proxy:4.1.32.Final", - jar_sha256 = "10d1081ed114bb0e76ebbb5331b66a6c3189cbdefdba232733fc9ca308a6ea34", + artifact_sha256 = "10d1081ed114bb0e76ebbb5331b66a6c3189cbdefdba232733fc9ca308a6ea34", licenses = ["notice"], server_urls = maven_servers, ) @@ -370,8 +370,8 @@ def scala_proto_default_repositories( _scala_maven_import_external( name = "scala_proto_rules_opencensus_api", - artifact = "io.opencensus:opencensus-api:0.18.0", - jar_sha256 = "45421ffe95271aba94686ed8d4c5070fe77dc2ff0b922688097f0dd40f1931b1", + artifact = "io.opencensus:opencensus-api:0.22.1", + artifact_sha256 = "62a0503ee81856ba66e3cde65dee3132facb723a4fa5191609c84ce4cad36127", licenses = ["notice"], server_urls = maven_servers, ) @@ -381,10 +381,49 @@ def scala_proto_default_repositories( actual = "@scala_proto_rules_opencensus_api//jar", ) + _scala_maven_import_external( + name = "scala_proto_rules_opencensus_impl", + artifact = "io.opencensus:opencensus-impl:0.22.1", + artifact_sha256 = "9e8b209da08d1f5db2b355e781b9b969b2e0dab934cc806e33f1ab3baed4f25a", + licenses = ["notice"], + server_urls = maven_servers, + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/proto/opencensus_impl", + actual = "@scala_proto_rules_opencensus_impl//jar", + ) + + _scala_maven_import_external( + name = "scala_proto_rules_disruptor", + artifact = "com.lmax:disruptor:3.4.2", + artifact_sha256 = "f412ecbb235c2460b45e63584109723dea8d94b819c78c9bfc38f50cba8546c0", + licenses = ["notice"], + server_urls = maven_servers, + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/proto/disruptor", + actual = "@scala_proto_rules_disruptor//jar", + ) + + _scala_maven_import_external( + name = "scala_proto_rules_opencensus_impl_core", + artifact = "io.opencensus:opencensus-impl-core:0.22.1", + artifact_sha256 = "04607d100e34bacdb38f93c571c5b7c642a1a6d873191e25d49899668514db68", + licenses = ["notice"], + server_urls = maven_servers, + ) + + native.bind( + name = "io_bazel_rules_scala/dependency/proto/opencensus_impl_core", + actual = "@scala_proto_rules_opencensus_impl_core//jar", + ) + _scala_maven_import_external( name = "scala_proto_rules_opencensus_contrib_grpc_metrics", - artifact = "io.opencensus:opencensus-contrib-grpc-metrics:0.18.0", - jar_sha256 = "1f90585e777b1e0493dbf22e678303369a8d5b7c750b4eda070a34ca99271607", + artifact = "io.opencensus:opencensus-contrib-grpc-metrics:0.22.1", + artifact_sha256 = "3f6f4d5bd332c516282583a01a7c940702608a49ed6e62eb87ef3b1d320d144b", licenses = ["notice"], server_urls = maven_servers, ) diff --git a/scala_proto/private/scalapb_aspect.bzl b/scala_proto/private/scalapb_aspect.bzl index 3fb6f40c5..32b8f6469 100644 --- a/scala_proto/private/scalapb_aspect.bzl +++ b/scala_proto/private/scalapb_aspect.bzl @@ -61,7 +61,7 @@ def _compile_scala( label.name + "_scalac.statsfile", sibling = scalapb_jar, ) - merged_deps = java_common.merge(deps_java_info + implicit_deps) + merged_deps = java_common.merge(_concat_lists(deps_java_info, implicit_deps)) # this only compiles scala, not the ijar, but we don't # want the ijar for generated code anyway: any change @@ -98,9 +98,6 @@ def _compile_scala( compile_jar = output, ) -def _empty_java_info(deps_java_info, implicit_deps): - return java_common.merge(deps_java_info + implicit_deps) - #### # This is applied to the DAG of proto_librarys reachable from a deps # or a scalapb_scala_library. Each proto_library will be one scalapb @@ -136,11 +133,11 @@ def _scalapb_aspect_impl(target, ctx): toolchain = ctx.toolchains["@io_bazel_rules_scala//scala_proto:toolchain_type"] flags = [] - imps = [j[JavaInfo] for j in toolchain.implicit_compile_deps] + imps = [j[JavaInfo] for j in ctx.attr._implicit_compile_deps] if toolchain.with_grpc: flags.append("grpc") - imps.extend([j[JavaInfo] for j in toolchain.grpc_deps]) + imps.extend([j[JavaInfo] for j in ctx.attr._grpc_deps]) if toolchain.with_flat_package: flags.append("flat_package") @@ -201,7 +198,7 @@ def _scalapb_aspect_impl(target, ctx): # this target is only an aggregation target src_jars = depset() outs = depset() - java_info = _empty_java_info(deps, imps) + java_info = java_common.merge(_concat_lists(deps, imps)) return [ ScalaPBAspectInfo( @@ -212,6 +209,12 @@ def _scalapb_aspect_impl(target, ctx): ), ] +def _concat_lists(list1, list2): + all_providers = [] + all_providers.extend(list1) + all_providers.extend(list2) + return all_providers + scalapb_aspect = aspect( implementation = _scalapb_aspect_impl, attr_aspects = ["deps"], @@ -221,6 +224,12 @@ scalapb_aspect = aspect( ], attrs = { "_protoc": attr.label(executable = True, cfg = "host", default = "@com_google_protobuf//:protoc"), + "_implicit_compile_deps": attr.label_list(cfg = "target", default = [ + "//external:io_bazel_rules_scala/dependency/proto/implicit_compile_deps", + ]), + "_grpc_deps": attr.label_list(cfg = "target", default = [ + "//external:io_bazel_rules_scala/dependency/proto/grpc_deps", + ]), }, toolchains = [ "@io_bazel_rules_scala//scala:toolchain_type", diff --git a/scala_proto/scala_proto.bzl b/scala_proto/scala_proto.bzl index 70374123b..4bb8211cf 100644 --- a/scala_proto/scala_proto.bzl +++ b/scala_proto/scala_proto.bzl @@ -14,12 +14,27 @@ load( "scalapb_aspect", ) +def register_default_proto_dependencies(): + if native.existing_rule("io_bazel_rules_scala/dependency/proto/grpc_deps") == None: + native.bind( + name = "io_bazel_rules_scala/dependency/proto/grpc_deps", + actual = "@io_bazel_rules_scala//scala_proto:default_scalapb_grpc_dependencies", + ) + + if native.existing_rule("io_bazel_rules_scala/dependency/proto/implicit_compile_deps") == None: + native.bind( + name = "io_bazel_rules_scala/dependency/proto/implicit_compile_deps", + actual = "@io_bazel_rules_scala//scala_proto:default_scalapb_compile_dependencies", + ) + def scala_proto_repositories( scala_version = _default_scala_version(), - maven_servers = ["https://repo1.maven.org/maven2"]): - return scala_proto_default_repositories(scala_version, maven_servers) + maven_servers = ["https://repo.maven.apache.org/maven2"]): + ret = scala_proto_default_repositories(scala_version, maven_servers) + register_default_proto_dependencies() + return ret -def _scalapb_proto_library_impl(ctx): +def _scala_proto_library_impl(ctx): aspect_info = merge_scalapb_aspect_info( [dep[ScalaPBAspectInfo] for dep in ctx.attr.deps], ) @@ -31,10 +46,13 @@ def _scalapb_proto_library_impl(ctx): DefaultInfo(files = aspect_info.output_files), ] -scalapb_proto_library = rule( - implementation = _scalapb_proto_library_impl, +scala_proto_library = rule( + implementation = _scala_proto_library_impl, attrs = { "deps": attr.label_list(aspects = [scalapb_aspect]), }, provides = [DefaultInfo, ScalaPBInfo, JavaInfo], ) + +def scalapb_proto_library(**kwargs): + scala_proto_library(**kwargs) diff --git a/scala_proto/scala_proto_toolchain.bzl b/scala_proto/scala_proto_toolchain.bzl index 0082f2cd3..a733a2fbf 100644 --- a/scala_proto/scala_proto_toolchain.bzl +++ b/scala_proto/scala_proto_toolchain.bzl @@ -7,15 +7,12 @@ def _scala_proto_toolchain_impl(ctx): with_single_line_to_string = ctx.attr.with_single_line_to_string, blacklisted_protos = ctx.attr.blacklisted_protos, code_generator = ctx.attr.code_generator, - grpc_deps=ctx.attr.grpc_deps, - implicit_compile_deps=ctx.attr.implicit_compile_deps, extra_generator_dependencies = ctx.attr.extra_generator_dependencies, - scalac=ctx.attr.scalac, + scalac = ctx.attr.scalac, named_generators = ctx.attr.named_generators, ) return [toolchain] - # Args: # with_grpc: Enables generation of grpc service bindings for services # with_flat_package: When true, ScalaPB will not append the protofile base name to the package name @@ -28,24 +25,16 @@ scala_proto_toolchain = rule( "with_grpc": attr.bool(), "with_flat_package": attr.bool(), "with_single_line_to_string": attr.bool(), - "blacklisted_protos": attr.label_list(default=[]), + "blacklisted_protos": attr.label_list(default = []), "code_generator": attr.label( executable = True, cfg = "host", default = Label("@io_bazel_rules_scala//src/scala/scripts:scalapb_generator"), - allow_files=True + allow_files = True, ), "named_generators": attr.string_dict(), "extra_generator_dependencies": attr.label_list( - providers = [JavaInfo] - ), - "grpc_deps": attr.label_list( - providers = [JavaInfo], - default = DEFAULT_SCALAPB_GRPC_DEPS - ), - "implicit_compile_deps": attr.label_list( providers = [JavaInfo], - default = DEFAULT_SCALAPB_COMPILE_DEPS, ), "scalac": attr.label( default = Label( @@ -54,6 +43,3 @@ scala_proto_toolchain = rule( ), }, ) - - - diff --git a/scala_proto/toolchains.bzl b/scala_proto/toolchains.bzl index dbc1926cc..b5f84c101 100644 --- a/scala_proto/toolchains.bzl +++ b/scala_proto/toolchains.bzl @@ -1,7 +1,5 @@ def scala_proto_register_toolchains(): native.register_toolchains("@io_bazel_rules_scala//scala_proto:default_toolchain") - def scala_proto_register_enable_all_options_toolchain(): native.register_toolchains("@io_bazel_rules_scala//scala_proto:enable_all_options_toolchain") - diff --git a/specs2/specs2.bzl b/specs2/specs2.bzl index 91fa38e76..a9385daf7 100644 --- a/specs2/specs2.bzl +++ b/specs2/specs2.bzl @@ -14,7 +14,7 @@ def specs2_version(): def specs2_repositories( scala_version = _default_scala_version(), - maven_servers = ["https://repo1.maven.org/maven2"]): + maven_servers = ["https://repo.maven.apache.org/maven2"]): major_version = _extract_major_version(scala_version) scala_jar_shas = { @@ -40,7 +40,7 @@ def specs2_repositories( "org.specs2:specs2-common:" + specs2_version(), major_version, ), - jar_sha256 = scala_version_jar_shas["specs2_common"], + artifact_sha256 = scala_version_jar_shas["specs2_common"], licenses = ["notice"], server_urls = maven_servers, ) @@ -51,7 +51,7 @@ def specs2_repositories( "org.specs2:specs2-core:" + specs2_version(), major_version, ), - jar_sha256 = scala_version_jar_shas["specs2_core"], + artifact_sha256 = scala_version_jar_shas["specs2_core"], licenses = ["notice"], server_urls = maven_servers, ) @@ -62,7 +62,7 @@ def specs2_repositories( "org.specs2:specs2-fp:" + specs2_version(), major_version, ), - jar_sha256 = scala_version_jar_shas["specs2_fp"], + artifact_sha256 = scala_version_jar_shas["specs2_fp"], licenses = ["notice"], server_urls = maven_servers, ) @@ -73,7 +73,7 @@ def specs2_repositories( "org.specs2:specs2-matcher:" + specs2_version(), major_version, ), - jar_sha256 = scala_version_jar_shas["specs2_matcher"], + artifact_sha256 = scala_version_jar_shas["specs2_matcher"], licenses = ["notice"], server_urls = maven_servers, ) diff --git a/specs2/specs2_junit.bzl b/specs2/specs2_junit.bzl index 774480b01..aa04e25ac 100644 --- a/specs2/specs2_junit.bzl +++ b/specs2/specs2_junit.bzl @@ -18,7 +18,7 @@ load( def specs2_junit_repositories( scala_version = _default_scala_version(), - maven_servers = ["https://repo1.maven.org/maven2"]): + maven_servers = ["https://repo.maven.apache.org/maven2"]): major_version = _extract_major_version(scala_version) specs2_repositories(scala_version, maven_servers) @@ -40,7 +40,7 @@ def specs2_junit_repositories( "org.specs2:specs2-junit:" + specs2_version(), major_version, ), - jar_sha256 = scala_jar_shas[major_version]["specs2_junit"], + artifact_sha256 = scala_jar_shas[major_version]["specs2_junit"], licenses = ["notice"], server_urls = maven_servers, ) diff --git a/src/java/io/bazel/rulesscala/coverage/instrumenter/BUILD b/src/java/io/bazel/rulesscala/coverage/instrumenter/BUILD index a24d9726a..1b0ed55bd 100644 --- a/src/java/io/bazel/rulesscala/coverage/instrumenter/BUILD +++ b/src/java/io/bazel/rulesscala/coverage/instrumenter/BUILD @@ -9,14 +9,11 @@ java_binary( ], main_class = "io.bazel.rulesscala.coverage.instrumenter.JacocoInstrumenter", visibility = ["//visibility:public"], - runtime_deps = [ - "@io_bazel_rules_scala_org_ow2_asm_asm_debug_all", - ], deps = [ + "@bazel_tools//tools/jdk:JacocoCoverage", "@io_bazel_rules_scala//src/java/com/google/devtools/build/lib:worker", "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/jar", "@io_bazel_rules_scala//src/java/io/bazel/rulesscala/worker", - "@io_bazel_rules_scala_org_jacoco_org_jacoco_core", ], ) diff --git a/src/java/io/bazel/rulesscala/exe/BUILD b/src/java/io/bazel/rulesscala/exe/BUILD index da9e98fe7..0a36ca645 100644 --- a/src/java/io/bazel/rulesscala/exe/BUILD +++ b/src/java/io/bazel/rulesscala/exe/BUILD @@ -1,24 +1,24 @@ java_library( name = "exe-lib", srcs = [ - "LauncherFileWriter.java", "LaunchInfo.java", + "LauncherFileWriter.java", ], + visibility = ["//visibility:private"], deps = [ - "@bazel_tools//tools/java/runfiles", "//external:io_bazel_rules_scala/dependency/scala/guava", + "@bazel_tools//tools/java/runfiles", ], - visibility = ["//visibility:private"], ) java_binary( name = "exe", - main_class = "io.bazel.rulesscala.exe.LauncherFileWriter", - runtime_deps = [ - ":exe-lib", - ], data = [ "@bazel_tools//tools/launcher", ], + main_class = "io.bazel.rulesscala.exe.LauncherFileWriter", visibility = ["//visibility:public"], + runtime_deps = [ + ":exe-lib", + ], ) diff --git a/src/java/io/bazel/rulesscala/scalac/jvm_export_toolchain.bzl b/src/java/io/bazel/rulesscala/scalac/jvm_export_toolchain.bzl index e0626715d..6636a3add 100644 --- a/src/java/io/bazel/rulesscala/scalac/jvm_export_toolchain.bzl +++ b/src/java/io/bazel/rulesscala/scalac/jvm_export_toolchain.bzl @@ -14,12 +14,8 @@ def _export_scalac_repositories_from_toolchain_to_jvm_impl(ctx): default_repl_classpath_files = _files_of( default_repl_classpath_deps, ).to_list() - java_common_provider = java_common.create_provider( - use_ijar = False, - compile_time_jars = default_repl_classpath_files, - runtime_jars = default_repl_classpath_files, - ) - return [java_common_provider] + providers = [JavaInfo(output_jar = jar, compile_jar = jar) for jar in default_repl_classpath_files] + return [java_common.merge(providers)] export_scalac_repositories_from_toolchain_to_jvm = rule( _export_scalac_repositories_from_toolchain_to_jvm_impl, diff --git a/src/java/io/bazel/rulesscala/specs2/Specs2RunnerBuilder.scala b/src/java/io/bazel/rulesscala/specs2/Specs2RunnerBuilder.scala index 9f4f8e03c..c26c63a13 100644 --- a/src/java/io/bazel/rulesscala/specs2/Specs2RunnerBuilder.scala +++ b/src/java/io/bazel/rulesscala/specs2/Specs2RunnerBuilder.scala @@ -3,21 +3,24 @@ package io.bazel.rulesscala.specs2 import java.util import java.util.regex.Pattern -import io.bazel.rulesscala.test_discovery._ import io.bazel.rulesscala.test_discovery.FilteredRunnerBuilder.FilteringRunnerBuilder +import io.bazel.rulesscala.test_discovery._ import org.junit.runner.notification.RunNotifier import org.junit.runner.{Description, RunWith, Runner} import org.junit.runners.Suite import org.junit.runners.model.RunnerBuilder import org.specs2.concurrent.ExecutionEnv import org.specs2.control.Action +import org.specs2.data.Trees._ +import org.specs2.fp.TreeLoc import org.specs2.main.{Arguments, CommandLine, Select} -import org.specs2.specification.core.Env +import org.specs2.specification.core.{Env, Fragment, SpecStructure} import org.specs2.specification.process.Stats -import scala.language.reflectiveCalls import scala.collection.JavaConverters._ +import scala.language.reflectiveCalls import scala.util.Try +import scala.util.control.NonFatal @RunWith(classOf[Specs2PrefixSuffixTestDiscoveringSuite]) class Specs2DiscoveredTestSuite @@ -53,6 +56,23 @@ object Specs2FilteringRunnerBuilder { class FilteredSpecs2ClassRunner(testClass: Class[_], testFilter: Pattern) extends org.specs2.runner.JUnitRunner(testClass) { + override def getDescription(env: Env): Description = { + try createFilteredDescription(specStructure, env.specs2ExecutionEnv) + catch { case NonFatal(t) => env.shutdown; throw t; } + } + + private def createFilteredDescription(specStructure: SpecStructure, ee: ExecutionEnv): Description = { + val descTree = createDescriptionTree(ee).map(_._2) + descTree.toTree.bottomUp { + (description: Description, children: Stream[Description]) => + children.filter(matchingFilter).foreach { + child => description.addChild(child) + } + description + }.rootLabel + + } + def matchesFilter: Boolean = { val fqn = testClass.getName + "#" val matcher = testFilter.matcher(fqn) @@ -66,6 +86,13 @@ class FilteredSpecs2ClassRunner(testClass: Class[_], testFilter: Pattern) else sanitized } + private def createDescriptionTree(implicit ee: ExecutionEnv): TreeLoc[(Fragment, Description)] = + Try(allDescriptions[specs2_v4].createDescriptionTree(specStructure)(ee)) + .getOrElse(allDescriptions[specs2_v3].createDescriptionTree(specStructure)) + + private def allFragmentDescriptions(implicit ee: ExecutionEnv): Map[Fragment, Description] = + createDescriptionTree(ee).toTree.flattenLeft.toMap + /** * Retrieves an original (un-sanitized) text of an example fragment, * used later as a regex string for specs2 matching. @@ -73,13 +100,13 @@ class FilteredSpecs2ClassRunner(testClass: Class[_], testFilter: Pattern) * This is done by matching the actual (sanitized) string with the sanitized version * of the original example text. */ - private def specs2Description(desc: String)(implicit ee: ExecutionEnv): String = - Try { allDescriptions[specs2_v4].fragmentDescriptions(specStructure)(ee) } - .getOrElse(allDescriptions[specs2_v3].fragmentDescriptions(specStructure)) + private def specs2Description(desc: String)(implicit ee: ExecutionEnv): String = { + allFragmentDescriptions .keys .map(fragment => fragment.description.show) .find(sanitize(_) == desc) .getOrElse(desc) + } private def toDisplayName(description: Description)(implicit ee: ExecutionEnv): Option[String] = for { name <- Option(description.getMethodName) @@ -106,22 +133,24 @@ class FilteredSpecs2ClassRunner(testClass: Class[_], testFilter: Pattern) * * This function returns a flat list of the descriptions and their children, starting with the root. */ - private def flattenDescription(description: Description): List[Description] = - description.getChildren.asScala.toList.flatMap(d => d :: flattenDescription(d)) + def flattenChildren(root: Description): List[Description] = { + root.getChildren.asScala.toList.flatMap(d => d :: flattenChildren(d)) + } - private def matching(testFilter: Pattern): Description => Boolean = { d => - val testCase = d.getClassName + "#" + d.getMethodName - testFilter.matcher(testCase).matches + private def matchingFilter(desc: Description): Boolean = { + if (desc.isSuite) true + else { + val testCase = desc.getClassName + "#" + Option(desc.getMethodName).mkString + testFilter.toString.r.findFirstIn(testCase).nonEmpty + } } - private def specs2ExamplesMatching(testFilter: Pattern, junitDescription: Description)(implicit ee: ExecutionEnv): List[String] = - flattenDescription(junitDescription) - .filter(matching(testFilter)) - .flatMap(toDisplayName(_)) + private def specs2Examples(implicit ee: ExecutionEnv): List[String] = + flattenChildren(getDescription).flatMap(toDisplayName(_)) override def runWithEnv(n: RunNotifier, env: Env): Action[Stats] = { implicit val ee = env.executionEnv - val specs2MatchedExamplesRegex = specs2ExamplesMatching(testFilter, getDescription).toRegexAlternation + val specs2MatchedExamplesRegex = specs2Examples.toRegexAlternation val newArgs = Arguments(select = Select(_ex = specs2MatchedExamplesRegex), commandLine = CommandLine.create(testClass.getName)) val newEnv = env.copy(arguments overrideWith newArgs) diff --git a/src/java/io/bazel/rulesscala/specs2/package.scala b/src/java/io/bazel/rulesscala/specs2/package.scala index 341cb1649..030842a3b 100644 --- a/src/java/io/bazel/rulesscala/specs2/package.scala +++ b/src/java/io/bazel/rulesscala/specs2/package.scala @@ -2,16 +2,19 @@ package io.bazel.rulesscala import org.junit.runner.Description import org.specs2.concurrent.ExecutionEnv +import org.specs2.fp.TreeLoc import org.specs2.reporter.JUnitDescriptions import org.specs2.specification.core.{Fragment, SpecStructure} package object specs2 { type specs2_v4 = { - def fragmentDescriptions(spec: SpecStructure)(ee: ExecutionEnv): Map[Fragment, Description] + //noinspection ScalaUnusedSymbol + def createDescriptionTree(spec: SpecStructure)(ee: ExecutionEnv): TreeLoc[(Fragment, Description)] } type specs2_v3 = { - def fragmentDescriptions(spec: SpecStructure): Map[Fragment, Description] + //noinspection ScalaUnusedSymbol + def createDescriptionTree(spec: SpecStructure): TreeLoc[(Fragment, Description)] } - def allDescriptions[T] = JUnitDescriptions.asInstanceOf[T] -} + def allDescriptions[T]: T = JUnitDescriptions.asInstanceOf[T] +} \ No newline at end of file diff --git a/src/java/io/bazel/rulesscala/test_discovery/DiscoveredTestSuite.scala b/src/java/io/bazel/rulesscala/test_discovery/DiscoveredTestSuite.scala index b11ff4ec7..579f11e06 100644 --- a/src/java/io/bazel/rulesscala/test_discovery/DiscoveredTestSuite.scala +++ b/src/java/io/bazel/rulesscala/test_discovery/DiscoveredTestSuite.scala @@ -49,7 +49,7 @@ object PrefixSuffixTestDiscoveringSuite { val classes = archives.flatMap(discoverClassesIn) if (classes.isEmpty) throw new IllegalStateException("Was not able to discover any classes " + - s"for archives=${archives.mkString(",")}" + + s"for archives=${archives.mkString(",")}, " + s"prefixes=$prefixes, " + s"suffixes=$suffixes") classes diff --git a/src/scala/io/bazel/rules_scala/scaladoc_support/BUILD b/src/scala/io/bazel/rules_scala/scaladoc_support/BUILD new file mode 100644 index 000000000..b48ad5fb4 --- /dev/null +++ b/src/scala/io/bazel/rules_scala/scaladoc_support/BUILD @@ -0,0 +1,17 @@ +load("//scala:scala.bzl", "scala_binary") + +# A simple scala_binary to run scaladoc. +# `bazel run` this target with "-help" as a param for usage text: +# bazel run -- "//src/scala/io/bazel/rules_scala/scaladoc_support:scaladoc_generator" -help +scala_binary( + name = "scaladoc_generator", + main_class = "scala.tools.nsc.ScalaDoc", + visibility = ["//visibility:public"], + runtime_deps = [ + "//external:io_bazel_rules_scala/dependency/scala/parser_combinators", + "//external:io_bazel_rules_scala/dependency/scala/scala_compiler", + "//external:io_bazel_rules_scala/dependency/scala/scala_library", + "//external:io_bazel_rules_scala/dependency/scala/scala_reflect", + "//external:io_bazel_rules_scala/dependency/scala/scala_xml", + ], +) diff --git a/src/scala/scripts/PBGenerateRequest.scala b/src/scala/scripts/PBGenerateRequest.scala index a35d03c31..5eea6e146 100644 --- a/src/scala/scripts/PBGenerateRequest.scala +++ b/src/scala/scripts/PBGenerateRequest.scala @@ -6,6 +6,30 @@ case class PBGenerateRequest(jarOutput: String, scalaPBOutput: Path, scalaPBArgs object PBGenerateRequest { + // This little function fixes a problem, where external/com_google_protobuf is not found. The com_google_protobuf + // is special in a way that it also brings-in protoc and also google well-known proto files. This, possibly, + // confuses Bazel and external/com_google_protobuf is not made available for target builds. Actual causes are unknown + // and this fixTransitiveProtoPath fixes this problem in the following way: + // (1) We have a list of all required .proto files; this is a tuple list (root -> full path), for example: + // bazel-out/k8-fastbuild/bin -> bazel-out/k8-fastbuild/bin/external/com_google_protobuf/google/protobuf/source_context.proto + // (2) Convert the full path to relative from the root: + // bazel-out/k8-fastbuild/bin -> external/com_google_protobuf/google/protobuf/source_context.proto + // (3) From all the included protos we find the first one that is located within dir we are processing -- relative + // path starts with the dir we are processing + // (4) If found -- the include folder is "orphan" and is not anchored in either host or target. To fix we prepend + // root. If not found, return original. This works as long as "external/com_google_protobuf" is available in + // target root. + def fixTransitiveProtoPath(includedProto: List[(Path, Path)]): String => String = { + val includedRelProto = includedProto.map { case (root, full) => (root.toString, root.relativize(full).toString) } + + { orig => + includedRelProto.find { case (_, rel) => rel.startsWith(orig) } match { + case Some((root, _)) => s"$root/$orig" + case None => orig + } + } + } + def from(args: java.util.List[String]): PBGenerateRequest = { val jarOutput = args.get(0) val protoFiles = args.get(4).split(':') @@ -23,17 +47,18 @@ object PBGenerateRequest { case s if s.charAt(0) == '-' => Some(s.tail) //drop padding character case other => sys.error(s"expected a padding character of - (dash), but found: $other") } - val transitiveProtoPaths = (args.get(3) match { + + val transitiveProtoPaths: List[String] = (args.get(3) match { case "-" => Nil case s if s.charAt(0) == '-' => s.tail.split(':').toList //drop padding character case other => sys.error(s"expected a padding character of - (dash), but found: $other") - }) ++ List(".") + }).map(fixTransitiveProtoPath(includedProto)) ++ List(".") val tmp = Paths.get(Option(System.getProperty("java.io.tmpdir")).getOrElse("/tmp")) val scalaPBOutput = Files.createTempDirectory(tmp, "bazelscalapb") val flagPrefix = flagOpt.fold("")(_ + ":") - val namedGenerators = args.get(6).drop(1).split(',').filter(_.nonEmpty).map { e => + val namedGenerators = args.get(6).drop(1).split(',').filter(_.nonEmpty).map { e => val kv = e.split('=') (kv(0), kv(1)) } diff --git a/src/scala/scripts/ScalaPBGenerator.scala b/src/scala/scripts/ScalaPBGenerator.scala index 5d4140f3e..9ed457d09 100644 --- a/src/scala/scripts/ScalaPBGenerator.scala +++ b/src/scala/scripts/ScalaPBGenerator.scala @@ -1,7 +1,7 @@ package scripts import java.io.PrintStream -import java.nio.file.Path +import java.nio.file.{Path, FileAlreadyExistsException} import io.bazel.rulesscala.io_utils.DeleteRecursively import io.bazel.rulesscala.jar.JarCreator @@ -12,6 +12,7 @@ import scalapb.ScalaPbCodeGenerator import java.nio.file.{Files, Paths} import scalapb.{ScalaPBC, ScalaPbCodeGenerator, ScalaPbcException} import java.net.URLClassLoader +import scala.util.{Try, Failure} object ScalaPBWorker extends GenericWorker(new ScalaPBGenerator) { @@ -33,15 +34,6 @@ object ScalaPBWorker extends GenericWorker(new ScalaPBGenerator) { } class ScalaPBGenerator extends Processor { - def setupIncludedProto(includedProto: List[(Path, Path)]): Unit = { - includedProto.foreach { case (root, fullPath) => - require(fullPath.toFile.exists, s"Path $fullPath does not exist, which it should as a dependency of this rule") - val relativePath = root.relativize(fullPath) - - relativePath.toFile.getParentFile.mkdirs - Files.copy(fullPath, relativePath) - } - } def deleteDir(path: Path): Unit = try DeleteRecursively.run(path) catch { @@ -50,8 +42,6 @@ class ScalaPBGenerator extends Processor { def processRequest(args: java.util.List[String]) { val extractRequestResult = PBGenerateRequest.from(args) - setupIncludedProto(extractRequestResult.includedProto) - val extraClassesClassLoader = new URLClassLoader(extractRequestResult.extraJars.map { e => val f = e.toFile require(f.exists, s"Expected file for classpath loading $f to exist") diff --git a/test/BUILD b/test/BUILD index 319d20edc..daeb207b1 100644 --- a/test/BUILD +++ b/test/BUILD @@ -3,18 +3,19 @@ package(default_testonly = 1) load( "//scala:scala.bzl", "scala_binary", + "scala_doc", + "scala_junit_test", "scala_library", - "scala_test", + "scala_library_suite", "scala_macro_library", "scala_repl", - "scala_test_suite", - "scala_library_suite", - "scala_junit_test", "scala_specs2_junit_test", + "scala_test", + "scala_test_suite", ) load( "//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) # The examples below show how to combine Scala and Java rules. @@ -137,9 +138,18 @@ scala_library( deps = ["ExportOnly"], ) +scala_doc( + name = "ScalaDoc", + deps = [ + ":HelloLib", + ":OtherLib", + "//test/src/main/scala/scalarules/test/compiler_plugin", # brings kind-projector compiler plugin with it + ], +) + scala_library( name = "UnusedLib", - srcs = ["UnusedLib.scala"] + srcs = ["UnusedLib.scala"], ) scala_library( @@ -155,7 +165,7 @@ scala_library( unused_dependency_checker_mode = "warn", deps = [ "ExportOnly", - "UnusedLib" + "UnusedLib", ], ) @@ -277,24 +287,20 @@ scala_library( srcs = glob(["src/main/scala/scalarules/test/mix_java_scala/*.scala"]) + glob([ "src/main/scala/scalarules/test/mix_java_scala/*.java", ]), - jvm_flags = [ - "-Xms1G", - "-Xmx4G", - ], ) genrule( name = "MixJavaScalaLibTestOutputs", outs = ["mix_java_scala_lib_test_rule_outputs.txt"], cmd = "echo $(locations MixJavaScalaLib) > $@", - tools = [":MixJavaScalaLib"] + tools = [":MixJavaScalaLib"], ) sh_test( name = "MixJavaScalaLibTest", srcs = ["test_scala_library_outputs_mixed_java_scala_jars.sh"], args = ["$(location MixJavaScalaLibTestOutputs) MixJavaScalaLib"], - data = ["MixJavaScalaLibTestOutputs"] + data = ["MixJavaScalaLibTestOutputs"], ) #needed to test java sources are compiled @@ -313,10 +319,6 @@ scala_library( # srcjar created with `jar -cfM Baz.srcjar Baz.java` "src/main/scala/scalarules/test/mix_java_scala/*.srcjar", ]), - jvm_flags = [ - "-Xms1G", - "-Xmx4G", - ], ) #needed to test java sources are compiled @@ -407,9 +409,9 @@ scala_specs2_junit_test( name = "Specs2Tests_unused_dependency_checker", size = "small", srcs = ["src/main/scala/scalarules/test/junit/specs2/Specs2Tests.scala"], - unused_dependency_checker_mode = "error", - unused_dependency_checker_ignored_targets = [":JUnitRuntimeDep"], suffixes = ["Test"], + unused_dependency_checker_ignored_targets = [":JUnitRuntimeDep"], + unused_dependency_checker_mode = "error", deps = [ ":JUnitCompileTimeDep", ":JUnitRuntimeDep", @@ -608,7 +610,7 @@ scala_binary( deps = [":lib_with_scala_proto_dep"], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_scala_dep", visibility = ["//visibility:public"], deps = [ @@ -617,12 +619,12 @@ scalapb_proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_scala_import_dep", visibility = ["//visibility:public"], deps = [ - "@com_github_jnr_jffi_native", "//test/proto:test2", + "@com_github_jnr_jffi_native", ], ) @@ -648,9 +650,8 @@ scala_junit_test( name = "JunitSeparateTarget_test_runner", size = "small", suffixes = ["Test"], - runtime_deps = [":JunitSeparateTarget"], tests_from = [":JunitSeparateTarget"], - + runtime_deps = [":JunitSeparateTarget"], ) java_library( @@ -663,9 +664,8 @@ scala_junit_test( name = "JunitJavaSeparateTarget_test_runner", size = "small", suffixes = ["Test"], - runtime_deps = [":JunitJavaSeparateTarget"], tests_from = [":JunitJavaSeparateTarget"], - + runtime_deps = [":JunitJavaSeparateTarget"], ) scala_library( @@ -677,40 +677,39 @@ scala_library( scala_library( name = "JunitSeparateTargetWithDependencyOnTest", srcs = ["src/main/scala/scalarules/test/junit/separate_target/JunitSeparateTargetTest.scala"], - deps = ["//external:io_bazel_rules_scala/dependency/junit/junit"], runtime_deps = [":TargetWithTestThatShouldNotRun"], + deps = ["//external:io_bazel_rules_scala/dependency/junit/junit"], ) scala_junit_test( name = "JunitSeparateTargetTransitiveDependencyShouldNotRun_test_runner", size = "small", suffixes = ["Test"], - runtime_deps = [":JunitSeparateTargetWithDependencyOnTest"], tests_from = [":JunitSeparateTargetWithDependencyOnTest"], - + runtime_deps = [":JunitSeparateTargetWithDependencyOnTest"], ) java_library( name = "JunitSeparateJavaTargetWithDependencyOnTest", srcs = ["src/main/scala/scalarules/test/junit/separate_target/JunitJavaSeparateTargetTest.java"], - deps = ["//external:io_bazel_rules_scala/dependency/junit/junit"], runtime_deps = [":TargetWithTestThatShouldNotRun"], + deps = ["//external:io_bazel_rules_scala/dependency/junit/junit"], ) scala_junit_test( name = "JunitSeparateJavaTargetTransitiveDependencyShouldNotRun_test_runner", size = "small", suffixes = ["Test"], - runtime_deps = [":JunitSeparateJavaTargetWithDependencyOnTest"], tests_from = [":JunitSeparateJavaTargetWithDependencyOnTest"], - + runtime_deps = [":JunitSeparateJavaTargetWithDependencyOnTest"], ) scala_library( name = "JunitMixedSeparateTarget", - srcs = ["src/main/scala/scalarules/test/junit/separate_target/SomeScalaClass.scala", - "src/main/scala/scalarules/test/junit/separate_target/JunitJavaSeparateTargetTest.java" - ], + srcs = [ + "src/main/scala/scalarules/test/junit/separate_target/JunitJavaSeparateTargetTest.java", + "src/main/scala/scalarules/test/junit/separate_target/SomeScalaClass.scala", + ], deps = ["//external:io_bazel_rules_scala/dependency/junit/junit"], ) @@ -718,7 +717,6 @@ scala_junit_test( name = "JunitMixedSeparateTarget_test_runner", size = "small", suffixes = ["Test"], - runtime_deps = [":JunitMixedSeparateTarget"], tests_from = [":JunitMixedSeparateTarget"], - + runtime_deps = [":JunitMixedSeparateTarget"], ) diff --git a/test/aspect/BUILD b/test/aspect/BUILD index f2b2f9cdb..fd4122f6d 100644 --- a/test/aspect/BUILD +++ b/test/aspect/BUILD @@ -1,10 +1,10 @@ load(":aspect.bzl", "aspect_test") load( "//scala:scala.bzl", - "scala_library", - "scala_test", "scala_junit_test", + "scala_library", "scala_specs2_junit_test", + "scala_test", ) aspect_test( diff --git a/test/coverage/A2.scala b/test/coverage/A2.scala index 20e8b7a1e..0e58f455d 100644 --- a/test/coverage/A2.scala +++ b/test/coverage/A2.scala @@ -1,5 +1,7 @@ object A2 { def a2(): Unit = { - println("a2: " + B2.b2_a()) + println("a2: " + + "" // B2.b2_a() + ) } } diff --git a/test/coverage/BUILD b/test/coverage/BUILD index 9c95ed14e..1d3cc583b 100644 --- a/test/coverage/BUILD +++ b/test/coverage/BUILD @@ -6,6 +6,7 @@ scala_toolchain( enable_code_coverage_aspect = "on", visibility = ["//visibility:public"], ) + toolchain( name = "enable_code_coverage_aspect", toolchain = "enable_code_coverage_aspect_impl", @@ -17,11 +18,11 @@ scala_test( name = "test-all", srcs = [ "TestAll.scala", - ], + ], deps = [ ":a1", ":a2", - ":b1", + ":b1", ], ) @@ -30,11 +31,11 @@ java_test( srcs = [ "TestB2.java", ], + tags = ["manual"], test_class = "TestB2", deps = [ ":b2", ], - tags = ["manual"], ) scala_library( @@ -58,9 +59,10 @@ scala_library( name = "a2", srcs = [ "A2.scala", - ], + ], deps = [ - ":b2", + # TODO :: Understand why referencing a local java library breaks coverage + # ":b2", ], ) @@ -80,12 +82,12 @@ java_library( ], deps = [ ":c2", - ], + ], ) scala_library( name = "c2", srcs = [ "C2.scala", - ], + ], ) diff --git a/test/coverage/expected-coverage.dat b/test/coverage/expected-coverage.dat index 78827572f..d142c915f 100755 --- a/test/coverage/expected-coverage.dat +++ b/test/coverage/expected-coverage.dat @@ -20,7 +20,7 @@ LF:3 end_of_record SF:/A2.scala FN:-1,A2$:: ()V -FN:5,A2$:: ()V +FN:7,A2$:: ()V FN:3,A2$::a2 ()V FN:-1,A2::a2 ()V FNDA:1,A2$:: ()V @@ -29,8 +29,8 @@ FNDA:1,A2$::a2 ()V FNDA:0,A2::a2 ()V FNF:4 FNH:3 -DA:3,11 -DA:5,5 +DA:3,4 +DA:7,5 LH:2 LF:2 end_of_record @@ -50,22 +50,6 @@ DA:7,5 LH:1 LF:2 end_of_record -SF:/C2.scala -FN:-1,C2$:: ()V -FN:5,C2$:: ()V -FN:3,C2$::c2 (Ljava/lang/String;)Ljava/lang/String; -FN:-1,C2::c2 (Ljava/lang/String;)Ljava/lang/String; -FNDA:1,C2$:: ()V -FNDA:1,C2$:: ()V -FNDA:1,C2$::c2 (Ljava/lang/String;)Ljava/lang/String; -FNDA:1,C2::c2 (Ljava/lang/String;)Ljava/lang/String; -FNF:4 -FNH:4 -DA:3,9 -DA:5,5 -LH:2 -LF:2 -end_of_record SF:/TestAll.scala FN:10,TestAll$$anonfun$1:: (LTestAll;)V FN:10,TestAll$$anonfun$1::apply ()V diff --git a/test/jmh/BUILD b/test/jmh/BUILD index b47391960..17edd80d1 100644 --- a/test/jmh/BUILD +++ b/test/jmh/BUILD @@ -30,6 +30,6 @@ scala_library( scala_benchmark_jmh( name = "test_benchmark", srcs = ["TestBenchmark.scala"], + data = ["data.txt"], deps = [":add_numbers"], - data = ["data.txt"] ) diff --git a/test/phase/add_to_all_rules/BUILD b/test/phase/add_to_all_rules/BUILD new file mode 100644 index 000000000..fa0b847be --- /dev/null +++ b/test/phase/add_to_all_rules/BUILD @@ -0,0 +1,59 @@ +load( + "//test/phase/add_to_all_rules:phase_add_to_all_rules_test.bzl", + "add_to_all_rules_scala_binary", + "add_to_all_rules_scala_junit_test", + "add_to_all_rules_scala_library", + "add_to_all_rules_scala_library_for_plugin_bootstrapping", + "add_to_all_rules_scala_macro_library", + "add_to_all_rules_scala_repl", + "add_to_all_rules_scala_test", + "add_to_all_rules_singleton", +) + +add_to_all_rules_singleton( + name = "phase_add_to_all_rules", + visibility = ["//visibility:public"], +) + +add_to_all_rules_scala_binary( + name = "PhaseBinary", + srcs = ["PhaseBinary.scala"], + main_class = "scalarules.test.phase.add_to_all_rules.PhaseBinary", +) + +add_to_all_rules_scala_library( + name = "PhaseLibrary", + srcs = ["PhaseLibrary.scala"], + custom_content = "This is custom content in library", +) + +add_to_all_rules_scala_library_for_plugin_bootstrapping( + name = "PhaseLibraryForPluginBootstrapping", + srcs = ["PhaseLibrary.scala"], + custom_content = "This is custom content in library_for_plugin_bootstrapping", +) + +add_to_all_rules_scala_macro_library( + name = "PhaseMacroLibrary", + srcs = ["PhaseLibrary.scala"], + custom_content = "This is custom content in macro_library", +) + +add_to_all_rules_scala_test( + name = "PhaseTest", + srcs = ["PhaseTest.scala"], + custom_content = "This is custom content in test", +) + +add_to_all_rules_scala_junit_test( + name = "PhaseJunitTest", + srcs = ["PhaseJunitTest.scala"], + custom_content = "This is custom content in junit_test", + suffixes = ["Test"], +) + +add_to_all_rules_scala_repl( + name = "PhaseRepl", + srcs = ["PhaseLibrary.scala"], + custom_content = "This is custom content in repl", +) diff --git a/test/phase/add_to_all_rules/PhaseBinary.scala b/test/phase/add_to_all_rules/PhaseBinary.scala new file mode 100644 index 000000000..a1f7bf5da --- /dev/null +++ b/test/phase/add_to_all_rules/PhaseBinary.scala @@ -0,0 +1,7 @@ +package scalarules.test.phase.add_to_all_rules + +object PhaseBinary { + def main(args: Array[String]) { + val message = "You can customize binary phases!" + } +} diff --git a/test/phase/add_to_all_rules/PhaseJunitTest.scala b/test/phase/add_to_all_rules/PhaseJunitTest.scala new file mode 100644 index 000000000..bc20bbc44 --- /dev/null +++ b/test/phase/add_to_all_rules/PhaseJunitTest.scala @@ -0,0 +1,10 @@ +package scalarules.test.phase.add_to_all_rules + +import org.junit.Test + +class PhaseJunitTest { + @Test + def someTest: Unit = { + val message = "You can customize junit test phases!" + } +} diff --git a/test/phase/add_to_all_rules/PhaseLibrary.scala b/test/phase/add_to_all_rules/PhaseLibrary.scala new file mode 100644 index 000000000..a1b8b1219 --- /dev/null +++ b/test/phase/add_to_all_rules/PhaseLibrary.scala @@ -0,0 +1,5 @@ +package scalarules.test.phase.add_to_all_rules + +object PhaseLibrary { + val message = "You can customize library phases!" +} diff --git a/test/phase/add_to_all_rules/PhaseTest.scala b/test/phase/add_to_all_rules/PhaseTest.scala new file mode 100644 index 000000000..ca0803a11 --- /dev/null +++ b/test/phase/add_to_all_rules/PhaseTest.scala @@ -0,0 +1,10 @@ +package scalarules.test.phase.add_to_all_rules + +import org.scalatest._ + +class PhaseTest extends FlatSpec { + val message = "You can customize test phases!" + "HelloTest" should "be able to customize test phases!" in { + assert(message.equals("You can customize test phases!")) + } +} diff --git a/test/phase/add_to_all_rules/phase_add_to_all_rules.bzl b/test/phase/add_to_all_rules/phase_add_to_all_rules.bzl new file mode 100644 index 000000000..aa89d08ec --- /dev/null +++ b/test/phase/add_to_all_rules/phase_add_to_all_rules.bzl @@ -0,0 +1,10 @@ +# +# PHASE: add to all rules +# +# A dummy test phase to make sure phase is working for all rules +# +def phase_add_to_all_rules(ctx, p): + ctx.actions.write( + output = ctx.outputs.custom_output, + content = ctx.attr.custom_content, + ) diff --git a/test/phase/add_to_all_rules/phase_add_to_all_rules_test.bzl b/test/phase/add_to_all_rules/phase_add_to_all_rules_test.bzl new file mode 100644 index 000000000..2d9b4c5fc --- /dev/null +++ b/test/phase/add_to_all_rules/phase_add_to_all_rules_test.bzl @@ -0,0 +1,60 @@ +""" +This test makes sure custom phases can be inserted to the desired position through phase API +""" + +load( + "//scala:advanced_usage/providers.bzl", + _ScalaRulePhase = "ScalaRulePhase", +) +load( + "//scala:advanced_usage/scala.bzl", + _make_scala_binary = "make_scala_binary", + _make_scala_junit_test = "make_scala_junit_test", + _make_scala_library = "make_scala_library", + _make_scala_library_for_plugin_bootstrapping = "make_scala_library_for_plugin_bootstrapping", + _make_scala_macro_library = "make_scala_macro_library", + _make_scala_repl = "make_scala_repl", + _make_scala_test = "make_scala_test", +) +load( + "//test/phase/add_to_all_rules:phase_add_to_all_rules.bzl", + _phase_add_to_all_rules = "phase_add_to_all_rules", +) + +# Inputs for the customizable rules +ext_add_to_all_rules = { + "attrs": { + "custom_content": attr.string( + default = "This is custom content", + ), + }, + "outputs": { + "custom_output": "%{name}.custom-output", + }, + "phase_providers": [ + "//test/phase/add_to_all_rules:phase_add_to_all_rules", + ], +} + +# The rule implementation for phase provider +def _add_to_all_rules_singleton_implementation(ctx): + return [ + _ScalaRulePhase( + custom_phases = [ + ("last", "", "add_to_all_rules", _phase_add_to_all_rules), + ], + ), + ] + +# The rule for phase provider +add_to_all_rules_singleton = rule( + implementation = _add_to_all_rules_singleton_implementation, +) + +add_to_all_rules_scala_binary = _make_scala_binary(ext_add_to_all_rules) +add_to_all_rules_scala_library = _make_scala_library(ext_add_to_all_rules) +add_to_all_rules_scala_library_for_plugin_bootstrapping = _make_scala_library_for_plugin_bootstrapping(ext_add_to_all_rules) +add_to_all_rules_scala_macro_library = _make_scala_macro_library(ext_add_to_all_rules) +add_to_all_rules_scala_test = _make_scala_test(ext_add_to_all_rules) +add_to_all_rules_scala_junit_test = _make_scala_junit_test(ext_add_to_all_rules) +add_to_all_rules_scala_repl = _make_scala_repl(ext_add_to_all_rules) diff --git a/test/phase/adjustment/BUILD b/test/phase/adjustment/BUILD new file mode 100644 index 000000000..947440aa0 --- /dev/null +++ b/test/phase/adjustment/BUILD @@ -0,0 +1,27 @@ +load( + "//test/phase/adjustment:phase_adjustment_test.bzl", + "adjustment_replace_scala_library", + "adjustment_replace_singleton", + "adjustment_scala_library", + "adjustment_singleton", +) + +adjustment_singleton( + name = "phase_adjustment", + visibility = ["//visibility:public"], +) + +adjustment_replace_singleton( + name = "phase_adjustment_replace", + visibility = ["//visibility:public"], +) + +adjustment_scala_library( + name = "PhaseLibrary", + srcs = ["PhaseLibrary.scala"], +) + +adjustment_replace_scala_library( + name = "PhaseLibraryReplace", + srcs = ["PhaseLibrary.scala"], +) diff --git a/test/phase/adjustment/PhaseLibrary.scala b/test/phase/adjustment/PhaseLibrary.scala new file mode 100644 index 000000000..20e661203 --- /dev/null +++ b/test/phase/adjustment/PhaseLibrary.scala @@ -0,0 +1,5 @@ +package scalarules.test.phase.adjustment + +object PhaseLibrary { + val message = "You can customize library phases!" +} diff --git a/test/phase/adjustment/phase_adjustment.bzl b/test/phase/adjustment/phase_adjustment.bzl new file mode 100644 index 000000000..ae47fd7d4 --- /dev/null +++ b/test/phase/adjustment/phase_adjustment.bzl @@ -0,0 +1,42 @@ +# +# PHASE: adjustment test +# +# Dummy test phases to make sure phase adjustment is working +# +def phase_first(ctx, p): + return struct( + info_first = "info from phase_first", + ) + +def phase_second(ctx, p): + return struct( + info_first = "phase_second redirect " + p.first.info_first, + info_second = "info from phase_second", + ) + +def phase_third(ctx, p): + ctx.actions.write( + output = ctx.outputs.custom_output, + content = "{} {} {}".format(p.first.info_first, p.second.info_first, p.second.info_second), + ) + +def phase_replace(ctx, p): + return struct( + info = "expected info from phase_replace", + ) + +def phase_being_replaced(ctx, p): + return struct( + info = "unexpected info from phase_being_replaced", + ) + +def phase_check_replacement(ctx, p): + final_info = "" + if getattr(p, "replace"): + final_info += p.replace.info + if hasattr(p, "being_replaced"): + final_info += p.being_replaced.info + ctx.actions.write( + output = ctx.outputs.custom_output, + content = "{} we should only see one info".format(final_info), + ) diff --git a/test/phase/adjustment/phase_adjustment_test.bzl b/test/phase/adjustment/phase_adjustment_test.bzl new file mode 100644 index 000000000..5d6699a4c --- /dev/null +++ b/test/phase/adjustment/phase_adjustment_test.bzl @@ -0,0 +1,79 @@ +""" +This test makes sure custom phases can be inserted to the desired position through phase API +""" + +load( + "//scala:advanced_usage/providers.bzl", + _ScalaRulePhase = "ScalaRulePhase", +) +load( + "//scala:advanced_usage/scala.bzl", + _make_scala_library = "make_scala_library", +) +load( + "//test/phase/adjustment:phase_adjustment.bzl", + _phase_being_replaced = "phase_being_replaced", + _phase_check_replacement = "phase_check_replacement", + _phase_first = "phase_first", + _phase_replace = "phase_replace", + _phase_second = "phase_second", + _phase_third = "phase_third", +) + +# Inputs for the customizable rules +ext_adjustment = { + "outputs": { + "custom_output": "%{name}.custom-output", + }, + "phase_providers": [ + "//test/phase/adjustment:phase_adjustment", + ], +} + +# The rule implementation for phase provider +def _adjustment_singleton_implementation(ctx): + return [ + _ScalaRulePhase( + custom_phases = [ + ("last", "", "second", _phase_second), + ("before", "second", "first", _phase_first), + ("after", "second", "third", _phase_third), + ], + ), + ] + +# The rule for phase provider +adjustment_singleton = rule( + implementation = _adjustment_singleton_implementation, +) + +adjustment_scala_library = _make_scala_library(ext_adjustment) + +# Inputs for the customizable rules +ext_adjustment_replace = { + "outputs": { + "custom_output": "%{name}.custom-output", + }, + "phase_providers": [ + "//test/phase/adjustment:phase_adjustment_replace", + ], +} + +# The rule implementation for phase provider +def _adjustment_replace_singleton_implementation(ctx): + return [ + _ScalaRulePhase( + custom_phases = [ + ("last", "", "check_replacement", _phase_check_replacement), + ("before", "check_replacement", "being_replaced", _phase_being_replaced), + ("replace", "being_replaced", "replace", _phase_replace), + ], + ), + ] + +# The rule for phase provider +adjustment_replace_singleton = rule( + implementation = _adjustment_replace_singleton_implementation, +) + +adjustment_replace_scala_library = _make_scala_library(ext_adjustment_replace) diff --git a/test/plugins/BUILD b/test/plugins/BUILD new file mode 100644 index 000000000..1a5cf607e --- /dev/null +++ b/test/plugins/BUILD @@ -0,0 +1,40 @@ +load("//scala:scala.bzl", "scala_library") + +scala_library( + name = "check_expand_location", + srcs = ["trivial.scala"], + plugins = [ + ":check_expand_location_plugin_deploy.jar", + ], + scalacopts = [ + "-P:diablerie:location=$(location :check_expand_location_plugin_deploy.jar)", + ], +) + +scala_library( + name = "check_expand_location_plugin", + srcs = [ + "check_expand_location_plugin.scala", + ], + resource_strip_prefix = package_name(), + resources = [ + ":gen-scalac-plugin.xml", + ], + deps = [ + "@io_bazel_rules_scala_scala_compiler", + ], +) + +_gen_plugin_xml_cmd = """ +cat > $@ << EOF + + plugin + plugin.Plugin + +""" + +genrule( + name = "gen-scalac-plugin.xml", + outs = ["scalac-plugin.xml"], + cmd = _gen_plugin_xml_cmd, +) diff --git a/test/plugins/check_expand_location_plugin.scala b/test/plugins/check_expand_location_plugin.scala new file mode 100644 index 000000000..fc5f61289 --- /dev/null +++ b/test/plugins/check_expand_location_plugin.scala @@ -0,0 +1,25 @@ +package plugin + +import scala.tools.nsc.Global +import scala.tools.nsc.Phase +import scala.tools.nsc.plugins.{ Plugin => NscPlugin} +import scala.tools.nsc.plugins.PluginComponent + +import java.io.File + +final class Plugin(override val global: Global) extends NscPlugin { + override val name: String = "diablerie" + override val description: String = "just another plugin" + override val components: List[PluginComponent] = Nil + + override def processOptions(options: List[String], error: String => Unit): Unit = { + options + .find(_.startsWith("location=")) + .map(_.stripPrefix("location=")) + .map(v => new File(v).exists) match { + case Some(true) => () + case Some(false) => error("expanded location doesn't exist") + case None => error("missing location argument") + } + } +} diff --git a/test/plugins/trivial.scala b/test/plugins/trivial.scala new file mode 100644 index 000000000..ed5e226bc --- /dev/null +++ b/test/plugins/trivial.scala @@ -0,0 +1,5 @@ +package trivial + +object Trivial { + // feel free to reuse this file for other plugin tests +} diff --git a/test/proto/BUILD b/test/proto/BUILD index ccc708a9a..a4614bb57 100644 --- a/test/proto/BUILD +++ b/test/proto/BUILD @@ -1,6 +1,6 @@ load( "//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) load( "//scala:scala.bzl", @@ -20,16 +20,16 @@ scala_proto_toolchain( "//test/proto:blacklisted_proto", "//test/proto:other_blacklisted_proto", ], + extra_generator_dependencies = [ + "//test/src/main/scala/scalarules/test/extra_protobuf_generator", + ], + named_generators = { + "jvm_extra_protobuf_generator": "scalarules.test.extra_protobuf_generator.ExtraProtobufGenerator", + }, visibility = ["//visibility:public"], with_flat_package = False, with_grpc = True, with_single_line_to_string = True, - named_generators = { - 'jvm_extra_protobuf_generator': 'scalarules.test.extra_protobuf_generator.ExtraProtobufGenerator', - }, - extra_generator_dependencies = [ - "//test/src/main/scala/scalarules/test/extra_protobuf_generator", - ], ) toolchain( @@ -74,7 +74,7 @@ proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_external_dep", visibility = ["//visibility:public"], deps = [":test_external_dep_proto"], @@ -82,19 +82,19 @@ scalapb_proto_library( # Test that the `proto_source_root` attribute is handled properly proto_library( - name = "proto_source_root", + name = "strip_import_prefix", srcs = [ "different_root.proto", "different_root2.proto", ], - proto_source_root = package_name(), + strip_import_prefix = "", visibility = ["//visibility:public"], ) -scalapb_proto_library( - name = "test_proto_source_root", +scala_proto_library( + name = "test_strip_import_prefix", visibility = ["//visibility:public"], - deps = [":proto_source_root"], + deps = [":strip_import_prefix"], ) proto_library( @@ -108,12 +108,19 @@ proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_nogrpc", visibility = ["//visibility:public"], deps = [":test2"], ) +scala_binary( + name = "test_binary_to_ensure_no_host_deps", + main_class = "a.b.c", + visibility = ["//visibility:public"], + deps = [":test_proto_nogrpc"], +) + java_proto_library( name = "test_proto_java_lib", deps = [ @@ -122,7 +129,7 @@ java_proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_java_conversions", visibility = ["//visibility:public"], deps = [ @@ -131,7 +138,7 @@ scalapb_proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto", visibility = ["//visibility:public"], deps = [ @@ -151,7 +158,6 @@ scala_test( ], ) - scala_test( name = "test_custom_object_exists", srcs = [ diff --git a/test/proto3/BUILD b/test/proto3/BUILD index 08f800729..441040e72 100644 --- a/test/proto3/BUILD +++ b/test/proto3/BUILD @@ -1,13 +1,13 @@ load( "//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) genrule( name = "generated", srcs = ["test.proto"], outs = ["generated.proto"], - cmd = "cp $(SRCS) \"$@\"" + cmd = "cp $(SRCS) \"$@\"", ) proto_library( @@ -16,8 +16,8 @@ proto_library( visibility = ["//visibility:public"], ) -scalapb_proto_library( +scala_proto_library( name = "test_generated_proto", visibility = ["//visibility:public"], deps = [":generated-proto-lib"], -) \ No newline at end of file +) diff --git a/test/shell/test_build_event_protocol.sh b/test/shell/test_build_event_protocol.sh new file mode 100755 index 000000000..0ba50edce --- /dev/null +++ b/test/shell/test_build_event_protocol.sh @@ -0,0 +1,36 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +runner=$(get_test_runner "${1:-local}") + +scala_binary_common_jar_is_exposed_in_build_event_protocol() { + local target=$1 + set +e + bazel build test:$target --build_event_text_file=$target_bes.txt + cat $target_bes.txt | grep "test/$target.jar" + if [ $? -ne 0 ]; then + echo "test/$target.jar was not found in build event protocol:" + cat $target_bes.txt + rm $target_bes.txt + exit 1 + fi + + rm $target_bes.txt + set -e +} + +scala_binary_jar_is_exposed_in_build_event_protocol() { + scala_binary_common_jar_is_exposed_in_build_event_protocol ScalaLibBinary +} + +scala_test_jar_is_exposed_in_build_event_protocol() { + scala_binary_common_jar_is_exposed_in_build_event_protocol HelloLibTest +} + +scala_junit_test_jar_is_exposed_in_build_event_protocol() { + scala_binary_common_jar_is_exposed_in_build_event_protocol JunitTestWithDeps +} + +$runner scala_binary_jar_is_exposed_in_build_event_protocol +$runner scala_test_jar_is_exposed_in_build_event_protocol +$runner scala_junit_test_jar_is_exposed_in_build_event_protocol diff --git a/test/shell/test_compilation.sh b/test/shell/test_compilation.sh new file mode 100755 index 000000000..f883dfe2f --- /dev/null +++ b/test/shell/test_compilation.sh @@ -0,0 +1,31 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_compilation_succeeds_with_plus_one_deps_on() { + bazel build --extra_toolchains=//test_expect_failure/plus_one_deps:plus_one_deps //test_expect_failure/plus_one_deps/internal_deps:a +} + +test_compilation_fails_with_plus_one_deps_undefined() { + action_should_fail build //test_expect_failure/plus_one_deps/internal_deps:a +} + +test_compilation_succeeds_with_plus_one_deps_on_for_external_deps() { + bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test_expect_failure/plus_one_deps/external_deps:a +} + +test_compilation_succeeds_with_plus_one_deps_on_also_for_exports_of_deps() { + bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test_expect_failure/plus_one_deps/exports_of_deps/... +} + +test_compilation_succeeds_with_plus_one_deps_on_also_for_deps_of_exports() { + bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test_expect_failure/plus_one_deps/deps_of_exports/... +} + +$runner test_compilation_succeeds_with_plus_one_deps_on +$runner test_compilation_fails_with_plus_one_deps_undefined +$runner test_compilation_succeeds_with_plus_one_deps_on_for_external_deps +$runner test_compilation_succeeds_with_plus_one_deps_on_also_for_exports_of_deps +$runner test_compilation_succeeds_with_plus_one_deps_on_also_for_deps_of_exports diff --git a/test/shell/test_deps.sh b/test/shell/test_deps.sh new file mode 100755 index 000000000..5570c1e09 --- /dev/null +++ b/test/shell/test_deps.sh @@ -0,0 +1,48 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scala_import_library_passes_labels_of_direct_deps() { + dependency_target='//test_expect_failure/scala_import:root_for_scala_import_passes_labels_of_direct_deps' + test_target='test_expect_failure/scala_import:leaf_for_scala_import_passes_labels_of_direct_deps' + + test_scala_library_expect_failure_on_missing_direct_deps $dependency_target $test_target +} + +test_plus_one_deps_only_works_for_java_info_targets() { + #for example doesn't break scala proto which depends on proto_library + bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test/proto:test_proto +} + +scala_pb_library_targets_do_not_have_host_deps() { + set -e + bazel build test/proto:test_binary_to_ensure_no_host_deps + set +e + find bazel-bin/test/proto/test_binary_to_ensure_no_host_deps.runfiles -name '*.jar' -exec readlink {} \; | grep 'bazel-out/host' + RET=$? + set -e + if [ "$RET" == "0" ]; then + echo "Host deps exist in output of target:" + echo "Possibly toolchains limitation?" + find bazel-bin/test/proto/test_binary_to_ensure_no_host_deps.runfiles -name '*.jar' -exec readlink {} \; | grep 'bazel-out/host' + exit 1 + fi +} + +test_scala_import_expect_failure_on_missing_direct_deps_warn_mode() { + dependency_target1='//test_expect_failure/scala_import:cats' + dependency_target2='//test_expect_failure/scala_import:guava' + test_target='test_expect_failure/scala_import:scala_import_propagates_compile_deps' + + local expected_message1="buildozer 'add deps $dependency_target1' //$test_target" + local expected_message2="buildozer 'add deps $dependency_target2' //$test_target" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message1}" ${test_target} "--strict_java_deps=warn" "ne" "${expected_message2}" +} + +$runner test_scala_import_library_passes_labels_of_direct_deps +$runner test_plus_one_deps_only_works_for_java_info_targets +$runner scala_pb_library_targets_do_not_have_host_deps +$runner test_scala_import_expect_failure_on_missing_direct_deps_warn_mode diff --git a/test/shell/test_helper.sh b/test/shell/test_helper.sh new file mode 100755 index 000000000..7e5f1b989 --- /dev/null +++ b/test/shell/test_helper.sh @@ -0,0 +1,112 @@ +#!/usr/bin/env bash +# +# Test helper functions for rules_scala integration tests. + +action_should_fail() { + # runs the tests locally + set +e + TEST_ARG=$@ + DUMMY=$(bazel $TEST_ARG) + RESPONSE_CODE=$? + if [ $RESPONSE_CODE -eq 0 ]; then + echo -e "${RED} \"bazel $TEST_ARG\" should have failed but passed. $NC" + exit -1 + else + exit 0 + fi +} + +test_expect_failure_with_message() { + set +e + + expected_message=$1 + test_filter=$2 + test_command=$3 + + command="bazel test --nocache_test_results --test_output=streamed ${test_filter} ${test_command}" + output=$(${command} 2>&1) + + echo ${output} | grep "$expected_message" + if [ $? -ne 0 ]; then + echo "'bazel test ${test_command}' should have logged \"${expected_message}\"." + exit 1 + fi + if [ "${additional_expected_message}" != "" ]; then + echo ${output} | grep "$additional_expected_message" + if [ $? -ne 0 ]; then + echo "'bazel test ${test_command}' should have logged \"${additional_expected_message}\"." + exit 1 + fi + fi + + set -e +} + +action_should_fail_with_message() { + set +e + MSG=$1 + TEST_ARG=${@:2} + RES=$(bazel $TEST_ARG 2>&1) + RESPONSE_CODE=$? + echo $RES | grep -- "$MSG" + GREP_RES=$? + if [ $RESPONSE_CODE -eq 0 ]; then + echo -e "${RED} \"bazel $TEST_ARG\" should have failed but passed. $NC" + exit 1 + elif [ $GREP_RES -ne 0 ]; then + echo -e "${RED} \"bazel $TEST_ARG\" should have failed with message \"$MSG\" but did not. $NC" + else + exit 0 + fi +} + +test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message() { + set +e + + expected_message=$1 + test_target=$2 + strict_deps_mode=$3 + operator=${4:-"eq"} + additional_expected_message=${5:-""} + + if [ "${operator}" = "eq" ]; then + error_message="bazel build of scala_library with missing direct deps should have failed." + else + error_message="bazel build of scala_library with missing direct deps should not have failed." + fi + + command="bazel build ${test_target} ${strict_deps_mode}" + + output=$(${command} 2>&1) + status_code=$? + + echo "$output" + if [ ${status_code} -${operator} 0 ]; then + echo ${error_message} + exit 1 + fi + + echo ${output} | grep "$expected_message" + if [ $? -ne 0 ]; then + echo "'bazel build ${test_target}' should have logged \"${expected_message}\"." + exit 1 + fi + if [ "${additional_expected_message}" != "" ]; then + echo ${output} | grep "$additional_expected_message" + if [ $? -ne 0 ]; then + echo "'bazel build ${test_target}' should have logged \"${additional_expected_message}\"." + exit 1 + fi + fi + + set -e +} + +test_scala_library_expect_failure_on_missing_direct_deps() { + dependenecy_target=$1 + test_target=$2 + + local expected_message="buildozer 'add deps $dependenecy_target' //$test_target" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" +} diff --git a/test/shell/test_javac_jvm_flags.sh b/test/shell/test_javac_jvm_flags.sh new file mode 100755 index 000000000..543524765 --- /dev/null +++ b/test/shell/test_javac_jvm_flags.sh @@ -0,0 +1,30 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +javac_jvm_flags_are_configured(){ + action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_javac +} + +javac_jvm_flags_via_javacopts_are_configured(){ + action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_javac_via_javacopts +} + +javac_jvm_flags_are_expanded(){ + action_should_fail_with_message \ + "invalid flag: test_expect_failure/compilers_jvm_flags/args.txt" \ + build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_javac +} + +javac_jvm_flags_via_javacopts_are_expanded(){ + action_should_fail_with_message \ + "invalid flag: test_expect_failure/compilers_jvm_flags/args.txt" \ + build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_javac_via_javacopts +} + +$runner javac_jvm_flags_are_configured +$runner javac_jvm_flags_via_javacopts_are_configured +$runner javac_jvm_flags_are_expanded +$runner javac_jvm_flags_via_javacopts_are_expanded diff --git a/test/shell/test_junit.sh b/test/shell/test_junit.sh new file mode 100755 index 000000000..af51d5b66 --- /dev/null +++ b/test/shell/test_junit.sh @@ -0,0 +1,113 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +multiple_junit_suffixes() { + bazel test //test:JunitMultipleSuffixes + + matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.JunitSuffixIT' -e 'scalarules.test.junit.JunitSuffixE2E' ./bazel-testlogs/test/JunitMultipleSuffixes/test.log) + if [ $matches -eq 3 ]; then + return 0 + else + return 1 + fi +} + +multiple_junit_prefixes() { + bazel test //test:JunitMultiplePrefixes + + matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.TestJunitCustomPrefix' -e 'scalarules.test.junit.OtherCustomPrefixJunit' ./bazel-testlogs/test/JunitMultiplePrefixes/test.log) + if [ $matches -eq 3 ]; then + return 0 + else + return 1 + fi +} + +multiple_junit_patterns() { + bazel test //test:JunitPrefixesAndSuffixes + matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.TestJunitCustomPrefix' -e 'scalarules.test.junit.JunitSuffixE2E' ./bazel-testlogs/test/JunitPrefixesAndSuffixes/test.log) + if [ $matches -eq 3 ]; then + return 0 + else + return 1 + fi +} + +test_scala_junit_test_can_fail() { + action_should_fail test test_expect_failure/scala_junit_test:failing_test +} + +junit_generates_xml_logs() { + bazel test //test:JunitTestWithDeps + matches=$(grep -c -e "testcase name='hasCompileTimeDependencies'" -e "testcase name='hasRuntimeDependencies'" ./bazel-testlogs/test/JunitTestWithDeps/test.xml) + if [ $matches -eq 2 ]; then + return 0 + else + return 1 + fi + test -e +} + +test_junit_test_must_have_prefix_or_suffix() { + action_should_fail test test_expect_failure/scala_junit_test:no_prefix_or_suffix +} + +test_junit_test_errors_when_no_tests_found() { + action_should_fail test test_expect_failure/scala_junit_test:no_tests_found +} + +scala_junit_test_test_filter(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.FirstFilterTest#(method1|method2)$|scalarules.test.junit.SecondFilterTest#(method2|method3)$' \ + test:JunitFilterTest) + local expected=( + "scalarules.test.junit.FirstFilterTest#method1" + "scalarules.test.junit.FirstFilterTest#method2" + "scalarules.test.junit.SecondFilterTest#method2" + "scalarules.test.junit.SecondFilterTest#method3") + local unexpected=( + "scalarules.test.junit.FirstFilterTest#method3" + "scalarules.test.junit.SecondFilterTest#method1" + "scalarules.test.junit.ThirdFilterTest#method1" + "scalarules.test.junit.ThirdFilterTest#method2" + "scalarules.test.junit.ThirdFilterTest#method3") + for method in "${expected[@]}"; do + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi + done + for method in "${unexpected[@]}"; do + if grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $method in output, but was found." + exit 1 + fi + done +} + +scala_junit_test_test_filter_custom_runner(){ + bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.JunitCustomRunnerTest#' \ + test:JunitCustomRunner +} + +$runner multiple_junit_suffixes +$runner multiple_junit_prefixes +$runner multiple_junit_patterns +$runner test_scala_junit_test_can_fail +$runner junit_generates_xml_logs +$runner test_junit_test_must_have_prefix_or_suffix +$runner test_junit_test_errors_when_no_tests_found +$runner scala_junit_test_test_filter +$runner scala_junit_test_test_filter_custom_runner diff --git a/test/shell/test_misc.sh b/test/shell/test_misc.sh new file mode 100755 index 000000000..e1a767b71 --- /dev/null +++ b/test/shell/test_misc.sh @@ -0,0 +1,124 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_disappearing_class() { + git checkout test_expect_failure/disappearing_class/ClassProvider.scala + bazel build test_expect_failure/disappearing_class:uses_class + echo -e "package scalarules.test\n\nobject BackgroundNoise{}" > test_expect_failure/disappearing_class/ClassProvider.scala + set +e + bazel build test_expect_failure/disappearing_class:uses_class + RET=$? + git checkout test_expect_failure/disappearing_class/ClassProvider.scala + if [ $RET -eq 0 ]; then + echo "Class caching at play. This should fail" + exit 1 + fi + set -e +} + +test_transitive_deps() { + set +e + + bazel build test_expect_failure/transitive/scala_to_scala:d + if [ $? -eq 0 ]; then + echo "'bazel build test_expect_failure/transitive/scala_to_scala:d' should have failed." + exit 1 + fi + + bazel build test_expect_failure/transitive/java_to_scala:d + if [ $? -eq 0 ]; then + echo "'bazel build test_expect_failure/transitive/java_to_scala:d' should have failed." + exit 1 + fi + + bazel build test_expect_failure/transitive/scala_to_java:d + if [ $? -eq 0 ]; then + echo "'bazel build test_transitive_deps/scala_to_java:d' should have failed." + exit 1 + fi + + set -e + exit 0 +} + +test_repl() { + bazel build $(bazel query 'kind(scala_repl, //test/...)') + echo "import scalarules.test._; HelloLib.printMessage(\"foo\")" | bazel-bin/test/HelloLibRepl | grep "foo java" && + echo "import scalarules.test._; TestUtil.foo" | bazel-bin/test/HelloLibTestRepl | grep "bar" && + echo "import scalarules.test._; ScalaLibBinary.main(Array())" | bazel-bin/test/ScalaLibBinaryRepl | grep "A hui hou" && + echo "import scalarules.test._; ResourcesStripScalaBinary.main(Array())" | bazel-bin/test/ResourcesStripScalaBinaryRepl | grep "More Hello" + echo "import scalarules.test._; A.main(Array())" | bazel-bin/test/ReplWithSources | grep "4 8 15" +} + +test_benchmark_jmh() { + RES=$(bazel run -- test/jmh:test_benchmark -i1 -f1 -wi 1) + RESPONSE_CODE=$? + if [[ $RES != *Result*Benchmark* ]]; then + echo "Benchmark did not produce expected output:\n$RES" + exit 1 + fi + exit $RESPONSE_CODE +} + +scala_test_test_filters() { + # test package wildcard (both) + local output=$(bazel test \ + --cache_test_results=no \ + --test_output streamed \ + --test_filter scalarules.test.* \ + test:TestFilterTests) + if [[ $output != *"tests a"* || $output != *"tests b"* ]]; then + echo "Should have contained test output from both test filter test a and b" + exit 1 + fi + # test just one + local output=$(bazel test \ + --cache_test_results=no \ + --test_output streamed \ + --test_filter scalarules.test.TestFilterTestA \ + test:TestFilterTests) + if [[ $output != *"tests a"* || $output == *"tests b"* ]]; then + echo "Should have only contained test output from test filter test a" + exit 1 + fi +} + +test_multi_service_manifest() { + deploy_jar='ScalaBinary_with_service_manifest_srcs_deploy.jar' + meta_file='META-INF/services/org.apache.beam.sdk.io.FileSystemRegistrar' + bazel build test:$deploy_jar + unzip -p bazel-bin/test/$deploy_jar $meta_file > service_manifest.txt + diff service_manifest.txt test/example_jars/expected_service_manifest.txt + RESPONSE_CODE=$? + rm service_manifest.txt + exit $RESPONSE_CODE +} + +test_override_javabin() { + # set the JAVABIN to nonsense + JAVABIN=/etc/basdf action_should_fail run test:ScalaBinary +} + +test_coverage_on() { + bazel coverage \ + --extra_toolchains="//test/coverage:enable_code_coverage_aspect" \ + //test/coverage/... + diff test/coverage/expected-coverage.dat $(bazel info bazel-testlogs)/test/coverage/test-all/coverage.dat +} + +xmllint_test() { + find -L ./bazel-testlogs -iname "*.xml" | xargs -n1 xmllint > /dev/null +} + +$runner test_disappearing_class +$runner test_transitive_deps +$runner test_repl +$runner test_benchmark_jmh +$runner scala_test_test_filters +$runner test_multi_service_manifest +$runner test_override_javabin +$runner test_coverage_on +$runner xmllint_test diff --git a/test/shell/test_phase.sh b/test/shell/test_phase.sh new file mode 100755 index 000000000..110f752ac --- /dev/null +++ b/test/shell/test_phase.sh @@ -0,0 +1,90 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +output_file_should_contain_message() { + set +e + MSG=$1 + TEST_ARG=${@:2} + OUTPUT_FILE=$(echo ${@:3} | sed 's#//#/#g;s#:#/#g') + OUTPUT_PATH=$(bazel info bazel-bin)/$OUTPUT_FILE + bazel $TEST_ARG + RESPONSE_CODE=$? + cat $OUTPUT_PATH | grep -- "$MSG" + GREP_RES=$? + if [ $RESPONSE_CODE -ne 0 ]; then + echo -e "${RED} \"bazel $TEST_ARG\" should pass but failed. $NC" + exit 1 + elif [ $GREP_RES -ne 0 ]; then + echo -e "${RED} \"bazel $TEST_ARG\" should pass with \"$MSG\" in file \"$OUTPUT_FILE\" but did not. $NC" + exit 1 + else + exit 0 + fi +} + +test_scala_binary_with_extra_phase() { + output_file_should_contain_message \ + "This is custom content" \ + build //test/phase/add_to_all_rules:PhaseBinary.custom-output +} + +test_scala_library_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in library" \ + build //test/phase/add_to_all_rules:PhaseLibrary.custom-output +} + +test_scala_library_for_plugin_bootstrapping_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in library_for_plugin_bootstrapping" \ + build //test/phase/add_to_all_rules:PhaseLibraryForPluginBootstrapping.custom-output +} + +test_scala_macro_library_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in macro_library" \ + build //test/phase/add_to_all_rules:PhaseMacroLibrary.custom-output +} + +test_scala_test_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in test" \ + build //test/phase/add_to_all_rules:PhaseTest.custom-output +} + +test_scala_junit_test_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in junit_test" \ + build //test/phase/add_to_all_rules:PhaseJunitTest.custom-output +} + +test_scala_repl_with_extra_phase_and_custom_content() { + output_file_should_contain_message \ + "This is custom content in repl" \ + build //test/phase/add_to_all_rules:PhaseRepl.custom-output +} + +test_phase_adjustment_and_global_provider() { + output_file_should_contain_message \ + "info from phase_first phase_second redirect info from phase_first info from phase_second" \ + build //test/phase/adjustment:PhaseLibrary.custom-output +} + +test_phase_adjustment_replace() { + output_file_should_contain_message \ + "expected info from phase_replace we should only see one info" \ + build //test/phase/adjustment:PhaseLibraryReplace.custom-output +} + +$runner test_scala_binary_with_extra_phase +$runner test_scala_library_with_extra_phase_and_custom_content +$runner test_scala_library_for_plugin_bootstrapping_with_extra_phase_and_custom_content +$runner test_scala_macro_library_with_extra_phase_and_custom_content +$runner test_scala_test_with_extra_phase_and_custom_content +$runner test_scala_junit_test_with_extra_phase_and_custom_content +$runner test_scala_repl_with_extra_phase_and_custom_content +$runner test_phase_adjustment_and_global_provider +$runner test_phase_adjustment_replace diff --git a/test_runner.sh b/test/shell/test_runner.sh similarity index 98% rename from test_runner.sh rename to test/shell/test_runner.sh index 33392676a..7e2825699 100644 --- a/test_runner.sh +++ b/test/shell/test_runner.sh @@ -12,7 +12,7 @@ run_test_ci() { local TEST_ARG=$@ local log_file=output_$$.log echo "running test $TEST_ARG" - $TEST_ARG &>$log_file & + eval $TEST_ARG &>$log_file & local test_pid=$! SECONDS=0 test_pulse_printer $! $TIMOUT $TEST_ARG & @@ -83,4 +83,4 @@ get_test_runner() { exit 1 fi echo "run_test_${test_env}" -} \ No newline at end of file +} diff --git a/test/shell/test_scala_binary.sh b/test/shell/test_scala_binary.sh new file mode 100755 index 000000000..df5170ea4 --- /dev/null +++ b/test/shell/test_scala_binary.sh @@ -0,0 +1,22 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scala_binary_expect_failure_on_missing_direct_deps() { + dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='test_expect_failure/missing_direct_deps/internal_deps:user_binary' + + test_scala_library_expect_failure_on_missing_direct_deps ${dependency_target} ${test_target} +} + +test_scala_binary_expect_failure_on_missing_direct_deps_located_in_dependency_which_is_scala_binary() { + dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='test_expect_failure/missing_direct_deps/internal_deps:binary_user_of_binary' + + test_scala_library_expect_failure_on_missing_direct_deps ${dependency_target} ${test_target} +} + +$runner test_scala_binary_expect_failure_on_missing_direct_deps +$runner test_scala_binary_expect_failure_on_missing_direct_deps_located_in_dependency_which_is_scala_binary diff --git a/test/shell/test_scala_classpath.sh b/test/shell/test_scala_classpath.sh new file mode 100755 index 000000000..6f5be4669 --- /dev/null +++ b/test/shell/test_scala_classpath.sh @@ -0,0 +1,22 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +runner=$(get_test_runner "${1:-local}") + +test_scala_classpath_resources_expect_warning_on_namespace_conflict() { + local output=$(bazel build \ + --verbose_failures \ + //test/src/main/scala/scalarules/test/classpath_resources:classpath_resource_duplicates + ) + + local expected="Classpath resource file classpath-resourcehas a namespace conflict with another file: classpath-resource" + + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi +} + +$runner test_scala_classpath_resources_expect_warning_on_namespace_conflict diff --git a/test/shell/test_scala_import_source_jar.sh b/test/shell/test_scala_import_source_jar.sh new file mode 100755 index 000000000..9b8a425b8 --- /dev/null +++ b/test/shell/test_scala_import_source_jar.sh @@ -0,0 +1,68 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to() { + # the existence of the env var should cause the import repository rule to re-fetch the dependency + # and therefore the order of tests is not expected to matter + export BAZEL_JVM_FETCH_SOURCES=$1 + local expect_failure=$2 + + if [[ ${expect_failure} ]]; then + test_scala_import_fetch_sources $expect_failure + else + test_scala_import_fetch_sources + fi + + unset BAZEL_JVM_FETCH_SOURCES +} + +test_scala_import_fetch_sources() { + local srcjar_name="guava-21.0-src.jar" + local bazel_out_external_guava_21=$(bazel info output_base)/external/com_google_guava_guava_21_0 + local expect_failure=$1 + + set -e + bazel build //test/src/main/scala/scalarules/test/fetch_sources/... + set +e + + assert_file_exists $expect_failure $bazel_out_external_guava_21/$srcjar_name +} + +assert_file_exists() { + if [[ $1 ]]; then + if [[ -f $2 ]]; then + echo "File $2 exists but we expect no source jars." + exit 1 + else + echo "File $2 does not exist." + exit 0 + fi + else + if [[ -f $2 ]]; then + echo "File $2 exists." + exit 0 + else + echo "File $2 does not exist but we expect it to exist." + exit 1 + fi + fi +} + +test_scala_import_source_jar_should_be_fetched_when_fetch_sources_is_set_to_true() { + test_scala_import_fetch_sources +} + +test_scala_import_source_jar_should_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_true() { + test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to "TruE" # as implied, the value is case insensitive +} + +test_scala_import_source_jar_should_not_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_non_true() { + test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to "false" "true" +} + +$runner test_scala_import_source_jar_should_be_fetched_when_fetch_sources_is_set_to_true +$runner test_scala_import_source_jar_should_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_true +$runner test_scala_import_source_jar_should_not_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_non_true diff --git a/test/shell/test_scala_jvm_flags.sh b/test/shell/test_scala_jvm_flags.sh new file mode 100755 index 000000000..5048d08de --- /dev/null +++ b/test/shell/test_scala_jvm_flags.sh @@ -0,0 +1,21 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scala_jvm_flags_on_target_overrides_toolchain_passes() { + bazel test --extra_toolchains="//manual_test/scala_test_jvm_flags:failing_scala_toolchain" //manual_test/scala_test_jvm_flags:empty_overriding_test +} + +test_scala_jvm_flags_from_scala_toolchain_passes() { + bazel test --extra_toolchains="//manual_test/scala_test_jvm_flags:passing_scala_toolchain" //manual_test/scala_test_jvm_flags:empty_test +} + +test_scala_jvm_flags_from_scala_toolchain_fails() { + action_should_fail test --extra_toolchains="//test_expect_failure/scala_test_jvm_flags:failing_scala_toolchain" //test_expect_failure/scala_test_jvm_flags:empty_test +} + +$runner test_scala_jvm_flags_on_target_overrides_toolchain_passes +$runner test_scala_jvm_flags_from_scala_toolchain_passes +$runner test_scala_jvm_flags_from_scala_toolchain_fails diff --git a/test/shell/test_scala_library.sh b/test/shell/test_scala_library.sh new file mode 100755 index 000000000..0a5ada9f4 --- /dev/null +++ b/test/shell/test_scala_library.sh @@ -0,0 +1,186 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +revert_internal_change() { + sed -i.bak "s/println(\"altered\")/println(\"orig\")/" $no_recompilation_path/C.scala + rm $no_recompilation_path/C.scala.bak +} + +revert_change() { + mv $1/$2.bak $1/$2 +} + +test_scala_library_expect_no_recompilation_on_internal_change() { + changed_file=$1 + changed_content=$2 + dependency=$3 + dependency_description=$4 + set +e + no_recompilation_path="test/src/main/scala/scalarules/test/ijar" + build_command="bazel build //$no_recompilation_path/... --subcommands" + + echo "running initial build" + $build_command + echo "changing internal behaviour of $changed_file" + sed -i.bak $changed_content ./$no_recompilation_path/$changed_file + + echo "running second build" + output=$(${build_command} 2>&1) + + not_expected_recompiled_action="$no_recompilation_path$dependency" + + echo ${output} | grep "$not_expected_recompiled_action" + if [ $? -eq 0 ]; then + echo "bazel build was executed after change of internal behaviour of 'dependency' target. compilation of $dependency_description should not have been triggered." + revert_change $no_recompilation_path $changed_file + exit 1 + fi + + revert_change $no_recompilation_path $changed_file + set -e +} + +test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency() { + test_scala_library_expect_no_recompilation_on_internal_change $1 $2 ":user" "'user'" +} + +test_scala_library_suite() { + action_should_fail build test_expect_failure/scala_library_suite:library_suite_dep_on_children +} + +test_scala_library_expect_failure_on_missing_direct_internal_deps() { + dependenecy_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' + + test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target +} + +test_scala_library_expect_failure_on_missing_direct_external_deps_jar() { + dependenecy_target='@com_google_guava_guava_21_0//:com_google_guava_guava_21_0' + test_target='test_expect_failure/missing_direct_deps/external_deps:transitive_external_dependency_user' + + test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target +} + +test_scala_library_expect_failure_on_missing_direct_external_deps_file_group() { + dependenecy_target='@com_google_guava_guava_21_0_with_file//:com_google_guava_guava_21_0_with_file' + test_target='test_expect_failure/missing_direct_deps/external_deps:transitive_external_dependency_user_file_group' + + test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target +} + +test_scala_library_expect_failure_on_missing_direct_deps_strict_is_disabled_by_default() { + expected_message="not found: value C" + test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "$expected_message" $test_target "" +} + +test_scala_library_expect_failure_on_missing_direct_deps_warn_mode() { + dependenecy_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' + + expected_message="warning: Target '$dependenecy_target' is used but isn't explicitly declared, please add it to the deps" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=warn" "ne" +} + +test_scala_library_expect_failure_on_missing_direct_deps_warn_mode_java() { + dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_java_user' + + local expected_message="$dependency_target.*$test_target" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=warn" "ne" +} + +test_scala_library_expect_failure_on_missing_direct_deps_off_mode() { + expected_message="test_expect_failure/missing_direct_deps/internal_deps/A.scala:[0-9+]: error: not found: value C" + test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=off" +} + +test_scala_library_expect_no_recompilation_on_internal_change_of_transitive_dependency() { + set +e + no_recompilation_path="test/src/main/scala/scalarules/test/strict_deps/no_recompilation" + build_command="bazel build //$no_recompilation_path/... --subcommands --strict_java_deps=error" + + echo "running initial build" + $build_command + echo "changing internal behaviour of C.scala" + sed -i.bak "s/println(\"orig\")/println(\"altered\")/" ./$no_recompilation_path/C.scala + + echo "running second build" + output=$(${build_command} 2>&1) + + not_expected_recompiled_target="//$no_recompilation_path:transitive_dependency_user" + + echo ${output} | grep "$not_expected_recompiled_target" + if [ $? -eq 0 ]; then + echo "bazel build was executed after change of internal behaviour of 'transitive_dependency' target. compilation of 'transitive_dependency_user' should not have been triggered." + revert_internal_change + exit 1 + fi + + revert_internal_change + set -e +} + +test_scala_library_expect_no_recompilation_on_internal_change_of_scala_dependency() { + test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency "B.scala" "s/println(\"orig\")/println(\"altered\")/" +} + +test_scala_library_expect_no_recompilation_on_internal_change_of_java_dependency() { + test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency "C.java" "s/System.out.println(\"orig\")/System.out.println(\"altered\")/" +} + +test_scala_library_expect_no_java_recompilation_on_internal_change_of_scala_sibling() { + test_scala_library_expect_no_recompilation_on_internal_change "B.scala" "s/println(\"orig_sibling\")/println(\"altered_sibling\")/" "/dependency_java" "java sibling" +} + +test_scala_library_expect_failure_on_missing_direct_java() { + dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' + test_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_java_user' + + expected_message="$dependency_target.*$test_target" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" +} + +test_scala_library_expect_failure_on_java_in_src_jar_when_disabled() { + test_target='//test_expect_failure/java_in_src_jar_when_disabled:java_source_jar' + + expected_message=".*Found java files in source jars but expect Java output is set to false" + + test_expect_failure_with_message "${expected_message}" $test_target +} + +test_scala_library_expect_better_failure_message_on_missing_transitive_dependency_labels_from_other_jvm_rules() { + transitive_target='.*transitive_dependency-ijar.jar' + direct_target='//test_expect_failure/missing_direct_deps/internal_deps:direct_java_provider_dependency' + test_target='//test_expect_failure/missing_direct_deps/internal_deps:dependent_on_some_java_provider' + + expected_message="Unknown label of file $transitive_target which came from $direct_target" + + test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" +} + +$runner test_scala_library_suite +$runner test_scala_library_expect_failure_on_missing_direct_internal_deps +$runner test_scala_library_expect_failure_on_missing_direct_external_deps_jar +$runner test_scala_library_expect_failure_on_missing_direct_external_deps_file_group +$runner test_scala_library_expect_failure_on_missing_direct_deps_strict_is_disabled_by_default +$runner test_scala_library_expect_failure_on_missing_direct_deps_warn_mode +$runner test_scala_library_expect_failure_on_missing_direct_deps_warn_mode_java +$runner test_scala_library_expect_failure_on_missing_direct_deps_off_mode +$runner test_scala_library_expect_no_recompilation_on_internal_change_of_transitive_dependency +$runner test_scala_library_expect_no_recompilation_on_internal_change_of_scala_dependency +$runner test_scala_library_expect_no_recompilation_on_internal_change_of_java_dependency +$runner test_scala_library_expect_no_java_recompilation_on_internal_change_of_scala_sibling +$runner test_scala_library_expect_failure_on_missing_direct_java +$runner test_scala_library_expect_failure_on_java_in_src_jar_when_disabled +$runner test_scala_library_expect_better_failure_message_on_missing_transitive_dependency_labels_from_other_jvm_rules diff --git a/test/shell/test_scala_library_jar.sh b/test/shell/test_scala_library_jar.sh new file mode 100755 index 000000000..8ebd50360 --- /dev/null +++ b/test/shell/test_scala_library_jar.sh @@ -0,0 +1,30 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_resources() { + RESOURCE_NAME="resource.txt" + TARGET=$1 + OUTPUT_JAR="bazel-bin/test/src/main/scala/scalarules/test/resources/$TARGET.jar" + FULL_TARGET="test/src/main/scala/scalarules/test/resources/$TARGET.jar" + bazel build $FULL_TARGET + jar tf $OUTPUT_JAR | grep $RESOURCE_NAME +} + +scala_library_jar_without_srcs_must_include_direct_file_resources(){ + test_resources "noSrcsWithDirectFileResources" +} + +scala_library_jar_without_srcs_must_include_filegroup_resources(){ + test_resources "noSrcsWithFilegroupResources" +} + +scala_library_jar_without_srcs_must_fail_on_mismatching_resource_strip_prefix() { + action_should_fail build test_expect_failure/mismatching_resource_strip_prefix:noSrcsJarWithWrongStripPrefix +} + +$runner scala_library_jar_without_srcs_must_fail_on_mismatching_resource_strip_prefix +$runner scala_library_jar_without_srcs_must_include_direct_file_resources +$runner scala_library_jar_without_srcs_must_include_filegroup_resources diff --git a/test/shell/test_scala_specs2.sh b/test/shell/test_scala_specs2.sh new file mode 100755 index 000000000..c6a2a95e8 --- /dev/null +++ b/test/shell/test_scala_specs2.sh @@ -0,0 +1,278 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +scala_specs2_junit_test_test_filter_everything(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=.*' \ + test:Specs2Tests) + local expected=( + "[info] JunitSpec2RegexTest" + "[info] JunitSpecs2AnotherTest" + "[info] JunitSpecs2Test") + local unexpected=( + "[info] UnrelatedTest") + for method in "${expected[@]}"; do + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi + done + for method in "${unexpected[@]}"; do + if grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $method in output, but was found." + exit 1 + fi + done +} + +scala_specs2_junit_test_test_filter_one_test(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#specs2 tests should::run smoothly in bazel$' \ + test:Specs2Tests) + local expected="+ run smoothly in bazel" + local unexpected="+ not run smoothly in bazel" + if ! grep "$expected" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $expected in output, but was not found." + exit 1 + fi + if grep "$unexpected" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $unexpected in output, but was found." + exit 1 + fi +} + +scala_specs2_junit_test_test_filter_whole_spec(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#' \ + test:Specs2Tests) + local expected=( + "+ run smoothly in bazel" + "+ not run smoothly in bazel") + local unexpected=( + "+ run from another test") + for method in "${expected[@]}"; do + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi + done + for method in "${unexpected[@]}"; do + if grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $method in output, but was found." + exit 1 + fi + done +} + +scala_specs2_junit_test_test_filter_exact_match(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2AnotherTest#other specs2 tests should::run from another test$' \ + test:Specs2Tests) + local expected="+ run from another test" + local unexpected="+ run from another test 2" + if ! grep "$expected" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $expected in output, but was not found." + exit 1 + fi + if grep "$unexpected" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $unexpected in output, but was found." + exit 1 + fi +} + +scala_specs2_junit_test_test_filter_exact_match_unsafe_characters(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpec2RegexTest#\Qtests with unsafe characters should::2 + 2 != 5\E$' \ + test:Specs2Tests) + local expected="+ 2 + 2 != 5" + local unexpected="+ work escaped (with regex)" + if ! grep "$expected" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $expected in output, but was not found." + exit 1 + fi + if grep "$unexpected" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $unexpected in output, but was found." + exit 1 + fi +} + +scala_specs2_junit_test_test_filter_exact_match_escaped_and_sanitized(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpec2RegexTest#\Qtests with unsafe characters should::work escaped [with regex]\E$' \ + test:Specs2Tests) + local expected="+ work escaped (with regex)" + local unexpected="+ 2 + 2 != 5" + if ! grep "$expected" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $expected in output, but was not found." + exit 1 + fi + if grep "$unexpected" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $unexpected in output, but was found." + exit 1 + fi +} + +scala_specs2_junit_test_test_filter_match_multiple_methods(){ + local output=$(bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2AnotherTest#other specs2 tests should::(\Qrun from another test\E|\Qrun from another test 2\E)$' \ + test:Specs2Tests) + local expected=( + "+ run from another test" + "+ run from another test 2") + local unexpected=( + "+ not run") + for method in "${expected[@]}"; do + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi + done + for method in "${unexpected[@]}"; do + if grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $method in output, but was found." + exit 1 + fi + done +} + +scala_specs2_exception_in_initializer_without_filter(){ + expected_message="org.specs2.control.UserException: cannot create an instance for class scalarules.test.junit.specs2.FailingTest" + test_command="test_expect_failure/scala_junit_test:specs2_failing_test" + + test_expect_failure_with_message "$expected_message" $test_filter $test_command +} + +scala_specs2_exception_in_initializer_terminates_without_timeout(){ + local output=$(bazel test \ + --test_output=streamed \ + --test_timeout=10 \ + '--test_filter=scalarules.test.junit.specs2.FailingTest#' \ + test_expect_failure/scala_junit_test:specs2_failing_test) + local expected=( + "org.specs2.control.UserException: cannot create an instance for class scalarules.test.junit.specs2.FailingTest") + local unexpected=( + "TIMEOUT") + for method in "${expected[@]}"; do + if ! grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Expected $method in output, but was not found." + exit 1 + fi + done + for method in "${unexpected[@]}"; do + if grep "$method" <<<$output; then + echo "output:" + echo "$output" + echo "Not expecting $method in output, but was found." + exit 1 + fi + done +} + +scala_specs2_all_tests_show_in_the_xml(){ + bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#' \ + test:Specs2Tests + matches=$(grep -c -e "testcase name='specs2 tests should::run smoothly in bazel'" -e "testcase name='specs2 tests should::not run smoothly in bazel'" ./bazel-testlogs/test/Specs2Tests/test.xml) + if [ $matches -eq 2 ]; then + return 0 + else + echo "Expecting two results, found a different number ($matches). Please check 'bazel-testlogs/test/Specs2Tests/test.xml'" + return 1 + fi + test -e +} + +scala_specs2_only_filtered_test_shows_in_the_xml(){ + bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#specs2 tests should::run smoothly in bazel$' \ + test:Specs2Tests + matches=$(grep -c -e "testcase name='specs2 tests should::run smoothly in bazel'" -e "testcase name='specs2 tests should::not run smoothly in bazel'" ./bazel-testlogs/test/Specs2Tests/test.xml) + if [ $matches -eq 1 ]; then + return 0 + else + echo "Expecting only one result, found more than one. Please check 'bazel-testlogs/test/Specs2Tests/test.xml'" + return 1 + fi + test -e +} + +scala_specs2_only_failed_test_shows_in_the_xml(){ + set +e + bazel test \ + --nocache_test_results \ + --test_output=streamed \ + '--test_filter=scalarules.test.junit.specs2.SuiteWithOneFailingTest#specs2 tests should::fail$' \ + test_expect_failure/scala_junit_test:specs2_failing_test + echo "got results" + matches=$(grep -c -e "testcase name='specs2 tests should::fail'" -e "testcase name='specs2 tests should::succeed'" ./bazel-testlogs/test_expect_failure/scala_junit_test/specs2_failing_test/test.xml) + if [ $matches -eq 1 ]; then + return 0 + else + echo "Expecting only one result, found more than one. Please check './bazel-testlogs/test_expect_failure/scala_junit_test/specs2_failing_test/test.xml'" + return 1 + fi +} + +$runner scala_specs2_junit_test_test_filter_everything +$runner scala_specs2_junit_test_test_filter_one_test +$runner scala_specs2_junit_test_test_filter_whole_spec +$runner scala_specs2_junit_test_test_filter_exact_match +$runner scala_specs2_junit_test_test_filter_exact_match_unsafe_characters +$runner scala_specs2_junit_test_test_filter_exact_match_escaped_and_sanitized +$runner scala_specs2_junit_test_test_filter_match_multiple_methods +$runner scala_specs2_exception_in_initializer_without_filter +$runner scala_specs2_exception_in_initializer_terminates_without_timeout +$runner scala_specs2_all_tests_show_in_the_xml +$runner scala_specs2_only_filtered_test_shows_in_the_xml +$runner scala_specs2_only_failed_test_shows_in_the_xml diff --git a/test/shell/test_scalac_jvm_flags.sh b/test/shell/test_scalac_jvm_flags.sh new file mode 100755 index 000000000..26098c6d5 --- /dev/null +++ b/test/shell/test_scalac_jvm_flags.sh @@ -0,0 +1,38 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scalac_jvm_flags_on_target_overrides_toolchain_passes() { + bazel build --extra_toolchains="//manual_test/scalac_jvm_opts:failing_scala_toolchain" //manual_test/scalac_jvm_opts:empty_overriding_build +} + +test_scalac_jvm_flags_from_scala_toolchain_passes() { + bazel build --extra_toolchains="//manual_test/scalac_jvm_opts:passing_scala_toolchain" //manual_test/scalac_jvm_opts:empty_build +} + +test_scalac_jvm_flags_from_scala_toolchain_fails() { + action_should_fail build --extra_toolchains="//test_expect_failure/scalac_jvm_opts:failing_scala_toolchain" //test_expect_failure/scalac_jvm_opts:empty_build +} + +test_scalac_jvm_flags_work_with_scalapb() { + bazel build --extra_toolchains="//manual_test/scalac_jvm_opts:passing_scala_toolchain" //manual_test/scalac_jvm_opts:proto +} + +test_scalac_jvm_flags_are_configured(){ + action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_scalac +} + +test_scalac_jvm_flags_are_expanded(){ + action_should_fail_with_message \ + "--jvm_flag=test_expect_failure/compilers_jvm_flags/args.txt" \ + build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_scalac +} + +$runner test_scalac_jvm_flags_on_target_overrides_toolchain_passes +$runner test_scalac_jvm_flags_from_scala_toolchain_passes +$runner test_scalac_jvm_flags_from_scala_toolchain_fails +$runner test_scalac_jvm_flags_work_with_scalapb +$runner test_scalac_jvm_flags_are_configured +$runner test_scalac_jvm_flags_are_expanded diff --git a/test/shell/test_toolchain.sh b/test/shell/test_toolchain.sh new file mode 100755 index 000000000..1fa2fc209 --- /dev/null +++ b/test/shell/test_toolchain.sh @@ -0,0 +1,19 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_scalaopts_from_scala_toolchain() { + action_should_fail build --extra_toolchains="//test_expect_failure/scalacopts_from_toolchain:failing_scala_toolchain" //test_expect_failure/scalacopts_from_toolchain:failing_build +} + +java_toolchain_javacopts_are_used(){ + action_should_fail_with_message \ + "invalid flag: -InvalidFlag" \ + build --java_toolchain=//test_expect_failure/compilers_javac_opts:a_java_toolchain \ + --verbose_failures //test_expect_failure/compilers_javac_opts:can_configure_jvm_flags_for_javac_via_javacopts +} + +$runner test_scalaopts_from_scala_toolchain +$runner java_toolchain_javacopts_are_used diff --git a/test/shell/test_unused_dependency.sh b/test/shell/test_unused_dependency.sh new file mode 100755 index 000000000..6e8e1e2ff --- /dev/null +++ b/test/shell/test_unused_dependency.sh @@ -0,0 +1,54 @@ +# shellcheck source=./test_runner.sh +dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +. "${dir}"/test_runner.sh +. "${dir}"/test_helper.sh +runner=$(get_test_runner "${1:-local}") + +test_unused_dependency_checker_mode_from_scala_toolchain() { + action_should_fail build --extra_toolchains="//test_expect_failure/unused_dependency_checker:failing_scala_toolchain" //test_expect_failure/unused_dependency_checker:toolchain_failing_build +} + +test_unused_dependency_checker_mode_set_in_rule() { + action_should_fail build //test_expect_failure/unused_dependency_checker:failing_build +} + +test_unused_dependency_checker_mode_override_toolchain() { + bazel build --extra_toolchains="//test_expect_failure/unused_dependency_checker:failing_scala_toolchain" //test_expect_failure/unused_dependency_checker:toolchain_override +} + +test_unused_dependency_checker_mode_warn() { + # this is a hack to invalidate the cache, so that the target actually gets built and outputs warnings. + bazel build \ + --strict_java_deps=warn \ + //test:UnusedDependencyCheckerWarn + + local output + output=$(bazel build \ + --strict_java_deps=off \ + //test:UnusedDependencyCheckerWarn 2>&1 + ) + + if [ $? -ne 0 ]; then + echo "Target with unused dependency failed to build with status $?" + echo "$output" + exit 1 + fi + + local expected="warning: Target '//test:UnusedLib' is specified as a dependency to //test:UnusedDependencyCheckerWarn but isn't used, please remove it from the deps." + + echo "$output" | grep "$expected" + if [ $? -ne 0 ]; then + echo "Expected output:[$output] to contain [$expected]" + exit 1 + fi +} + +test_unused_dependency_fails_even_if_also_exists_in_plus_one_deps() { + action_should_fail build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps_with_unused_error" //test_expect_failure/plus_one_deps/with_unused_deps:a +} + +$runner test_unused_dependency_checker_mode_from_scala_toolchain +$runner test_unused_dependency_checker_mode_set_in_rule +$runner test_unused_dependency_checker_mode_override_toolchain +$runner test_unused_dependency_checker_mode_warn +$runner test_unused_dependency_fails_even_if_also_exists_in_plus_one_deps diff --git a/test/src/main/scala/scalarules/test/compiler_plugin/BUILD.bazel b/test/src/main/scala/scalarules/test/compiler_plugin/BUILD.bazel index 368ae414c..53db39a9b 100644 --- a/test/src/main/scala/scalarules/test/compiler_plugin/BUILD.bazel +++ b/test/src/main/scala/scalarules/test/compiler_plugin/BUILD.bazel @@ -2,6 +2,7 @@ load("//scala:scala.bzl", "scala_library") scala_library( name = "compiler_plugin", - srcs = [ "KindProjected.scala" ], - plugins = ["@org_spire_math_kind_projector//jar"] -) \ No newline at end of file + srcs = ["KindProjected.scala"], + plugins = ["@org_spire_math_kind_projector//jar"], + visibility = ["//visibility:public"], +) diff --git a/test/src/main/scala/scalarules/test/extra_protobuf_generator/BUILD b/test/src/main/scala/scalarules/test/extra_protobuf_generator/BUILD index c522f5274..01cbc4fb0 100644 --- a/test/src/main/scala/scalarules/test/extra_protobuf_generator/BUILD +++ b/test/src/main/scala/scalarules/test/extra_protobuf_generator/BUILD @@ -3,13 +3,10 @@ load("//scala:scala.bzl", "scala_library") scala_library( name = "extra_protobuf_generator", srcs = ["ExtraProtobufGenerator.scala"], + visibility = ["//visibility:public"], deps = [ - "//external:io_bazel_rules_scala/dependency/proto/protoc_bridge", + "//external:io_bazel_rules_scala/dependency/com_google_protobuf/protobuf_java", + "//external:io_bazel_rules_scala/dependency/proto/protoc_bridge", "//external:io_bazel_rules_scala/dependency/proto/scalapb_plugin", - "//external:io_bazel_rules_scala/dependency/com_google_protobuf/protobuf_java", - ], - visibility = ["//visibility:public"], - - ) diff --git a/test/src/main/scala/scalarules/test/fetch_sources/BUILD b/test/src/main/scala/scalarules/test/fetch_sources/BUILD index bf90d3f43..8eb33b5f2 100644 --- a/test/src/main/scala/scalarules/test/fetch_sources/BUILD +++ b/test/src/main/scala/scalarules/test/fetch_sources/BUILD @@ -2,6 +2,6 @@ load("//scala:scala.bzl", "scala_library") scala_library( name = "fetch_sources", - srcs = [ "FetchSources.scala" ], - deps = [ "@com_google_guava_guava_21_0//jar" ] + srcs = ["FetchSources.scala"], + deps = ["@com_google_guava_guava_21_0//jar"], ) diff --git a/test/src/main/scala/scalarules/test/ijar/BUILD b/test/src/main/scala/scalarules/test/ijar/BUILD index 2b12aab5f..558e2138a 100644 --- a/test/src/main/scala/scalarules/test/ijar/BUILD +++ b/test/src/main/scala/scalarules/test/ijar/BUILD @@ -1,6 +1,6 @@ package(default_visibility = ["//visibility:public"]) -load("//scala:scala.bzl", "scala_library", "scala_test", "scala_binary") +load("//scala:scala.bzl", "scala_binary", "scala_library", "scala_test") scala_library( name = "user", diff --git a/test/src/main/scala/scalarules/test/scala_import/BUILD b/test/src/main/scala/scalarules/test/scala_import/BUILD index 1d46bb0e3..0ffc07d7c 100644 --- a/test/src/main/scala/scalarules/test/scala_import/BUILD +++ b/test/src/main/scala/scalarules/test/scala_import/BUILD @@ -5,8 +5,8 @@ load("//scala:scala_import.bzl", "scala_import") scala_import( name = "guava_and_commons_lang", jars = [ - "@com_google_guava_guava_21_0_with_file//jar:file", - "@org_apache_commons_commons_lang_3_5//jar:file", + "@com_google_guava_guava_21_0_with_file//:guava-21.0.jar", + "@org_apache_commons_commons_lang_3_5//:commons-lang3-3.5.jar", ], ) @@ -41,23 +41,19 @@ scala_specs2_junit_test( deps = [":relate"], ) -#filter source jars -scala_import( - name = "cats", - jars = ["@org_typelevel__cats_core//jar:file"], -) - scala_library( name = "source_jar_not_oncp", srcs = ["ReferCatsImplicits.scala"], - deps = [":cats"], + # jvm_maven_import_external doesn't fetch source jars automatically + deps = ["@org_typelevel__cats_core//jar"], ) ##Runtime deps scala_import( name = "indirection_for_transitive_runtime_deps", jars = [], - runtime_deps = [":cats"], + # jvm_maven_import_external doesn't fetch source jars automatically + deps = ["@org_typelevel__cats_core//jar"], ) scala_import( @@ -80,8 +76,8 @@ scala_specs2_junit_test( java_import( name = "guava_and_commons_lang_java_import", jars = [ - "@com_google_guava_guava_21_0_with_file//jar:file", - "@org_apache_commons_commons_lang_3_5//jar:file", + "@com_google_guava_guava_21_0_with_file//:guava-21.0.jar", + "@org_apache_commons_commons_lang_3_5//:commons-lang3-3.5.jar", ], ) diff --git a/test/src/main/scala/scalarules/test/scala_import/nl/BUILD.bazel b/test/src/main/scala/scalarules/test/scala_import/nl/BUILD.bazel index 8226a65e6..7b42b7dd6 100644 --- a/test/src/main/scala/scalarules/test/scala_import/nl/BUILD.bazel +++ b/test/src/main/scala/scalarules/test/scala_import/nl/BUILD.bazel @@ -4,7 +4,7 @@ load("//scala:scala_import.bzl", "scala_import") scala_import( name = "scala_import_never_link", jars = [ - "scala_import_never_link.jar" + "scala_import_never_link.jar", ], neverlink = 1, ) diff --git a/test/src/main/scala/scalarules/test/scripts/BUILD b/test/src/main/scala/scalarules/test/scripts/BUILD new file mode 100644 index 000000000..ce0778ee3 --- /dev/null +++ b/test/src/main/scala/scalarules/test/scripts/BUILD @@ -0,0 +1,10 @@ +load("//scala:scala.bzl", "scala_library", "scala_specs2_junit_test") +load("//scala:scala_import.bzl", "scala_import") + +scala_specs2_junit_test( + name = "pb_generate_request_test", + size = "small", + srcs = ["PBGenerateRequestTest.scala"], + suffixes = ["Test"], + deps = ["//src/scala/scripts:scala_proto_request_extractor"], +) diff --git a/test/src/main/scala/scalarules/test/scripts/PBGenerateRequestTest.scala b/test/src/main/scala/scalarules/test/scripts/PBGenerateRequestTest.scala new file mode 100644 index 000000000..87a37e569 --- /dev/null +++ b/test/src/main/scala/scalarules/test/scripts/PBGenerateRequestTest.scala @@ -0,0 +1,21 @@ +package scalarules.test.scripts + +import java.nio.file.Paths +import scripts.PBGenerateRequest +import org.specs2.mutable.SpecWithJUnit + +class PBGenerateRequestTest extends SpecWithJUnit { + "fixTransitiveProtoPath should fix path when included proto is available, ignore otherwise" >> { + val includedProtos = List(Paths.get("a/b/c") -> Paths.get("a/b/c/d/e/f.proto")) + Seq("d/e", "x/y/z").map(PBGenerateRequest.fixTransitiveProtoPath(includedProtos)) must + beEqualTo(Seq("a/b/c/d/e", "x/y/z")) + } + + "actual case observed in builds" >> { + val includedProtos = List( + Paths.get("bazel-out/k8-fastbuild/bin") -> + Paths.get("bazel-out/k8-fastbuild/bin/external/com_google_protobuf/google/protobuf/source_context.proto")) + Seq("external/com_google_protobuf").map(PBGenerateRequest.fixTransitiveProtoPath(includedProtos)) must + beEqualTo(Seq("bazel-out/k8-fastbuild/bin/external/com_google_protobuf")) + } +} \ No newline at end of file diff --git a/test/src/main/scala/scalarules/test/strict_deps/no_recompilation/BUILD b/test/src/main/scala/scalarules/test/strict_deps/no_recompilation/BUILD index b997b9ccf..def4d3dfa 100644 --- a/test/src/main/scala/scalarules/test/strict_deps/no_recompilation/BUILD +++ b/test/src/main/scala/scalarules/test/strict_deps/no_recompilation/BUILD @@ -1,6 +1,6 @@ package(default_visibility = ["//visibility:public"]) -load("//scala:scala.bzl", "scala_library", "scala_test", "scala_binary") +load("//scala:scala.bzl", "scala_binary", "scala_library", "scala_test") scala_library( name = "transitive_dependency_user", diff --git a/test/src/main/scala/scalarules/test/twitter_scrooge/twitter_scrooge_test.bzl b/test/src/main/scala/scalarules/test/twitter_scrooge/twitter_scrooge_test.bzl index 495ae330e..8432659fc 100644 --- a/test/src/main/scala/scalarules/test/twitter_scrooge/twitter_scrooge_test.bzl +++ b/test/src/main/scala/scalarules/test/twitter_scrooge/twitter_scrooge_test.bzl @@ -11,16 +11,16 @@ def _scrooge_transitive_outputs(ctx): "thrift_scrooge.jar", "thrift2_a_scrooge.jar", "thrift2_b_scrooge.jar", - "thrift3_scrooge.jar" + "thrift3_scrooge.jar", ]), - depset([out.class_jar.basename for out in ctx.attr.dep[JavaInfo].outputs.jars]) + depset([out.class_jar.basename for out in ctx.attr.dep[JavaInfo].outputs.jars]), ) - unittest.end(env) + return unittest.end(env) scrooge_transitive_outputs_test = unittest.make( _scrooge_transitive_outputs, - attrs = {"dep": attr.label()} + attrs = {"dep": attr.label()}, ) def test_scrooge_provides_transitive_jars(): @@ -28,15 +28,15 @@ def test_scrooge_provides_transitive_jars(): # All associated jars must be included in the outputs for IntelliJ resolution to function correctly. scrooge_transitive_outputs_test( name = "transitive_scrooge_test", - dep = ":scrooge1" + dep = ":scrooge1", ) def twitter_scrooge_test_suite(): - test_scrooge_provides_transitive_jars() + test_scrooge_provides_transitive_jars() - native.test_suite( - name = "twitter_scrooge_tests", - tests = [ - ":transitive_scrooge_test", - ], - ) \ No newline at end of file + native.test_suite( + name = "twitter_scrooge_tests", + tests = [ + ":transitive_scrooge_test", + ], + ) diff --git a/test_expect_failure/disappearing_class/BUILD b/test_expect_failure/disappearing_class/BUILD index bcee04a4d..3ec671142 100644 --- a/test_expect_failure/disappearing_class/BUILD +++ b/test_expect_failure/disappearing_class/BUILD @@ -1,4 +1,4 @@ -load("//scala:scala.bzl", "scala_binary", "scala_library", "scala_test", "scala_macro_library") +load("//scala:scala.bzl", "scala_binary", "scala_library", "scala_macro_library", "scala_test") scala_library( name = "uses_class", diff --git a/test_expect_failure/missing_direct_deps/internal_deps/BUILD b/test_expect_failure/missing_direct_deps/internal_deps/BUILD index b249991d3..de6053f4f 100644 --- a/test_expect_failure/missing_direct_deps/internal_deps/BUILD +++ b/test_expect_failure/missing_direct_deps/internal_deps/BUILD @@ -1,6 +1,6 @@ package(default_visibility = ["//visibility:public"]) -load("//scala:scala.bzl", "scala_library", "scala_test", "scala_binary") +load("//scala:scala.bzl", "scala_binary", "scala_library", "scala_test") load(":custom-jvm-rule.bzl", "custom_jvm") scala_library( diff --git a/test_expect_failure/missing_direct_deps/internal_deps/custom-jvm-rule.bzl b/test_expect_failure/missing_direct_deps/internal_deps/custom-jvm-rule.bzl index bd4b553db..b66da1080 100644 --- a/test_expect_failure/missing_direct_deps/internal_deps/custom-jvm-rule.bzl +++ b/test_expect_failure/missing_direct_deps/internal_deps/custom-jvm-rule.bzl @@ -1,23 +1,21 @@ +#This rule is an example for a jvm rule that doesn't support Jars2Labels def _custom_jvm_impl(ctx): - print(ctx.label) - transitive_compile_jars = _collect(ctx.attr.deps) - return struct( - providers = [ - java_common.create_provider( - transitive_compile_time_jars = transitive_compile_jars, - ), - ], + # TODO(#8867): Migrate away from the placeholder jar hack when #8867 is fixed. + jar = ctx.file._placeholder_jar + provider = JavaInfo( + output_jar = jar, + compile_jar = jar, + deps = [target[JavaInfo] for target in ctx.attr.deps], ) - -def _collect(deps): - transitive_compile_jars = depset() - for dep_target in deps: - transitive_compile_jars += dep_target[JavaInfo].transitive_compile_time_jars - return transitive_compile_jars + return [provider] custom_jvm = rule( implementation = _custom_jvm_impl, attrs = { "deps": attr.label_list(), + "_placeholder_jar": attr.label( + allow_single_file = True, + default = Label("@io_bazel_rules_scala//scala:libPlaceHolderClassToCreateEmptyJarForScalaImport.jar"), + ), }, ) diff --git a/test_expect_failure/plus_one_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/BUILD.bazel index 308a5193b..5dd9d2e1f 100644 --- a/test_expect_failure/plus_one_deps/BUILD.bazel +++ b/test_expect_failure/plus_one_deps/BUILD.bazel @@ -1,25 +1,28 @@ load("//scala:scala_toolchain.bzl", "scala_toolchain") + scala_toolchain( name = "plus_one_deps_impl", plus_one_deps_mode = "on", visibility = ["//visibility:public"], ) + toolchain( name = "plus_one_deps", toolchain = "plus_one_deps_impl", toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", visibility = ["//visibility:public"], ) + scala_toolchain( name = "plus_one_deps_with_unused_error_impl", - unused_dependency_checker_mode = "error", plus_one_deps_mode = "on", + unused_dependency_checker_mode = "error", visibility = ["//visibility:public"], ) + toolchain( name = "plus_one_deps_with_unused_error", toolchain = "plus_one_deps_with_unused_error_impl", toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", visibility = ["//visibility:public"], ) - diff --git a/test_expect_failure/plus_one_deps/exports_deps/A.scala b/test_expect_failure/plus_one_deps/deps_of_exports/A.scala similarity index 100% rename from test_expect_failure/plus_one_deps/exports_deps/A.scala rename to test_expect_failure/plus_one_deps/deps_of_exports/A.scala diff --git a/test_expect_failure/plus_one_deps/exports_deps/B.scala b/test_expect_failure/plus_one_deps/deps_of_exports/B.scala similarity index 100% rename from test_expect_failure/plus_one_deps/exports_deps/B.scala rename to test_expect_failure/plus_one_deps/deps_of_exports/B.scala diff --git a/test_expect_failure/plus_one_deps/deps_of_exports/BUILD.bazel b/test_expect_failure/plus_one_deps/deps_of_exports/BUILD.bazel new file mode 100644 index 000000000..d4205e98f --- /dev/null +++ b/test_expect_failure/plus_one_deps/deps_of_exports/BUILD.bazel @@ -0,0 +1,40 @@ +load("//scala:scala.bzl", "scala_library") +#Make sure that plus-one-deps from exports of direct deps also propagate + +#example with target only in exports +scala_library( + name = "test_target_using_facade", + srcs = ["A.scala"], + deps = [":facade"], +) + +scala_library( + name = "facade", + exports = [":direct_dep"], +) + +#example with target in deps & exports +scala_library( + name = "test_target", + srcs = ["A.scala"], + deps = [":direct_dep"], +) + +scala_library( + name = "direct_dep", + srcs = ["B.scala"], + exports = [":exported_dep"], + deps = [":exported_dep"], +) + +#common +scala_library( + name = "exported_dep", + srcs = ["C.scala"], + deps = [":plus_one_dep_of_exported"], +) + +scala_library( + name = "plus_one_dep_of_exported", + srcs = ["D.scala"], +) diff --git a/test_expect_failure/plus_one_deps/exports_deps/C.scala b/test_expect_failure/plus_one_deps/deps_of_exports/C.scala similarity index 100% rename from test_expect_failure/plus_one_deps/exports_deps/C.scala rename to test_expect_failure/plus_one_deps/deps_of_exports/C.scala diff --git a/test_expect_failure/plus_one_deps/exports_deps/D.scala b/test_expect_failure/plus_one_deps/deps_of_exports/D.scala similarity index 100% rename from test_expect_failure/plus_one_deps/exports_deps/D.scala rename to test_expect_failure/plus_one_deps/deps_of_exports/D.scala diff --git a/test_expect_failure/plus_one_deps/exports_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/exports_deps/BUILD.bazel deleted file mode 100644 index d1628eb99..000000000 --- a/test_expect_failure/plus_one_deps/exports_deps/BUILD.bazel +++ /dev/null @@ -1,21 +0,0 @@ -load("//scala:scala.bzl", "scala_library") -scala_library( - name = "a", - srcs = ["A.scala"], - deps = [":b"], -) -scala_library( - name = "b", - srcs = ["B.scala"], - deps = [":c"], -) -scala_library( - name = "c", - srcs = ["C.scala"], - deps = [":d"], - exports = ["d"], -) -scala_library( - name = "d", - srcs = ["D.scala"], -) diff --git a/test_expect_failure/plus_one_deps/exports_of_deps/A.scala b/test_expect_failure/plus_one_deps/exports_of_deps/A.scala new file mode 100644 index 000000000..6886de430 --- /dev/null +++ b/test_expect_failure/plus_one_deps/exports_of_deps/A.scala @@ -0,0 +1,5 @@ +package scalarules.test_expect_failure.plus_one_deps.internal_deps + +class A { + println(new B().hi) +} diff --git a/test_expect_failure/plus_one_deps/exports_of_deps/B.scala b/test_expect_failure/plus_one_deps/exports_of_deps/B.scala new file mode 100644 index 000000000..dc3fd5165 --- /dev/null +++ b/test_expect_failure/plus_one_deps/exports_of_deps/B.scala @@ -0,0 +1,5 @@ +package scalarules.test_expect_failure.plus_one_deps.internal_deps + +class B extends C { + def hi: String = "hi" +} \ No newline at end of file diff --git a/test_expect_failure/plus_one_deps/exports_of_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/exports_of_deps/BUILD.bazel new file mode 100644 index 000000000..9a96fc34d --- /dev/null +++ b/test_expect_failure/plus_one_deps/exports_of_deps/BUILD.bazel @@ -0,0 +1,26 @@ +load("//scala:scala.bzl", "scala_library") + +#Make sure that plus-one-deps exported targets are also propagated (in addition to the plus-one-dep outputs) +scala_library( + name = "test_target", + srcs = ["A.scala"], + deps = [":direct_dep"], +) + +scala_library( + name = "direct_dep", + srcs = ["B.scala"], + deps = [":plus_one_dep"], +) + +scala_library( + name = "plus_one_dep", + srcs = ["C.scala"], + exports = ["exported_dep"], + deps = [":exported_dep"], +) + +scala_library( + name = "exported_dep", + srcs = ["D.scala"], +) diff --git a/test_expect_failure/plus_one_deps/exports_of_deps/C.scala b/test_expect_failure/plus_one_deps/exports_of_deps/C.scala new file mode 100644 index 000000000..942f31db0 --- /dev/null +++ b/test_expect_failure/plus_one_deps/exports_of_deps/C.scala @@ -0,0 +1,3 @@ +package scalarules.test_expect_failure.plus_one_deps.internal_deps + +class C extends D \ No newline at end of file diff --git a/test_expect_failure/plus_one_deps/exports_of_deps/D.scala b/test_expect_failure/plus_one_deps/exports_of_deps/D.scala new file mode 100644 index 000000000..c74bf6530 --- /dev/null +++ b/test_expect_failure/plus_one_deps/exports_of_deps/D.scala @@ -0,0 +1,5 @@ +package scalarules.test_expect_failure.plus_one_deps.internal_deps + +class D { + +} diff --git a/test_expect_failure/plus_one_deps/external_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/external_deps/BUILD.bazel index ebfebfe63..42d964bb9 100644 --- a/test_expect_failure/plus_one_deps/external_deps/BUILD.bazel +++ b/test_expect_failure/plus_one_deps/external_deps/BUILD.bazel @@ -1,7 +1,7 @@ load("//scala:scala.bzl", "scala_library") + scala_library( name = "a", srcs = ["A.scala"], - deps = ["@org_springframework_spring_tx"] - -) \ No newline at end of file + deps = ["@org_springframework_spring_tx"], +) diff --git a/test_expect_failure/plus_one_deps/internal_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/internal_deps/BUILD.bazel index a8a7e6026..1a2eead95 100644 --- a/test_expect_failure/plus_one_deps/internal_deps/BUILD.bazel +++ b/test_expect_failure/plus_one_deps/internal_deps/BUILD.bazel @@ -1,14 +1,17 @@ load("//scala:scala.bzl", "scala_library") + scala_library( name = "a", srcs = ["A.scala"], deps = [":b"], ) + scala_library( name = "b", srcs = ["B.scala"], deps = [":c"], ) + scala_library( name = "c", srcs = ["C.scala"], diff --git a/test_expect_failure/plus_one_deps/with_unused_deps/BUILD.bazel b/test_expect_failure/plus_one_deps/with_unused_deps/BUILD.bazel index 12eb9834f..f7f1fd861 100644 --- a/test_expect_failure/plus_one_deps/with_unused_deps/BUILD.bazel +++ b/test_expect_failure/plus_one_deps/with_unused_deps/BUILD.bazel @@ -1,14 +1,20 @@ load("//scala:scala.bzl", "scala_library") + scala_library( name = "a", srcs = ["A.scala"], - deps = [":b",":c"], + deps = [ + ":b", + ":c", + ], ) + scala_library( name = "b", srcs = ["B.scala"], deps = [":c"], ) + scala_library( name = "c", srcs = ["C.scala"], diff --git a/test_expect_failure/proto_source_root/dependency/BUILD b/test_expect_failure/proto_source_root/dependency/BUILD index 5b922f2da..37ca7370a 100644 --- a/test_expect_failure/proto_source_root/dependency/BUILD +++ b/test_expect_failure/proto_source_root/dependency/BUILD @@ -1,6 +1,6 @@ proto_library( name = "dependency", srcs = glob(["*.proto"]), - proto_source_root = package_name(), + strip_import_prefix = "", visibility = ["//visibility:public"], ) diff --git a/test_expect_failure/proto_source_root/user/BUILD b/test_expect_failure/proto_source_root/user/BUILD index fae72ff1c..4431a2876 100644 --- a/test_expect_failure/proto_source_root/user/BUILD +++ b/test_expect_failure/proto_source_root/user/BUILD @@ -1,16 +1,16 @@ load( "//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) proto_library( name = "user", srcs = glob(["*.proto"]), - proto_source_root = package_name(), + strip_import_prefix = "", deps = ["//test_expect_failure/proto_source_root/dependency"], ) -scalapb_proto_library( +scala_proto_library( name = "user_scala", visibility = ["//visibility:public"], deps = [":user"], diff --git a/test_expect_failure/scala_import/BUILD b/test_expect_failure/scala_import/BUILD index 49fa45764..b1d5564ea 100644 --- a/test_expect_failure/scala_import/BUILD +++ b/test_expect_failure/scala_import/BUILD @@ -6,18 +6,18 @@ load("//scala:scala_import.bzl", "scala_import") scala_import( name = "dummy_dependency_to_trigger_create_provider_transitive_compile_jar_usage", - jars = ["@org_psywerx_hairyfotr__linter//jar:file"], + jars = ["@org_psywerx_hairyfotr__linter//jar"], ) scala_import( name = "guava", - jars = ["@com_google_guava_guava_21_0_with_file//jar:file"], + jars = ["@com_google_guava_guava_21_0_with_file//jar"], deps = [":dummy_dependency_to_trigger_create_provider_transitive_compile_jar_usage"], ) scala_import( name = "cats", - jars = ["@org_typelevel__cats_core//jar:file"], + jars = ["@org_typelevel__cats_core//jar"], ) scala_import( @@ -28,7 +28,7 @@ scala_import( scala_import( name = "commons_lang_as_imported_jar_cats_and_guava_as_compile_deps", - jars = ["@org_apache_commons_commons_lang_3_5//jar:file"], + jars = ["@org_apache_commons_commons_lang_3_5//jar"], deps = [ ":guava", ":indirection_for_transitive_compile_deps", diff --git a/test_expect_failure/scala_junit_test/BUILD b/test_expect_failure/scala_junit_test/BUILD index 2f91acad5..5e2443ea7 100644 --- a/test_expect_failure/scala_junit_test/BUILD +++ b/test_expect_failure/scala_junit_test/BUILD @@ -23,6 +23,9 @@ scala_junit_test( scala_specs2_junit_test( name = "specs2_failing_test", size = "small", - srcs = ["specs2/FailingTest.scala"], + srcs = [ + "specs2/FailingTest.scala", + "specs2/SuiteWithOneFailingTest.scala", + ], suffixes = ["Test"], ) diff --git a/test_expect_failure/scala_junit_test/specs2/SuiteWithOneFailingTest.scala b/test_expect_failure/scala_junit_test/specs2/SuiteWithOneFailingTest.scala new file mode 100644 index 000000000..bef083903 --- /dev/null +++ b/test_expect_failure/scala_junit_test/specs2/SuiteWithOneFailingTest.scala @@ -0,0 +1,15 @@ +package scalarules.test.junit.specs2 + +import org.specs2.mutable.SpecWithJUnit + +class SuiteWithOneFailingTest extends SpecWithJUnit { + + "specs2 tests" should { + "succeed" >> success + "fail" >> failure("boom") + } + + "some other suite" should { + "do stuff" >> success + } +} diff --git a/test_expect_failure/scala_test_jvm_flags/BUILD b/test_expect_failure/scala_test_jvm_flags/BUILD new file mode 100644 index 000000000..b983e40e4 --- /dev/null +++ b/test_expect_failure/scala_test_jvm_flags/BUILD @@ -0,0 +1,35 @@ +load("//scala:scala_toolchain.bzl", "scala_toolchain") +load("//scala:scala.bzl", "scala_test") + +scala_toolchain( + name = "failing_toolchain_impl", + # This will fail because 1M isn't enough + scala_test_jvm_flags = ["-Xmx1M"], + visibility = ["//visibility:public"], +) + +scala_toolchain( + name = "passing_toolchain_impl", + # This will pass because 1G is enough + scala_test_jvm_flags = ["-Xmx1G"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "failing_scala_toolchain", + toolchain = "failing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +toolchain( + name = "passing_scala_toolchain", + toolchain = "passing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_test( + name = "empty_test", + srcs = ["EmptyTest.scala"], +) diff --git a/test_expect_failure/scala_test_jvm_flags/EmptyTest.scala b/test_expect_failure/scala_test_jvm_flags/EmptyTest.scala new file mode 100644 index 000000000..d1fbfc7a0 --- /dev/null +++ b/test_expect_failure/scala_test_jvm_flags/EmptyTest.scala @@ -0,0 +1,9 @@ +package test_expect_failure.scala_test_jvm_flags + +import org.scalatest.FunSuite + +class EmptyTest extends FunSuite { + test("empty test") { + assert(true) + } +} \ No newline at end of file diff --git a/test_expect_failure/scalac_jvm_opts/BUILD b/test_expect_failure/scalac_jvm_opts/BUILD new file mode 100644 index 000000000..8fb8409be --- /dev/null +++ b/test_expect_failure/scalac_jvm_opts/BUILD @@ -0,0 +1,25 @@ +load("//scala:scala_toolchain.bzl", "scala_toolchain") +load("//scala:scala.bzl", "scala_library") +load( + "//scala_proto:scala_proto.bzl", + "scala_proto_library", +) + +scala_toolchain( + name = "failing_toolchain_impl", + # This will fail because 1M isn't enough + scalac_jvm_flags = ["-Xmx1M"], + visibility = ["//visibility:public"], +) + +toolchain( + name = "failing_scala_toolchain", + toolchain = "failing_toolchain_impl", + toolchain_type = "@io_bazel_rules_scala//scala:toolchain_type", + visibility = ["//visibility:public"], +) + +scala_library( + name = "empty_build", + srcs = ["Empty.scala"], +) diff --git a/test_expect_failure/scalac_jvm_opts/Empty.scala b/test_expect_failure/scalac_jvm_opts/Empty.scala new file mode 100644 index 000000000..691dbdd9b --- /dev/null +++ b/test_expect_failure/scalac_jvm_opts/Empty.scala @@ -0,0 +1,3 @@ +package test_expect_failure.scalac_jvm_opts + +class Empty \ No newline at end of file diff --git a/test_expect_failure/transitive/java_to_scala/BUILD b/test_expect_failure/transitive/java_to_scala/BUILD index cf77eab4e..73d288846 100644 --- a/test_expect_failure/transitive/java_to_scala/BUILD +++ b/test_expect_failure/transitive/java_to_scala/BUILD @@ -1,4 +1,4 @@ -load("//scala:scala.bzl", "scala_library", "scala_export_to_java") +load("//scala:scala.bzl", "scala_export_to_java", "scala_library") scala_library( name = "a", diff --git a/test_lint.sh b/test_lint.sh index 46a9a2530..d7de5ad9e 100755 --- a/test_lint.sh +++ b/test_lint.sh @@ -2,4 +2,4 @@ set -eou pipefail -FMT_SKYLINT=false ./lint.sh check +./tools/bazel run //tools:buildifier@check diff --git a/test_reproducibility.ps1 b/test_reproducibility.ps1 deleted file mode 100644 index e70786fc9..000000000 --- a/test_reproducibility.ps1 +++ /dev/null @@ -1,3 +0,0 @@ -Set-StrictMode -Version latest -$ErrorActionPreference = 'Stop' - diff --git a/test_reproducibility.sh b/test_reproducibility.sh index 5d866dbd4..2220d2f82 100755 --- a/test_reproducibility.sh +++ b/test_reproducibility.sh @@ -2,6 +2,11 @@ set -e +if ! bazel_loc="$(type -p 'bazel')" || [[ -z "$bazel_loc" ]]; then + export PATH="$(cd "$(dirname "$0")"; pwd)"/tools:$PATH + echo 'Using ./tools/bazel directly for bazel calls' +fi + md5_util() { if [[ "$OSTYPE" == "darwin"* ]]; then _md5_util="md5" @@ -12,7 +17,7 @@ md5_util() { } non_deploy_jar_md5_sum() { - find bazel-bin/test -name "*.jar" ! -name "*_deploy.jar" | xargs -n 1 -P 5 $(md5_util) | sort + find bazel-bin/test -name "*.jar" ! -name "*_deploy.jar" ! -path 'bazel-bin/test/jmh/*' | xargs -n 1 -P 5 $(md5_util) | sort } test_build_is_identical() { @@ -25,9 +30,9 @@ test_build_is_identical() { diff hash1 hash2 } -dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +test_dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/test/shell # shellcheck source=./test_runner.sh -. "${dir}"/test_runner.sh +. "${test_dir}"/test_runner.sh runner=$(get_test_runner "${1:-local}") diff --git a/test_rules_scala.ps1 b/test_rules_scala.ps1 deleted file mode 100644 index 29bfc43bc..000000000 --- a/test_rules_scala.ps1 +++ /dev/null @@ -1,36 +0,0 @@ -#!/usr/bin/env pwsh - -Set-StrictMode -Version latest -$ErrorActionPreference = 'Stop' - -$env:JAVA_HOME='c:\\java8' - -function bazel() { - Write-Output ">> bazel $args" - $global:lastexitcode = 0 - $backupErrorActionPreference = $script:ErrorActionPreference - $script:ErrorActionPreference = "Continue" - & bazel.exe @args 2>&1 | %{ "$_" } - $script:ErrorActionPreference = $backupErrorActionPreference - if ($global:lastexitcode -ne 0 -And $args[0] -ne "shutdown") { - Write-Output "<< bazel $args (failed, exit code: $global:lastexitcode)" - throw ("Bazel returned non-zero exit code: $global:lastexitcode") - } - Write-Output "<< bazel $args (ok)" -} - -bazel build //test/... -bazel shutdown - -bazel test ` - //test:HelloLibTest ` - //test:HelloLibTestSuite_test_suite_HelloLibTest.scala ` - //test:HelloLibTestSuite_test_suite_HelloLibTest2.scala ` - //test:LongTestSuiteNamesSuite ` - //test:TestFilterTests ` - //test:no_sig ` - //test/aspect:aspect_test ` - //test/aspect:scala_test ` - //test/proto:test_blacklisted_proto ` - //test/src/main/scala/scalarules/test/resource_jars:resource_jars ` - //test/src/main/scala/scalarules/test/twitter_scrooge/... diff --git a/test_rules_scala.sh b/test_rules_scala.sh index 0164e9659..e8456fabf 100755 --- a/test_rules_scala.sh +++ b/test_rules_scala.sh @@ -2,948 +2,14 @@ set -e -test_disappearing_class() { - git checkout test_expect_failure/disappearing_class/ClassProvider.scala - bazel build test_expect_failure/disappearing_class:uses_class - echo -e "package scalarules.test\n\nobject BackgroundNoise{}" > test_expect_failure/disappearing_class/ClassProvider.scala - set +e - bazel build test_expect_failure/disappearing_class:uses_class - RET=$? - git checkout test_expect_failure/disappearing_class/ClassProvider.scala - if [ $RET -eq 0 ]; then - echo "Class caching at play. This should fail" - exit 1 - fi - set -e -} +if ! bazel_loc="$(type -p 'bazel')" || [[ -z "$bazel_loc" ]]; then + export PATH="$(cd "$(dirname "$0")"; pwd)"/tools:$PATH + echo 'Using ./tools/bazel directly for bazel calls' +fi -test_transitive_deps() { - set +e - - bazel build test_expect_failure/transitive/scala_to_scala:d - if [ $? -eq 0 ]; then - echo "'bazel build test_expect_failure/transitive/scala_to_scala:d' should have failed." - exit 1 - fi - - bazel build test_expect_failure/transitive/java_to_scala:d - if [ $? -eq 0 ]; then - echo "'bazel build test_expect_failure/transitive/java_to_scala:d' should have failed." - exit 1 - fi - - bazel build test_expect_failure/transitive/scala_to_java:d - if [ $? -eq 0 ]; then - echo "'bazel build test_transitive_deps/scala_to_java:d' should have failed." - exit 1 - fi - - set -e - exit 0 -} - -test_override_javabin() { - # set the JAVABIN to nonsense - JAVABIN=/etc/basdf action_should_fail run test:ScalaBinary -} - -test_scala_library_suite() { - action_should_fail build test_expect_failure/scala_library_suite:library_suite_dep_on_children -} - -test_expect_failure_with_message() { - set +e - - expected_message=$1 - test_filter=$2 - test_command=$3 - - command="bazel test --nocache_test_results --test_output=streamed ${test_filter} ${test_command}" - output=$(${command} 2>&1) - - echo ${output} | grep "$expected_message" - if [ $? -ne 0 ]; then - echo "'bazel test ${test_command}' should have logged \"${expected_message}\"." - exit 1 - fi - if [ "${additional_expected_message}" != "" ]; then - echo ${output} | grep "$additional_expected_message" - if [ $? -ne 0 ]; then - echo "'bazel test ${test_command}' should have logged \"${additional_expected_message}\"." - exit 1 - fi - fi - - set -e -} - -test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message() { - set +e - - expected_message=$1 - test_target=$2 - strict_deps_mode=$3 - operator=${4:-"eq"} - additional_expected_message=${5:-""} - - if [ "${operator}" = "eq" ]; then - error_message="bazel build of scala_library with missing direct deps should have failed." - else - error_message="bazel build of scala_library with missing direct deps should not have failed." - fi - - command="bazel build ${test_target} ${strict_deps_mode}" - - output=$(${command} 2>&1) - status_code=$? - - echo "$output" - if [ ${status_code} -${operator} 0 ]; then - echo ${error_message} - exit 1 - fi - - echo ${output} | grep "$expected_message" - if [ $? -ne 0 ]; then - echo "'bazel build ${test_target}' should have logged \"${expected_message}\"." - exit 1 - fi - if [ "${additional_expected_message}" != "" ]; then - echo ${output} | grep "$additional_expected_message" - if [ $? -ne 0 ]; then - echo "'bazel build ${test_target}' should have logged \"${additional_expected_message}\"." - exit 1 - fi - fi - - set -e -} - -test_scala_library_expect_failure_on_missing_direct_deps_strict_is_disabled_by_default() { - expected_message="not found: value C" - test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "$expected_message" $test_target "" -} - -test_scala_library_expect_failure_on_missing_direct_deps() { - dependenecy_target=$1 - test_target=$2 - - local expected_message="buildozer 'add deps $dependenecy_target' //$test_target" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" -} - -test_scala_library_expect_failure_on_missing_direct_internal_deps() { - dependenecy_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' - - test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target -} - -test_scala_binary_expect_failure_on_missing_direct_deps() { - dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='test_expect_failure/missing_direct_deps/internal_deps:user_binary' - - test_scala_library_expect_failure_on_missing_direct_deps ${dependency_target} ${test_target} -} - -test_scala_binary_expect_failure_on_missing_direct_deps_located_in_dependency_which_is_scala_binary() { - dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='test_expect_failure/missing_direct_deps/internal_deps:binary_user_of_binary' - - test_scala_library_expect_failure_on_missing_direct_deps ${dependency_target} ${test_target} -} - -test_scala_library_expect_failure_on_missing_direct_external_deps_jar() { - dependenecy_target='@com_google_guava_guava_21_0//:com_google_guava_guava_21_0' - test_target='test_expect_failure/missing_direct_deps/external_deps:transitive_external_dependency_user' - - test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target -} - -test_scala_library_expect_failure_on_missing_direct_external_deps_file_group() { - dependenecy_target='@com_google_guava_guava_21_0_with_file//jar:jar' - test_target='test_expect_failure/missing_direct_deps/external_deps:transitive_external_dependency_user_file_group' - - test_scala_library_expect_failure_on_missing_direct_deps $dependenecy_target $test_target -} - -test_scala_library_expect_failure_on_missing_direct_deps_warn_mode() { - dependenecy_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' - - expected_message="warning: Target '$dependenecy_target' is used but isn't explicitly declared, please add it to the deps" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=warn" "ne" -} - -test_scala_library_expect_failure_on_missing_direct_java() { - dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_java_user' - - expected_message="$dependency_target.*$test_target" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" -} - -test_scala_library_expect_failure_on_java_in_src_jar_when_disabled() { - test_target='//test_expect_failure/java_in_src_jar_when_disabled:java_source_jar' - - expected_message=".*Found java files in source jars but expect Java output is set to false" - - test_expect_failure_with_message "${expected_message}" $test_target -} - -test_scala_library_expect_better_failure_message_on_missing_transitive_dependency_labels_from_other_jvm_rules() { - transitive_target='.*transitive_dependency-ijar.jar' - direct_target='//test_expect_failure/missing_direct_deps/internal_deps:direct_java_provider_dependency' - test_target='//test_expect_failure/missing_direct_deps/internal_deps:dependent_on_some_java_provider' - - expected_message="Unknown label of file $transitive_target which came from $direct_target" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" $test_target "--strict_java_deps=error" -} - -test_scala_library_expect_failure_on_missing_direct_deps_warn_mode_java() { - dependency_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency' - test_target='//test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_java_user' - - local expected_message="$dependency_target.*$test_target" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=warn" "ne" -} - -test_scala_library_expect_failure_on_missing_direct_deps_off_mode() { - expected_message="test_expect_failure/missing_direct_deps/internal_deps/A.scala:[0-9+]: error: not found: value C" - test_target='test_expect_failure/missing_direct_deps/internal_deps:transitive_dependency_user' - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message}" ${test_target} "--strict_java_deps=off" -} - -test_scala_junit_test_can_fail() { - action_should_fail test test_expect_failure/scala_junit_test:failing_test -} - -test_repl() { - echo "import scalarules.test._; HelloLib.printMessage(\"foo\")" | bazel-bin/test/HelloLibRepl | grep "foo java" && - echo "import scalarules.test._; TestUtil.foo" | bazel-bin/test/HelloLibTestRepl | grep "bar" && - echo "import scalarules.test._; ScalaLibBinary.main(Array())" | bazel-bin/test/ScalaLibBinaryRepl | grep "A hui hou" && - echo "import scalarules.test._; ResourcesStripScalaBinary.main(Array())" | bazel-bin/test/ResourcesStripScalaBinaryRepl | grep "More Hello" - echo "import scalarules.test._; A.main(Array())" | bazel-bin/test/ReplWithSources | grep "4 8 15" -} - -test_benchmark_jmh() { - RES=$(bazel run -- test/jmh:test_benchmark -i1 -f1 -wi 1) - RESPONSE_CODE=$? - if [[ $RES != *Result*Benchmark* ]]; then - echo "Benchmark did not produce expected output:\n$RES" - exit 1 - fi - exit $RESPONSE_CODE -} - -test_multi_service_manifest() { - deploy_jar='ScalaBinary_with_service_manifest_srcs_deploy.jar' - meta_file='META-INF/services/org.apache.beam.sdk.io.FileSystemRegistrar' - bazel build test:$deploy_jar - unzip -p bazel-bin/test/$deploy_jar $meta_file > service_manifest.txt - diff service_manifest.txt test/example_jars/expected_service_manifest.txt - RESPONSE_CODE=$? - rm service_manifest.txt - exit $RESPONSE_CODE -} - - - -action_should_fail() { - # runs the tests locally - set +e - TEST_ARG=$@ - DUMMY=$(bazel $TEST_ARG) - RESPONSE_CODE=$? - if [ $RESPONSE_CODE -eq 0 ]; then - echo -e "${RED} \"bazel $TEST_ARG\" should have failed but passed. $NC" - exit -1 - else - exit 0 - fi -} - -action_should_fail_with_message() { - set +e - MSG=$1 - TEST_ARG=${@:2} - RES=$(bazel $TEST_ARG 2>&1) - RESPONSE_CODE=$? - echo $RES | grep -- "$MSG" - GREP_RES=$? - if [ $RESPONSE_CODE -eq 0 ]; then - echo -e "${RED} \"bazel $TEST_ARG\" should have failed but passed. $NC" - exit 1 - elif [ $GREP_RES -ne 0 ]; then - echo -e "${RED} \"bazel $TEST_ARG\" should have failed with message \"$MSG\" but did not. $NC" - else - exit 0 - fi -} - -xmllint_test() { - find -L ./bazel-testlogs -iname "*.xml" | xargs -n1 xmllint > /dev/null -} - -multiple_junit_suffixes() { - bazel test //test:JunitMultipleSuffixes - - matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.JunitSuffixIT' -e 'scalarules.test.junit.JunitSuffixE2E' ./bazel-testlogs/test/JunitMultipleSuffixes/test.log) - if [ $matches -eq 3 ]; then - return 0 - else - return 1 - fi -} - -multiple_junit_prefixes() { - bazel test //test:JunitMultiplePrefixes - - matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.TestJunitCustomPrefix' -e 'scalarules.test.junit.OtherCustomPrefixJunit' ./bazel-testlogs/test/JunitMultiplePrefixes/test.log) - if [ $matches -eq 3 ]; then - return 0 - else - return 1 - fi -} - -multiple_junit_patterns() { - bazel test //test:JunitPrefixesAndSuffixes - matches=$(grep -c -e 'Discovered classes' -e 'scalarules.test.junit.TestJunitCustomPrefix' -e 'scalarules.test.junit.JunitSuffixE2E' ./bazel-testlogs/test/JunitPrefixesAndSuffixes/test.log) - if [ $matches -eq 3 ]; then - return 0 - else - return 1 - fi -} - -junit_generates_xml_logs() { - bazel test //test:JunitTestWithDeps - matches=$(grep -c -e "testcase name='hasCompileTimeDependencies'" -e "testcase name='hasRuntimeDependencies'" ./bazel-testlogs/test/JunitTestWithDeps/test.xml) - if [ $matches -eq 2 ]; then - return 0 - else - return 1 - fi - test -e -} - -test_junit_test_must_have_prefix_or_suffix() { - action_should_fail test test_expect_failure/scala_junit_test:no_prefix_or_suffix -} - -test_junit_test_errors_when_no_tests_found() { - action_should_fail test test_expect_failure/scala_junit_test:no_tests_found -} - -test_resources() { - RESOURCE_NAME="resource.txt" - TARGET=$1 - OUTPUT_JAR="bazel-bin/test/src/main/scala/scalarules/test/resources/$TARGET.jar" - FULL_TARGET="test/src/main/scala/scalarules/test/resources/$TARGET.jar" - bazel build $FULL_TARGET - jar tf $OUTPUT_JAR | grep $RESOURCE_NAME -} - -scala_library_jar_without_srcs_must_include_direct_file_resources(){ - test_resources "noSrcsWithDirectFileResources" -} - -scala_library_jar_without_srcs_must_include_filegroup_resources(){ - test_resources "noSrcsWithFilegroupResources" -} - -scala_library_jar_without_srcs_must_fail_on_mismatching_resource_strip_prefix() { - action_should_fail build test_expect_failure/wrong_resource_strip_prefix:noSrcsJarWithWrongStripPrefix -} - -scala_test_test_filters() { - # test package wildcard (both) - local output=$(bazel test \ - --cache_test_results=no \ - --test_output streamed \ - --test_filter scalarules.test.* \ - test:TestFilterTests) - if [[ $output != *"tests a"* || $output != *"tests b"* ]]; then - echo "Should have contained test output from both test filter test a and b" - exit 1 - fi - # test just one - local output=$(bazel test \ - --cache_test_results=no \ - --test_output streamed \ - --test_filter scalarules.test.TestFilterTestA \ - test:TestFilterTests) - if [[ $output != *"tests a"* || $output == *"tests b"* ]]; then - echo "Should have only contained test output from test filter test a" - exit 1 - fi -} - -scala_junit_test_test_filter(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.FirstFilterTest#(method1|method2)$|scalarules.test.junit.SecondFilterTest#(method2|method3)$' \ - test:JunitFilterTest) - local expected=( - "scalarules.test.junit.FirstFilterTest#method1" - "scalarules.test.junit.FirstFilterTest#method2" - "scalarules.test.junit.SecondFilterTest#method2" - "scalarules.test.junit.SecondFilterTest#method3") - local unexpected=( - "scalarules.test.junit.FirstFilterTest#method3" - "scalarules.test.junit.SecondFilterTest#method1" - "scalarules.test.junit.ThirdFilterTest#method1" - "scalarules.test.junit.ThirdFilterTest#method2" - "scalarules.test.junit.ThirdFilterTest#method3") - for method in "${expected[@]}"; do - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi - done - for method in "${unexpected[@]}"; do - if grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $method in output, but was found." - exit 1 - fi - done -} - -scala_junit_test_test_filter_custom_runner(){ - bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.JunitCustomRunnerTest#' \ - test:JunitCustomRunner -} - -scala_specs2_junit_test_test_filter_everything(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=.*' \ - test:Specs2Tests) - local expected=( - "[info] JunitSpec2RegexTest" - "[info] JunitSpecs2AnotherTest" - "[info] JunitSpecs2Test") - local unexpected=( - "[info] UnrelatedTest") - for method in "${expected[@]}"; do - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi - done - for method in "${unexpected[@]}"; do - if grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $method in output, but was found." - exit 1 - fi - done -} - -scala_specs2_junit_test_test_filter_whole_spec(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#' \ - test:Specs2Tests) - local expected=( - "+ run smoothly in bazel" - "+ not run smoothly in bazel") - local unexpected=( - "+ run from another test") - for method in "${expected[@]}"; do - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi - done - for method in "${unexpected[@]}"; do - if grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $method in output, but was found." - exit 1 - fi - done -} - -scala_specs2_junit_test_test_filter_one_test(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpecs2Test#specs2 tests should::run smoothly in bazel$' \ - test:Specs2Tests) - local expected="+ run smoothly in bazel" - local unexpected="+ not run smoothly in bazel" - if ! grep "$expected" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $expected in output, but was not found." - exit 1 - fi - if grep "$unexpected" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $unexpected in output, but was found." - exit 1 - fi -} - -scala_specs2_junit_test_test_filter_exact_match(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpecs2AnotherTest#other specs2 tests should::run from another test$' \ - test:Specs2Tests) - local expected="+ run from another test" - local unexpected="+ run from another test 2" - if ! grep "$expected" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $expected in output, but was not found." - exit 1 - fi - if grep "$unexpected" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $unexpected in output, but was found." - exit 1 - fi -} - -scala_specs2_junit_test_test_filter_exact_match_unsafe_characters(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpec2RegexTest#\Qtests with unsafe characters should::2 + 2 != 5\E$' \ - test:Specs2Tests) - local expected="+ 2 + 2 != 5" - local unexpected="+ work escaped (with regex)" - if ! grep "$expected" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $expected in output, but was not found." - exit 1 - fi - if grep "$unexpected" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $unexpected in output, but was found." - exit 1 - fi -} - -scala_specs2_junit_test_test_filter_exact_match_escaped_and_sanitized(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpec2RegexTest#\Qtests with unsafe characters should::work escaped [with regex]\E$' \ - test:Specs2Tests) - local expected="+ work escaped (with regex)" - local unexpected="+ 2 + 2 != 5" - if ! grep "$expected" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $expected in output, but was not found." - exit 1 - fi - if grep "$unexpected" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $unexpected in output, but was found." - exit 1 - fi -} - -scala_specs2_junit_test_test_filter_match_multiple_methods(){ - local output=$(bazel test \ - --nocache_test_results \ - --test_output=streamed \ - '--test_filter=scalarules.test.junit.specs2.JunitSpecs2AnotherTest#other specs2 tests should::(\Qrun from another test\E|\Qrun from another test 2\E)$' \ - test:Specs2Tests) - local expected=( - "+ run from another test" - "+ run from another test 2") - local unexpected=( - "+ not run") - for method in "${expected[@]}"; do - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi - done - for method in "${unexpected[@]}"; do - if grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $method in output, but was found." - exit 1 - fi - done -} - - -scala_specs2_exception_in_initializer_without_filter(){ - expected_message="org.specs2.control.UserException: cannot create an instance for class scalarules.test.junit.specs2.FailingTest" - test_command="test_expect_failure/scala_junit_test:specs2_failing_test" - - test_expect_failure_with_message "$expected_message" $test_filter $test_command -} - -scala_specs2_exception_in_initializer_terminates_without_timeout(){ - local output=$(bazel test \ - --test_output=streamed \ - --test_timeout=10 \ - '--test_filter=scalarules.test.junit.specs2.FailingTest#' \ - test_expect_failure/scala_junit_test:specs2_failing_test) - local expected=( - "org.specs2.control.UserException: cannot create an instance for class scalarules.test.junit.specs2.FailingTest") - local unexpected=( - "TIMEOUT") - for method in "${expected[@]}"; do - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi - done - for method in "${unexpected[@]}"; do - if grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Not expecting $method in output, but was found." - exit 1 - fi - done -} - -scalac_jvm_flags_are_configured(){ - action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_scalac -} - -javac_jvm_flags_are_configured(){ - action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_javac -} - -javac_jvm_flags_via_javacopts_are_configured(){ - action_should_fail build //test_expect_failure/compilers_jvm_flags:can_configure_jvm_flags_for_javac_via_javacopts -} - -scalac_jvm_flags_are_expanded(){ - action_should_fail_with_message \ - "--jvm_flag=test_expect_failure/compilers_jvm_flags/args.txt" \ - build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_scalac -} - -javac_jvm_flags_are_expanded(){ - action_should_fail_with_message \ - "invalid flag: test_expect_failure/compilers_jvm_flags/args.txt" \ - build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_javac -} - -javac_jvm_flags_via_javacopts_are_expanded(){ - action_should_fail_with_message \ - "invalid flag: test_expect_failure/compilers_jvm_flags/args.txt" \ - build --verbose_failures //test_expect_failure/compilers_jvm_flags:can_expand_jvm_flags_for_javac_via_javacopts -} - -java_toolchain_javacopts_are_used(){ - action_should_fail_with_message \ - "invalid flag: -InvalidFlag" \ - build --java_toolchain=//test_expect_failure/compilers_javac_opts:a_java_toolchain \ - --verbose_failures //test_expect_failure/compilers_javac_opts:can_configure_jvm_flags_for_javac_via_javacopts -} - -revert_internal_change() { - sed -i.bak "s/println(\"altered\")/println(\"orig\")/" $no_recompilation_path/C.scala - rm $no_recompilation_path/C.scala.bak -} - -test_scala_library_expect_no_recompilation_on_internal_change_of_transitive_dependency() { - set +e - no_recompilation_path="test/src/main/scala/scalarules/test/strict_deps/no_recompilation" - build_command="bazel build //$no_recompilation_path/... --subcommands --strict_java_deps=error" - - echo "running initial build" - $build_command - echo "changing internal behaviour of C.scala" - sed -i.bak "s/println(\"orig\")/println(\"altered\")/" ./$no_recompilation_path/C.scala - - echo "running second build" - output=$(${build_command} 2>&1) - - not_expected_recompiled_target="//$no_recompilation_path:transitive_dependency_user" - - echo ${output} | grep "$not_expected_recompiled_target" - if [ $? -eq 0 ]; then - echo "bazel build was executed after change of internal behaviour of 'transitive_dependency' target. compilation of 'transitive_dependency_user' should not have been triggered." - revert_internal_change - exit 1 - fi - - revert_internal_change - set -e -} - -test_scala_library_expect_no_recompilation_on_internal_change_of_java_dependency() { - test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency "C.java" "s/System.out.println(\"orig\")/System.out.println(\"altered\")/" -} - -test_scala_library_expect_no_recompilation_on_internal_change_of_scala_dependency() { - test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency "B.scala" "s/println(\"orig\")/println(\"altered\")/" -} - -test_scala_library_expect_no_recompilation_of_target_on_internal_change_of_dependency() { - test_scala_library_expect_no_recompilation_on_internal_change $1 $2 ":user" "'user'" -} - -test_scala_library_expect_no_java_recompilation_on_internal_change_of_scala_sibling() { - test_scala_library_expect_no_recompilation_on_internal_change "B.scala" "s/println(\"orig_sibling\")/println(\"altered_sibling\")/" "/dependency_java" "java sibling" -} - -test_scala_library_expect_no_recompilation_on_internal_change() { - changed_file=$1 - changed_content=$2 - dependency=$3 - dependency_description=$4 - set +e - no_recompilation_path="test/src/main/scala/scalarules/test/ijar" - build_command="bazel build //$no_recompilation_path/... --subcommands" - - echo "running initial build" - $build_command - echo "changing internal behaviour of $changed_file" - sed -i.bak $changed_content ./$no_recompilation_path/$changed_file - - echo "running second build" - output=$(${build_command} 2>&1) - - not_expected_recompiled_action="$no_recompilation_path$dependency" - - echo ${output} | grep "$not_expected_recompiled_action" - if [ $? -eq 0 ]; then - echo "bazel build was executed after change of internal behaviour of 'dependency' target. compilation of $dependency_description should not have been triggered." - revert_change $no_recompilation_path $changed_file - exit 1 - fi - - revert_change $no_recompilation_path $changed_file - set -e -} - -revert_change() { - mv $1/$2.bak $1/$2 -} - -test_scala_import_expect_failure_on_missing_direct_deps_warn_mode() { - dependency_target1='//test_expect_failure/scala_import:cats' - dependency_target2='//test_expect_failure/scala_import:guava' - test_target='test_expect_failure/scala_import:scala_import_propagates_compile_deps' - - local expected_message1="buildozer 'add deps $dependency_target1' //$test_target" - local expected_message2="buildozer 'add deps $dependency_target2' //$test_target" - - test_expect_failure_or_warning_on_missing_direct_deps_with_expected_message "${expected_message1}" ${test_target} "--strict_java_deps=warn" "ne" "${expected_message2}" -} - -test_scalaopts_from_scala_toolchain() { - action_should_fail build --extra_toolchains="//test_expect_failure/scalacopts_from_toolchain:failing_scala_toolchain" //test_expect_failure/scalacopts_from_toolchain:failing_build -} - -test_unused_dependency_checker_mode_set_in_rule() { - action_should_fail build //test_expect_failure/unused_dependency_checker:failing_build -} - -test_unused_dependency_checker_mode_from_scala_toolchain() { - action_should_fail build --extra_toolchains="//test_expect_failure/unused_dependency_checker:failing_scala_toolchain" //test_expect_failure/unused_dependency_checker:toolchain_failing_build -} - -test_unused_dependency_checker_mode_override_toolchain() { - bazel build --extra_toolchains="//test_expect_failure/unused_dependency_checker:failing_scala_toolchain" //test_expect_failure/unused_dependency_checker:toolchain_override -} - -test_unused_dependency_checker_mode_warn() { - # this is a hack to invalidate the cache, so that the target actually gets built and outputs warnings. - bazel build \ - --strict_java_deps=warn \ - //test:UnusedDependencyCheckerWarn - - local output - output=$(bazel build \ - --strict_java_deps=off \ - //test:UnusedDependencyCheckerWarn 2>&1 - ) - - if [ $? -ne 0 ]; then - echo "Target with unused dependency failed to build with status $?" - echo "$output" - exit 1 - fi - - local expected="warning: Target '//test:UnusedLib' is specified as a dependency to //test:UnusedDependencyCheckerWarn but isn't used, please remove it from the deps." - - echo "$output" | grep "$expected" - if [ $? -ne 0 ]; then - echo "Expected output:[$output] to contain [$expected]" - exit 1 - fi -} - -test_scala_import_library_passes_labels_of_direct_deps() { - dependency_target='//test_expect_failure/scala_import:root_for_scala_import_passes_labels_of_direct_deps' - test_target='test_expect_failure/scala_import:leaf_for_scala_import_passes_labels_of_direct_deps' - - test_scala_library_expect_failure_on_missing_direct_deps $dependency_target $test_target -} - -test_scala_classpath_resources_expect_warning_on_namespace_conflict() { - local output=$(bazel build \ - --verbose_failures \ - //test/src/main/scala/scalarules/test/classpath_resources:classpath_resource_duplicates - ) - - local expected="Classpath resource file classpath-resourcehas a namespace conflict with another file: classpath-resource" - - if ! grep "$method" <<<$output; then - echo "output:" - echo "$output" - echo "Expected $method in output, but was not found." - exit 1 - fi -} - -scala_binary_common_jar_is_exposed_in_build_event_protocol() { - local target=$1 - set +e - bazel build test:$target --build_event_text_file=$target_bes.txt - cat $target_bes.txt | grep "test/$target.jar" - if [ $? -ne 0 ]; then - echo "test/$target.jar was not found in build event protocol:" - cat $target_bes.txt - rm $target_bes.txt - exit 1 - fi - - rm $target_bes.txt - set -e -} - -scala_binary_jar_is_exposed_in_build_event_protocol() { - scala_binary_common_jar_is_exposed_in_build_event_protocol ScalaLibBinary -} - -scala_test_jar_is_exposed_in_build_event_protocol() { - scala_binary_common_jar_is_exposed_in_build_event_protocol HelloLibTest -} - -scala_junit_test_jar_is_exposed_in_build_event_protocol() { - scala_binary_common_jar_is_exposed_in_build_event_protocol JunitTestWithDeps -} - -test_scala_import_source_jar_should_be_fetched_when_fetch_sources_is_set_to_true() { - test_scala_import_fetch_sources -} - -test_scala_import_source_jar_should_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_true() { - test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to "TruE" # as implied, the value is case insensitive -} - -test_scala_import_source_jar_should_not_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_non_true() { - test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to "false" "and expect no source jars" -} - -test_scala_import_fetch_sources_with_env_bazel_jvm_fetch_sources_set_to() { - # the existence of the env var should cause the import repository rule to re-fetch the dependency - # and therefore the order of tests is not expected to matter - export BAZEL_JVM_FETCH_SOURCES=$1 - local expect_failure=$2 - - if [[ ${expect_failure} ]]; then - action_should_fail test_scala_import_fetch_sources - else - test_scala_import_fetch_sources - fi - - unset BAZEL_JVM_FETCH_SOURCES -} - -test_scala_import_fetch_sources() { - local srcjar_name="guava-21.0-src.jar" - local bazel_out_external_guava_21=$(bazel info output_base)/external/com_google_guava_guava_21_0 - - set -e - bazel build //test/src/main/scala/scalarules/test/fetch_sources/... - set +e - - assert_file_exists $bazel_out_external_guava_21/$srcjar_name -} - -test_compilation_succeeds_with_plus_one_deps_on() { - bazel build --extra_toolchains=//test_expect_failure/plus_one_deps:plus_one_deps //test_expect_failure/plus_one_deps/internal_deps:a -} -test_compilation_fails_with_plus_one_deps_undefined() { - action_should_fail build //test_expect_failure/plus_one_deps/internal_deps:a -} -test_compilation_succeeds_with_plus_one_deps_on_for_external_deps() { - bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test_expect_failure/plus_one_deps/external_deps:a -} -test_compilation_succeeds_with_plus_one_deps_on_also_for_exports() { - bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test_expect_failure/plus_one_deps/exports_deps:a -} -test_plus_one_deps_only_works_for_java_info_targets() { - #for example doesn't break scala proto which depends on proto_library - bazel build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" //test/proto:test_proto -} -test_unused_dependency_fails_even_if_also_exists_in_plus_one_deps() { - action_should_fail build --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps_with_unused_error" //test_expect_failure/plus_one_deps/with_unused_deps:a -} - -test_coverage_on() { - bazel coverage \ - --extra_toolchains="//test/coverage:enable_code_coverage_aspect" \ - //test/coverage/... - diff test/coverage/expected-coverage.dat $(bazel info bazel-testlogs)/test/coverage/test-all/coverage.dat -} - -assert_file_exists() { - if [[ -f $1 ]]; then - echo "File $1 exists." - exit 0 - else - echo "File $1 does not exist." - exit 1 - fi -} - -dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ) +test_dir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/test/shell # shellcheck source=./test_runner.sh -. "${dir}"/test_runner.sh +. "${test_dir}"/test_runner.sh runner=$(get_test_runner "${1:-local}") $runner bazel build test/... @@ -954,81 +20,23 @@ $runner bazel test third_party/... $runner bazel build "--strict_java_deps=ERROR -- test/... -test:UnusedDependencyChecker" #$runner bazel build "--strict_java_deps=ERROR --all_incompatible_changes -- test/... -test:UnusedDependencyChecker" $runner bazel test "--strict_java_deps=ERROR -- test/... -test:UnusedDependencyChecker" -$runner test_disappearing_class -$runner find -L ./bazel-testlogs -iname "*.xml" -$runner xmllint_test -$runner test_transitive_deps -$runner test_scala_library_suite -$runner test_repl -$runner test_benchmark_jmh -$runner multiple_junit_suffixes -$runner multiple_junit_prefixes -$runner test_scala_junit_test_can_fail -$runner junit_generates_xml_logs -$runner scala_library_jar_without_srcs_must_fail_on_mismatching_resource_strip_prefix -$runner multiple_junit_patterns -$runner test_junit_test_must_have_prefix_or_suffix -$runner test_junit_test_errors_when_no_tests_found -$runner scala_library_jar_without_srcs_must_include_direct_file_resources -$runner scala_library_jar_without_srcs_must_include_filegroup_resources -$runner scala_test_test_filters -$runner scala_junit_test_test_filter -$runner scala_junit_test_test_filter_custom_runner -$runner scala_specs2_junit_test_test_filter_everything -$runner scala_specs2_junit_test_test_filter_one_test -$runner scala_specs2_junit_test_test_filter_whole_spec -$runner scala_specs2_junit_test_test_filter_exact_match -$runner scala_specs2_junit_test_test_filter_exact_match_unsafe_characters -$runner scala_specs2_junit_test_test_filter_exact_match_escaped_and_sanitized -$runner scala_specs2_junit_test_test_filter_match_multiple_methods -$runner scala_specs2_exception_in_initializer_without_filter -$runner scala_specs2_exception_in_initializer_terminates_without_timeout -$runner scalac_jvm_flags_are_configured -$runner javac_jvm_flags_are_configured -$runner javac_jvm_flags_via_javacopts_are_configured -$runner scalac_jvm_flags_are_expanded -$runner javac_jvm_flags_are_expanded -$runner javac_jvm_flags_via_javacopts_are_expanded -$runner test_scala_library_expect_failure_on_missing_direct_internal_deps -$runner test_scala_library_expect_failure_on_missing_direct_external_deps_jar -$runner test_scala_library_expect_failure_on_missing_direct_external_deps_file_group -$runner test_scala_library_expect_failure_on_missing_direct_deps_strict_is_disabled_by_default -$runner test_scala_binary_expect_failure_on_missing_direct_deps -$runner test_scala_binary_expect_failure_on_missing_direct_deps_located_in_dependency_which_is_scala_binary -$runner test_scala_library_expect_failure_on_missing_direct_deps_warn_mode -$runner test_scala_library_expect_failure_on_missing_direct_deps_off_mode -$runner test_unused_dependency_checker_mode_from_scala_toolchain -$runner test_unused_dependency_checker_mode_set_in_rule -$runner test_unused_dependency_checker_mode_override_toolchain -$runner test_scala_library_expect_no_recompilation_on_internal_change_of_transitive_dependency -$runner test_multi_service_manifest -$runner test_scala_library_expect_no_recompilation_on_internal_change_of_scala_dependency -$runner test_scala_library_expect_no_recompilation_on_internal_change_of_java_dependency -$runner test_scala_library_expect_no_java_recompilation_on_internal_change_of_scala_sibling -$runner test_scala_library_expect_failure_on_missing_direct_java -$runner test_scala_library_expect_failure_on_java_in_src_jar_when_disabled -$runner test_scala_library_expect_failure_on_missing_direct_deps_warn_mode_java -$runner test_scala_library_expect_better_failure_message_on_missing_transitive_dependency_labels_from_other_jvm_rules -$runner test_scala_import_expect_failure_on_missing_direct_deps_warn_mode $runner bazel build "test_expect_failure/missing_direct_deps/internal_deps/... --strict_java_deps=warn" -$runner test_scalaopts_from_scala_toolchain -$runner test_scala_import_library_passes_labels_of_direct_deps -$runner java_toolchain_javacopts_are_used -$runner test_scala_classpath_resources_expect_warning_on_namespace_conflict $runner bazel build //test_expect_failure/proto_source_root/... --strict_proto_deps=off -$runner scala_binary_jar_is_exposed_in_build_event_protocol -$runner scala_test_jar_is_exposed_in_build_event_protocol -$runner scala_junit_test_jar_is_exposed_in_build_event_protocol -$runner test_scala_import_source_jar_should_be_fetched_when_fetch_sources_is_set_to_true -$runner test_scala_import_source_jar_should_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_true -$runner test_scala_import_source_jar_should_not_be_fetched_when_env_bazel_jvm_fetch_sources_is_set_to_non_true -$runner test_unused_dependency_checker_mode_warn -$runner test_override_javabin -$runner test_compilation_succeeds_with_plus_one_deps_on -$runner test_compilation_fails_with_plus_one_deps_undefined -$runner test_compilation_succeeds_with_plus_one_deps_on_for_external_deps -$runner test_compilation_succeeds_with_plus_one_deps_on_also_for_exports -$runner test_plus_one_deps_only_works_for_java_info_targets $runner bazel test //test/... --extra_toolchains="//test_expect_failure/plus_one_deps:plus_one_deps" -$runner test_unused_dependency_fails_even_if_also_exists_in_plus_one_deps -$runner test_coverage_on +. "${test_dir}"/test_build_event_protocol.sh +. "${test_dir}"/test_compilation.sh +. "${test_dir}"/test_deps.sh +. "${test_dir}"/test_javac_jvm_flags.sh +. "${test_dir}"/test_junit.sh +. "${test_dir}"/test_misc.sh +. "${test_dir}"/test_phase.sh +. "${test_dir}"/test_scala_binary.sh +. "${test_dir}"/test_scalac_jvm_flags.sh +. "${test_dir}"/test_scala_classpath.sh +. "${test_dir}"/test_scala_import_source_jar.sh +. "${test_dir}"/test_scala_jvm_flags.sh +. "${test_dir}"/test_scala_library_jar.sh +. "${test_dir}"/test_scala_library.sh +. "${test_dir}"/test_scala_specs2.sh +. "${test_dir}"/test_toolchain.sh +. "${test_dir}"/test_unused_dependency.sh diff --git a/test_version.sh b/test_version.sh index fb76315e7..196552100 100755 --- a/test_version.sh +++ b/test_version.sh @@ -4,30 +4,30 @@ set -e test_scala_version() { SCALA_VERSION=$1 - + SCALA_VERSION_SHAS='' SCALA_VERSION_SHAS+='"scala_compiler": "'$2'",' SCALA_VERSION_SHAS+='"scala_library": "'$3'",' SCALA_VERSION_SHAS+='"scala_reflect": "'$4'"' cd "${dir}"/test_version - + timestamp=$(date +%s) - + NEW_TEST_DIR="test_${SCALA_VERSION}_${timestamp}" - + cp -r version_specific_tests_dir/ $NEW_TEST_DIR - + sed \ -e "s/\${scala_version}/$SCALA_VERSION/" \ -e "s/\${scala_version_shas}/$SCALA_VERSION_SHAS/" \ WORKSPACE.template >> $NEW_TEST_DIR/WORKSPACE - + cd $NEW_TEST_DIR bazel test //... RESPONSE_CODE=$? - + cd .. rm -rf $NEW_TEST_DIR @@ -44,7 +44,8 @@ $runner test_scala_version "2.11.12" \ "3e892546b72ab547cb77de4d840bcfd05c853e73390fed7370a8f19acb0735a0" \ "0b3d6fd42958ee98715ba2ec5fe221f4ca1e694d7c981b0ae0cd68e97baf6dce" \ "6ba385b450a6311a15c918cf8688b9af9327c6104f0ecbd35933cfcd3095fe04" -$runner test_scala_version "2.12.6" \ - "3023b07cc02f2b0217b2c04f8e636b396130b3a8544a8dfad498a19c3e57a863" \ - "f81d7144f0ce1b8123335b72ba39003c4be2870767aca15dd0888ba3dab65e98" \ - "ffa70d522fc9f9deec14358aa674e6dd75c9dfa39d4668ef15bb52f002ce99fa" + +$runner test_scala_version "2.12.10" \ + "cedc3b9c39d215a9a3ffc0cc75a1d784b51e9edc7f13051a1b4ad5ae22cfbc0c" \ + "0a57044d10895f8d3dd66ad4286891f607169d948845ac51e17b4c1cf0ab569d" \ + "56b609e1bab9144fb51525bfa01ccd72028154fc40a58685a1e9adcbe7835730" diff --git a/test_version/version_specific_tests_dir/BUILD b/test_version/version_specific_tests_dir/BUILD index 26267b8cf..5c048a244 100644 --- a/test_version/version_specific_tests_dir/BUILD +++ b/test_version/version_specific_tests_dir/BUILD @@ -3,16 +3,16 @@ package(default_testonly = 1) load( "@io_bazel_rules_scala//scala:scala.bzl", "scala_binary", + "scala_junit_test", "scala_library", - "scala_test", "scala_macro_library", "scala_repl", - "scala_junit_test", "scala_specs2_junit_test", + "scala_test", ) load( "@io_bazel_rules_scala//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) # The examples below show how to combine Scala and Java rules. @@ -172,7 +172,7 @@ scala_binary( deps = [":lib_with_scala_proto_dep"], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_scala_dep", visibility = ["//visibility:public"], deps = [ diff --git a/test_version/version_specific_tests_dir/proto/BUILD b/test_version/version_specific_tests_dir/proto/BUILD index f60fba726..98f34bea6 100644 --- a/test_version/version_specific_tests_dir/proto/BUILD +++ b/test_version/version_specific_tests_dir/proto/BUILD @@ -1,6 +1,6 @@ load( "@io_bazel_rules_scala//scala_proto:scala_proto.bzl", - "scalapb_proto_library", + "scala_proto_library", ) proto_library( @@ -28,7 +28,7 @@ proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_nogrpc", visibility = ["//visibility:public"], deps = [":test2"], @@ -42,7 +42,7 @@ java_proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto_java_conversions", visibility = ["//visibility:public"], deps = [ @@ -52,7 +52,7 @@ scalapb_proto_library( ], ) -scalapb_proto_library( +scala_proto_library( name = "test_proto", visibility = ["//visibility:public"], deps = [":test_service"], diff --git a/tools/BUILD b/tools/BUILD new file mode 100644 index 000000000..599026c00 --- /dev/null +++ b/tools/BUILD @@ -0,0 +1,14 @@ +load("@com_github_bazelbuild_buildtools//buildifier:def.bzl", "buildifier") + +[ + buildifier( + name = "buildifier@%s" % mode, + mode = mode, + ) + for mode in [ + "check", + "diff", + "fix", + "print_if_changed", + ] +] diff --git a/tools/bazel b/tools/bazel new file mode 100755 index 000000000..aad205806 --- /dev/null +++ b/tools/bazel @@ -0,0 +1,89 @@ +#!/usr/bin/env bash +set -e + +default_bazel_version='1.1.0' + +if [ "$BUILDKITE" = true ]; then + bazel_version='host' +else + case $(uname -s) in + Darwin|Linux) + # shellcheck disable=SC2153 + if [ -z "$BAZEL_VERSION" ]; then + bazel_version="$default_bazel_version" + else + bazel_version="$BAZEL_VERSION" + fi + ;; + *) + # windows, presumably + bazel_version='host' + ;; + esac +fi + +case "$bazel_version" in + 'host') + bazel_version=$("$BAZEL_REAL" version | awk '/Build label/ {print $3}' | cut -d '-' -f 1) + bazel="$BAZEL_REAL" + ;; + '1.1.0') + darwin_sha='1a552f4ce194860fbbd50eeb319f81788ddf50a849e92378eec72231cc64ef65' + linux_sha='14301099c87568db302d59a5d3585f5eb8a6250ac2c6bb0367c56e623ff6e65f' + ;; + '2.0.0') + darwin_sha='c675fa27d99a3114d681db10eb03ded547c40f702b2048c99b8f4ea8e89b9356' + linux_sha='2fbdc9c0e3d376697caf0ee3673b7c9475214068c55a01b9744891e131f90b87' + ;; + *) + echo "The requested Bazel version '$bazel_version' is not supported" + exit 1 + ;; +esac + +if [ -z "$bazel" ]; then + bazel_bin_loc=~/.bazel_binaries + bazel=$bazel_bin_loc/$bazel_version/bin/bazel-real + + if ! [ -f "$bazel" ]; then + case $(uname -s) in + Darwin) + platform='darwin' + sha=$darwin_sha + ;; + Linux) + platform='linux' + sha=$linux_sha + ;; + *) + echo 'Your OS is not supported for automatic bazel installs.' + exit 1 + ;; + esac + remote_source=https://github.com/bazelbuild/bazel/releases/download + installer_name="bazel-$bazel_version-installer-$platform-x86_64.sh" + url="$remote_source/$bazel_version/$installer_name" + ( + tmp_dir=$(mktemp -d) + # shellcheck disable=SC2064 + trap "rm -rf $tmp_dir" EXIT + cd "$tmp_dir" + (>&2 echo "downloading installer from") + (>&2 echo "$url") + curl -o installer.sh -L "$url" + generated_sha=$(shasum -a 256 installer.sh | awk '{print $1}') + if [ "$generated_sha" != "$sha" ]; then + echo "Sha 256 does not match, expected: $sha" + echo "But found $generated_sha" + echo "Recommend you: update the sha to the expected" + echo "and then re-run this script" + exit 1 + fi + mkdir -p $bazel_bin_loc + chmod +x installer.sh + ./installer.sh --base="$bazel_bin_loc"/"$bazel_version" --bin="$bazel_bin_loc"/"$bazel_version"/bin_t + ) >&2 + fi +fi + +exec "$bazel" "$@" diff --git a/tut_rule/tut.bzl b/tut_rule/tut.bzl index 6ed38c5bf..924b21ea6 100644 --- a/tut_rule/tut.bzl +++ b/tut_rule/tut.bzl @@ -30,7 +30,7 @@ def tut_repositories( "org.tpolecat:tut-core:0.4.8", major_version, ), - jar_sha256 = scala_jar_shas[major_version]["tut_core"], + artifact_sha256 = scala_jar_shas[major_version]["tut_core"], licenses = ["notice"], server_urls = server_urls, ) @@ -58,5 +58,5 @@ def scala_tut_doc(**kw): srcs = [src], outs = ["%s_tut.md" % name], tools = [tool], - cmd = "./$(location %s) $(location %s) \"$@\"" % (tool, src), + cmd = "./$(location %s) $(location %s) \"$@\" >/dev/null" % (tool, src), ) diff --git a/twitter_scrooge/twitter_scrooge.bzl b/twitter_scrooge/twitter_scrooge.bzl index 6e9375eab..4680b3701 100644 --- a/twitter_scrooge/twitter_scrooge.bzl +++ b/twitter_scrooge/twitter_scrooge.bzl @@ -23,13 +23,13 @@ _jar_extension = ".jar" def twitter_scrooge( scala_version = _default_scala_version(), - maven_servers = ["https://repo1.maven.org/maven2"]): + maven_servers = ["https://repo.maven.apache.org/maven2"]): major_version = _extract_major_version(scala_version) _scala_maven_import_external( name = "libthrift", artifact = "org.apache.thrift:libthrift:0.8.0", - jar_sha256 = "adea029247c3f16e55e29c1708b897812fd1fe335ac55fe3903e5d2f428ef4b3", + artifact_sha256 = "adea029247c3f16e55e29c1708b897812fd1fe335ac55fe3903e5d2f428ef4b3", licenses = ["notice"], server_urls = maven_servers, ) @@ -61,7 +61,7 @@ def twitter_scrooge( "com.twitter:scrooge-core:18.6.0", major_version, ), - jar_sha256 = scala_version_jar_shas["scrooge_core"], + artifact_sha256 = scala_version_jar_shas["scrooge_core"], licenses = ["notice"], server_urls = maven_servers, ) @@ -77,7 +77,7 @@ def twitter_scrooge( "com.twitter:scrooge-generator:18.6.0", major_version, ), - jar_sha256 = scala_version_jar_shas["scrooge_generator"], + artifact_sha256 = scala_version_jar_shas["scrooge_generator"], licenses = ["notice"], server_urls = maven_servers, ) @@ -92,7 +92,7 @@ def twitter_scrooge( "com.twitter:util-core:18.6.0", major_version, ), - jar_sha256 = scala_version_jar_shas["util_core"], + artifact_sha256 = scala_version_jar_shas["util_core"], licenses = ["notice"], server_urls = maven_servers, ) @@ -107,7 +107,7 @@ def twitter_scrooge( "com.twitter:util-logging:18.6.0", major_version, ), - jar_sha256 = scala_version_jar_shas["util_logging"], + artifact_sha256 = scala_version_jar_shas["util_logging"], licenses = ["notice"], server_urls = maven_servers, ) @@ -212,7 +212,7 @@ def _compile_scala( label.name + "_scalac.statsfile", sibling = scrooge_jar, ) - merged_deps = java_common.merge(deps_java_info + implicit_deps) + merged_deps = java_common.merge(_concat_lists(deps_java_info, implicit_deps)) # this only compiles scala, not the ijar, but we don't # want the ijar for generated code anyway: any change @@ -249,18 +249,11 @@ def _compile_scala( compile_jar = output, ) -def _empty_java_info(deps_java_info, implicit_deps): - merged_deps = java_common.merge(deps_java_info + implicit_deps) - return java_common.create_provider( - use_ijar = False, - compile_time_jars = depset(transitive = [merged_deps.compile_jars]), - transitive_compile_time_jars = depset( - transitive = [merged_deps.transitive_compile_time_jars], - ), - transitive_runtime_jars = depset( - transitive = [merged_deps.transitive_runtime_jars], - ), - ) +def _concat_lists(list1, list2): + all_providers = [] + all_providers.extend(list1) + all_providers.extend(list2) + return all_providers #### # This is applied to the DAG of thrift_librarys reachable from a deps @@ -322,7 +315,7 @@ def _scrooge_aspect_impl(target, ctx): # this target is only an aggregation target src_jars = depset() outs = depset() - java_info = _empty_java_info(deps, imps) + java_info = java_common.merge(_concat_lists(deps, imps)) return [ ScroogeAspectInfo( @@ -379,7 +372,8 @@ def _scrooge_scala_library_impl(ctx): ) if ctx.attr.exports: exports = [exp[JavaInfo] for exp in ctx.attr.exports] - all_java = java_common.merge(exports + [aspect_info.java_info]) + exports.append(aspect_info.java_info) + all_java = java_common.merge(exports) else: all_java = aspect_info.java_info @@ -399,15 +393,15 @@ scrooge_scala_library = rule( ) def _scrooge_scala_import_impl(ctx): - scala_jars = depset(ctx.files.scala_jars) - jars_ji = java_common.create_provider( - use_ijar = False, - compile_time_jars = scala_jars, - transitive_compile_time_jars = scala_jars, - transitive_runtime_jars = scala_jars, - ) + jars_jis = [ + JavaInfo( + output_jar = scala_jar, + compile_jar = scala_jar, + ) + for scala_jar in ctx.files.scala_jars + ] java_info = java_common.merge( - [imp[JavaInfo] for imp in ctx.attr._implicit_compile_deps] + [jars_ji], + [imp[JavaInfo] for imp in ctx.attr._implicit_compile_deps] + jars_jis, ) # to make the thrift_info, we only put this in the