Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
45c7bc5
Adding the proto and the dependencies
grundprinzip Jun 16, 2022
c2c6de0
Trying to fix aarch64
grundprinzip Jul 5, 2022
034a44d
Moving connect to its own module
grundprinzip Jul 7, 2022
0cedf00
[SparkConnect] Spark Connect Planner
grundprinzip Jul 4, 2022
5d4b8dd
adding the planner
grundprinzip Jul 25, 2022
506fc44
adding the python code
grundprinzip Jul 26, 2022
0f6a687
fixing tests
grundprinzip Jul 26, 2022
9adf4bd
Adding some very basic unit tests
grundprinzip Jul 29, 2022
ed2fe24
adding explain to the service
grundprinzip Jul 29, 2022
786efbb
adding apache header and small col function
grundprinzip Jul 31, 2022
8693efb
more apache headers
grundprinzip Jul 31, 2022
d336b2a
Adding some more tests
grundprinzip Jul 31, 2022
dbe3d1f
adding some more doc
grundprinzip Jul 31, 2022
3ea55ca
some basic tests for literals
grundprinzip Jul 31, 2022
127a492
Fixing the pom files
grundprinzip Aug 29, 2022
4c63000
Moving to grpc-java
grundprinzip Aug 29, 2022
b713969
Fixing some python tests
grundprinzip Aug 29, 2022
0547e07
Fixing more python tests
grundprinzip Aug 29, 2022
074e1f7
making connect build by default
grundprinzip Aug 29, 2022
e77a018
fixing sbt build
grundprinzip Aug 29, 2022
21224ec
More sbt stuff
grundprinzip Aug 30, 2022
e22975f
More sbt stuff
grundprinzip Aug 30, 2022
9601449
More sbt stuff
grundprinzip Aug 31, 2022
b3ab663
properly shaded spark connect build
grundprinzip Sep 2, 2022
8968f1e
SBT build and testing works
grundprinzip Sep 4, 2022
87a50d4
Python linting
grundprinzip Sep 4, 2022
fefc84a
Restricting enaling Spark Connect to sepcific tests only and fixing t…
grundprinzip Sep 4, 2022
193e6b0
scala 2.13 fix
grundprinzip Sep 4, 2022
e1f862e
Fixin python lint issues
grundprinzip Sep 4, 2022
d5b6002
Fixin python lint issues
grundprinzip Sep 4, 2022
8bb03d2
Marking all classes in Spark Connect as experimental
grundprinzip Sep 4, 2022
265aca3
Fixing style
grundprinzip Sep 4, 2022
d6f64e8
test infra
grundprinzip Sep 5, 2022
3d00bb0
Missing file for disabling mypy checks
grundprinzip Sep 5, 2022
3525eab
Trying to add a python package the right way
grundprinzip Sep 5, 2022
7e1dd58
Removing grpc/protobuf from pypy
grundprinzip Sep 5, 2022
fa9e85e
Disabling pypy for spark connect
grundprinzip Sep 5, 2022
cf6b19a
Adding licence files
grundprinzip Sep 5, 2022
16bf911
Fixing a bug in UDF handling for Python version selection
grundprinzip Sep 5, 2022
762104a
Trailing newlines
grundprinzip Sep 5, 2022
ca4cfbb
Adding dependency manaifest
grundprinzip Sep 5, 2022
79bbfd6
Moving to classs
grundprinzip Sep 6, 2022
ec1221b
Merge branch 'master' into spark-connect-grpc-shaded
grundprinzip Sep 6, 2022
63d8ebc
dependencies
grundprinzip Sep 6, 2022
70ad818
M2 cache bomb
grundprinzip Sep 6, 2022
ae75ca4
More m2 removal
grundprinzip Sep 6, 2022
7e615f7
Revert "More m2 removal"
HyukjinKwon Sep 7, 2022
8effefb
Revert "M2 cache bomb"
HyukjinKwon Sep 7, 2022
aa68119
Add Python 3.7
HyukjinKwon Sep 7, 2022
fcefe9e
Revert "Add Python 3.7"
HyukjinKwon Sep 7, 2022
ff8e4ad
Merge remote-tracking branch 'upstream/master' into HEAD
HyukjinKwon Sep 7, 2022
b2f6548
Python 3.7
HyukjinKwon Sep 7, 2022
feb0233
Revert "Python 3.7"
HyukjinKwon Sep 7, 2022
4b02eb3
Tweaking memory consumption of SBT
grundprinzip Sep 7, 2022
aec7e3d
Disabling paralell execution for SBT in doc build
grundprinzip Sep 7, 2022
e4485c3
Avoiding copying the shaded jar in doc build
grundprinzip Sep 8, 2022
4ef5a35
Sbt doc build still
grundprinzip Sep 8, 2022
e13b955
test package build
grundprinzip Sep 8, 2022
6c96205
only test package build
grundprinzip Sep 9, 2022
d1e3c13
using exact pyspark build options
grundprinzip Sep 9, 2022
9d43765
Moving things around
grundprinzip Sep 9, 2022
448fcc8
Desperate attempts
grundprinzip Sep 9, 2022
b3d2d8c
Java 11 anyone?
grundprinzip Sep 9, 2022
b2c6bfd
Revert "Java 11 anyone?"
grundprinzip Sep 13, 2022
11ad9b6
disable tests on assembly run
grundprinzip Sep 13, 2022
7617b82
Adding additional tests
grundprinzip Sep 14, 2022
ce4900b
format + slight python api change
grundprinzip Sep 14, 2022
28e7741
update on readme and import
grundprinzip Sep 14, 2022
8223615
Properly catching exceptions and removing stray debug user
grundprinzip Sep 18, 2022
8bf14f2
scalastyle
grundprinzip Sep 18, 2022
87ba6f2
doc fix
grundprinzip Sep 18, 2022
45f23e0
addrsessing review comments
grundprinzip Sep 20, 2022
4971980
removing embedded protos for 3p google
grundprinzip Sep 20, 2022
4aafab8
renaming
grundprinzip Sep 20, 2022
38b69ce
Adding @Since annotation
grundprinzip Sep 20, 2022
07b0ec8
Fixing python test with the right package
grundprinzip Sep 20, 2022
f47b8e9
fixing build error due to package refactoring
grundprinzip Sep 20, 2022
b57cbd2
Sql -> SQL
grundprinzip Sep 21, 2022
77470cd
Scala review comments
grundprinzip Sep 22, 2022
ee13ae2
Build file review
grundprinzip Sep 22, 2022
51be506
Python review comments
grundprinzip Sep 22, 2022
549a10e
Python review comments
grundprinzip Sep 22, 2022
b0608f3
black fmt
grundprinzip Sep 22, 2022
279faf4
Addressing review comments
grundprinzip Sep 23, 2022
e5e2347
Addressing review comments
grundprinzip Sep 23, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -242,7 +242,7 @@ jobs:
- name: Install Python packages (Python 3.8)
if: (contains(matrix.modules, 'sql') && !contains(matrix.modules, 'sql-'))
run: |
python3.8 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy unittest-xml-reporting
python3.8 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy unittest-xml-reporting grpcio protobuf
python3.8 -m pip list
# Run the tests.
- name: Run tests
Expand Down Expand Up @@ -333,6 +333,8 @@ jobs:
pyspark-pandas
- >-
pyspark-pandas-slow
- >-
pyspark-sql-connect
env:
MODULES_TO_TEST: ${{ matrix.modules }}
HADOOP_PROFILE: ${{ inputs.hadoop }}
Expand Down Expand Up @@ -576,7 +578,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m pip install 'sphinx<3.1.0' mkdocs pydata_sphinx_theme ipython nbsphinx numpydoc 'jinja2<3.0.0' 'markupsafe==2.0.1'
python3.9 -m pip install ipython_genutils # See SPARK-38517
python3.9 -m pip install sphinx_plotly_directive 'numpy>=1.20.0' pyarrow pandas 'plotly>=4.8'
python3.9 -m pip install sphinx_plotly_directive 'numpy>=1.20.0' pyarrow pandas 'plotly>=4.8' grpcio protobuf
python3.9 -m pip install 'docutils<0.18.0' # See SPARK-39421
apt-get update -y
apt-get install -y ruby ruby-dev
Expand Down
5 changes: 5 additions & 0 deletions assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,11 @@
<artifactId>spark-repl_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-connect_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I ask some questions, @grundprinzip and @HyukjinKwon ?

  1. Do we need to embed this in the Apache Spark binary releases inevitably?
  2. Can we publish like Kafka or Avro?
  3. Can we put this under profile like hadoop-cloud at least?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one, I will handle separately. It will be published like Kafka or Avro, and yes it will be covered by a profile. I will do separately.

Copy link
Contributor

@LuciferYang LuciferYang Sep 27, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Run dev/make-distribution.sh --tgz and decompress the generated tar ball, I found the jars related to grpc and protobuf-java-util-3.19.2.jar are also placed in the jars directory as follows:

ls -l *grpc* 
-rw-r--r--  1 yangjie01  staff   256991  9 27 18:03 grpc-api-1.47.0.jar
-rw-r--r--  1 yangjie01  staff    30593  9 27 18:03 grpc-context-1.47.0.jar
-rw-r--r--  1 yangjie01  staff   689433  9 27 18:03 grpc-core-1.47.0.jar
-rw-r--r--  1 yangjie01  staff  9129585  9 27 18:03 grpc-netty-shaded-1.47.0.jar
-rw-r--r--  1 yangjie01  staff     5115  9 27 18:03 grpc-protobuf-1.47.0.jar
-rw-r--r--  1 yangjie01  staff     7570  9 27 18:03 grpc-protobuf-lite-1.47.0.jar
-rw-r--r--  1 yangjie01  staff   838576  9 27 18:03 grpc-services-1.47.0.jar
-rw-r--r--  1 yangjie01  staff    50879  9 27 18:03 grpc-stub-1.47.0.jar

should we explicitly exclude them here due to they already shaded

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same concerns at #37710 (comment)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing this out. I am working on this - would make a PR soon.


<!--
Because we don't shade dependencies anymore, we need to restore Guava to compile scope so
Expand Down
279 changes: 279 additions & 0 deletions connect/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,279 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one or more
~ contributor license agreements. See the NOTICE file distributed with
~ this work for additional information regarding copyright ownership.
~ The ASF licenses this file to You under the Apache License, Version 2.0
~ (the "License"); you may not use this file except in compliance with
~ the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

<artifactId>spark-connect_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Connect</name>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>connect</sbt.project.name>
<protobuf.version>3.21.1</protobuf.version>
<guava.version>31.0.1-jre</guava.version>
<io.grpc.version>1.47.0</io.grpc.version>
<tomcat.annotations.api.version>6.0.53</tomcat.annotations.api.version>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- #if scala-2.13 --><!--
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-parallel-collections_${scala.binary.version}</artifactId>
</dependency>
--><!-- #endif scala-2.13 -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>failureaccess</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>${protobuf.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-netty-shaded</artifactId>
<version>${io.grpc.version}</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-protobuf</artifactId>
<version>${io.grpc.version}</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-services</artifactId>
<version>${io.grpc.version}</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-stub</artifactId>
<version>${io.grpc.version}</version>
</dependency>
<dependency> <!-- necessary for Java 9+ -->
<groupId>org.apache.tomcat</groupId>
<artifactId>annotations-api</artifactId>
<version>${tomcat.annotations.api.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scalacheck</groupId>
<artifactId>scalacheck_${scala.binary.version}</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<scope>test</scope>
</dependency>

</dependencies>
<build>
<!-- Protobuf compilation for Spark Connect -->
<extensions>
<extension>
<groupId>kr.motd.maven</groupId>
<artifactId>os-maven-plugin</artifactId>
<version>1.6.2</version>
</extension>
</extensions>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-sources</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>src/main/scala-${scala.binary.version}</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-scala-test-sources</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>src/test/gen-java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<!-- Add protobuf-maven-plugin and provide ScalaPB as a code generation plugin -->
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@HyukjinKwon @grundprinzip Report another issue:

Compile connect module on CentOS release 6.3, the default glibc version is 2.12, this will cause compilation to fail as follows:

[ERROR] PROTOC FAILED: /home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe)
/home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.18' not found (required by /home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe)
/home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.14' not found (required by /home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe)
/home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe: /usr/lib64/libstdc++.so.6: version `CXXABI_1.3.5' not found (required by /home/disk0/spark-source/connect/target/protoc-plugins/protoc-3.21.1-linux-x86_64.exe) 

Already file a jira SPARK-40593, I think at least we should explicitly point out the compilation dependency somewhere

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @LuciferYang for checking this out.

<version>0.6.1</version>
<configuration>
<protocArtifact>com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:${io.grpc.version}:exe:${os.detected.classifier}</pluginArtifact>
<protoSourceRoot>src/main/protobuf</protoSourceRoot>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Shade all GRPC / Guava / Protobuf dependencies of this build -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<artifactSet>
<includes>
<include>com.google.guava:*</include>
<include>io.grpc:*:</include>
<include>com.google.protobuf:*</include>
</includes>
</artifactSet>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>${spark.shade.packageName}.connect.guava</shadedPattern>
<includes>
<include>com.google.common.**</include>
</includes>
</relocation>
<relocation>
<pattern>com.google.thirdparty</pattern>
<shadedPattern>${spark.shade.packageName}.connect.guava</shadedPattern>
<includes>
<include>com.google.thirdparty.**</include>
</includes>
</relocation>
<relocation>
<pattern>com.google.protobuf</pattern>
<shadedPattern>${spark.shade.packageName}.connect.protobuf</shadedPattern>
<includes>
<include>com.google.protobuf.**</include>
</includes>
</relocation>
<relocation>
<pattern>io.grpc</pattern>
<shadedPattern>${spark.shade.packageName}.connect.grpc</shadedPattern>
</relocation>
</relocations>
</configuration>
</plugin>
</plugins>
</build>
</project>
41 changes: 41 additions & 0 deletions connect/src/main/buf.gen.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
version: v1
plugins:
- remote: buf.build/protocolbuffers/plugins/cpp:v3.20.0-1
out: gen/proto/cpp
- remote: buf.build/protocolbuffers/plugins/csharp:v3.20.0-1
out: gen/proto/csharp
- remote: buf.build/protocolbuffers/plugins/java:v3.20.0-1
out: gen/proto/java
- remote: buf.build/protocolbuffers/plugins/python:v3.20.0-1
out: gen/proto/python
- remote: buf.build/grpc/plugins/python:v1.47.0-1
out: gen/proto/python
- remote: buf.build/protocolbuffers/plugins/go:v1.28.0-1
out: gen/proto/go
opt:
- paths=source_relative
- remote: buf.build/grpc/plugins/go:v1.2.0-1
out: gen/proto/go
opt:
- paths=source_relative
- require_unimplemented_servers=false
- remote: buf.build/grpc/plugins/ruby:v1.47.0-1
out: gen/proto/ruby
- remote: buf.build/protocolbuffers/plugins/ruby:v21.2.0-1
out: gen/proto/ruby
19 changes: 19 additions & 0 deletions connect/src/main/buf.work.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
version: v1
directories:
- protobuf
Loading