-
Notifications
You must be signed in to change notification settings - Fork 376
golang code generation #68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@julian-hj I am actually in the process of refactor the make process for C++ as well. Let's collaborate on that. |
Hi @julian-hj,
A way to make that work with Golang tooling is adding this to the top of a Go source file:
You can read more on I personally believe the spec should be a protobuf first and Markdown second, in part to make automated tooling easier. I also think hooks that cause additional commits are a bad idea. Please check the spec repo and note I already process out the protobuf and a generated Go source with every build and push those to a Gist. I've offered to provide additional assistance in the realm of CI to the CSI team. I think they're happy with the way things are ATM. I agree that it could be better. |
Hi @julian-hj, By the way, I've migrated away from
See this CI eample - while the final stage is broken due to something unrelated, that the second stage builds Go programs sans Go (a C container) in concert with the env var file generated by the |
Hi @julian-hj, Holy crap I'm an idiot. Pardon me. Your original comment literally calls out |
Hi @julian-hj, I will say my |
Hi @julian-hj, Last thing. Please find my GoCSI PR that the CNCF folks decided wasn't appropos for merge, but still might be useful for your needs as an example. It generates the CSI Go code on demand via a primary Makefile, but it could easily be updated to use I'd suggest you not import anything but generate the code locally. One of the values of gRPC is having a declarative, generational method of recreating types that can be used to marshal/unmarshal common data structures. There's really no need for a central, shared CSI library IMO. And in fact it could be viewed as a bad idea as it would place a dependency on shared, generated code instead of the spec from which it is generated. |
Hi @akutz, Thanks for the links--we will definitely take a look at your approach. For now, the big issue I see is that CSI will need to fit into a bunch of different build environments, and right now, the state of package management in golang is very fragmented, with some builds using old school imports with the system gopath, some folks using vendoring and submodules (possibly with godep), some folks using glide, and probably some folks using other tools I'm not aware of. For CSI to be straightforward to consume in all those disparate build environments, IMO there's no substitute for having a git repo somewhere with csi.pb.go already generated. My guess is if we don't set that up, then probably we will see a proliferation of manually generated checked in copies living out there in the wild, and any example build we eventually promote to the /examples folder will not be |
Hi @julian-hj, I don't think it's a problem having multiple copies of |
By the way, take a look at https://github.com/stevvooe/protobuild for generating pb.go's without resorting to |
Hi @cpuguy83, I'm unaware of any of the issues with
And if you were going to use |
Minor correction: as of BurntSushi/toml@4678bf2 BurntSushi's toml package is licensed MIT. |
Hi @gpaul, Well that is def good to hear! :) |
I add the C++ build support here #69 |
I can speak from C++ project's perspective. There are two common build systems used for C++ projects: autotools and CMake, both first do some auto configuration based on the system environment and user input, and then generate a makefile. For CSI downstream C++ projects, they normally expect a library (static will be sufficient in this case) and a few headers they can build and link with. The library and the headers are either already installed in system prefixes, or the downstream project will choose to bundle the dependencies in their repo (e.g., gRPC bundles dependencies like protobuf/boringssl, etc.) to avoid the dependency on the installation of those packages. Many C++ projects support both modes (e.g., gRPC, Mesos, etc.). So the best practice for CSI spec repo is to have a build system (either autotool, or CMake) that can build the headers and the library for downstream projects. I prefer CMake because it can be made available on Windows platforms, and has a much nicer way of propagating compile flags to downstream projects. The generation of the csi.pb.h and csi.pb.cc naturally fits into the build process (see my PR #69). |
Hi @jieyu,
Are you referring to relative to C++ specifically? Because otherwise the CSI spec repo does have a build system, it has a Makefile which is GNU Make compliant and is built with Travis-CI. The spec repo's build system should be irrelevant anyway as it should likely only be executed as part of the CI process. @jdef has previously stated to me that the consensus was this repo should not directly include source code or even a compiled protobuf file. FWIW I disagreed with both of these positions, but regardless of my own feelings on the matter the fact is that it is trivial to generate the gRPC sources from the protobuf. I've already stated above why a centralized gRPC library for the CSI spec for golang is a bad idea. Still, I've volunteered multiple times help with this repo's CI process and would be more than happy to implement such automated builds of the spec and corresponding language-specific artifacts. |
Yes, I am talking about C++ specifically. The current Makefile is not sufficient. For instance, some project might bundle protobuf (e.g., Mesos or gRPC), and prefer to use the bundled protobuf and protoc. I'd prefer we build the CSI client/server stub (for C++) in CSI repo itself. Otherwise, downstream projects will have to do the same thing over and over again. |
Hi @jieyu,
Ah, that wasn't clear.
Yes, and I don't see that as a problem. One purpose of selecting gRPC was to be able to generate language-specific code from a language-agnostic specification. |
This might be true in go, but for C++ projects, this is an anti pattern. Downstream project should not be aware how the upstream project build the library. It'll only link with the library produced by the upstream project and consumes the headers. |
Hi @jieyu, Sure, I get that. With Go there are no headers and everything is built from source, so this is less of an issue, especially with regards to how Go handles vendored types and fully qualified type import paths (aka package hashes) (again, see the above issue I have with linking to externally generated code). Even with C++ though I wouldn't consider this to be anti-pattern. If you don't think of this project as a library but as a source for a spec that you build into your library or program, it's not really an issue. A centralized set of libraries or reusable artifacts aren't going to prevent incompatible client/server pairings. Unless you're simply concerned or wanting there to be a single, dynamic lib to which your program can link (or all C++ progs can link) on a given host instead of statically linking some lib you compile locally or even directly introducing the spec into your program as another source? |
I think the consensus previously was that the validation tooling will be in this repo in the future (so that when we ship a spec, we have a corresponding validation tooling that's in sync with the spec, and has the same versioning). As a result, we'll be adding source code into this repo in the future (probably go). I'd actually suggest that we put the CSI library code into the repo as well (instead of a separate repo) for the same reason mentioned above. This is actual what CNI is doing. For library, it needs to be multiple language (at least C++ and go). For C++, a build system that generates headers and libraries are pretty standard practice. For go, i guess it'll just be source code that downstream project will use. The only tricky part is the generation of protobuf file from the spec doc. Here is my suggestion:
We can add the |
Close this issue as #74 is merged. |
There's a very nice & useful makefile in the spec repo that will extract the protobuf snippets from
spec.md
and generate golang code from that protobuf, but we lack a good way to make that makefile work with standard go tooling.In the normal golang world, I can write a plugin that imports
github.com/container-storage-interface/spec
and then I cango get
that plugin, and go will fetch all the various code packages and build my plugin executable. But that mechanism only works if the go code is actually available. As it stands today, thatgo get
would clone the spec repo and then fail to build because the code is not yet generated. For that reason, by conventiongo generate
d code is normally checked in. (Checking in generated code seems kind of gross if you are coming to golang from just about any other programming language, but nonetheless, that seems to be the convention.)In order to make CSI more go-friendly, I propose to have a CI task that watches for changes to the spec repo, and generates the protobuf and go code whenver there is a spec change. We're making a version of that in our cloudfoundry pipelines, but we're happy to push the generated code back to a CSI repo if others agree that this is desirable.
The text was updated successfully, but these errors were encountered: