Conversation
This adds go_vet_test It uses the gopath structure built by one or more go_path rules to run go tool vet, the test fails if the vet tool reports any issues.
| @@ -0,0 +1,69 @@ | |||
| # Copyright 2014 The Bazel Authors. All rights reserved. | |||
There was a problem hiding this comment.
s/2014/2017/
Will go_fmt_test and go_fix_test also be in this file in the future? tools.bzl is an unfortunately generic name, even though it is accurate. Maybe go_tools.bzl or tool_tests.bzl would be better.
There was a problem hiding this comment.
Not all things would be tests.
I was going to stick all the tools in one file, but how about this structure, a tools directory with a bzl file per tool, so go/private/tools/vet.bzl
go/private/tools.bzl
Outdated
| go_toolchain = get_go_toolchain(ctx) | ||
| script_file = ctx.new_file(ctx.label.name+".bash") | ||
| gopath = [] | ||
| files = ctx.files.data |
There was a problem hiding this comment.
In #726, they copied lists from ctx.attr (as ctx.attr.linkopts[:]). That seems to imply ctx.files.data will be immutable in the future.
Moving tool rules into their own folder, and having a separate bzl file per tool.
|
This looks really great and solves one of our annoying issues! One question: It looks like the suggested way to use this is to say go_vet_test(
name = "vet",
data = [":all_srcs"]
) For Kubernetes, running |
|
@spxtr maybe lint checks in bazel are weird. |
|
I'm speculating here, but for v2 of these tests, I wonder if we could produce some artifact after a successful test of a package, then only re-test when the inputs change. The question is would it be possible to save a lot of extra configuration (avoiding Would also be good to test a subset of packages. Maybe vendored packages have warnings you don't want to hear about, for example. |
|
This is not intended as the long term interface to vet, it was written more as a proof that the path code was good enough to write these tools on top of. It is easy to make it slightly more flexible if you want to start using it now (for instance, making it non recursive, or support a list of packages to actually check), and I am happy to do so, but it will definitely change again in the future. Per package vet tests with correct dependancies are clearly the way to go, but there are some costs.... Every separate vet test has to build it's own gopath copy that includes that package and all dependencies. If we add it as a separate rule, what about all the other checks (gosimple, gofmt, errcheck, staticcheck, unused etc) I am currently considering the concept of adding a new toolchain type (go_validator) and having a go_validate_test() rule that uses that toolchain to do it's work. Then you could control the set of static checks by switching the toolchain registry. This would allow gazelle to add exactly one extra rule per library, while still giving users full control. It's still very much a nascent idea though, I need to try it out and see if it holds up in the real world first. |
…ib#1855) With this PR the code becomes more maintainable and easier to inspect. Since bazel-skylib is already a dependency of rules_python, this is a backwards compatible change. Skipping the CHANGELOG notes because it should not be an externally visible change. Work towards bazel-contrib#735.
…bazel-contrib#1856) With this change we can in theory have multi-platform libraries in the dependency cycle and use the pip hub repo for the dependencies. With this we can also make the contents of `whl_library` not depend on what platform the actual dependencies are. This allows us to support the following topologies: * A platform-specific wheel depends on cross-platform wheel. * A cross-platform wheel depends on cross-platform wheel. * A whl_library can have `select` dependencies based on the interpreter version, e.g. pull in a `tomli` dependency only when the Python interpreter is less than 3.11. Relates to bazel-contrib#1663. Work towards bazel-contrib#735.
…nency closures (bazel-contrib#1875) We start using the recently introduced `is_python_config_setting` to make it possible to have a working select statement when multiple python version selection needs to happen in a `whl_library`. This adds further fixes so that the correct dependencies are pulled in when the `python_version` string flag is unset thus making this implementation suitable for `bzlmod` use case where we would use a single `whl_library` instance for multiple python versions within the hub. Work towards bazel-contrib#735.
…rib#1885) This PR implements a better way of specifying the requirements files for different (os, cpu) tuples. It allows for more granular specification than what is available today and allows for future extension to have all of the sources in the select statements in the hub repository. This is replacing the previous selection of the requirements and there are a few differences in behaviour that should not be visible to the external user. Instead of selecting the right file which we should then use to create `whl_library` instances we parse all of the provided requirements files and merge them based on the contents. The merging is done based on the blocks within the requirement file and this allows the starlark code to understand if we are working with different versions of the same package on different target platforms. Fixes bazel-contrib#1868 Work towards bazel-contrib#1643, bazel-contrib#735
This adds go_vet_test
It uses the gopath structure built by one or more go_path rules to run go tool
vet, the test fails if the vet tool reports any issues.