The gtest source was changed in #7237 on an opt-in basis, and has been
released since 3.12.0. The comments state that the default should change
in 3.13.0, but that didn't quite happen.
This change does flip the default, and updates comments to say 3.14.0.
This change updates the README.md to describe the order of resolution
between `options['generate_py_protobufs']`, `--protoc`, and the `PROTOC`
env var.
Since the `setup.py` script is not the main way to find docs, this change
deletes the parts of the docstring that are redundant with `README.md`.
Currently, the logic in `generate_py_protobufs` cannot find a custom `protoc` (even though it's supposed to be possible). Fixes:
1. Mark the `--protoc` flag as accepting an argument.
2. If the `--protoc` flag was not passed, try finding `PROTOC` in the environment.
3. (Existing behavior) Otherwise, fall back to `spawn.find_executable`.
Hat tip to @bobhancock for uncovering the problem(s).
The Python C++ extension targets are not used unless
`--@:use_fast_cpp_protos=true`, and may not even be able to build
if the Python headers are missing (note that
`//util/python:python_headers`, bound to `@python_headers//` in
`//:WORKSPACE`, is not currently sufficient). This change adds the
`"manual"` tag to these targets, so that they do not cause
`bazel test ...` to fail when Python headers are missing. Without
the manual tag, the targets are always selected, even if
`--@:use_fast_cpp_protos=false`.
The `:cc_proto_blacklist_test` target is metastable, depending on
whether the `--proto_toolchain_for_cc=` flag names a target with
or without the `@com_google_protobuf//` prefix. We use the correct
prefix for Bazel's default in `kokoro/linux/bazel/build.sh`, but
the `bazel test :cc_proto_blacklist_test` (with or without `//`)
fails consistently. Hopefully, this will be fixed when
bazelbuild/bazel#10590 is addressed.
Using non-versioned scripts for `build_file` means every Python version is
tested (under Tox), and this is multiplied by each Python version running Tox.
For example, when the `python38` config is used to run Tox, the
Tox layer runs on Python 3.8. This then tests all of the Python versions
individually, including, for example, python27 tests.
This change fixes the `build_file` paths to point to the same-pathed build
script. For example, `kokoro/linux/python27/presubmit.cfg` now uses
`kokoro/linux/python27/build.sh`.
Some additional fixes:
* Use `python -m tox` in tests.sh instead of just `tox`. This helps non-site
installations of tox, where the `tox` script may not be on `$PATH`.
* Ensure tox (and other Python build-related packages) are available in
Python testing images. (New images have been pushed.)
* Disable `--warnings_as_errors` due to deprecated function.
* Remove apt lists per [Docker best practices][1].
[1]: https://docs.docker.com/develop/develop-images/dockerfile_best-practices/
The Codespell checker is saying we should replace "files'" with
"file's", but this is not a real issue since the single-quote on the end
is being used as a quote and not an apostrophe. Let's tell Codespell to
ignore this.
Since `google` is a Python namespace package, the google/__init__.py
file should be omitted to avoid collisions. For example, its presence
may cause other Pip packages in the google namespace not to be found.
This change narrows which __init__.py files are included in the
`//:protobuf_runtime` target to include only those under
`google/protobuf`. It also moves their canonical inclusion to the
internal `//:python_srcs` rule, eliminating the dual specification in
`python_protobuf.extra_srcs`.
This is the difference in files, lexicographically ordered:
```
$ bazel cquery 'labels(srcs, kind(py_library, deps(:protobuf_python)))' | sort > /tmp/deps-{before,after}.txt
$ diff /tmp/deps-{before,after}.txt
1,5d0
< //:python/compatibility_tests/v2.5.0/tests/__init__.py (null)
< //:python/compatibility_tests/v2.5.0/tests/google/__init__.py (null)
< //:python/compatibility_tests/v2.5.0/tests/google/protobuf/__init__.py (null)
< //:python/compatibility_tests/v2.5.0/tests/google/protobuf/internal/__init__.py (null)
< //:python/google/__init__.py (null)
51d45
< //:python/protobuf_distutils/protobuf_distutils/__init__.py (null)
```
This seems like a desirable change, since it avoids polluting other
packages, too.
This change is conceptually similar to #7877, but for Bazel.
In #713 and #1296, the `google` package in protobuf sources was found
to cause conflicts with other Google projects, because it was not
properly configured as a namespace package [1]. The initial fix in
786f80f addressed part of the issue, and #1298 fixed the rest.
However, 786f80f (the initial fix) also made `google.protobuf` and
`google.protobuf.pyext` into namespace packages. This was not correct:
they are both regular, non-namespace, sub-subpackages.
However (still), the follow-up #1298 did not nominate them as
namespace packages, so the namespace registration behavior has
remained, but without benefit.
This change removes the unnecessary namespace registration, which has
substantial overhead, thus reducing startup time substantially when
using protobufs.
Because this change affects the import internals, quantifying the
overhead requires a full tear-down/start-up of the Python interpreter.
So, to capture the full cost for every run, I measured the time to
launching a _fresh_ Python instance in a subprocess, varying the
imports and code under test. In other words, I used `timeit` to
measure the time to launch a _fresh_ Python subprocess which actually
performs the imports.
* Reference: normal Python startup (i.e., don't import protobuf at all).
```
% python3 -m timeit -s 'import subprocess' -r 3 -n 10 'subprocess.call(["python3", "-c", "pass"])'
10 loops, best of 3: 27.1 msec per loop
```
* Baseline: cost to import `google.protobuf.descriptor`, with
extraneous namespace packages.
```
% python3 -m timeit -s 'import subprocess' -r 3 -n 10 'subprocess.call(["python3", "-c", "import google.protobuf.descriptor"])'
10 loops, best of 3: 133 msec per loop
```
* This change: cost to import `google.protobuf.descriptor`, without
extraneous namespace packages.
```
% python3 -m timeit -s 'import subprocess' -r 3 -n 10 'subprocess.call(["python3", "-c", "import google.protobuf.descriptor"])'
10 loops, best of 3: 43.1 msec per loop
```
[1]: https://packaging.python.org/guides/packaging-namespace-packages/
wheel is required since version 3.13.0 and
ff92cee10b
This will result in the following build failure when cross-compiling:
Download error on https://pypi.org/simple/wheel/: unknown url type: https -- Some packages may not be found!
Couldn't find index page for 'wheel' (maybe misspelled?)
Download error on https://pypi.org/simple/: unknown url type: https -- Some packages may not be found!
No local packages or working download links found for wheel
Remove wheel requirement from setup.py as it is only needed by
release.sh, not by setup.py
Fixes:
- http://autobuild.buildroot.org/results/371c686a10d6870933011b46d36b1879d29046b9
Signed-off-by: Fabrice Fontaine <fontaine.fabrice@gmail.com>
This updates README.md to use an example with pip instead of setup.py.
The README.md is then used as the long_description argument to setup(),
so that the doc shows up on pypi.
This extension allows Python code to be generated from setup.py files, so they
are created as part of a normal Python build. The extension uses an already-existing
protoc binary, which can be explicitly specified if needed.
This commit makes a couple of fixes:
- Make sure we always update the suffix, since even for a non-RC release
we need to clear the suffix.
- Make sure the suffix is properly quoted and begins with -rc
See https://docs.gradle.org/current/userguide/java_library_plugin.html
> The compile, testCompile, runtime and testRuntime configurations inherited from the Java plugin are still available but are deprecated. You should avoid using them, as they are only kept for backwards compatibility.
UnknownField::AddAll() is called (multiple times) from UnknownField::MergeFrom(). The `extras` parameter is one of: `varintList`, `fixed32List`, `fixed64List`, `lengthDelimitedList`, or `groupList`. All of these members can be null, and are appropriately checked in other usage. If attempting to parse a proto with unknown extensions, and exception is thrown (NRE).
This adds the appropriate null check inside UnknownField::AddAll().