protobuf/protobuf.bzl

330 lines
9.5 KiB
Python
Raw Normal View History

# -*- mode: python; -*- PYTHON-PREPROCESSING-REQUIRED
def _GetPath(ctx, path):
if ctx.label.workspace_root:
return ctx.label.workspace_root + '/' + path
else:
return path
2015-10-20 00:19:49 +00:00
def _GenDir(ctx):
if not ctx.attr.includes:
return ctx.label.workspace_root
if not ctx.attr.includes[0]:
return _GetPath(ctx, ctx.label.package)
if not ctx.label.package:
return _GetPath(ctx, ctx.attr.includes[0])
return _GetPath(ctx, ctx.label.package + '/' + ctx.attr.includes[0])
def _CcOuts(srcs, use_grpc_plugin=False):
ret = [s[:-len(".proto")] + ".pb.h" for s in srcs] + \
[s[:-len(".proto")] + ".pb.cc" for s in srcs]
if use_grpc_plugin:
ret += [s[:-len(".proto")] + ".grpc.pb.h" for s in srcs] + \
[s[:-len(".proto")] + ".grpc.pb.cc" for s in srcs]
return ret
2015-10-20 00:19:49 +00:00
def _PyOuts(srcs):
2015-10-15 00:37:39 +00:00
return [s[:-len(".proto")] + "_pb2.py" for s in srcs]
Bazel build: Keep generated sources and Python runtime in the same directory. Users often encounter a Python import error when trying to build Python protos if protobuf is installed locally on the machine. In this case, Python ends up looking in the wrong directory when importing files (see bazelbuild/bazel#1209 and tensorflow/tensorflow#2021). It seems that the problem is caused by Python getting confused when there are Python source files that are meant to be part of the same package but are in separate directories. Prior to #1233, the Bazel build setup would copy the Python runtime sources and all generated sources for the builtin protos into the root directory (assuming that the protobuf tree is vendored in a google/protobuf directory). With #1233, the two sets of sources are kept in their respective directories but both `src/` and `python/` are added to the `PYTHONPATH` using the new `imports` attribute of the Bazel Python rules. However, both the runtime sources and the generated sources are under the same package: `google.protobuf`, causing Python to become confused when trying to import modules that are in the other directory. This patch adds a workaround to the Bazel build to add a modified version of the original `internal_copied_filegroup` macro to copy the `.proto` files under `src/` to `python/` before building the `py_proto_library` targets for the builtin protos. This ensures that the generated sources for the builtin protos will be in the same directory as the corresponding runtime sources. This patch was tested with the following: * All Python tests in protobuf * All Python tests in tensorflow * All tests in [Skydoc](https://github.com/bazelbuild/skydoc) * Importing protobuf as `//google/protobuf` * Importing and binding targets under `//external` * Importing protobuf as `//third_party/protobuf`
2016-05-20 23:49:04 +00:00
def _RelativeOutputPath(path, include, dest=""):
2015-10-20 00:19:49 +00:00
if include == None:
return path
if not path.startswith(include):
fail("Include path %s isn't part of the path %s." % (include, path))
if include and include[-1] != '/':
include = include + '/'
Bazel build: Keep generated sources and Python runtime in the same directory. Users often encounter a Python import error when trying to build Python protos if protobuf is installed locally on the machine. In this case, Python ends up looking in the wrong directory when importing files (see bazelbuild/bazel#1209 and tensorflow/tensorflow#2021). It seems that the problem is caused by Python getting confused when there are Python source files that are meant to be part of the same package but are in separate directories. Prior to #1233, the Bazel build setup would copy the Python runtime sources and all generated sources for the builtin protos into the root directory (assuming that the protobuf tree is vendored in a google/protobuf directory). With #1233, the two sets of sources are kept in their respective directories but both `src/` and `python/` are added to the `PYTHONPATH` using the new `imports` attribute of the Bazel Python rules. However, both the runtime sources and the generated sources are under the same package: `google.protobuf`, causing Python to become confused when trying to import modules that are in the other directory. This patch adds a workaround to the Bazel build to add a modified version of the original `internal_copied_filegroup` macro to copy the `.proto` files under `src/` to `python/` before building the `py_proto_library` targets for the builtin protos. This ensures that the generated sources for the builtin protos will be in the same directory as the corresponding runtime sources. This patch was tested with the following: * All Python tests in protobuf * All Python tests in tensorflow * All tests in [Skydoc](https://github.com/bazelbuild/skydoc) * Importing protobuf as `//google/protobuf` * Importing and binding targets under `//external` * Importing protobuf as `//third_party/protobuf`
2016-05-20 23:49:04 +00:00
if dest and dest[-1] != '/':
dest = dest + '/'
2015-10-20 00:19:49 +00:00
path = path[len(include):]
Bazel build: Keep generated sources and Python runtime in the same directory. Users often encounter a Python import error when trying to build Python protos if protobuf is installed locally on the machine. In this case, Python ends up looking in the wrong directory when importing files (see bazelbuild/bazel#1209 and tensorflow/tensorflow#2021). It seems that the problem is caused by Python getting confused when there are Python source files that are meant to be part of the same package but are in separate directories. Prior to #1233, the Bazel build setup would copy the Python runtime sources and all generated sources for the builtin protos into the root directory (assuming that the protobuf tree is vendored in a google/protobuf directory). With #1233, the two sets of sources are kept in their respective directories but both `src/` and `python/` are added to the `PYTHONPATH` using the new `imports` attribute of the Bazel Python rules. However, both the runtime sources and the generated sources are under the same package: `google.protobuf`, causing Python to become confused when trying to import modules that are in the other directory. This patch adds a workaround to the Bazel build to add a modified version of the original `internal_copied_filegroup` macro to copy the `.proto` files under `src/` to `python/` before building the `py_proto_library` targets for the builtin protos. This ensures that the generated sources for the builtin protos will be in the same directory as the corresponding runtime sources. This patch was tested with the following: * All Python tests in protobuf * All Python tests in tensorflow * All tests in [Skydoc](https://github.com/bazelbuild/skydoc) * Importing protobuf as `//google/protobuf` * Importing and binding targets under `//external` * Importing protobuf as `//third_party/protobuf`
2016-05-20 23:49:04 +00:00
return dest + path
2015-10-20 00:19:49 +00:00
2015-10-15 17:51:32 +00:00
def _proto_gen_impl(ctx):
"""General implementation for generating protos"""
srcs = ctx.files.srcs
deps = []
deps += ctx.files.srcs
2015-10-20 00:19:49 +00:00
gen_dir = _GenDir(ctx)
if gen_dir:
Bazel build: Keep generated sources and Python runtime in the same directory. Users often encounter a Python import error when trying to build Python protos if protobuf is installed locally on the machine. In this case, Python ends up looking in the wrong directory when importing files (see bazelbuild/bazel#1209 and tensorflow/tensorflow#2021). It seems that the problem is caused by Python getting confused when there are Python source files that are meant to be part of the same package but are in separate directories. Prior to #1233, the Bazel build setup would copy the Python runtime sources and all generated sources for the builtin protos into the root directory (assuming that the protobuf tree is vendored in a google/protobuf directory). With #1233, the two sets of sources are kept in their respective directories but both `src/` and `python/` are added to the `PYTHONPATH` using the new `imports` attribute of the Bazel Python rules. However, both the runtime sources and the generated sources are under the same package: `google.protobuf`, causing Python to become confused when trying to import modules that are in the other directory. This patch adds a workaround to the Bazel build to add a modified version of the original `internal_copied_filegroup` macro to copy the `.proto` files under `src/` to `python/` before building the `py_proto_library` targets for the builtin protos. This ensures that the generated sources for the builtin protos will be in the same directory as the corresponding runtime sources. This patch was tested with the following: * All Python tests in protobuf * All Python tests in tensorflow * All tests in [Skydoc](https://github.com/bazelbuild/skydoc) * Importing protobuf as `//google/protobuf` * Importing and binding targets under `//external` * Importing protobuf as `//third_party/protobuf`
2016-05-20 23:49:04 +00:00
import_flags = ["-I" + gen_dir, "-I" + ctx.var["GENDIR"] + "/" + gen_dir]
else:
import_flags = ["-I."]
for dep in ctx.attr.deps:
import_flags += dep.proto.import_flags
deps += dep.proto.deps
args = []
if ctx.attr.gen_cc:
args += ["--cpp_out=" + ctx.var["GENDIR"] + "/" + gen_dir]
if ctx.attr.gen_py:
args += ["--python_out=" + ctx.var["GENDIR"] + "/" + gen_dir]
if ctx.executable.grpc_cpp_plugin:
args += ["--plugin=protoc-gen-grpc=" + ctx.executable.grpc_cpp_plugin.path]
args += ["--grpc_out=" + ctx.var["GENDIR"] + "/" + gen_dir]
if args:
ctx.action(
2015-10-15 00:37:39 +00:00
inputs=srcs + deps,
outputs=ctx.outputs.outs,
2015-10-15 00:37:39 +00:00
arguments=args + import_flags + [s.path for s in srcs],
2015-10-15 17:51:32 +00:00
executable=ctx.executable.protoc,
)
return struct(
proto=struct(
2015-10-15 00:37:39 +00:00
srcs=srcs,
import_flags=import_flags,
deps=deps,
),
)
2015-10-15 17:51:32 +00:00
_proto_gen = rule(
attrs = {
2015-10-15 00:20:05 +00:00
"srcs": attr.label_list(allow_files = True),
"deps": attr.label_list(providers = ["proto"]),
"includes": attr.string_list(),
2015-10-15 00:20:05 +00:00
"protoc": attr.label(
cfg = HOST_CFG,
2015-10-15 00:20:05 +00:00
executable = True,
single_file = True,
mandatory = True,
),
"grpc_cpp_plugin": attr.label(
cfg = HOST_CFG,
executable = True,
single_file = True,
),
2015-10-15 00:20:05 +00:00
"gen_cc": attr.bool(),
"gen_py": attr.bool(),
"outs": attr.output_list(),
},
output_to_genfiles = True,
2015-10-15 17:51:32 +00:00
implementation = _proto_gen_impl,
)
def cc_proto_library(
2015-10-15 00:37:39 +00:00
name,
srcs=[],
deps=[],
2015-10-16 18:44:21 +00:00
cc_libs=[],
include=None,
protoc="//:protoc",
2015-10-16 19:46:26 +00:00
internal_bootstrap_hack=False,
use_grpc_plugin=False,
default_runtime="//:protobuf",
2015-10-15 00:37:39 +00:00
**kargs):
2015-10-16 19:46:26 +00:00
"""Bazel rule to create a C++ protobuf library from proto source files
NOTE: the rule is only an internal workaround to generate protos. The
interface may change and the rule may be removed when bazel has introduced
the native rule.
2015-10-16 19:46:26 +00:00
Args:
name: the name of the cc_proto_library.
srcs: the .proto files of the cc_proto_library.
deps: a list of dependency labels; must be cc_proto_library.
cc_libs: a list of other cc_library targets depended by the generated
cc_library.
include: a string indicating the include path of the .proto files.
protoc: the label of the protocol compiler to generate the sources.
internal_bootstrap_hack: a flag indicate the cc_proto_library is used only
for bootstraping. When it is set to True, no files will be generated.
The rule will simply be a provider for .proto files, so that other
cc_proto_library can depend on it.
use_grpc_plugin: a flag to indicate whether to call the grpc C++ plugin
when processing the proto files.
default_runtime: the implicitly default runtime which will be depended on by
the generated cc_library target.
2015-10-16 19:46:26 +00:00
**kargs: other keyword arguments that are passed to cc_library.
"""
includes = []
if include != None:
includes = [include]
if internal_bootstrap_hack:
# For pre-checked-in generated files, we add the internal_bootstrap_hack
# which will skip the codegen action.
2015-10-15 17:51:32 +00:00
_proto_gen(
2015-10-15 00:37:39 +00:00
name=name + "_genproto",
srcs=srcs,
2015-10-16 18:44:21 +00:00
deps=[s + "_genproto" for s in deps],
includes=includes,
2015-10-15 00:37:39 +00:00
protoc=protoc,
visibility=["//visibility:public"],
)
# An empty cc_library to make rule dependency consistent.
native.cc_library(
2015-10-15 00:37:39 +00:00
name=name,
2015-10-16 18:44:21 +00:00
**kargs)
return
grpc_cpp_plugin = None
if use_grpc_plugin:
grpc_cpp_plugin = "//external:grpc_cpp_plugin"
outs = _CcOuts(srcs, use_grpc_plugin)
2015-10-15 17:51:32 +00:00
_proto_gen(
2015-10-15 00:37:39 +00:00
name=name + "_genproto",
srcs=srcs,
2015-10-16 18:44:21 +00:00
deps=[s + "_genproto" for s in deps],
includes=includes,
2015-10-15 00:37:39 +00:00
protoc=protoc,
grpc_cpp_plugin=grpc_cpp_plugin,
2015-10-15 00:37:39 +00:00
gen_cc=1,
outs=outs,
visibility=["//visibility:public"],
)
if default_runtime and not default_runtime in cc_libs:
cc_libs += [default_runtime]
if use_grpc_plugin:
cc_libs += ["//external:grpc_lib"]
native.cc_library(
2015-10-15 00:37:39 +00:00
name=name,
srcs=outs,
2015-10-16 18:44:21 +00:00
deps=cc_libs + deps,
includes=includes,
2015-10-16 18:44:21 +00:00
**kargs)
2015-10-20 00:19:49 +00:00
def internal_gen_well_known_protos_java(srcs):
"""Bazel rule to generate the gen_well_known_protos_java genrule
Args:
srcs: the well known protos
"""
root = Label("%s//protobuf_java" % (REPOSITORY_NAME)).workspace_root
if root == "":
include = " -Isrc "
else:
include = " -I%s/src " % root
native.genrule(
name = "gen_well_known_protos_java",
srcs = srcs,
outs = [
"wellknown.srcjar",
],
cmd = "$(location :protoc) --java_out=$(@D)/wellknown.jar" +
" %s $(SRCS) " % include +
" && mv $(@D)/wellknown.jar $(@D)/wellknown.srcjar",
tools = [":protoc"],
)
Bazel build: Keep generated sources and Python runtime in the same directory. Users often encounter a Python import error when trying to build Python protos if protobuf is installed locally on the machine. In this case, Python ends up looking in the wrong directory when importing files (see bazelbuild/bazel#1209 and tensorflow/tensorflow#2021). It seems that the problem is caused by Python getting confused when there are Python source files that are meant to be part of the same package but are in separate directories. Prior to #1233, the Bazel build setup would copy the Python runtime sources and all generated sources for the builtin protos into the root directory (assuming that the protobuf tree is vendored in a google/protobuf directory). With #1233, the two sets of sources are kept in their respective directories but both `src/` and `python/` are added to the `PYTHONPATH` using the new `imports` attribute of the Bazel Python rules. However, both the runtime sources and the generated sources are under the same package: `google.protobuf`, causing Python to become confused when trying to import modules that are in the other directory. This patch adds a workaround to the Bazel build to add a modified version of the original `internal_copied_filegroup` macro to copy the `.proto` files under `src/` to `python/` before building the `py_proto_library` targets for the builtin protos. This ensures that the generated sources for the builtin protos will be in the same directory as the corresponding runtime sources. This patch was tested with the following: * All Python tests in protobuf * All Python tests in tensorflow * All tests in [Skydoc](https://github.com/bazelbuild/skydoc) * Importing protobuf as `//google/protobuf` * Importing and binding targets under `//external` * Importing protobuf as `//third_party/protobuf`
2016-05-20 23:49:04 +00:00
def internal_copied_filegroup(name, srcs, strip_prefix, dest, **kwargs):
"""Macro to copy files to a different directory and then create a filegroup.
This is used by the //:protobuf_python py_proto_library target to work around
an issue caused by Python source files that are part of the same Python
package being in separate directories.
Args:
srcs: The source files to copy and add to the filegroup.
strip_prefix: Path to the root of the files to copy.
dest: The directory to copy the source files into.
**kwargs: extra arguments that will be passesd to the filegroup.
"""
outs = [_RelativeOutputPath(s, strip_prefix, dest) for s in srcs]
native.genrule(
name = name + "_genrule",
srcs = srcs,
outs = outs,
cmd = " && ".join(
["cp $(location %s) $(location %s)" %
(s, _RelativeOutputPath(s, strip_prefix, dest)) for s in srcs]),
)
native.filegroup(
name = name,
srcs = outs,
**kwargs)
2015-10-20 00:19:49 +00:00
def py_proto_library(
name,
srcs=[],
deps=[],
py_libs=[],
py_extra_srcs=[],
include=None,
default_runtime="//:protobuf_python",
protoc="//:protoc",
2015-10-20 00:19:49 +00:00
**kargs):
2015-10-20 00:56:27 +00:00
"""Bazel rule to create a Python protobuf library from proto source files
NOTE: the rule is only an internal workaround to generate protos. The
interface may change and the rule may be removed when bazel has introduced
the native rule.
2015-10-20 00:56:27 +00:00
Args:
name: the name of the py_proto_library.
srcs: the .proto files of the py_proto_library.
deps: a list of dependency labels; must be py_proto_library.
py_libs: a list of other py_library targets depended by the generated
py_library.
py_extra_srcs: extra source files that will be added to the output
py_library. This attribute is used for internal bootstrapping.
include: a string indicating the include path of the .proto files.
default_runtime: the implicitly default runtime which will be depended on by
the generated py_library target.
2015-10-20 00:56:27 +00:00
protoc: the label of the protocol compiler to generate the sources.
**kargs: other keyword arguments that are passed to cc_library.
"""
2015-10-20 00:19:49 +00:00
outs = _PyOuts(srcs)
includes = []
if include != None:
includes = [include]
2015-10-20 00:19:49 +00:00
_proto_gen(
name=name + "_genproto",
srcs=srcs,
deps=[s + "_genproto" for s in deps],
includes=includes,
2015-10-20 00:19:49 +00:00
protoc=protoc,
gen_py=1,
outs=outs,
visibility=["//visibility:public"],
2015-10-20 00:19:49 +00:00
)
if default_runtime and not default_runtime in py_libs + deps:
py_libs += [default_runtime]
2015-10-20 00:19:49 +00:00
native.py_library(
name=name,
srcs=outs+py_extra_srcs,
deps=py_libs+deps,
imports=includes,
2015-10-20 00:19:49 +00:00
**kargs)
def internal_protobuf_py_tests(
name,
modules=[],
**kargs):
2015-10-20 00:56:27 +00:00
"""Bazel rules to create batch tests for protobuf internal.
Args:
name: the name of the rule.
modules: a list of modules for tests. The macro will create a py_test for
each of the parameter with the source "google/protobuf/%s.py"
kargs: extra parameters that will be passed into the py_test.
"""
2015-10-20 00:19:49 +00:00
for m in modules:
s = "python/google/protobuf/internal/%s.py" % m
2015-10-20 00:19:49 +00:00
native.py_test(
name="py_%s" % m,
2015-10-20 00:56:27 +00:00
srcs=[s],
main=s,
2015-10-20 00:19:49 +00:00
**kargs)