Sync from Piper @452551755

PROTOBUF_SYNC_PIPER
This commit is contained in:
Sandy Zhang 2022-06-02 16:14:25 +00:00
parent 6dd8af4ecf
commit f138d5de25
60 changed files with 1051 additions and 784 deletions

View File

@ -1,5 +1,4 @@
# These are fetched as external repositories.
third_party/abseil-cpp
third_party/benchmark
third_party/googletest
_build/

4
.gitmodules vendored
View File

@ -5,7 +5,3 @@
path = third_party/googletest
url = https://github.com/google/googletest.git
ignore = dirty
[submodule "third_party/abseil-cpp"]
path = third_party/abseil-cpp
url = https://github.com/abseil/abseil-cpp.git
branch = lts_2021_11_02

View File

@ -1,3 +1,63 @@
2022-05-27 version 21.1 (C++/Java/Python/PHP/Objective-C/C#/Ruby)
C++
* cmake: Revert "Fix cmake install targets (#9822)" (#10060)
* Remove Abseil dependency from CMake build (#10056)
Python
* Update python wheel metadata with more information incl. required python version (#10058)
* Fix segmentation fault when instantiating field via repeated field assignment (#10066)
2022-05-25 version 21.0 (C++/Java/Python/PHP/Objective-C/C#/Ruby)
C++
* cmake: Call get_filename_component() with DIRECTORY mode instead of PATH mode (#9614)
* Escape GetObject macro inside protoc-generated code (#9739)
* Update CMake configuration to add a dependency on Abseil (#9793)
* Fix cmake install targets (#9822)
* Use __constinit only in GCC 12.2 and up (#9936)
Java
* Update protobuf_version.bzl to separate protoc and per-language java … (#9900)
Python
* Increment python major version to 4 in version.json for python upb (#9926)
* The C extension module for Python has been rewritten to use the upb library.
This is expected to deliver significant performance benefits, especially when
parsing large payloads. There are some minor breaking changes, but these
should not impact most users. For more information see:
https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
* Fixed win32 build and fixed str(message) on all Windows platforms. (#9976)
* The binary wheel for macOS now supports Apple silicon.
PHP
* [PHP] fix PHP build system (#9571)
* Fix building packaged PHP extension (#9727)
* fix: reserve "ReadOnly" keyword for PHP 8.1 and add compatibility (#9633)
* fix: phpdoc syntax for repeatedfield parameters (#9784)
* fix: phpdoc for repeatedfield (#9783)
* Change enum string name for reserved words (#9780)
* chore: [PHP] fix phpdoc for MapField keys (#9536)
* Fixed PHP SEGV by not writing to shared memory for zend_class_entry. (#9996)
Ruby
* Allow pre-compiled binaries for ruby 3.1.0 (#9566)
* Implement `respond_to?` in RubyMessage (#9677)
* [Ruby] Fix RepeatedField#last, #first inconsistencies (#9722)
* Do not use range based UTF-8 validation in truffleruby (#9769)
* Improve range handling logic of `RepeatedField` (#9799)
* Support x64-mingw-ucrt platform
Other
* [Kotlin] remove redundant public modifiers for compiled code (#9642)
* [C#] Update GetExtension to support getting typed value (#9655)
* Fix invalid dependency manifest when using `descriptor_set_out` (#9647)
* Fix C# generator handling of a field named "none" in a oneof (#9636)
* Add initial version.json file for 21-dev (#9840)
* Remove duplicate java generated code (#9909)
* Cherry-pick PR #9981 into 21.x branch (#10000)
2022-05-19 version 21.0-rc2(C++/Java/Python/PHP/Objective-C/C#/Ruby)
Python

View File

@ -5,7 +5,7 @@
# dependent projects use the :git notation to refer to the library.
Pod::Spec.new do |s|
s.name = 'Protobuf'
s.version = '3.21.0-rc1'
s.version = '3.21.1'
s.summary = 'Protocol Buffers v.3 runtime library for Objective-C.'
s.homepage = 'https://github.com/protocolbuffers/protobuf'
s.license = 'BSD-3-Clause'

View File

@ -5,7 +5,7 @@
<title>Google Protocol Buffers tools</title>
<summary>Tools for Protocol Buffers - Google's data interchange format.</summary>
<description>See project site for more info.</description>
<version>3.21.0-rc2</version>
<version>3.21.1</version>
<authors>Google Inc.</authors>
<owners>protobuf-packages</owners>
<licenseUrl>https://github.com/protocolbuffers/protobuf/blob/main/LICENSE</licenseUrl>

View File

@ -4,7 +4,7 @@
<Description>C# runtime library for Protocol Buffers - Google's data interchange format.</Description>
<Copyright>Copyright 2015, Google Inc.</Copyright>
<AssemblyTitle>Google Protocol Buffers</AssemblyTitle>
<VersionPrefix>3.21.0-rc2</VersionPrefix>
<VersionPrefix>3.21.1</VersionPrefix>
<!-- C# 7.2 is required for Span/BufferWriter/ReadOnlySequence -->
<LangVersion>7.2</LangVersion>
<Authors>Google Inc.</Authors>

344
docs/cpp_build_systems.md Normal file
View File

@ -0,0 +1,344 @@
# How Protobuf supports multiple C++ build systems
This document explains how the Protobuf project supports multiple C++ build
systems.
## Background
Protobuf primarily uses [Bazel](https://bazel.build) to build the Protobuf C++
runtime and Protobuf compiler[^historical_sot]. However, there are several
different build systems in common use for C++, each one of which requires
essentially a complete copy of the same build definitions.
[^historical_sot]:
On a historical note, prior to its [release as Open Source
Software](https://opensource.googleblog.com/2008/07/protocol-buffers-googles-data.html),
the Protobuf project was developed using Google's internal build system, which
was the predecessor to Bazel (the vast majority of Google's contributions
continue to be developed this way). The Open Source Protobuf project, however,
historically used Autoconf to build the C++ implementation.
Over time, other build systems (including Bazel) have been added, thanks in
large part to substantial contributions from the Open Source community. Since
the Protobuf project deals with multiple languages (all of which ultimately
rely upon C++, for the Protobuf compiler), Bazel is a natural choice for a
project-wide build system -- in fact, Bazel (and its predecessor, Blaze)
was designed in large part to support exactly this type of rich,
multi-language build.
Currently, C++ Protobuf can be built with Bazel, Autotools, and CMake. Each of
these build systems has different semantics and structure, but share in common
the list of files needed to build the runtime and compiler.
## Design
### Extracting information from Bazel
Bazel's Starlark API provides [aspects](https://bazel.build/rules/aspects) to
traverse the build graph, inspect build rules, define additional actions, and
expose information through
[providers](https://bazel.build/rules/rules#providers). For example, the
`cc_proto_library` rule uses an aspect to traverse the dependency graph of
`proto_library` rules, and dynamically attaches actions to generate C++ code
using the Protobuf compiler and compile using the C++ compiler.
In order to support multiple build systems, the overall build structure is
defined once for each system, and expose frequently-changing metadata
from Bazel in a way that can be included from the build definition. Primarily,
this means exposing the list of source files in a way that can be included
in other build definitions.
Two aspects are used to extract this information from the Bazel build
definitions:
* `cc_file_list_aspect` extracts `srcs`, `hdrs`, and `textual_hdrs` from build
rules like `cc_library`. The sources are exposed through a provider named
`CcFileList`.
* `proto_file_list_aspect` extracts the `srcs` from a `proto_library`, and
also generates the expected filenames that would be generated by the
Protobuf compiler. This information is exposed through a provider named
`ProtoFileList`.
On their own, these aspects have limited utility. However, they can be
instantiated by custom rules, so that an ordinary `BUILD.bazel` target can
produce outputs based on the information gleaned from these aspects.
### (Aside) Distribution libraries
Bazel's native `cc_library` rule is typically used on a "fine-grained" level, so
that, for example, lightweight unit tests can be written with narrow scope.
Although Bazel does build library artifacts (such as `.so` and `.a` files on
Linux), they correspond to `cc_library` rules.
Since the entire "Protobuf library" includes many constituent `cc_library`
rules, a special rule, `cc_dist_library`, combines several fine-grained
libraries into a single, monolithic library.
For the Protobuf project, these "distribution libraries" are intended to match
the granularity of the Autotools- and CMake-based builds. Since the Bazel-built
distribution library covers the rules with the source files needed by other
builds, the `cc_dist_library` rule invokes the `cc_file_list_aspect` on its
input libraries. The result is that a `cc_dist_library` rule not only produces
composite library artifacts, but also collect and provide the list of sources
that were inputs.
For example:
```
$ cat cc_dist_library_example/BUILD.bazel
load("@rules_cc//cc:defs.bzl", "cc_library")
load("//pkg:cc_dist_library.bzl", "cc_dist_library")
cc_library(
name = "a",
srcs = ["a.cc"],
)
cc_library(
name = "b",
srcs = ["b.cc"],
deps = [":c"],
)
# N.B.: not part of the cc_dist_library, even though it is in the deps of 'b':
cc_library(
name = "c",
srcs = ["c.cc"],
)
cc_dist_library(
name = "lib",
deps = [
":a",
":b",
],
visbility = ["//visibility:public"],
)
# Note: the output below has been formatted for clarity:
$ bazel cquery //cc_dist_library_example:lib \
--output=starlark \
--starlark:expr='providers(target)["//pkg:cc_dist_library.bzl%CcFileList"]'
struct(
hdrs = depset([]),
internal_hdrs = depset([]),
srcs = depset([
<source file cc_dist_library_example/a.cc>,
<source file cc_dist_library_example/b.cc>,
]),
textual_hdrs = depset([]),
)
```
The upshot is that the "coarse-grained" library can be defined by the Bazel
build, and then export the list of source files that are needed to reproduce the
library in a different build system.
One major difference from most Bazel rule types is that the file list aspects do
not propagate. In other words, they only expose the immediate dependency's
sources, not transitive sources. This is for two reasons:
1. Immediate dependencies are conceptually simple, while transitivity requires
substantially more thought. For example, if transitive dependencies were
considered, then some way would be needed to exclude dependencies that
should not be part of the final library (for example, a distribution library
for `//:protobuf` could be defined not to include all of
`//:protobuf_lite`). While dependency elision is an interesting design
problem, the protobuf library is small enough that directly listing
dependencies should not be problematic.
2. Dealing only with immediate dependencies gives finer-grained control over
what goes into the composite library. For example, a Starlark `select()`
could conditionally add fine-grained libraries to some builds, but not
others.
Another subtlety for tests is due to Bazel internals. Internally, a slightly
different configuration is used when evaluating `cc_test` rules as compared to
`cc_dist_library`. If `cc_test` targets are included in a `cc_dist_library`
rule, and both are evaluated by Bazel, this can result in a build-time error:
the config used for the test contains additional options that tell Bazel how to
execute the test that the `cc_file_list_aspect` build config does not. Bazel
detects this as two conflicting actions generating the same outputs. (For
`cc_test` rules, the simplest workaround is to provide sources through a
`filegroup` or similar.)
### File list generation
Lists of input files are generated by Bazel in a format that can be imported to
other build systems. Currently, Automake- and CMake-style files can be
generated.
The lists of files are derived from Bazel build targets. The sources can be:
* `cc_dist_library` rules (as described above)
* `proto_library` rules
* individual files
* `filegroup` rules
* `pkg_files` or `pkg_filegroup` rules from
https://github.com/bazelbuild/rules_pkg
For example:
```
$ cat gen_file_lists_example/BUILD.bazel
load("@rules_proto//proto:defs.bzl", "proto_library")
load("//pkg:build_systems.bzl", "gen_cmake_file_lists")
filegroup(
name = "doc_files",
srcs = [
"README.md",
"englilsh_paper.md",
],
)
proto_library(
name = "message",
srcs = ["message.proto"],
)
gen_cmake_file_lists(
name = "source_lists",
out = "source_lists.cmake",
src_libs = {
":doc_files": "docs",
":message": "buff",
"//cc_dist_library_example:c": "distlib",
},
)
$ bazel build gen_file_lists_example:source_lists
$ cat bazel-bin/gen_file_lists_example/source_lists.cmake
# Auto-generated by //gen_file_lists_example:source_lists
#
# This file contains lists of sources based on Bazel rules. It should
# be included from a hand-written CMake file that defines targets.
#
# Changes to this file will be overwritten based on Bazel definitions.
if(${CMAKE_VERSION} VERSION_GREATER 3.10 OR ${CMAKE_VERSION} VERSION_EQUAL 3.10)
include_guard()
endif()
# //gen_file_lists_example:doc_files
set(docs_files
gen_file_lists_example/README.md
gen_file_lists_example/englilsh_paper.md
)
# //gen_file_lists_example:message
set(buff_proto_srcs
gen_file_lists_example/message.proto
)
# //gen_file_lists_example:message
set(buff_srcs
gen_file_lists_example/message.proto.pb.cc
)
# //gen_file_lists_example:message
set(buff_hdrs
gen_file_lists_example/message.proto.pb.h
)
# //gen_file_lists_example:message
set(buff_files
gen_file_lists_example/message-descriptor-set.proto.bin
)
# //cc_dist_library_example:c
set(distlib_srcs
cc_dist_library_example/a.cc
cc_dist_library_example/b.cc
)
# //cc_dist_library_example:c
set(distlib_hdrs
)
```
A hand-written CMake build rule could then use the generated file to define
libraries, such as:
```
include(source_lists.cmake)
add_library(distlib ${distlib_srcs} ${buff_srcs})
```
In addition to `gen_cmake_file_lists`, there is also a `gen_automake_file_lists`
rule. These rules actually share most of the same implementation, but define
different file headers and different Starlark "fragment generator" functions
which format the generated list variables.
### Protobuf usage
The main C++ runtimes (lite and full) and the Protobuf compiler use their
corresponding `cc_dist_library` rules to generate file lists. For
`proto_library` targets, the file list generation can extract the source files
directly. For other targets, notably `cc_test` targets, the file list generators
use `filegroup` rules.
In general, adding new targets to a non-Bazel build system in Protobuf (or
adding a new build system altogether) requires some one-time setup:
1. The overall structure of the new build system has to be defined. It should
import lists of files and refer to them by variable, instead of listing
files directly.
2. (Only if the build system is new) A new rule type has to be added to
`//pkg:build_systems.bzl`. Most of the implementation is shared, but a
"fragment generator" is need to declare a file list variable, and the rule
type itself has to be defined and call the shared implementation.
When files are added or deleted, or when the Protobuf Bazel structure is
changed, these changes may need to be reflected in the file list logic. These
are some example scenarios:
* Files are added to (or removed from) the `srcs` of an existing `cc_library`:
no changes needed. If the `cc_library` is already part of a
`cc_dist_library`, then regenerating the source lists will reflect the
change.
* A `cc_library` is added: the new target may need to be added to the Protobuf
`cc_dist_library` targets, as appropriate.
* A `cc_library` is deleted: if a `cc_dist_library` depends upon the deleted
target, then a build-time error will result. The library needs to be removed
from the `cc_dist_library`.
* A `cc_test` is added or deleted: test sources are handled by `filegroup`
rules defined in the same package as the `cc_test` rule. The `filegroup`s
are usually given a name like `"test_srcs"`, and often use `glob()` to find
sources. This means that adding or removing a test may not require any extra
work, but this can be verified within the same package as the test rule.
* Test-only proto files are added: the `proto_library` might need to be added
to the file list map in `//pkg:BUILD.bazel`, and then the file added to
various build systems. However, most test-only protos are already exposed
through libraries like `//src/google/protobuf:test_protos`.
If there are changes, then the regenerated file lists need to be copied back
into the repo. That way, the corresponding build systems can be used with a git
checkout, without needing to run Bazel first.
### (Aside) Distribution archives
A very similar set of rules is defined in `//pkg` to build source distribution
archives for releases. In addition to the full sources, Protobuf releases also
include source archives sliced by language, so that, for example, a Ruby-based
project can get just the sources needed to build the Ruby runtime. (The
per-language slices also include sources needed to build the protobuf compiler,
so they all effectively include the C++ runtime.)
These archives are defined using rules from the
[rules_pkg](https://github.com/bazelbuild/rules_pkg) project. Although they are
similar to `cc_dist_library` and the file list generation rules, the goals are
different: the build system file lists described above only apply to C++, and
are organized according to what should or should not be included in different
parts of the build (e.g., no tests are included in the main library). On the
other hand, the distribution archives deal with languages other than C++, and
contain all the files that need to be distributed as part of a release (even for
C++, this is more than just the C++ sources).
While it might be possible to use information from the `CcFileList` and
`ProtoFileList` providers to define the distribution files, additional files
(such as the various `BUILD.bazel` files) are also needed in the distribution
archive. The lists of distribution files can usually be generated by `glob()`,
anyhow, so sharing logic with the file list aspects may not be beneficial.
Currently, all of the file lists are checked in. However, it would be possible
to build the file lists on-the-fly and include them in the distribution
archives, rather than checking them in.

View File

@ -23,7 +23,7 @@ If you are using Maven, use the following:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</dependency>
```
@ -37,7 +37,7 @@ protobuf-java-util package:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java-util</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</dependency>
```
@ -45,7 +45,7 @@ protobuf-java-util package:
If you are using Gradle, add the following to your `build.gradle` file's dependencies:
```
implementation 'com.google.protobuf:protobuf-java:3.21.0-rc-2'
implementation 'com.google.protobuf:protobuf-java:3.21.1'
```
Again, be sure to check that the version number matches (or is newer than) the version number of protoc that you are using.

View File

@ -4,7 +4,7 @@
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-bom</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
<packaging>pom</packaging>
<name>Protocol Buffers [BOM]</name>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</parent>
<artifactId>protobuf-java</artifactId>

View File

@ -39,11 +39,12 @@ import java.util.Map;
*
* <p>See also {@link MessageLite}, which defines most of the methods that typical users care about.
* {@link Message} adds methods that are not available in the "lite" runtime. The biggest added
* features are introspection and reflection; that is, getting descriptors for the message type
* and accessing the field values dynamically.
* features are introspection and reflection; that is, getting descriptors for the message type and
* accessing the field values dynamically.
*
* @author kenton@google.com Kenton Varda
*/
@CheckReturnValue
public interface Message extends MessageLite, MessageOrBuilder {
// (From MessageLite, re-declared here only for return type covariance.)
@ -102,6 +103,7 @@ public interface Message extends MessageLite, MessageOrBuilder {
// (From MessageLite.Builder, re-declared here only for return type
// covariance.)
@Override
@CanIgnoreReturnValue
Builder clear();
/**
@ -121,6 +123,7 @@ public interface Message extends MessageLite, MessageOrBuilder {
*
* <p>This is equivalent to the {@code Message::MergeFrom} method in C++.
*/
@CanIgnoreReturnValue
Builder mergeFrom(Message other);
// (From MessageLite.Builder, re-declared here only for return type
@ -135,9 +138,11 @@ public interface Message extends MessageLite, MessageOrBuilder {
Builder clone();
@Override
@CanIgnoreReturnValue
Builder mergeFrom(CodedInputStream input) throws IOException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
throws IOException;
@ -190,18 +195,21 @@ public interface Message extends MessageLite, MessageOrBuilder {
* Sets a field to the given value. The value must be of the correct type for this field, that
* is, the same type that {@link Message#getField(Descriptors.FieldDescriptor)} returns.
*/
@CanIgnoreReturnValue
Builder setField(Descriptors.FieldDescriptor field, Object value);
/**
* Clears the field. This is exactly equivalent to calling the generated "clear" accessor method
* corresponding to the field.
*/
@CanIgnoreReturnValue
Builder clearField(Descriptors.FieldDescriptor field);
/**
* Clears the oneof. This is exactly equivalent to calling the generated "clear" accessor method
* corresponding to the oneof.
*/
@CanIgnoreReturnValue
Builder clearOneof(Descriptors.OneofDescriptor oneof);
/**
@ -212,6 +220,7 @@ public interface Message extends MessageLite, MessageOrBuilder {
* @throws IllegalArgumentException if the field is not a repeated field, or {@code
* field.getContainingType() != getDescriptorForType()}.
*/
@CanIgnoreReturnValue
Builder setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value);
/**
@ -220,12 +229,15 @@ public interface Message extends MessageLite, MessageOrBuilder {
* @throws IllegalArgumentException if the field is not a repeated field, or {@code
* field.getContainingType() != getDescriptorForType()}
*/
@CanIgnoreReturnValue
Builder addRepeatedField(Descriptors.FieldDescriptor field, Object value);
/** Set the {@link UnknownFieldSet} for this message. */
@CanIgnoreReturnValue
Builder setUnknownFields(UnknownFieldSet unknownFields);
/** Merge some unknown fields into the {@link UnknownFieldSet} for this message. */
@CanIgnoreReturnValue
Builder mergeUnknownFields(UnknownFieldSet unknownFields);
// ---------------------------------------------------------------
@ -234,30 +246,38 @@ public interface Message extends MessageLite, MessageOrBuilder {
// (From MessageLite.Builder, re-declared here only for return type
// covariance.)
@Override
@CanIgnoreReturnValue
Builder mergeFrom(ByteString data) throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(ByteString data, ExtensionRegistryLite extensionRegistry)
throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(byte[] data) throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(byte[] data, int off, int len) throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(byte[] data, ExtensionRegistryLite extensionRegistry)
throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(byte[] data, int off, int len, ExtensionRegistryLite extensionRegistry)
throws InvalidProtocolBufferException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(InputStream input) throws IOException;
@Override
@CanIgnoreReturnValue
Builder mergeFrom(InputStream input, ExtensionRegistryLite extensionRegistry)
throws IOException;

View File

@ -344,7 +344,6 @@ public interface MessageLite extends MessageLiteOrBuilder {
* a protobuf in the first place.
* @throws IOException an I/O error reading from the stream
*/
@CanIgnoreReturnValue // TODO(kak): should this be @CheckReturnValue instead?
boolean mergeDelimitedFrom(InputStream input) throws IOException;
/**
@ -357,7 +356,6 @@ public interface MessageLite extends MessageLiteOrBuilder {
* a protobuf in the first place.
* @throws IOException an I/O error reading from the stream
*/
@CanIgnoreReturnValue // TODO(kak): should this be @CheckReturnValue instead?
boolean mergeDelimitedFrom(InputStream input, ExtensionRegistryLite extensionRegistry)
throws IOException;
}

View File

@ -39,6 +39,7 @@ import java.util.Map;
*
* @author jonp@google.com (Jon Perlow)
*/
@CheckReturnValue
public interface MessageOrBuilder extends MessageLiteOrBuilder {
// (From MessageLite, re-declared here only for return type covariance.)

View File

@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</parent>
<artifactId>protobuf-kotlin-lite</artifactId>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</parent>
<artifactId>protobuf-kotlin</artifactId>

View File

@ -29,7 +29,7 @@ protobuf Java Lite runtime. If you are using Maven, include the following:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-javalite</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</dependency>
```

View File

@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</parent>
<artifactId>protobuf-javalite</artifactId>

View File

@ -4,7 +4,7 @@
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
<packaging>pom</packaging>
<name>Protocol Buffers [Parent]</name>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</parent>
<artifactId>protobuf-java-util</artifactId>

View File

@ -6,3 +6,6 @@ set -ex
cmake -DCMAKE_POSITION_INDEPENDENT_CODE=ON -Dprotobuf_WITH_ZLIB=0 .
make -j8
# The Java build setup expects the protoc binary to be in the src/ directory.
ln -s $PWD/protoc ./src/protoc

View File

@ -52,15 +52,9 @@ tar -C ${DIST_WORK_ROOT} --strip-components=1 -axf ${DIST_ARCHIVE}
#
# Run tests using extracted sources
#
if SOURCE_DIR=${DIST_WORK_ROOT} \
SOURCE_DIR=${DIST_WORK_ROOT} \
CMAKE_GENERATOR=Ninja \
CTEST_PARALLEL_LEVEL=$(nproc) \
kokoro/common/cmake.sh; then
# TODO: remove this conditional.
# The cmake build is expected to fail due to missing abseil sources.
echo "$0: Expected failure, but build passed." >&2
echo "Please update $(basename $0) to remove failure expectation." >&2
echo "FAIL" >&2
exit 1
fi
kokoro/common/cmake.sh
echo "PASS"

View File

@ -2,10 +2,29 @@
#
# Build file to set up and run tests
# Change to repo root
set -eux
set -o pipefail
if [[ -h /tmpfs ]] && [[ ${PWD} == /tmpfs/src ]]; then
# Workaround for internal Kokoro bug: b/227401944
cd /Volumes/BuildData/tmpfs/src
fi
# Default environment variables used by cmake build:
: ${CMAKE_CONFIG_TYPE:=Debug}
export CMAKE_CONFIG_TYPE
: ${CTEST_PARALLEL_LEVEL:=4}
export CTEST_PARALLEL_LEVEL
# Run from the project root directory.
cd $(dirname $0)/../../..
# Prepare worker environment to run tests
source kokoro/macos/prepare_build_macos_rc
#
# Update submodules
#
git submodule update --init --recursive
./tests.sh cpp
#
# Run build
#
kokoro/common/cmake.sh

View File

@ -30,7 +30,7 @@ git clone https://github.com/matthew-brett/multibuild.git
# silently creeping in (see https://github.com/protocolbuffers/protobuf/issues/9180).
# IMPORTANT: always pin multibuild at the same commit for:
# - linux/build_artifacts.sh
# - macos/build_artifacts.sh
# - linux/build_artifacts.sh
# - windows/build_artifacts.bat
(cd multibuild; git checkout b89bb903e94308be79abefa4f436bf123ebb1313)
cp kokoro/release/python/linux/config.sh config.sh

View File

@ -30,7 +30,7 @@ git clone https://github.com/matthew-brett/multibuild.git
# silently creeping in (see https://github.com/protocolbuffers/protobuf/issues/9180).
# IMPORTANT: always pin multibuild at the same commit for:
# - linux/build_artifacts.sh
# - macos/build_artifacts.sh
# - linux/build_artifacts.sh
# - windows/build_artifacts.bat
(cd multibuild; git checkout b89bb903e94308be79abefa4f436bf123ebb1313)
cp kokoro/release/python/macos/config.sh config.sh

View File

@ -18,7 +18,7 @@ REM Pin multibuild scripts at a known commit to avoid potentially unwanted futur
REM silently creeping in (see https://github.com/protocolbuffers/protobuf/issues/9180).
REM IMPORTANT: always pin multibuild at the same commit for:
REM - linux/build_artifacts.sh
REM - macos/build_artifacts.sh
REM - linux/build_artifacts.sh
REM - windows/build_artifacts.bat
cd multibuild
git checkout b89bb903e94308be79abefa4f436bf123ebb1313
@ -34,11 +34,6 @@ SET ZLIB_ROOT=%cd%\zlib
del /Q zlib.zip
del /Q zlib-src.zip
REM Update Submodules
REM This is needed because this build uses CMake <3.13.
git submodule update --init --recursive
SET ABSL_ROOT_DIR=%cd%\third_party\abseil-cpp
REM Create directory for artifacts
SET ARTIFACT_DIR=%cd%\artifacts
mkdir %ARTIFACT_DIR%

View File

@ -49,7 +49,7 @@ mkdir src\.libs
mkdir vcprojects
pushd vcprojects
cmake -G "%generator%" -Dprotobuf_BUILD_SHARED_LIBS=%BUILD_DLL% -Dprotobuf_UNICODE=%UNICODE% -Dprotobuf_BUILD_TESTS=OFF -DABSL_ROOT_DIR=%ABSL_ROOT_DIR% ../cmake || goto :error
cmake -G "%generator%" -Dprotobuf_BUILD_SHARED_LIBS=%BUILD_DLL% -Dprotobuf_UNICODE=%UNICODE% -Dprotobuf_BUILD_TESTS=OFF ../cmake || goto :error
msbuild protobuf.sln /p:Platform=%vcplatform% /p:Configuration=Release || goto :error
dir /s /b
popd

View File

@ -82,12 +82,12 @@ const char *const kReservedNames[] = {
"global", "goto", "insteadof", "interface", "isset",
"list", "match", "namespace", "new", "object",
"or", "parent", "print", "private", "protected",
"public", "readonly", "require", "require_once", "return",
"self", "static", "switch", "throw", "trait",
"try", "unset", "use", "var", "while",
"xor", "yield", "int", "float", "bool",
"string", "true", "false", "null", "void",
"iterable", NULL};
"public", "require", "require_once", "return", "self",
"static", "switch", "throw", "trait", "try",
"unset", "use", "var", "while", "xor",
"yield", "int", "float", "bool", "string",
"true", "false", "null", "void", "iterable",
NULL};
bool is_reserved_name(const char* name) {
int i;

View File

@ -10,15 +10,15 @@
<email>protobuf-opensource@google.com</email>
<active>yes</active>
</lead>
<date>2022-05-19</date>
<time>13:35:18</time>
<date>2022-05-27</date>
<time>09:36:34</time>
<version>
<release>3.21.0RC2</release>
<api>3.21.0</api>
<release>3.21.1</release>
<api>3.21.1</api>
</version>
<stability>
<release>beta</release>
<api>beta</api>
<release>stable</release>
<api>stable</api>
</stability>
<license uri="https://opensource.org/licenses/BSD-3-Clause">BSD-3-Clause</license>
<notes>
@ -1313,5 +1313,35 @@ G A release.
<notes>
</notes>
</release>
<release>
<version>
<release>3.21.0</release>
<api>3.21.0</api>
</version>
<stability>
<release>stable</release>
<api>stable</api>
</stability>
<date>2022-05-25</date>
<time>13:50:26</time>
<license uri="https://opensource.org/licenses/BSD-3-Clause">BSD-3-Clause</license>
<notes>
</notes>
</release>
<release>
<version>
<release>3.21.1</release>
<api>3.21.1</api>
</version>
<stability>
<release>stable</release>
<api>stable</api>
</stability>
<date>2022-05-27</date>
<time>09:36:34</time>
<license uri="https://opensource.org/licenses/BSD-3-Clause">BSD-3-Clause</license>
<notes>
</notes>
</release>
</changelog>
</package>

View File

@ -127,7 +127,7 @@ ZEND_BEGIN_ARG_INFO_EX(arginfo_setter, 0, 0, 1)
ZEND_ARG_INFO(0, value)
ZEND_END_ARG_INFO()
#define PHP_PROTOBUF_VERSION "3.21.0RC2"
#define PHP_PROTOBUF_VERSION "3.21.1"
// ptr -> PHP object cache. This is a weak map that caches lazily-created
// wrapper objects around upb types:

View File

@ -285,12 +285,11 @@ class GPBUtil
"include"=>0, "include_once"=>0, "instanceof"=>0, "insteadof"=>0,
"interface"=>0, "isset"=>0, "list"=>0, "match"=>0, "namespace"=>0,
"new"=>0, "or"=>0, "parent"=>0, "print"=>0, "private"=>0,
"protected"=>0,"public"=>0, "readonly" => 0,"require"=>0,
"require_once"=>0,"return"=>0, "self"=>0, "static"=>0, "switch"=>0,
"throw"=>0,"trait"=>0, "try"=>0,"unset"=>0, "use"=>0, "var"=>0,
"while"=>0,"xor"=>0, "yield"=>0, "int"=>0, "float"=>0, "bool"=>0,
"string"=>0,"true"=>0, "false"=>0, "null"=>0, "void"=>0,
"iterable"=>0
"protected"=>0,"public"=>0, "require"=>0, "require_once"=>0,
"return"=>0, "self"=>0, "static"=>0, "switch"=>0, "throw"=>0,
"trait"=>0, "try"=>0,"unset"=>0, "use"=>0, "var"=>0, "while"=>0,
"xor"=>0, "yield"=>0, "int"=>0, "float"=>0, "bool"=>0, "string"=>0,
"true"=>0, "false"=>0, "null"=>0, "void"=>0, "iterable"=>0
);
if (array_key_exists(strtolower($classname), $reserved_words)) {

View File

@ -334,18 +334,6 @@ class GeneratedClassTest extends TestBase
$this->legacyEnum(new TestLegacyMessage\NestedEnum);
}
public function testLegacyReadOnlyMessage()
{
$this->assertTrue(class_exists('\Upper\READONLY'));
$this->assertTrue(class_exists('\Lower\readonly'));
}
public function testLegacyReadOnlyEnum()
{
$this->assertTrue(class_exists('\Upper_enum\READONLY'));
$this->assertTrue(class_exists('\Lower_enum\readonly'));
}
private function legacyEnum(TestLegacyMessage_NestedEnum $enum)
{
// If we made it here without a PHP Fatal error, the typehint worked
@ -955,7 +943,6 @@ class GeneratedClassTest extends TestBase
$m = new \Lower\PBprivate();
$m = new \Lower\PBprotected();
$m = new \Lower\PBpublic();
$m = new \Lower\PBreadonly();
$m = new \Lower\PBrequire();
$m = new \Lower\PBrequire_once();
$m = new \Lower\PBreturn();
@ -1036,7 +1023,6 @@ class GeneratedClassTest extends TestBase
$m = new \Upper\PBPRIVATE();
$m = new \Upper\PBPROTECTED();
$m = new \Upper\PBPUBLIC();
$m = new \Upper\PBREADONLY();
$m = new \Upper\PBREQUIRE();
$m = new \Upper\PBREQUIRE_ONCE();
$m = new \Upper\PBRETURN();
@ -1118,7 +1104,6 @@ class GeneratedClassTest extends TestBase
$m = new \Lower_enum\PBprotected();
$m = new \Lower_enum\PBpublic();
$m = new \Lower_enum\PBrequire();
$m = new \Lower_enum\PBreadonly();
$m = new \Lower_enum\PBrequire_once();
$m = new \Lower_enum\PBreturn();
$m = new \Lower_enum\PBself();
@ -1198,7 +1183,6 @@ class GeneratedClassTest extends TestBase
$m = new \Upper_enum\PBPRIVATE();
$m = new \Upper_enum\PBPROTECTED();
$m = new \Upper_enum\PBPUBLIC();
$m = new \Upper_enum\PBREADONLY();
$m = new \Upper_enum\PBREQUIRE();
$m = new \Upper_enum\PBREQUIRE_ONCE();
$m = new \Upper_enum\PBRETURN();
@ -1303,7 +1287,6 @@ class GeneratedClassTest extends TestBase
$m = \Lower_enum_value\NotAllowed::iterable;
$m = \Lower_enum_value\NotAllowed::parent;
$m = \Lower_enum_value\NotAllowed::self;
$m = \Lower_enum_value\NotAllowed::readonly;
$m = \Upper_enum_value\NotAllowed::PBABSTRACT;
$m = \Upper_enum_value\NotAllowed::PBAND;
@ -1384,7 +1367,6 @@ class GeneratedClassTest extends TestBase
$m = \Upper_enum_value\NotAllowed::ITERABLE;
$m = \Upper_enum_value\NotAllowed::PARENT;
$m = \Upper_enum_value\NotAllowed::SELF;
$m = \Upper_enum_value\NotAllowed::READONLY;
$this->assertTrue(true);
}

View File

@ -57,7 +57,6 @@ enum print { ZERO51 = 0; }
enum private { ZERO52 = 0; }
enum protected { ZERO53 = 0; }
enum public { ZERO54 = 0; }
enum readonly { ZERO80 = 0; }
enum require { ZERO55 = 0; }
enum require_once { ZERO56 = 0; }
enum return { ZERO57 = 0; }

View File

@ -57,7 +57,6 @@ enum PRINT { ZERO51 = 0; }
enum PRIVATE { ZERO52 = 0; }
enum PROTECTED { ZERO53 = 0; }
enum PUBLIC { ZERO54 = 0; }
enum READONLY { ZERO80 = 0; }
enum REQUIRE { ZERO55 = 0; }
enum REQUIRE_ONCE { ZERO56 = 0; }
enum RETURN { ZERO57 = 0; }

View File

@ -58,7 +58,6 @@ enum NotAllowed {
private = 51;
protected = 52;
public = 53;
readonly = 79;
require = 54;
require_once = 55;
return = 56;

View File

@ -58,7 +58,6 @@ enum NotAllowed {
PRIVATE = 51;
PROTECTED = 52;
PUBLIC = 53;
READONLY = 79;
REQUIRE = 54;
REQUIRE_ONCE = 55;
RETURN = 56;

View File

@ -57,7 +57,6 @@ message print {}
message private {}
message protected {}
message public {}
message readonly {}
message require {}
message require_once {}
message return {}

View File

@ -57,7 +57,6 @@ message PRINT {}
message PRIVATE {}
message PROTECTED {}
message PUBLIC {}
message READONLY {}
message REQUIRE {}
message REQUIRE_ONCE {}
message RETURN {}

View File

@ -114,6 +114,6 @@ def protobuf_deps():
_github_archive(
name = "upb",
repo = "https://github.com/protocolbuffers/upb",
commit = "12efc9b096f35b62055a217f45e6b0fe5fb1a099",
sha256 = "de0ab4ee1e2d8f01b494de39cd70b611e190b63943f1d5c448d4ecb9560dc16f",
commit = "04cb5af6b67c80db61f0aee76dcb6d233e51795c",
sha256 = "62d3519a7b65d6695e011f2733bfc5d7c6ab77f2bd83cdd2dca449da2e739c7f",
)

View File

@ -1,3 +1,3 @@
PROTOC_VERSION = '21.0-rc-2'
PROTOBUF_JAVA_VERSION = '3.21.0-rc-2'
PROTOBUF_PYTHON_VERSION = '4.21.0-rc-2'
PROTOC_VERSION = '21.1'
PROTOBUF_JAVA_VERSION = '3.21.1'
PROTOBUF_PYTHON_VERSION = '4.21.1'

View File

@ -8,7 +8,7 @@
</parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protoc</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
<packaging>pom</packaging>
<name>Protobuf Compiler</name>
<description>

View File

@ -34,10 +34,6 @@
Note that the golden messages exercise every known field type, thus this
test ends up exercising and verifying nearly all of the parsing and
serialization code in the whole library.
TODO(kenton): Merge with wire_format_test? It doesn't make a whole lot of
sense to call this a test of the "message" module, which only declares an
abstract interface.
"""
__author__ = 'gps@google.com (Gregory P. Smith)'
@ -476,6 +472,12 @@ class MessageTest(unittest.TestCase):
'}\n')
self.assertEqual(sub_msg.bb, 1)
def testAssignRepeatedField(self, message_module):
msg = message_module.NestedTestAllTypes()
msg.payload.repeated_int32[:] = [1, 2, 3, 4]
self.assertEqual(4, len(msg.payload.repeated_int32))
self.assertEqual([1, 2, 3, 4], msg.payload.repeated_int32)
def testMergeFromRepeatedField(self, message_module):
msg = message_module.TestAllTypes()
msg.repeated_int32.append(1)

View File

@ -1,6 +1,6 @@
Gem::Specification.new do |s|
s.name = "google-protobuf"
s.version = "3.21.0.rc.2"
s.version = "3.21.1"
git_tag = "v#{s.version.to_s.sub('.rc.', '-rc')}" # Converts X.Y.Z.rc.N to vX.Y.Z-rcN, used for the git tag
s.licenses = ["BSD-3-Clause"]
s.summary = "Protocol Buffers"

View File

@ -9,7 +9,7 @@
<groupId>com.google.protobuf.jruby</groupId>
<artifactId>protobuf-jruby</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
<name>Protocol Buffer JRuby native extension</name>
<description>
Protocol Buffers are a way of encoding structured data in an efficient yet
@ -76,7 +76,7 @@
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java-util</artifactId>
<version>3.21.0-rc-2</version>
<version>3.21.1</version>
</dependency>
<dependency>
<groupId>org.jruby</groupId>

View File

@ -107,7 +107,7 @@ CSharpType GetCSharpType(FieldDescriptor::Type type) {
}
std::string StripDotProto(const std::string& proto_file) {
int lastindex = proto_file.find_last_of(".");
int lastindex = proto_file.find_last_of('.');
return proto_file.substr(0, lastindex);
}
@ -122,7 +122,7 @@ std::string GetFileNamespace(const FileDescriptor* descriptor) {
// input of "google/protobuf/foo_bar.proto" would result in "FooBar".
std::string GetFileNameBase(const FileDescriptor* descriptor) {
std::string proto_file = descriptor->name();
int lastslash = proto_file.find_last_of("/");
int lastslash = proto_file.find_last_of('/');
std::string base = proto_file.substr(lastslash + 1);
return UnderscoresToPascalCase(StripDotProto(base));
}
@ -422,7 +422,7 @@ std::string GetOutputFile(const FileDescriptor* descriptor,
return ""; // This will be ignored, because we've set an error.
}
namespace_suffix = ns.substr(base_namespace.length());
if (namespace_suffix.find(".") == 0) {
if (namespace_suffix.find('.') == 0) {
namespace_suffix = namespace_suffix.substr(1);
}
}

View File

@ -128,7 +128,7 @@ void EnumGenerator::GenerateHeader(io::Printer* printer) {
continue;
}
if (all_values_[i]->GetSourceLocation(&location)) {
std::string comments = BuildCommentsString(location, true).c_str();
std::string comments = BuildCommentsString(location, true);
if (comments.length() > 0) {
if (i > 0) {
printer->Print("\n");

View File

@ -990,9 +990,9 @@ static std::string HandleExtremeFloatingPoint(std::string val,
return "-INFINITY";
} else {
// float strings with ., e or E need to have f appended
if (add_float_suffix && (val.find(".") != std::string::npos ||
val.find("e") != std::string::npos ||
val.find("E") != std::string::npos)) {
if (add_float_suffix && (val.find('.') != std::string::npos ||
val.find('e') != std::string::npos ||
val.find('E') != std::string::npos)) {
val += "f";
}
return val;

View File

@ -59,18 +59,18 @@ const char* const kReservedNames[] = {
"include_once", "instanceof", "insteadof", "interface", "isset",
"list", "match", "namespace", "new", "or",
"parent", "print", "private", "protected", "public",
"readonly", "require", "require_once", "return", "self",
"static", "switch", "throw", "trait", "try",
"unset", "use", "var", "while", "xor",
"yield", "int", "float", "bool", "string",
"true", "false", "null", "void", "iterable"};
"require", "require_once", "return", "self", "static",
"switch", "throw", "trait", "try", "unset",
"use", "var", "while", "xor", "yield",
"int", "float", "bool", "string", "true",
"false", "null", "void", "iterable"};
const char* const kValidConstantNames[] = {
"int", "float", "bool", "string", "true",
"false", "null", "void", "iterable", "parent",
"self", "readonly"
"self"
};
const int kReservedNamesSize = 80;
const int kValidConstantNamesSize = 12;
const int kReservedNamesSize = 79;
const int kValidConstantNamesSize = 11;
const int kFieldSetter = 1;
const int kFieldGetter = 2;
const int kFieldProperty = 3;
@ -334,7 +334,7 @@ std::string GeneratedMetadataFileName(const FileDescriptor* file,
const Options& options) {
const std::string& proto_file = file->name();
int start_index = 0;
int first_index = proto_file.find_first_of("/", start_index);
int first_index = proto_file.find_first_of('/', start_index);
std::string result = "";
std::string segment = "";
@ -347,7 +347,7 @@ std::string GeneratedMetadataFileName(const FileDescriptor* file,
// Append directory name.
std::string file_no_suffix;
int lastindex = proto_file.find_last_of(".");
int lastindex = proto_file.find_last_of('.');
if (proto_file == kEmptyFile) {
return kEmptyMetadataFile;
} else {
@ -371,12 +371,12 @@ std::string GeneratedMetadataFileName(const FileDescriptor* file,
file_no_suffix.substr(start_index, first_index - start_index), true);
result += ReservedNamePrefix(segment, file) + segment + "/";
start_index = first_index + 1;
first_index = file_no_suffix.find_first_of("/", start_index);
first_index = file_no_suffix.find_first_of('/', start_index);
}
}
// Append file name.
int file_name_start = file_no_suffix.find_last_of("/");
int file_name_start = file_no_suffix.find_last_of('/');
if (file_name_start == std::string::npos) {
file_name_start = 0;
} else {
@ -407,16 +407,6 @@ std::string GeneratedClassFileName(const DescriptorType* desc,
return result + ".php";
}
template <typename DescriptorType>
std::string LegacyReadOnlyGeneratedClassFileName(const DescriptorType* desc,
const Options& options) {
std::string php_namespace = RootPhpNamespace(desc, options);
if (!php_namespace.empty()) {
return php_namespace + "/" + desc->name() + ".php";
}
return desc->name() + ".php";
}
std::string GeneratedServiceFileName(const ServiceDescriptor* service,
const Options& options) {
std::string result = FullClassName(service, options) + "Interface";
@ -485,7 +475,7 @@ std::string PhpSetterTypeName(const FieldDescriptor* field,
}
if (field->is_repeated()) {
// accommodate for edge case with multiple types.
size_t start_pos = type.find("|");
size_t start_pos = type.find('|');
if (start_pos != std::string::npos) {
type.replace(start_pos, 1, ">|array<");
}
@ -1217,7 +1207,7 @@ void GenerateHead(const FileDescriptor* file, io::Printer* printer) {
}
std::string FilenameToClassname(const std::string& filename) {
int lastindex = filename.find_last_of(".");
int lastindex = filename.find_last_of('.');
std::string result = filename.substr(0, lastindex);
for (int i = 0; i < result.size(); i++) {
if (result[i] == '/') {
@ -1237,7 +1227,7 @@ void GenerateMetadataFile(const FileDescriptor* file, const Options& options,
GenerateHead(file, &printer);
std::string fullname = FilenameToClassname(filename);
int lastindex = fullname.find_last_of("\\");
int lastindex = fullname.find_last_of('\\');
if (lastindex != std::string::npos) {
printer.Print(
@ -1262,32 +1252,6 @@ void GenerateMetadataFile(const FileDescriptor* file, const Options& options,
printer.Print("}\n\n");
}
template <typename DescriptorType>
void LegacyReadOnlyGenerateClassFile(const FileDescriptor* file,
const DescriptorType* desc, const Options& options,
GeneratorContext* generator_context) {
std::string filename = LegacyReadOnlyGeneratedClassFileName(desc, options);
std::unique_ptr<io::ZeroCopyOutputStream> output(
generator_context->Open(filename));
io::Printer printer(output.get(), '^');
GenerateHead(file, &printer);
std::string php_namespace = RootPhpNamespace(desc, options);
if (!php_namespace.empty()) {
printer.Print(
"namespace ^name^;\n\n",
"name", php_namespace);
}
std::string newname = FullClassName(desc, options);
printer.Print("class_exists(^new^::class);\n",
"new", GeneratedClassNameImpl(desc));
printer.Print("@trigger_error(__NAMESPACE__ . '\\^old^ is deprecated and will be removed in "
"the next major release. Use ^fullname^ instead', E_USER_DEPRECATED);\n\n",
"old", desc->name(),
"fullname", newname);
}
void GenerateEnumFile(const FileDescriptor* file, const EnumDescriptor* en,
const Options& options,
GeneratorContext* generator_context) {
@ -1299,7 +1263,7 @@ void GenerateEnumFile(const FileDescriptor* file, const EnumDescriptor* en,
GenerateHead(file, &printer);
std::string fullname = FilenameToClassname(filename);
int lastindex = fullname.find_last_of("\\");
int lastindex = fullname.find_last_of('\\');
if (lastindex != std::string::npos) {
printer.Print(
@ -1408,19 +1372,6 @@ void GenerateEnumFile(const FileDescriptor* file, const EnumDescriptor* en,
"new", fullname,
"old", LegacyFullClassName(en, options));
}
// Write legacy file for backwards compatibility with "readonly" keywword
std::string lower = en->name();
std::transform(lower.begin(), lower.end(), lower.begin(), ::tolower);
if (lower == "readonly") {
printer.Print(
"// Adding a class alias for backwards compatibility with the \"readonly\" keyword.\n");
printer.Print(
"class_alias(^new^::class, __NAMESPACE__ . '\\^old^');\n\n",
"new", fullname,
"old", en->name());
LegacyReadOnlyGenerateClassFile(file, en, options, generator_context);
}
}
void GenerateMessageFile(const FileDescriptor* file, const Descriptor* message,
@ -1440,7 +1391,7 @@ void GenerateMessageFile(const FileDescriptor* file, const Descriptor* message,
GenerateHead(file, &printer);
std::string fullname = FilenameToClassname(filename);
int lastindex = fullname.find_last_of("\\");
int lastindex = fullname.find_last_of('\\');
if (lastindex != std::string::npos) {
printer.Print(
@ -1536,19 +1487,6 @@ void GenerateMessageFile(const FileDescriptor* file, const Descriptor* message,
"old", LegacyFullClassName(message, options));
}
// Write legacy file for backwards compatibility with "readonly" keywword
std::string lower = message->name();
std::transform(lower.begin(), lower.end(), lower.begin(), ::tolower);
if (lower == "readonly") {
printer.Print(
"// Adding a class alias for backwards compatibility with the \"readonly\" keyword.\n");
printer.Print(
"class_alias(^new^::class, __NAMESPACE__ . '\\^old^');\n\n",
"new", fullname,
"old", message->name());
LegacyReadOnlyGenerateClassFile(file, message, options, generator_context);
}
// Nested messages and enums.
for (int i = 0; i < message->nested_type_count(); i++) {
GenerateMessageFile(file, message->nested_type(i), options,
@ -1570,7 +1508,7 @@ void GenerateServiceFile(
GenerateHead(file, &printer);
std::string fullname = FilenameToClassname(filename);
int lastindex = fullname.find_last_of("\\");
int lastindex = fullname.find_last_of('\\');
if (!file->options().php_namespace().empty() ||
(!file->options().has_php_namespace() && !file->package().empty()) ||

View File

@ -68,7 +68,7 @@ std::string NumberToString(numeric_type value) {
}
std::string GetRequireName(const std::string& proto_file) {
int lastindex = proto_file.find_last_of(".");
int lastindex = proto_file.find_last_of('.');
return proto_file.substr(0, lastindex) + "_pb";
}
@ -421,7 +421,7 @@ int GeneratePackageModules(const FileDescriptor* file, io::Printer* printer) {
// -> A.B.C
if (package_name.find("::") != std::string::npos) {
need_change_to_module = false;
} else if (package_name.find(".") != std::string::npos) {
} else if (package_name.find('.') != std::string::npos) {
GOOGLE_LOG(WARNING) << "ruby_package option should be in the form of:"
<< " 'A::B::C' and not 'A.B.C'";
}

View File

@ -422,6 +422,25 @@ TEST(MESSAGE_TEST_NAME, ParseFailsIfExtensionWireMalformed) {
EXPECT_FALSE(p.ParseFromString(serialized));
}
TEST(MESSAGE_TEST_NAME, ParseFailsIfGroupFieldMalformed) {
UNITTEST::TestMutualRecursionA original, parsed;
original.mutable_bb()
->mutable_a()
->mutable_subgroup()
->mutable_sub_message()
->mutable_b()
->set_optional_int32(-1);
std::string data;
ASSERT_TRUE(original.SerializeToString(&data));
// Should parse correctly.
ASSERT_TRUE(parsed.ParseFromString(data));
// Overwriting the last byte of varint (-1) to 0xFF results in malformed wire.
data[data.size() - 2] = 0xFF;
EXPECT_FALSE(parsed.ParseFromString(data));
}
TEST(MESSAGE_TEST_NAME, UninitializedAndMalformed) {
UNITTEST::TestRequiredForeign o, p1, p2;
o.mutable_optional_message()->set_a(-1);

View File

@ -865,6 +865,8 @@
// Inconvenient macro names from usr/include/sys/syslimits.h in some macOS SDKs.
#pragma push_macro("UID_MAX")
#undef UID_MAX
#pragma push_macro("GID_MAX")
#undef GID_MAX
#endif // __APPLE__
#if defined(__clang__) || PROTOBUF_GNUC_MIN(3, 0) || defined(_MSC_VER)

View File

@ -144,6 +144,7 @@
#pragma pop_macro("TRUE")
#pragma pop_macro("FALSE")
#pragma pop_macro("UID_MAX")
#pragma pop_macro("GID_MAX")
#endif // __APPLE__
#if defined(__clang__) || defined(__GNUC__) || defined(_MSC_VER)

View File

@ -454,8 +454,13 @@ class RepeatedField final {
//
// Typically, due to the fact that adder is a local stack variable, the
// compiler will be successful in mem-to-reg transformation and the machine
// code will be loop: cmp %size, %capacity jae fallback mov dword ptr [%buffer
// + %size * 4], %val inc %size jmp loop
// code will be
// loop:
// cmp %size, %capacity
// jae fallback
// mov dword ptr [%buffer + %size * 4], %val
// inc %size
// jmp loop
//
// The first version executes at 7 cycles per iteration while the second
// version executes at only 1 or 2 cycles.
@ -467,6 +472,8 @@ class RepeatedField final {
capacity_ = repeated_field_->total_size_;
buffer_ = repeated_field_->unsafe_elements();
}
FastAdderImpl(const FastAdderImpl&) = delete;
FastAdderImpl& operator=(const FastAdderImpl&) = delete;
~FastAdderImpl() { repeated_field_->current_size_ = index_; }
void Add(Element val) {
@ -484,8 +491,6 @@ class RepeatedField final {
int index_;
int capacity_;
Element* buffer_;
GOOGLE_DISALLOW_EVIL_CONSTRUCTORS(FastAdderImpl);
};
// FastAdder is a wrapper for adding fields. The specialization above handles
@ -494,11 +499,12 @@ class RepeatedField final {
class FastAdderImpl<I, false> {
public:
explicit FastAdderImpl(RepeatedField* rf) : repeated_field_(rf) {}
FastAdderImpl(const FastAdderImpl&) = delete;
FastAdderImpl& operator=(const FastAdderImpl&) = delete;
void Add(const Element& val) { repeated_field_->Add(val); }
private:
RepeatedField* repeated_field_;
GOOGLE_DISALLOW_EVIL_CONSTRUCTORS(FastAdderImpl);
};
using FastAdder = FastAdderImpl<>;

View File

@ -37,9 +37,14 @@
#include <google/protobuf/stubs/status.h>
#include <google/protobuf/stubs/statusor.h>
#include <google/protobuf/port_def.inc>
namespace google {
namespace protobuf {
namespace util {
namespace status_macros_internal {
using PROTOBUF_NAMESPACE_ID::util::Status;
} // namespace status_macros_internal
// Run a command that returns a util::Status. If the called code returns an
// error status, return that status up out of this method too.
@ -49,7 +54,8 @@ namespace util {
#define RETURN_IF_ERROR(expr) \
do { \
/* Using _status below to avoid capture problems if expr is "status". */ \
const PROTOBUF_NAMESPACE_ID::util::Status _status = (expr); \
const ::google::protobuf::util::status_macros_internal::Status _status = \
(expr); \
if (PROTOBUF_PREDICT_FALSE(!_status.ok())) return _status; \
} while (0)
@ -86,4 +92,6 @@ Status DoAssignOrReturn(T& lhs, StatusOr<T> result) {
} // namespace protobuf
} // namespace google
#include <google/protobuf/port_undef.inc>
#endif // GOOGLE_PROTOBUF_STUBS_STATUS_H_

View File

@ -72,12 +72,13 @@
#ifndef GOOGLE_PROTOBUF_STUBS_STATUSOR_H_
#define GOOGLE_PROTOBUF_STUBS_STATUSOR_H_
#include <google/protobuf/stubs/status.h>
#include <new>
#include <string>
#include <utility>
#include <google/protobuf/stubs/status.h>
// Must be included last.
#include <google/protobuf/port_def.inc>
namespace google {
@ -87,7 +88,8 @@ namespace statusor_internal {
template <typename T>
class StatusOr {
template<typename U> friend class StatusOr;
template <typename U>
friend class StatusOr;
public:
using value_type = T;
@ -144,6 +146,13 @@ class StatusOr {
// Returns a reference to our current value, or CHECK-fails if !this->ok().
const T& value() const;
T& value();
// Returns a reference to our current value; UB if not OK.
const T& operator*() const { return value(); }
T& operator*() { return value(); }
const T* operator->() const { return &value(); }
T* operator->() { return &value(); }
private:
Status status_;
@ -198,8 +207,7 @@ inline StatusOr<T>::StatusOr(const T& value) {
template <typename T>
inline StatusOr<T>::StatusOr(const StatusOr<T>& other)
: status_(other.status_), value_(other.value_) {
}
: status_(other.status_), value_(other.value_) {}
template <typename T>
inline StatusOr<T>& StatusOr<T>::operator=(const StatusOr<T>& other) {
@ -211,8 +219,7 @@ inline StatusOr<T>& StatusOr<T>::operator=(const StatusOr<T>& other) {
template <typename T>
template <typename U>
inline StatusOr<T>::StatusOr(const StatusOr<U>& other)
: status_(other.status_), value_(other.status_.ok() ? other.value_ : T()) {
}
: status_(other.status_), value_(other.status_.ok() ? other.value_ : T()) {}
template <typename T>
template <typename U>
@ -240,6 +247,13 @@ inline const T& StatusOr<T>::value() const {
return value_;
}
template <typename T>
inline T& StatusOr<T>::value() {
if (!status_.ok()) {
StatusOrHelper::Crash(status_);
}
return value_;
}
} // namespace statusor_internal
using ::google::protobuf::util::statusor_internal::StatusOr;

View File

@ -87,6 +87,16 @@ TEST(StatusOr, TestValueCtor) {
EXPECT_EQ(kI, thing.value());
}
TEST(StatusOr, TestPtrOps) {
const int kI = 4;
StatusOr<int> thing(kI);
EXPECT_TRUE(thing.ok());
EXPECT_EQ(kI, *thing);
StatusOr<StatusOr<int>> thing2(thing);
EXPECT_EQ(kI, thing2->value());
}
TEST(StatusOr, TestCopyCtorStatusOk) {
const int kI = 4;
StatusOr<int> original(kI);

View File

@ -55,6 +55,7 @@ message NestedTestMessageSetContainer {
message NestedTestInt {
optional fixed32 a = 1;
optional int32 b = 3;
optional NestedTestInt child = 2;
}

View File

@ -30,11 +30,16 @@
#include <google/protobuf/util/json_util.h>
#include <algorithm>
#include <cstdint>
#include <list>
#include <string>
#include <vector>
#include <gmock/gmock.h>
#include <gtest/gtest.h>
#include <google/protobuf/stubs/status.h>
#include <google/protobuf/stubs/statusor.h>
#include <google/protobuf/stubs/strutil.h>
#include <google/protobuf/descriptor_database.h>
#include <google/protobuf/dynamic_message.h>
#include <google/protobuf/io/zero_copy_stream.h>
@ -42,71 +47,88 @@
#include <google/protobuf/util/internal/testdata/maps.pb.h>
#include <google/protobuf/util/json_format.pb.h>
#include <google/protobuf/util/json_format_proto3.pb.h>
#include <google/protobuf/util/json_format_proto3.pb.h>
#include <google/protobuf/util/type_resolver.h>
#include <google/protobuf/util/type_resolver_util.h>
#include <google/protobuf/stubs/status_macros.h>
// Must be included last.
#include <google/protobuf/port_def.inc>
namespace google {
namespace protobuf {
namespace util {
namespace {
using proto3::BAR;
using proto3::FOO;
using proto3::TestAny;
using proto3::TestEnumValue;
using proto3::TestMap;
using proto3::TestMessage;
using proto3::TestOneof;
using proto_util_converter::testing::MapIn;
using ::proto3::TestAny;
using ::proto3::TestEnumValue;
using ::proto3::TestMap;
using ::proto3::TestMessage;
using ::proto3::TestOneof;
using ::proto_util_converter::testing::MapIn;
// TODO(b/234474291): Use the gtest versions once that's available in OSS.
MATCHER_P(IsOkAndHolds, inner,
StrCat("is OK and holds ", testing::PrintToString(inner))) {
if (!arg.ok()) {
*result_listener << arg.status();
return false;
}
return testing::ExplainMatchResult(inner, *arg, result_listener);
}
util::Status GetStatus(const util::Status& s) { return s; }
template <typename T>
util::Status GetStatus(const util::StatusOr<T>& s) {
return s.status();
}
MATCHER_P(StatusIs, status,
StrCat(".status() is ", testing::PrintToString(status))) {
return GetStatus(arg).code() == status;
}
#define EXPECT_OK(x) EXPECT_THAT(x, StatusIs(util::StatusCode::kOk))
#define ASSERT_OK(x) ASSERT_THAT(x, StatusIs(util::StatusCode::kOk))
// As functions defined in json_util.h are just thin wrappers around the
// JSON conversion code in //net/proto2/util/converter, in this test we
// only cover some very basic cases to make sure the wrappers have forwarded
// parameters to the underlying implementation correctly. More detailed
// tests are contained in the //net/proto2/util/converter directory.
class JsonUtilTest : public ::testing::Test {
protected:
JsonUtilTest() {}
std::string ToJson(const Message& message, const JsonPrintOptions& options) {
util::StatusOr<std::string> ToJson(const Message& message,
const JsonPrintOptions& options = {}) {
std::string result;
GOOGLE_CHECK_OK(MessageToJsonString(message, &result, options));
RETURN_IF_ERROR(MessageToJsonString(message, &result, options));
return result;
}
bool FromJson(const std::string& json, Message* message,
const JsonParseOptions& options) {
return JsonStringToMessage(json, message, options).ok();
util::Status FromJson(StringPiece json, Message* message,
const JsonParseOptions& options = {}) {
return JsonStringToMessage(json, message, options);
}
bool FromJson(const std::string& json, Message* message) {
return FromJson(json, message, JsonParseOptions());
}
std::unique_ptr<TypeResolver> resolver_;
};
TEST_F(JsonUtilTest, TestWhitespaces) {
TEST(JsonUtilTest, TestWhitespaces) {
TestMessage m;
m.mutable_message_value();
EXPECT_THAT(ToJson(m), IsOkAndHolds("{\"messageValue\":{}}"));
JsonPrintOptions options;
EXPECT_EQ("{\"messageValue\":{}}", ToJson(m, options));
options.add_whitespace = true;
EXPECT_EQ(
"{\n"
EXPECT_THAT(ToJson(m, options), IsOkAndHolds("{\n"
" \"messageValue\": {}\n"
"}\n",
ToJson(m, options));
"}\n"));
}
TEST_F(JsonUtilTest, TestDefaultValues) {
TEST(JsonUtilTest, TestDefaultValues) {
TestMessage m;
EXPECT_THAT(ToJson(m), IsOkAndHolds("{}"));
JsonPrintOptions options;
EXPECT_EQ("{}", ToJson(m, options));
options.always_print_primitive_fields = true;
EXPECT_EQ(
"{\"boolValue\":false,"
EXPECT_THAT(ToJson(m, options), IsOkAndHolds("{\"boolValue\":false,"
"\"int32Value\":0,"
"\"int64Value\":\"0\","
"\"uint32Value\":0,"
@ -127,14 +149,14 @@ TEST_F(JsonUtilTest, TestDefaultValues) {
"\"repeatedBytesValue\":[],"
"\"repeatedEnumValue\":[],"
"\"repeatedMessageValue\":[]"
"}",
ToJson(m, options));
"}"));
options.always_print_primitive_fields = true;
m.set_string_value("i am a test string value");
m.set_bytes_value("i am a test bytes value");
EXPECT_EQ(
"{\"boolValue\":false,"
EXPECT_THAT(
ToJson(m, options),
IsOkAndHolds("{\"boolValue\":false,"
"\"int32Value\":0,"
"\"int64Value\":\"0\","
"\"uint32Value\":0,"
@ -155,14 +177,14 @@ TEST_F(JsonUtilTest, TestDefaultValues) {
"\"repeatedBytesValue\":[],"
"\"repeatedEnumValue\":[],"
"\"repeatedMessageValue\":[]"
"}",
ToJson(m, options));
"}"));
options.preserve_proto_field_names = true;
m.set_string_value("i am a test string value");
m.set_bytes_value("i am a test bytes value");
EXPECT_EQ(
"{\"bool_value\":false,"
EXPECT_THAT(
ToJson(m, options),
IsOkAndHolds("{\"bool_value\":false,"
"\"int32_value\":0,"
"\"int64_value\":\"0\","
"\"uint32_value\":0,"
@ -183,20 +205,19 @@ TEST_F(JsonUtilTest, TestDefaultValues) {
"\"repeated_bytes_value\":[],"
"\"repeated_enum_value\":[],"
"\"repeated_message_value\":[]"
"}",
ToJson(m, options));
"}"));
}
TEST_F(JsonUtilTest, TestPreserveProtoFieldNames) {
TEST(JsonUtilTest, TestPreserveProtoFieldNames) {
TestMessage m;
m.mutable_message_value();
JsonPrintOptions options;
options.preserve_proto_field_names = true;
EXPECT_EQ("{\"message_value\":{}}", ToJson(m, options));
EXPECT_THAT(ToJson(m, options), IsOkAndHolds("{\"message_value\":{}}"));
}
TEST_F(JsonUtilTest, TestAlwaysPrintEnumsAsInts) {
TEST(JsonUtilTest, TestAlwaysPrintEnumsAsInts) {
TestMessage orig;
orig.set_enum_value(proto3::BAR);
orig.add_repeated_enum_value(proto3::FOO);
@ -205,20 +226,20 @@ TEST_F(JsonUtilTest, TestAlwaysPrintEnumsAsInts) {
JsonPrintOptions print_options;
print_options.always_print_enums_as_ints = true;
std::string expected_json = "{\"enumValue\":1,\"repeatedEnumValue\":[0,1]}";
EXPECT_EQ(expected_json, ToJson(orig, print_options));
auto printed = ToJson(orig, print_options);
ASSERT_THAT(printed,
IsOkAndHolds("{\"enumValue\":1,\"repeatedEnumValue\":[0,1]}"));
TestMessage parsed;
JsonParseOptions parse_options;
ASSERT_TRUE(FromJson(expected_json, &parsed, parse_options));
ASSERT_OK(FromJson(*printed, &parsed));
EXPECT_EQ(proto3::BAR, parsed.enum_value());
EXPECT_EQ(2, parsed.repeated_enum_value_size());
EXPECT_EQ(proto3::FOO, parsed.repeated_enum_value(0));
EXPECT_EQ(proto3::BAR, parsed.repeated_enum_value(1));
EXPECT_EQ(parsed.enum_value(), proto3::BAR);
EXPECT_EQ(parsed.repeated_enum_value_size(), 2);
EXPECT_EQ(parsed.repeated_enum_value(0), proto3::FOO);
EXPECT_EQ(parsed.repeated_enum_value(1), proto3::BAR);
}
TEST_F(JsonUtilTest, TestPrintEnumsAsIntsWithDefaultValue) {
TEST(JsonUtilTest, TestPrintEnumsAsIntsWithDefaultValue) {
TestEnumValue orig;
// orig.set_enum_value1(proto3::FOO)
orig.set_enum_value2(proto3::FOO);
@ -228,20 +249,20 @@ TEST_F(JsonUtilTest, TestPrintEnumsAsIntsWithDefaultValue) {
print_options.always_print_enums_as_ints = true;
print_options.always_print_primitive_fields = true;
std::string expected_json =
"{\"enumValue1\":0,\"enumValue2\":0,\"enumValue3\":1}";
EXPECT_EQ(expected_json, ToJson(orig, print_options));
auto printed = ToJson(orig, print_options);
ASSERT_THAT(
printed,
IsOkAndHolds("{\"enumValue1\":0,\"enumValue2\":0,\"enumValue3\":1}"));
TestEnumValue parsed;
JsonParseOptions parse_options;
ASSERT_TRUE(FromJson(expected_json, &parsed, parse_options));
ASSERT_OK(FromJson(*printed, &parsed));
EXPECT_EQ(proto3::FOO, parsed.enum_value1());
EXPECT_EQ(proto3::FOO, parsed.enum_value2());
EXPECT_EQ(proto3::BAR, parsed.enum_value3());
EXPECT_EQ(parsed.enum_value1(), proto3::FOO);
EXPECT_EQ(parsed.enum_value2(), proto3::FOO);
EXPECT_EQ(parsed.enum_value3(), proto3::BAR);
}
TEST_F(JsonUtilTest, TestPrintProto2EnumAsIntWithDefaultValue) {
TEST(JsonUtilTest, TestPrintProto2EnumAsIntWithDefaultValue) {
protobuf_unittest::TestDefaultEnumValue orig;
JsonPrintOptions print_options;
@ -250,111 +271,105 @@ TEST_F(JsonUtilTest, TestPrintProto2EnumAsIntWithDefaultValue) {
print_options.always_print_primitive_fields = true;
// result should be int rather than string
std::string expected_json = "{\"enumValue\":2}";
EXPECT_EQ(expected_json, ToJson(orig, print_options));
auto printed = ToJson(orig, print_options);
ASSERT_THAT(printed, IsOkAndHolds("{\"enumValue\":2}"));
protobuf_unittest::TestDefaultEnumValue parsed;
JsonParseOptions parse_options;
ASSERT_TRUE(FromJson(expected_json, &parsed, parse_options));
ASSERT_OK(FromJson(*printed, &parsed));
EXPECT_EQ(protobuf_unittest::DEFAULT, parsed.enum_value());
EXPECT_EQ(parsed.enum_value(), protobuf_unittest::DEFAULT);
}
TEST_F(JsonUtilTest, ParseMessage) {
TEST(JsonUtilTest, ParseMessage) {
// Some random message but good enough to verify that the parsing wrapper
// functions are working properly.
std::string input =
"{\n"
" \"int32Value\": 1234567891,\n"
" \"int64Value\": 5302428716536692736,\n"
" \"floatValue\": 3.4028235e+38,\n"
" \"repeatedInt32Value\": [1, 2],\n"
" \"messageValue\": {\n"
" \"value\": 2048\n"
" },\n"
" \"repeatedMessageValue\": [\n"
" {\"value\": 40}, {\"value\": 96}\n"
" ]\n"
"}\n";
JsonParseOptions options;
TestMessage m;
ASSERT_TRUE(FromJson(input, &m, options));
EXPECT_EQ(1234567891, m.int32_value());
EXPECT_EQ(5302428716536692736, m.int64_value());
EXPECT_EQ(3.402823466e+38f, m.float_value());
ASSERT_EQ(2, m.repeated_int32_value_size());
EXPECT_EQ(1, m.repeated_int32_value(0));
EXPECT_EQ(2, m.repeated_int32_value(1));
EXPECT_EQ(2048, m.message_value().value());
ASSERT_EQ(2, m.repeated_message_value_size());
EXPECT_EQ(40, m.repeated_message_value(0).value());
EXPECT_EQ(96, m.repeated_message_value(1).value());
ASSERT_OK(FromJson(R"json(
{
"int32Value": 1234567891,
"int64Value": 5302428716536692736,
"floatValue": 3.4028235e+38,
"repeatedInt32Value": [1, 2],
"messageValue": {
"value": 2048
},
"repeatedMessageValue": [
{"value": 40},
{"value": 96}
]
}
)json",
&m));
EXPECT_EQ(m.int32_value(), 1234567891);
EXPECT_EQ(m.int64_value(), 5302428716536692736);
EXPECT_EQ(m.float_value(), 3.402823466e+38f);
ASSERT_EQ(m.repeated_int32_value_size(), 2);
EXPECT_EQ(m.repeated_int32_value(0), 1);
EXPECT_EQ(m.repeated_int32_value(1), 2);
EXPECT_EQ(m.message_value().value(), 2048);
ASSERT_EQ(m.repeated_message_value_size(), 2);
EXPECT_EQ(m.repeated_message_value(0).value(), 40);
EXPECT_EQ(m.repeated_message_value(1).value(), 96);
}
TEST_F(JsonUtilTest, ParseMap) {
TEST(JsonUtilTest, ParseMap) {
TestMap message;
(*message.mutable_string_map())["hello"] = 1234;
JsonPrintOptions print_options;
JsonParseOptions parse_options;
EXPECT_EQ("{\"stringMap\":{\"hello\":1234}}", ToJson(message, print_options));
auto printed = ToJson(message);
ASSERT_THAT(printed, IsOkAndHolds("{\"stringMap\":{\"hello\":1234}}"));
TestMap other;
ASSERT_TRUE(FromJson(ToJson(message, print_options), &other, parse_options));
EXPECT_EQ(message.DebugString(), other.DebugString());
ASSERT_OK(FromJson(*printed, &other));
EXPECT_EQ(other.DebugString(), message.DebugString());
}
TEST_F(JsonUtilTest, ParsePrimitiveMapIn) {
TEST(JsonUtilTest, ParsePrimitiveMapIn) {
MapIn message;
JsonPrintOptions print_options;
print_options.always_print_primitive_fields = true;
JsonParseOptions parse_options;
EXPECT_EQ("{\"other\":\"\",\"things\":[],\"mapInput\":{},\"mapAny\":{}}",
ToJson(message, print_options));
auto printed = ToJson(message, print_options);
ASSERT_THAT(
ToJson(message, print_options),
IsOkAndHolds(
"{\"other\":\"\",\"things\":[],\"mapInput\":{},\"mapAny\":{}}"));
MapIn other;
ASSERT_TRUE(FromJson(ToJson(message, print_options), &other, parse_options));
EXPECT_EQ(message.DebugString(), other.DebugString());
ASSERT_OK(FromJson(*printed, &other));
EXPECT_EQ(other.DebugString(), message.DebugString());
}
TEST_F(JsonUtilTest, PrintPrimitiveOneof) {
TEST(JsonUtilTest, PrintPrimitiveOneof) {
TestOneof message;
JsonPrintOptions options;
options.always_print_primitive_fields = true;
message.mutable_oneof_message_value();
EXPECT_EQ("{\"oneofMessageValue\":{\"value\":0}}", ToJson(message, options));
EXPECT_THAT(ToJson(message, options),
IsOkAndHolds("{\"oneofMessageValue\":{\"value\":0}}"));
message.set_oneof_int32_value(1);
EXPECT_EQ("{\"oneofInt32Value\":1}", ToJson(message, options));
EXPECT_THAT(ToJson(message, options),
IsOkAndHolds("{\"oneofInt32Value\":1}"));
}
TEST_F(JsonUtilTest, TestParseIgnoreUnknownFields) {
TEST(JsonUtilTest, TestParseIgnoreUnknownFields) {
TestMessage m;
JsonParseOptions options;
options.ignore_unknown_fields = true;
EXPECT_TRUE(FromJson("{\"unknownName\":0}", &m, options));
EXPECT_OK(FromJson("{\"unknownName\":0}", &m, options));
}
TEST_F(JsonUtilTest, TestParseErrors) {
TEST(JsonUtilTest, TestParseErrors) {
TestMessage m;
JsonParseOptions options;
// Parsing should fail if the field name can not be recognized.
EXPECT_FALSE(FromJson("{\"unknownName\":0}", &m, options));
EXPECT_THAT(FromJson(R"json({"unknownName": 0})json", &m),
StatusIs(util::StatusCode::kInvalidArgument));
// Parsing should fail if the value is invalid.
EXPECT_FALSE(FromJson("{\"int32Value\":2147483648}", &m, options));
EXPECT_THAT(FromJson(R"json("{"int32Value": 2147483648})json", &m),
StatusIs(util::StatusCode::kInvalidArgument));
}
TEST_F(JsonUtilTest, TestDynamicMessage) {
// Some random message but good enough to test the wrapper functions.
std::string input =
"{\n"
" \"int32Value\": 1024,\n"
" \"repeatedInt32Value\": [1, 2],\n"
" \"messageValue\": {\n"
" \"value\": 2048\n"
" },\n"
" \"repeatedMessageValue\": [\n"
" {\"value\": 40}, {\"value\": 96}\n"
" ]\n"
"}\n";
TEST(JsonUtilTest, TestDynamicMessage) {
// Create a new DescriptorPool with the same protos as the generated one.
DescriptorPoolDatabase database(*DescriptorPool::generated_pool());
DescriptorPool pool(&database);
@ -363,218 +378,223 @@ TEST_F(JsonUtilTest, TestDynamicMessage) {
std::unique_ptr<Message> message(
factory.GetPrototype(pool.FindMessageTypeByName("proto3.TestMessage"))
->New());
EXPECT_TRUE(FromJson(input, message.get()));
EXPECT_OK(FromJson(R"json(
{
"int32Value": 1024,
"repeatedInt32Value": [1, 2],
"messageValue": {
"value": 2048
},
"repeatedMessageValue": [
{"value": 40},
{"value": 96}
]
}
)json",
message.get()));
// Convert to generated message for easy inspection.
TestMessage generated;
EXPECT_TRUE(generated.ParseFromString(message->SerializeAsString()));
EXPECT_EQ(1024, generated.int32_value());
ASSERT_EQ(2, generated.repeated_int32_value_size());
EXPECT_EQ(1, generated.repeated_int32_value(0));
EXPECT_EQ(2, generated.repeated_int32_value(1));
EXPECT_EQ(2048, generated.message_value().value());
ASSERT_EQ(2, generated.repeated_message_value_size());
EXPECT_EQ(40, generated.repeated_message_value(0).value());
EXPECT_EQ(96, generated.repeated_message_value(1).value());
EXPECT_EQ(generated.int32_value(), 1024);
ASSERT_EQ(generated.repeated_int32_value_size(), 2);
EXPECT_EQ(generated.repeated_int32_value(0), 1);
EXPECT_EQ(generated.repeated_int32_value(1), 2);
EXPECT_EQ(generated.message_value().value(), 2048);
ASSERT_EQ(generated.repeated_message_value_size(), 2);
EXPECT_EQ(generated.repeated_message_value(0).value(), 40);
EXPECT_EQ(generated.repeated_message_value(1).value(), 96);
JsonOptions options;
EXPECT_EQ(ToJson(generated, options), ToJson(*message, options));
auto message_json = ToJson(*message);
ASSERT_OK(message_json);
auto generated_json = ToJson(generated);
ASSERT_OK(generated_json);
EXPECT_EQ(*message_json, *generated_json);
}
TEST_F(JsonUtilTest, TestParsingUnknownAnyFields) {
std::string input =
"{\n"
" \"value\": {\n"
" \"@type\": \"type.googleapis.com/proto3.TestMessage\",\n"
" \"unknown_field\": \"UNKNOWN_VALUE\",\n"
" \"string_value\": \"expected_value\"\n"
" }\n"
"}";
TEST(JsonUtilTest, TestParsingUnknownAnyFields) {
StringPiece input = R"json(
{
"value": {
"@type": "type.googleapis.com/proto3.TestMessage",
"unknown_field": "UNKNOWN_VALUE",
"string_value": "expected_value"
}
}
)json";
TestAny m;
JsonParseOptions options;
EXPECT_FALSE(FromJson(input, &m, options));
EXPECT_THAT(FromJson(input, &m),
StatusIs(util::StatusCode::kInvalidArgument));
JsonParseOptions options;
options.ignore_unknown_fields = true;
EXPECT_TRUE(FromJson(input, &m, options));
EXPECT_OK(FromJson(input, &m, options));
TestMessage t;
EXPECT_TRUE(m.value().UnpackTo(&t));
EXPECT_EQ("expected_value", t.string_value());
EXPECT_EQ(t.string_value(), "expected_value");
}
TEST_F(JsonUtilTest, TestParsingUnknownEnumsProto2) {
std::string input =
"{\n"
" \"a\": \"UNKNOWN_VALUE\"\n"
"}";
TEST(JsonUtilTest, TestParsingUnknownEnumsProto2) {
StringPiece input = R"json({"a": "UNKNOWN_VALUE"})json";
protobuf_unittest::TestNumbers m;
JsonParseOptions options;
EXPECT_FALSE(FromJson(input, &m, options));
EXPECT_THAT(FromJson(input, &m, options),
StatusIs(util::StatusCode::kInvalidArgument));
options.ignore_unknown_fields = true;
EXPECT_TRUE(FromJson(input, &m, options));
EXPECT_OK(FromJson(input, &m, options));
EXPECT_FALSE(m.has_a());
}
TEST_F(JsonUtilTest, TestParsingUnknownEnumsProto3) {
TEST(JsonUtilTest, TestParsingUnknownEnumsProto3) {
TestMessage m;
{
JsonParseOptions options;
ASSERT_FALSE(options.ignore_unknown_fields);
std::string input =
"{\n"
" \"enum_value\":\"UNKNOWN_VALUE\"\n"
"}";
m.set_enum_value(proto3::BAR);
EXPECT_FALSE(FromJson(input, &m, options));
ASSERT_EQ(proto3::BAR, m.enum_value()); // Keep previous value
StringPiece input = R"json({"enum_value":"UNKNOWN_VALUE"})json";
m.set_enum_value(proto3::BAR);
EXPECT_THAT(FromJson(input, &m),
StatusIs(util::StatusCode::kInvalidArgument));
ASSERT_EQ(m.enum_value(), proto3::BAR); // Keep previous value
JsonParseOptions options;
options.ignore_unknown_fields = true;
EXPECT_TRUE(FromJson(input, &m, options));
EXPECT_EQ(0, m.enum_value()); // Unknown enum value must be decoded as 0
EXPECT_OK(FromJson(input, &m, options));
EXPECT_EQ(m.enum_value(), 0); // Unknown enum value must be decoded as 0
}
// Integer values are read as usual
{
JsonParseOptions options;
std::string input =
"{\n"
" \"enum_value\":12345\n"
"}";
m.set_enum_value(proto3::BAR);
EXPECT_TRUE(FromJson(input, &m, options));
ASSERT_EQ(12345, m.enum_value());
TEST(JsonUtilTest, TestParsingUnknownEnumsProto3FromInt) {
TestMessage m;
StringPiece input = R"json({"enum_value":12345})json";
m.set_enum_value(proto3::BAR);
EXPECT_OK(FromJson(input, &m));
ASSERT_EQ(m.enum_value(), 12345);
JsonParseOptions options;
options.ignore_unknown_fields = true;
EXPECT_TRUE(FromJson(input, &m, options));
EXPECT_EQ(12345, m.enum_value());
EXPECT_OK(FromJson(input, &m, options));
EXPECT_EQ(m.enum_value(), 12345);
}
// Trying to pass an object as an enum field value is always treated as an
// error
{
TEST(JsonUtilTest, TestParsingUnknownEnumsProto3FromObject) {
TestMessage m;
StringPiece input = R"json({"enum_value": {}})json";
JsonParseOptions options;
std::string input =
"{\n"
" \"enum_value\":{}\n"
"}";
options.ignore_unknown_fields = true;
EXPECT_FALSE(FromJson(input, &m, options));
options.ignore_unknown_fields = false;
EXPECT_FALSE(FromJson(input, &m, options));
}
// Trying to pass an array as an enum field value is always treated as an
// error
{
JsonParseOptions options;
std::string input =
"{\n"
" \"enum_value\":[]\n"
"}";
EXPECT_FALSE(FromJson(input, &m, options));
options.ignore_unknown_fields = true;
EXPECT_FALSE(FromJson(input, &m, options));
}
EXPECT_THAT(FromJson(input, &m, options),
StatusIs(util::StatusCode::kInvalidArgument));
EXPECT_THAT(FromJson(input, &m),
StatusIs(util::StatusCode::kInvalidArgument));
}
TEST_F(JsonUtilTest, TestParsingEnumIgnoreCase) {
TEST(JsonUtilTest, TestParsingUnknownEnumsProto3FromArray) {
TestMessage m;
{
StringPiece input = R"json({"enum_value": []})json";
EXPECT_THAT(FromJson(input, &m),
StatusIs(util::StatusCode::kInvalidArgument));
JsonParseOptions options;
std::string input =
"{\n"
" \"enum_value\":\"bar\"\n"
"}";
options.ignore_unknown_fields = true;
EXPECT_THAT(FromJson(input, &m, options),
StatusIs(util::StatusCode::kInvalidArgument));
}
TEST(JsonUtilTest, TestParsingEnumCaseSensitive) {
TestMessage m;
StringPiece input = R"json({"enum_value": "bar"})json";
m.set_enum_value(proto3::FOO);
EXPECT_FALSE(FromJson(input, &m, options));
EXPECT_THAT(FromJson(input, &m),
StatusIs(util::StatusCode::kInvalidArgument));
// Default behavior is case-sensitive, so keep previous value.
ASSERT_EQ(proto3::FOO, m.enum_value());
ASSERT_EQ(m.enum_value(), proto3::FOO);
}
{
JsonParseOptions options;
options.case_insensitive_enum_parsing = false;
std::string input =
"{\n"
" \"enum_value\":\"bar\"\n"
"}";
TEST(JsonUtilTest, TestParsingEnumIgnoreCase) {
TestMessage m;
StringPiece input = R"json({"enum_value":"bar"})json";
m.set_enum_value(proto3::FOO);
EXPECT_FALSE(FromJson(input, &m, options));
ASSERT_EQ(proto3::FOO, m.enum_value()); // Keep previous value
}
{
JsonParseOptions options;
options.case_insensitive_enum_parsing = true;
std::string input =
"{\n"
" \"enum_value\":\"bar\"\n"
"}";
m.set_enum_value(proto3::FOO);
EXPECT_TRUE(FromJson(input, &m, options));
ASSERT_EQ(proto3::BAR, m.enum_value());
}
EXPECT_OK(FromJson(input, &m, options));
ASSERT_EQ(m.enum_value(), proto3::BAR);
}
typedef std::pair<char*, int> Segment;
// A ZeroCopyOutputStream that writes to multiple buffers.
class SegmentedZeroCopyOutputStream : public io::ZeroCopyOutputStream {
public:
explicit SegmentedZeroCopyOutputStream(std::list<Segment> segments)
: segments_(segments),
last_segment_(static_cast<char*>(NULL), 0),
byte_count_(0) {}
explicit SegmentedZeroCopyOutputStream(
std::vector<StringPiece> segments)
: segments_(segments) {
// absl::c_* functions are not cloned in OSS.
std::reverse(segments_.begin(), segments_.end());
}
bool Next(void** buffer, int* length) override {
if (segments_.empty()) {
return false;
}
last_segment_ = segments_.front();
segments_.pop_front();
*buffer = last_segment_.first;
*length = last_segment_.second;
byte_count_ += *length;
last_segment_ = segments_.back();
segments_.pop_back();
// TODO(b/234159981): This is only ever constructed in test code, and only
// from non-const bytes, so this is a valid cast. We need to do this since
// OSS proto does not yet have absl::Span; once we take a full Abseil
// dependency we should use that here instead.
*buffer = const_cast<char*>(last_segment_.data());
*length = static_cast<int>(last_segment_.size());
byte_count_ += static_cast<int64_t>(last_segment_.size());
return true;
}
void BackUp(int length) override {
GOOGLE_CHECK(length <= last_segment_.second);
segments_.push_front(
Segment(last_segment_.first + last_segment_.second - length, length));
last_segment_ = Segment(last_segment_.first, last_segment_.second - length);
byte_count_ -= length;
GOOGLE_CHECK(length <= static_cast<int>(last_segment_.size()));
size_t backup = last_segment_.size() - static_cast<size_t>(length);
segments_.push_back(last_segment_.substr(backup));
last_segment_ = last_segment_.substr(0, backup);
byte_count_ -= static_cast<int64_t>(length);
}
int64_t ByteCount() const override { return byte_count_; }
private:
std::list<Segment> segments_;
Segment last_segment_;
int64_t byte_count_;
std::vector<StringPiece> segments_;
StringPiece last_segment_;
int64_t byte_count_ = 0;
};
// This test splits the output buffer and also the input data into multiple
// segments and checks that the implementation of ZeroCopyStreamByteSink
// handles all possible cases correctly.
TEST(ZeroCopyStreamByteSinkTest, TestAllInputOutputPatterns) {
static const int kOutputBufferLength = 10;
static constexpr int kOutputBufferLength = 10;
// An exhaustive test takes too long, skip some combinations to make the test
// run faster.
static const int kSkippedPatternCount = 7;
static constexpr int kSkippedPatternCount = 7;
char buffer[kOutputBufferLength];
for (int split_pattern = 0; split_pattern < (1 << (kOutputBufferLength - 1));
split_pattern += kSkippedPatternCount) {
// Split the buffer into small segments according to the split_pattern.
std::list<Segment> segments;
std::vector<StringPiece> segments;
int segment_start = 0;
for (int i = 0; i < kOutputBufferLength - 1; ++i) {
if (split_pattern & (1 << i)) {
segments.push_back(
Segment(buffer + segment_start, i - segment_start + 1));
StringPiece(buffer + segment_start, i - segment_start + 1));
segment_start = i + 1;
}
}
segments.push_back(
Segment(buffer + segment_start, kOutputBufferLength - segment_start));
segments.push_back(StringPiece(buffer + segment_start,
kOutputBufferLength - segment_start));
// Write exactly 10 bytes through the ByteSink.
std::string input_data = "0123456789";
@ -593,7 +613,7 @@ TEST(ZeroCopyStreamByteSinkTest, TestAllInputOutputPatterns) {
}
byte_sink.Append(&input_data[start], input_data.length() - start);
}
EXPECT_EQ(input_data, std::string(buffer, input_data.length()));
EXPECT_EQ(std::string(buffer, input_data.length()), input_data);
}
// Write only 9 bytes through the ByteSink.
@ -613,8 +633,8 @@ TEST(ZeroCopyStreamByteSinkTest, TestAllInputOutputPatterns) {
}
byte_sink.Append(&input_data[start], input_data.length() - start);
}
EXPECT_EQ(input_data, std::string(buffer, input_data.length()));
EXPECT_EQ(0, buffer[input_data.length()]);
EXPECT_EQ(std::string(buffer, input_data.length()), input_data);
EXPECT_EQ(buffer[input_data.length()], 0);
}
// Write 11 bytes through the ByteSink. The extra byte will just
@ -641,29 +661,29 @@ TEST(ZeroCopyStreamByteSinkTest, TestAllInputOutputPatterns) {
}
}
TEST_F(JsonUtilTest, TestWrongJsonInput) {
const char json[] = "{\"unknown_field\":\"some_value\"}";
io::ArrayInputStream input_stream(json, strlen(json));
TEST(JsonUtilTest, TestWrongJsonInput) {
StringPiece json = "{\"unknown_field\":\"some_value\"}";
io::ArrayInputStream input_stream(json.data(), json.size());
char proto_buffer[10000];
io::ArrayOutputStream output_stream(proto_buffer, sizeof(proto_buffer));
std::string message_type = "type.googleapis.com/proto3.TestMessage";
TypeResolver* resolver = NewTypeResolverForDescriptorPool(
auto* resolver = NewTypeResolverForDescriptorPool(
"type.googleapis.com", DescriptorPool::generated_pool());
auto result_status = util::JsonToBinaryStream(resolver, message_type,
&input_stream, &output_stream);
EXPECT_THAT(JsonToBinaryStream(resolver, message_type, &input_stream,
&output_stream),
StatusIs(util::StatusCode::kInvalidArgument));
delete resolver;
EXPECT_FALSE(result_status.ok());
EXPECT_TRUE(util::IsInvalidArgument(result_status));
}
TEST_F(JsonUtilTest, HtmlEscape) {
TEST(JsonUtilTest, HtmlEscape) {
TestMessage m;
m.set_string_value("</script>");
JsonPrintOptions options;
EXPECT_EQ("{\"stringValue\":\"\\u003c/script\\u003e\"}", ToJson(m, options));
EXPECT_THAT(ToJson(m, options),
IsOkAndHolds("{\"stringValue\":\"\\u003c/script\\u003e\"}"));
}
} // namespace

@ -1 +0,0 @@
Subproject commit 8c6e53ef3adb1227fffa442c50349dab134a54bc

@ -1 +1 @@
Subproject commit 5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8
Subproject commit 0baacde3618ca617da95375e0af13ce1baadea47

View File

@ -1,195 +1,8 @@
#!/bin/bash -u
# This script copies source file lists from src/Makefile.am to cmake files.
# This script generates file lists from Bazel, e.g., for cmake.
get_variable_value() {
local FILENAME=$1
local VARNAME=$2
awk "
BEGIN { start = 0; }
/^$VARNAME =/ { start = 1; }
{ if (start) { print \$0; } }
/\\\\\$/ { next; }
{ start = 0; }
" $FILENAME \
| sed "s/^$VARNAME =//" \
| sed "s/[ \\]//g" \
| grep -v "^\\$" \
| grep -v "^$" \
| LC_ALL=C sort | uniq
}
get_header_files() {
get_variable_value $@ | grep '\.h$'
}
get_source_files() {
get_variable_value $@ | grep "cc$\|inc$"
}
get_proto_files() {
get_variable_value $@ | grep "pb.cc$" | sed "s/pb.cc/proto/"
}
sort_files() {
for FILE in $@; do
echo $FILE
done | LC_ALL=C sort | uniq
}
MAKEFILE=src/Makefile.am
[ -f "$MAKEFILE" ] || {
echo "Cannot find: $MAKEFILE"
exit 1
}
# Extract file lists from src/Makefile.am
GZHEADERS=$(get_variable_value $MAKEFILE GZHEADERS)
LIBPROTOBUF_HEADERS=$(get_variable_value $MAKEFILE nobase_include_HEADERS | grep -v /compiler/)
LIBPROTOBUF_HEADERS=$(sort_files $GZHEADERS $LIBPROTOBUF_HEADERS)
LIBPROTOBUF_LITE_SOURCES=$(get_source_files $MAKEFILE libprotobuf_lite_la_SOURCES)
LIBPROTOBUF_SOURCES=$(get_source_files $MAKEFILE libprotobuf_la_SOURCES)
LIBPROTOC_SOURCES=$(get_source_files $MAKEFILE libprotoc_la_SOURCES)
LIBPROTOC_HEADERS=$(get_variable_value $MAKEFILE nobase_include_HEADERS | grep /compiler/)
LITE_PROTOS=$(get_proto_files $MAKEFILE protoc_lite_outputs)
PROTOS=$(get_proto_files $MAKEFILE protoc_outputs)
WKT_PROTOS=$(get_variable_value $MAKEFILE nobase_dist_proto_DATA)
COMMON_TEST_SOURCES=$(get_source_files $MAKEFILE COMMON_TEST_SOURCES)
COMMON_LITE_TEST_SOURCES=$(get_source_files $MAKEFILE COMMON_LITE_TEST_SOURCES)
TEST_SOURCES=$(get_source_files $MAKEFILE protobuf_test_SOURCES)
NON_MSVC_TEST_SOURCES=$(get_source_files $MAKEFILE NON_MSVC_TEST_SOURCES)
LITE_TEST_SOURCES=$(get_source_files $MAKEFILE protobuf_lite_test_SOURCES)
LITE_ARENA_TEST_SOURCES=$(get_source_files $MAKEFILE protobuf_lite_arena_test_SOURCES)
TEST_PLUGIN_SOURCES=$(get_source_files $MAKEFILE test_plugin_SOURCES)
################################################################################
# Update cmake files.
################################################################################
CMAKE_DIR=cmake
EXTRACT_INCLUDES_BAT=cmake/extract_includes.bat.in
[ -d "$CMAKE_DIR" ] || {
echo "Cannot find: $CMAKE_DIR"
exit 1
}
[ -f "$EXTRACT_INCLUDES_BAT" ] || {
echo "Cannot find: $EXTRACT_INCLUDES_BAT"
exit 1
}
set_cmake_value() {
local FILENAME=$1
local VARNAME=$2
local PREFIX=$3
shift
shift
shift
awk -v values="$*" -v prefix="$PREFIX" "
BEGIN { start = 0; }
/^set\\($VARNAME/ {
start = 1;
print \$0;
len = split(values, vlist, \" \");
for (i = 1; i <= len; ++i) {
printf(\" \");
if (vlist[i] !~ /^\\\$/) {
printf(\"%s\", prefix);
}
printf(\"%s\\n\", vlist[i]);
}
next;
}
start && /^\\)/ {
start = 0;
}
!start {
print \$0;
}
" $FILENAME > /tmp/$$
cp /tmp/$$ $FILENAME
}
# Replace file lists in cmake files.
CMAKE_PREFIX="\${protobuf_SOURCE_DIR}/src/"
set_cmake_value $CMAKE_DIR/libprotobuf-lite.cmake libprotobuf_lite_files $CMAKE_PREFIX $LIBPROTOBUF_LITE_SOURCES
set_cmake_value $CMAKE_DIR/libprotobuf.cmake libprotobuf_files $CMAKE_PREFIX $LIBPROTOBUF_SOURCES
set_cmake_value $CMAKE_DIR/libprotoc.cmake libprotoc_files $CMAKE_PREFIX $LIBPROTOC_SOURCES
set_cmake_value $CMAKE_DIR/libprotoc.cmake libprotoc_headers $CMAKE_PREFIX $LIBPROTOC_HEADERS
set_cmake_value $CMAKE_DIR/tests.cmake lite_test_protos "" $LITE_PROTOS
set_cmake_value $CMAKE_DIR/tests.cmake tests_protos "" $PROTOS
set_cmake_value $CMAKE_DIR/tests.cmake common_test_files $CMAKE_PREFIX '${common_lite_test_files}' $COMMON_TEST_SOURCES
set_cmake_value $CMAKE_DIR/tests.cmake common_lite_test_files $CMAKE_PREFIX $COMMON_LITE_TEST_SOURCES
set_cmake_value $CMAKE_DIR/tests.cmake tests_files $CMAKE_PREFIX $TEST_SOURCES
set_cmake_value $CMAKE_DIR/tests.cmake non_msvc_tests_files $CMAKE_PREFIX $NON_MSVC_TEST_SOURCES
set_cmake_value $CMAKE_DIR/tests.cmake lite_test_files $CMAKE_PREFIX $LITE_TEST_SOURCES
set_cmake_value $CMAKE_DIR/tests.cmake lite_arena_test_files $CMAKE_PREFIX $LITE_ARENA_TEST_SOURCES
# Generate extract_includes.bat
echo "mkdir include" > $EXTRACT_INCLUDES_BAT
for INCLUDE in $LIBPROTOBUF_HEADERS $LIBPROTOC_HEADERS $WKT_PROTOS; do
INCLUDE_DIR=$(dirname "$INCLUDE")
while [ ! "$INCLUDE_DIR" = "." ]; do
echo "mkdir include\\${INCLUDE_DIR//\//\\}"
INCLUDE_DIR=$(dirname "$INCLUDE_DIR")
done
done | sort | uniq >> $EXTRACT_INCLUDES_BAT
for INCLUDE in $(sort_files $LIBPROTOBUF_HEADERS $LIBPROTOC_HEADERS) $WKT_PROTOS; do
WINPATH=${INCLUDE//\//\\}
echo "copy \"\${PROTOBUF_SOURCE_WIN32_PATH}\\..\\src\\$WINPATH\" include\\$WINPATH" >> $EXTRACT_INCLUDES_BAT
done
################################################################################
# Update bazel BUILD files.
################################################################################
set_bazel_value() {
local FILENAME=$1
local VARNAME=$2
local PREFIX=$3
shift
shift
shift
awk -v values="$*" -v prefix="$PREFIX" "
BEGIN { start = 0; }
/# AUTOGEN\\($VARNAME\\)/ {
start = 1;
print \$0;
# replace \$0 with indent.
sub(/#.*/, \"\", \$0)
len = split(values, vlist, \" \");
for (i = 1; i <= len; ++i) {
printf(\"%s\\\"%s%s\\\",\n\", \$0, prefix, vlist[i]);
}
next;
}
start && /\]/ {
start = 0
}
!start {
print \$0;
}
" $FILENAME > /tmp/$$
cp /tmp/$$ $FILENAME
}
BAZEL_BUILD=./BUILD
BAZEL_PREFIX="src/"
if [ -f "$BAZEL_BUILD" ]; then
set_bazel_value $BAZEL_BUILD protobuf_lite_srcs $BAZEL_PREFIX $LIBPROTOBUF_LITE_SOURCES
set_bazel_value $BAZEL_BUILD protobuf_srcs $BAZEL_PREFIX $LIBPROTOBUF_SOURCES
set_bazel_value $BAZEL_BUILD protoc_lib_srcs $BAZEL_PREFIX $LIBPROTOC_SOURCES
set_bazel_value $BAZEL_BUILD lite_test_protos "" $LITE_PROTOS
set_bazel_value $BAZEL_BUILD well_known_protos "" $WKT_PROTOS
set_bazel_value $BAZEL_BUILD test_protos "" $PROTOS
set_bazel_value $BAZEL_BUILD common_test_srcs $BAZEL_PREFIX $COMMON_LITE_TEST_SOURCES $COMMON_TEST_SOURCES
set_bazel_value $BAZEL_BUILD test_srcs $BAZEL_PREFIX $TEST_SOURCES
set_bazel_value $BAZEL_BUILD non_msvc_test_srcs $BAZEL_PREFIX $NON_MSVC_TEST_SOURCES
set_bazel_value $BAZEL_BUILD test_plugin_srcs $BAZEL_PREFIX $TEST_PLUGIN_SOURCES
else
echo "Skipped BUILD file update."
fi
set -e
bazel build //pkg:gen_src_file_lists
cp -v bazel-bin/pkg/src_file_lists.cmake src/file_lists.cmake