JSON/HTTP transcoding to gRPC using Envoy

Arindam Mukherjee
5 min readSep 22, 2019

--

Using your GRPC APIs to serve http requests with JSON.

As more and more businesses run their applications as ensembles of microservices, gRPC finds increasing popularity as a protocol for data interchange between services. Using protobuf over http2, gRPC provides a compact wire format for data over an industry-standard protocol. However gRPC isn’t replacing in a hurry all HTTP 1.1 applications and clients (including browsers and JSON-gulping services). For the microservice authors this means that in some cases, they have to support JSON over HTTP 1.1 as an option. This translates to additional boilerplate code for serving http and generating / consuming JSON.

This article looks at one particular way of doing this with almost no code and a little bit of configuration using Envoy, the popular proxy/reverse proxy that is the new favorite of cloud infrastructure writers.

The Method

The way to serve JSON over HTTP 1.1 with just gRPC endpoints and Envoy is an extension to the process of writing the gRPC APIs. It can be summarized as the following sequence of steps.

  1. In your gRPC service definitions in the proto file, add a small amount of metadata for each RPC you want to expose as an http endpoint. To avail the directives for doing this, include the file google/api/annotations.proto in your service proto file. To be able to access this proto file, you should clone googleapis/googleapis.
  2. Build your language stubs from the proto file and write your service implementation, just as you would for any gRPC service implementation. The regular protobuf toolchain needs to be used — protoc et al.
  3. Also using protoc, generate the protocol descriptor set for your services. This is the information that would be used by a translation layer to map JSON to protobuf and vice versa.
  4. Finally, run your gRPC server at some designated port, and use Envoy as a reverse proxy to route to that port. In doing so, add configuration information to Envoy’s configuration asking it to use the gRPC-JSON transcoder filter to do the translation between gRPC and JSON. This information would include the path to the protocol descriptor set generated in step 3, so that Envoy knows the rules of translation.

Steps 1, 3, and 4 above are the additional actions you need to be able to transcode JSON to GRPC. Lets look at these three steps in more detail.

Using google.api.annotations to map gRPC services to REST endpoints

Let’s suppose you have defined a message type Person and a gRPC service called PersonRegistry with RPCs Lookup and Create. You would want to expose Lookup as a GET API, and Create as a POST API. Normally your service definition in a proto file would like the following:

service PersonRegistry {
rpc Lookup(Person) returns(Person);
rpc Create(Person) returns(Person);
}

However, we must annotate each RPC that we would want to expose as a REST endpoint. So we add the following annotations (and import the relevant proto file defining the annotation elements).

import "google/api/annotations.proto";
...
service PersonFinder {
rpc Lookup(Person) returns(Person) {
option (google.api.http) = {
get: "/lookup"
};

}
rpc Create(Person) returns(Person) {
option (google.api.http) = {
post: "/create"
};

}
}

In addition, for GET requests, we can pass query parameters to map to individual fields in the request object. For POST requests, we can pass a JSON object that maps to the request object. In both cases, all lower snake-case parameters are converted to camel-case fields in the message.

Generating protocol descriptor sets

The protocol descriptor sets are generated from the same proto file that defines the message types and services. Since we included google/api/annotations.proto, we must make sure that this file is available and the protoc command knows where to get these files from.

git clone https://github.com/googleapis/googleapis <googleapi-dir>
...
protoc -I<googleapi-dir> -I<other-dirs> --include_imports \
--descriptor_set_out=<path>/person.pb person.proto

The descriptor set is stored in a binary file with the .pb extension.

Envoy for bridging JSON/HTTP to gRPC

Finally, we must generate an Envoy configuration that would make Envoy listen for http requests on a particular port, translate them to gRPC requests using the descriptor sets, and then forward them to our real gRPC server. On the response path, Envoy must convert the protobuf response back to JSON. Here is the configuration with its essential elements highlighted in bold:

static_resources:
listeners:
- name: restlistener
address:
socket_address: { address: 0.0.0.0, port_value: 7778 }
filter_chains:
- filters:
- name: envoy.http_connection_manager
config:
stat_prefix: grpc_json
codec_type: AUTO
route_config:
name: local_route
virtual_hosts:
- name: local_service
domains: ["*"]
routes:
- match: { prefix: "/" }
route: { cluster: grpcserver, timeout: { seconds: 60 } }
http_filters:
- name: envoy.grpc_json_transcoder
config:
proto_descriptor: "<path>/person.pb"
services: ["pkg.name.PersonRegistry"]
print_options:
add_whitespace: true
always_print_primitive_fields: true
always_print_enums_as_ints: false
preserve_proto_field_names: false
- name: envoy.router

clusters:
- name: grpcserver
connect_timeout: 1.25s
type: logical_dns
lb_policy: round_robin
dns_lookup_family: V4_ONLY
http2_protocol_options: {}
hosts:
- socket_address:
address: 0.0.0.0
port_value: 7777

Here it is assumed that your gRPC server is listening on port 7777, while Envoy listens for http requests on port 7778 and forwards them to port 7777. On the forward and response path, the envoy.grpc_json_transcoder filter does the real translation work using the proto_descriptor and the list of service names in the services attribute. Note that the service names must be fully qualified using package names.

The Result

It will now be possible to access the gRPC endpoints via the proxied http endpoints, using an http client such as curl. Here is a command line to create a Person entry.

curl -X POST -d '{"name": "Sean", "age": 21}' \
-H "Content-Type: application/json" \
http://0.0.0.0:7778/create

The JSON data passed with the -d option is mapped to the request message type of the RPC. And here is a curl command to look this entry up by the Person’sname:

curl -H "Content-Type: application/json" \
http://0.0.0.0:7778/lookup?name=Sean

You can see in action the actual implementation ofPersonRegistry, and an Envoy config to do the gRPC to JSON translation, at amukherj/envoy-grpc-json. The repository README has more information on how to try this configuration.

In conclusion, your microservice writers can focus on writing the services in terms of RPCs and protobufs. A little bit of annotation in service definitions, and tweaks to your Envoy config get you all the flexibility in terms of serving REST + JSON. This is a useful technique, especially if you already happen to have Envoy handling your API requests, or if you’re planning on a move to Envoy.

--

--

Arindam Mukherjee
Arindam Mukherjee

Written by Arindam Mukherjee

Programmer, blogger, author, tech enthusiast.

No responses yet