Intro to gRPC: The REST alternative

gRPC is a high-performance binary protocol for remote procedure calls on virtually any platform. Here's a hands-on intro to using gRPC with Node.

BigBlueStudio/Shutterstock

gRPC is a binary protocol for remote procedure calls, open sourced by Google. It is an alternative to REST that has gained traction for its performance and simplicity. gRPC gives you a full duplex connection over HTTP/2. At the moment, it requires an adapter when used in the browser, and it remains most popular in back-end services. gRPC is especially popular for microservices, where its strong API contracts and method abstraction help tame complexity. Although gRPC demands more upfront work than a typical REST stack, it can be appealing for larger projects and organizations that benefit from a more structured approach.

Intro to gRPC

gRPC is a viable alternative to REST and represents the state-of-the-art in remote procedure calls. It is polyglot, offering kits for server and client in all the major back-end languages and platforms. There is also the gRPC-web project for use in the browser.

In this article, we'll dig right into the code and build a Node.js client that talks to a Node.js service. This will give us a look at both sides of the communication channel. gRPC runs on HTTP/2 and uses a binary protocol that supports several combinations of blocking and streaming requests: request/response, client streaming, server streaming, and full streaming duplex. For our simple example, we’ll stick to a request/response interaction (what gRPC calls a “unary” interaction, versus streaming).

Create the Node service API definitions

gRPC works by creating a service definition in plain text, then compiling that into “stubs." Stubs are code libraries that your application imports and relies on to define servers and clients. Essentially, the text definition lets you easily describe the endpoints and their communication in a human-readable format, then compile that into code libraries that you use inside your application to interact with the endpoints. The stubs do all the actual I/O work and your code gets access to the API definition and adds its business logic to that. 

This overall flow is similar to other RPC frameworks. For learning purposes, the important thing to know is that the service definition is the hinge point between all your clients and services. The definitions are the API surfaces and the in-code servers and clients run off of the compiled version of them. 

In theory, having the extra step of maintaining the service definitions and migrating them along with code changes adds overhead. This is also true in practice; however, integrating the definition into the build process and support in IDEs makes the process less onerous.

Let’s start by creating the API definition for a Node-based number squaring service, as shown in Listing 1.

Listing 1. The API definition for the Node service


syntax = "proto3";

package myservice;

// Define the service
service NumberSquarer {
  // Unary RPC method for squaring a number
  rpc SquareNumber(Number) returns (Number) {}
}

// Define the message types
message Number {
  int32 value = 1;
}

Listing 1 begins by telling gRPC what version of the Protocol Buffers protocol to use. Currently, the latest version is proto3.  After that, we define a package called myservice.

Then, we have two blocks, one with the service and the other with the data structure. The service is named NumberSquarer and it has a single remote procedure on it, SquareNumber, which accepts a Number type. That type is defined in Listing 1, and SquareNumber also uses a Number as its return type.

Set up the Node project

To start a new Node project, you can use the steps shown in Listing 2, which will include the necessary dependencies. (Note that you will need Node/NPM installed.)

Listing 2. Start a new gRPC Node project


$ cd iw-grpc
$ npm init -y
$ npm install @grpc/proto-loader async google-protobuf @grpc/grpc-js

Use the definition to create a server

Some platforms, like Java, require a compile step in the build process, but in JavaScript we can compile at runtime. In the /iw-grpc directory, create the service.proto file with the contents previously shown in Listing 1.

Next, use that definition to start a Node server that exposes the squaring endpoint, as shown in Listing 3.

Listing 3. Node gRPC server


const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');

// Load the protobuf definition dynamically
const packageDefinition = protoLoader.loadSync('service.proto');
const { myservice } = grpc.loadPackageDefinition(packageDefinition);

// Implement the NumberSquarer service
const server = new grpc.Server();

function squareNumber(call, callback) {
  const number = call.request.value;
  const squaredValue = number * number;

  callback(null, { value: squaredValue });
}

server.addService(myservice.NumberSquarer.service, {
  SquareNumber: squareNumber,
});

// Start the gRPC server
const port = 50051;
server.bindAsync(`0.0.0.0:${port}`, grpc.ServerCredentials.createInsecure(), (err, port) => {
  if (err) {
    console.error(`Failed to bind gRPC server: ${err}`);
    return;
  }
  console.log(`gRPC server is listening on port ${port}`);
  server.start();
});

The three main chunks (after the imports) in Listing 3 are: load the service definition, implement the endpoint logic, and start the server. Since we defined our service inside a package, we begin by loading the package and then grab the service from that:


 const { myservice } = grpc.loadPackageDefinition(packageDefinition);

After that, we create a new gRPC server, define our squaring method, and use the server object to bind the new method to the SquareNumber endpoint.

Finally, we start the server listening on port 50051. We use the createInsecure method, which just means it’ll use HTTP instead of HTTPS.

Now you can start this server by typing: $ node server.js.

The Node client

Once the Node server is running, we can use the same service definition to build our client, which is shown in Listing 4.

Listing 4. The Node client


const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');

// Load the protobuf definition dynamically
const packageDefinition = protoLoader.loadSync('service.proto');
const { myservice } = grpc.loadPackageDefinition(packageDefinition);

// Create a gRPC client
const client = new myservice.NumberSquarer('localhost:50051', grpc.credentials.createInsecure());

// Read the number from the command line argument
const number = parseInt(process.argv[2]);

// Create the request message
const request = { value: number };

// Make the gRPC unary RPC call
client.SquareNumber(request, (err, response) => {
  if (err) {
    console.error(`Error: ${err.message}`);
    return;
  }
  console.log(`Squared number: ${response.value}`);
});

The client accepts a command-line argument for a single number (process.argv[2]) and uses the server to output its square. It follows the same process as the server for loading the service definition. Instead of creating a server, it creates a client service referencing the server we have running, with:


const client = new myservice.NumberSquarer('localhost:50051', grpc.credentials.createInsecure());.

Calling the remote procedure is simple and looks like calling an asynchronous method on the client object. The RPC accepts two arguments: the request and the response callback handler.

To run the client, you can type $ node client.js 7 and received a response like this one:


Squared number: 49. (Make sure the server is running!)

You can find the code for this example here

Other platforms

In other platforms, you include the service compilation as part of the build process, and then use the compiled interfaces for writing your endpoint implementations. A good way to get a look at how this works is by cloning the examples from the gRPC repository on GitHub. 

For example, you could use one of the Java examples. You can find similar examples and quickstarts for all supported platforms here. If you're interested in gRPC from the browser, check into the grpc-web project. You can also check out the Android project for gRPC.

Listing 5 shows how you’d integrate the gRPC compile step into a Gradle build for Java.

Listing 5. Java Gradle gRPC build sample


protobuf {
    protoc { artifact = "com.google.protobuf:protoc:${protocVersion}" }
    plugins { grpc { artifact = "io.grpc:protoc-gen-grpc-java:${grpcVersion}" } }
    generateProtoTasks {
        all()*.plugins { grpc {} }
    }
}

Essentially, you'd just need to add a generateProtoTasks target to the build.

Conclusion

This article gives you just a taste for using gRPC. Although it is different from the ubiquitous REST formula, it's not too hard to shift mental gears. The biggest change is having to account for the extra compilation step. As you’ve seen in the case of Node, adding compilation is fairly painless. 

It’s fair to ask if migrating to gRPC is worth it when REST is so common and familiar to developers. The answer is that gRPC shows its value when used in high-intensity microservices environments, where many services are interacting with a lot of volume. In that case, a large organization will benefit from the extra rigidity afforded by explicit RPC contracts, and the binary data protocol will net performance benefits magnified across the entire microservices architecture.

This is not to say gRPC is a non-starter for fast-moving startups and the like. It’s just that you’ll want to make sure you are willing to adopt the extra configuration, are convinced you need the performance benefits, and understand the additional work required to find and train developers. 

Of course, you might just really like the gRPC approach, which is another perfectly legitimate reason to start experimenting with it today.