Specification-Driven Development (and why it’s the future of software development)

TLDR: I generated code for a REST API server and a microservice using a single Swagger spec file.

Every once in a while, I take time to completely re-evaluate how I approach software engineering. Design patterns are cool, but aren’t a game-changer in terms of increasing productivity. New dev tools go out of style as soon as you get comfortable with them (I like to wait until they’re mature). And no one programming language can be proven objectively better than another.

But after my recent exploration of what I call “specification-driven development”, my approach to backend development is significantly different today than it was last week.

At my last company, we had started a new project and decided to try go-swagger, which is a tool that generates the code for an entire Golang API server based on a swagger spec file. I was reluctant because it reminded me of the Adobe Dreamweaver days. But after the workflow finally clicked, it was exhilleratingly efficient.

I learned how to write a REST API schema according to the Swagger spec. In the Swagger file, I was not only able to specify API endpoints, but also add models (YAML can use references to keep it DRY), write descriptions for everything (which would be used for the generated documentation), and even add example values so that the documentation looked realistic. Then I ran a simple command and BOOM, all the server code was there. Another command resulted in my browser popping up with pretty docs and example requests/responses.

So recently I started a new project with the same formula- a code-generated API server. But I took it a step further by also having microservices that were code-generated via the same Swagger spec. It took a bit of shell script kung-fu, but I was enlightened to learn about Google’s Protobuf protocol and now I see what the future of software development will look like…

Backend developers have always dictated data models. A single declarative language will be used to define the endpoints and data models of a backend API, and code-generation tools will generate usable server and client code for all major languages. It’s already happening. There are a plethora of Swagger code generators. That saves a tremendous amount of time when iterating quickly and trying to keep the backend and frontend clients in sync. I’ve always believe that when working with data, it’s important to have a “single source of truth”, and now it’s possible to do that.

So now to the details. I started off with a swagger spec file:

At the end of the Swagger file would be a list the list of definitions that were references in the endpoints:

With the endpoints and models defined, there would now be enough information to generate the server with go-swagger:

swagger generate server -t ./generated

That’s it. Then all of a sudden, you see code show in the generated directory.

The cmd directory contains the main function/entry point. You can go run that.

The models directory contains… models.

And the restapi directory contains the endpoint handlers. The configure_myappname.go file contains hooks so that you can plug in your logic.

So with the API and models generated, I now wanted to generated a microservice.

After looking at various microservice frameworks, I chose go-micro as the candidate of choice because it had lots of stars, the example code looked simple enough and the readme had cool buzzwords like “protobuf”, “service discovery” and “load balancing”. So why not?

The thing is, I wanted to generate code from my Swagger file. The go-micro framework instead used the protobuf language to describe models and interfaces. So before moving forward, I’d need to convert my swagger file to a proto file. To do so, I found a tool called openapi2proto being maintained by New York Times. There wasn’t really another viable alternative, so I crossed my fingers… and it worked. I now had a myapp.proto file that contain all my swagger declarations, but in proto format, like so:

From there, the protoc tool was used to generate the microservice code in Go. I’ll be honest. There was a lot of tinkering required to create a shell script that generated working code, because Golang is very picky about directory/package structure, and code generation tools rely on that stuff to do their job. But after chaining together a few of the tools and adding additional hacks to tweak the generated files, I ended up with an ugly script that generated lovely results.

I ran into deep frustrations. I was on the cusp of saying “screw this” and building the microservices manually. But thankfully, with a bit more fiddling, I was able to automate the tedious parts. I can now create a new microservice in minutes, plug it in, and start implementing features. And if model changes are required, I just update the swagger spec, re-run the scripts and tweak a bit of code.

After about 2 days of foundation work, I now have an MVP backend. I try to keep things as simple as possible in the beginning, because once I add Docker and nginx to the mix, it’s only going to add additional complexity.

But for now, I’m going to jump into some React Native code. I’ll try to generate JS classes and maybe a JS client from the swagger spec to have generated code across the full stack.

Leave a Reply

Be the First to Comment!

wpDiscuz