The problem might not be COBOL

At the beginning of the COVID crisis, the governor of New Jersey called for COBOL developers to help the state fix its unemployment insurance systems. The systems were crashing under overwhelming demand, and the culprit seemed to be the decades-old code underpinning the application for benefits. Predictably, critics jumped all over the government for continuing to operate vital systems using a programming language invented more than 50 years ago.

News flash: the C programming language is the most popular language in the world and it’s nearly as old as COBOL. SQL is the backbone of database programming, and it was invented in the 1970s. The age of a technology is no indicator of its viability or appropriateness for solving mission-critical problems. The fact that something is old may just mean that it has stood the test of time.

As it turns out, New Jersey ultimately didn’t hire any COBOL developers because the problem wasn’t really COBOL, it was actually something more quotidian and less dramatic: the ability to process forms on the internet in a way that scales. The problem New Jersey and other governments have isn’t that they’re using old code or older programming languages. The problem is often that they’re using code that wasn’t designed to meet the needs of people who use it.

The frustrated users of New Jersey’s unemployment insurance system shouldn’t necessarily blame the errors they encountered on COBOL. They ought to blame them on a system architecture that wasn’t built to handle increased demand, or on an interface that was copied directly from a PDF instead of being designed based on intensive testing with users. Blame them on systems that can’t talk to each other behind the scenes, so they keep asking people to re-enter their information. Blame them on a complex authentication process for tasks that can be done without requiring someone to create an account or sign in.

Getting good at online forms

Governments need to see themselves as digital services providers, which means investing in the product, user experience, and research skills necessary to understand the problems of people using their services, and the software engineering and operations skills that make the services reliable and available at the time of someone’s needs. Simply putting out a procurement for a new software project will not guarantee that it’s easy-to-use or able to stand up during a huge increase in demand, without requiring these specific capabilities from the delivery team.

To start, governments need to develop competency at putting forms on the internet and having them scale to meet demand, while integrating smoothly with backend systems and office processes. This skill is at the core of nearly all modern government digital services, yet many agencies continue to operate online forms that are the door to vital services but can’t compete with the user experience and reliability of common consumer apps.

Creating online forms that are easy for the public to fill out and easy for office staff to process is absolutely within reach for every government agency if they’re willing to follow the best practices of consumer technology in designing and operating these services.

At Ad Hoc, we start by bringing in product management and user research experts to work closely with agencies to investigate the main problem to solve and define the outcomes agencies are seeking. We take in all the constraints of government from privacy to security to accessibility, and balance those with the researched needs of real users, informing agency stakeholders of the tradeoffs and empowering them to make effective choices. We conduct usability research with people who come to government websites looking for services to find out where they get stuck, and what could be done to make the experience better.

Our research team also spends time with the civil servants who process incoming requests from the public to see where the bottlenecks are in their process and how we can ensure any new public-facing application integrates with their backend systems. That often means we have to integrate new applications with old systems and “old” code like the COBOL in New Jersey’s unemployment insurance system.

User research, both at the beginning of a project and throughout development, informs how our product managers set “north stars”, or shared visions of a future success, and outcome metrics with the government customer. As our engineers and designers create these new services, they’re always working towards outcomes for real people. We ask, how many people were able to enroll in a benefit? How often did the site go down and deny access to the application? We prioritize outcome-based metrics over how many features we shipped, or how many story points we completed, or whether we’re using the latest technology to get the job done.

Choose what’s right over what’s old

Yes, agencies should evaluate whether their continued investment in COBOL and mainframes makes sense for them in terms of their unique constraints of available budget and technology resources, just as they should continuously examine any system, regardless of its technology stack or the age of particular components, for whether its delivering the outcomes they want within the constraints they have.

Agencies should also fix the errors of the past and invest in new and more efficient technologies. No civil servant should have to print out forms submitted online and re-enter the information into another system. But we must be precise about what needs updating and why, and not be dismissive of technologies merely because of their age.

The so-called “von Neumann architecture” describes how nearly every computer on the planet is designed and operates. It dates back to the mid-1940s and is, for all intents and purposes, unchanged to this day. We would hardly blame New Jersey’s woes on von Neumann computers, just as we shouldn’t blame forms crashing on COBOL code just because it’s old.