The writing of software is only a small part of being a software developer. The other things that surround this — the human factors, the acceptance of business leadership and decision making that are outside the developer’s control, the grasping of the backstory of how a code base came into being in the precise way that it did, and the comprehension of why a change needs to be made and how and where to make it — all of these things combine to describe what it means to be a software developer. The actual writing and shipping of code is the very last step in an important process that involves many persons within a company, all of whom possess varying degrees of leadership and influence. This upstream process, though sometimes deemed by the developer to be tedious and plagued with inefficiencies, is crucial; it’s what separates startups who one day hope to be relevant from established companies who pay their way, make profit, and drive industry.

Any intelligent punk can write new code from scratch. That is what is taught and learned in university. That is the fun part, the easy part. Knowing how to implement software change in a preexisting, and therefore antiquated,1 project that’s already being used in a production context is the part that brings home the bacon. Knowing how to do this within the confines, constraints, and challenges of a technical team whilst striving to maintain integrity in the code is the job of the software developer. It’s a high calling, and the biggest mistake a developer can make is to presume that it’s all about the code. The code is a means, not an end. The code needs to be the best is can possibly be, of course; but the code is not the purpose for which the company exists. It’s not about the code.


  1. Any software that has managed to survive long enough to see the light of production is by definition antiquated. ↩︎