Kady M.
6 min readFeb 9, 2019

--

Advocates for this “learn to code” movement argue that all jobs in the future will somehow be impacted by coding.

Well, true. But for some reason, people seem to not understand the history of this profession.

“Coding” is simply a new word for “programming”. In this article, I’ll use the traditional terminology. Goal here is to explain to the programming newbie what drives the need for programmers, and the reason why, if you’re a student-to-30-something learning to code, you’ll be switching jobs again before you’re 50. (Which will also explain why the basic premise of the OP’s is correct; adding coding to the base high school curriculum would be a mistake.)

So, just for fun, let’s outline what these major changes have been to-date.

NOTE: I am just winging this off the top of my head. Don’t get anal with me by pointing out what I missed, or what you disagree with. It’s a quick summarization of a complex process, and we’re just going to use it for illustration, not anything specific.

  1. Basic development (50’s-60's). Mainframe computing, starting with many different vendors entering the marketplace, but only the largest corporations were able to afford them. Computing languages differed from system to system, but the period ends with the IBM 370 being the defacto computing standard, and COBOL being the defacto programming language.
  2. 1970’s. Minicomputers entered the venue. Resurgence in the use of FORTRAN alongside COBOL, as scientific and digital automation computing had slightly different needs. Small business systems started to appear, e.g. the IBM S/32 and S/34, driven by the programming language RPG.
  3. Personal Computing enters the venue in the 1980's. Again, a plethora of competitors using different computing languages resolves down to the IBM PC/Microsoft combination. Program development primarily done in BASIC.
  4. Client/Server computing (1990's). The need for more compute power on the desktop leads to companies such as Sun Microsystems providing that power using UNIX and RISC technology. BASIC proves unsuitable for high performance computing, as does COBOL and FORTRAN; C is developed. Object oriented computing begin to appear, with C++ being the most prevalent. Programming tools such as IBM’s VisualAge also appear, and 4th generation computing languages are developed.
  5. In the 2000’s, internet technologies take hold. HTML becomes the internet’s common language. This period leads to rapid growth for high performance systems such as Sun Micro, but that UNIX/RISC period is rapidly eclipsed by higher and higher performance from Intel chips running on standard PC’s and Windows. Java takes hold as the core computer language of the internet.
  6. In the late 2010’s (e.g., today) the core internet technologies begin to be supplemented by cloud computing and microservices, which are single-purpose programs which run in the cloud to do very specific things for programmers. Java remains the primary internet programming language, but the demand for individuals who can script key microservice functions such as Kubernetes and Kafka jump in importance, along with programmers in R and Python.

Now, let’s drill down a bit on what happens to programmers during each one of these periods. I’ll use two of them as examples, but during each period the same pattern recurs. During the beginning of the period, there is a large but temporary increase in the number of programmers needed, but the number employed per unit of output rapidly decreases near the end of the period.

During the beginning of period (1), programming was actually done by hotwire; the programmer arranged wires inside of the computer to make it do what he/she wanted. That was obviously labor intensive and limited sales of the systems, so almost immediately the designers went to work making programming easier. They designed facilities that would permit a programmer to talk to the microcode on the system directly (Assembler) but when THAT proved unproductive, they developed techniques which would take a program written in some semblance of natural language (COBOL) and then another program (the compiler) would recode that program for you into something the computer could understand.

Bottom line: In the beginning of the period you needed a LOT of programmers to develop a given output; at the end of the period, you needed a lot less (like, maybe ONE) to do a job that previously took maybe a hundred to do.

Now, let’s skip up to period 5. Same thing. When the internet began to gain popularity, it was because of a simple language (HTML) that could easily be interpreted by another program (a browser) to provide a visual layer, while at the same time accessing data in other locations on the ‘net and bringing that data back to the browser.

HTML is easy to write but labor intensive and kind of a pain in the arse to maintain; so the next thing the smart people did was create other programs to write the HTML for you; all you needed to do was work graphically on the screen (NetObjects, Dreamweaver) and the system would output the HTML. Suddenly, you needed many fewer HTML programmers.

Same bottom line. In the beginning of the internet period, you needed a lot more HTML programmers to write and maintain a website of X pages than you do today. A LOT more.

Today, we are just starting a new period in the history of computing where companies are now seeing economies of scale by moving computing functions off their own premises and into the Cloud. As mentioned in (6), this is requiring a load of new programmers (aka “coders”) fluent in microservice technologies over the next decade or so, as existing programs are replatformed and modernized.

But, coders, know this from computer history:

  1. Software companies are already working overtime to put you on the unemployment line; both they and your employer want to decrease how labor intensive the replatforming to microservices is. To that end, they are creating other software programs that will provide simplistic user interfaces for these microservices. These are traditionally known as 4th generation languages, although the term “agile tooling” is a modern synonym. If you’re writing scripts to drive and link these services today, you won’t be doing that in a couple of years. Somebody with much less skill than you, but who understands the higher level programming tool, will be doing your job.
  2. Also, there is a new “player” that we haven’t seen before that makes the above even more drastic in terms of shortening coding careers: AI. Before, creating a higher level visual programming tool was simply a matter of creating an interface that a user would use to efficiently write a program; today, AI is going to make those tools even MORE productive and “agile” than previous incarnations of 4GL languages.
  3. The need for programmers during a new period is largely driven by the replatforming of existing programs to the new technologies. This is much like the employment estimates for, say, the Keystone pipeline. Keystone creates 20,000 jobs during construction, but only 2,000 of those jobs will survive after construction is complete. Same thing here. Once the replatforming period ends at the end of the next decade, about a tenth of the programmers that are currently needed will be required for ongoing maintenance of the software assets.
  4. Finally, and depressingly, it must be mentioned that coding is an easily outsourceable profession. US programmers are always in competition with those in China and India (and I expect, in the next decade, even more developing nations, such as in Africa, to come on line to compete) which at best hold down US wages, and at worst allow the US coder to be replaced by an overseas counterpart living in a nation where $15 an hour allows you to have a house, two cars, a maid, a driver, and private schools for your kids (I do not exaggerate, btw. I’ve lived in these places.)

To conclude, I will simply say that the coding “craze” that I see today disturbs me, because from history…..we know it’s not a long term employment trend. To stay in it for an entire career involves agility on the programmers part (they must constantly be learning the “next new” technology to stay employed), luck (your employer is always trying to find somebody overseas to replace you), and an ability to manage money, since compensation from job to job may NOT always be in a positive direction.

My $.02.

--

--

Kady M.

Free markets/free minds. Question all narratives. If you think one political party is perfect and the other party is evil, the problem with our politics is you.