In Which The Author Conducts A Technical Interview On The Shoulders Of Giants
I was in the gig economy before the gig economy was cool, you know. In the past two decades I've been a consultant, a third-party service provider, a contributor, a self-employed technocrat, and just a plain old temp. Sometimes it's been remarkably lucrative and often it's been remarkably depressing. Sometimes both at once.
Last month I said goodbye to the last of my technical support customers and took a new full-time contract. I'm surrounded by very nice people there. The commute is short and the parking is free. It's a little hard not to get depressed when I'm sitting in my cube and seeing all my various journo-friends and journo-foes traveling the world and enjoying all the perks that the business can provide. The best I can tell myself is that Hawthorne started his adult career in the Custom-House and Melville spent his final years there. At the end of every day I do not worry about whether I sold my integrity cheaply or whether I failed to fight for the truth on any given subject. Such things have no meaning or relevance in the eternal late-fall twilight of that seventy-four-degree flourescent office building.
I make no decision on any subject beyond the technical. There are a thousand dreams and ambitions in that building and there are people who are compensated to a truly stunning degree and there are people who spend their lunch hours in worried dialogue about bills and childcare but it matters not to me. I show up in the morning and I do my work and I leave and by the time the oil is warm in my CB1100 on the road home all thoughts of the job have slipped from my mind with that same light viscosity.
This past Wednesday I assisted one of the fellows in the office in conducting a short technical interview. It was my intention to ask a couple of bland questions and shut up, but things got a bit out of hand.
My first thought as I read the applicant's resume: I really am older than I realize. He had more than a decade's worth of experience in the business and two degrees from our decidedly non-prestigious local quasi-universities but he was still fresh-faced and he was sufficiently excited about "neat tech stuff" to include a link to his personal tech blog in the header, right below his address but above the link to his programming repository on GitHub.
For those of you with no interest in modern computing beyond that of the consumer --- and I know that's most of you here --- I should give a brief description of how it works. Once upon a time, every company had a "mainframe". This was a massive computer that was "time-shared" into thousands of tasks. You accessed the mainframe through a terminal, which was like a typewriter that could also type back. Eventually video terminals became common. The last place to get rid of video terminals was probably the ticketing counter at your local airport, by the way, because speed and direct connectivity is important when you have thousands of reps making changes in the system at once.
Mainframes were specialized beasts that required round-the-clock care and feeding by a group of specialized operators. They could be remarkably difficult people --- turn down the volume on your computer and check out The Bastard Operator From Hell to get a fictional, but reality-based, idea of how it used to be.
From mainframes we went to VAX and UNIX systems. From there we went to so-called personal computers which became bigger and more powerful as the years went on. Eventually we got "servers", which are computers that share a design with your laptop or your desktop computer but which are vastly more capable. Then we got "enclosures", which have sixteen or so servers plugged into a single network and storage backbone. When I worked for Honda I was tasked with keeping the enclosures alive, which was remarkably difficult because they were hugely flawed and massively fussy machines.
The current trend in computing is virtualization. Each server runs simulations of multiple servers inside itself. When you hear about "cloud computing" and stuff like that, we are talking about virtual computers that are actually programs running on a real computer. There are pros and cons to this method.
For a while, people were satisfied with virtual computing. But then somebody had the bright idea of adding another layer of abstraction, called "containers". Each virtual computer can run many "containers" and each container is a partial computer with a single task. Maybe it's a Web server. Maybe it's a game host. Maybe it's a database query engine. But the idea is that each container is separate from the others and cannot "contaminate" the other containers.
So. Each enclosure hosts many servers. Each server hosts many virtual servers. Each virtual server hosts many containers. You've seem this all before:
The question is: why? Why do all of this? What's changed over the past thirty years to require the nesting-doll configuration? If you ask the average marketing-bot jerkoff from LinkedInLand he will tell you a bunch of crap about scalability and convenience and whatnot but the real reason is this:
There are no more skilled operators, and even if there were skilled operators, nobody would hire them.
The operators, and their descendants known as "professional sysadmins", were unpleasant people with strong signs of autism spectrum disorder who rarely had any respect for any other aspect of the company, particularly anything having to do with sales, HR, and marketing types. I know because I've been one of them. We viewed our job as preserving the environment. Period. Point blank.
A good team of operators could accomplish miracles. They could run airlines and banks with computers that couldn't hold a candle to the iPhone in your hand in terms of power or speed. But woe betide the company executive who made plans and expected that the operators would bend to his whim the way the people in the marketing department or the sales department or the cafeteria did. We did not work that way. We served the machine, and we knew what the machine needed. Simple as that.
This philosophy was efficient and it was cost-effective but it did not account for the American corporate ego. So it was replaced by virtualization. With virtualization, and with containers, nothing has to be done right. You just throw hundreds or thousands of identical containers at the problem. Is your website slow because the code is lousy and the database is junk? Add containers. Do your agile scrum programmers write spaghetti that destroys systems and sucks resources? Just press a button and a thousand extra containers will solve the problem. The containers are standardized images, knocked out with no thought as to the efficiency or elegance of their operation. They solve problems the way Grant won the Civil War --- through attrition.
The young man sitting in front of me for this interview was a demonstrated master of containers. He had recent and significant experience with all the trends and fads in containers, all the tools to manipulate them, all the environments in which they could be deployed. He'd worked with every kind of container-related tool or enhancement you could imagine. When he spoke, it was with an effortless flow of concepts and buzzwords related to containers. I was pretty sure that I could learn a lot about containers just by hearing him respond to difficult interview questions, but I seemed only fair to start with the easy ones. He said that he "loved Linux", so that's the direction I took.
"What's the difference between Open Source and Free Software?" This central distinction is the most critical aspect of any discussion regarding Linux. But he didn't know. He didn't know the canonical definition of Open Source, which is that the programming code is available. He thought it meant that there was a community for a given program. Regarding Free Software, he'd never even heard the term.
"Okay... What are the seven OSI layers?" Every sysadmin knows what they are. You need to understand them to perform even the simplest network troubleshooting. But he didn't know.
"Okay... Can you give me a sense of what's in a packet?" He smiled and said, "Gosh, someday I'd like to learn that." Ten years ago, if you couldn't at least describe a packet in generic terms you couldn't get a job as a $10/hr PC tech --- but this kid was about to earn seven or eight times that.
"Can you explain the difference between SIGHUP and SIGKILL?" There was a time when you weren't allowed to even use a video terminal without knowing those signals. But he gave me a genial response about how he'd always been confused by that.
I decided to switch up a bit and ask him a question about "pods" in the container program known as "Kubernetes". He was articulate, knowledgeable, and correct, offering an explanation that surpassed my own understanding of the products. A few more questions showed that when it came to modern computing, he was spot-on. What bothered me was that he had no, and I mean zero, understanding about how computers actually work.
No, scratch that.
What bothered me was that he had no understanding about how computers work --- and it doesn't matter. I recommended him for the job without hesitation. I think he'll be better at this particular gig than I am.
When I started programming a TRS-80 in 1980, it was expected that any potential or aspiring computer user would acquaint himself with the innards of the machine, from the assembler language used for programming it to the individual microprocessors under the hood. As one of the early Linux sysadmins and advocates in the nineties, I had a working knowledge of the C programming language that was used to build UNIX. Without it, I would have been unable to understand and troubleshoot many of the problems I encountered. Even as recently as 2012, my mildly intimate knowledge of networking was absolutely crucial to solving some very thorny problems that were costing my client millions of dollars an hour.
In 2017, that no longer matters. Turns out that while the so-called geeks and nerds weren't looking, there was a war being fought. The purpose of that war was to take control of the environment away from technical people and put it into the hands of non-technical people. Nowadays, you order computing resources the same way you order any other bulk product for your corporation. It costs more than it used to, and it doesn't work quite as well, but what matters is that you've freed yourself from the tyranny of the operators in the basement.
Instead of having mysterious computers controlled by people who often literally referred to themselves as "wizards", you have IaaS, Infrastructure As A Service. You turn a crank and the computing stuff comes out, like meat. It's flavorless and characterless and it's only just as good as it needs to be, but you can adjust the volume of the delivery and it's all done through control panels on a webpage. Yes, there are still wizards out there, but they work at the infrastructure companies and they are sourced from overseas if possible so the balance of power is on the side of the contracting company that holds their visa.
Sitting at that interview table, I had a flash of self-understanding. I realized that the reason I'd always cared so much about hand-crafting in suits and shoes and bicycles and everything else was because I saw myself as an artisan of sorts, crafting delicate and masterful solutions in code or language. Most of it would go unnoticed by my users or my readers, but every once in a while a fellow sysadmin or classically-educated reader would recognize what I'd done and I would have that thrill of knowing that my efforts to provide the best work possible had been recognized.
Well, those days are mostly over. Nobody gives a shit about optimizations or elegance in computing anymore. Not when ten thousand dollars a month can get you more processing power via IaaS than the entire world had in 1980. We stand on the shoulders of giants, at the apex of the work done by a million brilliant men and women that, in the end, allows us to deploy an insurance-shopping app more quickly by pressing an infrastructure button a few more times.
The end is likely coming for excellence in writing in general, and autowriting in particular, as well. AutoWeek recently did a "30 under 30" issue that showcased some utterly brilliant photography and some mind-numbingly poor articles that wouldn't have passed muster in a high-school composition class circa 1985. If this is the best of what the next generation has to offer than we might as well all just call up YouTube and let the various idiots there just defecate directly into our brains. I remain utterly gobsmacked by just how bad these Millennial writers are. The vast majority of them are functional illiterates whose internal library of learning, wit, and allusion contains nothing but an endless vista of dust-caked empty shelves.
There isn't much I can do about the first of these two situations. Whatever chance I had to alter the direction of computing fell apart when Spindletop fell apart after September 11, 2001. I was affiliated with those fellows and I thought we were going to make a difference but in the years that followed I decided I'd rather make a buck, a decision by which I continue to stand.
With regards to my latter complaint, however, I believe I shall stop cursing the darkness long enough to light a candle. It's my intention to conduct an online workshop for young would-be autowriters over the upcoming winter. There's plenty of talent out there; it only requires a bit of direction, a touch of guidance. I do not yet know how thorough or extensive such a workshop should be, nor the precise form which it should take, but something must be done and I'm just the man to do it. Confined though I may be to the daily grind, there is no reason why I cannot grind exceeding fine, so to speak. Stay tuned.