Coders

I’ll be really clear from the beginning. I could not code my way out of a virtual, damp, recycled, organically-grown hemp fiber bag if I was given the rest of time – not just my time but ALL time. Even if you enclosed me in said virtual bag with a stack of coding manuals and an appropriate set of computing gear, I would remain in the sack, alone, damp, and surrounded by pesticide, herbicide, and chemical fertilizer-free hemp forever.

But I have the greatest admiration for what coders have done with their various languages, from ones and zeroes up through the hierarchy of (seemingly) always-expanding numbers of languages. I can’t list them all, nor would it be interesting to do so, but these folks out there beyond Firefox browser and WordPress website, out through my monitor, the HDMI cable that connects it to my video card, through a PCI Express high-speed computer peripheral expansion bus into the motherboard, bios, OS, machine language and whatever else it needs to connect with so I can see my typing, and out through my I/O port and LAN line to my cable modem/router and relatively efficient ISP, those folks who have coded this whole improbable mess together into a stable, functioning way to entertain myself and, occasionally, some of you, dear readers, have done is nothing short of miraculous. Well, I say miraculous but it was all a product of human intelligence and our incessant drive to solve problems that many of our fellow creatures weren’t even thinking existed.

Programming – coding – started eking its way into existence in 1801 with the invention of a loom which was run by punchcards (fondly thought of to this day by anyone who took upper level science, engineering, or computer science courses into the ’80s (perhaps beyond, not sure)). Charles Babbage intended to use this punchcard method to create programs for his difference engines and analytic engines, conceived as early as 1821 but never completed during his lifetime. His thinking about the analytic engine was remarkable and to a significant extent foreshadowed computing into the 21st century. Ada Lovelace (or more properly Augusta Ada King-Noel, Countess of Lovelace (née Byron; 10 December 1815 – 27 November 1852), aside from being the only legitimate heir to George Lord Byron, romatic poet, close friend of Percy and Mary Shelley, adventurer and noted profligate, is credited as using her friend Babbage’s ideas to create the first computer algorithm.

Ada_lovelace
Countess Ada Lovelace (not your typical neck-beard)

Although not directly related, it is interesting to read about the development of automated models (birds, toys, etc.) and devices like the music box and player piano as these types of creations were proto-computers in a limited sense; they only did what they were programmed to do, rather than perform a range of calculations. But onward!

Starting in 1889 Herman Hollerith started experimenting with punchcards (Hollerith cards) and paper tape as methods of feeding instructions to instruments he designed. In 1896 he started the Tabulating Machine Company, later to transform into International Business Machines (IBM). His designs led to creation of the Atanasoff-Berry Computer (ABC) designed to solve linear equations. The Bombe (1939) and Colossus (1943-1945) computers led to valuable breakthroughs in decrypting German communications.

Hollerith
Herman Hollerith, founder of IBM, noted soup strainer

Having enlisted in the U.S. Naval Reserve in 1943, Grace Murray Hopper worked in computing throughout WWII and, in 1947 while working for the Harvard Computation Lab discovered the first computer bug – a moth trapped in a Mark II Aiken Relay calculator, probably one of the simplest debugging jobs ever. She retired as Rear Admiral in August 1986 and received the Defense Distinguised Service Medal.

Grace_Hopper_and_UNIVAC
SI Neg. 83-14878. Date: na. Grace Murray Hopper at the UNIVAC keyboard, c. 1960. Grace Brewster Murray: American mathematician and rear admiral in the U.S. Navy who was a pioneer in developing computer technology, helping to devise UNIVAC I. the first commercial electronic computer, and naval applications for COBOL (common-business-oriented language). Credit: Unknown (Smithsonian Institution)

We zoom into the ’60s and the development of FORTRAN, developed in 1954 and released to the public in ’57, around the same time COBOL (1959) was developed by Grace Hopper. The first computer game was called SPACEWAR! and was released in 1961.Steve Russell created it in what he estimated to be 200 hours; he never profited from his work, but it was later released as an arcade game (1977).

The internet started taking shape in the mid-70s and by 1992 the world-wide web was released to limited, but ever-expanding public use. I think I started using internal email on an HP mini-frame around 1983, but I could be off by a year. My academic advisor bought a TRS-80 from Radio Shack in 1982. It occupied a semi-sanctified place in his office and could be used only for those writing up their dissertations, etc. I bought my first computer in 1984, a skinny MacIntosh for about $2500; it had an 7.8336 MHz CPU, 128 KB RAM and there was an 8 KB ROM chip included, along with a 64 KB. I bought the 400 KB 3.5″ floppy disk drive as a peripheral.

I played a game called Wizardry: Proving Grounds of the Mad Overlord. To play it I had to graph out my progress down hallways by turning right or left (if there was a wall, I graphed a wall, if there wasn’t it was a turn or a room that needed exploring). You didn’t know what was going to happen until a text box popped up and told you to open something, cast a spell, smite something (and what that something was), and that kind of thing. To say it was rudimentary is to do it a kindness, but I was hooked. I just found out you can download various versions of the game to this day and play it if you have an inexplicable nostalgia for doing something the old-fashioned way. I also found some YouTube videos, but they were in Japanese so you can go look at those if you’re interested. As far as I can recall, my game was in black and white.

I didn’t play anything else until Doom and Quake came out in the mid-’90s. Both were kind of terrifying; I introduced Quake to my workgroup (I was the manager) and we would play it on the company network after 5 PM. It felt like good team-building (all of them were better than I was so I got killed a bunch) and a little naughty (we were using company assets for our amusement).

Let’s jump again and visit the early ’00s. The Elder Scrolls III: Morrowind came out for PC. For the time, it was a beautiful and innovative fantasy role-playing game (RPG) in which you started off at level 1, went around trying to avoid all creatures and evil non-player characters (npcs) as virtually all of them would kill you if you didn’t have armor, weapons and skills to defend yourself (I use the word “kill” loosely of course as you could always reload a saved game and get out there again; annoyingly, I spent a good deal of time forgetting to “save” and losing a bunch of progress due to absent-mindedness). You would go around collecting herbs, flowers, wood, a bunch of stuff, that you could sell to a local merchant and upgrade your armor; all the while you were earning experience and gaining skills, making it more satisfactory that you wander off the beaten path and encounter bad npcs who would try their best to revert you to a load game. I played Oblivion and Skyrim, both for unhealthy amounts of time. Then Fallout 3 and Fallout: New Vegas. I didn’t really enjoy the Fallout games as much as they are quite bleak in my view, but they were truly incredible kinds of modern story-telling at the same time.

In each case, I became increasingly fascinated by what was happening beyond the screen, in the world where lines of programming interacts with ones and zeroes and become these astonishing worlds full of beauty, characters that talk to you, environments that go through diurnal/nocturnal shifts, in which weather occurs, in which the grasses on a hill and the flowers in a meadow blow in the wind. And in which little pieces of code sense that you are approaching, get really upset with you and do everything in their programmed power to make you reload a game. You can almost hear them chortle fiendishly when you die, particularly if you forgot to save your game in the past hour or more of game play (usually they just walk away and go back to whatever they were doing before you interrupted, which may include farm work, millwork, blacksmithery, marching and patrolling, buying stuff from local merchants, talking to each other, and other bits of coding brilliance.

But the folks who code games are up to much more than just making a game. They are also pushing the limits of what current commercially-available computing will do and making further innovations in central processing units (CPUs), graphic processing units (GPUs), operating systems (OS, like Windows 10 or OS X El Capitan), audio processing, like Creative Labs sound cards, etc. They are pushing for better home computing, better console computing (Playstation, XBox, Wii U, older models, etc.), more stable server and internet protocol technology, more secure computing (getting your account hacked after you’ve leveled up, saved a bunch of armor, weapons, gold, etc. is a horrible thing!). They are redefining the limits of technology for everyone in the world; as the gaming platforms increase in power, all other computing increases in power as well. While “power” in computing might cost a bit more, entry-level computers fall in price and deliver that power to increasing portions of the world’s population. The whole human race lurches forward in its abilities to (1) learn computing skills, even if they are related to writing and calculating applications and (2) make a larger percentage of the world capable of getting 21st-century employment.

I play a massive multiplayer on-line role-playing game (MMORPG). I’ve been doing this in my spare time for the last 3.5 years, again with a huge number of hours. About 50,000 players from around the world play this game on any given day (it is hard to pin a number down on this, but even if you put ±20% error brackets around 50k a whole bunch of people are playing this game, interacting with each other, chatting in text chat within the game, chatting/helping each other using a voice communication application). It is visually beautiful, a huge world with many environments, tough “boss” monsters, new play mechanics (gliding, updrafts, bouncing), new weapon and armor classes, and all of it pushes a good computer pretty hard.

There are a bunch of serious jobs for coders ar0und the world as well. I wish, for instance, that the battle between hackers/virus-and-malware-writers and people who are just trying to use computers would get managed in some reliable way. It is unacceptable that personal bank and credit card accounts are hacked into. It is absolutely frightening that there is some probability that electricity/utility grids will be hacked, damaged and/or crashed. While I personally wish that international governments were more transparent in their information-sharing with their various citizens I also think it is dangerous to have troves of classified documents hacked and shared; my hope is that (1) systems will become more secure (although I don’t know how short of disconnecting them from the internet) and (2) hacking will drive governments to be better at sharing without the hack.

I don’t know for a certainty how many jobs there will be for coders in the future – and coders, of course, is a catch-all phrase that includes many different types of languages, so there is no single number for the number of “coders” needed. I do know that there is an increasing push to get more women to learn coding and more young people to start coding at early ages.

https://girlswhocode.com/

https://code.org/

edx.org and coursera.org have a bunch of free courses in a variety of languages for achieving your particular interest (some require pay for certificates, so be aware of that).

There is also sage advice about being wise if you’re thinking of a career coding. This article from TechCrunch summarizes some issues you should know (the title overstates his argument for the sake of rhetoric, but pay attention to what he says):

https://techcrunch.com/2016/05/10/please-dont-learn-to-code/

But be aware that this is difficult, painstaking work that requires creativity in a language with many hard-and-fast rules. Every word, every punctuation mark, every number has a special and inflexible meaning that, if mistyped or misunderstood, may lead to hours or days of painful debugging (we’re not talking fishing a moth out of the machine this time). Programmers, system administrators, database administrators, server administrators, security administrators, etc. may work long hours. Not every job you get involves the adventure in wonderland jobs available at Google or the other tech giants; some are just sweaty, long days hunched in a cubicle, staring into a monitor, slashing away at lines of code that may come to seem infinite in length and complexity – and unintended, perhaps insoluble, consequences. Some of these people have rich lives and many interests, some have normal personalities and live healthy lives. Some live on a steady diet of salty chips, caffeinated soda and stimulants, grow beyond their belt loops and have amazing neck beards, fail to understand humanspeak any longer as they have become extensions of the code in which they live. It’s all up to you whether you become a designer of infinite beauty or an updater of cell phone apps. It’s all up to you whether you learn enough – persistently – to stay employed or whether you fall by the wayside when new hot languages emerge to rule the problem sets.

On the other hand, you’re helping define tomorrow – or might be if you’re good enough.

In my view, I bow humbly in all directions so that all coders, whatever their mission, feel the respect I feel for them. Keep pushing the limits and making new stuff. I can’t wait to see what you’ve been up to!

P.S. I am not a coder or a computing historian; I apologize for any liberties or oversights (which are many) that I have taken with the contents herein.

Featured image attributed to Duncan Hull @ https://www.flickr.com/photos/dullhunk/4833512699 under Creative Commons requirements for attribution.

Profound

Author: makingsenseofcomplications

I have an academic background in literature and, separately, science. My career has been in industry in positions of increasing responsibility assisting in the drug development process - one of the most amazing intellectual pursuits of the human mind, among many other amazing intellectual pursuits. I am interested in films, philosophy, history, art, music, science (obviously), literature (also obviously), some video gaming, human behavior, and many other topics. I wish there was more time in every day because we have a world that is full of amazing phenomena that are considered too superficially by too many. Although my first and last names are fictional, I think I believe in all of the stuff you read here, although I retain the right in perpetuity of changing my thoughts about anything written herein.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s