Articles: How The Computer Lost Its Sparkle

How The Computer Lost Its Sparkle

Just before we launch into another boring ramble-a-thon about arcane and obscure pieces of hardware I just want to stop for a second and ask you to look at the actual machine that you're using (if you can). Now, unless you're really, really lucky, you'll probably be viewing this on a PC (or maybe a Mac). Just take a few seconds to look at that box sat either on top of, underneath or by the side of your desk (by the way, if you're not viewing this on a PC then please try to picture a PC otherwise the next couple of minutes are going to be very dull).

Chances are it's a rectangular box with hard lines, plastic and/or metal. Possibly it's silver or black or maybe, if you're unlucky, beige. There might be lights, fans and jazzy 'quirks' but it's a box which has pretty much everything neatly tucked away inside of it.

While it's safe to say that 'modern computers' (especially PCs) have certainly undergone something of an overhaul in the past couple of years, managing to ditch the beige and become far more aesthetic, the image of computing as a whole has changed massively shifting from the 'obscure and mysterious' to the 'white box commodity' that is now epitomised by an all-in-one cardboard box that you can pick up from pretty much anywhere for just a few hundred of your hard earned $s or ús - even supermarkets now sell systems.

It wasn't always like this and, if you've read any of my other ramblings on the subject (and, if not, why not?) will no doubt appreciate that I have something of a love of old computing. That's kind of the focus of today's waffle-a-rama: Why and when did the computer lose it's sparkle?

Why It's The Size of a House
It's safe to say that computing has never exactly been glamorous (and, for the benefit of useless so-called-pop-star Fergie, you'll notice that the word 'glamourous' has two 'u's and two 'o's in it). While teenage girls may swoon at pop stars, footballers and various other pretty boys, never have the likes of Bill Gates and Steve Jobs been pin ups. Men (and women) may admire the sleek curves of a Ford Mustang or a Lamborghini, or marvel and coo at the technical brilliance of a V8 engine but, by and large, don't get excited about CPUs or the GFLOP rating of a 'muscle'-computer. We're not going to go into the psychology of this, the why's and wherefore's or even try to explain it in the slightest - it's just a fact.

In the early years of digital electronic computing (the 1940's for the purposes of this article) computing was all new and shiny. The public didn't know anything about this strange field of science. Yes it had seen (or heard of)(maybe) analogue computing in the shape of differential analysers and the likes of the Hollerith (later IBM) tabulating machines had revolutionised offices and statistical work the world over. Digital electronic computing though was radically new to 99.99% of the population.

It's well known that ENIAC received massive public exposure and some people may even be able to cite it as the first machine of its kind (it wasn't by the way, but even finding someone who's heard of Eckert and Mauchley's behemoth is a rarity these days). Why did ENIAC receive any form of attention though? It was primarily designed to help the war effort (by calculating shell trajectories) so surely it should have been shrouded in secrecy. It should have been hidden away from prying eyes...shouldn't it?

We're not going to try to figure out the rationale behind the decision to reveal ENIAC to the public (there's little point - propaganda would be high on my list but I can't be bothered to investigate as I have far more interesting ways to spend my days) and simply accept that it happened. If you can though, try to picture the poor reporter assigned the job of reporting this...thing. Looked at from a purely objective perspective, ENIAC right through to the shiny dual-core Intel and AMD hearted offerings under your desk and the obscenely powerful supercomputers are not exciting. They're machines that sit there crunching numbers. Faceless boxes filled with wires and circuits that don't 'feel', get excited or really do...anything.

Selling the idea of computing to the public was never going to be an easy task (it still isn't) but ENIAC... Here's a machine the size of a house, uses more electricity than X-thousand homes, never makes mistakes (yeah right) and can do calculations infinitely faster than a human can. That last point is the hook that did much to save computing when it came to the man-in-the-street and would, for many years, be the perception of computing adopted by the masses.

Mankind Is Redundant
What people saw for the first time in ENIAC (if we ignore the Zuse machines, the ABC, Colossus and a handful of others) was a machine that was smarter than a man. Mankind had, for the first time in history, managed to build something that was 'smarter'. We were sold visions of ENIAC (and its successors) solving all of the world's problems. The advertising men and spin doctors had to come up with a way to make computing interesting and they doggedly latched onto the 'smarter' aspect, conveniently ignoring the fact that computers, even to this day, are as dumb as fence posts and really aren't 'smart' in the slightest.

Even at it's ridiculously slow pace though, ENIAC was faster than a man or woman at calculating and processing numbers. There's no question about this but that doesn't make it 'smart' or, if we use a more accurate term, intelligent. News articles and reports though painted a picture of 'electronic brains' that could 'think', answer riddles, defeat us at chess and run our lives far more efficiently than our pathetic human brains ever could. That was the vision, that was what the media was telling us.

Given this 'vision' it's easy to see why some people did get excited by it all. Computing (in the digital electronic sense) was new and it did open up a whole new world of possibilities but it wasn't what the media led the man-in-the-street to believe. The idea that somewhere there was a machine that was making decisions, processing vast quantities of data and 'thinking' was the stuff of science fiction but it was what the public tapped into.

It wasn't just the 'electronic brain' that captured people's imaginations though. In the age of ENIAC and UNIVAC computers weren't just rare but almost mythical. Nowadays computers are a part of everyday life with millions of them rolling off the production line like sausages. In the 1940's and 50's though the computer was a rare beast. Most people had barely heard of them and had certainly never seen them. There was an air of mystery about computers, like anything rare, that made them elusive and exciting. And when people did get to see them, computers were behemoths. Here was a machine that towered over you, stretched off into the distance and was physically impressive. It had stature, presence and made you feel small and insignificant by comparison. You would quake and bow down before the electronic monster that had granted you audience.

The Unknown Becomes Known
Although it's easy to look back and assume that computer science in the 50's and 60's leapt forward at a breakneck speed and that everyday heralded a new concept and advance, the truth is that things did progress just a little bit more sedately. That's not to say that things didn't race forward at terrific speed as science had a lot to do to get us ready for the 'push-button age' that the media had promised us.

Massive advances were made though and not only did machines become ever more powerful, reliable and compact (relatively speaking - the switch to transistors from valves brought machine size down but machines could still easily fill a whole room) but they were also becoming more accessible. With players like IBM having joined the industry (well with IBM practically becoming the industry) the sheer numbers of computers installed around the world increased and exposure to computers became ever so slightly greater. With more people being exposed to what the computer could do, the mystic and air of mystery surrounding the computer was starting to erode. This became even more apparent when computers started to escape from the research and scientific community and appear in the business world.

That computers were useful was never in question but to increase sales, the likes of IBM had to make sure that their machines were ever more easy to use so that more and more people could access (and therefore demand) the power of computing. From a user point of view this was great. From a sales point of view this was great. From the computer's point of view it wasn't so great.

The mysterious monolith that was once treated with total reverence was becoming more subservient as the years went by. The armies of white coated boffins that treated every component and connection with the greatest respect were being replaced by people who didn't know what a transistor was, how a logic gate worked or the intricate details of how to communicate in the machine's raw binary language. With the advances made in the industry they didn't need to know though. The power of computing had been harnessed so that it could help them to do what they needed to do.

Computers stopped being objects of awe and became tools that allowed people to do 'things' without needing to know how it actually did those 'things'. People didn't grab a screwdriver when the computer screwed as a) the computer did screw up very often any more and b) if it did screw up, there was a support technician standing by to fix it. Users didn't need advanced degrees in computing or mathematics any more. The computer was becoming a tool that just did what it was supposed to do.

Let's Get Personal
Up until the early 1960's computers were still massive machines that, typically, users didn't interact with. Data would get taken away, fed in and the results passed back. There was very little in the way of interaction. There had been some advances in this area and DEC's PDP-1 (with it's CRT display) had been a massive step forward but, by and large, computing meant 'hands off'.

It was work on concepts such as 'time-sharing' that really made the computer more 'personal' and advances in human-computer interfaces had allowed people to interact with the computer in real time. Unlike in the days of punch cards and paper tape, where users would create their instructions, send them off and then wait for the results to come back, the advent of displays and keyboards meant that they could enter commands directly into the machine and see the results there and then. This further downplayed the computer's power - it responded to users rather than users responding to it. For the first time users felt as though they were in charge rather than as though they had to bow down before their computing God and offer sacrifices in the form of their puny little programs.

With their remote terminals users could feel as though they were in charge but they still rarely got to see the beast that they were in charge of. It was the introduction of a new class of computer, the mini-computer, that allowed more users to glimpse the physical side of computing.

The mini-computer, while smaller than it's mainframe big-brother, was still an expensive and powerful piece of equipment but it further exposed more people to computing and computers. Despite still costing tens (often hundreds) of thousands of dollars, a mini-computer was often a financially viable option for firms that were nowhere near big enough to consider a mainframe machine.

While predominantly the domain of DEC, Honeywell and other smaller manufacturers, the mini-computer and the untapped sales market that it reached soon drew the attention of IBM and the other big players. IBM especially paid attention and, realising that firms didn't just come in one shape and size, developed a new concept in computing - the range. Previously machines had come in a one-size-fits-all form but IBM's new System 360 Series changed all of that as it offered a range of machines each of different specifications, capabilities and, most importantly, prices that could all still run the same software.

This 'range' set IBM apart and, with its near infamous sales team, the adoption of the System 360 Series ensured that ever more people came into contact with the wonderful world of computing and that delving about inside the machine became ever less challenging as parts could be swapped faster and more easily than ever (the days of hand soldering components had been phased out years earlier).

Once again the status of the computer fell slightly. Not only was it no longer a demi-God the size of a house that ruled its subjects, but now it was easier than ever to configure the machine and march to the tune of the users and the men with the magic bags of money.

The Smallest Revolution of All
By the 1970's the computer had evolved from being a research project into an 'electronic brain' that would make life easier for us, and then into tool used by millions of ordinary people around the world day in, day out. It was the invention of the integrated circuit that would finally strip away any mystique that the box of tricks once had.

Even in the form of a mini-computer the computer was still a big, expensive piece of kit that was, at the very least, the domain of business. The invention of the silicon chip changed all of that though. Machines now became miniscule in size, prices plummeted and people who used computers in business and education wanted to harness that power for themselves. It was from this that the micro-computer emerged and the computer finally fell that last few feet down to Earth.

Early micro-computers were almost a throwback to the likes of ENIAC and users had to not only be scientifically minded but, more often than not, had to be handy with a soldering iron. While it was true that micro-computers were cheap, they still carried a price tag and that price tag was no longer being picked up by a business but by ordinary people and most ordinary people still couldn't justify or see the need to payout hundreds of $s or ús for something that they didn't understand.

Apple went a long way to address this and the release of the Apple ][ created the first computer that not only included everything that users needed (an easy form of input and output via the in-built keyboard and the user's existing TV, and a means to easily load and save data via regular audio cassettes) but packaged it into a simple plastic all-in-one box. There were no wires snaking about, no precariously connected boxes and no spit-and-chicken-wire monstrosities that threatened to burn the house down at any given moment.

The computer had become something that you could walk into a shop and pick up from the shelf. Its stature had been lost and here was the power of 'computing' in a box that you could hold in your hands. It was small, compact, cheap(ish), neat and, in a sense, cute. A far cry from the towering room sized machines, here was a computer that, while miniscule and woefully weak in technical terms (when compared to its mainframe and mini-computer cousins) was still a computer that could do things. The machine had lost its God-like stature and was now a commodity item that came in a cardboard box and shipped by the million.

The Beige Revolution
Once Apple opened the consumer floodgates seemingly every man and his dog released a micro-computer that stuck to the all-in-one design. It was what consumers wanted as it was (mostly) hassle free. Yes many machines soon had a swathe of peripherals and add-ons to plug in but they had all quickly acknowledged the keep-it-simple-stupid approach. Even when venerable IBM waded into the micro-computer market they went with the same 'simple' approach (albeit with an external keyboard) in the shape of the IBM PC.

Despite the different technologies under the case, the different shapes and sizes, keyboard layouts, connection sockets etc. just about every machine shared one thing: colour. Sadly that colour was beige and it would become the de facto standard for nearly two decades. The mighty computer, the towering brute that had seemingly appeared in every colour under the sun from the cool blues of the PDP-1 to the fiery reds of the System 360, and even the black monolithic ENIAC was now nothing more than a beige box. Yes the computers of the 1970's had slowly turned a lighter shade of brown too but while the mainframes and mini-computers could make up for it by sheer size, the poor micro-computer just looked plain and bland.

The computer became almost uniformly soul-less. A bland beige box that sat on, under or next to a desk - a servant with no stature, impact or presence to speak of. It didn't make the world quake, didn't contain any precious wonders or puzzles of intricate wiring (little more than a circuit board in many cases) and it didn't even look good while it was doing it. Some companies tried to to buck the trend and while the Silicon Graphics machines looked resplendent in their purples, teals, crimsons and even blacks, they were easily cancelled out by the millions of bland, beige boxes that littered offices and homes around the world.

By the start of the 1990's if the man in the street wanted a computer then he could easily find one...just so long as it was beige. The computer had truly passed from a thing of wonder to another example of the throwaway society. The computer had once been an irreplaceable tool that would run for 20 or 30 years but now it was considered obsolete in just two or three and would happily be binned in favour of the latest model. Machines that had once inspired and astounded were now available to the masses and the computer lost its magic once and for all. Or had it.

Despite having done more to bring the micro-computer to the level of the man-in-the-street than anyone else through the Apple ][, Apple had quickly lost out to the seemingly unstoppable IBM PC (or rather the IBM PC compatibles) and by the 1990's was little more than a beige box builder. Yes Apple had retained their individuality via the Macintosh and their refusal to go down the PC route but their boxes were just the same bland, boring beige efforts that were so indicative of the state of the computer in general. They didn't stand out from the crowd, they were helping to choke the life out of the computer...and then came the iMac.

The iMac wasn't anything brilliant technically and it ultimately led further down the path of killing the mystery of the computer but it was a breathe of fresh air none the less. Instead of beige the iMac brought colour and interest back to the computer. It made the computer chunky and funky and coo-el, and made it a point of interest again kick-starting a whole new generation of designers who also dared to say no to beige (or maybe it just kick-started a whole new generation of customers who said no to beige...and the rest of the industry merely jumped on the bandwagon). All admirable points but while re-invigorating the computer it made it fashionable and even artistic it pushed the computer even further into servitude.

As I write this PCs and Macs come in all shapes, sizes and colours. They have flashing lights, spinning fans, decals, artwork and all manner of baubles and bangles to separate them from the rest of the computer crowd. Strip away the paint jobs and chrome effect plastic though and the modern computer is still bland though. If anything the addition of the baubles and bangles has actually turned the computer from being a commodity into being a fashion accessory. From imposing monolith of infinite power with users queuing up to use it to a pink bauble sat on a coo-el girl's desk begging for attention.

It Ain't Got Soul
Don't get me wrong, I love computers and I think that making the computer more and more accessible to more and more people is a good thing. Modern machines are hundreds, thousands and sometimes millions or even billions times more powerful than the likes of ENIAC and co. They have seemingly infinite amounts of memory (by comparison), they can do amazing things, they can connect us to nearly every point on the planet, furnish us with every piece of human knowledge, help us in just about every walk of life, catalogue, organise, record and store almost anything and everything. But they haven't got soul anymore.

While I may look at something like a PowerMac G5 or a jet black IBM desktop, they don't have soul. Yes they're stylistically and aesthetically stunning and I'll happily ogle them and yearn for them to be sat on/under/next to my desk...but they're lifeless. They rolled off a production line like millions of identical machines.

I'm too young to remember the days of ENIAC and even the System 360 but they excite me in a way that my HP desktop will never be able to. I once stood in front of the Colossus rebuild and was awestruck. The sheer size and complexity of it astounded me. Even to this day I have no idea what it was doing but I was absolutely smitten with this piece of computing history. Yes my G4 iMac might hammer it into the ground in terms of performance, aesthetics, ease of use and seemingly everything, but it doesn't have soul.

Machines don't have souls - they're inanimate objects. That's the logical conclusion...but logic doesn't explain why seeing a machine that takes up three filing cabinets is more rewarding than seeing a warehouse full of PCs. Modern machines are built by machines and, in many ways, designed by machines. The early computers weren't. They were designed by men and built by men. Modern machines may be more reliable, better manufactured and, ultimately, far 'better'...but they don't have soul in them. The computers built by men some how capture the spirit of the people who built them.

As computers have become more widespread and produced by production lines, the magic (and soul) has diminished. The last machine to really have soul was, in my opinion, the Macintosh. With its hand-wrapped circuit board (at the design phase anyway) the early Macs seemed to be almost proud of what they could do and just shone 'quality'. Every curve, every button click, every noise, beep, icon and dimple on the case felt as though someone had deliberately thought about it. The Mac managed to capture the soul of the people who designed and built it (even if it too came off a production line).

Maybe I'm talking gibberish and you can't understand what I'm yammering on about but the early computers (up to about 1970) had that certain 'something', took a little bit of their builders soul with them and were far more 'human' than they have any right to be. When I finish writing this I can take a screwdriver and take my PC apart. I'll know what all the parts do and how to put them back together. I'll also know that if I screw anything up, there's a swathe of shops that'll have replacement parts, and, if they don't, so what. Where's the fun in that?

The rush and excitement that the computer should give you is opening the case and not knowing what you're going to find inside. Popping a panel as big as a door and finding miles of wiring, actual components, bulbs, switches and things that you can actually identify and work on. Being baffled, mystified, confused and getting a sense of genuine fear about screwing something up (and not just being able to drive down to PC World to buy a replacement). There's a thrill in getting a machine to do something, even flash a light or make a beep, and know that it was a damn sight harder than popping a Windows XP CD into the drive and hitting the 'on' switch.

Using a computer should be an adventure, something that gets you buzzing. It should be a combination of excitement and fear. To take a machine the size of a house and make it do your bidding is exhilarating, it's inspiring. A machine that big feels like it can do anything (even though it probably can't) and it radiates real power. The hundreds and thousands of flashing lights, the clicking of circuits, the sound of tapes spinning, punch cards flowing. It might all be a lie but the 'electronic brain' is there to do your bidding. You command it and it will obey. It will level cities, solve the riddles of the universe, cross check your bank account details and still have time to play eighteen games of chess...all in the twinkling of an eye.

The modern computer doesn't have any of that. The modern computer doesn't have soul.

Site designed and maintained by TheNeil. While all content is checked and updated regularly, the author cannot be held responsible for any broken links, incorrect information or damage caused to hardware or software. Comments, contributions and criticism always gratefully received.

See that? That's the number of fools that have found their way here

Site Last Updated: 10/02/2016 15:35:29