From Pascal to point & click | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

From Pascal to point & click

In the technological world, the abacus held sway for most of the millennium; the latest, greatest computer system gives way to the later, greater system each 18 months


You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

Hardly anyone back in 1000 A.D. could have imagined that people would buy plug-in electronic brains in the next millennium. Imagine explaining Web browsers and buddy lists to someone who lived back then. Yet computers have conquered our society in just 50 years, replacing our dreams of airplanes and cars and even space flights with visions of silicon and fiber optic cables. A thousand years ago, what some had was the abacus. By clicking beads around, people could perform math problems and manage a few business transactions. It was simple, could be easily replicated, easily learned, and it lasted centuries. The descendants of the abacus were more difficult to design and are likely to survive more than a few years. After a fit of brainstorms and ideas over the ages, Blaise Pascal gave the world an early taste of processing power. In the 1640s he build a device called the Pascaline, an early mechanical calculator that could add numbers up to eight digits, to help his father do taxes. It looked like a brass rectangular box with a series of dials running across. A few models actually were sold by the early 1850s. The Pascaline was big news, but hard to use. One problem was that French currency was not based in base 10, making calculating difficult. Only about 50 were made before lack of demand brought the project to a halt; a few still exist and were preserved. Pascal went on to do theology and pioneering work in geometry; his Pensees are still widely read today. The followers of his religious and technological work wound up moving in separate directions, and few of those who helped develop the PC were Christians. Over the years, others tried to tinker with the Pascaline, but a new wave of innovation came in the 1800s. Incandescent light bulbs, telegraph lines, and typewriters were all in the works, adding pieces to the puzzle. Early in the 19th century, Joseph-Marie Jacquard invented the punch card to control silk looms. By punching the card with holes, operators could send instructions to a machine; the card was similar to the roller that controls a player piano. Punch cards were big news, since data could be stored and brought back later. Someone could write a set of instructions to perform the same operations on different sets of cards. They were slow, tedious, and a real pain if spilled on the floor. But they made computer programming possible. The first to have grand plans for the punch card was English math professor Charles Babbage. In 1822, he started building a steam-powered, locomotive-sized invention called the difference engine. It was intended to generate mathematical tables without using teams of number-crunchers with pen, paper, or early calculators. After 10 years, he gave up and designed the analytical engine, the first stab at a programmable computer. Babbage's death in 1830 put a stop to the construction. Introducing Big Blue The next milestone was IBM, the big blue name that dominated the industry until Microsoft outmaneuvered and overpowered them. MIT lecturer Herman Hollerith used punch cards to store census data. He entered and won a competition for a new tabulating scheme. His machines cut a decade's worth of head counting down to a year of work. By sorting the cards, statisticians could compile population data easily for the first time. Hollerith saved the government $5 million and soon he had a hit on his hands, with orders coming in from as far away as Canada, Norway, and Russia. In 1890 he founded his Tabulating Machine Company, which in 1924 would become International Business Machines. Decades later, IBM was using punch cards to control giant mainframes, and for years its 80-column format was an icon of technological power in the days before floppies and CD-ROMs. That first $5 million was the birth of a new idea: that information could be used to save and make money. In Hollerith's time it was just one cog in the industrial revolution. The lesson would take a long time to learn because the technology was big and expensive. As processing slowly developed, a world of other technological changes hit: cars, planes, and household appliances rocked the world, leaving people wondering what could come next. An important bit of speculation came in 1924, when Czech author Karel Capek produced the play R.U.R, which popularized the idea of robots. These were machines programmed to do boring tasks. Naturally, they all went nuts and destroyed humanity. This launched the ever-present debate about people being enslaved by technology and computers replacing people. The early Internet After World War II came the explosion. Another IBM genius, Howard Aiken, invented the electronic calculator in 1944. Considered the first digital computer, it weighed five tons and made a terrible racket. The first real commercial computer, UNIVAC, arrived in 1951. For the next decades, computers were huge, expensive monstrosities that were only thought valuable to science, industry, and academia. In the 1960s, the people behind these big machines wanted to share resources and information while the government searched for ways to create a nuke-proof computer network. Al Gore's assertions to the contrary, in 1969 the Department of Defense Advanced Research Projects Agency (ARPA) linked four universities in ARPANET, an ancestor of today's Internet. Two years later, Intel created the microprocessor that would bring computers out of the realm of the elite onto the desktop. The 4004 chip is a direct ancestor of today's Pentium chip and was originally developed for business calculators. It was small, fast, and relatively cheap. Thousands of transistors were reduced to a bunch of integrated circuits on a silicon wafer. Suddenly the microchip was big business and smaller computers were now affordable. The late 1970s and early 1980s saw an amazing array of home computers from Commodore, Radio Shack, Sinclair, Apple, Atari, and others. Hard to use and weak by today's standards, they gave the world a feel for what was possible. Once the first major spreadsheet program, VisiCalc, came in 1979, more businesses wanted computers. IBM was lured into the market in 1981 with the first PC and the real hardball began. The PC era Before IBM's invasion, the slew of short-lived home computers gave average people a taste of what this technology did. They were slow, quaint, and clunky by today's standards and many of them plugged into a standard TV set instead of using its own monitor. Millions got their first taste of programming, gaming, and even online surfing and hungered for more. Big Blue did not understand the PC revolution until too late; it created the standard used by over 90 percent of desktop computers today, then lost control of the market. IBM wanted the original Personal Computer to ship in a year, so it built a Frankenstein contraption of parts from various manufacturers, notably Intel's processor. Enter Bill Gates and Microsoft, which scored a contract to write an operating system called MS-DOS for the PC. IBM didn't take exclusive rights to the product, leaving Microsoft to sell DOS to other vendors-and making Mr. Gates the richest man in the world. IBM's PC was a big hit and other manufacturers took notice. Clone makers popped up, tossing together similar parts running Intel's chips and Microsoft's operating systems. Originally, these were low-budget affairs that didn't always work, but over the years upstarts like Compaq, Dell, and Gateway took on IBM at its own game. By the late 1980s, the way the computer user ran his machine had changed. Punch cards had given way to command-line interfaces, where people type instructions on a keyboard and the computer responds. Now manufacturers were turning toward the graphical user interface, the point-and-click system made famous by the original Macintosh. Originally developed by Xerox's Palo Alto Research Center, the new system replaced hard-to-remember instructions with drop-down menus and icons (like Windows' recycling bin) that represent programs and files. Me and my machine Innovation has never stopped, largely because developers are constantly coming up with more ways to put more power into the same size box. An axiom called Moore's Law says that computer speeds double and prices drop every 18 months. This means that each quarter's PC models are a leap above the last, but every machine purchased has a bad case of entropy. The typical automobile can last indefinitely as long as well maintained and driven safely, but computers become dinosaurs in just three or four years. Everything about computers is constantly changing. The collection of odd-shape plugs on the back of most PCs is being phased out for a standard jack called USB, or Universal Serial Bus. The modem will eventually be replaced by network hookups for high-speed connection through cable and phone lines. Sleek, skinny flat screens with better pictures are overtaking today's big, bulky monitors. Nothing ever stays the same. The biggest change in the computer world this decade, of course, is the overnight rise of the Internet. The invention of the World Wide Web by Tim Berners-Lee and the birth of Mosaic, the first graphical browser, led millions to spend hundreds and thousands of dollars just to get free stuff off the Net. The mass marketing of the Internet has changed the way people look at computers. The PC has gone from being a cranky servant in the den to being a replacement for the TV set. People can get practically anything they want over a dial-up connection. If it can't be downloaded, it can be charged to your credit card and shipped. This has bureaucrats and tax collectors wringing their hands about losing the sales tax normally collected on one's trip to the mall. Media outlets from TV networks to newspapers to book dealers are wringing their hands about losing their dominance to an ever-changing enigma. If everything is at one's fingertips, who needs the choices at B. Dalton, Tower Records, or Family Bookstores? That's why billions are being spent building so-called e-commerce outlets to sell goods over the Net. Friendship, love affairs, and even employment is being carried on over the Net through the ever-growing pile of chat software that lets people type at each other across the miles. People can run into a lot of mentally deranged weirdos this way, but they can also find that kindred spirit who shares their love of gourmet pizza, rubber stamping, or the philosophy of Gordon Clark. This endless silicon-based Turkish bazaar extends to the intellect, as everyone on the Net can publish his innermost thoughts. Anyone with a little learning can turn an idea into a website or a virtual community. Every possible opinion can be found out there since there are no gatekeepers and no laws (at least in the United States) controlling what can be said. What it means Computers are like Legos, Tinkertoys, and Play-Doh. Users can fiddle with them and push and pull until they have something really cool. The cultural mandate possibilities of all this technology have barely been explored. The lack of gatekeepers, for example, means there is nobody who scan stop the Christian worldview from getting out to the world. If Gutenberg's revolution put Bibles in every house, just think what computerization can do. A few people with the right ideas, good web design, and a little marketing savvy can have great success in this medium. The often discussed dangers of this medium can be overtaken with a resurgence of Christian culture. The Internet will only become more powerful over the next decade as it begins to absorb the usual functions of phone service and cable TV within itself. Microsoft executives have long dreamed of hooking up every home appliance through one computer in the den. Already, America is being wired for broadband, better wiring for faster connections. Computers are speeding up people's day-to-day tasks, then slowing them down as they deal with all the new problems created by the system. This media revolution will both globalize and tribalize the people who use the technology. That makes the need for Christians to be salt and light to this new universe more urgent.

-Chris Stamper is a national correspondent for WORLD currently working on a book about Christianity and technology


Chris Stamper Chris is a former WORLD correspondent.

COMMENT BELOW

Please wait while we load the latest comments...

Comments