The PC & Tech Authority team picks the computing-related inventions that have revolutionised our lives. What would you add to the list?
We may curse slow-to-download phones more than a caravan on a bendy dirt road, but history will show that third-generation networks did for mobile broadband what ADSL did for its fixed-line equivalent. According to Cisco, the current 150% year-on-year mobile data traffic growth mirrors that of fixed-line internet traffic a decade ago, but on a much vaster scale: mobile data traffic in 2010 was three times the size of the entire internet in 2000. Without 3G, remote working would be greatly diminished, even if it still isn’t great in regional areas of Australia.
Deployment over existing phone lines made Asymmetric Digital Subscriber Line the starting point for the broadband revolution in Australia. Sadly, for many years it’s also been a sticking point, as anyone living or working too far from the telephone exchange still suffers low bandwidth, despite much of the industry moving to “up to” 24Mbits/sec ADSL2+. At least prices have dropped – in 2000 you could happily expect to pay around $75 a month for 512k speeds and an enormous 3GB cap.
Invented by a company subsequently acquired by Yahoo, Pay-Per-Click (PPC) advertising is an example of Google transforming someone else’s great idea into a $28-billion-a-year cash cow. Traditional advertising, online or off, involves parting with cash upfront and hoping that sales from the ad campaign outweigh its cost. PPC allows firms to create ads that appear in the search results for specific keyword phrases, a system Google calls AdWords. Advertisers pay only when their ad is clicked on by a punter who visits their site. This means anyone can compete in the global marketplace by carefully choosing the right keywords and optimising their site and ads. Of course, they have to convert visitors to customers, or they could end up pouring money into the infamous AdWords Black Hole.
In 2008, Apple launched the iPhone App Store, to partner the iPhone 3G (it may seem inconceivable that the original iPhone didn’t offer third-party apps, but it’s true). It was the first major portal for smartphone software, and it focused on making it as simple as possible to find, buy and download apps. The store was hugely popular, with iPhone and iPod touch users downloading apps in vast numbers. Android and Windows Phone 7 have since copied the App Store model and seen similar success. It’s no exaggeration to say that the App Store has changed the way people use smartphones. Can the miracle be replicated elsewhere? Apple launched the Mac App Store earlier this year, offering direct downloads of full-sized desktop games and applications for OS X, and Microsoft has confirmed that Windows 8 will include its own app store. It remains to be seen whether the one-stop impulse-buy model will shift full-price desktop apps as effectively as it has iPhone games.
Established in 1963, the American Standard Code for Information Interchange was the first standard system for encoding alphanumeric characters. ASCII made it easy to move data between systems, but, being American, it didn’t originally support accented letters, nor any currency symbols other than the dollar. International computer manufacturers had to make do with fudges until the more advanced Unicode system was established in 1991.
US engineers Joe Woodland and Bernard Silver devised the barcode in 1949 as a way of automating the checkout process at grocery stores – and the system is still going strong today. It’s estimated that Universal Product Code (UPC) symbols are scanned more than five billion times a day. The system now enables self-checkout and even live price comparison via smartphones. Quite an achievement for a collection of boring black and white lines.
Developed as a teaching language in the mid-1960s, BASIC ignited the careers of a whole generation. In fact, it’s one of the few programming languages to have been taught in schools, most having neglected coding in favour of spreadsheets and word processing. Providing a version of BASIC in 1974 gave Micro-Soft [sic] its big break, and Visual Basic 3 put programming into the hands of the masses.
One of the few technologies that’s made the transition from sci-fi to reality, biometrics are now used to log in to laptops, scan tickets and even buy school dinners. With iris recognition planned for the next generation of passports, and research being conducted into behavioural scans to measure typing rhythm and voice inflection, this very personal technology will only become more pervasive in the years to come.
It may sound like something from the space age, but a blade server’s principal function is rather mundane: to save space. Blade servers are modular systems that house many server “blades” – essentially, much thinner equivalents of the more prevalent units housed in standard 19in-wide rackservers, known as 1U, 2U and so on. Blade servers were developed in the 1990s in response to the need of datacenters to increase performance and availability without having to expand the physical space in which servers were stored.
When does a fad become a force to be reckoned with? Maybe it was when The Huffington Post received $5 million of investment in 2006, Perez Hilton was given a TV show, or when Gizmodo’s leaked iPhone provoked the ire of the world’s biggest technology company. Blogs used to be personal, but now they’re global powerhouses that challenge traditional media as well as governments – and they’re here to stay.
In Nineteen Eighty-Four, George Orwell wrote about a terrifying future
in which Big Brother kept watch over people’s every move. Some would say that nightmare has arrived, with millions of spy cams littering our roadsides and city centres. CCTV isn’t necessarily malign: coupled with the power of the computer it enables all sorts of services, from cheap security for small businesses, through tracking dangerous criminals using numberplate recognition, to helping drivers avoid the worst traffic snarl-ups. We may not have learned to love Big Brother, but at least he can be helpful.
Not so long ago, technical drawing was a painstaking job for a professional, and if you wanted to see how an idea would work in the real world you had to build a prototype. With computer-aided design, any type of plan can be assembled and tweaked in a jiffy. With physics simulation, engineers can even slam their ideas into virtual walls to see how they survive – safely, and without
wasting a penny.
The first computer-generated images to hit screens in 1969 were of a dancing triangle on Sesame Street. From such inauspicious beginnings, CGI has taken over the film industry – although it’s taken some time, with James Cameron famously waiting more than a decade for the technology to catch up with his imagination to create Avatar. While some actors, such as Harrison Ford, have complained that computer effects are overused, for every badly done green-screen scene, there’s the raptors in Jurassic Park, the T-1000 in Terminator 2 and everything Pixar’s ever made.
The CCD – a small photoactive piece of silicon at the back of your digital camera – has indisputably changed the world. It’s destroyed the photographic film industry in little over a decade, provided news reports with near-inastant photos from almost any breaking story, eliminated the cost of photo paper and processing, and – perhaps most importantly – allowed people to share the fruits of their camera work instantly, by using the screen on the back of a camera or by publishing online.
The ability to access computing resources from anywhere, and from any kind of device, frees users and applications from the desktop, enabling us to carry our applications and data around with us.
Even in pubs and restaurants we can stay
in contact with our social networks and work projects, where previous generations had to make do with talking to their colleagues.
The original database could be said to have originated in the libraries of ancient Greece, the first society to gather knowledge methodically in one accessible place. These days, databases are digital and pervasive, driving a range of applications. They can take the form of simple repositories of data about individuals or provide the building blocks and content for some of the world’s most popular websites.
The advent of desktop publishing software rendered the art of typesetting (and phototypesetting), entirely redundant. The first breakthrough DTP package, Aldus PageMaker, made its debut in 1985 on the Apple Macintosh, with a Windows version following two years later. Wysiwyg allowed users to preview the printed page accurately before going to press, and made it easy to combine text and images in complex page layouts.
Diffie-Hellman Key Exchange
There are two types of encryption: private-key encryption, where data is encrypted and decrypted with the same key; and public-key encryption, where the two operations use different keys. Private-key encryption is fast, but how do you share a key with someone you’ve never met before? Answer: they choose it at random, encrypt it with your public key and then only you can decrypt it. This is the basis of virtually all encryption on the internet. It’s what makes e-commerce secure, and it’s why we owe Whitfield Diffie and Martin Hellman a debt of more than mere gratitude.
The phenomenon of the projected image has inspired mankind for millennia – the principles of the camera obscura are documented as far back as 470BC – but the move from film projection to digital projection was an important milestone. Where projectors were once huge, noisy devices, the arrival of LCD and DLP-based projectors allowed compact, affordable devices to proliferate, supplanting the old overhead projectors in schools and businesses and making home projectors more attractive than the huge, expensive CRT projectors formerly adopted by home-cinema enthusiasts. Now they’re even built into smartphones and cameras.
Along with Amazon, eBay was one of the pioneers of online shopping. It now boasts sales of $60 billion a year, more from professional retailers than clutter-clearers. During its heyday, eBay drove people online with tales of outrageous auctions – virginity and dead fairies all made perfect marketing. Issues with fraud and false feedback failed to stop the company’s progress.
It might be the Etch-A-Sketch of display technologies, but despite being slow, monochrome and restricted to low resolutions, electronic ink offers two huge advantages over the LCD panels used by tablets and laptops. Its pigment-based technology doesn’t require continuous electrical power, so real-world battery life is measured in weeks. Its true genius, however, is that eBook readers using this technology are as readable as books – even in direct sunlight. Just not in the bath, okay? Stay tuned for eInk advances in terms of colour and animation - yes, both are possible and will be winging their way to your eReader soon.
Once upon a time, kids were drilled in their times tables and complex mathematics required a slide rule and an agile mind. The electronic calculator changed all that: suddenly anybody could perform advanced calculations instantly. It has liberated engineers and scientists from mathematical drudge work and freed them to focus on new applications. Some might say it’s also given the man in the street one more excuse to
Like it or not, email changed the world. Not only has it revolutionised the workplace, it was one of the main forces that led to the creation of the internet. Originally invented in the early 1960s for people logged into a single system, it wasn’t until 1970 that Ray Tomlinson sent the first email across a network using the “@” symbol to specify an address. Thanks, Ray.
Many people claim to hate Facebook, but almost everyone you know is on it. With more than 600 million users, the ultimate social network has pushed – for good and bad – the boundaries of online privacy, and changed how we waste time at work, communicate with friends and find and share interesting bits of information. One in six referrals to news sites now comes from Mark Zuckerberg’s out-of-control college project.
Flash is lambasted by defensive Apple fans – and Steve Jobs – as a buggy, resource-intensive piece of software, but millions of websites can’t be wrong. Introduced in 1996 as Macromedia Flash and acquired by Adobe in 2005, it helped the fledgling web break away from the narrow confines of HTML and enabled developers to incorporate audio, video and interactive elements into their pages. Its eventual replacement by HTML5 seems probable, but there’s no denying Flash’s historical importance.
Click below to see the next page...