Saturday, February 28, 2009

15 Hot New Technologies That Will Change Everything

Memristor circuits lead to ultrasmall PCs. Intel and AMD unleash massively multicore CPUs. Samsung TVs respond to your every gesture. These and other developing technologies will fundamentally change the way you think about--and use--technology.

Christopher Null

image

Illustration: Randy Lyhus

The Next Big thing? The memristor, a microscopic component that can "remember" electrical states even when turned off. It's expected to be far cheaper and faster than flash storage. A theoretical concept since 1971, it has now been built in labs and is already starting to revolutionize everything we know about computing, possibly making flash memory, RAM, and even hard drives obsolete within a decade.

The memristor is just one of the incredible technological advances sending shock waves through the world of computing. Other innovations in the works are more down-to-earth, but they also carry watershed significance. From the technologies that finally make paperless offices a reality to those that deliver wireless power, these advances should make your humble PC a far different beast come the turn of the decade.

In the following sections, we outline the basics of 15 upcoming technologies, with predictions on what may come of them. Some are breathing down our necks; some advances are still just out of reach. And all have to be reckoned with.

§  Memristor: A Groundbreaking New Circuit

§  32-Core CPUs From Intel and AMD

§  Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards

§  USB 3.0 Speeds Up Performance on External Devices

§  Wireless Power Transmission

§  64-Bit Computing Allows for More RAM

§  Windows 7: It's Inevitable

§  Google's Desktop OS

§  Gesture-Based Remote Control

§  Radical Simplification Hits the TV Business

§  Curtains for DRM

§  Use Any Phone on Any Wireless Network

§  Your Fingers Do Even More Walking

§  Cell Phones Are the New Paper

§  Where You At? Ask Your Phone, Not Your Friend

§  25 Years of Predictions

The Future of Your PC's Hardware

Memristor: A Groundbreaking New Circuit

Click here to view full-size image.

This simple memristor circuit could soon transform all electronic devices.Courtesy of HP


Since the dawn of electronics, we've had only three types of circuit components--resistors, inductors, and capacitors. But in 1971, UC Berkeley researcher Leon Chua theorized the possibility of a fourth type of component, one that would be able to measure the flow of electric current: the memristor. Now, just 37 years later, Hewlett-Packard has built one.

What is it? As its name implies, the memristor can "remember" how much current has passed through it. And by alternating the amount of current that passes through it, a memristor can also become a one-element circuit component with unique properties. Most notably, it can save its electronic state even when the current is turned off, making it a great candidate to replace today's flash memory.

Memristors will theoretically be cheaper and far faster than flash memory, and allow far greater memory densities. They could also replace RAM chips as we know them, so that, after you turn off your computer, it will remember exactly what it was doing when you turn it back on, and return to work instantly. This lowering of cost and consolidating of components may lead to affordable, solid-state computers that fit in your pocket and run many times faster than today's PCs.

Someday the memristor could spawn a whole new type of computer, thanks to its ability to remember a range of electrical states rather than the simplistic "on" and "off" states that today's digital processors recognize. By working with a dynamic range of data states in an analog mode, memristor-based computers could be capable of far more complex tasks than just shuttling ones and zeroes around.

When is it coming? Researchers say that no real barrier prevents implementing the memristor in circuitry immediately. But it's up to the business side to push products through to commercial reality. Memristors made to replace flash memory (at a lower cost and lower power consumption) will likely appear first; HP's goal is to offer them by 2012. Beyond that, memristors will likely replace both DRAM and hard disks in the 2014-to-2016 time frame. As for memristor-based analog computers, that step may take 20-plus years.

32-Core CPUs From Intel and AMD

Click here to view full-size image.

8-core Intel and AMD CPUs are about to make their way onto desktop PCs everywhere. Next stop: 16 cores.Courtesy of Intel


If your CPU has only a single core, it's officially a dinosaur. In fact, quad-core computing is now commonplace; you can even get laptop computers with four cores today. But we're really just at the beginning of the core wars: Leadership in the CPU market will soon be decided by who has the most cores, not who has the fastest clock speed.

What is it? With the gigahertz race largely abandoned, both AMD and Intel are trying to pack more cores onto a die in order to continue to improve processing power and aid with multitasking operations. Miniaturizing chips further will be key to fitting these cores and other components into a limited space. Intel will roll out 32-nanometer processors (down from today's 45nm chips) in 2009.

When is it coming? Intel has been very good about sticking to its road map. A six-core CPU based on the Itanium design should be out imminently, when Intel then shifts focus to a brand-new architecture called Nehalem, to be marketed as Core i7. Core i7 will feature up to eight cores, with eight-core systems available in 2009 or 2010. (And an eight-core AMD project called Montreal is reportedly on tap for 2009.)

After that, the timeline gets fuzzy. Intel reportedly canceled a 32-core project called Keifer, slated for 2010, possibly because of its complexity (the company won't confirm this, though). That many cores requires a new way of dealing with memory; apparently you can't have 32 brains pulling out of one central pool of RAM. But we still expect cores to proliferate when the kinks are ironed out: 16 cores by 2011 or 2012 is plausible (when transistors are predicted to drop again in size to 22nm), with 32 cores by 2013 or 2014 easily within reach. Intel says "hundreds" of cores may come even farther down the line.

Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards

When AMD purchased graphics card maker ATI, most industry observers assumed that the combined company would start working on a CPU-GPU fusion. That work is further along than you may think.

What is it? While GPUs get tons of attention, discrete graphics boards are a comparative rarity among PC owners, as 75 percent of laptop users stick with good old integrated graphics, according to Mercury Research. Among the reasons: the extra cost of a discrete graphics card, the hassle of installing one, and its drain on the battery. Putting graphics functions right on the CPU eliminates all three issues.

Chip makers expect the performance of such on-die GPUs to fall somewhere between that of today's integrated graphics and stand-alone graphics boards--but eventually, experts believe, their performance could catch up and make discrete graphics obsolete. One potential idea is to devote, say, 4 cores in a 16-core CPU to graphics processing, which could make for blistering gaming experiences.

When is it coming? Intel's soon-to-come Nehalem chip includes graphics processing within the chip package, but off of the actual CPU die. AMD's Swift (aka the Shrike platform), the first product in its Fusion line, reportedly takes the same design approach, and is also currently on tap for 2009.

Putting the GPU directly on the same die as the CPU presents challenges--heat being a major one--but that doesn't mean those issues won't be worked out. Intel's two Nehalem follow-ups, Auburndale and Havendale, both slated for late 2009, may be the first chips to put a GPU and a CPU on one die, but the company isn't saying yet.

USB 3.0 Speeds Up Performance on External Devices

The USB connector has been one of the greatest success stories in the history of computing, with more than 2 billion USB-connected devices sold to date. But in an age of terabyte hard drives, the once-cool throughput of 480 megabits per second that a USB 2.0 device can realistically provide just doesn't cut it any longer.

What is it? USB 3.0 (aka "SuperSpeed USB") promises to increase performance by a factor of 10, pushing the theoretical maximum throughput of the connector all the way up to 4.8 gigabits per second, or processing roughly the equivalent of an entire CD-R disc every second. USB 3.0 devices will use a slightly different connector, but USB 3.0 ports are expected to be backward-compatible with current USB plugs, and vice versa. USB 3.0 should also greatly enhance the power efficiency of USB devices, while increasing the juice (nearly one full amp, up from 0.1 amps) available to them. That means faster charging times for your iPod--and probably even more bizarre USB-connected gear like the toy rocket launchers and beverage coolers that have been festooning people's desks.

When is it coming? The USB 3.0 spec is nearly finished, with consumer gear now predicted to come in 2010. Meanwhile, a host of competing high-speed plugs--DisplayPort, eSATA, and HDMI--will soon become commonplace on PCs, driven largely by the onset of high-def video. Even FireWire is looking at an imminent upgrade of up to 3.2 gbps performance. The port proliferation may make for a baffling landscape on the back of a new PC, but you will at least have plenty of high-performance options for hooking up peripherals.

Wireless Power Transmission

Wireless power transmission has been a dream since the days when Nikola Tesla imagined a world studded with enormous Tesla coils. But aside from advances in recharging electric toothbrushes, wireless power has so far failed to make significant inroads into consumer-level gear.

What is it? This summer, Intel researchers demonstrated a method--based on MIT research--for throwing electricity a distance of a few feet, without wires and without any dangers to bystanders (well, none that they know about yet). Intel calls the technology a "wireless resonant energy link," and it works by sending a specific, 10-MHz signal through a coil of wire; a similar, nearby coil of wire resonates in tune with the frequency, causing electrons to flow through that coil too. Though the design is primitive, it can light up a 60-watt bulb with 70 percent efficiency.

When is it coming? Numerous obstacles remain, the first of which is that the Intel project uses alternating current. To charge gadgets, we'd have to see a direct-current version, and the size of the apparatus would have to be considerably smaller. Numerous regulatory hurdles would likely have to be cleared in commercializing such a system, and it would have to be thoroughly vetted for safety concerns.

Assuming those all go reasonably well, such receiving circuitry could be integrated into the back of your laptop screen in roughly the next six to eight years. It would then be a simple matter for your local airport or even Starbucks to embed the companion power transmitters right into the walls so you can get a quick charge without ever opening up your laptop bag.

The Future of Your PC's Software

64-Bit Computing Allows for More RAM

In 1986, Intel introduced its first 32-bit CPU. It wasn't until 1993 that the first fully 32-bit Windows OS--Windows NT 3.1--followed, officially ending the 16-bit era. Now 64-bit processors have become the norm in desktops and notebooks, though Microsoft still won't commit to an all-64-bit Windows. But it can't live in the 32-bit world forever.

What is it? 64-bit versions of Windows have been around since Windows XP, and 64-bit CPUs have been with us even longer. In fact, virtually every computer sold today has a 64-bit processor under the hood. At some point Microsoft will have to jettison 32-bit altogether, as it did with 16-bit when it launched Windows NT, if it wants to induce consumers (and third-party hardware and software developers) to upgrade. That isn't likely with Windows 7: The upcoming OS is already being demoed in 32-bit and 64-bit versions. But limitations in 32-bit's addressing structure will eventually force everyone's hand; it's already a problem for 32-bit Vista users, who have found that the OS won't access more than about 3GB of RAM because it simply doesn't have the bits to access additional memory.

When is it coming? Expect to see the shift toward 64-bit accelerate with Windows 7; Microsoft will likely switch over to 64-bit exclusively with Windows 8. That'll be 2013 at the earliest. Meanwhile, Mac OS X Leopard is already 64-bit, and some hardware manufacturers are currently trying to transition customers to 64-bit versions of Windows (Samsung says it will push its entire PC line to 64-bit in early 2009). And what about 128-bit computing, which would represent the next big jump? Let's tackle one sea change at a time--and prepare for that move around 2025.

Windows 7: It's Inevitable

Click here to view full-size image.

Will Windows 7 finally push PC software into the 64-bit world for good? We can only hope.


Whether you love Vista or hate it, the current Windows will soon go to that great digital graveyard in the sky. After the tepid reception Vista received, Microsoft is putting a rush on Vista's follow-up, known currently as Windows 7.

What is it? At this point Windows 7 seems to be the OS that Microsoft wanted to release as Vista, but lacked the time or resources to complete. Besides continuing refinements to the security system of the OS and to its look and feel, Windows 7 may finally bring to fruition the long-rumored database-like WinFS file system. Performance and compatibility improvements over Vista are also expected.

But the main thrust of Windows 7 is likely to be enhanced online integration and more cloud computing features--look for Microsoft to tie its growing Windows Live services into the OS more strongly than ever. Before his retirement as Microsoft's chairman, Bill Gates suggested that a so-called pervasive desktop would be a focus of Windows 7, giving users a way to take all their data, desktop settings, bookmarks, and the like from one computer to another--presumably as long as all those computers were running Windows 7.

When is it coming? Microsoft has set a target date of January 2010 for the release of Windows 7, and the official date hasn't slipped yet. However, rumor has the first official beta coming out before the end of this year.

Google's Desktop OS

Click here to view full-size image.

The independently created gOS Linux is built around Google Web apps. Is this a model for a future Google PC OS?


In case you haven't noticed, Google now has its well-funded mitts on just about every aspect of computing. From Web browsers to cell phones, soon you'll be able to spend all day in the Googleverse and never have to leave. Will Google make the jump to building its own PC operating system next?

What is it? It's everything, or so it seems. Google Checkout provides an alternative to PayPal. Street View is well on its way to taking a picture of every house on every street in the United States. And the fun is just starting: Google's early-beta Chrome browser earned a 1 percent market share in the first 24 hours of its existence. Android, Google's cell phone operating system, is hitting handsets as you read this, becoming the first credible challenger to the iPhone among sophisticated customers.

When is it coming? Though Google seems to have covered everything, many observers believe that logically it will next attempt to attack one very big part of the software market: the operating system.

The Chrome browser is the first toe Google has dipped into these waters. While a browser is how users interact with most of Google's products, making the underlying operating system somewhat irrelevant, Chrome nevertheless needs an OS to operate.

To make Microsoft irrelevant, though, Google would have to work its way through a minefield of device drivers, and even then the result wouldn't be a good solution for people who have specialized application needs, particularly most business users. But a simple Google OS--perhaps one that's basically a customized Linux distribution--combined with cheap hardware could be something that changes the PC landscape in ways that smaller players who have toyed with open-source OSs so far haven't been quite able to do.

Check back in 2011, and take a look at the not-affiliated-with-Google gOS, thinkgos in the meantime.

The Future of Entertainment

Gesture-Based Remote Control

Click here to view full-size image.

Soon you'll be able to simply point at your television and control it with hand gestures.Courtesy of Reatrix


We love our mice, really we do. Sometimes, however, such as when we're sitting on the couch watching a DVD on a laptop, or when we're working across the room from an MP3-playing PC, it just isn't convenient to drag a hockey puck and click on what we want. Attempts to replace the venerable mouse--whether with voice recognition or brain-wave scanners--have invariably failed. But an alternative is emerging.

What is it? Compared with the intricacies of voice recognition, gesture recognition is a fairly simple idea that is only now making its way into consumer electronics. The idea is to employ a camera (such as a laptop's Webcam) to watch the user and react to the person's hand signals. Holding your palm out flat would indicate "stop," for example, if you're playing a movie or a song. And waving a fist around in the air could double as a pointing system: You would just move your fist to the right to move the pointer right, and so on.

When is it coming? Gesture recognition systems are creeping onto the market now. Toshiba, a pioneer in this market, has at least one product out that supports an early version of the technology: the Qosmio G55 laptop, which can recognize gestures to control multimedia playback. The company is also experimenting with a TV version of the technology, which would watch for hand signals via a small camera atop the set. Based on my tests, though, the accuracy of these systems still needs a lot of work.

Gesture recognition is a neat way to pause the DVD on your laptop, but it probably remains a way off from being sophisticated enough for broad adoption. All the same, its successful development would excite tons of interest from the "can't find the remote" crowd. Expect to see gesture recognition technology make some great strides over the next few years, with inroads into mainstream markets by 2012.

Radical Simplification Hits the TV Business

The back of most audiovisual centers looks like a tangle of snakes that even Medusa would turn away from. Similarly, the bowl of remote controls on your coffee table appeals to no one. The Tru2way platform may simplify things once and for all.

What is it? Who can forget CableCard, a technology that was supposed to streamline home A/V installations but that ultimately went nowhere despite immense coverage and hype? CableCard just didn't do enough--and what it managed to do, it didn't do very well. Enter Tru2way.

Tru2way is a set of services and standards designed to pick up the pieces of CableCard's failure by upgrading what that earlier standard could do (including support for two-way communications features like programming guides and pay-per-view, which CableCard TVs couldn't handle), and by offering better compatibility, improved stability, and support for dual-tuner applications right out of the box. So if you have a Tru2way-capable TV, you should need only to plug in a wire to be up and running with a full suite of interactive cable services (including local search features, news feeds, online shopping, and games)--all sans additional boxes, extra remotes, or even a visit from cable-company technicians.

When is it coming? Tru2way sets have been demonstrated all year, and Chicago and Denver will be the first markets with the live technology. Does Tru2way have a real shot? Most of the major cable companies have signed up to implement it, as have numerous TV makers, including LG, Panasonic, Samsung, and Sony. Panasonic began shipping two Tru2way TVs in late October, and Samsung may have sets that use the technology available in early to mid-2009.

Curtains for DRM

Click here to view full-size image.

RealDVD's DRM-free format makes taking flicks on the road easier. This is the future of entertainment.


Petrified of piracy, Hollywood has long relied on technical means to keep copies of its output from making the rounds on peer-to-peer networks. It hasn't worked: Tools to bypass DRM on just about any kind of media are readily available, and feature films often hit BitTorrent even before they appear in theaters. Unfortunately for law-abiding citizens, DRM is less a deterrent to piracy than a nuisance that gets in the way of enjoying legally obtained content on more than one device.

What is it? It's not what it is, it's what it isn't--axing DRM means no more schemes to prevent you from moving audio or video from one form of media to another. The most ardent DRM critics dream of a day when you'll be able to take a DVD, pop it in a computer, and end up with a compressed video file that will play on any device in your arsenal. Better yet, you won't need that DVD at all: You'll be able to pay a few bucks for an unprotected, downloadable version of the movie that you can redownload any time you wish.

When is it coming? Technologically speaking, nothing is stopping companies from scrapping DRM tomorrow. But legally and politically, resistance persists. Music has largely made the transition already--Amazon and iTunes both sell DRM-free MP3s that you can play on as many devices as you want.

Video is taking baby steps in the same direction, albeit slowly so far. One recent example: RealNetworks' RealDVD software (which is now embroiled in litigation) lets you rip DVDs to your computer with one click, but they're still protected by a DRM system. Meanwhile, studios are experimenting with bundling legally rippable digital copies of their films with packaged DVDs, while online services are tiptoeing into letting downloaders burn a copy of a digital movie to disc.

That's progress, but ending all DRM as we know it is still years off. Keep your fingers crossed--for 2020.

The Future of Mobile Phones

Use Any Phone on Any Wireless Network

The reason most cell phones are so cheap is that wireless carriers subsidize them so you'll sign a long-term contract. Open access could change the economics of the mobile phone (and mobile data) business dramatically as the walls preventing certain devices from working on certain networks come down. We could also see a rapid proliferation of cell phone models, with smaller companies becoming better able to make headway into formerly closed phone markets.

What is it? Two years is an eternity in the cellular world. The original iPhone was announced, introduced, and discontinued in less than that time, yet carriers routinely ask you to sign up for two-year contracts if you want access to their discounted phones. (It could be worse--in other countries, three years is normal.) Verizon launched the first volley late last year when it promised that "any device, any application" would soon be allowed on its famously closed network. Meanwhile, AT&T and T-Mobile like to note that their GSM networks have long been "open."

When is it coming? Open access is partially here: You can use almost any unlocked GSM handset on AT&T or T-Mobile today, and Verizon Wireless began certifying third-party devices for its network in July (though to date the company has approved only two products). But the future isn't quite so rosy, as Verizon is dragging its feet a bit on the legal requirement that it keep its newly acquired 700-MHz network open to other devices, a mandate that the FCC agreed to after substantial lobbying by Google. Some experts have argued that the FCC provisions aren't wholly enforceable. However, we won't really know how "open" is defined until the new network begins rolling out, a debut slated for 2010.

Your Fingers Do Even More Walking

Last year Microsoft introduced Surface, a table with a built-in monitor and touch screen; many industry watchers have seen it as a bellwether for touch-sensitive computing embedded into every device imaginable. Surface is a neat trick, but the reality of touch devices may be driven by something entirely different and more accessible: the Apple iPhone.

What is it? With the iPhone, "multitouch" technology (which lets you use more than one finger to perform specific actions) reinvented what we knew about the humble touchpad. Tracing a single finger on most touchpads looks positively simian next to some of the tricks you can do with two or more digits. Since the iPhone's launch, multitouch has found its way into numerous mainstream devices, including the Asus Eee PC 900 and a Dell Latitude tablet PC. Now all eyes are turned back to Apple, to see how it will further adapt multitouch (which it has already brought to its laptops' touchpads). Patents that Apple has filed for a multitouch tablet PC have many people expecting the company to dive into this neglected market, finally bringing tablets into the mainstream and possibly sparking explosive growth in the category.

When is it coming? It's not a question of when Multitouch will arrive, but how quickly the trend will grow. Fewer than 200,000 touch-screen devices were shipped in 2006. iSuppli analysts have estimated that a whopping 833 million will be sold in 2013. The real guessing game is figuring out when the old "single-touch" pads become obsolete, possibly taking physical keyboards along with them in many devices.

Cell Phones Are the New Paper

Click here to view full-size image.

Next Year, you can drop paper boarding passes and event tickets and just flash your phone at the gate.Courtesy of TSA (left); courtesy of Tickets.com (right)


Log in to your airline's Web site. Check in. Print out your boarding pass. Hope you don't lose it. Hand the crumpled pass to a TSA security agent and pray you don't get pulled aside for a pat-down search. When you're ready to fly home, wait in line at the airport because you lacked access to a printer in your hotel room. Can't we come up with a better way?

What is it? The idea of the paperless office has been with us since Bill Gates was in short pants, but no matter how sophisticated your OS or your use of digital files in lieu of printouts might be, they're of no help once you leave your desk. People need printouts of maps, receipts, and instructions when a computer just isn't convenient. PDAs failed to fill that need, so coming to the rescue are their replacements: cell phones.

Applications to eliminate the need for a printout in nearly any situation are flooding the market. Cellfire offers mobile coupons you can pull up on your phone and show to a clerk; Tickets.com now makes digital concert passes available via cell phone through its Tickets@Phone service. The final frontier, though, remains the airline boarding pass, which has resisted this next paperless step since the advent of Web-based check-in.

When is it coming? Some cell-phone apps that replace paper are here now (just look at the ones for the iPhone), and even paperless boarding passes are creeping forward. Continental has been experimenting with a cell-phone check-in system that lets you show an encrypted, 2D bar code on your phone to a TSA agent in lieu of a paper boarding pass. The agent scans the bar code with an ordinary scanner, and you're on your way. Introduced at the Houston Intercontinental Airport, the pilot project became permanent earlier this year, and Continental rolled it out in three other airports in 2008. The company promises more airports to come. (Quantas will be doing something similar early next year.)

Where You At? Ask Your Phone, Not Your Friend

Click here to view full-size image.

Right Now, only a handful of devices sport GPS service. In the near future, it will be the norm.


GPS is taking off, as phone makers, carriers, and service providers have realized that consumers generally have no idea where they are, ever. A location-based service (LBS) takes raw GPS data that pinpoints your location and enhances this information with additional services, from suggesting nearby restaurants to specifying the whereabouts of your friends.

What is it? LBS was originally envisioned as simply using old-school cell-phone signal triangulation to locate users' whereabouts, but as the chips become more common and more sophisticated, GPS is proving to be not only handy and accurate but also the basis for new services. Many startups have formed around location-based services. Want a date? Never mind who's compatible; who's nearby? MeetMoi can find them. Need to get a dozen people all in one place? Both Whrrl and uLocate's Buddy Beacon tell you where your friends are in real time.

Of course, not everyone is thrilled about LBS: Worries about surreptitious tracking or stalking are commonplace, as is the possibility of a flood of spam messages being delivered to your phone.

When is it coming? LBS is growing fast. The only thing holding it back is the slow uptake of GPS-enabled phones (and carriers' steep fees to activate the function). But with iPhones selling like Ben & Jerry's in July, that's not much of a hurdle to overcome. Expect to see massive adoption of these technologies in 2009 and 2010.

25 Years of Predictions:

Our Greatest Hits

Predicting the future isn't easy. Sometimes PC World has been right on the money. At other times, we've missed it by a mile. Here are three predictions we made that were eerily prescient--and three where we may have been a bit too optimistic.

1983 What we said: "The mouse will bask in the computer world limelight... Like the joystick before it, though, the mouse will fade someday into familiarity."

We hit that one out of the park. Mice are so commonplace that they're practically disposable.

1984 What we said: "Microsoft Windows should have a lasting effect on the entire personal computer industry."

"Lasting" was an understatement. Windows has now amassed for Microsoft total revenues in the tens of billions of dollars and is so ubiquitous and influential that it has been almost perpetually embroiled in one lawsuit or another, usually involving charges of monopoly or of trademark and patent infringements.

1988 What we said:"In the future you'll have this little box containing all your files and programs... It's very likely that eventually people will always carry their data with them."

For most people, that little box is now also their MP3 player or cell phone.

And Biggest Misses

1987 What we said: "When you walk into an office in 1998, the PC will sense your presence, switch itself on, and promptly deliver your overnight e-mail, sorted in order of importance."

When we arrive in our office, the computer ignores us, slowly delivers the overnight e-mail, and puts all the spam on top.

1994 What we said: "Within five years... batteries that last a year, like watch batteries today, will power [PDAs]."

Perhaps our biggest whiff of all time. Not only do these superbatteries not exist (nor are they even remotely in sight), but PDAs are pretty much dead too.

2000 What we said: We wrote about future "computers that pay attention to you, sensing where you are, what you're doing, and even what your vital signs are... Products incorporating this kind of technology...could hit the market within a year."

While many devices now feature location-sensing hardware, such a PC has yet to come to pass. And frankly, we'd be glad to be wrong about this one. 

Source: http://www.pcworld.com/article/152683/15_hot_new_technologies_that_will_change_everything.html

Monday, January 5, 2009

Bushisms over the years
By The Associated Press


President George W. Bush will leave behind a legacy of Bushisms, the label stamped on the commander in chief's original speaking style. Some of the president's more notable malaprops and mangled statements:
___
• "I know the human being and fish can coexist peacefully." — September 2000, explaining his energy policies at an event in Michigan.
• "Rarely is the question asked, is our children learning?" — January 2000, during a campaign event in South Carolina.
• "They misunderestimated the compassion of our country. I think they misunderestimated the will and determination of the commander in chief, too." — Sept. 26, 2001, in Langley, Va. Bush was referring to the terrorists who carried out the Sept. 11 attacks.
• "There's no doubt in my mind, not one doubt in my mind, that we will fail." — Oct. 4, 2001, in Washington. Bush was remarking on a back-to-work plan after the terrorist attacks.
• "It would be a mistake for the United States Senate to allow any kind of human cloning to come out of that chamber." — April 10, 2002, at the White House, as Bush urged Senate passage of a broad ban on cloning.
• "I want to thank the dozens of welfare-to-work stories, the actual examples of people who made the firm and solemn commitment to work hard to embetter themselves." — April 18, 2002, at the White House.
• "There's an old saying in Tennessee — I know it's in Texas, probably in Tennessee — that says, fool me once, shame on — shame on you. Fool me — you can't get fooled again." — Sept. 17, 2002, in Nashville, Tenn.
• "Our enemies are innovative and resourceful, and so are we. They never stop thinking about new ways to harm our country and our people, and neither do we." — Aug. 5, 2004, at the signing ceremony for a defense spending bill.
• "Too many good docs are getting out of business. Too many OB/GYNs aren't able to practice their love with women all across this country." — Sept. 6, 2004, at a rally in Poplar Bluff, Mo.
• "Our most abundant energy source is coal. We have enough coal to last for 250 years, yet coal also prevents an environmental challenge." — April 20, 2005, in Washington.
• "We look forward to hearing your vision, so we can more better do our job." — Sept. 20, 2005, in Gulfport, Miss.
• "I can't wait to join you in the joy of welcoming neighbors back into neighborhoods, and small businesses up and running, and cutting those ribbons that somebody is creating new jobs." — Sept. 5, 2005, when Bush met with residents of Poplarville, Miss., in the wake of Hurricane Katrina.
• "It was not always a given that the United States and America would have a close relationship. After all, 60 years we were at war 60 years ago we were at war." — June 29, 2006, at the White House, where Bush met with Japanese Prime Minister Junichiro Koizumi.
• "Make no mistake about it, I understand how tough it is, sir. I talk to families who die." — Dec. 7, 2006, in a joint appearance with British Prime Minister Tony Blair.
• "These are big achievements for this country, and the people of Bulgaria ought to be proud of the achievements that they have achieved." — June 11, 2007, in Sofia, Bulgaria.
• "Mr. Prime Minister, thank you for your introduction. Thank you for being such a fine host for the OPEC summit." — September 2007, in Sydney, Australia, where Bush was attending an APEC summit.
• "Thank you, Your Holiness. Awesome speech." April 16, 2008, at a ceremony welcoming Pope Benedict XVI to the White House.
• "The fact that they purchased the machine meant somebody had to make the machine. And when somebody makes a machine, it means there's jobs at the machine-making place." — May 27, 2008, in Mesa, Ariz.
• "And they have no disregard for human life." — July 15, 2008, at the White House. Bush was referring to enemy fighters in Afghanistan.
• "I remember meeting a mother of a child who was abducted by the North Koreans right here in the Oval Office." — June 26, 2008, during a Rose Garden news briefing.
• "Throughout our history, the words of the Declaration have inspired immigrants from around the world to set sail to our shores. These immigrants have helped transform 13 small colonies into a great and growing nation of more than 300 people." — July 4, 2008 in Virginia.
• "The people in Louisiana must know that all across our country there's a lot of prayer — prayer for those whose lives have been turned upside down. And I'm one of them. It's good to come down here." — Sept. 3, 2008, at an emergency operations center in Baton Rouge, La., after Hurricane Gustav hit the Gulf Coast.
• "This thaw — took a while to thaw, it's going to take a while to unthaw." Oct. 20, 2008, in Alexandria, La., as he discussed the economy and frozen credit markets.

Source: http://news.yahoo.com/s/ap/20090103/ap_on_go_pr_wh/bushisms

New Impacts on Outsourcing in 2009

By Kathleen Goolsby

New impacts on outsourcing in 2009 include service-oriented architecture (SOA), service provider "DNA," green IT, the changing role of physicians, and what the future holds because of the convergence of technology and business process. This article looks at what you need to know about each of these impacts.
Service-Oriented Architecture
"I'm really excited about SOA," says Gianni Giacomelli, head of BPO Strategy and Marketing, SAP. "Conceptually, it's a revolution in outsourcing that will take it to the next level."
Software implementations today are constrained by yesterday's way of writing code. As Giacomelli explains, software developers wrote hundreds of thousands of lines of code that, together, handle a business process (such as finance and accounting). But the code is not clearly segmented into functions or subprocesses (such as accounts payable, accounts receivable, collections, general ledger, and fixed assets). It's often difficult to take out the lines of code for subprocesses and give them to another company. And, at times, companies have to implement an entire system even if they only want to use one segment of the code.
In contrast, SOA makes the code accessible in pieces, so to speak, that are very easy to map to business subprocesses. So if a company only wants to implement a system for collections or a system for the general ledger, for example, SOA enables that option.
"By being able to do that, you enable one simple thing: specialization," explains Giacomelli. He compares it to automobile manufacturers that use subcontractors to build almost of the components that make up a car. "Those components became a natural breeding ground for organizations that are specialized in doing specific things such as making brakes or transmissions. Without specialization, we wouldn't be able to have cars that cost what they do today. Cars were expensive and extremely rudimentary decades ago because there was no specialization in the components in the car."
SOA has the potential to generate that specialization in the outsourcing industry because it enables providers to take much more granular pieces of a process and concentrate on them. "By concentrating only on one piece or on a few pieces, service providers can actually choose the ones in which they are really, really good, the ones in which they really can create significant economies of scale for 100 or 200 customers," says the SAP exec.
That's a value proposition that Giacomelli points out is still sometimes lacking in BPO today. "Many providers are not bringing to the fore significantly different economies of scale that the client can't replicate because many providers have at best only a handful of clients running on the same platform."
What are the implications for buyers of outsourcing services? The risk and difficulty of outsourcing subprocesses will be much lower. "The connection points between the piece the buyer moved out and gave to the provider and the rest of the retained subprocesses are going to be very clear because they are mapped into the software. It's almost like taking a Lego piece out of a structure; it still recombines fairly well with the rest because the connection points are very regular. SOA is also great for making new and improved pieces fit with the rest of the structure; there's less pain with enhancement, upgrades, and ultimately innovation."
The ideal scenario is one where both the buyer and provider have SOA so that they can communicate in the best way and so there is a minimum amount of "stranded assets" on the client side. "But the reality is most clients don't have SOA in their landscape today for most of the processes. It's changing, and there's a wave of adoption today; however, broader adoption will follow the rhythm of upgrades, so it will take 10 years," says Giacomelli. "This said, the fact that the provider is already able to use SOA on its end to build very focused 'droplets' of subfunctions is game-changing."
The big advantage of SOA in outsourcing is a win-win for buyers and providers. Giacomelli points out, "With SOA, the BPO provider needs to implement and run only a specific piece of the entire application landscape (such as the collections piece of the accounts receivable process). Therefore, the implementation will be much less complex, less lengthy (and costly) than traditional implementations."
Provider DNA
"I think that the biggest thing in the outsourcing landscape over the next year or two is going to be the expectation of the customers of a much different DNA in the suppliers that they work with in the outsourcing space." That's the belief of Keith Higgins, vice president of Worldwide Marketing, at Aricent, a global innovation, technology, and outsourcing company focused exclusively on the communications industry.
In an age where user experience and consumer demands dictate product development, companies are under pressure to innovate and get to market a lot faster than ever before.
"We're moving from cost arbitrage to skills arbitrage," claims Higgins. This is different from the DNA required for just being the recipient of a client's to-do list and doing it globally at lower cost. Outsourcing providers are now moving up the value chain and product life cycle all the way to the whiteboard."
As clients ask for innovation, industry domain expertise will be "paramount to selecting the right outsourcing partner." It will enable more streamlined expertise for the buyer. Higgins believes the trillion-dollar outsourcing market will soon fragment into players focused on domain expertise.
"It will be the death of the mile-wide inch-deep outsourcing deals," he says. "You can't be a jack of all trades in the outsourcing space." He predicts that domain expertise will be a self-fulfilling prophecy; the more customers a provider has in one domain, the better the provider "gets it," and the more customers the provider will gain.
Neeraj Bhargava, CEO of WNS Global Services, agrees. "Successful providers are going to have greater industry specialization." He says the DNA of offshore providers will also change. "The successful offshore companies will add more value by combining their talent with technologies." According to Bhargava, offshore providers like WNS have the momentum of growth at 40 percent per year for the past five years. Now they're adding higher value-added areas such as research and analytics to their DNA. "Areas such as financial research, marketing analysis, and procurement analysis are growing rapidly in the offshore market," says Bhargava.
Debra Kops, chief marketing officer, WNS Global Services, also lists the changing provider DNA as a new impact on outsourcing in the coming year. "What's driving the increased focus on vertical domain expertise is the need for the provider to understand the buyer's industry challenges and changes in business volumes. An example is knowing the context of billing in the utilities industry along with conversion rates and need for accuracy of meter reading."
Changing role of physicians
Look for a new spin on clinical help desks next year. New opportunities for outsourcing are developing in the physician community, according to Greg Baugh, senior director of operations, Siemens. Business processes in hospitals are changing, and physicians' roles are changing, requiring them to do more things in hospitals. For instance, in an effort to reduce medical errors, hospitals are implementing systems that require physicians to take accountability and place their orders themselves instead of having other clinicians do it for them.
"Outsourcers will need to change the way they provide help desk services and on-site services to physicians. We need to help the physicians do what's now being required of them. Physicians can't delay their work while they're held up with IT issues. They need support right away and expect answers immediately."
Physicians are also getting more involved with the electronic medical records (EMR). As companies sell them ambulatory products to handle the EMR, it will create new opportunities for outsourcing services in support of those products.
Green IT
"Companies are really taking up the charge of responsibility to the environment and to society at large," says Arthur Mazor, senior vice president, Offering Management & Marketing, Fidelity HR Services. Fidelity is finding that most companies seeking to create outsourcing engagements are now including interest in and requirements around environmental sustainability contributions in their evaluation criteria for service providers.
"We're finding that this is a significant impact on the way that outsourcing providers must think about and execute their business strategies, solutions, infrastructure footprint, and usage of resources that are environmentally friendly."
According to Mazor, many buyers' RFIs, RFPs, and questions from analysts and sourcing advisors guiding clients are now requiring providers to demonstrate their positions and environmental contributions. The environmental issue is starting to manifest itself in companies requiring electronic distribution and collection of RFPs.
Mazor says the "big question" is to what degree companies will weight the RFP questions related to environmental sustainability compared to the rest of the provider evaluation. "I think that's something that companies are wresting with," he says.
Bob Pryor, senior vice president, Sales and Marketing, HP Outsourcing Services, agrees that the influence of environmentalism in terms of "green" IT is a significant trend shaping the industry today. He ties it together with pressures on data centers for reducing costs of energy and cooling. "These two issues are tightly connected now."
"We're seeing very significant trends in this past year about what customers are asking for and the issues they are facing regarding their data centers not being able to handle the higher demands for power and cooling, especially in higher density environments," says Pryor. Customers are asking about solutions for energy management, conservation, preservation, and alternative energy sources as well as seeking understanding on whether they should build solutions with their own capital, outsource, or do a combination of both.
Convergence of technology and business process
"Although it's happening in pockets, the trend around the convergence of technology and business process hasn't quite taken hold yet. But it's ultimately the new area in outsourcing," predicts Pryor.
In this emerging model, customers move away from doing everything in an isolated pocket (for example, buy an application from one company, outsource computing capacity from another, and outsource accounting to another company). In the model Pryor favors, customers demand and expect that one company "could provide them all of their business process services with all of the people, expertise, and enabling technology and all bundled back to them at a price however they want it (per customer, per volume, per certain service units, the way they bill their customers, etc.)."
"Combining all of these pieces is an early step in offering on-demand services over the Internet," says Pryor. While cloud computing (including the SaaS model as one component) is "a profound trend in the marketplace today," he believes it will take a while before suppliers can deliver everything as a service from an outsourcing standpoint.
"I think you'll see aspects or components of it in outsourcing within the next three to five years," predicts Pryor. The question is, how prominent will that be? The answer depends on how advanced the enabling network and computing environments become." Building the model will shift the risk to providers, along with the significant capital investment.
"It won't be a small undertaking for providers," he adds. "So we'll see it first in niche kinds of services and with early adopters. But as the demand grows, you'll see the investment and the growth curve that says it's truly a big trend in the industry."
Lessons from the Outsourcing Journal:
Service-oriented architecture (SOA) will enable outsourcing service providers to specialize in certain processes and thus create more significant economies of scale for the buyers' benefit; this will create a value proposition that is often still lacking in BPO to date.
SOA will reduce the complexity, time, and costs involved in traditional software implementations.
The outsourcing market is beginning to fragment into providers focused on domain expertise, enabling them to better meet buyer's needs around industry challenges and changes in business volumes.
The influence of environmentalism in terms of "green" IT is now tightly connected with pressures to reduce energy and cooling costs in data centers. Many companies are now starting to include requirements around environmental sustainability contributions in their evaluation criteria for service providers.
Physicians' roles in hospitals are changing as is their use of IT. Accordingly, service providers need to change the way they provide help desk and on-site services to physicians.
Outsourcing will begin moving away from doing work in isolated pockets (buying an application from one provider, computing capacity from another, and outsourcing a business process to another) and move toward providers that can deliver all such aspects in one bundled offering at a pricing structure that suits the buyer's needs. Combining all these aspects is necessary for offering on-demand outsourced services. This movement is beginning to happen now and will increase in niche areas over the next three to five years.