Nehalem and Swift Chips Spell the End of Stand-Alone Graphics Boards
When AMD purchased graphics card maker ATI, most industry observers
assumed that the combined company would start working on a CPU-GPU
fusion. That work is further along than you may think.
What is it? While GPUs get tons of attention,
discrete graphics boards are a comparative rarity among PC owners, as 75
percent of laptop users stick with good old integrated graphics,
according to Mercury Research. Among the reasons: the extra cost of a
discrete graphics card, the hassle of installing one, and its drain on
the battery. Putting graphics functions right on the CPU eliminates all
three issues.
Chip makers expect the performance of such on-die GPUs to fall
somewhere between that of today's integrated graphics and stand-alone
graphics boards--but eventually, experts believe, their performance
could catch up and make discrete graphics obsolete. One potential idea
is to devote, say, 4 cores in a 16-core CPU to graphics processing,
which could make for blistering gaming experiences.
When is it coming? Intel's soon-to-come Nehalem chip includes graphics processing within the chip package, but off of the actual CPU die. AMD's Swift
(aka the Shrike platform), the first product in its Fusion line,
reportedly takes the same design approach, and is also currently on tap
for 2009.
Putting the GPU directly on the same die as the CPU presents
challenges--heat being a major one--but that doesn't mean those issues
won't be worked out. Intel's two Nehalem follow-ups, Auburndale and
Havendale, both slated for late 2009, may be the first chips to put a
GPU and a CPU on one die, but the company isn't saying yet.
USB 3.0 Speeds Up Performance on External Devices
The USB connector has been one of the greatest success stories in
the history of computing, with more than 2 billion USB-connected devices
sold to date. But in an age of terabyte hard drives, the once-cool
throughput of 480 megabits per second that a USB 2.0 device can
realistically provide just doesn't cut it any longer.
What is it? USB 3.0 (aka "SuperSpeed USB")
promises to increase performance by a factor of 10, pushing the
theoretical maximum throughput of the connector all the way up to 4.8
gigabits per second, or processing roughly the equivalent of an entire
CD-R disc every second. USB 3.0 devices will use a slightly different
connector, but USB 3.0 ports are expected to be backward-compatible with
current USB plugs, and vice versa. USB 3.0 should also greatly enhance
the power efficiency of USB devices, while increasing the juice (nearly
one full amp, up from 0.1 amps) available to them. That means faster
charging times for your iPod--and probably even more bizarre
USB-connected gear like the toy rocket launchers and beverage coolers
that have been festooning people's desks.
When is it coming? The USB 3.0 spec is nearly
finished, with consumer gear now predicted to come in 2010. Meanwhile, a
host of competing high-speed plugs--DisplayPort, eSATA, and HDMI--will
soon become commonplace on PCs, driven largely by the onset of high-def
video. Even FireWire is looking at an imminent upgrade of up to 3.2 gbps
performance. The port proliferation may make for a baffling landscape
on the back of a new PC, but you will at least have plenty of
high-performance options for hooking up peripherals.
Wireless Power Transmission
Wireless power transmission has been a dream since the days when
Nikola Tesla imagined a world studded with enormous Tesla coils. But
aside from advances in recharging electric toothbrushes, wireless power
has so far failed to make significant inroads into consumer-level gear.
What is it? This summer, Intel researchers
demonstrated a method--based on MIT research--for throwing electricity a
distance of a few feet, without wires and without any dangers to
bystanders (well, none that they know about yet). Intel calls the
technology a "wireless resonant energy link,"
and it works by sending a specific, 10-MHz signal through a coil of
wire; a similar, nearby coil of wire resonates in tune with the
frequency, causing electrons to flow through that coil too. Though the
design is primitive, it can light up a 60-watt bulb with 70 percent
efficiency.
When is it coming? Numerous obstacles remain, the
first of which is that the Intel project uses alternating current. To
charge gadgets, we'd have to see a direct-current version, and the size
of the apparatus would have to be considerably smaller. Numerous
regulatory hurdles would likely have to be cleared in commercializing
such a system, and it would have to be thoroughly vetted for safety
concerns.
Assuming those all go reasonably well, such receiving circuitry
could be integrated into the back of your laptop screen in roughly the
next six to eight years. It would then be a simple matter for your local
airport or even Starbucks to embed the companion power transmitters
right into the walls so you can get a quick charge without ever opening
up your laptop bag.
The Future of Your PC's Software
64-Bit Computing Allows for More RAM
In 1986, Intel introduced its first 32-bit CPU. It wasn't until
1993 that the first fully 32-bit Windows OS--Windows NT 3.1--followed,
officially ending the 16-bit era. Now 64-bit processors have become the
norm in desktops and notebooks, though Microsoft still won't commit to
an all-64-bit Windows. But it can't live in the 32-bit world forever.
What is it? 64-bit versions of Windows have been
around since Windows XP, and 64-bit CPUs have been with us even longer.
In fact, virtually every computer sold today has a 64-bit processor
under the hood. At some point Microsoft will have to jettison 32-bit
altogether, as it did with 16-bit when it launched Windows NT, if it
wants to induce consumers (and third-party hardware and software
developers) to upgrade. That isn't likely with Windows 7: The upcoming
OS is already being demoed in 32-bit and 64-bit versions. But
limitations in 32-bit's addressing structure will eventually force
everyone's hand; it's already a problem for 32-bit Vista users, who have
found that the OS won't access more than about 3GB of RAM because it
simply doesn't have the bits to access additional memory.
When is it coming? Expect to see the shift toward 64-bit accelerate with Windows 7;
Microsoft will likely switch over to 64-bit exclusively with Windows 8.
That'll be 2013 at the earliest. Meanwhile, Mac OS X Leopard is already
64-bit, and some hardware manufacturers are currently trying to
transition customers to 64-bit versions of Windows (Samsung says it will
push its entire PC line to 64-bit in early 2009). And what about
128-bit computing, which would represent the next big jump? Let's tackle
one sea change at a time--and prepare for that move around 2025.
Whether
you love Vista or hate it, the current Windows will soon go to that
great digital graveyard in the sky. After the tepid reception Vista
received, Microsoft is putting a rush on Vista's follow-up, known
currently as Windows 7.
What is it? At this point Windows 7 seems to be the OS that Microsoft wanted to release as Vista,
but lacked the time or resources to complete. Besides continuing
refinements to the security system of the OS and to its look and feel,
Windows 7 may finally bring to fruition the long-rumored database-like
WinFS file system. Performance and compatibility improvements over Vista
are also expected.
But the main thrust of Windows 7 is likely to be enhanced online integration and more cloud computing features--look
for Microsoft to tie its growing Windows Live services into the OS more
strongly than ever. Before his retirement as Microsoft's chairman, Bill
Gates suggested that a so-called pervasive desktop would be a focus of
Windows 7, giving users a way to take all their data, desktop settings,
bookmarks, and the like from one computer to another--presumably as long
as all those computers were running Windows 7.
When is it coming? Microsoft has set a target date
of January 2010 for the release of Windows 7, and the official date
hasn't slipped yet. However, rumor has the first official beta coming
out before the end of this year.
In case you haven't noticed, Google now has its well-funded mitts on just about every aspect of computing. From Web browsers to cell phones,
soon you'll be able to spend all day in the Googleverse and never have
to leave. Will Google make the jump to building its own PC operating
system next?
What is it? It's everything, or so it seems. Google
Checkout provides an alternative to PayPal. Street View is well on its
way to taking a picture of every house on every street in the United
States. And the fun is just starting: Google's early-beta Chrome browser
earned a 1 percent market share in the first 24 hours of its existence.
Android, Google's cell phone operating system, is hitting handsets as
you read this, becoming the first credible challenger to the iPhone
among sophisticated customers.
When is it coming? Though Google seems to have
covered everything, many observers believe that logically it will next
attempt to attack one very big part of the software market: the
operating system.
The Chrome browser is the first toe Google has dipped into these
waters. While a browser is how users interact with most of Google's
products, making the underlying operating system somewhat irrelevant,
Chrome nevertheless needs an OS to operate.
To make Microsoft irrelevant, though, Google would have to work its
way through a minefield of device drivers, and even then the result
wouldn't be a good solution for people who have specialized application
needs, particularly most business users. But a simple Google OS--perhaps
one that's basically a customized Linux distribution--combined with
cheap hardware could be something that changes the PC landscape in ways
that smaller players who have toyed with open-source OSs so far haven't
been quite able to do.
The Future of Entertainment
Gesture-Based Remote Control
What is it? Compared with the intricacies of voice
recognition, gesture recognition is a fairly simple idea that is only
now making its way into consumer electronics. The idea is to employ a
camera (such as a laptop's Webcam) to watch the user and react to the
person's hand signals. Holding your palm out flat would indicate "stop,"
for example, if you're playing a movie or a song. And waving a fist
around in the air could double as a pointing system: You would just move
your fist to the right to move the pointer right, and so on.
When is it coming?
Gesture recognition systems
are creeping onto the market now. Toshiba, a pioneer in this market,
has at least one product out that supports an early version of the
technology: the Qosmio G55 laptop, which can recognize gestures to
control multimedia playback. The company is also experimenting with a TV
version of the technology, which would watch for hand signals via a
small camera atop the set. Based on my tests, though, the accuracy of
these systems still needs a lot of work.
Gesture recognition is a neat way to pause the DVD on your laptop,
but it probably remains a way off from being sophisticated enough for
broad adoption. All the same, its successful development would excite
tons of interest from the "can't find the remote" crowd. Expect to see
gesture recognition technology make some great strides over the next few
years, with inroads into mainstream markets by 2012.
Radical Simplification Hits the TV Business
The back of most audiovisual centers looks like a tangle of snakes
that even Medusa would turn away from. Similarly, the bowl of remote
controls on your coffee table appeals to no one. The Tru2way platform
may simplify things once and for all.
What is it? Who can forget CableCard,
a technology that was supposed to streamline home A/V installations but
that ultimately went nowhere despite immense coverage and hype?
CableCard just didn't do enough--and what it managed to do, it didn't do
very well. Enter Tru2way.
Tru2way is a set of services and standards designed to pick up the
pieces of CableCard's failure by upgrading what that earlier standard
could do (including support for two-way communications features like
programming guides and pay-per-view, which CableCard TVs couldn't
handle), and by offering better compatibility, improved stability, and
support for dual-tuner applications right out of the box. So if you have
a Tru2way-capable TV, you should need only to plug in a wire to be up
and running with a full suite of interactive cable services (including
local search features, news feeds, online shopping, and games)--all sans
additional boxes, extra remotes, or even a visit from cable-company
technicians.
When is it coming? Tru2way sets have been
demonstrated all year, and Chicago and Denver will be the first markets
with the live technology. Does Tru2way have a real shot? Most of the
major cable companies have signed up to implement it, as have numerous
TV makers, including LG, Panasonic, Samsung, and Sony. Panasonic began
shipping two Tru2way TVs in late October, and Samsung may have sets that
use the technology available in early to mid-2009.
What is it? It's not what it is, it's what it
isn't--axing DRM means no more schemes to prevent you from moving audio
or video from one form of media to another. The most ardent DRM critics
dream of a day when you'll be able to take a DVD, pop it in a computer,
and end up with a compressed video file that will play on any device in
your arsenal. Better yet, you won't need that DVD at all: You'll be able
to pay a few bucks for an unprotected, downloadable version of the
movie that you can redownload any time you wish.
When is it coming? Technologically speaking, nothing
is stopping companies from scrapping DRM tomorrow. But legally and
politically, resistance persists. Music has largely made the transition
already--Amazon and iTunes both sell DRM-free MP3s that you can play on as many devices as you want.
Video is taking baby steps in the same direction, albeit slowly so
far. One recent example: RealNetworks' RealDVD software (which is now
embroiled in litigation) lets you rip DVDs to your computer with one
click, but they're still protected by a DRM system. Meanwhile, studios
are experimenting with bundling legally rippable digital copies of their
films with packaged DVDs, while online services are tiptoeing into
letting downloaders burn a copy of a digital movie to disc.
That's progress, but ending all DRM as we know it is still years off. Keep your fingers crossed--for 2020.
The Future of Mobile Phones
Use Any Phone on Any Wireless Network
The reason most cell phones are so cheap is that wireless carriers
subsidize them so you'll sign a long-term contract. Open access could
change the economics of the mobile phone (and mobile data) business
dramatically as the walls preventing certain devices from working on
certain networks come down. We could also see a rapid proliferation of
cell phone models, with smaller companies becoming better able to make
headway into formerly closed phone markets.
What is it? Two years is an eternity in the cellular
world. The original iPhone was announced, introduced, and discontinued
in less than that time, yet carriers routinely ask you to sign up for
two-year contracts if you want access to their discounted phones. (It
could be worse--in other countries, three years is normal.) Verizon
launched the first volley late last year when it promised that "any
device, any application" would soon be allowed on its famously closed network. Meanwhile, AT&T and T-Mobile like to note that their GSM networks have long been "open."
When is it coming? Open access is partially here:
You can use almost any unlocked GSM handset on AT&T or T-Mobile
today, and Verizon Wireless began certifying third-party devices for its
network in July (though to date the company has approved only two
products). But the future isn't quite so rosy, as Verizon is dragging
its feet a bit on the legal requirement that it keep its newly acquired
700-MHz network open to other devices, a mandate that the FCC agreed to
after substantial lobbying by Google. Some experts have argued that the
FCC provisions aren't wholly enforceable. However, we won't really know
how "open" is defined until the new network begins rolling out, a debut
slated for 2010.
Your Fingers Do Even More Walking
Last year Microsoft introduced Surface,
a table with a built-in monitor and touch screen; many industry
watchers have seen it as a bellwether for touch-sensitive computing
embedded into every device imaginable. Surface is a neat trick, but the
reality of touch devices may be driven by something entirely different
and more accessible: the Apple iPhone.
What is it? With the iPhone, "multitouch" technology
(which lets you use more than one finger to perform specific actions)
reinvented what we knew about the humble touchpad. Tracing a single
finger on most touchpads looks positively simian next to some of the
tricks you can do with two or more digits. Since the iPhone's launch,
multitouch has found its way into numerous mainstream devices, including
the Asus Eee PC 900 and a Dell Latitude tablet PC. Now all eyes are
turned back to Apple, to see how it will further adapt multitouch (which
it has already brought to its laptops' touchpads). Patents that Apple
has filed for a multitouch tablet PC have many people expecting the
company to dive into this neglected market, finally bringing tablets
into the mainstream and possibly sparking explosive growth in the
category.
When is it coming? It's not a question of when
Multitouch will arrive, but how quickly the trend will grow. Fewer than
200,000 touch-screen devices were shipped in 2006. iSuppli analysts have
estimated that a whopping 833 million will be sold in 2013. The real
guessing game is figuring out when the old "single-touch" pads become
obsolete, possibly taking physical keyboards along with them in many
devices.
What is it? The idea of the paperless office has
been with us since Bill Gates was in short pants, but no matter how
sophisticated your OS or your use of digital files in lieu of printouts
might be, they're of no help once you leave your desk. People need
printouts of maps, receipts, and instructions when a computer just isn't
convenient. PDAs failed to fill that need, so coming to the rescue are
their replacements: cell phones.
Applications to eliminate the need for a printout in nearly any situation are flooding the market. Cellfire offers mobile coupons you can pull up on your phone and show to a clerk; Tickets.com
now makes digital concert passes available via cell phone through its
Tickets@Phone service. The final frontier, though, remains the airline
boarding pass, which has resisted this next paperless step since the
advent of Web-based check-in.
When is it coming? Some cell-phone apps that replace paper are here now (just look at the ones for the iPhone), and even paperless boarding passes are creeping forward. Continental has been experimenting with a cell-phone check-in system
that lets you show an encrypted, 2D bar code on your phone to a TSA
agent in lieu of a paper boarding pass. The agent scans the bar code
with an ordinary scanner, and you're on your way. Introduced at the
Houston Intercontinental Airport, the pilot project became permanent
earlier this year, and Continental rolled it out in three other airports
in 2008. The company promises more airports to come. (Quantas will be doing something similar early next year.)
Where You At? Ask Your Phone, Not Your Friend
What is it? LBS was originally envisioned as simply
using old-school cell-phone signal triangulation to locate users'
whereabouts, but as the chips become more common and more sophisticated,
GPS is proving to be not only handy and accurate but also the basis for
new services. Many startups have formed around location-based services.
Want a date? Never mind who's compatible; who's nearby? MeetMoi can find them. Need to get a dozen people all in one place? Both Whrrl and uLocate's Buddy Beacon tell you where your friends are in real time.
Of course, not everyone is thrilled about LBS: Worries about
surreptitious tracking or stalking are commonplace, as is the
possibility of a flood of spam messages being delivered to your phone.
When is it coming? LBS is growing fast. The only
thing holding it back is the slow uptake of GPS-enabled phones (and
carriers' steep fees to activate the function). But with iPhones selling
like Ben & Jerry's in July, that's not much of a hurdle to
overcome. Expect to see massive adoption of these technologies in 2009
and 2010.
The Future of Your PC's Hardware
Memristor: A Groundbreaking New Circuit

Photograph: Courtesy of HPSince
the dawn of electronics, we've had only three types of circuit
components--resistors, inductors, and capacitors. But in 1971, UC
Berkeley researcher Leon Chua theorized the possibility of a fourth type
of component, one that would be able to measure the flow of electric
current: the memristor. Now, just 37 years later, Hewlett-Packard has
built one.
What is it? As its name implies, the memristor can "remember" how much current has passed through it. And by alternating the amount of current that passes through it, a memristor can also become a one-element circuit component with unique properties. Most notably, it can save its electronic state even when the current is turned off, making it a great candidate to replace today's flash memory.
Memristors will theoretically be cheaper and far faster than flash memory, and allow far greater memory densities. They could also replace RAM chips as we know them, so that, after you turn off your computer, it will remember exactly what it was doing when you turn it back on, and return to work instantly. This lowering of cost and consolidating of components may lead to affordable, solid-state computers that fit in your pocket and run many times faster than today's PCs.
Someday the memristor could spawn a whole new type of computer, thanks to its ability to remember a range of electrical states rather than the simplistic "on" and "off" states that today's digital processors recognize. By working with a dynamic range of data states in an analog mode, memristor-based computers could be capable of far more complex tasks than just shuttling ones and zeroes around.
When is it coming? Researchers say that no real barrier prevents implementing the memristor in circuitry immediately. But it's up to the business side to push products through to commercial reality. Memristors made to replace flash memory (at a lower cost and lower power consumption) will likely appear first; HP's goal is to offer them by 2012. Beyond that, memristors will likely replace both DRAM and hard disks in the 2014-to-2016 time frame. As for memristor-based analog computers, that step may take 20-plus years.
32-Core CPUs From Intel and AMD
Photograph: Courtesy of IntelIf your CPU has only a single core, it's officially a dinosaur. In fact, quad-core computing
is now commonplace; you can even get laptop computers with four cores
today. But we're really just at the beginning of the core wars:
Leadership in the CPU market will soon be decided by who has the most
cores, not who has the fastest clock speed.
What is it? With the gigahertz race largely abandoned, both AMD and Intel are trying to pack more cores onto a die in order to continue to improve processing power and aid with multitasking operations. Miniaturizing chips further will be key to fitting these cores and other components into a limited space. Intel will roll out 32-nanometer processors (down from today's 45nm chips) in 2009.
When is it coming? Intel has been very good about sticking to its road map. A six-core CPU based on the Itanium design should be out imminently, when Intel then shifts focus to a brand-new architecture called Nehalem, to be marketed as Core i7. Core i7 will feature up to eight cores, with eight-core systems available in 2009 or 2010. (And an eight-core AMD project called Montreal is reportedly on tap for 2009.)
After that, the timeline gets fuzzy. Intel reportedly canceled a 32-core project called Keifer, slated for 2010, possibly because of its complexity (the company won't confirm this, though). That many cores requires a new way of dealing with memory; apparently you can't have 32 brains pulling out of one central pool of RAM. But we still expect cores to proliferate when the kinks are ironed out: 16 cores by 2011 or 2012 is plausible (when transistors are predicted to drop again in size to 22nm), with 32 cores by 2013 or 2014 easily within reach. Intel says "hundreds" of cores may come even farther down the line.
What is it? As its name implies, the memristor can "remember" how much current has passed through it. And by alternating the amount of current that passes through it, a memristor can also become a one-element circuit component with unique properties. Most notably, it can save its electronic state even when the current is turned off, making it a great candidate to replace today's flash memory.
Memristors will theoretically be cheaper and far faster than flash memory, and allow far greater memory densities. They could also replace RAM chips as we know them, so that, after you turn off your computer, it will remember exactly what it was doing when you turn it back on, and return to work instantly. This lowering of cost and consolidating of components may lead to affordable, solid-state computers that fit in your pocket and run many times faster than today's PCs.
Someday the memristor could spawn a whole new type of computer, thanks to its ability to remember a range of electrical states rather than the simplistic "on" and "off" states that today's digital processors recognize. By working with a dynamic range of data states in an analog mode, memristor-based computers could be capable of far more complex tasks than just shuttling ones and zeroes around.
When is it coming? Researchers say that no real barrier prevents implementing the memristor in circuitry immediately. But it's up to the business side to push products through to commercial reality. Memristors made to replace flash memory (at a lower cost and lower power consumption) will likely appear first; HP's goal is to offer them by 2012. Beyond that, memristors will likely replace both DRAM and hard disks in the 2014-to-2016 time frame. As for memristor-based analog computers, that step may take 20-plus years.
32-Core CPUs From Intel and AMD
What is it? With the gigahertz race largely abandoned, both AMD and Intel are trying to pack more cores onto a die in order to continue to improve processing power and aid with multitasking operations. Miniaturizing chips further will be key to fitting these cores and other components into a limited space. Intel will roll out 32-nanometer processors (down from today's 45nm chips) in 2009.
When is it coming? Intel has been very good about sticking to its road map. A six-core CPU based on the Itanium design should be out imminently, when Intel then shifts focus to a brand-new architecture called Nehalem, to be marketed as Core i7. Core i7 will feature up to eight cores, with eight-core systems available in 2009 or 2010. (And an eight-core AMD project called Montreal is reportedly on tap for 2009.)
After that, the timeline gets fuzzy. Intel reportedly canceled a 32-core project called Keifer, slated for 2010, possibly because of its complexity (the company won't confirm this, though). That many cores requires a new way of dealing with memory; apparently you can't have 32 brains pulling out of one central pool of RAM. But we still expect cores to proliferate when the kinks are ironed out: 16 cores by 2011 or 2012 is plausible (when transistors are predicted to drop again in size to 22nm), with 32 cores by 2013 or 2014 easily within reach. Intel says "hundreds" of cores may come even farther down the line.
25 Years of Predictions:
Our Greatest Hits
Predicting the future isn't easy. Sometimes PC World has
been right on the money. At other times, we've missed it by a mile. Here
are three predictions we made that were eerily prescient--and three
where we may have been a bit too optimistic.
1983
What we said: "The mouse will bask in the computer world
limelight... Like the joystick before it, though, the mouse will fade
someday into familiarity."
We hit that one out of the park. Mice are so commonplace that they're practically disposable.
1984 What we said: "Microsoft Windows should have a lasting effect on the entire personal computer industry."
"Lasting" was an understatement. Windows has now amassed for
Microsoft total revenues in the tens of billions of dollars and is so
ubiquitous and influential that it has been almost perpetually embroiled in one lawsuit or another, usually involving charges of monopoly or of trademark and patent infringements.
1988
What we said:"In the future you'll have this little box
containing all your files and programs... It's very likely that
eventually people will always carry their data with them."
For most people, that little box is now also their MP3 player or cell phone.
And Biggest Misses
1987
What we said: "When you walk into an office in 1998, the PC
will sense your presence, switch itself on, and promptly deliver your
overnight e-mail, sorted in order of importance."
When we arrive in our office, the computer ignores us, slowly delivers the overnight e-mail, and puts all the spam on top.
1994
What we said: "Within five years... batteries that last a year, like watch batteries today, will power [PDAs]."
Perhaps our biggest whiff of all time. Not only do these
superbatteries not exist (nor are they even remotely in sight), but PDAs
are pretty much dead too.
2000 What we said: We wrote about future
"computers that pay attention to you, sensing where you are, what you're
doing, and even what your vital signs are... Products incorporating
this kind of technology...could hit the market within a year."
While many devices now feature location-sensing hardware, such a PC
has yet to come to pass. And frankly, we'd be glad to be wrong about
this one.
0 comments:
Post a Comment
Add Your Comment Below