That actor--who was he? My augmented-reality contact lenses pick up the unique eye motion I make when I have a query, which I then enter on a virtual keyboard that appears in the space in front of me. Suddenly my field of vision is covered with a Web page showing a list of the actor's movies, along with some embedded video clips.
These technologies will come to life in the distant future, right? Future, yes. Distant, no.
Speed and content (much of it video) will be paired consistently across mobile, laptop, desktop, and home-entertainment systems. New ways of using video--including adding 3D depth or artificial visual overlays--will require more speed, storage, and computational power.
In our preview of technologies that are well on their way to reality, we look at the connective tissue of USB 3.0, 802.11ac, and 802.11ad for moving media--especially video--faster; at HTML5 for displaying video and content of all kinds consistently across all our devices; at augmented reality to see how the digital world will stretch into our physical reality by overlaying what we see with graphics and text; and at 3D TV, which will add image depth and believability to the experience of watching TV.
USB 3.0
The new USB 3.0 standard preserves backward compatibility by allowing older cables to plug into newer jacks; but newer cables like this one have extra pins that boost the data rate to 4.8 gbps.
Before you leave work, you need to back up your computer. You push a button, and 5 minutes later, while you're still packing up, your system has dumped 150GB of data onto an encrypted 512GB superfast solid-state drive, which you eject to take with you for offsite backup. On your way home, you stop at a movie kiosk outside a fast-food restaurant and buy a feature-length 3D video download on sale. You plug in your drive, the kiosk reads your credentials, and while you watch a 90-second preview of coming attractions, the 30GB video transfers onto your SSD. You pull out the drive and head home.USB may be one of the least-sexy technologies built into present-day computers and mobile devices, but speed it up tenfold, and it begins to sizzle. Cut most of the other cables to your computer, and the standard ignites. Bring in the potential of uncompressed video transfer, and you have a raging fire.
Any task that involves transferring data between your PC and a peripheral device--scanning, printing, or transferring files, among others--will be far faster with USB 3.0. In many cases, the transfer will be complete before you realize it has started.
The 3.0 revision of USB, dubbed SuperSpeed by the folks who control testing and licensing at the USB Implementors Forum (USB-IF), is on track to deliver more than 3.2 gigabits per second (gbps) of actual throughput. That transfer rate will make USB 3.0 five to ten times faster than other standard desktop peripheral standards, except some flavors of DisplayPort and the increasingly out-of-favor eSATA.
In addition, USB 3.0 can shoot full-speed data in both directions at the same time, an upgrade from 2.0's "half duplex" (one direction at a time) rates. USB 3.0 jacks will accept 1.0 and 2.0 plug ends for backward compatibility, but 3.0 cables will work only with 3.0 jacks.
This technology could be a game-changer for device connectivity. A modern desktop computer today may include jacks to accommodate ethernet, USB 2.0, FireWire 400 or 800 (IEEE 1394a or 1394b) or both, DVI or DisplayPort or both, and--on some--eSATA. USB 3.0 could eliminate all of these except ethernet. In their place, a computer may have several USB 3.0 ports, delivering data to monitors, retrieving it from scanners, and exchanging it with hard drives. The improved speed comes at a good time, as much-faster flash memory drives are in the pipeline.
USB 3.0 is fast enough to allow uncompressed 1080p video (currently our highest-definition video format) at 60 frames per second, says Jeff Ravencraft, president and chair of the USB-IF. That would enable a camcorder to forgo video compression hardware and patent licensing fees for MPEG-4. The user could either stream video live from a simple camcorder (with no video processing required) or store it on an internal drive for later rapid transfer; neither of these methods is feasible today without heavy compression. Citing 3.0's versatility, some analysts see the standard as a possible complement--or even alternative--to the consumer HDMI connection found on today's Blu-ray players.
The new USB flavor could also turn computers into real charging stations. Whereas USB 2.0 can produce 100 milliamperes (mA) of trickle charge for each port, USB 3.0 ups that quantity to 150mA per device. USB 2.0 tops out at 500mA for a hub; the maximum for USB 3.0 is 900mA.
With mobile phones moving to support USB as the standard plug for charging and syncing (the movement is well underway in Europe and Asia), and with U.S. carriers having recently committed to doing the same, the increased amperage of USB 3.0 might let you do away with wall warts (AC adapters) of all kinds.
In light of the increased importance and use of USB in its 3.0 version, future desktop computers may very well have two internal hubs, with several ports easily accessible in the front to act as a charging station. Each hub could have up to six ports and support the full amperage. Meanwhile, laptop machines could multiply USB ports for better charging and access on the road. (Apple's Mac Mini already includes five USB 2.0 ports on its back.)
The higher speed of 3.0 will accelerate data transfers, of course, moving more than 20GB of data per minute. This will make performing backups (and maintaining offsite backups) of increasingly large collections of images, movies, and downloaded media a much easier job.
Possible new applications for the technology include on-the-fly syncs and downloads (as described in the case study above). The USB-IF's Ravencraft notes that customers could download movies at the gas pump at of a filling station. "With high-speed USB [2.0], you couldn't have people waiting in line at 15 minutes a crack to download a movie," Ravencraft says.
Manufacturers are poised to take advantage of USB 3.0, and analysts predict mass adoption of the standard on computers within a couple of years. The format will be popular in mobile devices and consumer electronics, as well. Ravencraft says that manufacturers currently sell more than 2 billion devices with built-in USB each year, so there's plenty of potential for getting the new standard out fast.
Video Streaming Over Wi-Fi
Today's Wi-Fi will be left in the dust by 802.11ac and 802.11ad, both of which will be capable of carrying multiple video streams and of operating at far higher data rates.
When you get home--with your high-def, 3D movie stored on a flash drive--you plug the drive into your laptop and transfer it to your network file server over a gigabit Wi-Fi connection. A couple of minutes later, the movie is ready to stream via a 60GHz wireless link from your networked entertainment center to your wall-mounted HDTV.Wired ethernet has consistently achieved higher data speeds than Wi-Fi, but wireless standards groups are constantly trying to figure out ways to help Wi-Fi catch up. By 2012, two new protocols--802.11ac and 802.11ad--should be handling over-the-air data transmission at 1 gbps or faster.
As a result, future users can have multiple high-definition video streams and gaming streams active across a house and within a room. Central media servers, Blu-ray players, and other set-top boxes can sit anywhere in the home, streaming content to end devices in any location. For example, an HD video display, plugged in with just a power cord, can stand across the room from a Blu-ray player, satellite receiver, or computer--no need for expensive, unsightly cables.
The 802.11ac and 802.11ad standards should be well suited for home use, though their applications will certainly extend far beyond the home. The names reflect the internal method of numbering that the engineering group IEEE uses: 802 for networking, 11 for wireless, and one or more letters in sequence for specific task groups (that's how we got 802.11a, b, g, h, n, and others).
The 802.11ac standard will update 802.11n, the latest and greatest of a decade's worth of wireless local area networking (WLAN) technology that began with 802.11b. With 802.11ac, wireless networking performance will leap from a theoretical top speed of 600 mbps to a nominal maximum of more than 1 gbps. In practice, the net data carried by 802.11ac will be likely be between 300 mbps and 400 mbps--up from 160 mbps or so for a good real-world 802.11n setup, and more than enough capacity to carry multiple compressed video streams over a single channel simultaneously. Or users may assign individual streams running on unique frequencies to a number of separate channels. Like 802.11n, 802.11ac will use many antennas for receiving and sending data wirelessly.
The 802.11ac flavor still won't have the capacity to carry lossless high-definition video (video that retains the full fidelity and quality of the raw source), however. Today, lossless video is common over wired connections after decompression or decoding of a data stream from a satellite, cable, or disc. The right hardware will be able to take the 802.11ac compressed data stream and send it directly to a decoder in an HDTV set; some HD sets already have this capability today. But when uncompressed video has to stream at a rate faster than 1 gbps, a speedier format must be used.
That's where 802.11ad comes in. It abandons the 2.4GHz and 5GHz bands of the spectrum (where today's Wi-Fiworks) to the newly available 60GHz spectrum. Because the 60GHz spectrum has an ocean of frequencies available in most countries--including in the United States--you'll be able to use multiple distinct channels to carry more than 1 gbps of uncompressed video each.
Unfortunately, the millimeter-long waves that make up 60GHz signals penetrate walls and furniture poorly, and oxygen readily absorbs the waves' energy. So 802.11ad is best suited for moving data across short distances between devices in the same room. Apart from supporting fast video transfers, 802.11ad will permit you to move files or sync data between devices at speeds approaching that of USB 3.0--and 1000 times faster than Bluetooth 2.
The 802.11ad spec is one of three competing ideas for using the 60GHz band of the spectrum. The Wireless HD trade group, a consortium of consumer electronics firms, is focusing on video use of the 60GHz band, while theWireless Gigabit Alliance (WiGig) is looking at networking and consumer uses. Membership in the various groups overlaps, making an interoperable and perhaps unified spec possible. Though 802.11ad doesn't specifically address video, it will be a generic technology that can accommodate many kinds of data. At a minimum, each group will work to prevent interference with one another's purposes.
The combination of 802.11ac and 802.11ad, coupled with USB 3.0, will allow you to position clusters of computer equipment and entertainment hardware around your home. USB 3.0 and gigabit ethernet might connect devices located in a cabinet or on a desk; 802.11ac will link clusters across a home; and 802.11ad will carry data to mobile devices, displays, and other gear within a room.
Allen Huotari, the technical leader at Cisco Consumer Products (which now includes Linksys products and ships millions of Wi-Fi and ethernet devices each year) says that the change in home networks won't result from "any one single technology in the home, but rather a pairing of technologies or a trio of technologies--wired and/or wireless--for the backbone and the wireless on the edges."
This means fewer wires and cables, better speeds, and higher-quality video playback than anything possible today. By 2012, both specifications should be readily available.
3D TV
Panasonic and other high-definition TV makers are looking to faux 3D technology to provide stereoscopic depth--and a reason for consumers to buy a newer set.
Disconnecting your active-shutter 3D glasses from a charger, you slip them on, eager to check out your downloaded copy of Hulk VI: Triumph of the Stretch Fabrics,the latest entrant in the green antihero's film franchise. You drop into a comfy chair, tell the kids it's time for a movie, and twist the heat pouch on a bag of popcorn to start it popping. The kids grab their own glasses and sit down to watch the Hulk knock the Predator practically into their laps!When television makers introduced HDTVs, it was inevitable that they would figure out a way to render the technology obsolete not long after everyone bought a set. And they have. The next wave in home viewing is 3DTV--a 2D picture with some stereoscopic depth.
As 3D filmmaking and film projection technology have improved, Hollywood has begun building a (still small) library of depth-enhanced movies. The potential to synthesize 2D movies into 3D could feed demand, however--the way colorizing technology increased interest in black-and-white films in some circles in the 1980s. For movies based on computer animation--such as Toy Story 3D, a newly rendered version of the first two movies in the series--it's already happening.
The promise of 3D is a more immersive, more true-to-life experience, and substantively different from almost anything you've watched before. In commercial theaters, 3D projection typically involves superimposing polarized or distinctly colored images on each frame and then having viewers wear so-called "passive" glasses that reveal different images to each eye. The brain synthesizes the two images into a generally convincing notion of depth.
In contrast, 3D at home will almost certainly rely on alternating left and right views for successive frames. HDTVs that operate at 120Hz (that is, 120 cycles of refresh per second) are broadly available, so the ability to alternate left and right eye images far faster than the human eye can follow already exists. Fundamental industry standards are in place to allow such recording, says Alfred Poor, an analyst with GigaOm and the author of the Web site HDTV Almanac.
Viewing 3DTV displays will require "active" glasses that use rapidly firing shutters to alternate the view into each eye. Active glasses are expensive today, but their price will drop as 3D rolls out. Meanwhile, designers are in the development phase of producing a 3D set that doesn't require the glasses.
Sony and Panasonic have announced plans to produce3D-capable displays, and Panasonic recently demonstrated a large-screen version that the company expects to ship in 2010. As happened when HDTVs rolled out, premium 3DTVs will appear first, followed by progressively more-affordable models.
Creating and distributing enough 3D content to feed consumers' interest may be more of an challenge. Poor noted that filmmakers are currently making or adapting only a handful of features each year for 3D. But techniques to create "synthetic 3D" versions of existing films (using various tracking, focus, and pattern cues for splitting images) could fill the gap.
Existing terrestrial cable and IPTV networks should be able to distribute 3D content. The bandwidth that such networks use to deliver typical HD broadcasts will be adequate for delivering 3D video once the networks upgrade to newer video compression techniques. Satellite may face a more difficult road, since such systems already use the best levels of compression.
For physical media playback, Blu-ray can store the data needed, and 3D Blu-ray players are already on the drawing board. No fundamental changes in Blu-ray will be necessary, so the trade group that created the standard is focusing compatibility--such as ensuring that a 2D TV can play a 3D disc.
Standards issues might not end up being very troublesome, so long as the 3DTVs are flexible enough. An industry group is working on setting some general parameters, much as digital TV was broken up into 480, 720, and 1080 formats, along with progressive and interlaced versions. A 3DTV may need to support multiple formats, but all will involve alternating images and a pair of shutter-based glasses.
Poor expects that 3DTV will be but a minor upgrade to existing HDTV sets. The upgraded sets will need a modified display controller that alternates images 60 per second for each eye, as well as an infrared or wireless transmitter to send synchronization information to the 3D glasses.
"Augmented Reality" in Mobile Devices
Babak Parviz, a professor at the University of Washington specializing in nanotechnology, is working on a bionic contact lens that would paint imagery and information directly on the eye to augment reality.
You enjoyed Hulk VI so much on your home theater setup that you decided to see it on the big screen. The movie is still playing, but you’re not sure how to find the movie theater where it’s playing. In the old days, you might have printed out directions from MapQuest; but nowadays you don't need to do anything so primitive. Instead, you dock your smartphone on the dashboard as you slip into your car, and instantly it superimposes driving directions to the theater are superimposed on your car's windshield. As you approach your destination, you see a group of tall buildings. Superimposed on the windshield over one of the buildings is the building’s name, the name of the movie theater inside it, the name Hulk VI, and a countdown to show time. "Turn left in 100 yards," the navigator speaks through your stereo as a large turning arrow appears, guiding you into the parking structure.In Neal Stephenson's book Snow Crash, "gargoyles" are freelance intelligence gatherers who have wired themselves to see (through goggles that annotate all of their experiences) a permanent overlay of data on top of the physical world. In less immersive fashion, we may all become gargoyles as “augmented reality” becomes an everyday experience.
Augmented reality is a catchall term for overlaying what we see with computer-generated contextual data or visual substitutions. The point of the technology is to enhance our ability to interact with things around us by providing us with information immediately relevant to those things.
At work, you might walk around the office and see the name and department of each person you pass painted on them--along with a graphical indicator showing what tasks you owe them or they owe you. Though many case scenarios involve “heads-up” displays embedded in windshields or inside eyeglasses, the augmented reality we have today exists primarily on the “heads-down” screens of smartphones.
Several companies have released programs that overlay position- and context-based data onto a continuous video camera feed. The data comes from various radios and sensors built into modern smartphones, including GPS radios (for identifying position by satellite data), accelerometers (for measuring changes in speed and orientation), and magnetometers (for finding position relative to magnetic north).
In an application called Nearest Places, the names and locations of subway stops, parks, museums, restaurants, and other places of interest are shown on top of an iPhone's video feed. As you walk or turn, the information changes to overlay your surroundings.
"Smartphones and the related apps are the trailblazers for augmented reality," says Babak Parviz, a professor at the University of Washington who specializes in nanotechnology. "In the short to medium term, my guess is that they will dominate the field."
Other prototype applications display information dropped at particular coordinates as 3D models that the user can walk around, or as animations whose details update in 3D relative to the user's position. But the technology for those apps isn't ripe yet; handhelds require a more-precise positioning mechanism in order to handle that kind of data insertion. Fortunately, each smartphone generation seems to include more and better sensors.
In other realms, augmented reality may serve to provide not just additional information, but enhanced vision. One day, infrared cameras mounted on the front of a car will illuminate a far-away object represented as a bright-as-day image on an in-windshield display. Radar signals and wireless receivers will detect and display cars that are out of sight; and one piece of glass will host GPS and traffic reporting.
Leaping past displays, Parviz and his team are working on ways to put the display directly on the eyeball. They’re trying to develop a technology for embedding video circuitry into wearable contact lenses. While wearing such contact lenses, you would see a continuous, context-based data feed overlaid on your field of vision.
Before Parviz's lenses become a reality, augmented reality is likely to become a routine navigation and interaction aid on mobile devices. In addition, game developers may use the technology to overlay complete digital game environments over the reality that gamers see around them.
HTML5
Web pages built with HTML5 will display the same on any browser--desktop or mobile.
Hulk VI was great, but what should you watch this evening? Before heading off to work in the morning, you click to some trailers on a movie Website, but you don't have time to watch many. So you use your mobile phone to snap a picture of the 2D barcode on one of the videos; the phone's browser then takes you to the same site. On the commuter train to the office, you watch the previews over a 4G cell phone connection. A few of the movies have associated games that you try out on your phone, too.
Remember when every Website had a badge that read "optimized for Netscape Navigator" or "requires Internet Explorer 4"? In the old days, people made Web pages that worked best with--or only with--certain browsers. To some extent, they still do.
The new flavor of the HTML--the standard program for writing Web pages--is called HTML5 (Hypertext Markup Language version 5); and HTML5 aims to put that practice to bed for good.
Specifically, HTML5 may do away with the need for audio, video, and interactive plug-ins. It will allow designers to create Websites that work essentially the same on every browser--whether on a desktop, a laptop, or a mobile device--and it will give users a better, faster, richer Web experience.
Instead of leaving each browser maker to rely on a combination of its in-house technology and third-party plug-ins for multimedia, HTML5 requires that the browser have built-in methods for audio, video, and 2D graphics display. Patent and licensing issues cloud the question of which audio and video formats will achieve universal support, but companies have plenty of motivation to work out those details.
In turn, Website designers and Web app developers won't have to deal with multiple incompatible formats and workarounds in their efforts to create the same user experience in every browser.
This is an especially valuable advance for mobile devices, as their browsers today typically have only limited multimedia support. The iPhone’s Safari browser, for example, doesn't handle Adobe Flash--even though Flash is a prime method of delivering video content across platforms and browsers.
"It'll take a couple of years to roll out, but if all the browser companies are supporting video display with no JavaScript [for compatibility handling], just the video tag and no plug-in, then there's no downside to using a mobile device," says Jeffrey Zeldman, a Web designer and leading Web standards guru. "Less and less expert users will have better and better experiences."
Makers of operating systems and browsers appear to be falling into line behind HTML5. Google Chrome, Apple Safari, Opera, and WebKit (the development package that underlies many mobile and desktop programs),among others, are all moving toward HTML5 support.
For its part, Microsoft says that Internet Explorer 8 will support only parts of HTML5. But Microsoft may not want to risk having its Internet Explorer browser lose more market share by resisting HTML5 in the face of consensus among the other OS and browser makers.
HTML5 is now completing its last march toward a final draft and official support by the World Wide Web Consortium.
Glenn Fleishman, PC World
No comments:
Post a Comment