IBM Said Devising Video Archive and Search Technology

IBM is reported to be developing a way to archive video footage and to build a search engine to retrieve clips that can’t be retrieved too readily on the Internet as it is now.

The IBM research team is aiming toward a technology, reportedly code-named “Marvel,” whereby you’ll be able to click on a sample shot of a scene or be able to describe a scene and retrieve the relevant clips from thousands, if not millions, of hours of audio and video generated each year by filmmakers, broadcasters, and individuals, KeralaNext.com said September 30.

The research team reportedly works on the project with libraries and with chosen news organizations like CNN, showing an early prototype at a Cambridge University conference in late August. That prototype, KeralaNext.com said, scanned a database of over 200 hours of broadcast news footage and used 100 descriptive terms to classify and identify selections.

Based on MPEG-7, Marvel is being designed to search on any standard video format, and while IBM has yet to talk in detail about how to turn the technology into a marketable product, KeralaNext.com said, Big Blue may be planning to release it to the television industry first, before thinking about consumer promotion.

“Though current search engines like Google and Yahoo can serve up video clips or images, they really aren't searching on the images contained in the files,” the publication said. “Instead, they rely on the text attached to the bottom of the files, and thus they search only the small number of files that have been properly identified.”

The Marvel research team, KeralaNext.com said, is aiming for automatic categorizing and subsequent retrieval of clips using such modifier terms as “outdoor,” “indoor,” “cityscape,” “engine noise,” and others that describe the action or the sights within the clips. It is said to lean heavily on development off vector machine technology – a computer learning to assign the equivalent of a yes or no value to data – pioneered by AT&T’s Vladimir Vapnik a decade earlier.

“In other words,” KeralaNext.com translated, “if the computer is supposed to distinguish between an indoor or outdoor scene, trees in a shot could well prompt the computer to put the clip in the outdoor bucket."