multichannel connect
careers
all access

News

With CES Ahead, It’s Partner-Up Time Again

11/08/2010 12:01 AM Eastern

BY THE LOOKS OF LAST WEEK’S NEWS NOISE AROUND
online video, it’s partner-up time again!

Walmart, owner of Vudu (which
streams movies in 1080p), hitched up
with Disney to stunt a free, electronic
copy of Toy Story 3, with Blu-ray Disc
purchase. (More deal talk: Vudu’s player
recently became a Boxee feature.)

Yamaha added Netflix, Blockbuster
and YouTube to the feature list of its new
Blu-ray Player. Amazon streaming is on
Panasonic’s Blu-ray players. LG players
sport CinemaNow (under Best Buy’s tattered
wing) and Vudu.

Consider this CES foreshadowing. A big part of the buzz
of the upcoming Consumer Electronics Show is likely to be
about whose video player/online movie library/Internet doohickey
is in whose Blu-Ray player, game console or HDTV.

And then there’s the surge of the online movie vendors. Amazon
upped its title count to 10,000 titles last week from 300,
via its deal with Disc Plus. (Buy a DVD, get an electronic copy.)

Netflix grew its subscriber count to 16.9 million, a 52%
leap from last year. (Engineering banter at the recent SCTE
Cable-Tec Expo put Netflix streaming traffic at 15% — and
skyrocketing.)

And then there’s the UltraViolet camp, with its everybodybut-
Apple-and-Disney digital locker service. Intent: For people
to trust that when they buy an invisible copy of something,
it’s as easy to use as the DVD version.

In cable, these conversations tend to beeline toward hierarchical
storage and “content delivery networks,” or CDNs.
It’s the new black. Everybody either built one, is building one
or is renting.

The thinking goes like this: If you’re the guy offering ondemand
services over the VOD network you built, across all
of your systems, over the last dozen or so years, you may
have 100 or more different storage servers scattered about,
all holding pretty much the same stuff.

Why not centralize that, and leave those 100 end points
as caches for more popular content? Put the popular stuff
— the “hot content,” in CDN-speak — out in the caches.
Leave the cold content on the big, centralized library servers.

Simple, right? On the surface, maybe. Underneath, though,
there’s a lot of engineering and architecting going on. First of
all, what’s being stored? Is there an encoder needed at the
front, to chop each title into smaller, two-second chunks? Yes
or no on storing three separate versions, in high, medium and
low resolutions, to suit available bandwidth?

Caches also work to reduce network load, but that depends
on how quickly usage patterns shift. (Turns out, it’s a lot.)

Figuring out how to know when something’s about to fail is
a design biggie. And, if the title is coming out of cold storage,
what’s the best way to handle trick-play features, like fastforward,
pause and rewind?

Cable, arguably the biggest server-upper of on-demand titles,
is hard at work on all of this. All that’s missing are those
big-name, partner-up headlines. Well, beyond the obvious.


Stumped by gibberish? Visit Leslie Ellis at www.translationplease.com or www.multichannel.com/blog.

 

Alert to All Users of the Disqus commenting system:
Because of a recent global security issue, the Disqus website recommends that all users change their Disqus passwords. Here's a URL about the issue:
http://engineering.disqus.com/2014/04/10/heartbleed.html

 

April