[guide.chat] google upgrades youtubes engines

  • From: vanessa <qwerty1234567a@xxxxxxxxx>
  • To: "GUIDE CHAT" <guide.chat@xxxxxxxxxxxxx>
  • Date: Thu, 21 Mar 2013 22:05:16 -0000

Google upgrades YouTube's 'engines'
Serving up billions of hours of video every month would put huge strain on most 
websites. Social networks such as Twitter and Tumblr still occasionally 
struggle with much smaller numbers of people viewing just text and pictures.
But at YouTube, which delivers more bandwidth-intensive videos, there is no 
"Fail Whale" - the mascot that Twitter shows when it has collapsed under too 
many tweets. When Google acquired the scrappy Californian start-up for $1.65bn 
in 2006, it plugged YouTube into its own vast server farm to spread the load.
In recent months, Google has been investing in new technology to streamline 
both video uploading and its streaming player, to improve quality for viewers - 
and increase efficiencies for YouTube.
 
While Google is tight-lipped on whether YouTube, which now has 1bn regular 
visitors, has ever turned a profit, these technical changes are commercially 
significant. Studies show that the faster a video loads, the more likely people 
are to watch it in full, and to return to the site again soon. It also means 
they are more likely to watch the ads that increasingly precede the videos.
In an interview last week, Andy Berkheimer, engineering manager at YouTube and 
a six-year veteran of the site, told the Financial Times the recent changes 
were akin to "full body swap-outs" in technology.
"It's like changing an airplane's engines while the airplane is in flight," 
adds his YouTube colleague, software engineer Rushabh Doshi.
"One of the things we see [changing] is the amount of content coming in that is 
in HD quality," says Mr Berkheimer. "It's a feedback cycle - as soon as we 
started delivering in HD, a lot more people started wanting to give us HD 
content."
An average of 72 hours of video is uploaded to YouTube every minute and as 
people use better cameras, the resource demands are increased. Christmas 
creates a particular spike, says Mr Doshi, as people try out their new GoPro 
cameras or digital SLRs.
"Every time you do a [video] resolution increase, the tax on your systems is 
roughly scaling as to the square of the resolution," explains Mr Doshi. "It's a 
pretty heavy tax. What we do to make it better is what Google does best - throw 
lots and lots of machines at it. It's a truly stunning number of machines, 
scattered all over the world."
As soon as a video file begins to upload, it is "transcoded" into a 
YouTube-friendly video format that can play on any device, using a system 
called "pipelining", meaning it begins processing even before the initial file 
upload is complete.
"Each video transcoding process is not just a simple file transfer of bits," Mr 
Doshi says. "What is happening is you are taking information, chopping it into 
bits, and sending each one to different clusters of machines. Then you have to 
take all these pieces back and create one video."
Splitting and reuniting the chunks of video without making playback jumpy or 
losing seconds of video if the transcoding server unexpectedly collapses 
creates a huge challenge in storage, infrastructure and distributed computing.
On the viewers' side, the latest changes to YouTube's player were codenamed 
"Sliced Bread" - because it's about slicing a video up so YouTube's back-end 
can make second-by-second decisions on streaming quality and network speed, 
rather than making one "loaf-sized" choice about resolution at the outset.
This requires changes both on the server side and in the software running on 
the viewer's own device. Sliced Bread is almost completely rolled out on the 
desktop and YouTube is now bringing it to smartphones and tablets.
"We are moving beyond our comfort zone and saying for a lot of that 
[improvement], we need to be mobile first," says Mr Berkheimer.
A quarter of YouTube's views now come via mobile devices, where connectivity 
can be particularly erratic. A neighbour turning on a microwave might slow down 
a wireless connection and make the video stop midway through.
"That is one of the key things that makes a user go away quicker than anything 
else," says Mr Berkheimer. "So when that happens we need to seamlessly adjust 
[the stream] rather than stall."
One idea to improve mobile quality, for users who are signed in to a YouTube 
account, is to preload video overnight that its algorithms think they might be 
interested in, based on previous viewing behaviour.
Storing videos on servers closest to where they are being viewed reduces 
bandwidth costs and increases speeds. But YouTube has to cater to a mix of 
"purely global" content such as Hollywood movies, as well as regional audiences.
That often requires doing deals with local telecoms providers as well as Google 
building out its own data centres all over the world.
"You need to be close by," Mr Berkheimer explains. "Once you start talking 
about HD video, the speed of light becomes an issue. You can't serve it all 
from one data centre in the US."
Although he does not provide a figure for how the cost of streaming was reduced 
by Sliced Bread, Mr Berkheimer says that it brought a 20 per cent reduction in 
"buffering" - when users see a spinning wheel while the video loads.
"Our primary metric is to make the user experience better," he says. "A lot of 
that stuff helps improve efficiency as well. A lot of things that improve the 
user experience almost pay for themselves."

from
Vanessa The Google Girl.
my skype name is rainbowstar123

Other related posts:

  • » [guide.chat] google upgrades youtubes engines - vanessa