Quote:
Originally Posted by AustinChief
(Post 9905494)
No offense but you really don't know what your talking about. You were 100% correct on Netflix and 100% wrong on Google. Google has been preparing for this (or something close enough) for quite awhile. I don't anticipate the flipping of a switch and magically they have a working delivery mechanism for this... but if you don't think they could be ready to deploy by 2015 than you haven't been paying attention.
EDIT: you're probably right about there being "last mile" issues but I see Google taking its standard "not our problem" stance on that.
|
So what I do right now is building a next generation system for delivering high quality high definition video over mobile networks. While my problem is somewhat different(and I specifically don't do live because of the technology involved) it touches on many of the areas involved here.
The biggest issue is that liveTV is fundamentally like any other Real-Time system. Data needs to be delivered as it is generated an if it's delivered late it can have no value.
The challenge is, the internet was never architected to be a real time system. Broadcast TV(either via Cable, Satellite or Over The Air) was architected from the outset to deliver reliable real-time video(well most of the time :p).
They do this by 'owning' the network links. In the early days they would literally lay telecom links between cities to deliver a reliable dedicated connection to deliver real time TV.
Today most of that reliable transmission network is done via satellite links. They spend major money to make sure both the uplink and downlink are reliable communications. Some of them might be shifting over to lank links but you still see the big satellite dishes at cable offices/local networks to handle the up/downlinks.
Now do you have to have hyper reliable network hardware to make a real time network work? No of course not you can use software to provide high 'system reliability' from unreliable components.
With on demand TV they can tolerate variable network performance by aggressively caching data. If your link is noisy then you prefetch more and more data when you have the link to make up for the times when this link is noisy and not working. If you prefetch enough you can completely mask a noisy link from the end user. The only thing they may see is longer initial buffering.
With real time TV you can't aggressively prefetch like you can with onDemand TV because we can't prefetch data that hasn't been created yet.
Now the other issue with live TV us timing, with onDemand TV if you have a marginal connection then maybe you'll have to buffer more before you start. If your movie is running 5 minutes behind that's probably not a big deal but if you're watching the chiefs 5 minutes behind everyone else in the game thread that's going to piss you off really quickly.
Does this mean you can't build a reliable real time network with unreliable components(like the internet)? No you still can but you have to then over-provision your resources to tolerate variance in the unreliable components. In this case if internet backbone providers A and B both have a probability of dropping or delaying your traffic that's below your requirements, in the simple example you can transmit simultaneously to both networks hoping that at least one copy arrives to the client on time. This approach can work but you generally need to significantly over-provision your system to hit the real time requirements you need. This gets very expensive very quickly.
Yes I know what google is capable of, my poker group generally consists of 5+ PhDs who are working google depending who's left or joined google since our last game.
Even with all of their brain trust, what is google doing in this type of space? In some cases they are leasing full fiber lines between datacenters(i.e. owning the network to create a more reliable system) but this is really expensive and not always matched to their core business(at least at the level that live TV would require).
The other approach they are taking is the 'CableTV' approach and control direct access to the consumer via Google fiber. This helps with some of the last mile issues and when paired with dedicated telecom links between data centers it gives them the ability to deliver some real time content. But again this is pretty hugely expensive to build out all of the infrastructure.
Plus it's really not clear this a good long term strategy. As the world goes more and more mobile, landline links become more expensive to maintain than they are worth. After Sandy in NJ verizon actually didn't rebuild all of the landline phone links that were destroyed. What they did was connect a mobile phone link up to the outside of the house. The house still had a 'landline' but it was actually connected to a mobile network.
Right now mobile links are bandwidth saturated, but if someone can find a way to either radically increase mobile bandwidth, radically decrease video bandwidth(which consumes 50+ and growing of bandwidth) or ideally both. Then landline networks will likely start to go the way of landline phones.
I could keep going but the point is the issue of delivering live real-time video is a whole lot more complicated than delivering traditional onDemand video. The approaches are to either build a dedicated network and look very much like a cable company or spend a lot of money to hugely over-provision your network so you can use software to create a 'reliable network'. Both of which likely require massive capital expense on infrastructure.
While Google might have the billions to spend to roll out nationwide networks, will they get the return on investment to make such an expenditure worth it? That is very much in doubt.
Like always the issue here is scale. Imagine a 1% likelihood event of your game watching being ****ed up. If you have 1 million customers watching you'll have 10,000 of them affected by that 1% event on average. Small scale things tend to work, when you run on massive scales all the really unlikely things start to show up for 'someone' all the time.