Software runs my life

Tag: google

Is there money in producing content?

yankee-group-online-ad-market-and-internet-access-growth-2006-2011

From MarketingCharts.com

Online media is growing up. All the big media players (News, Fairfax etc.) are currently fighting it out with the new kids on the block, online pure plays (Google, Microsoft, Realestate.com.au etc.). The prize is the rapidly growing pool of online advertising revenue, predicted to pass the US$50 billion mark next year. Historically the provider with the most content has attracted the most consumers, in turn attracting the most customers. Eventually this network effect lead to breakaway market leaders establishing dominance and gradually raising the market barriers of entry. Holding all the content was a licence to print money.

Slowly general search tools like Google and Bing, as well as vertical specific search sites like Zillow, started gaining momentum. They established themselves as “middle men”, generating advertising while helping people more efficiently find the content they were looking for. They were not interested in hosting or contributing content, but rather focused on the delivery of that content. They realised that the front-end distribution is where the money is at, not at the back-end creating content. Google in particular understands this, and the publishers do not. The publishers hate that Google News provides a beautiful user interface to access their content easily and for free, yet despite their threats they do not block Google’s bots because they need a strong online delivery channel and half their traffic comes from search engines.

This style of reluctant symbiotic relationship also appears outside news content, it is extending further into real estate and videos to name just a few. Microsoft are attempting to flip the relationship by making Bing Video index Google’s YouTube content and Google Maps is indexing real estate content.

The big media content creators have recognised one thing at least, for the partnership to work each participant has to have a stake in it’s success (or failure). Licencing deals, share stakes and other structures are occurring left, right and centre as the various players align themselves. This “sorting out” period has amusing side effects, like media companies being on both sides of the legal fence. Eventually the flurry of deals will subside and the media companies will realise that YouTube is no different to their old printing press and delivery operation, it is a necessary distribution channel that takes a commission. If your printing press operator decided to make your boring black and white rag and turn it into a glossy high end publication that successfully retailed at twice the price (despite having the same content) then good luck to them, in the end you benefit from a more valuable distribution channel.

For now we are faced with more sabre rattling by the media companies, constant partnership renegotiation’s and declining print revenues. As with any market forces, the digital media market will eventually reach an unsteady equilibrium. Some sort of duopoly with Google/Microsoft as the distribution channels, and the old media companies aligned behind them as the content creators. It is unlikely that the print rivers of gold will be seen in one place again, but sharing these rivers over a wider and more competitive landscape will benefit consumers. Sooner or later content producers will realise that revenue is a balance between consumption price and volume, withholding content only encourages piracy and other forces that undermine their progress to a fair and efficient new distribution channel.

Islands of Computing Power

Amit Mital kicked off TechEd Australia 2008 today with a keynote presentation on Microsoft’s view of how software and services will develop in the future, particularly in relation to their new Live Mesh offering. There is a good summary of his presentation on the TechEd New Zealand site, it seems they got an identical opening keynote. For someone who loves networks he sure doesn’t seem to like professional networks!

There was one flow of logic which struck me in his speech. Moore’s law is still holding true, and computer hardware is continuing to double in processing power every 18 months. This computer power is also appearing in more and more locations. But when was the last time your network doubled in speed? What about doubling in speed to each additional node? This rapid processing power increase has meant two things that are obvious even today:

  1. Computers are islands of computing power – There is no seamless transfer of data between your devices. You work on a file at work, email it home, download it at home, work on it and send it back.
  2. Deploying local machines is too hard – Each branch office needs a rack, servers, backup, redundency, configuration, support, licencing…

Behind the Mesh SlideMicrosoft’s solution at a high level is the Mesh stack, the structure of which can be seen in the slide shown here. The fundamentals are that local software is fast, hosted services are convenient, so lets tie them together with an API and we get the best of both worlds. The trick is getting the balance right, where does a local application end and the service begin? How do you split the business logic? How do you provide offline access and quick sign-on to new devices? Hmmm…

Microsoft’s current practical solution is to re-write most of its server packages to allow hosted delivery. Hosted Exchange is an obvious flagship for this. Google have taken a different approach. They believe that all you should need on your desktop is Chrome, essentially an all-purpose thin client rather than a thick client on a drip feed.

So who is right? Well I am betting things will converge on a middle of the road approach. Implmenting with current technology I would say that javascript, a web browser and some sort of XML interface would be the best way to go. A few things need to develop from here:

  1. API’s need to be standardised and built into the browser (or OS as these merge). Something like Javascript libraries, but compiled, lightening fast and highly reusable. Chrome is getting there.
  2. Data transfer needs to be better than XML. Think highly compressed, encrypted on the fly, but quickly decoded into a human readable format if necessary. Microsoft’s MeshFX is getting there because it has authentication and other services built in, but it needs to be open like SOAP.

So I guess the race is on! Google will take Javascript to it’s limits, Microsoft will try to blow us away with it’s feature set. When will they sit down and standardise on the next generation of javascript and data format?

Page 6 of 6

Powered by WordPress & Theme by Anders Norén