Here is a high level overview of how the Linius technology could apply to sports content personalisation.
You are in your connected car and have just input your destination – home – into the car’s navigation system.
The system estimates a travel time of 17 minutes.
‘Great,’ you think. ‘That gives me a chance to catch up on the game I missed last night.’
You switch to your in-car entertainment system and simply order your content. Here’s how you do it…
‘Siri, please give me a version of last night’s game in video. I have 17 minutes to my destination and I want the game to focus on my favourite 3 players, Alvarado, Stevens, and Peterson, and focus only on the touchdowns, penalties and any fights or major clashes.’
Instantly, a personalised stream of content starts to play, tailored to your input.
How is this possible?
Today, content cannot be produced like that ‘on the fly’. Creating such a package would take several edits then the rendering of a new file.
All of this takes time and processing power. It’s complicated to make the edits and this can’t happen ‘on the fly’ – particularly not to a file length of 17 minutes.
Even if that was possible, surely it would be impossible for you to switch camera angles in stream.
Is it possible?
With digital content 2.0, it is.
Digital content 2.0 treats digital content in a completely new way.
This methodology treats digital media files as data blocks, not video or audio.
Right now, treating files as digital media content (audio and video) leads to processes like rendering and transcoding, which generate multiple copies of files for every edit, and copies of these copies to suit different devices.
Treating content files as data blocks allows the building of new media files simply by reorganising the sequence of the media structure.
We can assemble multiple files, and edits from these files, into a single file by reorganising data – and it happens as it is streaming.
This is digital content 2.0: a new world of content possibilities.