Free
Message: Digital delivery: your questions answered

Digital delivery: your questions answered

posted on Jan 23, 2006 06:07AM
Digital delivery: your questions answered

January 23, 2006 – LAST WEEK the WAEA’s Digital Content Management Working Group (DCMWG) delivered the first draft of the specification that will shape the emergence of end-to-end, all-digital delivery of IFE content from creator to consumer in the coming years (Inflight Online, January 20).

The document – WAEA Specification 0403, Standard Methodology for the Delivery of Digital Content for Airline In-Flight Entertainment – is the result of more than five years’ work by DCMWG and will eventually supplant the current Specification 0395. In the meantime, its implications – particularly in relation to the codecs needed for the digital encoding of content – are sure to be hotly debated as it covers the last mile towards WAEA approval.

Michael Childers, senior associate consultant on content management at IMDC, has been a co-chair of DCMWG since the beginning. Here he answers Inflight Online’s questions on the importance of WAEA 0403 and shines more light into an acronym-packed corner of the inflight industry.

Q: A formal specification is at last in sight. Are you satisfied with the outcome?

A: For the most part. I’m not certain that about half of the industry truly understands that what we have developed is the specification for a digital delivery supply chain - a migration path from the world of physical delivery media to a network delivery world without cassettes and discs. A lot of people still think that all we were doing was selecting a specific codec, and some were unhappy when we declined to do that.

Q: Is it the case that you were the principal advocate of this “requirements-based” versus “standards-based” approach to writing the specification?

A: It has been obvious to me for some time that the probable standards on which we would build our specification - namely MPEG-4, MPEG-7 and MPEG-21 - were progressing slowly and would not be ready for full implementation for some time. At the same time, demand for products based on these standards would probably see them entering the marketplace before the standards were finished. We needed a way to evaluate those products and to begin using them rather than waiting for the final standards. I advocated a requirements-based approach and letting the marketplace decide. Some of these standards may never be completed because the marketplace will offer interoperable products that do the job.

Q: Is the same thing happening in portable IFE?

A: Yes. I have recommended Windows Media Video, the VC1 codec and Microsoft digital rights management (DRM) for use in portables because they form a good, workable suite of technologies based on MPEG-4, MPEG-7 and MPEG-21 but ready to be implemented now. Repurposed and customised, they make a successful solution.

Q: What are the key requirements – rather than specific standards - contained in the new specification?

A: One is for object-based rather than frame-based codecs. This leaves MPEG-1 and MPEG-2 behind and opens the door to MPEG-4-based codecs - not just ISO/IEC Part 2 Visual and Part 10 Advanced Video Coding (AVC) but also VC1, which Microsoft has submitted to the Society of Motion Picture and Television Engineers (SMPTE) for ratification.

Q: Why object-based rather than frame-based?

A: Object-based codecs accommodate interactive multimedia and expand content applications. Frame-based codecs treat the entire frame as a unit, while the object-based codecs support the identification and manipulation of specific objects within the frame. Object-based codecs like MPEG-4 can accommodate video, audio, metadata, text and CGI Web applications in the same stream, while MPEG-1 and 2 are limited to audio and video.

Q: How does the specification approach standards for metadata?

A: There are two kinds of metadata – one describes the content for programme guides, the other identifies data components in the file for the computer.

The eventual metadata standard is likely to be MPEG-7, and we know that MPEG-7 will be based on XML - eXtensible Mark-up Language. So if we specify XML as the data interchange language for IFE, the work we do in XML will either be interoperable with, or convertible to, the tools that emerge in the future.

Q: What does XML do?

A: Like an object-based codec, XML can manipulate data within a file. With HTML, the entire document is a single unit, so that it’s impossible to distinguish between the content and the display of the document. XML tags and identifies individual data elements so that they can be read by and acted upon by a computer, supporting automation and “trusted system” security. An example is the use of XML metadata tags to identify the duration of a content licence so that it can be read and enforced electronically.

Q: This suggests that the business rules of the licence agreement, identified by metadata, will be included in the content file.

A: Correct. We are moving toward an overall media environment in which this is necessary. In IFE we must ensure that our future digital delivery systems have that capability too: a fully automated and fully secured supply chain demands it.

Q: What other key requirements are there in the specification?

A: The use of object-level rather than file-level encryption, and end-to-end rather than transport-level encryption. If you use a scheme that encrypts the content while leaving the header and metadata in the clear, the content itself can go through the integration process without decryption.

Q: And all these requirements can be met without insisting on specific standards?

A: Yes. Having identified the requirements, we can say that the solutions are likely to be found in MPEG-4, MPEG-7 and MPEG-21. We can say that we will probably meet the requirements with a data delivery rate of 1.0Mbit/sec and and a screen resolution of 720x480. And we can say that XML-based solutions and standards offer the best hope for eventual automation and interoperability in the supply chain. Windows Media will do a great job in one application, and MPEG-4 Part 10 AVC will do a great job in another. Individual users don’t have to pick one or the other - or either. Chipsets are coming that can decode across all of these encodes: all that matters is that they meet the requirements we have laid out.

Q: The specification can be expected to promote profound change in the form of a move from sneakernet delivery of cassettes to high-speed Internet/VPN delivery of digital content. Are there any other big changes afoot?

A: There are at least three:

1. Migration from portable to embedded data loaders

2. Transfer of content integration away from hardware to an independent function

3. Delivery of content in the cabin to both airline-owned and passenger-owned devices

Q: How are embedded data loaders superior to portable units?

A: Digital delivery requires the transfer of increasing amounts of data through the network and into the aircraft’s server. Currently, technicians carry portable units to the flight line for manual loading while the aircraft is on the tarmac. Embedded loaders are permanently installed in the aircraft - in the video control centre (VCC), for example - and content is supplied to the loader via removable hard drives, AIT tapes, USB memory sticks, DVDs or CDs, or wirelessly at the gate via links such as IEEE 802.11, GSM or CDMA. Loading is automatic – there is no need for a technician to be present. This is a much more efficient and secure process which can also encompass the use of two-way communications for the reporting of load status, passenger usage and BITE data.

Q: Is anyone doing this today?

A: American Airlines, British Airways and Virgin America have all opted for embedded data loading.

Q: Do you really expect to see content being delivered from the aircraft IFE server to passenger-owned devices?

A: Yes - this is inevitable. DCMWG is about to address the matter, and there are products already in development to make it happen.

Share
New Message
Please login to post a reply