Will Ultra High Definition Television Bloom?: UHDTV - Television's Renaissance?

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

tech-i

INSIGHT FROM EBU TECHNICAL


Issue 07 March 2011

Contents
03 04 06 07 10 11 12 13 14 15 Viewpoint: Lieven Vermaele 3DTV Study Group Eurovision Network History HbbTV In France Broadband Requirements HTML5 In the Spotlight Out of Sync Seminar News Diary

Will Ultra High Definition Television Bloom?

UHDTV - Televisions Renaissance? -

PAGE 8

http://tech.ebu.ch

leading the evolution in news and sports delivery


Ericssons Voyager II
News gathering is one of the most competitive areas in the broadcast market, demanding the delivery of high-quality video in the most cost efficient method available. Traditionally DSNG solutions have been based on MPEG-2 and DVB-S, but more recently operators have moved towards MPEG-4 AVC for delivery over DVB-S2 satellite networks. However IP-based satellite communications, delivering improved reliability, are now also a valid alternative for cost-effective lower bandwidth news gathering links. As cellular networks mature and increase their transport capacity, they too will be able to reliably deliver real-time or near real-time video streams. In a dynamic and fast moving market, future DSNG solutions must reliably deliver high quality video in multiple formats over all modern transmission technologies and networks. Voyager II is Ericssons fifth generation DSNG and is the result of many years experience delivering solutions in this most demanding of markets. To enable operators to migrate from one compression technology or format to another, Voyager II supports all the major compression formats, MPEG-2 and MPEG-4 AVC, in both standard or high definition resolutions as either 4:2:0 or 4:2:2 and for improved video quality, Voyager II is software upgradeable to support 10-bit precision and 1080P50/60. Built on a revolutionary modular chassis in a space saving 1RU form factor, Voyager II represents the most advanced DSNG on the market, offering broadcasters and service providers the level of integration and flexibility required to future proof any operational investment.

Ericsson Television Limited Strategic Park, Comines Way Hedge End, Southampton Hampshire SO30 4DA, UK Tel: +44 (0) 23 8048 4666 www.ericsson.com/televisionary

viewpoint from lieven vermaele

Will 2011 be the year of tablets and the open internet?


A whole host of manufacturers are making tablet PC models now, and the public is clearly warming to them. They come in many sizes, and with different functionalities. Perhaps the cornerstone of their functionality today though is internet browsing.

ith improvements in streaming technology, and higher broadband speeds, tablet PCs will provide ever more opportunity to enjoy rich media. These devices could also eventually and easily include digital terrestrial television and radio receivers. They may eventually become all media devices the embodiment of the anytime, anywhere, and anyplace dream. So, one of the questions that broadcasters need to consider is whether the tablet PC will ever replace the television set? If so, what should we do about it now, to be well prepared? The answer to the first is probably not completely, but there is no simple answer to these questions. Certainly the use of tablet PCs will continue to explode they are elegant and easy to use. Young people may prefer their go-anywhere portability to watching TV on a large screen, as they do now with laptops. They may be used also in conjunction with television sets. The seated viewer may use his tablet PC to check out other channels, for social networks, or to interact with or deepen the content of the TV show running on the large family screen. The tablet PC is also likely to be able to become an extension of the TV set. One of the most attractive features, for the public, of smartphones and tablet PCs is the apps. The possibility of calling up, rapidly and very simply, computer programmes that run on the API of the tablet

PC or smartphone. For the iPhone, iPad, and other devices, there are now tens of thousands of apps available for download and installation. The idea of providing these from an open market of programmers was an inspiration, and a massive success. We will have to be good to be used. So, in our new world of tablet PCs, if we want to be part of the revolution we will need to think seriously about what great apps we could provide, and how they can complement broadcast programming. We also need to think about what quality, style of programming, and programme length will best suit the viewer with a tablet PC in his hand, taking programmes on demand via our websites. Our websites today are often designed for lean forward use. The tablet PC situation is probably half way between the lean forward and lean back worlds. The EBU is setting up an activity to bring creative and technical people together to exchange best practices. One of the elements will probably be to work with tablet PC makers to ensure that receivers for the digital broadcasting standards, DVB-T, DVB-T2, and DAB are included in them. If they are included in all tablet PCs, this would have very little impact on the price of them. Most of the cost of the tablet PC is tied to the cost of the screen itself, not the electronics behind it. And, in case we cant achieve it, we are already defining alternative ways to bring this content

to the user in our EBU strategic programme on the future of terrestrial broadcasting. However, as manifested on the open market today, such tablets and apps arent completely open to all. Likewise, the use of certain services apparently at no cost like social network services arent exactly that free. At the EBU, we have heard stories from Members of where their Facebook page one day was removed without informing them first. Similar stories are heard about difficulties of certain apps to pass the app commission, sometimes on a country by country basis. It shows that the net-neutrality discussion is not only relevant for bandwidth, quality of service and investments issues, but will become even more relevant for the openness of the internet in the world of apps and personal data ownership. There is to be an EBU group on net neutrality and related issues. One important thing to say though is about fingerprints on tablets. At our local multimedia store here, a member of the stores staff is permanently assigned to the continuous cyclic cleaning of the screens. If there is someone out there who knows a way to avoid fingerprints when you poke at the screen, they are going to be rich.

Visit http://tech.ebu.ch to subscribe to tech- i free of charge


tech-i 2011 All rights reserved. The reproduction of articles in tech-i is authorised only with the written permission of the publishers. The responsibility for views expressed in tech-i rests solely with the authors. Published by EBU Technical European Broadcasting Union 17a, LAncienne-Route, CH-1218 Le Grand-Saconnex, Switzerland. Editor-in-Chief: Lieven Vermaele E-mail: [email protected] Tel: +41 22 717 2111 Editors: William Daly, Harold Bergin Production: WHD PR For editorial & advertising enquiries contact: WHD PR E-mail: [email protected] Tel:+44 20 7799 3100 Printing: New Goff n.v.

tech-i | http://tech.ebu.ch | march 2011 03

in focus

What can the EBU 3DTV Group do for you?


The BBCs Andy Quested, Chairman of the EBU Project Group on 3DTV sets out the Groups objectives and offers some advice to those seeking to make good 3DTV programmes.

s chair of the Group I thought it was about time I gave an update on our thoughts and our work and maybe make a comment or two about the third dimension. The Group was formed about 6 months ago and much has happened in that comparatively short period. Figure 1 illustrates the state of 3D broadcasting in July 2010 just after our first meeting. Figure 2 has the position at the beginning of 2011 and shows just how fast 3D is growing. The number of 3D channels is still growing but is there enough content and how good is that content and more importantly, how good is the 3Dness of the content? If you look at the maps again, the really sad thing is the lack (or even complete absence) of the Public Service Broadcaster. As part of the process of setting up the 3D Group we sent out a survey asking about the relevance, concerns and plans of each Member. We also asked if the Members wanted the EBU to do something, such as provide information, recommendations, tech notes, etc. There was an overwhelming response. The survey was sent out on a Friday and by the following Monday morning there were over 90 responses! There is obviously a demand for information but I dont believe there is a desire to do anything but observe. The survey did allow us to set some simple but clear objectives for the Group: Provide objective information on the different 3DTV technologies; Inform general managers about the anticipated 3DTV developments over the next 2-4 years and the implications; Provide an overview of the worldwide standardisation activities and the areas where EBU Members input is required; Collect EBU Members requirements (technical, psycho-visual, business related) and ensure their adoption. We are gathering information about the current issues and techniques for 3D programme making at the moment, but we have already delivered the second objective, which is available at http://tech.ebu.ch/3dtv.

The Group wanted to provide the answer to any PSB who asks, If I had to start from scratch what do I need to know to make a good 3D programme?. The answer is take some advice, then take some more and keep taking advice through the whole process. But where and what advice? Here are some early thoughts. Advice 1. Will it work in 3D? - What really makes a programme work in 3D is not the technology but just like 2D, its the story and the context. So you could ask Can the story use or benefit from 3D?. Can it play to the medium to tell the story or is 3D just a box office effect? You should be able to watch a 3D programme and almost not notice its there, but at the same time, miss it if it was not there. That doesnt mean that its too subtle to see, but that its a natural part of the story. Advice 2. Can you capture it in 3D? - There is a very high probability that traditional 2D camera positions, especially at large arena venues, will not be good for 3D. This is where theres no substitute for experience and expertise, or if you

are doing it on your own be prepared to accept the fact that you will make a lot of mistakes early on that will never see the light of transmission. Just because you may need new camera positions for good 3D, it doesnt mean you need to keep all the old 2D positions. The 2D coverage may well be different but not necessarily unsatisfactory. Advice 3. Planning - Call in the experts early and make sure theres someone leading the project whose role is to maintain an overview of the whole process from beginning to end. There are new jobs to consider but right at the top is the stereographer. Phil Streather, the stereographer involved in the EBUs 3D training cannot emphasise enough the importance of planning and preparation. Chris Johns who is leading BSkyBs 3D efforts cannot get through a presentation without using the words on every slide of his presentations. Advice 4. Communication - Keeping everyone fully informed and up to date with the inevitable changes is vital, especially for live programmes where there is not

Fig.1 - 3D Broadcast July 2010 2010 3D Broadcast July


USA UK Spain Germany Portugal Brazil Australia Middle East
2011 Futuresource Consulting Ltd

Finland Poland Japan

Korea France

04 tech-i | http://tech.ebu.ch | march 2011

Response Percent Basic information on the strategic impact of S3D Training on S3D production grammar and techniques S3D production technology advice aimed at produciton and operational sta Advice for technology on S3D standards for acquisition, production, contribution and distribution Guidelines and help on how to understand and evaluate the S3D image quality (objective and subjective) In uence S3D standards bodies to ensure open and interoperable technology Information on terminology surrounding S3D Other (please specify) 77.9% 49.5% 49.5% 69.5% 50.5% 66.3% 42.1% 7.4% Answered questions

Response Count 74 47 47 66 48 63 40 7 95

3DTV Study Group Survey On Stereoscopic 3D (S3D)


usually the chance to correct issues during the event. Even what you think is obvious should be communicated; does everyone know you are doing 3D? Does everyone know the left from the right? Does everyone know there are two video streams? A good programme plan with contact details is just as important as the cameras. Advice 5. Cameras and Rigs - It is tempting to save money here because rigs can be incredibly expensive. Conversely it is easy to over specify. Setting a top end rig to capture a shot a simple side-by-side rig could deliver just as well is a waste of money. Then again trying to do a money shot with inadequate equipment will produce poor results and bad 3D. You will end up wasting money trying to correct the 3D in post or if its live, you will just upset the audience. The moral here is to take advice. Advice 6. Contribution Circuits & Signal Processing - This is where advice has to be backed up by tests and experience. How will the signal from a live event be sent back to base? What compression will be used and will it handle the two vision streams in the same way? How do you standards convert two image streams identically? Is the infrastructure capable of handling two streams - dual link SDI? 3G-DS? Or is it 3-DL? There are too many options to discuss in this short article. Remember, no matter how good the 3D is on site, if you cant get it back to base and on-air, all the work is for nothing. Advice 7. Recording & Post Production Compression - How will NLE compression handle 3D? What about 3D EVS or must you use 880Mbs HDCamSR? Long term storage formats for the archive are really important but often overlooked. The best advice for the archive is to store two image streams (Left eye and Right eye) clean. That means, do not squeeze to Side by Side or Top & Bottom. This is essential to maintain quality as the 3D distribution technology evolves. Plan your post production short term storage and contribution to deliver both Left and Right eye images at the highest possible quality. Advice 8. Editing & Audio - Speaking as an ex editor, once you are into post production you can only work with what youre given, the tools you have and your imagination and skill. Seriously though, on a TV budget for post, you can only do so much. This is where any lack of planning or communication will catch you out. However, if you have planned and communicated, the post will be smooth, but just in case, the 3D tools now available from the main NLE manufactures can at least help in tricky situations. Advice 9. Subtitles, Graphics and Captions - Where to put them? - Titles

Fig.2 - 3D Broadcast January 2011


UK Finland USA Spain Canada France Brazil Portugal Italy

3D Broadcast January 2011


Poland Russia

Japan Germany

have to be carefully placed so they dont become divorced from the image they are associated with. Too far forward and the audience feel their eyes crossing tying to read them. Too far back and they appear on top of the objects or performers they are supposed to be behind. The advice could go on for a lot longer but we need more knowledge to help. So, is this what the EBU 3D Group can do? Talk to the manufactures, the experts and the audience so that the EBU can provide the guidance, recommendations and expertise its Members expect. Coming so soon after the pressure to go HD, many PSBs are asking if 3D is a dimension too far. But the results of the survey sent out by the 3D Group demonstrated there is a real concern about the impact 3D subscription channels will have on the PSBs position and reputation for innovation in each territory. At the EBU Technology Seminar the last session was on 3D with three very good presentations. I took the opportunity to ask Members to support the work of the Group with practical help. Please take some time to look at the presentation on the EBU Technical website and come and join us. The Groups next, and most important task, is to disseminate all the information on the processing, technology and techniques required to make good 3D high definition programmes for a public service audience with the quality (editorial and technical) our audiences expect. As part of this process, I have taken on the role of joint ITU special rapporteur looking at the issues surrounding 3D production. I am looking forward to the next few months. Its going to be an interesting, exciting and worrying journey and I hope some of you will join me.

Korea

Australia Middle East


a
FTA-Terrestrial trials for World Cup 2010

a real concern about the impact 3D subscription channels will have on the PSBs position and reputation for innovation

tech-i | http://tech.ebu.ch | march 2011 05

in focus

Dreams become reality


Marc Lambreghs looks back at defining moments in the Eurovision networks satellite broadcasting history.

hen I began working at the Belgian broadcaster RTBF in the mid-1970s, preparations were under way for the frequency planning conference for satellite broadcasting services. I was put in charge of looking into this technology and reporting back on the opportunities and potential benefits it provided for the RTBF. The transmission system that was due to be discussed at the conference was analogue based, which would allow a 27 MHz television channel to be broadcast in the 12 GHz frequency band. This conference, WARC-1977, drew up the plan for the satellite broadcasting service for the general public. At around the same time, broadcasters also started taking an interest in how satellite communications could be used to meet their own needs, e.g., contribution links. Transportable or portable stations that could relay events directly from a remote site through a satellite was one idea that was put forward, though it was dismissed as fantastical, given that satellite communications were still in their infancy. Bear in mind that, in those days, satellites were used almost exclusively for intercontinental telephone transmissions and earth stations were huge, sporting 30 metre satellite dishes, equipped with a cooled parametric amplifier for reception, a klystron amplifier for transmission and a complete wave-guide system between the various components. The whole set-up ran on C-band (4-6 GHz). Furthermore, these stations had to be placed in special locations free from radio-frequency interference (Goonhilly, Raisting, Plemeur-Bodou, Fucino, Loche, Lessive, and so forth). The dream became reality with the launch of the OTS (Orbital Test Satellite)

experiment, which determined the systems features, establishing signal propagation in the 12 GHz frequency band. Yet the defining moment for Europe was the founding of Eutelsat in 1977 to manage the continents satellite-communication resources; its first satellite was launched in 1983. A capacityleasing agreement between Eutelsat and the EBU paved the way for the first satellite-borne transmissions on behalf of Eurovision. Since these first transmissions, the Eurovision satellite network has grown, thanks to deregulation, which made it possible for Members to acquire their own earth stations. Euroradio channels were also added. Transportable stations (e.g., for news gathering) started to be used widely. A key event in the history of Eurovision was the digitisation of the networks satellite component in the summer of 1998. This changeover significantly extended available capacity, allowing for 16 wideband digital television channels instead of the 6 analogue channels, all within the same space segment. As a result, satellite became the main transmission technology across the Eurovision network, and many terrestrial links fell into disuse until the advent of optical fibre communications. The network continued to incorporate new technology in keeping with economic constraints. In particular, transmission bit rates could be tailored to needs at hand, endowing the system with flexibility. Within the direct broadcast satellite reception domain, analogue satellite transmissions had only been of interest to real amateurs. However, with the arrival of inexpensive digital receivers, satellite TV became all the rage and dishes popped up everywhere. Part of the attraction was that the services on offer generally had

Marc Lambreghs, recently retired EBU Senior Engineer pan-European reach, which gave satellite TV an international dimension. The challenge facing broadcasting satellite systems today is how to fit in with a telecommunications industry that is dominated by IP protocols. Even though IP requires increased transmission capabilities to carry radio and television signals, IP networks are now used widely. In professional broadcasting, integration has made it possible to handle real-time transmission and file transfer on the same platform as well as adding support for informational, control and administrative data. In conclusion, during these nearly 35 years of my professional life I have found that it was good to believe in utopias, as technological evolution has in general made them reality; we just need to wait a little. 01. First SECAM transmission on Eurovision in1968 02. Uplinks 1980s style 03. First PAL transmission on Eurovision in 1967

01

02

03

06 tech-i | http://tech.ebu.ch | march 2011

case study

HbbTV In France
TF1s Frdric Tapissier gives a brief history of HbbTV and its planned launch.

n French, DTT shares its abbreviation with Alfred Nobels explosive invention TNT, and stands for Tlvision Numrique Terrestre, the exact translation of Digital Terrestrial Television. When it was launched in 2005 the number of free-to-air channels exploded. However there was no consensus to include an interactive TV format, such as in the UK with MHEG5 or in Italy with MHP. Broadcasters were anticipating a generalisation of IP connectivity and were not convinced by the existing solutions. They then started a working group to study an alternative solution based on HTML that would work with both broadcast and online. The group lead by the major French broadcasters (TF1, France Tlvisions, M6) and helped by middleware expert OpenTV, decided to create specifications by making a selection within the existing standards (mainly W3C and DVB) and to complement this selection with a few extra APIs when there was no other means to fulfil a critical need. When communicating the results of their work, they realised that a very similar initiative was well advanced in Germany by IRT, Astra, Philips and ANT. Both teams decided to work together to create a new standard for hybrid TV receivers: HbbTV was born.

HbbTV The shared objectives of HbbTV were to create a standard that: addresses both broadcast and online content distribution; is hybrid - broadcast content and online content that are complementary to each other; uses existing standards - ease of implementation and no IPR issues; aims for HTML as a development environment mastered by broadcasters. The team that finally lead the project, included Samsung and Sony as key representatives of the CE industry and the EBU as the representative of the European broadcasters, all with vast experience in standardisation. A website was created and very soon the project found numerous supporters ready to contribute to specification work and promotion. On July 1, 2010, ETSI approved the HbbTV specification and made it an official standard. HbbTV in France & TNT 2.0 project French broadcasters have expressed on several occasions their will to start hybrid services over DTT. With more than 50% of the population relying on TV aerials to get pictures on their screens, terrestrial broadcasting is the primary means of

Frdric Tapissier TF1, Head of CE Partnerships and Innovation & President of the Technical Committee, HD Forum France receiving television in France. About half of these households have a broadband connection, which makes DTT households a sizable market opportunity of several million. Of course, other markets are also targeted, such as free-to-air satellite (TNTSat, Fransat), satellite pay TV, cable and IPTV. In order to make it a mass audience success, the HD Forum France plans to create a new brand that will heighten consumer awareness and help create a rich and comprehensive offer. The project name is currently TNT 2.0, which would not only refer to HbbTV, but to a full set of features that gives access to specific services that would range from premium video, a new generation EPG to synchronised applications. The launch is planned for spring 2012.

tech-i | http://tech.ebu.ch | march 2011 07

in focus

More pixels = More immersive television experience


Dr. Hans Hoffmann provides a status update on the development of Ultra High Definition Television.

hilst many countries are only just starting or increasing their HDTV services, even more immersive media systems are under development in several parts of the world, called in general Ultra High Definition (UHDTV). A particular form is termed Super-Hi-Vision (SHV). Just to add to the acronyms, these form a subset of a wider class of video system called EHRI (Extremely High Resolution Imaging) which covers systems with uses outside and inside television. The NHK (Nippon Hoso Kyokai) Laboratories in Japan are leading the research in SHV and promoting this new and very immersive image format with 32 million pixels/picture, and which includes a revolutionary 3-dimensional 22.2 channel sound system. Immersive in the context of SHV means that the viewer can have a much wider viewing angle than with normal HDTV. The assumption is that the viewing angle will be up to 100 degrees (HDTV is assumed to have only a mere 30). The higher resolution allows the viewer to enjoy stunning images at much closer distances to the screen. The degree of reality perceived with UHDTV images is dramatic. UHDTV would allow us to enhance the value of the content, generated in UHDTV, because of the greater reality it captures. UHDTV image formats come in two levels: Level 1 (UHDTV1) with 3840 x 2160

pixels (termed the 4k system), and Level 2 (UHDTV2) with 7680x4320 pixels (termed the 8K system), corresponding to 4 and 16 times the resolution of a progressive HDTV 1920 x 1080 pixel picture. Both levels have an aspect ratio of 16x9, progressive scanning, and currently have 24, 50 or 60 frames per second1. The UHDTV image format and the parts of the sound system have already undergone standardisation in the SMPTE, and in the ITU. The following table summarises the current standard work: Image Format ITU-R Recommendation BT.12011(1995-2004): Extremely high resolution imagery (Tiered image formats based on multiples of 1920x1080) ITU-R Recommendation BT.1769 (2006): Parameter values for an expanded hierarchy of LSDI (large screen digital imagery) image formats for production and international programme exchange SMPTE 2036-1-2009 Ultra High Definition Television Image Parameter Values for Program Production Audio SMPTE 2036-2-2008: Ultra High Definition Television Audio Characteristics and Audio Channel Mapping for Program Production.

Interfacing SMPTE 2036-3-2010: Ultra High Definition Television - Mapping into Single-link or Multi-link 10 Gb/s Serial Signal/Data Interface In addition, the SMPTE is undertaking further work on carrying UHDTV image formats in multi-link SDI interfaces. For many years NHK has shown SHV using 8k projectors at various conventions (IBC, NAB) with impressive UHDTV images with 22.2 audio in its theatres, attracting thousands of spectators. Aside from the impressive image resolution, the sound system produces an unforgettable experience. Among the first applications of UHDTV/ SHV will be the public viewing of the London Olympics in 2012, and this may play a key role in the future of UHDTV. Some argue that UHDTV will be much better suited to an event like the Olympic Games than 3DTV, because the action has such a wide canvas and degree of detail. Could it be that one UHDTV camera alone, pointing into the vast stadium will be all you need for a great telecast, and the same experience as being there? The trends in the consumer domain for displays are also important, and this is a crucial element for the future of UHDTV. 4k LCD and plasma displays in the range of 52 inch and larger have already been

BBC/NHK shooting at Londons City Hall for IBC 2008 UHDTV demonstration

Images of SHV production and projection courtesy of NHK

08 tech-i | http://tech.ebu.ch | march 2011

UHDTV1 at RAI
L
developed, corresponding to UHDTV level 1, and are commercially available. Developments on consumer displays for 8k can be expected. Two fundamental technological challenges are in focus. The first is the image capture technology needed with new cameras equipped with sensors that provide the appropriate spatial and temporal resolution with low noise. The second is how to manage the huge data rates of up to 33.1 Megapixels corresponding to, for example, 16 HD-SDI links with 1.5 Gbit/s uncompressed. In addition to these challenges, compression technologies for production and for distribution have to be developed or optimised. For example, demonstrations between NHK, BBC, EBU, and RAI for the IBC2008 in Amsterdam have already proven the concept of UHDTV compressed delivery by demonstrating a real-time transmission via IP from London to Amsterdam (compressed at 600 Mbit/s), and also satellite distribution from Turin to Amsterdam (120 Mbit/s in the 22 GHz band). Both bitrates are very high compared to bitrates used in todays HDTV applications. An ongoing question is also how the current 3D stereoscopic developments and two dimensional UHDTV should be put into context. Are the systems competing, or can they complement each other? Whilst glasses based displays for 3D (2 view stereoscopy) are the hype now, future multi-view 3D could benefit from developments in UHDTV (sharing pixels between views). On the other hand, everybody who has ever seen well made 2D UHDTV will agree on the immersive experience it provides to the viewer. Consequently the jury might still be out for some years to come on which of them will provide the best experience. There is a strong likelihood that both systems, when technology and business cases mature, can complement each other. We can observe that, for Level 1 UHDTV at least, technical developments have partly left the research domain and are now in the hands of supplier and consumer industries. At the same time, there are increasing efforts to agree new standards for interfaces and compression systems that take into account the needs of UHDTV, so that we can optimistically look into the future of immersive media.
1

ast year, the EBU Technical Assembly was given an impressive demonstration of a programme made by RAI in UHDTV1, the system with about 2000 lines and progressive scanning. The programme was about a day in the life of the city of Turin. It included a largely previously unseen self drawing by Leonardo da Vinci, and was a real feast for human eyes. The UHDTV1 programme gave an almost there experience, coupled with sight of an artwork usually left unseen for fear that the light may diminish it. The programme was shot with a 4K camera sometimes used for movie production, running at 24 pictures per second. Not only was the programme itself impressive, the demonstration included compressing the programme into a bit rate which can be carried by a DVB-T2 digital terrestrial broadcast channel. The tests demonstrated that a 4K UHDTV1 system, with about 8 million pixels/per picture, could be broadcast by terrestrial broadcasting in less than 40 Mbit/s. Can we conclude that such things will be done in the decades ahead? The demonstration highlighted the need for broadcasters to take seriously the need for adequate terrestrial broadcast spectrum to allow the natural development of terrestrial broadcasting quality. It also highlights what the public will be deprived of if, as some suggest, much of the currently used broadcast bands are taken away from broadcast use. Probably, if Leonardo himself had seen the demonstration, looking back from the Renaissance, he would surely have applauded it, and suggested that UHDTV may indeed be the new Renaissance for television, with images that are closer for the truth. He would have helped us to maintain the broadcast bands for broadcasting. One of the other issues about UHDTV that the Turin demonstration brought to light is the differences

in perceived quality the viewer is likely to experience between HDTV, UHDTV1, and UHDTV2. The quality we experience will be affected by several aspects of the format, such as the aspect ratio, the picture rate, the colorimetry, and the amount of detail in the picture. How much is (or should be) the quality jump between HDTV, UHDTV1, and UHDTV2? In terms of the amount of detail in the picture, our perception is probably logarithmic, which means that the quality increases with (say) the square root of the amount of detail in the picture. In rough and ready terms this means that the quality jump we will experience in moving from SDTV to HDTV will be about the same as moving from HDTV to UHDTV1, and from UHDTV1 to UHDTV2. This may be approximately 1.5 ITU quality grades per jump. So, having a two phase progression may make sense.

everybody who has ever seen well made 2D UHDTV will agree on the immersive experience it provides to the viewer

Higher frame rates are under research.

tech-i | http://tech.ebu.ch | march 2011 09

in focus

Broadcaster requirements for broadband - Part 2


Dr.ir. E.M. Verharen, Manager R&D, Nederlandse Publieke Omroep examines the challenges surrounding broadcasting to mass online audiences in Part 2 of this 2 part series.

n Part 1 we looked at distribution issues. Here we describe the formats and protocols needed to deliver PSBs linear and on-demand programmes over the internet. For people to watch online video or listen to online audio a media file or stream has to be delivered to the end-users device. The broadcaster has to make the choice to support a myriad of devices or choose a specific set of them and the associated delivery mechanisms to support an audience as large as possible. In any case choices have to be made on: image size (resolution)/bit depth, frame/sampling rate, bitrate, format1 (often referred to as codec), container or wrapper2, delivery method (streaming, progressive download or file download) and protocol. In choosing the best mix, you need to predict your audience, the type of connection available, and what device is used for viewing/listening, and its features, e.g., in scaling. For instance, to determine the optimal bitrate consider the percentage of broadband connections in your service area. For over-the-top linear services the preferred display device is a TV, so choose a larger resolution and frame rate. For ondemand mobile TV, a smaller resolution and frame rate as low as 12 fps can help save bandwidth and prevent artefacts and buffer underruns. You should choose HTTP if already deployed content acceleration and distribution technology can be used. Unfortunately only certain combinations of codec and container are supported by todays hardware and software players. The goal is to pick the right flavour-of-the-day combination supporting the devices your users have now and taking into account their replacement cycle. The most popular formats and containers today are still determined by the first streaming technology providers, describing the most used video format, audio format, container and delivery protocol respectively. They are: Apple, with MPEG-4, AAC, Quicktime, and RTSP; Microsoft, with WMV/ VC-1, WMA, AVI, and MMS and HTTP; and
1 2

Adobe with the VPx video codecs, MP3, Flash, and RTMP. For a while the world seemed to converge on using H.264 as the main video format, MP3 as the main audio format, MP4 as container, and delivery through RTSP and HTTP. However recent developments and the entrance of new giants showed that the codec and container wars are far from over. For instance, (at the time of writing) Google withdrew support for H.264 in its browser and favours WebM as container with associated VP8 video and Vorbis audio codecs. And, although adaptive HTTP streaming is supported by Apple, Microsoft and Adobe, unfortunately, they all have different format and implementation details. The EBU has prepared a statement on the requirement for standardisation in the codec and delivery fields. From the broadcasters side, today, the optimal mix of formats and protocols to deliver linear and on-demand audio and video services to as many devices as possible, is: H.264 as video format, AAC+ and MP3 as audio formats, MP4 and MPEG-2 TS as container

formats and adaptive streaming using HTTP as delivery protocol. In the Netherlands, NPO recently supported the development of software that takes one H.264 encoded stream and rewraps it on playout to the appropriate Apple, Silverlight and Flash container formats and delivers it through open source (Apache and Lighttpd) HTTP servers to a broad spectrum of online players and devices, including Silverlight, Flash, Apple and other mobile devices using their native players. It now can reduce its costs by simplifying the playout platform, i.e., get rid of WindowsMedia, Quicktime and Flash server farms, as well as the encoding platform, since only H.264 in different bitrates is needed, phasing out WMV/VC-1, and specific Apple, 3GP and VPx encoding. The EBU ECP (Expert Community on Platforms and Services) looks into the developments on the format/codec and container side on the one hand and the broadcasters requirements on the other side.

http://en.wikipedia.org/wiki/Video_codec and http://en.wikipedia.org/wiki/Audio_codec list of codecs http://en.wikipedia.org/wiki/Container_format_(digital)

10 tech-i | http://tech.ebu.ch | march 2011

standardisation and interoperability update

HTML5 A game changer?


Franc Kozamernik ponders the future of HTML5.

TML (HyperText Markup Language) is the language of the code that sits behind every web page displayed by a browser. You find it on any web page by right-clicking your mouse and selecting View Page Source. Compared to HTML4 which was introduced in 1997, HTML5 introduces many new interesting elements. For example, the HTML5 dictionary includes canvas which allows inserting moving graphics that can be used in games and animations. The HTML5 specification enables the browser to store 1000 times more data than is currently possible, so that it is possible to use web pages even when there is no connection to the internet. For broadcasters and content providers, the most useful feature is a new capability for the native support of audio and video playback. HTML5 is not yet fully developed and still lacks a support for many features that are critically important for the content provider: adaptive streaming, digital rights management, advertising and monetisation. In spite of that, it has already been implemented by all major browsers, e.g., Mozilla Firefox, Apple Safari, Google Chrome, Opera and lately Microsoft Internet Explorer 9. Before the advent of HTML5, in order to get video to play, websites added proprietary programmes (e.g., Adobe Flash and Microsoft Silverlight) and required the users to download plug-ins to play them. That made websites more complex and dependent on a plug-in presence in the client device. HTML5 provides a new <video> tag to play video directly (natively) in the browser itself, therefore no third party plug-in is required. Contrary to a plug-in where video is locked away and trapped in a black box, the <video> element can be manipulated flexibly: it can be styled with CSS, resized on hover using CSS transition, it can be tweaked and redisplayed onto <canvas> with JavaScript, etc. The <video> tag itself is codec agnostic and leaves the browser developer open to support whatever codec they wish. This leaves the door open to the situation where each browser could use a codec of their choice. This could potentially lead to a market fragmentation and indeed to reverting back to the use of proprietary plug-ins. The table below shows which video codecs (embedded in the appropriate containers) are currently supported by the most recent browsers. Ogg container Theora video Vorbis audio Internet Explorer IE 9.0+ Mozilla Firefox 4.0+ Apple Safari 3.0+ Google Chrome 5.0+ Opera 11.0+ Apple iPhone 3.0+ Android 2.0+
+ stands for or later Note 1: IE9 will only support WebM if the user has installed VP8 codec Note 2: Google has decided to remove its support for H.264 from Chrome Note 3: Goggle has committed to supporting WebM in Android

The EBU Expert Community Platforms and Services has been conducting some studies on using different browsers for the HTML5 content. An example of an HTML5 player developed by SRG-SSR/ SwissInfo is shown (courtesy Ayar Alazzawi, SwissInfo). Today H.264 is the most widely used video codec in digital broadcasting. In the internet several codecs are being used (see table), the most popular ones being H.264 and WebM/VP8. In todays convergent environments where the IT, consumer electronics (including mobile) and broadcast worlds are coming together and the borderline between them is blurring, it would be advantageous to consider common audio and video coding for the internet and broadcasting. Not surprising, broadcasters prefer using H.264 not only for broadcasting but also for internet distribution of video files and streams. The H.264 license issues have been successfully resolved. MPEG LA announced in August 2010 that H.264 will be royalty-free forever so long as video encoded with the standard is free to end users and delivered via the internet. This means that no royalties are required for the H.264 web videos that are delivered free of charge (as is the case with most EBU public broadcasters). However, Google recently decided to discontinue supporting H.264, as it only intends to use open source, licence-free codecs such as WebM/VP8. Many experts however, fear that the unresolved WebM submarine patent issues might later hit those who have implemented this codec. Googles decision may force the content providers wishing to target the most popular browsers to produce two video versions, one in H.264 and the other in VP8. As things stand today, Apple and Microsoft will probably continue to support the H.264 codec, whereas Mozilla Firefox, Opera and now Google are likely to support merely WebM/VP8. It is unlikely that these two camps will ever agree on a common approach. Although some optimists believe that HTML5 signifies the webs rebirth, many sceptics share the opinion that HTML5 may rise or fall depending on whether or not the browsers are able to reach a consensus on a common native video and audio codec. Unfortunately, the prospects of reaching such consensus seem to be meagre. MPEG-4 container H.264 video AAC & MP1L2 audio Y WebM container VP8 video Vorbis audio Y, Note 1 Y Y Y, Note 2 Y Y Note 3 Y Y

Y Y Y

tech-i | http://tech.ebu.ch | march 2011 11

industry news

New SMPTE VP
I
n January 2011, Dr. Hans Hoffmann, currently Programme Manager in the EBU Technology and Development Department at the EBU headquarters in Geneva, became the first European for nearly 40 years to be elected as Engineering Vice President of the Society of Motion Picture and Television Engineers. This is SMPTEs most senior engineering post. SMPTE is acknowledged to be the most important and active organisation in the world developing standards for programme making technology. SMPTE has a worldwide membership, and has been responsible for developing standards for the television production and cinema industries for over 80 years. SMPTE has been responsible for many of the technical standards used for programme making today, including the MXF file formats and many others. The broadcasting world looks to SMPTE to bring together programme makers and equipment manufacturers to agree formats which make it possible for different manufacturers equipment to work together. The EBU has a long and close relationship with SMPTE, and formed a joint Task Force in 1996 which was largely seen as kick starting the use of information technology in programme production. Hans Hoffmann led part of the work of the Task Force, and has been working in SMPTE on behalf of the EBU ever since that time. He has served in many posts in SMPTE, and is currently a Board Member as Regional Director for Europe, Middle East, Africa and Central & South America. He relinquishes this post, taking up SMPTEs senior engineering post for an elected term. Lieven Vermaele, Director EBU Technology and Development said The EBU is honoured in the confidence that SMPTE members have shown in one of the team. I believe it demonstrates our competence, and reflects our commitment to standards. Dr. Hoffmann remarked: There are many important standardisation challenges

Dr. Hans Hoffmann facing the broadcast community, my job as Engineering Vice President will be to identify them and encourage SMPTE members to work together to solve them.

member spotlight

In the Spotlight
issue, the spotlight is Petr Vitek In thisCzech Television. PetronCommittee from has been a member of the EBU Technical since 2000. His role includes strategic adviser on many interdisciplinary levels relating to new technologies. The EBU Technical Committee consists of 13 elected members, who represent the interests of the EBU membership as a whole. They are asked to consider themselves elected as individuals rather than organisation representatives, and thus to speak for other members in similar circumstances. and to share with him the remaining 350 km or so to Prague. We spent an unforgettable three days together cycling around the country. I also like listening to music as a means of intense relaxation. 3. What do you consider as your finest achievement so far in your career? Undoubtedly, it is the correct setting of Czech Televisions strategy in the field of digital broadcasting. It is something we started working on as early as 19992002 and, with hindsight, we can now fully appreciate the stabilising and innovative elements we incorporated in it. On this occasion, I recall the useful EBU Technical Seminar held in Czech Television in 1995 dedicated to digital broadcasting and DVB standards as well as the meeting of the EBU Technical Committee chaired by Professor Messerschmidt in 1998 when, in the role of observer, I was acquiring knowledge of strategic decisions made within TC EBU. 4. Why did you step forward as a candidate for the EBU Technical Committee? For me, the EBU Technical Committee is an environment where I can make use of my long term experience in the field of media and further develop my capacity on an international level for the benefit of all EBU Members. Plurality of opinions and diversity have always been a challenge to me and as one of the most senior members of the Technical Committee I am always willing to share my experience acquired within the EBU. Years ago, it was the other way around and it was me on the receiving end.

1. Can you tell us something of your current responsibilities at Czech Television? I am responsible for international cooperation of Czech Television in the field of television technology strategy as well as for solutions of issues of international and national importance pertaining to Czech Television. 2. Its always interesting to hear about outside interests - what are yours? I am an avid cyclist and I love hiking and almost all winter sports. I should say, however, that I am not the only keen cyclist in the EBU Technical Committee; I was very pleased that when my colleague from the Netherlands, Mr. Jan Doeven, Chairman of the Broadcast Technology Management Committee, was about to leave his activities in the EBU after his long and distinguished work for the EBU, he decided to make a cycling trip from Amsterdam to Prague to pay me a visit. My natural reaction was to meet him at the Czech border in western Bohemia

Petr Vitek, Ph.D., Czech Television Currently, a great challenge for me is New Media. Another reason for submitting my candidature is the fact that as a member of the CEE (Central Eastern European) EBU Group I am somewhat of a representative of the Group in TC EBU or, so to speak, a gateway for the Group Members to the TC EBU. Needless to say, however, without the strong support from Czech Television and EBU Members, my candidature would not be possible or even thinkable. 5. What are for you the most important challenges facing EBU Members, particularly those with circumstances similar to Czech TV, today? Future development of terrestrial digital broadcasting including HD distribution, online media including net neutrality, hybrid TV environment, utilisation of digital archive, interoperability in production area, seeking to unify approaches to online media.

12 tech-i | http://tech.ebu.ch | march 2011

standardisation and interoperability update

David Wood asks the question Are we paying enough Lip-service to television?

Out of Sync
O
ne of the irritations for viewers in the age of digital television is changes in loudness between programmes and commercial breaks, or between TV channels, or packaged media, etc. Its irritating to keep jumping up and down to adjust the volume of the TV sound. We have a solution here (see page 15), as reported previously in Tech-i. Broadcasters only need to use the same kind of loudness meter (an EBU mode ITU-R BS.1770 meter), and use a single target loudness (its called -23 LUFS). The other irritation is the potential lack of synchronisation between the pictures we see and the voices we hear from the TV set. The really bad case is when the sound of the voice is noticeably ahead of the movement of the lips of the person speaking. Can we fix this one just as we have done loudness? This is occupying standardisation groups worldwide. We dont have a universal solution yet, but at least we understand the problem, and thats a start. The first thing is our human tolerance for lack of synchronisation. The results of subjective evaluations are well known. In tests with SDTV, 50% of viewers at normal viewing distances (6H) for SDTV notice a lack of synchronisation but are not annoyed by it with delays of +45ms (advance) and -125ms

(delay). We are at least twice as tolerant of delay as we are of advance, because we are quite used to delay in our normal lives. For instance, the sound from people laughing takes about 20ms to travel 6 metres to you, but their joyous faces are there instantly. Beyond that, tests show that about 50% of listeners find the lack of synchronisation annoying when the delays are about +90ms and -185ms. If we cannot better these limits we will be failing as broadcasters. EBU Recommendation R37 (from artist to transmitter) is for tolerances of +40ms and -60ms. These are within the human end to end tolerances, to allow the delay introduced by the transmitter encoder, the display type, and the viewing distance. Plasma displays introduce delays of about 40-90ms, and LCDs of 30-80ms, and of course viewing distances vary in the home. Is there a means to automatically adjust the sound/vision delay so that, artist to armchair, advance is always less than +40ms? Today we search for automatic and manual systems to remove delay up to the transmitter. An important idea being studied is termed an A/V fingerprint. A fingerprint is a digital signal extracted from video or audio which unambiguously identifies the image

or sound over a given period, and which is always valid no matter what happens to the sound or vision quality. The idea is to derive, at a point where we know the sound and video are exactly in sync (say, the camera output), fingerprints for both the audio and video separately. These are then compared and metadata is created about their relationship. We now have our true A/V fingerprint. At any later point in the chain, we can derive a new A/V fingerprint, and adjust the delay until the downstream fingerprint matches the source (true) fingerprint. We will need to keep carrying the true A/V fingerprint metadata wherever the programme goes in the production process. A SMPTE ad-hoc group is working on several aspects of this matter, such as user requirements too. Though there are different ways of achieving our artist to armchair objectives, one way may be in future to carry the fingerprint metadata over to the home receiver. Another may be to use MPEG time stamps for codec delay removal. Could it be that the smartphone you will be using to control the TV set will be able to set the delay, spot on, from your armchair, via an app? Lets hope that before too long we can remove the second major irritation for viewers of digital television.

tech-i | http://tech.ebu.ch | march 2011 13

seminar news

CPM11-2
What is CPM11-2? he Conference Preparatory Meeting CPM11-2 gathers all interested ITU member administrations. It is tasked by the ITU with the preparation of a consolidated report to be used in support of the work of the next World Radiocommunication Conference in 2012 (WRC12). The preparation of this report was started three years ago, just after the previous WRC07, and has involved many ITU working groups, each tasked with the preparation of a part of the report with proposals related to the WRC12 agenda items. The proposals are called methods to solve the issues related to each agenda item. There are usually several proposed methods for each item. The CPM11-2 took place from February 14 - 25, 2010 in Geneva to finalise and approve the report.

CPM1 1-2
interests. The main subjects of interest for broadcasters in WRC12 are: The need for sufficient frequency resources for electronic news gathering systems; The possible constraints on FM broadcasting services from the introduction of new aeronautical mobile (R) service (AM(R)S) systems in the FM upper adjacent band; The impact on broadcasting from the mobile services using the 790-862 MHz band; The future of international spectrum regulatory framework, in particular regarding the definition of radiocommunication services; The future of software-defined radio and Cognitive Radio systems in the ITU; The protection of broadcasting from emissions from short-range devices; The preparation of the agenda for the next WRC, expected in 2016, particularly with regard to expectations around further allocations to the mobile service in the UHF band. What comes after the CPM11-2? The preparation from broadcasters for the WRC12 will continue with the aim

I n t e r n a t i o n a l

T e l e c o m m u n i c a t i o n

U n i o n

2nd Session of the 2011 Conference Preparatory Meeting

For the 2012 World Radiocommunication Conf erence

GENEVA, 14 - 25 February 2011 www.itu.int/ITU-R/go/rcpm

Broadcasters and the CPM11-2 Broadcasters, on a regional or global basis, are concerned by several agenda items of the WRC12 and have participated in the preparation of the corresponding parts of the CPM report during the last three years. They contributed to this CPM11-2 to make sure that the CPM report contains methods that are fair to the broadcasting service

of having their favoured methods of the CPM report supported by administrations at the conference in January 2012. A detailed article is expected for publication in the EBU Technical Review later in 2011. Walid Sami

Welcome to the Hybrid Age


An EBU General Assembly event

ybrid Broadcast Broadband (HBB) and connected television remain a key strategic topic for EBU Members with several initiatives underway in EBU Members. EBUs General Assembly is a biannual gathering of the senior executives from EBU Members to discuss such key topics: thus an ideal opportunity to raise awareness of HBB and discuss its implications. As part of its remit, EBU Technical organised a conference immediately prior to the EBU General Assembly, gathering the main players in the hybrid space. The highlight was a speech by Anthony Rose (ex CTO, YouView), and a key figure in the development of the BBCs online strategy. Anthony explained the motivation and proposed roll-out of YouView, highlighting the importance of the initiative and the commitment of the shareholders to ensuring its success. It is hoped that YouView will sit alongside the very successful free-to-air Freeview terrestrial platform. EBU General Assembly delegates also heard country updates from France, Germany, Italy and Spain. Different markets at different stages of development mean that different business and technical choices are being made by these countries. Nonetheless, there is a wish to adopt open standards and embrace the opportunities afforded by hybrid

services as quickly as possible. Enhanced data services replacing analogue teletext and transferring the popular catch-up TV services to the television appear to be two key drivers. A highlight of the event was a keynote presentation by Andreas Weiss (ARD) outlining a vision of a common EBU-wide HBB initiative. Andreas stressed the importance for EBU Members to take a common stance on hybrid services and platforms, and most importantly to be proactive in launching services. He

pointed out the synergies between Members linear and online services and the opportunity hybrid afforded to really enhancing the viewers experience of public service media. The next steps will be to bring together all those who expressed an interest in working with the EBU to bring hybrid to their markets. EBU and its Members are working with industry stakeholders to capitalise on the mutual benefits that hybrid services promise. Peter MacAvock

14 tech-i | http://tech.ebu.ch | march 2011

Riding the loudness wave


W
01
ith the EBU Loudness Recommendation R 128 and related test signals published, broadcasters across Europe (and beyond) are now organising loudness workshops to learn how best to implement the EBU loudness solution. In January, 120 media professionals attended the loudness workshop in Munich, organised by the ARD-ZDF Media Academy and the IRT. The workshop featured live mixing demonstrations. Using an audio-video link to an IRT studio, the participants could follow the faders live, to see how loudness normalisation actually works in practice. The conclusion: it is not at all hard to do. Riding the faders So what is the main difference for the audio engineer? According to operational staff having made the switch to loudness measurement already, loudness normalisation offers a more relaxed way of mixing. Instead of watching peak meters dance up and down, audio engineers rely more on their ears and occasionally check if the loudness meter agrees. So loudness metering is a liberating (audio) revolution! Loudness is also very relevant for situations where there actually are no faders, such as in automated QC (Quality Control). With the move to file-based production facilities, the choice of good batch file analyser software, including the right loudness measurement algorithm is gaining importance. More events Besides the loudness workshop at the IRT, the first two months of the year have already seen several other events devoted to EBU R 128, including workshops at Technicolor in Hilversum and the EBU in Geneva, and an RTVE-AES seminar in Madrid. More loudness sessions are planned for the coming months, for example during the AES Convention in London (13-16 May). There truly is a wave of loudness going through Europe.
http://tech.ebu.ch/loudness

Frans de Jong 01. Live mixing demonstration at the IRT in Munich, January 2011 02. Loudness workshop, January 2011 03. Peter Rafailov (front), Cornelius Behrens and Askan Siegfried (right) at the workshop

02

03

diary 2011
Subtitling in XML/MXF - Webinar 15 Mar 2011 / 14:00 (CET) - Online / No Fee. In this webinar, Larissa Grner (IRT) will introduce the features of the EBU-DFXP format, its use in current workflows and the replacement of EBU STL. MXF Masterclass 2011 24 - 25 Mar 2011 / Geneva (CH) / Fee. This course provides expert knowledge in MXF technology to better understand how to migrate to a file-based workflow system. EBU BroadThinking Seminar 29 - 30 Mar 2011 / Geneva (CH) / Fee. This seminar will highlight the EBUs and outside experiences on online, internet and hybrid broadcast broadband technologies and related services and applications experienced over the last two years. Technical Assembly 2011 2 - 3 Jun 2011 / Troms (NO) / Members Only. The Technical Assembly analyses current technology, future prospects for production, broadcast and broadband delivery, and spectrum management. Further details and up to date information can be found at http://tech.ebu.ch/events

tech-i | http://tech.ebu.ch | march 2011 15

Where Creativity, Technology and Business meet.

Making broadcast pay in todays connected world requires unprecedented creativity. With a decades-long legacy of technology leadership, Harris is the only company that offers an end-to-end workflow engineered to address all tomorrows standards and the unique challenges faced when baseband meets broadband. HDTV. Mobile TV. 3D. Connected TV. Wherever you and your viewers converge, Harris has the technology to transform your creativity into a thriving business.

To learn more, visit broadcast.harris.com


UK, Israel, Africa +44 118 964 8200 North, Central, Eastern Europe +49 89 149 049 0 Southern Europe +33 1 47 92 44 00 Middle East +971 4 433 8250

You might also like