(transcript of a talk at On Borders and Edges 29th May 2004)
good evening, i have this odd title “production technologist” which i occasionally find very difficult to explain to people. I'm sure that you know the BBC is a large organisation,, about 30,000 people, and we use technology for production. But in such a big organisation, it can be very difficult to use things ad-hoc, its not just a matter of a production team here using something, and another team using something else. definitely, when you try to be creative and innovative as an organisation, there has to be a strategy about how you go about doing things both in the short term medium term and longer term. so that is the team I'm working in, our group is called 'technology direction' and it provides advice and direction in terms of the technology the BBC is using to continue to be creative in their productions. we also do some of the boring stuff, which has to do with things like what computers the BBC is using, what kind of software, we also look at what kind of technologies are coming out of labs, experimental research around the world, and if we can use some of that to be more innovative. so that is what i will be talking about tonight, mainly about the field of mixed reality and how we chose some parts of this technology to use in television production.
ill start with a technology which is already on your screen, its called “virtual studios”. whenever you watch a weather forecast you are seeing this technology, the weather map will not in fact be behind the person, its a virtual map or scene, which has to be mixed with the real presenter using chroma-keying. if you go a step further, if you notice that the camera is usually static in the weather forecast, so if you start moving the camera, the mixing of the real video image of the presenter and the virtual image is not so straight forward anymore, if the camera moves, the person will move with it but he graphics will stay the same. the only way to create an illusion that the presenter is part of the virtual scene behind them, is if you go to an expensive technology, a “virtual studio”. If you have seen these productions its not obvious that these complex technologies are behind them. This technology to merges video and graphics together which allows the camera to move freely, involves a very cumbersome setup, it involves putting targets on the ceiling in a particular way, being in a specially lit room with uniform colouring, either blue or green, and sensors on the cameras. Because the room is uniform, the presenter has no clue as to what is behind them, so the only way to see what is happening is thru a monitor. So it works fine, you can see it on your television screen but its not easy for the presenter and it can be quite difficult and expensive. Whats more, when the virtual graphics are behind or around the presenter, they don't know where the graphics are and so don't really interact, so it can look quite fake really. In the end you know its not a very intuitive environment for the presenter. So its expensive, its not intuitive, and it cant be taken outside.
So what we have done, to be able to solve these problems, is go back into the field of mixed reality, and here is the spectrum Maja was talking about earlier.
(slide)
this continuum we call 'mixed reality', starting with virtual environments on one side, when you step into a totally virtual worlds, completely immersed. on the other side is the real environment, in which you bring graphics out into. so let say the virtual studio are more on the side of the virtual, since they put the presenter in a completely synthetic world, but what we really want to be able to do is to bring the virtual world out into the light, into the live studio of the presenters. so the technology that i decided to look into was 'augmented reality', which usually involves looking through funny glasses or goggles into a mixed image that has both graphics and video.
(video)
This is a project from the HIT lab in Washington university, called the “ARToolkit”, its open software, but i wont bore you with details, ill just show you
(video)
imagine now that i am using glasses to look into this black and white pattern and what i see thru my glasses is my hands, a black and white tile, and a virtual object attached to the tile. so as i move, the physical object, and watch it through the glasses, what i see is a virtual character and the top of my hand. so this is an augmentation of the real space with virtual images. it doesn't have to be only graphics, here is an example with 3d video (of some dancers) using realtime tracking. it senses how i am tilting the marker and it puts the dancers there, the best thing about this is that you can work with a laptop and a webcam, or it can be run on a home computer. it doesn't need too much complexity. now there are some technical issues to get from this this resolution to the quality required for television. now what does this mean for television production, i showed it to a lot of production teams, who were mostly thinking it was a nice toy that couldn't really be used for television. so we compared it to the virtual studio and It finally clicked.
(video of interview)
so this is what the presenter made of the technology, it meant he could bring virtual objects into the studio without any special setup, he could go outside and do the same thing. what i figured out after that production what that i should not become an actress, but go back into the lab and improve this technology. when i say improve the technology, it involves using professional cameras, images that need to run at 25 frames per second, they have to be fantastic in terms of how the graphics are mixed. so this experimental technology was great to help create ideas and understand how we could use it, but it was not ready to be used in television production. it took about a year to improve it so it was able to work with studio cameras, to sense the zoom and focus as well as movement of the camera. we incorporated the chroma-keying technology, since no-one really wanted to see a black and white pattern on the TV screen. so we have used green patterns, changed the algorithms a little bit so we could superimpose scenes, it also includes video walls within the graphics, particularly for putting remote correspondents into virtual landscapes so the presenter could talk to them.
we have improved the technology a lot and we reached the point where we could test it again with another production, to see what the production teams can do with it. A producer came along, who was doing “body hit” which is a BBC 3 series about the human body. they have done 2 previous series, and were about to go into the 3rd, thinking about what they could do. up until now they had used graphics to explain how the body works inside, and then real people to explain it, and were often shooting on location. by the 3rd series they realised that a molecule is a molecule is a molecule, and there is only so much you can do to visualise it in an interesting way, and the audience would probably get bored seeing the same thing over and over. The other problem was that they wanted to make a program about the effects of drugs on the body, and there is very little you can do experimenting with real people and drugs that is legal, and can be broadcast. so they thought, using this technology they could have a virtual character that they could pump as many drugs as they wanted into him and then explain how it works. so they had the idea that this man could help bring the graphics and molecules to life.
(video of the series)
so you can see the mix of realtime graphics and post-production graphics. This gives you some idea of how these productions could use mixed reality to produce programs in ways that they couldn't before. In a way, it allows them to play with graphics as something integrated with their programs in new ways.
Id like to go quickly into some other platforms, because its not only television that BBC does, surprisingly enough. Its also involved in education, so here are some videos of the reactions of kids using this type of technology.
(video)
What you see here is similar to what Kristina was saying, this girl is quite amazed. there is only a screen, through which they can see the virtual objects in front of them. what is equally important is that this is collaborative, its one of the first times you can see more than one kids in front of a computer without a keyboard or mouse, using no more interaction than moving around the black and white patterns on card. in this video they are assembling a little rocket, something like a puzzle, if 3 patterns are connected in the right way a spaceship will appear, and then fly off. when you take this into a classroom situation, as we did here, we understood that kinaesthetic learners, that is children that learn by doing are almost ignored by the current curriculum, which is all about presenting information verbally. so this was proven to be quite a powerful tool for teachers, since its easy to use, as well as for the kids who are kinaesthetic learners.
There are 2 other spaces we looked at integrating the technology with content from the BBC. one is the 'interactive window', which involves a window, where the public just passes by, and could interact if they wanted. the other is in the science museum in London, we used the same kind of technology in a number of ways, with another range of experiments.
so, we have homes, classrooms and public spaces where this can be used, i could even take it into my pocket as well, since its possible to use augmented reality with a PDA or mobile phone. if you are interested, more information can be found on our website, and as you can see a number of different divisions within the BBC brought all their strengths together to be able to do this.