Pixotope Snapshot: David Stroud
Pixotope Snapshot is our new series shining a light on the many talented individuals that make the Pixotope team so great – a group of industry experts with vast experience. In this edition, we sat down with Pixotope Chief Product Officer, David Stroud, discussing his career history, his role at Pixotope, and his thoughts on the rapid uptake of XR (eXtended Reality) technology within the live events and broadcast market.
Tell us about your career leading up to joining Pixotope?
I’ve been working in the graphics industry since 1987, which certainly makes me one of the oldest team members. My first role was at an AI company called Symbolics, working with some incredibly intelligent individuals from MIT (Massachusetts Institute of Technology). We produced an integrated 2D & 3D animation suite very early on in the industry’s development, building our own software and hardware. In 1994, I joined the UK software company, Parallax – this was where the software Matador and Media Illusion were originally created. ILM (Industrial Light & Magic) loved using the Matador software – it was groundbreaking at the time and used on films such as Forrest Gump and Jurassic Park. We were then bought by Avid, and I spent five years there, ending up as Engineering Manager.
By the year 2000, I had joined 5D, helping to produce 5D Cyborg, 5D Commander, and 5D Colossus – a very early digital color corrector. Unfortunately, this venture ended abruptly in 2002. Shortly after, I joined FilmLight as Product Manager – a role which I stayed in until 2012! During my time there I worked extensively on the TrueLight color management system as well as on BaseLight. We changed the digital intermediate process for film making and had all the major labs and studios in LA using TrueLight color management. In 2012, FilmLight entered into a joint venture with Nic Hatch, giving birth to Ncam. I joined the JV and from ground zero, we designed, developed, and brought Ncam to market. At the time, people didn’t believe you could do free-form camera tracking, and implementing AR or VS was a difficult and complex business. I worked with Ncam for seven years before joining Pixotope in 2019.
Tell us about your role at Pixotope?
Certainly. Joining Pixotope was a perfect time for me to return to the VFX world, something I knew and loved. I started work at Pixotope initially in Operations – at the time, we were still producing a lot of our own creative projects, so despite being UK-based, I spent a lot of time working in Norway. It was an exciting year, working on the LPL (Tencent League of Legends Pro League), Eurovision, Tour De France, Wimbledon, and a range of incredible projects in China too.
But in December 2019, we decided to refocus our efforts away from our own creative events and towards enabling our customers and assisting them as technical advisors on productions. This was the best way to leverage and scale our powerful Pixotope technology in new and exciting ways for bigger markets. It was soon after this that I switched departments into Product Management. Since then, I have been leading the charge on developments across broadcast, eSports, and live events. We’ve got big plans for this year, which leads us nicely to our new developments in XR.
How much of an impact is XR having on the media and entertainment industry?
XR is happening now. All around the world, people are building LED walls to implement virtual production technology into their pipelines or to use for live events. The problem is many studios are asking: “How do I use it?” People are struggling to make it successful. There is a lot of complexity regarding the setup, accuracy of camera tracking, and timing. Now there are requirements to use multiple cameras and also to blend in XR displays for live audiences as well. We are delivering practical solutions that people can not only use, but that maximize the true potential of LED walls and this is where our production experience comes in to help.
How was Pixotope XR conceived and how will it enhance studios’ experience with LED walls?
We started off with a single plane of geometry – this technique has been around for a number of years and has been used by TV studios that want to create a window into another world. You would use a projection screen or an LED wall in your TV studio and drive that with the off-axis projection from your camera. Essentially, you take your camera viewpoint and you create a panel – you can see your 3D world behind it. As you move your camera, it’s like looking through a window and you get the depth of your 3D scene to augment your physical studio. That’s the basics and this technique has been used successfully on many projects around the world.
The problem for studios arises when they want multiple walls. It requires multiple engines, three plane configurations, two walls, and a floor before blending the images together. But using three engines becomes logistically very tough. We started looking at what studios are aiming to achieve and started building the functionality to map multiple planes of geometry with XR projections using a single camera viewpoint. We’ve achieved this in the release of Pixotope 1.5 and XR using an output of HD or UHD. We’re currently using video outputs that go straight into a scaler or the LED processors to drive the wall. Pixotope has multiple planes with a single output, and the ability to map those onto a curve or an arbitrarily aligned plane of geometry in a scene.
What could be coming next in XR?
There’s no prescribed way to do things, and everyone is searching for best practices. So we’re constantly communicating with studios using XR, and those taking their first forays into the technology, so we can remain at the forefront of development and add features that directly enhance production. Every time we solve a problem, we are already considering the next move – what we produce today will be eclipsed in six months and that’s the best way to ensure Pixotope continues to grow.