I got a call asking if we could send an analogue camera feed out to some on-set CRT monitors... Easy right? Kind of.
The first part of the project was fairly standard, simply set up 3 wireless transmitters so that we can get things moving. The rental company also supplied some RCA to HDMI converters which would convert the analogue signal into a standard digital format. More on those later.
With all jobs like this, the moment you make something sound simple, they bring you more challenges. The client asked if we could also livestream to the feed to YouTube. Not a problem, I thought. "I can just run the feed into OBS using NDI and that'll be that."
I only had a few hours to set everything up and have everything set for the show at 2pm. Like all jobs involving analogue I brought along my magical mystery box of all things cables and converters. Before the job I looked up the monitors we had and saw a SCART cable input. Being that SCART has a slightly better quality - Due to the fact that SCART can send RGB signals which are 3 individual signals for red green and blue. This allows for a higher overall bandwidth than composite alone. (it also gets rid of some nasty dot crawl and flickering inherent in composite video). Seeing the SCART, I decided that it would be the way to get the best possible picture into our monitors. I purchased some HDMI to SCART converters and hoped they would do the job 🤞🏻 It's also important to remember that chasing good picture quality from analogue video is a fools game. We only like it BECAUSE of its flaws.
Normally i'd just run a BNC cable over to the 3 monitors but because the space was extremely tight and would be full of people I decided to run the 3 monitors entirely wirelessly.
Now, it might be a bit crazy to set up 4 transmitters in a small room, but i've been using my Dwarf Connection and DJI transmission systems for over a year now. And with good signal management I've never had any issues. I bit the bullet and decided that having a 3 DJI receivers would be better than having 3 BNC cables taped to the ground, with SDI to HDMI converters at the end.
I got everything plugged in, but disaster... the picture going into the monitor looked fucked.
Panic set in. What do I do now, we have an hour until the show and all these monitors with no picture, the entire 90's shopping channel vibe is riding on my shoulders and it's failed because of some bad converters!!!! In these situations, you do what you have to do, stay calm... and turn it on and off again. Still, nothing. I reseated the cables and did what any gameboy-playing kid from the 90's would... I blew straight into that SCART converter. Sure as day, the CRT lit into action with its reassuring whine and displayed the camera feed perfectly. PHEW. The latency from the the DJI's combined with the imaculate response time of CRT's gave us an in imperceptible input lag of no more than a frame or 2. More than adequate for what we needed to do.
Another issue with using these type of adapters is aspect ratio. The DV cameras shoot in 4:3 But the adapters automatically scale this (without any options) to 16:9. The adapters from the cameras also automatically scale the image into 16:9. So we had a 16:9 image, which the monitors would automatically squeeze back into 4:3. Not a problem then right?! Wronggg. Because of the livestream, I'd need to scale the image back into 4:3 in software, but i cant do that because that would mean by the time its back on the monitors it would be DOUBLE squeezed. More quick thinking needed... What if I send the OBS feed from the monitors to my laptop using NDI over thunderbolt and livestream from that using a seperate instance of OBS? Hmm that could work but then the graphics which I've already squeezed in software would be double squeezed! Nighmare! New soution. I'll just send the original NDI camera feed instead of the final output so that graphics wont be embedded.
Essentially what had been a simple, "use QTake to capture some pictures and put them on monitors" had turned into 2 separate streams of video with me having to apply separate overlay graphics to 2 separate streams. Were there better ways of doing what i'd done? Probably yes, there always is. But the most important thing is to plan what you can and be ready for everything to go wrong. The simplest of things can really throw a spanner in the works. And even though I tested everything before the job, you can never fully test everything in a full production environment. But hey, if you're not always learning, what's the point!
It also always helps if you're working with a great team, like Toby Leary, Raffi Chipperfield and the rest of the great camera crew.
Here's the link to the full piece ! https://www.youtube.com/watch?v=AwPKuCyoHAM
Mis-creddited as DIT also btw, there was no DITing involved with this job, just capturing in the DV cameras.

Comments