In our continuing series on remote collaboration, Cirina Catania speaks with Bob Caniglia, Director of Sales Operations, North America for Blackmagic Design.  Blackmagic has several solutions that make production and post-production life easier when teams are in different locations. We take a deep dive under the hood and talk about DaVinci Resolve 17 and Cut for next-gen editing, Fairlight, Crew in a Box, how to get better sound with Zoom interviews, the Pocket Cinema Camera 6Ks and ATEM Mini Pro switchers, and more!

Twenty-six of this year’s Emmy Award nominees were streamed into the telecast using kits built around Pocket Cinema Camera 6K and ATEM Mini Pro switchers.  Blackmagic solutions are also being used by filmmakers featured in major film festivals, like the fact that Senior Colorist Andrew Francis used DaVinci Resolve Studio to grade the new 2021 Sundance documentary “Cusp”. He relied on a home set up powered by DaVinci Resolve Mini Panel, Teranex Mini SDI to DisplayPort 8K HDR converter and UltraStudio 4K capture and playback device as part of his remote workflow.

For more information about our amazing sponsor, Other World Computing, go to MacSales.com or OWCDigital.com, where you’ll find hardware and software solutions and tutorial videos that will get you up and running in no time.

For more about our host, filmmaker, tech maven and co-founder of the Sundance Film Festival, Cirina Catania, visit cirinacatania.com.

If you enjoy our podcast, please subscribe and tell all your friends about us! We love our listeners. And, if you have ideas for segments, write to OWCRadio@catania.us. Cirina is always up for new ideas!


In This Episode

  • 0:30 – Cirina introduces Bob Caniglia, the Director of Sales Operations for North America at Blackmagic Design. Bob’s been working in the film and television industry for many years and he is an expert in video production and post-production.
  • 1:58 – Bob talks about DaVinci Resolve and its new features.
  • 8:25 – Bob shares a non-destructive way of saving your project versions through Cut Page.
  • 11:40 – Cirina asks Bob about the audio editing tools that Blackmagic offers.
  • 14:32 – Bob shares some tips to have better audio quality during recording remote interviews and live streams.
  • 20:04 – Bob talks about the Blackmagic equipment you need to build a recording studio from scratch.
  • 26:29 – Cirina asks Bob about Crew in a Box and how it works.
  • 31:31 – Bob explains the uses of fiber converters in video production. 
  • 34:15 – Bob recommends a camera to use as the tent pole for video production.
  • 36:38 – Bob encourages listeners to check out Blackmagic Design’s website to get some tips on how to be better at remote collaboration.

Jump to Links and Resources


Transcript

Bob, thank you for doing this. It’s nice to hear your voice again, and I appreciate you spending the time to do this because we’re talking about remote collaboration. They are so much available through Blackmagic Design, and I really want to share that with all of our listeners. Welcome and let’s get started. 

All right. Well, thank you for having me.

Always. Let’s start with the darling, DaVinci Resolve. Everyone seems to be moving towards that. More and more people are editing with it, they’ve always colored with it. You have some new remote collaboration features that I would love you to talk to us about.

Sure. It’s interesting that the software itself has grown by leaps and bounds over the last few years. We have a Cut Page which is a one-editing page, then we have the Full Edit page, then we have the Fairlight audio, we have Fusion visual effects, the DaVinci color that everybody knows and loves, and of course our media management. 

One of the things with what’s going on with the pandemic and whatnot, a lot of people using collaborative workflow, remote grading as well. People are connected to machines in the main facility that they work in but using panels and monitors at home and being able to access hardware in different locations. The collaborative workflow is something that has been used a lot more often now that we are in the situation we’re in, and it can be done in a few ways. 

If a lot of people have access to shared storage, then they can really be in line to work. If someone’s color grading then another person could be editing. Once either one of them has something for the other to see, they can just basically send them a little message in Resolve like you would with Skype or something to say, “Hey, you might want to pick up.” Then, you can update your timeline to reflect the changes that were done by the other person. 

It eliminates the reconforming type stuff, which takes a lot of time, obviously. It really has gotten to the point where people forget that they’re not next to each other in the same building. They’re wherever they want to be, and it’s really gotten a lot of people to continue working even though they’re not in the buildings. 

What were some of the other features of DaVinci’s Resolve 17 that were announced? Can you talk about some of those? 

There are few things that we had to do in order to adjust to some of the newer hardware. Apple has their new M1 processor, so we needed to be able to adjust for that. We added a whole bunch of new features in the Cut Page and the Edit Page. Of course, we have the new hardware, a new speed editor that is a little jog shuttle knob with some buttons on there that works with DaVinci Resolve. That hardware is fully used in DaVinci Resolve 17 too. We’ve definitely done a lot of work on multicam editing and just trying to make it easier. 

Even with our ATEMs, we now can do ISO recording in the new ATEM Mini Pro ISO and then the very new ATEM Mini Extreme ISO, where they will ISO record each of the inputs. Also, it will create an XML file of all of the switching that you do during live recording. Then you can open up that project in Resolve and it just populates the timeline. It just shows up every cut and like that you did during the program. It shows up and then you can go in and adjust. 

A lot of that multicam stuff is really making it easier for people to do their own home recording, do a stream at the same time, and then go back, finish it, fix it maybe, but easy to do. This has been something that has been coming for a while, and it’s great to see it. 

The collaborative workflow is something that has been used a lot more often now, and it can be done in a few ways.
The collaborative workflow is something that has been used a lot more often now, and it can be done in a few ways.

I know the editors are loving it, the Sync Bin, particularly. Talk a little bit about that. 

We’ve done a lot of work on being able to get access to the files that you want to without trying to come up with a rubik’s cube to do that. I worked back in the tape days. Everything shot on film was transferred to tape and then we had a bunch of tapes laying around. I sometimes feel like that was easier to keep track of than bins full of files. That’s what we’re trying to do there, is to make it easier to search for things and be able to get to the audio, sync with the video that was shot. 

We also see that with Blackmagic RAW workflow—the codec that we use for our Blackmagic cameras—you can use the one file for editing and color grading so that you’re not making proxies. That’s another thing that can delay the whole process, is if you’re shooting on set and then you decide, all right, we’re going to make proxies for the editor to work with but the colors will work with the uncompressed files, then at the end, we’ll do a reconform and all that. 

Blackmagic RAW helps with that because you can put a LUT on that so that the editor can work with a decent-looking picture. When the color finishes the grading, he just has to send a basic XML file, a DaVinci Resolve Project—it’s just a metadata—so then he just updates the color and boom, it’s finished. You don’t have to do reconforming. 

That also has been helpful in this type of situation because if someone is doing your color grading somewhere else and you’re not in connected or shared media, at least all you have to do is send those files. My daughter and I do this. She has been recording these videos and she’ll do the first edit on it, then she’ll just send me the DaVinci Resolve project, and I’ll go in and tweak color or the key or something like that—she’s shooting a lot of green screen—and it’s just very easy when one of us has a change. We just have to go in and pull that file down and it’s not a lot of data to do that. It’s been a big help for us, that’s for sure. 

Where are you storing your media? How are you accessing the media? By a central server or you’re using something like a Jellyfish?

In the small things that my daughter and I do, we make sure that we each have it. Most of the time actually, she’ll change the file name to something and throw it on Dropbox. Now, a lot of people are using central storage or they’re using Frame.io, or those kinds of things. The larger facilities obviously have the bandwidth to be able to access their storage in their buildings remotely, and even sometimes the hardware, too. 

That’s the beauty I think of Resolve, is that it will grow as you grow and you can do it as a basic sharing of the Resolve projects or literally sharing the same storage at the same time even though you’re not in the same place. That is obviously the holy grail of being able to do it.

I’m thinking of a solution like PostLab possibly using them. There’s a phrase on your website when I was looking at Cut, and it says “Source Overwrite.” Whenever I see overwrite, I always worry about versions. Can you talk about how Cut handles versions and whether or not we’re protected and then it’s non-destructive? Reassure me.

Everything is non-destructive. That’s always an easy way to do it. There are a lot of good ways to do versioning in the Cut Page and the Edit Page for that matter. I often copy timelines and things like that. Then we have the ability to actually compare timelines, which is the better way to do it, especially when you have people that are working on the same project at the same time. 

Sometimes, you want to see what the other one’s doing and you can compare two timelines. It will show you not only graphically but then, of course, on the spring, you can see where changes are and then decide which version you want. I often do that myself. I’ll have a version and then I would do a comparison to see what I think between the two. I always find it fun when you’re the editor and then you’re trying to be a producer at the same time. My big favorite is when I’m acting and directing at the same time, and I have to yell at myself. 

The funny thing about the Cut Page reminds me of the old days of offline editing, where you had a giant reel with all the takes on there and then you’re just moving it from one three-quarter inch tape over to another one. It’s that kind of thing when you have the source reel where you put all of your clips into a one giant timeline, basically, that you can scroll through and go through everything, and then you just drop on to the timeline that little chunk that you see. It is film-like in a way and also just that whole older school offline, but it’s just the fast way to work. 

That’s also why we developed the speed editor for the same thing, to give yourself access to those reels with a job shuttle knob, which, for those of us that worked in the old linear days, were very accustomed to that. There’s a lot of people, obviously, that came into editing at non-linear with Final Cut or whatever and never experienced the linear days, so they didn’t really understand the whole jog shuttle knob or why you would want these things, but as they get their hands on it, they realize, this is pretty cool. I understand they can see why you would use that. 

Also, say with people that start coloring with a mouse, but then once they get on to the trackballs of the DaVinci Resolve panel, they say, now I understand. It’s the same thing. There are good uses of hardware in the software editing. When you have the two working together, it really can become very powerful.

I like the idea. Is the Cut scalable, too? Because you never know. I mean, sometimes I’m in the backroom. I’m working on these monstrous two 35-inch curved monitors, then I go into the other room and I’m on my laptop. That seems like that would be something that would be pretty nice to have. I haven’t tried it yet. I have to try it.

It does work well because in the Cut Page, you have two timelines at the same time. You have the overall timeline of your project and then the zoomed-in version for where you are in the timeline. You always have a reference between the two and that does work well with the laptop. I find that to be very useful. The other thing about the laptop and the way that we do it, especially the Blackmagic RAW files, is that people are using their laptop to edit 12K footage, which is insane when you really think about it. 

That’s true. Who knew? Oh, my goodness.

It wasn’t that long ago where a laptop would fall over with a DV footage and now suddenly, they’re doing 12K? How did that happen?

That’s amazing. Talk to me about the audio turn view. That obviously because right now, we’re recording for a podcast. Audio editing is something I live with every day. Talk to me about the audio turn view.

The thing about our audio is that we have a lot of different tools. In the Cut Page, there are some tools. There are some tools in the Edit Page. Obviously, the Fairlight Audio Page is where the real tools are the big boy toys. I think what I like the most about all of them is that a lot of people come to Resolve from different software. Even in the keyboard itself, you can map the keyboard to mimic other NLEs. The audio is the same thing. You can use it like a Pro Tools if you set it up. We did that with the color grading, too.

The DaVinci guys know the way DaVinci works before Resolve, but then as other people came in, we had to accommodate their way of working. That’s why there are a lot of different choices in terms of color wheels versus the different views that we have of the color channels and color. Audio is similar, too. We’ve tried to accommodate a lot of those things. 

I just like the idea that we’re able to basically shrink and grow what you need to see. If you’re doing a music video, for instance, you’re not going to worry about the audio. You’re just trying to hit the beats. You want to be able to expand those tracks so you can see where they are, because you want to be able to hit certain points, but you’re not actually looking for a whole lot of tools to do anything. 

But then, there are other more complex projects where you have multiple channels of audio and maybe not that many channels of video, so you want to be able to expand those things. I think the way that we’re able to expand and contract timelines and how you can hide tracks if you don’t want to see them, are all the things that just speed up the process so that people can work the way they want to work. I think that’s really the key that we’ve really introduced over the last couple of years with some of the different changes. Instead of trying to make everybody work one way, you should be able to work the way that you’re almost comfortable. I mean that Cut Page is a perfect example of that.

If your audio is not good, it doesn't matter how great your video is. Click To Tweet

Absolutely. Thank you. Life is tough enough, and without having to learn a whole new language in terms of your navigation. While we’re talking about audio, I do want to talk about the setup for live streaming. What tips can you give us for better audio when we’re recording remote interviews? 

The thing that often happens is, for example, Zoom is compressed. The audio that you get out of Zoom is compressed. If you’re using redundant recording devices or software, sometimes the local host is different from the remote host. I’m talking specifically about the Zoom type of audios because we wanted to hit that with this interview with you. Can you give us some tips?

It’s funny about all of this live streaming and people doing Zooms and whatnot. I’m amazed now that we’re a year into it and some people totally get it and understand how to do it right and some people just don’t do it very well. They sound like they’re in some hollow cave. I’m talking about people that are the CEOs of companies or the sports guy in my local television area. They sound terrible and it could be fixed so easily with just some simple things. 

Even if you are just using the microphone that’s built into the computer that you’re speaking to, whether it’s a laptop or an iMac or any of those, if that’s all you have for a microphone, okay fine, but here’s the way to make that sound better. You hang a coat or a blanket behind that device, whether it’s the laptop or an iMac or whatever it is. You put blankets down on the desk or foam squares. I don’t know how many of them I have on my desk right now. I have plenty of foam.

Those little 12×12 audio squares?

Those 12×12 squares, you can buy like a package of six, cheap. If you do that and you place them down on the desk, then the sound isn’t bouncing off the wood of the desk or this hard surface. When it doesn’t bounce off the hard surface then it stays there. If you’re in your closet at home, it’s the quietest place in your house. It’s the best place to record an audio interview if you’re sitting in your closet.

Before I built this room in the back, I actually recorded a couple of interviews in my dining room with a blanket over my head and the laptop. Literally, under the blanket. I have a confession to make. Do you want to know what I have on my desk right now? I have a camping pad. One of those foam pads that you sleep on in a tent. I took that out and I put that on my desk. It’s the perfect size and that’s what I have on my desk.

It is the way to go. When I do my Zoom calls, I have a microphone that I plug into the camera, that’s out of camera sight. Everybody always says, “How come you sound so good? I don’t even see a microphone.” I said, “Well, I put it just out of the camera but I have a ton of foam all over the place to try and make it sound better.” 

What I always say about the Zoom calls is, if your audio is no good, it doesn’t matter how good your video is. Now, on the video side, I happen to use a green screen because I don’t really have anything great in this office with a bunch of bare walls. I have a green screen. I put a picture of my family room from home behind me, which I still have, but it’s out of focus. 

Most of the time, people don’t even know that that’s not where I am. I have it well lit because when you look and sound good, people will pay attention to you longer than if you’re doing something that’s annoying. If your camera is facing out the window and it shuts down the image so that you’re dark and silhouette and that people can’t see you, then they have less time to want to listen to you.

Absolutely. You sound great. What microphone are you using?

I have a Sennheiser Shotgun Mic right now that’s plugged in. I also have a Lavalier that I occasionally wear because I’m using a Blackmagic camera as my main camera—not that you can see it on this audio interview—I plug it directly into that. I’m Italian, so sometimes I use my hands, so I try to make sure that I don’t hit things.

I do that too. I hit the mic all the time. I forget that I’m talking and having a great time and then I hit the mic.

That’s the other tip I will say about audio. You and I, when you’re on Skype or Zoom or any of these things, you don’t hear yourself the way that the other person hears you. You don’t get that coming back because of the way they record. Both Skype and Zoom have the ability to check your microphone. You test the mic, and you hit the test-the-mic, you speak and then it plays back the test so that you can hear what other people hear. I think that people need to do that more often because if they hear a hollow sound on the way back, they realize they’re hollow and they should put some foam squares down or a towel, or a camping pad, or whatever. 

You know what? It works. 

Absolutely. I know people that use furniture blankets. I know audio guys that say that some of the soundproof padding that you buy is more expensive than packing blankets, so they buy packing blankets and use those.

I spent a lot of money on some sound blankets. When they got delivered, I looked at them and I went, I’ve got furnie pads in the garage that would’ve worked just as well, that are pretty cheap. What you have to do is look at your room and think about how the sound might be bouncing and just pad it wherever you can. I actually built—using piping that I got at Home Depot—the system inspired by Alex Lindsay, by the way. I hang room divider curtains up on four sides. That ultimately is what I ended up with and that’s all that I need here. 

I want you to look around and talk to us about what Blackmagic has if we wanted to start from scratch and build a studio that we could do either live streaming or recording local and remote, or just even film and television productions being shot remote. What kind of equipment does Blackmagic have, for example, the ATEM Mini, the cameras, et cetera? 

The ATEM Minis, there are now five of them and we keep another new one. The ATEM Mini, which the original one cost $295, is a four-input HDMI switcher, and originally that was designed to provide a switcher for people who didn’t even really know the switcher was. 

By that, I mean that you could plug in four cameras. It didn’t matter what their output resolution was. They would turn into HD. You take the USB-C output of it, put it into your computer, it’s now your webcam. When I go into Skype and Zoom, it just says, are you using the Blackmagic webcam? Which is not really a webcam but for that input, that’s the one I use. 

There are good uses of hardware and software in editing. When you have the two working together, it really can become very powerful.
There are good uses of hardware and software in editing. When you have the two working together, it really can become very powerful.

Each of the HDMI inputs will take embedded audios. I take the microphone, I plug it into the camera number 1, I leave that up as my audio, but you can do audio, file, or videos. Say I was on camera but I also want to play some videos off of the laptop. I can take the output of the laptop into the switcher and I can get the audio from there as well because it would embed the audio. 

As a basic $295 webcam accelerator, for lack of a better term, because it does have a great chroma keyer built into it. If you do have a camera and you have a green screen, you can put anything behind you, whether it’s a moving source from a laptop or it’s a still frame that you can put in our media player. That’s all very easy to do and very high quality. Just get yourself a light and you can really make yourself look pretty good for a reasonable amount of money. It will take the HDMI out of any camera that has HDMI out, but with the Blackmagic camera, the Blackmagic Pocket Cinema Camera 4K and 6K, then you’re able to get the HDMI out of that but also have control over the camera. 

When your video looks and sounds good, people will pay more attention to it. Click To Tweet

For the lens that I’m using, I have an Olympus lens on it. I can actually zoom and focus from the ATEM software, which is helpful when you’re the talent and the technical director because this way, I don’t have to get up and guess what the focus is and things like that. I could do it while I’m sitting here, so that helps. 

As you move up the line, you have things like a multiviewer and the ability to stream directly out of the built-in encoder. We just introduced software for the ATEM Mini Pro, the ATEM Mini Pro ISO, the ATEM Mini Extreme, and ATEM Extreme ISO. You can actually tether a cell phone and get internet access from that to actually do the streaming directly out of it. 

You can do that with the new Web Presenter HD as well. The Web Presenter HD allows you to take any SDI source, maybe out of a larger ATEM switcher into that, and then you can stream directly out of that box. You can also tether it to a phone to get the internet and pump it off from there.

Talk to me about the new Extremes.

That’s what I’m using here as my switcher now. I needed a slightly larger desk, but it has eight inputs in it, eight HDMI inputs, and it has six channels of DVE. I have a super source where I have up to four boxes, say over a background, and then I have two other additional DVE channels because I have four upstream keyers. 

At this point, I could basically do an award show with this thing which is amazing to me. That will work with either the software, it works with the hard panel that comes with it—it has a lot of buttons on it—or you can even plug in one of our other ATEM advance panels into it and control it that way. 

One of the other things that we did was our original ATEMs. You could use our larger cameras with the URSA Mini Pro or URSA Broadcast, those kinds of cameras. With this new bidirectional micro converter, we’re able to now plug those into these ATEMs and get full control of those cameras because you take the SDI in and out of those cameras, the HDMI into the ATEM, and it does the back and forth talking, and you’re able to control any Blackmagic camera in studio mode to any of the ATEM switchers regardless. You could use the pocket cameras with the larger switchers. You could use the larger cameras with the ATEM Mini. It really has completed the cycle to be able to have full control. 

Now, on the ATEM Mini Extreme ISO, which is the one I do have, I can then plug a USB-C drive, like a Samsung hard drive into it and when I go to record, it will record all of the inputs as well as the program output. Then when I populate my timeline in DaVinci Resolve because it creates a DaVinci Resolve Project, I now have a timeline of my full shoot with all the effects and all of the ISO camera recording. It’s really cool, of all the inputs.

Wow. You’ve got how many HDMI outs and ins? How many connections do you have?

I have eight HDMI inputs, two HDMI outputs, and two USB-C outputs. That becomes helpful because the USB-C connections, I could have one connected to a computer to be my webcam, I could have the other one connected so that I could tether it to a phone or I could use the recording aspect. I also have an ethernet port so I could just use that to encode the signal and livestream directly from the ATEM Extreme or Extreme ISO. It’s very cool. 

With the two HDMI outputs, I can take one that has the program output and I have the other one that shows the multiview. The multiview is super flexible because I can put up to 16 different boxes up there and put anything in it—program, preview, any of the inputs. Anything that I want to see, I can see up there and I can change it. Even with the macro, I could change it from one to another. It’s really amazing; it’s some of the things that I have. It’s better for me because now, I can have my program output the full camera but I can also see my multiview on a monitor nearby, so I can see everything working. 

If you have an engineer working remotely to help you so that you don’t have to be an engineer and host, that would be even better, wouldn’t it?

It is, and you can do that because I can actually talk to someone else’s machine. That’s often done during the Olympics. They would often put an ATEM on site but have tielines back to the US and have somebody controlling it. That’s all possible.

Unbelievable. All this engineering ourselves is really hard.

I agree. Being the host and the tech person at the same time is not the easiest thing to do in the world. 

Bob, tell me, what is Crew in a Box and how does that work?

Crew in a Box, I guess it’s a company. It’s a website that provides the all-in-one solution. I know that they use some Blackmagic products in their solution. Their idea is that it’s a box that gets delivered. Either you set it up or they’ll send somebody to set it up so that you don’t have to worry about it. It’s got the camera, it’s got the switcher. It’s got the ability to encode and send out, it has lights, and everything that you need. There’s a lot of different similar solutions that are out there now. 

It was funny when I watched the Emmys this year. We’re watching live and I started to comment that some people’s setups looked really good and others, not so good. My wife even said, “Well, maybe they’re using Blackmagic stuff.” I said, “Maybe, I don’t know.” 

Turns out that they had sent out 150 kits to people. Not all of them use them, but they sent them out with the idea that all they had to do was plug in their local ethernet cable to be able to get it up and running, which they did and they’re able to control. They had a Blackmagic Pocket 6K camera and an ATEM Mini Pro. They had all the little tools. 

What we’re seeing is a lot of people doing these Crew in a Box–type deployment because you can get a lot done in a small footprint. I think about it now that if I were a news crew or I was going to extend an interview, usually they send two cameras out to do an interview and whatnot. They may as well have somebody there who could actually do a line cut while the interview is going—very easily with an ATEM Mini—record it, and then you would have that timeline already populated. To fix anything you wanted to fix later would be very easy. People have been doing that before but not being able to do the ISO recording which you can do now with these. 

I think that we’re getting people to rethink some of the ways that they’re doing remote stuff. If you really think about it, right now, if they have a live shot for a new show, they have one camera, but they could easily have a second camera now and have somebody switch and stream it directly back to the station to one of our ATEM streaming bridges—I like to call the ATEM Streaming Receiver—which would receive the stream no matter where it’s coming from, and then turns it back into SDI or HDMI. That would give them better quality than some of the things they’re doing already like Zoom and whatnot. 

Obviously, there is security because you’re only giving the one box access to the other box, so no one else is picking up on it. They would be able to do it for a relatively a few dollars, be able to have two cameras out there because oftentimes when they’re there, you’ll see the reporter will say, well, swing it around and show them that. Well, you could do both at the same time. You could show the person speaking but then also show another angle at the same time. 

Again, not doing much to be able to do that because if you just left the one camera framed up on the reporter and then the cameraman swung around and shot another camera that way, and they were able to send that out. I just think that it’s time to step up the game with some of these remote live things where they could do a lot more than they’re currently doing with trying to send one camera out there.

It’s painful to watch some of these, what are called “professional”. I mean, it’s Academy Awards time. We start getting all the screeners and we start getting invited to all of these meetings online starring the producers, actors, and by very famous, well, I don’t want to name names. You’re watching these meetings and like you said, everything looks terrible, the lighting is terrible, the sound is terrible. We’ll have to send everybody over to the Blackmagic page and maybe Crew in a Box.

It’s funny because I did a Zoom meeting with some people about the camera. The guy I was talking to is a high-end VFX guy. He looked good and he and I were both commenting about how bad people look and sound. Aren’t we in this business and why aren’t people even paying attention? He’s saying, I get it with maybe actors. They don’t pay attention, they don’t do the technical stuff. But when you talk to a technical person and they look and sound terrible, it’s like, dude, what are you doing? 

As we’re talking, I said to him, “What’s funny?” I said, “Today, I’m not in my normal setup. I had to improvise and I set up a green screen somewhere else.” He just said, “Wait a minute.” I could tell he was zooming in and he’s like, “Wow, I didn’t know you’re using a green screen.” I said, “Well, that’s a compliment because the VFX guy didn’t notice.”

Absolutely. You didn’t have a halo. 

Right. That is part of it. Now, if you have a well-lit green screen and even if you use Zoom for your keyer, you can put your own image in the background—I’ve done that—it doesn’t look quite as good as my ATEM but it looks a lot better than the people who lose their head when they move back and forth to the side.

Oh, my gosh. Yeah, especially women with longer hair. Well, you have long hair now too, Bob. I noticed. 

Yeah.

You’ve let your hair grow out. I want to ask you a question about these fiber converters because I love the fact that you can connect to live switchers that are up to two kilometers away. That just looks pretty awesome. Talk to us about what those are used for.

Sure. We have a camera fiber converter that goes on the back of an URSA Mini Pro or URSA Broadcast, and then you have the SMPTE fiber connection and the SMPTE fiber as you say, can go up to two kilometers, which is far. When you get it to the other end which is the other side of the receiver, then that turns it back into SDI to go in your switcher, but it also provides the return feed. You can actually send up to three different feeds back to the camera. 

They could watch program output, maybe it’s in preview. If you want to tell the cameraman, here, look at the preview side and now line up your shot before we take you, that kind of thing. That’s really cool. It has the intercom and has two channels, but it’s just one of those things that it’s relatively inexpensive compared to what other solutions are. 

You’re talking about with a camera body and the two ends of that minus the cable, so around $10,000 where that’s probably one-tenth of the cost to some other solution. The fact that you’re able to use SMPTE fiber cables and get that length is really cool. 

One thing I would tell you about HDMI, because it’s another one of those things where the problem with HDMI is you can’t usually go very long distance wise, but there are new HDMI fiber cables that go up to 300 feet. That’s about the same length as an SDI cable will go. They’re very inexpensive and they work very well with the ATEM Minis, Blackmagic cameras. 

It’s interesting that the length of HDMI cables which used to be prohibitive over 25 to 30 feet is now uncapped by these optic 100 meters or a little 330 feet, so that’s pretty cool too. Fibers helping on both ends.

Thank heavens for Blackmagic. They’ve sponsored my films now and they have been amazing with helping me get things filmed. I have to say, thank you to Blackmagic. I’m admitting to everyone listening that I am a little bit prejudiced towards Blackmagic for very good reasons. 

I notice that the URSA Mini Pro has a small mini jack on it. I’m just thinking about cabling and I’m not an engineer. I’m a producer and a host. What do you have as a solution that I can use? For example, one of these URSAs, the URSA Mini Pro to shoot with, how would I cable that to some of this equipment you’re talking about?

The fiber converter actually mounts on the back of the URSA Mini Pro, and when it does, then the three little SDI connectors on the back would connect to that. That’s the way that would work. 

Being the host and the tech person at the same time is not the easiest thing to do in the world. There’s so much time and effort invested in the art of filmmaking. Click To Tweet

That’s awesome. Most people are doing Zoom or they’re doing Skype, or they’re doing some variation of that to do their meetings. Which cameras that Blackmagic makes would you recommend as the tent pole for your production, so to speak?

When you use an external camera, you need to be able to interphase to the computer to do that, and there are a couple ways to do it. If you have a computer, you can add a capture card to it—we make DeckLink cards—but of course the ATEM Mini is really the easiest way because it just plugs in with USB-C, and then any of the four inputs can be your webcam output. 

You can switch between them live as you’re doing it, which is really cool because you can put one of our cameras like the Pocket Cinema Camera 4K which is our least expensive camera for that type of use, but it is really a great output camera. So, that HDMI out into the little ATEM Mini, and then into your computer. 

That also gives you the choice to switch between other sources, maybe the output of a laptop or you might have your keynote or PowerPoint presentation and you want to switch between them, or put you up in the corner while you’re talking about it. All those things are available there and that really can step up the game as they say of people in Zooms. As they say, if your video and audio look and sound good, people will want to listen and watch longer. 

Absolutely. You mentioned that, when we were talking earlier, that the Emmy nominees were live-streamed using Blackmagic equipment. Can you tell us about that?

They used Blackmagic Pocket Cinema Camera 6Ks with EF lenses to get a little bit of depth of field. They also used the Blackmagic URSA Mini Pro. They also had a laptop and a light on that road kit. The way they worked it is they built it all on a tripod that had wheels so that they could just wheel it in because they’re trying to be safe, too. Somebody could just wheel it in, plug in the ethernet into the ATEM, and away they went. You could tell who was using the cameras and who wasn’t.

What have I forgotten to ask you about that you want to tell people to help them get better at remote collaboration? That’s a huge subject.

That is a huge subject but I’ll tell them this. They can find out more information about all of our products at blackmagicdesign.com. That website’s great. We have a lot of videos that we had from our product announcements that explained a lot of these things. Our website has gotten very feature-rich and has provided all kinds of cool little videos or animations that try to explain a lot of the things that we were talking about today. I think that is something that people should poke around and take a look. 

Instead of trying to make everybody work one way, you should be able to work the way that you’re most comfortable.
Instead of trying to make everybody work one way, you should be able to work the way that you’re most comfortable.

We have a very active and passionate fanbase that have definitely made a lot of videos to show people because they want to show off some of the features that we do. I’m always amazed at some of the work that gets done by putting those together, and I thank them all because I think it’s great. So yeah, I would look for those. Google Blackmagic Design Workflow or ATEM or any of those, but our website obviously has a lot of that stuff too.

There’s a section of your website that I love going to. It’s called Splice and it’s great for inspiration. There are stories there about creators who do amazing work. I love browsing around on Splice once in a while just to see who’s out there doing great work. 

Bob, I know you’re swamped and I really appreciate you taking the time to talk to us. Everybody, go to blackmagicdesign.com and I’m Cirina Catania. I’m speaking with Bob Caniglia of Blackmagic Design. I thank everybody for listening. Remember what I tell you all the time, get up off your chair and go do something wonderful today even if it’s in your own home. Have a great day and thank you.



Checklist

  1. Design a workflow for remote collaboration. The team must have a collaborative system that will make the project easier to finish.
  2. Practice clear and effective communication between each team member. Communicate your expectations for how collaboration should be conducted and also ask your collaborators.
  3. Make copies of the timelines of your project. This is a better and non-destructive way to save your project, especially when you have people that are working on it at the same time. 
  4. Use a shared project management space for all tasks, products, and vital information.
  5. Invest in equipment that will make your audio and video recording better. If your audio is not good, it doesn’t matter how great your video is.
  6. Always test your microphone before you do an interview or live stream. This will help you hear what other people will hear in your recording. 
  7. Plug your microphone into your camera and make the place you’re recording soundproof. This will make your recording sound better.
  8. Be in the quietest place in your house or office when you are recording an interview or doing a live stream. You can use foam and packing blankets to make your recording studio soundproof.
  9. Make sure that your place is well lit when recording. When you look and sound good in your video, people will pay more attention to it.
  10. Check out Blackmagic Design’s website to know more about their products.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.