Part One: Getting off the Ground
…is hard enough
Our streamlined HD/VR production crew meets for the first time at JFK. We are on our way to Africa to capture the essence of “The Bright Continent”, a book written by Dayo Olopade that attempts to correct the narrative that Africa needs hand outs from America and Europe in order to succeed. Inspired by Dayo, we will be exploring the ingenuity and creativity in communities in Nigeria, Rwanda, and Kenya. Along the way we are filming both a traditional HD documentary and VR documentary for The Nantucket Project with the generous support of Harbers Studios.
There are four of us (from L to R):
Jennifer Hobdy – Sound Op / Production Manager / Organization Guru
Daniel Honan – Director, Producer / Fearless Leader
Me – VR Director / Technical Solutionist
Aaron Neu – DP for the HD doc / Production MacGuyver / Comedian
Look at those smiles!
We have no idea what we’re in for.
What did we get ourselves into?
Besides the fantastic crew involved, this shoot is a unique challenge for several reasons:
1. we are severely limited in the kind of equipment we can bring
2. there will be little to no opportunity to location scout, test, or film the VR sequences
3. we will be operating in places that provide very little or no production support
4. we’ve had almost no pre-production time
5. three days before our primary camera took a 2 foot drop to the floor
4 360 cameras: Ladybug3, 2 360fly, Ricoh Theta
2 DSLR cameras: a 5D and a 60D with a slew of lenses and grip items
2 GoPro Hero3s with clamps, suction cups, and mounts
a helmet with a headmount,
portable battery to charge the Ladybug
1 wireless lav mic
2 ambisonic microphones: TetraMic with Tascam recorder and the Brahma
Everything we have is packed into 8 carry-on items. No checked luggage.
As I go through security at JFK, TSA confiscates the portable battery straight off the bat. This is a key piece of equipment because the Ladybug3 does not have its own power supply. It will be powering off the firewire connection on our MacBook Pro, a workflow that was set up by minutes before the gear was en route to the airport, which no one has thoroughly tested yet.
The TSA officer is looking right past me, ignoring me as I plead with him to let us keep our power supply. Apparently the clamps on the side that could start a car could also potentially be used to electrocute someone. I guess I don’t think about things like that. We are past GO, the plane is at the gate, and there is no time to get back through security to check the battery. So we leave it and run for the plane.
Rule #1 in remote shoots: always have a back up
In mid-sprint Jenn, our rockstar production manager, is emailing our Nigerian producer to find out power options when we land. My backup plan is the inverter in my backpack. It plugs into a car cigarette lighter and has a US power outlet on it that I can plug the computer into to re-up. I can only hope that it will be enough to power up the laptop between locations or else we aren’t going to be able to get many shots off. The travel plan places us in remote locations where we are not going to be able to run cables, so the portable power was going to give us the juice we need to set-up our shots and execute multiple takes.
I lay eyes on the Ladybug3 for the first time under the dimly light cabin lights of a flight from New York to Paris. She’s a beauty. We had looked into several options before deciding to work with the Ladybug. The Bublcam is great looking device. It unfortunately took a 2 foot drop to the floor while we were testing. The backup got held up in Canada/US customs and never made it to us. Massive Red rigs were impractical. The multi-GoPro rigs are fine if you have hours to set up, plenty of production support, the ability to shoot multiple takes – none of which we would have. I wasn’t willing to risk losing a shot with a finicky set up. Other options like Samsung’s Beyond weren’t available yet.
And then there was the Ladybug3. Taylor Swift had used it in a recent interactive video. She had used 16 of them, actually. The Point Grey research has a kick ass supportive team that was on the ball with expediting a camera to us and answering dozens of questions. While the Ladybug solved some problems for us, it came with its own set of problems.
Technical moment for those so inclined
The Ladybug3 is a camera in the sense that it has lenses. It does not have its own power supply. It does not have a battery supply. It does not have it’s own internal storage. It does not capture sound. To control the lens, you have to be connected to the SDK. Oh, and did I mention that the SDK only runs on PC? The power can be supplied via off Firewire – on a Mac. We forwarded Point Grey’s optimal configuration settings to potential computer suppliers and no one could come up with one that met their specifications in less than 24 hours. So also in the backpack, is a Mac Bootcamped to run Windows 8. It’s been awhile since I used Windows, so all the ridiculously huge icons that I imagine are intended to make it more user-friendly just feel like obstacles to efficiency.
I’ve also brought 2 setups for ambisonic sound. The most complicated one is the TetraMic, which requires a setup involving the XLR outlets being converted to Cat5 and back again into a Tascam recorder. The second microphone is the last piece of hardware that I hadn’t laid eyes on until this plane ride – The Brahma. The Brahma is a re-outfitted H2N Zoom with the ambisonic mics inside. Both mics have windscreens. That’s all I know. I’m not really a sound person other than having sat in many a post-production sound booth watching the engineers work or testing sound in a movie theatre before a screening. I know the basics of directional vs omni sound, but I’ve read up on everything I can and this is going to be learning on the fly.
The Ladybug is for our key shots. For supplemental footage and backup (see Rule #1) we have two 360flys. Those babies are single fisheye lens cameras that store to the camera itself and can be triggered from an app on your phone. In that app you can also preview the shot what you’re seeing and there’s a desktop application where you can review the footage. Plus it records sound. And it’s water-proof. The 360fly already feels like the winner for a shoot where we will be going to some uncontrolled environments, but we have concerns about the resolution and low-light capabilities. Also its lens only allows for a 240 degree view – which means you can’t look down when you’re inside the VR experience, which personally drives me crazy.
The Ricoh Theta is going to be more of a communication device than for shooting footage. It has two fish eye lens that capture full spherical images and up to three minutes of video and sound. The iPhone app that does a great job of auto-stitching the lens together for images, but you can’t review video through the app. Same as the 360flys, there are low-light and resolution limitations that prevent it from being an option for key material.
Still in the box is a G-Tech 4Tb external harddrive, that hasn’t even been formatted yet. This drive will be the back-up that I will download content to each night after the day’s shooting.
The moment of truth
Because of the time difference, I know that I need to try to sleep on this leg of the flight to switch my body clock closer to Lagos time. On the plane from Paris to Lagos is when I should test the set-up. After all, we’re already mid-air. If the set-up doesn’t work, then we’re screwed whether I get the Ladybug up and running now or on the next flight. We all down a couple glasses of wine and annoy the stewardesses by gathering in the walkways to talk excitedly about the adventures ahead. Back at my seat, I open my printed out version of the Ladybug Technical Manual to ramp up my learning curve, and boom – right to sleep.
On the plane, Aaron and I pull out the laptop and fire up the Ladybug. Have we hauled all this equipment across the Atlantic only to find it won’t work?
It will have to stay connected to the laptop in order to function. And my plan, inspired by the US TSA, is now to see how long the computer will run the SDK for until is is completely out of battery.
The SDK fires up, the preview capture begins and we are up and running.
The PTGui SDK interface is simple enough with options for various camera uses. We’ll be using video almost exclusively so I skip straight to camera settings. Which pops up a huge menu, covering 50% of my screen real estate where I’m trying to see how the setting changes are effecting picture. No way to minimize and when I slide it off the screen, I can’t access the controls. A function of running off Bootcamp? That’s going to be annoying.
Aaron is holding up the camera while I play around with the SDK. What he doesn’t realize at this moment on this plane is that his role as Key Grip has just been established. The software offers individual control over each of the 6 lenses – shutter, exposure, gain. The auto settings don’t seem to get us very far, so each of these is going to need to be hand set. Another drag on the SDK is that for certain functions, like shutter, you can’t manually type in a value. Instead you have to click up or down to change the value, which can take a long time. Our set-ups aren’t going to be as fast as I would have liked. I notice that the capture streams are chunked into 2G files. I assume that’s to be backwards compatible with drive Windows drive format limitations, but I need to confirm that.
The people sitting next to us on the plane start to cover themselves up when they realize that they can see themselves on the laptop showing the preview. The steward says he doesn’t want to be in a film and stares at the panoramic image as he tries to dodge the camera. We continue to test out various settings and frame rates.
In just over 75 minutes I completely drain the laptop battery. I’m not going to have the 2 hours I thought. Tomorrow is going to be rough.