May 9, 2022
Alumni Spotlight: Ari Karczag '15
By Claire Wong
After a 15-year career in information technology (IT), Ari Karczag '15 craved something more creative and interactive. He returned to school for a degree in Film and Television Production and discovered motion capture through various courses. His love for creative tech was nurtured. Today, Ari works as a motion capture and performance capture artist at Scanline VFX. His most notable works are with Sony Innovation Studios, Jim Henson Studios and various other Warner Brothers films. Ari joins us to share his story and how the SCA community has helped shape his career.
You discovered your love for motion capture while at USC School of Cinematic Arts. Can you talk a little bit more about the classes, professors and your overall education that helped shape your career?
The Physical Film Production track at USC was an interesting and challenging experience. They really throw you into the fire of filmmaking, having students literally do every role required in pre-production, production, and post production. It's a lot to take on and learn - but by going through the process you're able to figure out what you're good at and what you enjoy. It helps navigate all the potential career choices this industry has to offer. In my first semester we had to create four short films that were 4-8 minutes long - it was rough. For this first set of short films, I focused on epic stories. I built in a series of technical challenges so I could learn the hardware and software side of things. This process helped me learn all the different tools of filmmaking.
For example, one of my shorts, Sucker Punched started with an extended 1st person POV one-shot, where I learned it was really hard to hide the cameraman. It also takes place in the near future, so the character has a heads-up display that notifies him of a head injury and a full bladder. Since he was hit in the head all of the sounds he hears are off just a bit, as his audio implant had been damaged. I ended up not using any of the on-set audio and instead had to design and layer every sound effect in post. In other shorts, I experimented with shooting at night, pre-filming shots that could be played back later and interacted with for in-camera VFX, camera tracking to add digital effects, etc. By building these simple challenges into my early projects, I was able to take a deep dive into complex camera operation, Avid, and ProTools early on. This made technical challenges on future projects much easier to handle.
My first semester teachers, Miles Watkins and Tony Cucchiari, were really great at playing Bad Cop/Good Cop. Miles is a veteran director and could immediately spot if a student was working towards improving their potential. If Miles thought you were slacking and should do better he would call it out. This pushed students to work outside their comfort zones and improve as storytellers and filmmakers. Tony was a bit softer and more tactful with his constructive criticism and let you know that he understood what you were trying to do, then showed you how to improve your work. I really enjoyed learning from both of them and feel lucky to have had them at the very start of my schooling.
What interested you in motion capture?
Prior to attending USC, I had a 12-year career in IT and Technical Support as well as Acting. Tech always paid the bills, while performance in theater and commercials fed the creative monster. When I stepped onto the mocap stage at the Robert Zemeckis Center for Digital Arts, I felt at home. For me, mocap is the perfect blend of technology, performance, and creative problem solving.
For those not familiar with it, can you describe Virtual Production and your role in it, including motion capture?
The 1st Edition of the Virtual Production Field Guide produced by Epic Games states:
"Virtual production is a broad term referring to a spectrum of computer-aided production and visualization filmmaking methods. To begin, here are a few notable definitions about virtual production. According to the team at Weta Digital, “Virtual production is where the physical and digital worlds meet.” Moving Picture Company (MPC) adds to this definition with more technical detail - “VP combines virtual and augmented reality with CGI and game-engine technologies to enable production crews to see their scenes unfold as they are composed and captured on set.”
In simpler terms, we're replacing the greenscreens & bluescreens with giant LED walls that display digital backgrounds in real-time which help immerse the actors and filmmakers into the world of the story. The role of motion capture in virtual production for the most part is two-fold: We track the film camera as it moves so that the digital background displays the correct parallax in real-time, and we capture human movement in order to turn them into digital characters or creatures.
How does motion capture differ from traditional film? Are there aspects when shooting motion capture you have to consider differently?
Every shoot has unique requirements and the size and budget of each project can vary widely. Are you only shooting body mocap or is it full performance capture with body, fingers, facial capture, and audio? How many performers are being captured at one time? Is this for a cinematic scene, in-game player controlled movements, or a live-streamed performance? Some shoots don't require a physical set, while others need to have set pieces built and placed to match the location in the 3D scene.
It takes A LOT of imagination to both perform and direct in motion capture. As a director, you need to provide your talent with as many visual resources as possible so they understand the character they're embodying and the world they inhabit. For performers, your costume is a skin-tight suit with reflective markers velcroed all over and a helmet clamped on your head in order to hold a tiny camera 12 inches away from your nose. Sometimes, there's a lot of tech that's velcroed around your waist to record or transmit your facial video. All of this could end up being distracting, but it's the job of the mocap team to make things as comfortable as possible for the actors.
One of the great things about shooting mocap is that you're capturing everything in 360 degrees all at once. There's no need to have different setups for wide, close-ups, over the shoulder, etc. You can really focus on getting the best performance, then worry about lighting and shooting camera coverage another day.
Collaboration is vital in motion capture as there are many performers, technicians and creatives working within the same space. How do you balance collaborating with others of varying expertise?
It quite often depends on the position I'm hired for. As a mocap or performance capture technician, my job is to suit up the talent and make sure all the hardware is functioning properly. Then, I need to make sure the data or video is captured to the highest standard. The work is less collaborative and more of a technical role. However, if I'm brought on as a technical director or mocap supervisor, I tend to work with the director or movement director in order to get the best possible performance from the actor. Most of the time, we don't have an elaborate set or necessarily the right props, so the challenge becomes "How do we get the skeletal performance to sell the shot?" Sometimes, when a director is having trouble with a scene, especially when they're remote-directing, I'll step in to offer suggestions that help to clarify or inform the performance.
Virtual Production and real-time technology are constantly improving and innovating. Many creatives who aren't working with the technology on a regular basis may not know what it's entirely capable of. Quite often, I find writers, directors, and actors asking me great questions about how the tech works. I provide answers as best I can during brainstorming sessions.
It seems that due to the nature of motion capture, on-set production is critical. How did the pandemic impact the matter in which motion and performance capture is filmed? Did you change the way it is filmed today?
Most projects shut down April to October 2020. Studios began to open up slowly and we were looking to fill several key roles, while keeping the number of on-stage crew to a minimum. Luckily, over the course of my career, I've become familiar with most of the hardware and software for performance capture. I build and operate OptiTrack and Vicon motion capture volumes. I run Dynamixyz, Faceware, & Standard Deviation headcams, and Xsens suits with StretchSense or Manus gloves for inertial motion capture. These skill sets allowed me to take responsibility for specific tasks, while being able to help in other areas as needed.
On-stage crews have definitely gotten smaller. On several occasions, even the director is working remotely. We stream reference camera video and the mocap previews via video call. Most of the creative and animation supervisors work remotely, listening in to the conference call when necessary. We'll see over the next year or so whether on-set crews grow back to pre-pandemic numbers or remain a smaller, specialized team.
How do you think virtual production and motion capture technology will impact the future of entertainment?
As this technology becomes more widely used and new hardware and software tools are designed for filmmakers, I believe it will drastically reduce budgets required for fantastical stories and increase turnaround time for VFX shots with more and more being done in-camera rather than in post production. This will also give actors the ability to see the digital world on the LED walls around them, enabling them to get in 'the zone' for their performance.
What advice do you have for current students looking to pursue a career in virtual production?
Take the "Intro to VFX" and "Motion Capture" courses provided at USC in order to learn the history of Visual Effects and get hands-on experience in the mocap volume. Motion capture and virtual production are skill sets that require hands-on learning. You can't learn if you don't have access to the hardware and software. Learn how to navigate Unreal Engine and/or Unity. Build a solid understanding of what can be done with the game engine. Go through as many of the Unreal online courses as possible. Take an introduction to IT course and learn how to do basic computer hardware, software, and network troubleshooting.
What is your ultimate career goal, and what’s next for you?
In the industry, I'd like to push real-time digital performance and live streaming immersive theater. Imagine Whose Line is it Anyway, but the improv actors are in full performance capture gear, live-driving digital characters in a digital scene being shown on a giant LED wall on stage. Additionally, I'd like to direct more digital shorts, like Love Death + Robots. I think the short-story format forces the director to 'trim the fat' and really make the most of every second of screentime.
My wife and I recently welcomed our beautiful daughter, Lila, (thank you, thank you). I'm discovering the importance of a good work-life balance. Maintaining balance, while navigating our careers is quite a high priority. My wife, Yuchao, is a budding stand-up comedian and I really enjoy helping film her sketch comedy shorts.