top of page

opening
mofac vp studio

May 2018, Around the time when the post production of <Monstrum> almost ended, I saw an article and a video of Star Wars demo from GDC 2018 which introduced real-time retargeted motion capture, virtual scouting and virtual camera on Unreal Engine. I sensed that it could become a new way of film making in the future.

gdc-epic-games-nvidia-ray-tracing-demo-slim_1200x500.jpeg

"Reflections" Demo from GDC 2018

I got a permission to lead R&D independently as a task force team. In a empty office room, with a single workstation, one Rokoko Smartsuit and Vive Pro VR kit, I studied and tested virtual production solutions. Also I made a custom virtual camera rig and a facial capture rig.

I made a video reel of my work for two months, ‘Introducing Virtual Production’ and MOFAC’s executives saw the possibility of virtual production in future, they decided to make a new department to advance this technology.

As a result, Virtual Production Center was founded on late September, at the same time the new Chroma-key based virtual studio was built in the company.

The first thing I did in this new team was introducing camera tracking and real-time compositing solution. After a number of researches and online demos, we chose Ncam. The company held a showcase event and invited many guests from film industry. I studied the solution and prepared assets for the demonstration and operated it.

I introduced VICON system and the optical motion capture volume studio was opened in July 2019. Because it allows capturing multiple actors in one virtual stage, the main production of <The Life of Our Lord> and <Hansan> could proceed.

진교현포트폴리오.001.png
bottom of page