NILL Co., Ltd.
NILL and Opus Studio, technical cooperation in the development of NTT Konokyu’s XR live system “Matrix Stream”.
NILL Co., Ltd. (Headquarters: Chofu City, Tokyo, CEO: Yusuke Shimizu) and Opus Studio Co., Ltd. (Headquarters: Shinjuku-ku, Tokyo, CEO: Tomohide Yanagisawa) developed into NTT Konokyu’s XR live system “Matrix Stream”. We provided technical cooperation.
This system “Matrix Stream” will be held at Makuhari Messe for three days from October 26th to 28th. will be exhibited.
1. Live video experience in vision and VR
You can experience the live video of Tacitly, the character of “Intuition x Algorithm ♪ 3rd Season”, with the vision on the wall and the integrated VR goggles.
2.Experience participating in a virtual live together with a large number of people
This time, by implementing functions that can recognize individual items such as T-shirts and penlights, users wearing integrated VR goggles at the same time can experience watching live together. 3. Wear VR goggles and experience camera switching from multiple viewpoints Wearing integrated VR goggles, you can experience camera switching by yourself in Tacitly’s live viewing experience.
About “Matrix Stream”
1. System overview
This system generates XR content in real time as the movement of characters in the virtual space from the performer’s data such as the motion capture of the XR studio. The generated XR content can be distributed simultaneously in multiple distribution formats, such as distribution to screens at real venues, VR distribution, and online distribution, as live music and programs that utilize XR technology. [Image 4
(1) Simultaneous distribution is possible in multiple distribution formats, both domestically and internationally. In addition to real venues and online distribution, viewers themselves will be able to participate in virtual venues as avatars, experience live performances in the same space as virtual characters, and watch programs. *Photos are currently under development.
(2) Realizing a virtual live performance that combines real-time and high-quality images by developing a unique shader. In order to express virtual characters and spaces attractively and realize more emotional XR music live, we have developed and introduced a unique shader (program) that pursues real-time performance as well as expressive power.
(3) Interactive production is possible by linking with distribution site’s unique functions
By linking with functions such as “comments” and “tips” on various distribution sites, you can interactively change the production of the XR space in real time. For example, live streaming of virtual live performances on video distribution sites in Japan and China at the same time, and people from various countries participating across borders, such as fireworks going off at the same time in the virtual space when people comment on their support in their respective languages. It is possible to enjoy virtual live while playing. In addition, it is possible to flexibly respond to the production expression according to the image of the content and the world view of the character.
(4) Labor saving of virtual live operation
In general virtual live systems, the avatar specifications such as the bone structure and facial expressions necessary to move the virtual character are defined for each system, and it is necessary to set up each system to be used and modify and adjust the character model. in some cases. On the other hand, in this system, it is possible to automate the conversion from the avatar model format, which has been done manually in the past, by developing tools to optimize the data structure in introducing virtual character models, regardless of whether they are your own company or other companies. .
In addition, in the operation of the live performance, we will automate the live progress, streamline camera work and lighting operations, optimize the number of operators involved in this system in the live operation, and improve the efficiency of the operation cost.
Details about this release: