[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vt/ - Virtual Youtubers


View post   

File: 63 KB, 930x772, 1597188642568.jpg [View same] [iqdb] [saucenao] [google]
769661 No.769661 [Reply] [Original]

What hardware are different agencies using? What about standalones?
Pics of the rigs? I know holo uses iphones and they do in studio shit for 3d but it seems that coco and others have their own 3d rigs setup.

>> No.769901

All of Cover and Ichikara's tracking tech is inhouse. I'm not sure what Japanese indies use but a decent amount of western indies Vtube Studio, which is also officially partnered with Tsunderia, MyHoloTV, and Asagiri Yua

>> No.770365

>>769661
Both Cover (Hololive) and Ichikara (Nijisanji) use proprietary apps they developed themselves that leverage the iPhone's facial tracking tech.

In fact, both agencies started as glorified tech demos for their in-house 2d and 3d tech (well, in the case of Cover,you can count the original Sakura Miko project as their 3d body tracking demo), all the crazy antics you see the Niji boys doing such as playing table tennis in 3d is basically Ichikara showing off their tech; Ichikara also has Yumenographia, which is their VR tech demo being sold as a virtual hostess service.

Nijisanji's live2d camera app used to be available to the public, but it's been almost two years since they took it down, but they may still offer it on a business-to-business basis (I remember someone mentioning that Animare is contracted with them for the tech, and certainly their partner's in China who run VirtuaRea use the same tech).

A lot of other companies and standalones in the Tokyo area who do 3d also hire the services of VRLive (dotLive/.Live's parent company, of which Shoujo Denno Shiro and Idol Club are their tech demos) for their 3d body tracking and streaming services.

As for indies and smaller agencies, when it comes to 2d, most will use PrprLive or Facerig with the Live2d module, but there are many other options. For 3d, if it's just face, 3Tene and VseeFace are used a lot, Facerig is also used and there's others - many of the 3d programs also have leap-motion support, so you can have the hand movement animated too.

For full body 3d, they use VR-chat-like tracking equipment with VtubeStudio. Some who have voice-activated animations instead of actual tracking use use DaredemoVtuber.

The tech-inclined Vtubers however... god knows how many ways they do it. Some home-made solutions I know of are: NoraCat used self-programmed tracking using Microsoft Kinect; Omega Sisters use VR equipment to capture movement and cameras for face tracking while having Unity or Unreal 4 real-time render stuff.

>> No.770374
File: 45 KB, 680x451, images (26).jpg [View same] [iqdb] [saucenao] [google]
770374

>>769661
Mocap setups can go from an extremely basic rig with kinect, leapmotion and so on to complex setups with xsens and other proper mocap setups. The more complex systems can accurately track multiple people dancing, while the basic setups will look terrible.

https://youtu.be/FXZ4mDJovzM
This is likely the best mocap I've seen besides Kizuna AI
https://youtu.be/vd_8sx-7XrQ
This would be an example of someone making the most of bad mocap, very SOVL

>> No.770506

>>770374
Jesus, that was a godly mocap. Pity the face tracking wasn't as good.

>> No.772500

>>770506
I dont think it was any facetracking. It was either generic based on tones or the more likely option being that it was pasted in after the fact.

>> No.772790

Weta studios vtuber live when?

>> No.774512

>>772790
we need that ILM vtuber model first

>>
Name
E-mail
Subject
Comment
Action