[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vt/ - Virtual Youtubers

Search:


View post   

>> No.60701730 [View]
File: 1.95 MB, 1024x1280, 28208-3534163128-best quality, (nanashi mumei, hololive, brown hair, brown eyes, ponytail, hair feathers, feather hair ornament,_1.1), (pentagram.png [View same] [iqdb] [saucenao] [google]
60701730

>>60700161
OFT is mostly about less overfitting and better preservation of base model. And it supposedly trains faster too.

Pivotal tuning is training embeddings and model at the same time. Some anon was talking about it few days ago.
Sounds stupid, but there's examples of it working:
https://civitai.com/articles/2494/making-better-loras-with-pivotal-tuning
HLL with this method will have about 800 embeddings packed. Not sure if it will work at this scale, but it would be interesting to try.

Navigation
View posts[+24][+48][+96]