[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vt/ - Virtual Youtubers

Search:


View post   

>> No.48012030 [View]
File: 1.47 MB, 2048x512, catbox_wpd0fp.png [View same] [iqdb] [saucenao] [google]
48012030

This last image is the 4 checkboxes. I had them all turned on for the other tests, but turned one of them off for each of these tests. From left to right:

> Use Random perturbations unchecked
no noticeable effect.
>Merge Attention
If I toggle this off, I lose practically all of the gains in it/s that token merging provides.

>Merge cross attention
If I turn this off, I lose a very small amount(~0.3-0.5 it/s) of it/s with nothing really noticeable happening in the image.

>Merge mlp layers
Basically the same as above.

In summary, from my own testing with token merging, I'll be using these settings:
Token Merging Ratio: 0.7
Max Downsample: 1

Use Random Perturbation, Merge Attention, Merge Cross Attention, Merge mlp layers all toggled on.

Navigation
View posts[+24][+48][+96]