[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vt/ - Virtual Youtubers

Search:


View post   

>> No.40208124 [View]
File: 445 KB, 640x640, {{{{ZIP folder icon}}}}, an extremely and ahoy, houshou marine, {{{spoken exclam s-1130796852.png [View same] [iqdb] [saucenao] [google]
40208124

novelAI's clean.py errors out:
Traceback (most recent call last):
File "R:\ckpt\clean.py", line 13, in <module>
dict_keys = list(model_dict["state_dict"].keys())
KeyError: 'state_dict'
prune.py does too, same error
Traceback (most recent call last):
File "R:\ckpt\prune.py", line 58, in <module>
prune_it(file)
File "R:\ckpt\prune.py", line 25, in prune_it
sd = nsd["state_dict"].copy()
KeyError: 'state_dict'

a lazy google search led to this bit of code...
https://github.com/d8ahazard/sd_dreambooth_extension/issues/9#issuecomment-1311197488

I used it to write a "pre-prune" script by borrowing from the prune.py script using its imports and file handler arguments, pasting in said code block to...I think do nothing more than re-write "state_dict" for every "dict"? grug does not understand how the magic picture box works. grug doesn't understand python either. grug just grugs
https://pastebin.com/w1bdwXe1

and then prune.py...
saving pruned checkpoint at: abyss-hll-30-70-wsum-pre-pruned-ema-pruned.ckpt
New ckpt size: 2.55 GB. Saved 3.44 GB by removing optimizer states and non-EMA weights
uh. holy shit?

novelaileak's clean.py goes even fucking harder? output was <2 GB, kek

I didn't know if non-EMA weights are needed or not, or if they were already tossed out from e.g. animesfw-final-pruned.ckpt. I tried it again with uh...hmm. line 11 changed to read:
def prune_it(p, keep_only_ema=False) and got a 5.98 GB file at the other end with an identical file checksum not model hash. so it's tossing whatever the hell non-EMA weights are out to save all that space

if anyone wants to test them out...

https://drive.google.com/file/d/1JDATXnV-dMEE4Gny0ccqLg-mIVKl2SI-/view?usp=sharing
abyss-hll-30-70-wsum-nai-clean-py.ckpt - 1.99 GB
https://drive.google.com/file/d/1zrQ3289Bkc9sDH4XIyrOL2MwXrcXZEnk/view?usp=share_link
abyss-hll-30-70-wsum-ema-pruned.ckpt - 2.37 GB

do not consider these links permanent. if one or both of them actually produces the same images as the full 6 GB abyss-hll-30-70-wsum.safetensors, I'll => back to .safetensors for whichever one is smaller and still works the same and upload it to pixeldrain or something

Navigation
View posts[+24][+48][+96]