[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/3/ - 3DCG


View post   

File: 2.09 MB, 1541x867, Screen Shot 2020-10-22 at 4.25.51 PM.png [View same] [iqdb] [saucenao] [google]
777702 No.777702 [Reply] [Original]

For sharing 3-D animated music videos

>> No.777703

I don't know very many, but I can start. The screenshot is from a video by Sage Suede:
https://youtu.be/tK6Zw56cz0Y

>> No.777705
File: 827 KB, 1520x874, Screen Shot 2020-10-22 at 4.29.27 PM.png [View same] [iqdb] [saucenao] [google]
777705

J Balvin definitely has a few on colors too. Who else?
https://youtu.be/IcBmnstOR64

>> No.777766
File: 2.98 MB, 3840x2160, 4477.jpg [View same] [iqdb] [saucenao] [google]
777766

>>777702
Been working on one for a musician for the past few months. Posted it a few times in the wip thread.
Doing the final 4k renders for it now, release date is November 4th.

It was originally just a lyric video, but its kind of taken on its own thing and turned into more of a dedicated music video. Though the lyrics are part of it.

>> No.777902

Ooh, that sounds cool. I'm definitely curious to see it.

>> No.778138
File: 358 KB, 1920x1080, spectroUV.jpg [View same] [iqdb] [saucenao] [google]
778138

So I wanted to use spectral data from the song so my visualizer would reflect real data from the audio. I used a greyscale for the displace and the coloration from the spectrograph. I animated the UV using AnimAll. Timing was off, though. You could probably get away with it if you could snap to pixels in the UV, and if you had a 1:1 time/pixel with the spectrograph. Used a baked f-curve for the icosphere in the middle.

Animation nodes has a system built explicitly for this kinda stuff, though. I strongly recommend it.

>> No.778171

>>778138
>Timing was off, though.
I hope your frame-rate was a factor of the bpm.
If not, things can get out of sync super easy.
So for example, if your song is 120bpm, you'd want to aim for 24, 30, 48 or 60fps. If a song has a multiple like 140bpm though, it won't really divide nicely with normal frame-rates, so it ends up being like 28, or 35fps. In which case, you're better off baking things to a curve from the song itself to make things sync so you can use whatever frame-rate you want.

I take stems of the song (each instrument track soloed out), and bake the sound to an f-curve. Then just limit the values on it from 0-1, and use it as a driver for other elements. Keeping it at the 0-1 range means I can multiply it with exact values in the driver to set the max for whatever I need.

>Animation nodes has a system built explicitly for this kinda stuff, though. I strongly recommend it.
Yeah I need to get on that too. Just been putting it off for the longest time since it seems like a real hassle to learn, and it's been so long since I've actually had to sit down and learn something from scratch, I feel like I've forgotten how (if that makes sense).

>> No.778173
File: 122 KB, 769x1024, 6o1JZ69.jpg [View same] [iqdb] [saucenao] [google]
778173

>>777702
https://youtu.be/fhlz0-lCahM

>> No.778174
File: 82 KB, 750x573, 89cd958eb6be37bf79fa59ca21aa2f90.jpg [View same] [iqdb] [saucenao] [google]
778174

>>778171
https://youtu.be/hQ49zz49MxM

>> No.778256

>>778171
I think the timing issue translates to the fact that I've got an 'interpolation' error. The spectrograph I generated is 1280px wide, the song is 5:05.568 or 305.568s, which means I need to scroll 4.1889px/s. That was the basis I was working from initially, but I got better results from just translating the UV on the x for the length of the song, but something went awry, it could be my UV translation not originating at 0 on the x or ending at 1280, might just be the viewport lagging because the framerate is like 19. I could see it being what you're explaining, too, but I can't even begin to figure the BPM.

Working with the animation nodes now, it's a far better method, way more extensible.

>> No.778260

>>778256
Oh I never thought about scrolling a spectrograph like that. Are you doing it in single pixel slices, and then sliding it at the 4px/s?
I know it's too late probably, but if you do the slide, and then used a stepped interpolation, you might be able to find a step-rate that syncs up per-pixel, instead of using a linear interpolation. Doing that, you should only need to align the UV to the pixel one time, and then using the stepped interpolation it should naturally keep aligned.
Pairing that up with a BPM fixed framerate and it might fix your syncing issues.
Not to mention, try to end things on whole numbers if you can. If a song fades out, sometimes it's not worth it to use the actual length of the audio file, and instead find a spot where things end manually.

As far as finding the BPM, is it a crazy song with different time signatures that you can't tap out? If you can tap it out, I'd just go on an online metronome or something and try to sync it up, starting with basic BPMS like 120, 135, 140, 160. Generally if it's got a dance beat it's around 130-140.
Having issues with lag though might have problems too. If you can have your viewport drop frames to keep speed, it might be a way to keep things performant.

>> No.778263

>>778260
It's not good, I have 3k layers in my dope sheet.

I just /tried/ to isolate a single pixel, but Blender's UV editor will let you zoom in and select below a pixel, so I don't know that it's actually 1px. and then used linear interpolation from "1"-"1280". I moved it to constant interpolation just now, seems like it defaulted to more appropriate values, but I also changed the displace multiplier down to 1.9 from 16. I think it's introducing distortions that make it less readable. Seems like the response is just too long and there's like a hair's breadth of hang time. No biggie though, it kinda worked, neat experiment.

https://www.youtube.com/watch?v=E_kA0gP0VlU

>> No.778368
File: 48 KB, 747x265, Image 001.jpg [View same] [iqdb] [saucenao] [google]
778368

>>778263
Yeah I did my best to tap it out, and got somewhere around 130 for the tempo.
You weren't kidding about it being difficult to figure out though.
As far as zooming into a pixel and snapping to it, looks like there's a "snap to pixel" option in the UV menu that I overlooked.
For the stepping, I'd keep the curve linear, and add a modifier on it (like you would a noise modifier), but use the "stepped interpolation" modifier instead. That way you can edit the decay of the steps.

Then again, you'd somehow need a way to get the spectrograph to actually be a large enough resolution to last the entire song. Or maybe not. At this point I'm just thinking about what could work, not things that actually would.

>> No.778937
File: 1.59 MB, 1080x1920, spec.png [View same] [iqdb] [saucenao] [google]
778937

Made a spectrograph + shader with Animation Nodes. It's way better than the pan and displace in UV method.

>> No.778938
File: 140 KB, 1920x1920, chaldi.jpg [View same] [iqdb] [saucenao] [google]
778938

>>778937
Also made a "chladni plate" but it's really not even at all.

>> No.778983

>>778937
Neato. What's it look like in action?

>> No.779071
File: 986 KB, 324x576, output.webm [View same] [iqdb] [saucenao] [google]
779071

>>778983
Encode is shot, but it matches super well with the song.

>> No.779200

>>779071
That's pretty cool actually.
Did you just dive into AN? Seems like it didn't take you very long to figure it out. Might be easier to learn than I thought.

>> No.779253
File: 204 KB, 1682x824, shader.png [View same] [iqdb] [saucenao] [google]
779253

>>779200
I just copy pasted Chris P's node group. I didn't do the second slice that he did, though, since my song of choice is bass-heavy. I don't find it particularly intuitive to work with, but I suppose it'll come with time.

https://www.youtube.com/user/chrisprenn
https://www.youtube.com/watch?v=8PkdH_GXpQE