Announcing #stablediffusion #ControlNet : TemporalNet 2.0(+ TemporalKit 1.5 with TemporalNet support).
TemporalNet 2.0 builds on the original TemporalNet concept, now instead of providing the last frame, you provide the last frame *and* an optical flow map
huggingface.co/CiaraRowles/Te
Conversation
In order to install this with the webui, you'll need to switch to this branch of ControlNet: github.com/CiaraStrawberr
and either use the hugging face temporalvideo.py script, or use temporalkit 1.5 found here:
github.com/CiaraStrawberr
it's wip, expect to have to modify...
0:06
1.3K views
1
1
20
TemporalNetRun .py , it's wip.
And make sure to put the ckpt file in your models folder.
I'll be updating the repo with the dif safetensors file over the next few days.
0:02
865 views
1
1
10
None of the examples above were done with ebsynth, just 10fps renders and interpolated with flowframes to show the benefits of temporalnet, you could get much better results using ebsynth and other tricks.
1
1
18
Super cool - You mentioned one could get better results with Ebsynth. Would it make sense to combine these or are they redundant ?
1
They should hopefully work well together! the max consistency would probably be reached with combining ebsynth + temporalnet + temporalkit plate generation + reference only mode, however i ran into some issues with reference only mode and temporal kit so it's hard to tell.
1
2
Show replies
I am really confuse to use temporal kit, can you give a bit of information of
Sides/frames per frames/ batch settings/ ebsynth... briefly if you have time
1
Hi, i've kinda been putting off doing a full up to date tutorial, but you can find some on youtube, in terms of each term:
sides: the number of images per generated plate, so like 2 would be 4 plates.
frames per frame would be the number of interpolated/ebsynth-d frames per sd..
1
1
Show replies
1
1
Not at this time, although shouldn't be too hard, I'll have a look into it.
1
Pure curiousity.. why do stationary things like the window (and the whole wall, really) behind the singer constantly morph.. is that an intentional setting or just how it works?
1
5050 the background is moving in the video so it's usually less morphy, but there will generally always be an element of that due to the way stable diffusion handles the image generation. you can also use things like ebsynth and generating the foreground and background separately
1
1
Show replies




