Replies: 1 comment 1 reply
-
I'd be interested in something like this as well, although I might explore programmatically creating something like this. Given 2 workflows, be able to programmatically combine the two:
There would be a lot of assumptions but, assuming the author of the workflows (and the tool to combine them) was aware of these things, might be doable. If I end up building something that can do this I'll share it with you! Most posts I've found online say copy-paste, it's just that if at any point you need to update/modify the workflows separately you now need to re-do copy-pasting in any combined workflows which is a pain to manage. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Got a question for you experts. Now I've been messing around and I know my way around but I find that there is a problem with scale and I want to ask what the best solutions are today for dealing with this issue of reconciling two types of workflow because the workflows feed into each other. So far though it seems like we have to manually manage chosen images and import them across separate workflows.
So in the course of generating images:
First, we want to use a high speed (turbo & LCM SDXL, or 1.5 with LCM Lora) workflow to generate many images at multiple images per second.
Then, I would like to have some sort of a picker node where I can click in the gallery of outputs any that I like, that I'll spend more compute feeding through: detailer flows, or maybe send through a non-LCM checkpoint img2img flow, then optionally detailers again, optionally upscalers, etc.
Basically we have separate workflows going on, there's the initial composition bashing step (mode A: rapid generation) where we can generate hundreds of outputs per minute, in large batches.
Then there is detailed and could be extremely complex flows to work off of the compositions we found that we like as starting points. These (mode B: detailed rendering) flows take much longer to run.
The conflict is that I have not seen an effective method to switch a comfy workflow from mode A into mode B.
OK got a good result after screwing around, Now to return from mode B to mode A:
Now that I have constructed an overview of how this would work in the ideal case maybe I'm blowing the manual overhead out of proportion. But I really do feel like it's really tedious to switch back and forth. And I start to feel like the behavior of the Generate button and how it manipulates the seed needs some sort of work. Basically I think what we should have is the notion that some nodes (such as the hypothetical gallery image picker I described above) can be made to auto trigger execution down its nodes when manipulated. I am not sure but I suspect this capability is not possible because the Generate button must be clicked for any node evaluation to proceed.
This way, we could keep the generate button dedicated for compo bashing while the magic gallery image picker is the entry point for detailer flows coming off of it. It should be a custom node that is an image picker that also has a generate button that applies to the flows downstream.
The alternative to all of this would be instead of trying to do it all in one workflow just have them as separate workflows. I just don't know how to make this efficient but the idea is to just make larger compobashing batches, choose the ones we want to enhance further using an out of band tool, and then come up with a way with script or manually load the chosen ones as inputs for a completely separate workflow. It's probably a more straightforward approach for attaining efficiency but I think it requires far more discipline than I'll have. I'm gonna be too curious about what i can achieve with some gens as soon as I see them. Since it's such a creative process, the order of operations matters too much here to dismiss out of hand.
Another alternative is to have automations or custom nodes or something that can let us automate all those mode switching steps down into one trigger somehow. So we could have special noodles or special pairs of nodes that effectively implement "automated noodles".
Beta Was this translation helpful? Give feedback.
All reactions