Generate Character and Environment Textures for 3D Renders using Stable Diffusion | Studio Sessions

In the third event of our live stream series, Invoke Studio Sessions, Kent covers how to use various techniques and AI tools like node-based workflows, depth mapping, ControlNet, and Blender exports to generate textures and apply them to environment and character renders. 00:00 Introduction 01:54 Starting with 3D Models from Blender 15:12 First attempt, then exploring Depth Anything 18:55 Second attempt 20:48 Using a 3D Model of a Character 30:00 Creating a custom node-based workflow 54:00 Testing the workflow 56:22 Seamless tiling 57:10 Closing Invoke Studio Sessions are live events recorded regularly on Invoke’s Discord channel. We take a design challenge from our community and spend an hour working through it while community members follow along, ask questions, and design with us. Join our community: 👉 Get started free at , and check out our open-source, locally hosted Community Edition.
Back to Top