deepglugs September 25, 2021 Dev Guide: AI for generating shaders

Dev Guide: AI for generating shaders

Guide Level: Advanced

AI can add uniqueness and depth to your game from visuals to animation to voice and sound. One newer method AI can help is through the generation of textures and bump/normal maps. This guide will walk you through how to use an AI project called VQGAN+CLIP to generate unique images that can be used as shader textures or bump/normal maps for Daz or blender. For this guide, we will focus on blender, but the concepts should be adaptable to Daz or other 3D software platforms. We will also use Linux commands to generate the images. For windows users, there are web-based collabs that can be used to generate the images. You will also need a CUDA-compatible video card to generate the images.

VQGAN+CLIP

VQGAN+CLIP is one of the latest methods of generating images. It takes an input, usually text, and generates a unique image that matches that text. It uses two projects, VQGAN and CLIP from OpenAI. In a nutshell, it works by VQGAN generating things, and CLIP guiding the generation using the text provided called a “prompt”. This usually happens over several attempts called “iterations”.

Prompt: A woman with black hair

The VQGAN-CLIP project is open source so you can clone it from GitHub here:

git clone https://github.com/nerdyrodent/VQGAN-CLIP.git

Open a browser to that link and follow the setup instructions in the README.md file. It will involve cloning a couple of other repos including CLIP and downloading a model file (probably imagenet). Once you have everything set up, you can start generating images to use as assets in your 3D modeling.

Generating Textures

The cool (and hard) part about this method, is that it uses free text as a prompt. Your imagination is the limit, but it can take multiple prompt attempts to get what you want. Here’s a simple example of generating a “reptile scale” texture:

python3 generate.py -p "a close-up photo of reptile scales"

Now we can drop it into blender, and set up our shader to use the texture:

We can use the same texture with a color ramp to make a normal/bump map:

Let’s try to generate a fabric texture for a character’s skirt:

python3 generate.py -p "red and black plaid fabric texture"

After hooking it up and playing with the scale and location of the UV mapping, it’s not terrible:

Bump/Normal Maps

Adding the text “A black and white photo of” to our prompt, we can generate images that can be used as cool bump/normal map additions. I have a dungeon scene that I want to enhance with some demonic runes on the floor stones.

python3 generate.py -p "a black and white photo of demonic runes"

runes.png

If we import the texture and mix it with our existing stone floor shader we turn

Into this:

It looks as if the runes from the generated images are etched into the flooring.

Conclusion

The sky is the limit when it comes to using AI-generated images to enhance your work. I only spent a few minutes on the above examples but with a bit more work, you can make something truly unique and interesting. What are other applications for AI-generated textures? Post your ideas below and I may try them out.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments