Normal and bump mapping are techniques used in 3D graphics to fake additional surface detail on an object without adding any additional polygons. Whilst bump mapping uses simple greyscale images (where light areas appear raised) normal mapping, used by SceneKit, requires RGB images where the red, green and blue channels define the displacement along the x, y and z axes respectively. 

Core Image includes tools that are ideal candidates to create bump maps: gradients, stripes, noise and my own Voronoi noise are examples. However, they need an additional step to convert them to normal maps for use in SceneKit. Interestingly, Model I/O includes a class that will do this, but we can take a more direct route with a custom Core Image kernel. 

The demonstration project for this blog post is SceneKitProceduralNormalMapping.

Creating a Source Bump Map


The source bump map is created using a CIGaussianGradient filter which is chained to a CIAffineTile:


    let radialFilter = CIFilter(name: "CIGaussianGradient", withInputParameters: [
        kCIInputCenterKey: CIVector(x: 50, y: 50),
        kCIInputRadiusKey : 45,
        "inputColor0": CIColor(red: 1, green: 1, blue: 1),
        "inputColor1": CIColor(red: 0, green: 0, blue: 0)
        ])

    let ciCirclesImage = radialFilter?
        .outputImage?
        .imageByCroppingToRect(CGRect(x:0, y: 0, width: 100, height: 100))
        .imageByApplyingFilter("CIAffineTile", withInputParameters: nil)
        .imageByCroppingToRect(CGRect(x:0, y: 0, width: 500, height: 500))

Creating a Normal Map Filter

The kernel to convert the bump map to a normal map is fairly simple: for each pixel, the kernel compares the luminance of the pixels to its immediate left and right and, for the red output pixel, returns the difference of those two values added to one and divided by two. The same is done for the pixels immediately above and below for the green channel The blue and alpha channels of the output pixel are both set to 1.0:


        float lumaAtOffset(sampler source, vec2 origin, vec2 offset)
        {
            vec3 pixel = sample(source, samplerTransform(source, origin + offset)).rgb;
            float luma = dot(pixel, vec3(0.2126, 0.7152, 0.0722));
            return luma;
        }
            
        kernel vec4 normalMap(sampler image) \n" +
       
            vec2 d = destCoord();" +
            
            float northLuma = lumaAtOffset(image, d, vec2(0.0, -1.0));
            float southLuma = lumaAtOffset(image, d, vec2(0.0, 1.0));
            float westLuma = lumaAtOffset(image, d, vec2(-1.0, 0.0));
            float eastLuma = lumaAtOffset(image, d, vec2(1.0, 0.0));
            
            float horizontalSlope = ((westLuma - eastLuma) + 1.0) * 0.5;
            float verticalSlope = ((northLuma - southLuma) + 1.0) * 0.5;
            
            return vec4(horizontalSlope, verticalSlope, 1.0, 1.0);
        }

Wrapping this up in a Core Image filter and bumping up the contrast returns a normal map:




Implementing the Normal Map

A SceneKit material's normal content can be populated with a CGImage instance, so we can update the code above to chain the tiled radial gradients to the new filter and, should we want to, a further color controls filter to tweak the contrast:


    let ciCirclesImage = radialFilter?
        .outputImage?
        .imageByCroppingToRect(CGRect(x:0, y: 0, width: 100, height: 100))
        .imageByApplyingFilter("CIAffineTile", withInputParameters: nil)
        .imageByCroppingToRect(CGRect(x:0, y: 0, width: 500, height: 500))
        .imageByApplyingFilter("NormalMap", withInputParameters: nil)
        .imageByApplyingFilter("CIColorControls", withInputParameters: ["inputContrast": 2.5])
    
    let context = CIContext()

    let cgNormalMap = context.createCGImage(ciCirclesImage!,
                                            fromRect: ciCirclesImage!.extent)

Then, simply define a material with the normal map:


    let material = SCNMaterial()
    material.normal.contents = cgNormalMap

Core Image for Swift

All the code to accompany this post os available from my GitHub repository. However, if you'd like to learn more about how to wrap the Core Image Kernel Language code in a Core Image filter or explore the awesome power of custom kernels, may I recommend my book, Core Image for Swift.
Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.




0

Add a comment


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

About Me
About Me
Labels
Labels
Blog Archive
Loading