Last night, I gave a talk at NSLondon on Apple's Core Image framework. Speaking to attendees afterwards, there were two frequent comments: "Core Image is far easier than I thought" and "Core Image is far more powerful than I thought". 

One of the topics I covered was creating a custom filter to create a variable blur effect - something that may initially sound quite complex but is actually fairly simple to implement. I thought a blog post discussing the custom filter in detail may help spread the word on Core Image's ease of use and awesome power.

This post will discuss the kernel code required to create the effect and the Swift code required to wrap up the kernel in a CIFilter. It's worth noting that Core Image includes its own CIMaskedVariableBlur filter which we'll be emulating in this post.


Core Image Kernels

At the heart of every built in Core Image filter is a kernel function. This is a small program that's executed against every single pixel in the output image and is written in a dialect of OpenGL Shading Language called Core Image Kernel Language. We'll be looking at Core Image's  CIKernel class which manages a kernel function and expects code that looks a little like:


    kernel vec4 kernelFunction(sampler image)
    {
        return sample(image, samplerCoord(image));
    }

In this example, the function accepts a single argument, image, which is of type sampler and contains the pixel data of the source image. The sample function returns the pixel color of the source image at the co-ordinates of the pixel currently being computed and simply returning that value gives is a destination image identical to the source image. 

Variable Blur Kernel

Our filter will be based on a simple box blur. The kernel function will sample neighbouring pixels in a square centred on the pixel currently being computed and return the average color of those pixels. The size of that square is a function of the blur image - the brighter the corresponding pixel in the blur image, the bigger the square.

The code is a little more involved than the previous "pass-through" filter:


    kernel vec4 lumaVariableBlur(sampler image, sampler blurImage, float blurRadius) 
    
        vec3 blurPixel = sample(blurImage, samplerCoord(blurImage)).rgb; 
        float blurAmount = dot(blurPixel, vec3(0.2126, 0.7152, 0.0722)); 

        int radius = int(blurAmount * blurRadius); 

        vec3 accumulator = vec3(0.0, 0.0, 0.0); 
        float n = 0.0; 

        for (int x = -radius; x <= radius; x++) 
        { 
            for (int y = -radius; y <= radius; y++) 
            { 
                vec2 workingSpaceCoordinate = destCoord() + vec2(x,y); 
                vec2 imageSpaceCoordinate = samplerTransform(image, workingSpaceCoordinate); 
                vec3 color = sample(image, imageSpaceCoordinate).rgb; 
                accumulator += color; +
                n += 1.0; +
            }     
        
        accumulator /= n; 

        return vec4(accumulator, 1.0); 

    

In this kernel we:

  • Calculate the blur amount based on the luminosity of the current pixel in the blur image and use that to define the blur radius.
  • Iterate over the surrounding pixels in the input image, sampling and accumulating their values.
  • Return the average of the accumulated pixel values with an alpha (transparency) of 1.0

To use that code in a filter, it needs to be passed in the constructor of a CIKernel:


    let maskedVariableBlur = CIKernel(string:"kernel vec4 lumaVariableBlur....")


Implementing as a Core Image Filter

With the CIKernel created, we can wrap up that GLSL code in a Core Image filter. To do this, we'll subclass CIFilter:


    class MaskedVariableBlur: CIFilter
    {
    }

The filter needs three parameters: the input image which will be blurred, the blur image which defines the blur intensity and a blur radius which, as we saw above, controls the size of the sampled area:


    var inputImage: CIImage?
    var inputBlurImage: CIImage?
    var inputBlurRadius: CGFloat = 5

The execution of the kernel is done inside the filter's overridden outputImage getter by invoking applyWithExtent on the Core Image kernel:


    override var outputImage: CIImage!
    {
        guard let
            inputImage = inputImage,
            inputBlurImage = inputBlurImage else
        {
            return nil
        }
        
        let extent = inputImage.extent

        let blur = maskedVariableBlur?.applyWithExtent(
            inputImage.extent,
            roiCallback:
            {
                (index, rect) in
                return rect
            },
            arguments: [inputImage, inputBlurImage, inputBlurRadius])
        
        return blur!.imageByCroppingToRect(extent)
    }

The roiCallback is a function that answers the question, "to render the region, rect, in the destination, what region do I need  from the source image?". My book, Core Image for Swift takes a detailed look at how this can affect both performance and the effect of the filter. 

Note that the arguments array mirrors the kernel function's declaration. 

Registering the Filter

To finish creating the filter, it needs to be registered. This is done with CIFilter's static registerFilterName method:


    CIFilter.registerFilterName("MaskedVariableBlur",
                            constructor: FilterVendor(),

                            classAttributes: [kCIAttributeFilterName: "MaskedVariableBlur"])

The filter vendor, which conforms to CIFilterConstructor, returns a concrete instance of our filter class based on a string:


    class FilterVendor: NSObject, CIFilterConstructor
    {
        func filterWithName(name: String) -> CIFilter?
        {
            switch name
            {
            case "MaskedVariableBlur":
                return MaskedVariableBlur()

            default:
            return nil
            }
        }
    }

The Filter in Use

Although the filter can accept any image as a blur image, it might be neat to create a radial gradient procedurally (this could even be controlled by a Core Image detector to centre itself on the face!). 


    let monaLisa = CIImage(image: UIImage(named: "monalisa.jpg")!)!


    let gradientImage = CIFilter(
        name: "CIRadialGradient",
        withInputParameters: [
            kCIInputCenterKey: CIVector(x: 310, y: 390),
            "inputRadius0": 100,
            "inputRadius1": 300,
            "inputColor0": CIColor(red: 0, green: 0, blue: 0),
            "inputColor1": CIColor(red: 1, green: 1, blue: 1)
        ])?
        .outputImage?

        .imageByCroppingToRect(monaLisa.extent)

Core Image's generator filters, such as this gradient, create images with infinite extent - hence a crop at the end. Our radial gradient looks like:


We can now call our funky new filter using that gradient and an image of the Mona Lisa:



    let final = monaLisa
        .imageByApplyingFilter("MaskedVariableBlur",
            withInputParameters: [
                "inputBlurRadius": 15,

                "inputBlurImage": gradientImage!])

Which yields this result: where the blur applied to the source is greater where the blur image is lighter:




Core Image for Swift

My book, Core Image for Swift, takes a detailed look at custom filters and covers the high performance warp and color kernels at length. Hopefully, this article has demonstrated that with very little code, even the simplest GLSL can provide some impressive results.

Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.



3

View comments

  1. If someone wants to try using this example, here's how you can build the full Core Image Kernel string (with a small bug corrected, removed extra "+" from code):

    let maskedVariableBlur = CIKernel(string:("kernel vec4 lumaVariableBlur(sampler image, sampler blurImage, float blurRadius)" +
    "{ \n" +
    "vec3 blurPixel = sample(blurImage, samplerCoord(blurImage)).rgb;" +
    "float blurAmount = dot(blurPixel, vec3(0.2126, 0.7152, 0.0722));" +
    "\n" +
    "int radius = int(blurAmount * blurRadius);" +
    "\n" +
    "vec3 accumulator = vec3(0.0, 0.0, 0.0);" +
    "float n = 0.0;" +
    "\n" +
    "for (int x = -radius; x <= radius; x++)" +
    "\n{" +
    "for (int y = -radius; y <= radius; y++)" +
    "\n{" +
    "vec2 workingSpaceCoordinate = destCoord() + vec2(x,y);" +
    "vec2 imageSpaceCoordinate = samplerTransform(image, workingSpaceCoordinate);" +
    "vec3 color = sample(image, imageSpaceCoordinate).rgb;" +
    "accumulator += color; " +
    "n += 1.0;" +
    "}\n" +
    "}\n" +
    "accumulator /= n;" +
    "\n" +
    "return vec4(accumulator, 1.0);" +
    "\n" +
    "}"))

    ReplyDelete
  2. I am getting:
    failed because 'lumaVariableBlur', the first kernel in the string, does not conform to the calling convensions of a CIColorKernel.
    What does this mean?

    ReplyDelete
  3. Spend several hours with no working filter with this tutorial.

    ```
    @objc dynamic var inputImage: CIImage?
    ...
    ```

    ReplyDelete


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

  1. ok. I've got to finish current job, then crash course in programming, and ... this is very inspirational!

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
About Me
About Me
Labels
Labels
Blog Archive
Loading