I'm a big fan of Apple's Core Image technology: my Nodality application is based entirely around Core Image filters. However, for new users, the code for adding a simple filter to an image is a little oblique and the implementation is very "stringly" typed.

This post looks at an alternative, GPUImage from Brad Larson. GPUImage is a framework containing a rich set of image filters, many of which aren't in Core Image. It has a far simpler and more strongly typed API and, in some cases, is faster than Core Image.

To kick off, let's look at the code required to apply a Gaussian blue to an image (inputImage) using Core Filter:


        let inputImage = UIImage()
        let ciContext = CIContext(options: nil)
        
        let blurFilter = CIFilter(name: "CIGaussianBlur")
        blurFilter.setValue(CIImage(image: inputImage), forKey: "inputImage")
        blurFilter.setValue(10, forKey: "inputRadius")
        
        let outputImageData = blurFilter.valueForKey("outputImage"as CIImage!
        let outputImageRef: CGImage = ciContext.createCGImage(outputImageData, fromRect: outputImageData.extent())

        let outputImage = UIImage(CGImage: outputImageRef)!

...not only do we need to explicitly define the context, both the filter name and parameter are strings and we need a few steps to convert the filter's output into a UIImage

Here's the same functionality using GPUImage:


        let inputImage = UIImage()
        
        let blurFilter = GPUImageGaussianBlurFilter()
        blurFilter.blurRadiusInPixels = 10
        
        let outputImage = blurFilter.imageByFilteringImage(inputImage)

Here, both the filter and its blur radius parameter are properly typed and the filter returns a UIImage instance.

On the flip-side, there is some setting up to do. Once you've got a local copy of GPUImage, drag the framework project into your application's project. Then under the application target's build phases, add a target dependency, a reference to GPUImage.framework under link binaries and a copy files stage.

Your build phases screen should look like this:



Then, by simply importing GPUImage, you're ready to roll.

To show off some of the funkier filters contained in GPUImage, I've created a little demonstration app, GPUImageDemo.

The app demonstrates Polar Pixellate, Polka Dot, Sketch, Threshold Sketch, Toon, Smooth Toon, Emboss, Sphere Refraction and Glass Sphere - none of which are available in Core Image. 

The filtering work is all done in my GPUImageDelegate class where a switch statement declares a GPUImageOutput variable (the class that includes the imageByFiltering() method) and sets it to the appropriate concrete class depending on the user interface.

For example, if the picker is set the threshold sketch, the following case statement is executed:


       case ImageFilter.ThresholdSketch:

            gpuImageFilter =  GPUImageThresholdSketchFilter()
            
            if let gpuImageFilter = gpuImageFilter as? GPUImageThresholdSketchFilter
            {
                if values.count > 1
                {
                    gpuImageFilter.edgeStrength = values[0]
                    gpuImageFilter.threshold = values[1]
                }
            }

If you build this project, you may encounter a build error on the documentation target. I've simply deleted this target on affected machines.

GPUImage is fast enough to filter video. I've taken my recent two million particles experiment and added a post processing step that consists of a cartoon filter and an emboss filter. These are packaged together in a GPUImageFilterGroup:


        let toonFilter = GPUImageSmoothToonFilter()
        let embossFilter = GPUImageEmbossFilter()

        let filterGroup = GPUImageFilterGroup()


        toonFilter.threshold = 1
        embossFilter.intensity = 2
        filterGroup.addFilter(toonFilter)
        filterGroup.addFilter(embossFilter)
        
        toonFilter.addTarget(embossFilter)
        
        filterGroup.initialFilters = [ toonFilter ]

        filterGroup.terminalFilter = embossFilter

Since GPUImageFilterGroup extends GPUImageFilterOutput, I can take the output from the  Metal texture, create a UIImage instance of it and pass it to the composite filter:


        self.imageView.image = self.filterGroup.imageByFilteringImage(UIImage(CGImage: imageRef)!)

On my iPad Air 2, the final result of 2,000,000 particles with a two filter post process on a 1,024 x 1,024 image still runs at around 20 frames per second. Here's a real time screen capture:



The source code for my GPUImageDemo is available at my GitHub repository here and GPUImage lives here.


4

View comments


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

About Me
About Me
Labels
Labels
Blog Archive
Loading