I'm a big fan of Apple's Core Image technology: my Nodality application is based entirely around Core Image filters. However, for new users, the code for adding a simple filter to an image is a little oblique and the implementation is very "stringly" typed.

This post looks at an alternative, GPUImage from Brad Larson. GPUImage is a framework containing a rich set of image filters, many of which aren't in Core Image. It has a far simpler and more strongly typed API and, in some cases, is faster than Core Image.

To kick off, let's look at the code required to apply a Gaussian blue to an image (inputImage) using Core Filter:


        let inputImage = UIImage()
        let ciContext = CIContext(options: nil)
        
        let blurFilter = CIFilter(name: "CIGaussianBlur")
        blurFilter.setValue(CIImage(image: inputImage), forKey: "inputImage")
        blurFilter.setValue(10, forKey: "inputRadius")
        
        let outputImageData = blurFilter.valueForKey("outputImage"as CIImage!
        let outputImageRef: CGImage = ciContext.createCGImage(outputImageData, fromRect: outputImageData.extent())

        let outputImage = UIImage(CGImage: outputImageRef)!

...not only do we need to explicitly define the context, both the filter name and parameter are strings and we need a few steps to convert the filter's output into a UIImage

Here's the same functionality using GPUImage:


        let inputImage = UIImage()
        
        let blurFilter = GPUImageGaussianBlurFilter()
        blurFilter.blurRadiusInPixels = 10
        
        let outputImage = blurFilter.imageByFilteringImage(inputImage)

Here, both the filter and its blur radius parameter are properly typed and the filter returns a UIImage instance.

On the flip-side, there is some setting up to do. Once you've got a local copy of GPUImage, drag the framework project into your application's project. Then under the application target's build phases, add a target dependency, a reference to GPUImage.framework under link binaries and a copy files stage.

Your build phases screen should look like this:



Then, by simply importing GPUImage, you're ready to roll.

To show off some of the funkier filters contained in GPUImage, I've created a little demonstration app, GPUImageDemo.

The app demonstrates Polar Pixellate, Polka Dot, Sketch, Threshold Sketch, Toon, Smooth Toon, Emboss, Sphere Refraction and Glass Sphere - none of which are available in Core Image. 

The filtering work is all done in my GPUImageDelegate class where a switch statement declares a GPUImageOutput variable (the class that includes the imageByFiltering() method) and sets it to the appropriate concrete class depending on the user interface.

For example, if the picker is set the threshold sketch, the following case statement is executed:


       case ImageFilter.ThresholdSketch:

            gpuImageFilter =  GPUImageThresholdSketchFilter()
            
            if let gpuImageFilter = gpuImageFilter as? GPUImageThresholdSketchFilter
            {
                if values.count > 1
                {
                    gpuImageFilter.edgeStrength = values[0]
                    gpuImageFilter.threshold = values[1]
                }
            }

If you build this project, you may encounter a build error on the documentation target. I've simply deleted this target on affected machines.

GPUImage is fast enough to filter video. I've taken my recent two million particles experiment and added a post processing step that consists of a cartoon filter and an emboss filter. These are packaged together in a GPUImageFilterGroup:


        let toonFilter = GPUImageSmoothToonFilter()
        let embossFilter = GPUImageEmbossFilter()

        let filterGroup = GPUImageFilterGroup()


        toonFilter.threshold = 1
        embossFilter.intensity = 2
        filterGroup.addFilter(toonFilter)
        filterGroup.addFilter(embossFilter)
        
        toonFilter.addTarget(embossFilter)
        
        filterGroup.initialFilters = [ toonFilter ]

        filterGroup.terminalFilter = embossFilter

Since GPUImageFilterGroup extends GPUImageFilterOutput, I can take the output from the  Metal texture, create a UIImage instance of it and pass it to the composite filter:


        self.imageView.image = self.filterGroup.imageByFilteringImage(UIImage(CGImage: imageRef)!)

On my iPad Air 2, the final result of 2,000,000 particles with a two filter post process on a 1,024 x 1,024 image still runs at around 20 frames per second. Here's a real time screen capture:



The source code for my GPUImageDemo is available at my GitHub repository here and GPUImage lives here.


4

View comments

  1. their is errors when i tried to run the application
    how can i solve it i use xcode 7.2

    ReplyDelete
  2. I'm using an old version of GPUImage. I'll try to find time to update it.

    ReplyDelete
  3. Same problem. We would all be thankful if you could update it. Thanks.

    ReplyDelete
  4. Just updated that project - runs fine now under Xcode 7.2. Cheers!

    ReplyDelete

It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat.

5

This blog post discusses a technique for rendering SideFX Houdini FLIP fluids as sparse fields of wormlike particles (hence my slightly over-the-top Sparse Vermiform moniker) with their color based on the fluid system's gas pressure field.

3

This post continues from my recent blog entry, Particle Advection by Reaction Diffusion in SideFX Houdini. In this project, I've done away with the VEX in the DOP network, and replaced it with a VDB Analysis node to create a vector field that represents the gradient in the reaction diffusion volume.

After watching this excellent tutorial that discusses advecting particles by magnetic fields to create an animation of the sun, I was inspired to use the same technique to advect particles by fields that are a function of reaction diffusion systems.

1

I played with animating mitosis in Houdini last year (see Simulating Mitosis in Houdini), but the math wasn't quite right, so I thought I'd revisit my VEX to see if it could be improved. After some tinkering, the video above shows my latest (hopefully improved) results.

4

This post describes a simple way to create a system comprising of a regularly surfaced fluid and a faux grain system.

1

Fibonacci spheres are created from a point set that follows a spiral path to form a sphere (you can see an example, with code at OpenProcessing).

This video contains five clips using SideFX Houdini's Grain Solver with an attached POP Wrangle that uses VEX to generate custom forces. Here's a quick rundown of the VEX I used for each clip (please forgive the use of a variable named oomph).

The Rayleigh-Taylor instability is the instability between two fluids of different densities. It can appear as "fingers" of a denser liquid dropping into a less dense liquid or as a mushroom cloud in an explosion.

The phenomenon "comes for free" in SideFX Houdini FLIP Fluids.

1

Following on from my recent blog post, Mixing Fluids in Houdini, I wanted to simulate a toroidal eddy effect where the incoming drip takes the form of a torus and the fluid flows around the circumference of its minor radius.

I suspect that mixing two or more disparate fluids is one of the first challenges Houdini newbies, like myself, set themselves.

This video contains three treatments of a gravitational tides project using Houdini FLIP fluids.

1

Here's an animation of a rocky planetoid with a liquid core that ejects a stream of water in an aquatic volcano. The planetoid's gravity pulls the water back which settles into streams and pools.

1

This may sound a tad geeky, but reaction diffusion is one of my favorite things. I've been tinkering with different implementations for four years - almost to the day - so, it felt like high-time to have a go at a SideFX Houdini version.

2

Here's a nice-ish looking and easy to set up project that animates a hapless planet caught in the gravitational field of a small dense star. As the planet approaches, it begins to melt and gets sucked into star's inescapable pull.

1

When I started thinking about using SideFX Houdini to simulate accretion, I was expecting to have my VEX skills (or lack of) pushed to their limits. However, after looking at Houdini's grains solver, I was able to come up with quite a nice looking solution with only a handful of lines of VEX.

Following on from my recent post about creating a double pendulum in SideFX Houdini, here's another project that roughly simulates the effect of a magnetic pendulum and visualizes the resulting attractor.

The system consists of a single pendulum and four fields visualized as small spheres.

I looked at creating strange attractors in Houdini recently and there's a more elegant solution over at Entagma. This post looks at an alternative way of modeling an attractor using SideFX Houdini with a double pendulum.

This post expands upon my previous post, Simulating Mitosis in Houdini. After watching Creating Points, Vertices & Primitives using Vex by Fifty50, I wondered what sort of structure my mitosis emulation would create if the generated points were joined to their parent by a connecting rod.

2

Here's a quick and easy way to create a procedural simulation of cell mitosis that, in my opinion, looks pretty impressive. 

The simulation is based on a point cloud which is wrangled inside a solver. My main network looks like:

The initial point cloud is crated with a Point Generate.

4

My first Houdini blog post, Experimenting with Impacts in SideFX Houdini, discussed viscous fluid impacts. This post looks at how to add a little more detail to the fluid object - specifically, I wanted to add colored stripes to the fluid.

1

Houdini VOPs, or VEX Operators, are networks that use a visual programming paradigm to create operators that can, for example, deform geometry, define material shaders or generate particle systems. There is a huge range of nodes available inside a VOP - everything from programming logic (e.g.

1

My recent post, Swarm Chemistry in SideFX Houdini, illustrates an interesting effect, but tweaking the values in the swarm chemistry solver is a pain. What would be great is a user interface to vary the simulation parameters rather than having to edit the VEX by hand.

Hiroki Sayama's Swarm Chemistry is a model I've tinkered with and blogged about for years. Swarm chemistry simulates systems of heterogeneous particles each with genomes that describe simple kinetic rules such as separation and cohesion.

1

This post looks at a super simple Houdini project that scatters cones across the surface of a sphere and uses a Perlin noise function in a VEX wrangle node to randomly transform each cone.

1

I posted recently about simulating the Belousov-Zhabotinsky reaction in Houdini. Although it worked, it wasn't particularly aesthetically pleasing, so here's a hopefully prettier version using a slightly different approach.

As I explore Houdini, I find myself continually in awe at the ease with which I can create complex and physically realistic effects. This post looks at creating geometry and applying a heat source to it to cause it to glow and melt.

2

The Belousov-Zhabotinsky reaction is an amazing looking chemical reaction that can be simulated with some pretty simple maths. I've written about it several times and recreated it in several technologies.

I recently posted Chaos in Swift! Visualizing the Lorenz Attractor with Metal, which was inspired by a Houdini tweet from Erwin Santacruz.

Now that my working days are spent working purely with Swift, it's a good time to stretch my technical and creative muscles and learn something new in my free time. I've long admired Houdini from SideFX, and mastering that is my new goal.

Inspired by this tweet from Erwin Santacruz, the owner of http://houdinitricks.com, I thought it was high time that I played with some strange attractors in Swift.

After reading Paul Hudson's excellent What's new in Swift 3.0 article, I thought I'd take an early dive into Swift 3.0 myself to see how it affects Core Image development.

4

Transverse or lateral chromatic aberration is an optical artefact caused by different wavelengths of light focussing at different positions on a camera's focal plane. It appears as blue and purple fringing which increases towards the edge of an image.

1

I added a new Scatter filter to Filterpedia yesterday. The filter mimics the effect of textured glass with a kernel that randomly offsets its sample coordinates based on a noise texture created with a CIRandomGenerator filter.

Simon Gladman - Advanced image processing with Core Image from Daniel Haight on Vimeo.

I'm pleased to announce that version 1.3 of my book, Core Image for Swift, is now available through Apple's iBooks Store and, as a PDF, through Gumroad.

About Me
About Me
Labels
Labels
Blog Archive
Loading