As part of a project to create a GPU based reaction diffusion simulation, I stated to look at using Metal in Swift this weekend.

I've done similar work in the past targeting the Flash Player and using AGAL. Metal is a far higher level language than AGAL: it's based on C++ with a richer syntax and includes compute functions. Whereas in AGAL, to run cellular automata, I'd create a rectangle out of two triangles with a vertex shader and execute the reaction diffusion functions in a separate fragment shader, a compute shader is more direct: I can get and set textures and it can operate of individual pixels of that texture without the need for a vertex shader.

The Swift code I discuss in this article is based heavily on two articles at Metal By Example: Introduction to Compute Programming in Metal and Fundamentals of Image Processing in Metal. Both of which include Objective-C source code, so hopefully my Swift implementation will help some. 

My application has four main steps: initialise Metal, create a Metal texture from a UIImage, apply a kernel function to that texture, convert the newly generated texture back into a UIImage and display it. I'm using a simple example shader that changes the saturation of the input image. so I've also added a slider that changes the saturation value.

Let's look at each step one by one:

Initialising Metal

Initialising Metal is pretty simple: inside my view controller's overridden viewDidLoad(), I create a pointer to the default Metal device:

    var device: MTLDevice! = nil
    [...]
    device = MTLCreateSystemDefaultDevice()

I also need to create a library and command queue:

    defaultLibrary = device.newDefaultLibrary()
    commandQueue = device.newCommandQueue()

Finally, I add a reference to my Metal function to the library and synchronously create and compile a compute pipeline state:

    let kernelFunction = defaultLibrary.newFunctionWithName("kernelShader")
    pipelineState = device.newComputePipelineStateWithFunction(kernelFunction!, error: nil)

The kernelShader points to the saturation image processing function, written in Metal, that lives in my Shaders.metal file:

    kernel void kernelShader(texture2d<float, access::read> inTexture [[texture(0)]],
                         texture2d<float, access::write> outTexture [[texture(1)]],
                         constant AdjustSaturationUniforms &uniforms [[buffer(0)]],
                         uint2 gid [[thread_position_in_grid]])
    {
        float4 inColor = inTexture.read(gid);
        float value = dot(inColor.rgb, float3(0.299, 0.587, 0.114));
        float4 grayColor(value, value, value, 1.0);
        float4 outColor = mix(grayColor, inColor, uniforms.saturationFactor);
        outTexture.write(outColor, gid);
    }

Creating a Metal Texture from a UIIMage

There are a few steps in converting a UIImage into a MTLTexture instance. I create an array of UInt8 to hold an empty CGBitmapInfo, then use CGContextDrawImage() to copy the image into a bitmap context 

    let image = UIImage(named: "grand_canyon.jpg")
    let imageRef = image.CGImage
        
    let imageWidth = CGImageGetWidth(imageRef)
    let imageHeight = CGImageGetHeight(imageRef)

    let bytesPerRow = bytesPerPixel * imageWidth
        
    var rawData = [UInt8](count: Int(imageWidth * imageHeight * 4), repeatedValue: 0)
  
    let bitmapInfo = CGBitmapInfo(CGBitmapInfo.ByteOrder32Big.toRaw() | CGImageAlphaInfo.PremultipliedLast.toRaw())

    let context = CGBitmapContextCreate(&rawData, imageWidth, imageHeight, bitsPerComponent, bytesPerRow, rgbColorSpace, bitmapInfo)
        
    CGContextDrawImage(context, CGRectMake(0, 0, CGFloat(imageWidth), CGFloat(imageHeight)), imageRef)

Once all of those steps have executed, I can create a new texture use its replaceRegion() method to write the image into it:

    let textureDescriptor = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(MTLPixelFormat.RGBA8Unorm, width: Int(imageWidth), height: Int(imageHeight), mipmapped: true)
        
    texture = device.newTextureWithDescriptor(textureDescriptor)

    let region = MTLRegionMake2D(0, 0, Int(imageWidth), Int(imageHeight))
    texture.replaceRegion(region, mipmapLevel: 0, withBytes: &rawData, bytesPerRow: Int(bytesPerRow))

I also create an empty texture which the kernel function will write into:

    let outTextureDescriptor = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(texture.pixelFormat, width: texture.width, height: texture.height, mipmapped: false)
    outTexture = device.newTextureWithDescriptor(outTextureDescriptor)

Invoking the Kernel Function

The next block of work is to set the textures and another variable on the kerne function and execute the shader. The first step is to instantiate a command buffer and command encoder:

    let commandBuffer = commandQueue.commandBuffer()
    let commandEncoder = commandBuffer.computeCommandEncoder()

...then set the pipeline state (we got from device.newComputePipelineStateWithFunction() earlier) and textures on the command encoder:

    commandEncoder.setComputePipelineState(pipelineState)
    commandEncoder.setTexture(texture, atIndex: 0)
    commandEncoder.setTexture(outTexture, atIndex: 1)

The filter requires an addition parameter that defines the saturation amount. This is passed into the shader via an MTLBuffer. To populate the buffer, I've created a small struct:

    struct AdjustSaturationUniforms 
    {
        var saturationFactor: Float
    }

Then newBufferWithBytes() to pass in my saturationFactor float value:

    var saturationFactor = AdjustSaturationUniforms(saturationFactor: self.saturationFactor)
    var buffer: MTLBuffer = device.newBufferWithBytes(&saturationFactor, length: sizeof(AdjustSaturationUniforms), options: nil)
    commandEncoder.setBuffer(buffer, offset: 0, atIndex: 0)

This is now accessible inside the shader as an argument to its kernel function:

    constant AdjustSaturationUniforms &uniforms [[buffer(0)]]

Now I'm ready invoke the function itself. Metal kernel functions use thread groups to break up their workload into chunks. In my example, I create 64 thread groups, then send them off to the GPU:

    let threadGroupCount = MTLSizeMake(8, 8, 1)
    let threadGroups = MTLSizeMake(texture.width / threadGroupCount.width, texture.height / threadGroupCount.height, 1)
        
    commandQueue = device.newCommandQueue()
        
    commandEncoder.dispatchThreadgroups(threadGroups, threadsPerThreadgroup: threadGroupCount)
    commandEncoder.endEncoding()
    commandBuffer.commit()
    commandBuffer.waitUntilCompleted()

Converting the Texture to a UIImage

Finally, now that the kernel function has executed, we need to do the reverse of above and get the image held in outTexture into a UIImage so it can be displayed. Again, I use a region  to define the size and the texture's getBytes() to populate an array on UInt8:

    let imageSize = CGSize(width: texture.width, height: texture.height)
    let imageByteCount = Int(imageSize.width * imageSize.height * 4)
        
    let bytesPerRow = bytesPerPixel * UInt(imageSize.width)
    var imageBytes = [UInt8](count: imageByteCount, repeatedValue: 0)
    let region = MTLRegionMake2D(0, 0, Int(imageSize.width), Int(imageSize.height))
        
    outTexture.getBytes(&imageBytes, bytesPerRow: Int(bytesPerRow), fromRegion: region, mipmapLevel: 0)

Now that imageBytes holds the raw data, it's a few lines to create a CGImage:

    let providerRef = CGDataProviderCreateWithCFData(
            NSData(bytes: &imageBytes, length: imageBytes.count * sizeof(UInt8))
        )
        
    let bitmapInfo = CGBitmapInfo(CGBitmapInfo.ByteOrder32Big.toRaw() | CGImageAlphaInfo.PremultipliedLast.toRaw())
    let renderingIntent = kCGRenderingIntentDefault
        
    let imageRef = CGImageCreate(UInt(imageSize.width), UInt(imageSize.height), bitsPerComponent, bitsPerPixel, bytesPerRow, rgbColorSpace, bitmapInfo, providerRef, nil, false, renderingIntent)
        
    imageView.image = UIImage(CGImage: imageRef)

...and we're done! 

Metal requires an A7 or A8 processor and this code has been built and tested under Xcode 6. All the source code is available at my GitHub repository here.


4

View comments

  1. Anonymous10:10 AM

    Thanks for the nice tutorial. While running some practice leveraging your example, I ran into an issue related to memory alignment. I am practicing zero-copy data transfer by using 'newBufferWithBytesNoCopy.' This seems to require the memory to be aligned to a certain size. Could you please give me some advice on how to align pointer to an Obj-C structure in Swift for creating a Metal buffer object with newBufferWithBytesNoCopy?

    ReplyDelete
  2. Anonymous9:44 PM

    Thanks for the article. Any thoughts on how to apply this to a SCNScene or SCNRenderer to get barrel distortion?

    ReplyDelete
  3. Actually, I have a barrel distortion CIKernel which you can apply as a CIFilter to an SCNScene. It's part of y CRT Core Image filter and available here: https://github.com/FlexMonkey/Filterpedia/tree/master/Filterpedia/customFilters

    ReplyDelete
  4. Hi Simon,
    Is there any chance that you could migrate your code to Swift 3 or later? Especially for the Filterpedia app? I tried converting myself, but am running into lots of Async errors that I don't know how to address (and that don't get resolved by Xcode's code migration)...I'm following along your excellent image processing book, but my swift knowhow is a bit lacking. Thank you for all these amazing resources...

    ReplyDelete

It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat.

5

This blog post discusses a technique for rendering SideFX Houdini FLIP fluids as sparse fields of wormlike particles (hence my slightly over-the-top Sparse Vermiform moniker) with their color based on the fluid system's gas pressure field.

3

This post continues from my recent blog entry, Particle Advection by Reaction Diffusion in SideFX Houdini. In this project, I've done away with the VEX in the DOP network, and replaced it with a VDB Analysis node to create a vector field that represents the gradient in the reaction diffusion volume.

After watching this excellent tutorial that discusses advecting particles by magnetic fields to create an animation of the sun, I was inspired to use the same technique to advect particles by fields that are a function of reaction diffusion systems.

1

I played with animating mitosis in Houdini last year (see Simulating Mitosis in Houdini), but the math wasn't quite right, so I thought I'd revisit my VEX to see if it could be improved. After some tinkering, the video above shows my latest (hopefully improved) results.

4

This post describes a simple way to create a system comprising of a regularly surfaced fluid and a faux grain system.

1

Fibonacci spheres are created from a point set that follows a spiral path to form a sphere (you can see an example, with code at OpenProcessing).

This video contains five clips using SideFX Houdini's Grain Solver with an attached POP Wrangle that uses VEX to generate custom forces. Here's a quick rundown of the VEX I used for each clip (please forgive the use of a variable named oomph).

The Rayleigh-Taylor instability is the instability between two fluids of different densities. It can appear as "fingers" of a denser liquid dropping into a less dense liquid or as a mushroom cloud in an explosion.

The phenomenon "comes for free" in SideFX Houdini FLIP Fluids.

1

Following on from my recent blog post, Mixing Fluids in Houdini, I wanted to simulate a toroidal eddy effect where the incoming drip takes the form of a torus and the fluid flows around the circumference of its minor radius.

I suspect that mixing two or more disparate fluids is one of the first challenges Houdini newbies, like myself, set themselves.

This video contains three treatments of a gravitational tides project using Houdini FLIP fluids.

1

Here's an animation of a rocky planetoid with a liquid core that ejects a stream of water in an aquatic volcano. The planetoid's gravity pulls the water back which settles into streams and pools.

1

This may sound a tad geeky, but reaction diffusion is one of my favorite things. I've been tinkering with different implementations for four years - almost to the day - so, it felt like high-time to have a go at a SideFX Houdini version.

2

Here's a nice-ish looking and easy to set up project that animates a hapless planet caught in the gravitational field of a small dense star. As the planet approaches, it begins to melt and gets sucked into star's inescapable pull.

1

When I started thinking about using SideFX Houdini to simulate accretion, I was expecting to have my VEX skills (or lack of) pushed to their limits. However, after looking at Houdini's grains solver, I was able to come up with quite a nice looking solution with only a handful of lines of VEX.

Following on from my recent post about creating a double pendulum in SideFX Houdini, here's another project that roughly simulates the effect of a magnetic pendulum and visualizes the resulting attractor.

The system consists of a single pendulum and four fields visualized as small spheres.

I looked at creating strange attractors in Houdini recently and there's a more elegant solution over at Entagma. This post looks at an alternative way of modeling an attractor using SideFX Houdini with a double pendulum.

This post expands upon my previous post, Simulating Mitosis in Houdini. After watching Creating Points, Vertices & Primitives using Vex by Fifty50, I wondered what sort of structure my mitosis emulation would create if the generated points were joined to their parent by a connecting rod.

2

Here's a quick and easy way to create a procedural simulation of cell mitosis that, in my opinion, looks pretty impressive. 

The simulation is based on a point cloud which is wrangled inside a solver. My main network looks like:

The initial point cloud is crated with a Point Generate.

4

My first Houdini blog post, Experimenting with Impacts in SideFX Houdini, discussed viscous fluid impacts. This post looks at how to add a little more detail to the fluid object - specifically, I wanted to add colored stripes to the fluid.

1

Houdini VOPs, or VEX Operators, are networks that use a visual programming paradigm to create operators that can, for example, deform geometry, define material shaders or generate particle systems. There is a huge range of nodes available inside a VOP - everything from programming logic (e.g.

1

My recent post, Swarm Chemistry in SideFX Houdini, illustrates an interesting effect, but tweaking the values in the swarm chemistry solver is a pain. What would be great is a user interface to vary the simulation parameters rather than having to edit the VEX by hand.

Hiroki Sayama's Swarm Chemistry is a model I've tinkered with and blogged about for years. Swarm chemistry simulates systems of heterogeneous particles each with genomes that describe simple kinetic rules such as separation and cohesion.

1

This post looks at a super simple Houdini project that scatters cones across the surface of a sphere and uses a Perlin noise function in a VEX wrangle node to randomly transform each cone.

1

I posted recently about simulating the Belousov-Zhabotinsky reaction in Houdini. Although it worked, it wasn't particularly aesthetically pleasing, so here's a hopefully prettier version using a slightly different approach.

As I explore Houdini, I find myself continually in awe at the ease with which I can create complex and physically realistic effects. This post looks at creating geometry and applying a heat source to it to cause it to glow and melt.

2

The Belousov-Zhabotinsky reaction is an amazing looking chemical reaction that can be simulated with some pretty simple maths. I've written about it several times and recreated it in several technologies.

I recently posted Chaos in Swift! Visualizing the Lorenz Attractor with Metal, which was inspired by a Houdini tweet from Erwin Santacruz.

Now that my working days are spent working purely with Swift, it's a good time to stretch my technical and creative muscles and learn something new in my free time. I've long admired Houdini from SideFX, and mastering that is my new goal.

Inspired by this tweet from Erwin Santacruz, the owner of http://houdinitricks.com, I thought it was high time that I played with some strange attractors in Swift.

After reading Paul Hudson's excellent What's new in Swift 3.0 article, I thought I'd take an early dive into Swift 3.0 myself to see how it affects Core Image development.

4

Transverse or lateral chromatic aberration is an optical artefact caused by different wavelengths of light focussing at different positions on a camera's focal plane. It appears as blue and purple fringing which increases towards the edge of an image.

1

I added a new Scatter filter to Filterpedia yesterday. The filter mimics the effect of textured glass with a kernel that randomly offsets its sample coordinates based on a noise texture created with a CIRandomGenerator filter.

Simon Gladman - Advanced image processing with Core Image from Daniel Haight on Vimeo.

I'm pleased to announce that version 1.3 of my book, Core Image for Swift, is now available through Apple's iBooks Store and, as a PDF, through Gumroad.

About Me
About Me
Labels
Labels
Blog Archive
Loading