Bokeh is an optical effect where out of focus regions of an image take on the shape of a camera's iris - often a polygon such as a hexagon or octagon. This effect can be simulated in both vImage and with the Metal Performance Shaders (MPS) framework using the morphology operator, dilate. In this post, I'll look at a simulation of bokeh using MPS.

Creating a Hexagonal Probe

Dilate operators use a probe (also known as a kernel or structure element) to define the shape that bright areas of an image expand into. In my recent talk, Image Processing for iOS, I demonstrated an example using vImage to create a starburst effect. In this project, I'll create a hexagonal shaped probe using the technique I used recently to create lens flare.

MPSImageDilate accepts a probe in the form of an array of floats which is treated as a two dimensional grid. Much like a convolution kernel, the height and width of the grid need to be odd numbers. So, the declaration of my MPS dilate operator is:


    lazy var dilate: MPSImageDilate =
    {
        var probe = [Float]()
        
        let size = 45
        let v = Float(size / 4)
        let h = v * sqrt(3.0)
        let mid = Float(size) / 2
        
        for i in 0 ..< size
        {
            for j in 0 ..< size
            {
                let x = abs(Float(i) - mid)
                let y = abs(Float(j) - mid)
                
                let element = Float((x > h || y > v * 2.0) ?
                    1.0 :
                    ((2.0 * v * h - v * x - h * y) >= 0.0) ? 0.0 : 1.0)
                
                probe.append(element)
            }
        }
        
        let dilate = MPSImageDilate(
            device: self.device!,
            kernelWidth: size,
            kernelHeight: size,
            values: probe)
   
        return dilate
    }()

Executing the Dilate

Metal performance shaders work with Metal textures, so to apply the dilate to a UIImage, I use MetalKit's texture loader to convert an image to a texture. The syntax is pretty simple:


    lazy var imageTexture: MTLTexture =
    {
        let textureLoader = MTKTextureLoader(device: self.device!)
        let imageTexture:MTLTexture
        
        let sourceImage = UIImage(named: "DSC00773.jpg")!
        
        do
        {
            imageTexture = try textureLoader.newTextureWithCGImage(
                sourceImage.CGImage!,
                options: nil)
        }
        catch
        {
            fatalError("unable to create texture from image")
        }
        
        return imageTexture
    }()

Because Metal's coordinate system is upside-down compared to Core Graphics, the texture needs to be flipped. I use another MPS shader, MPSImageLanczosScale with a y scale of -1:


    lazy var rotate: MPSImageLanczosScale =
    {
        let scale = MPSImageLanczosScale(device: self.device!)
        
        var tx = MPSScaleTransform(
            scaleX: 1,
            scaleY: -1,
            translateX: 0,
            translateY: Double(-self.imageTexture.height))
        
        withUnsafePointer(&tx)
        {
            scale.scaleTransform = $0
        }
        
        return scale
    }()

The result of the dilation benefits from a slight Gaussian blur, which is also an MPS shader:


    lazy var blur: MPSImageGaussianBlur =
    {
        return MPSImageGaussianBlur(device: self.device!, sigma: 5)
    }()

Although MPS supports in-place filtering, I use an intermediate texture between the scale and the dilate. newTexture(width:height:) simplifies the process of creating textures:


    func newTexture(width width: Int, height: Int) -> MTLTexture
    {
        let textureDesciptor = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(
            MTLPixelFormat.RGBA8Unorm,
            width: imageTexture.width,
            height: imageTexture.height,
            mipmapped: false)
        
        let texture = device!.newTextureWithDescriptor(textureDesciptor)

        return texture
    } 

...which is used to create the destination textures for the scale and blur shaders:


    let rotatedTexture = newTexture(width: imageTexture.width, height: imageTexture.height)

    let dilatedTexture = newTexture(width: imageTexture.width, height: imageTexture.height)

To begin using MPS shaders, a command queue and a buffer need to be created:


    let commandQueue = device!.newCommandQueue()
    let commandBuffer = commandQueue.commandBuffer()

...and now I'm ready to execute the dilate and pass that result to the blur. The blur targets an MTKView current drawable:


    rotate.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: imageTexture,

        destinationTexture: rotatedTexture)

    dilate.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: rotatedTexture,
        destinationTexture: dilatedTexture)
    
    blur.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: dilatedTexture,
        destinationTexture: currentDrawable.texture)

Finally, the buffer is passed the MTKView's drawable to present and committed for execution:


    commandBuffer.presentDrawable(imageView.currentDrawable!)

    commandBuffer.commit();

There's a demo of this code available here.

Bokeh as a Core Image Filter

The demo is great, but to use my bokeh filter is more general contexts, I've wrapped it up as a Core Image filter which can be used like any other filter. You can find this implementation in Filterpedia:





It's worth noting I've created a masked variable bokeh filter that applies this effect with a variable intensity based on the brightness of a separate mask image. You can read about this here.

If you'd like to learn more about wrapping up Metal code in Core Image filter wrappers, may I suggest my book Core Image for Swift. Although I don't discuss MPS filters explicitly, I do discuss using Metal compute shaders for image processing. 

Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.



1

View comments

  1. This is a very nice article. thank you for publishing this. i can understand this easily.!!..iOS Swift Online Training Hyderabad

    ReplyDelete


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

  1. ok. I've got to finish current job, then crash course in programming, and ... this is very inspirational!

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
About Me
About Me
Labels
Labels
Blog Archive
Loading