Whilst Core Image has an amazing range of image filters, the iOS SDK's Accelerate framework's vImage API includes some impressive histogram functions that CI lacks. These include histogram specification, pictured above, which applies a histogram (e.g. calculated from an image) to an image, and contrast stretch which normalises the values of a histogram across the full range of intensity values. 

vImage's API isn't super friendly to Swift developers, so this blog post demonstrates how to use these functions. These examples originate from a talk I gave at ProgSCon and you can see the slide show for the talk here

Converting Images to vImage Buffers and Vice Versa

Much like Core Image has its own CIImage format, vImage uses its own format for image data: vImage_Buffer. Buffers can easily be created from a Core Graphics CGImage with a few steps. First of all, we need to define a format: this will be 8 bit per channel and four channels per pixel: red, green, blue and, lastly, alpha:


let bitmapInfo:CGBitmapInfo = CGBitmapInfo(
    rawValue: CGImageAlphaInfo.Last.rawValue)

var format = vImage_CGImageFormat(
    bitsPerComponent: 8,
    bitsPerPixel: 32,
    colorSpace: nil,
    bitmapInfo: bitmapInfo,
    version: 0,
    decode: nil,
    renderingIntent: .RenderingIntentDefault)

Given a UIImage, we can pass its CGImage to vImageBuffer_InitWithCGImage(). This method also needs an empty buffer which will be populated with the bitmap data of the image:


let sky = UIImage(named: "sky.jpg")!


var inBuffer = vImage_Buffer()

vImageBuffer_InitWithCGImage(
  &inBuffer,
  &format,
  nil,
  sky.CGImage!,
  UInt32(kvImageNoFlags))

Once we're done filtering, converting a vImage buffer back to a UIImage is just as simple. This code is nice to implement as an extension to UIImage as a convenience initialiser. Here we can accept a buffer as an argument, create a mutable copy and pass it to vImageCreateCGImageFromBuffer to populate a CGImage:


extension UIImage
{
  convenience init?(fromvImageOutBuffer outBuffer:vImage_Buffer)
  {
    var mutableBuffer = outBuffer
    var error = vImage_Error()
    
    let cgImage = vImageCreateCGImageFromBuffer(
      &mutableBuffer,
      &format,
      nil,
      nil,
      UInt32(kvImageNoFlags),
      &error)
    
    self.init(CGImage: cgImage.takeRetainedValue())
  }
}

Contrast Stretching

vImage filters generally accept a source image buffer and write the result to a destination buffer.  The technique above will create the source buffer, but we'll need to create an empty buffer with the same dimensions as the source as a destination:


let sky = UIImage(named: "sky.jpg")!

let imageRef = sky.CGImage

let pixelBuffer = malloc(CGImageGetBytesPerRow(imageRef) * CGImageGetHeight(imageRef))

var outBuffer = vImage_Buffer(
  data: pixelBuffer,
  height: UInt(CGImageGetHeight(imageRef)),
  width: UInt(CGImageGetWidth(imageRef)),
  rowBytes: CGImageGetBytesPerRow(imageRef))

With the inBuffer and outBuffer in place, executing the stretch filter is as simple as:


vImageContrastStretch_ARGB8888(
  &inBuffer,
  &outBuffer,
  UInt32(kvImageNoFlags))

...and we can now use the initialiser above to create a UIImage from the outBuffer:


let outImage = UIImage(fromvImageOutBuffer: outBuffer)

Finally, the pixel buffer created to hold the bitmap data for outBuffer needs be freed:


free(pixelBuffer)

Contrast stretching can give some great results. This rather flat image:



Now looks like this:




Conclusion

vImage, although suffering from a rather unfriendly API, is a tremendously powerful framework. Along with a suite of histogram operations, it has functions for de-convolving images (e.g. de-blurring) and some interesting morphology functions (e.g. dilating with a kernel which can be used for lens effects such as star burst and bokeh simulation). 

My Image Processing for iOS slide deck explores different filters and the companion Swift project contains demonstration code.

The code for this project is available in the companion project to my image processing talk. I've also wrapped up this vImage code in a Core Image filter wrapper which is available in my Filterpedia app

It's worth noting that the Metal Performance Shaders framework also includes these histogram operations.

To learn more about histogram specification, see vImage Histogram Functions Part II: Specification.


Addendum - Contrast Stretching with Core Image

Many thanks to Simonas Bastys of Pixelmator who pointed out that Core Image does indeed have contrast stretching through its auto-adjustment filters.

With the sky image, Core Image can create a tone curve filter with the necessary values using this code:


let sky = CIImage(image: UIImage(named: "sky.jpg")!)!

let toneCurve = sky
    .autoAdjustmentFiltersWithOptions(nil)
    .filter({ $0.name == "CIToneCurve"})
    .first

The points create a curve that looks like this:



Which, when applied to the sky image:


if let toneCurve = toneCurve
{
    toneCurve.setValue(sky, forKey: kCIInputImageKey)
    
    let final = toneCurve.outputImage
}

....yields this image:



Thanks again Simonas!

FYI, the histograms for the images look like this (thanks to Pixelmator). vImage stretches the most!


Original image:



vImage contrast stretch



Core Image contrast stretch





1

View comments

  1. Hi Simon, I'm tring to perform the histogram stretching in swift 2.3 on a CVImageBuffer and I have some problems...

    This is my function... but I doesnt found a way to convert a CVImageBuffer to a UnsafeMutablePointer... could you give me a suggestion?

    func histogramStrech(inBuffer: CVImageBuffer ) -> CIImage {

    let width = CVPixelBufferGetWidth( inBuffer )
    let height = CVPixelBufferGetHeight( inBuffer )
    let stride = CVPixelBufferGetBytesPerRow( inBuffer )


    var inImg = vImage_Buffer()

    inImg.data = inBuffer
    inImg.height = UInt(height)
    inImg.width = UInt(width)
    inImg.rowBytes = stride

    let outBuffer = malloc(stride * height)
    var outImg = vImage_Buffer()

    outImg.data = outBuffer
    outImg.height = UInt(height)
    outImg.width = UInt(width)
    outImg.rowBytes = stride

    var err = vImage_Error()
    err = vImageContrastStretch_ARGB8888( &inImg, &outImg, UInt32(kvImageLeaveAlphaUnchanged))

    let outImage = CIImage(CVPixelBuffer: inBuffer)

    free(outBuffer)

    return outImage
    }

    ReplyDelete


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

  1. ok. I've got to finish current job, then crash course in programming, and ... this is very inspirational!

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
About Me
About Me
Labels
Labels
Blog Archive
Loading