After reading Paul Hudson's excellent What's new in Swift 3.0 article, I thought I'd take an early dive into Swift 3.0 myself to see how it affects Core Image development. The first step is to install the latest Swift 3.0 snapshot and, again, Hacking with Swift has a great blog post explaining how to do this.

To illustrate the changes, I've created a small demo project which is available here. The repository has two branches: master, which is a working Swift 2 version and Swift3, which is a working Swift 3 version. 

The project is pretty simple stuff, but does include some important Core Image features: creating and registering a new filter, using a custom kernel and rendering a filter's output to a CGImage using a Core Image context. 

An important note: Swift 3.0 is currently under development - what's correct today may well change before final release! 

A Simple Threshold Filter

Let's kick off by looking at a custom Core Image filter: a simple threshold filter that returns black for pixels with a luminance value below a given value and white for pixels above. 

The first steps might well be to define some filter attributes:


    class ThresholdFilter: CIFilter    
    {
        var inputImage : CIImage?
        var inputThreshold: CGFloat = 0.75
    
        override var attributes: [String : AnyObject]
        {
            return [
                kCIAttributeFilterDisplayName: "Threshold Filter",
                "inputImage": [kCIAttributeIdentity: 0,
                    kCIAttributeClass: "CIImage",
                    kCIAttributeDisplayName: "Image",
                    kCIAttributeType: kCIAttributeTypeImage],
                "inputThreshold": [kCIAttributeIdentity: 0,
                    kCIAttributeClass: "NSNumber",
                    kCIAttributeDefault: 0.75,
                    kCIAttributeDisplayName: "Threshold",
                    kCIAttributeMin: 0,
                    kCIAttributeSliderMin: 0,
                    kCIAttributeSliderMax: 1,
                    kCIAttributeType: kCIAttributeTypeScalar]
            ]
        }

Pretty simple stuff, which won't compile in Swift 3.0. It appears that as part of Proposal SE-0072, Fully Eliminate Implicit Bridging Conversions from Swift, the string constants kCIAttributeTypeImage and kCIAttributeTypeScalar no longer implicitly bridge to AnyObject. Furthermore, the dictionaries used to define the attribute properties for inputImage and inputThreshold fail to bridge to AnyObject

To fix this, we need to cast both the string constants and the attribute property dictionaries to something that does conform to AnyObject. For the strings, I've written a small extension:


    extension String
    {
        var nsString: NSString
        {
            return NSString(string: self)
        }
    }

...and updated the overridden attributes getter:


    override var attributes: [String : AnyObject]
    {
        return [
            kCIAttributeFilterDisplayName: "Threshold Filter",
            "inputImage": [kCIAttributeIdentity: 0,
                kCIAttributeClass: "CIImage",
                kCIAttributeDisplayName: "Image",
                kCIAttributeType: kCIAttributeTypeImage.nsString] as AnyObject,
            "inputThreshold": [kCIAttributeIdentity: 0,
                kCIAttributeClass: "NSNumber",
                kCIAttributeDefault: 0.75,
                kCIAttributeDisplayName: "Threshold",
                kCIAttributeMin: 0,
                kCIAttributeSliderMin: 0,
                kCIAttributeSliderMax: 1,
                kCIAttributeType: kCIAttributeTypeScalar.nsString] as AnyObject
        ]
    }

After creating a color kernel to do the thresholding, the filtering work is typically done inside a filter's overridden outputImage method:


    override var outputImage: CIImage!
    {
        guard let inputImage = inputImage,
            thresholdKernel = thresholdKernel else
        {
            return nil
        }
    
        let extent = inputImage.extent
        let arguments = [inputImage, inputThreshold]
    
        return thresholdKernel.applyWithExtent(extent, arguments: arguments)
    }

However, the same change to implicit bridging prevents this code from compiling. Recall from above that the inputThreshold attribute is of type CGFloat which doesn't conform to AnyObject. This is the same case for other Swift numeric types and the resolution is to change numeric parameters for Core Image filters to NSNumber:


    var inputThreshold: NSNumber = 0.75

As part of the renaming of methods in Swift 3.0, applyWithExtent has also changed, so the getter needs to look like:


    override var outputImage: CIImage!
    {
        guard let inputImage = inputImage,
            thresholdKernel = thresholdKernel else
        {
            return nil
        }
        
        let extent = inputImage.extent
        let arguments = [inputImage, inputThreshold]
        
        return thresholdKernel.apply(withExtent: extent, arguments: arguments)
    }

Registering the Filter

Now the filter is compiling, we need to register it and create a filter vendor to instantiate the filter based on its name. In Swift 2, the code would be:


    let CategoryCustomFilters = "Custom Filters"

    class CustomFiltersVendor: NSObject, CIFilterConstructor
    {
        static func registerFilters()
        {
            CIFilter.registerFilterName(
                "ThresholdFilter",
                constructor: CustomFiltersVendor(),
                classAttributes: [
                    kCIAttributeFilterCategories: [CategoryCustomFilters]
                ])
        }
        
        func filterWithName(name: String) -> CIFilter?
        {
            switch name
            {
            case "ThresholdFilter":
                return ThresholdFilter()
                
            default:
                return nil
            }
        }
    }

SE-0072 crops up again: since CategoryCustomFilters is a Swift string which doesn't conform to AnyObject. Also, both registerFilterName and filterWithName have changed their names. The updated code is:


    class CustomFiltersVendor: NSObject, CIFilterConstructor
    {
        static func registerFilters()
        {
            CIFilter.registerName(
                "ThresholdFilter",
                constructor: CustomFiltersVendor(),
                classAttributes: [
                    kCIAttributeFilterCategories: [CategoryCustomFilters.nsString]
                ])
        }
        
        func filter(withName name: String) -> CIFilter?
        {
            switch name
            {
            case "ThresholdFilter":
                return ThresholdFilter()
                
            default:
                return nil
            }
        }
    }

Querying and Executing the Filter

With the filter code in place and compiling, it's time to execute it. We'll override a view controller's viewDidLoad method by setting the background color:


    view.backgroundColor = UIColor.grayColor()

In this case, the "color" part of "grayColor" is a bit spurious and as part of "omit needless words", it's dropped, so the correct code is: 


    view.backgroundColor = UIColor.gray()

This code makes the slightly dubious assumption that the first filter in the "Custom Filters" category is our threshold filter (it is just a demo!):


    guard let filterName = CIFilter.filterNamesInCategory(CategoryCustomFilters).first else
    {
        return
    }

However, that method name has changed to:


    guard let filterName = CIFilter.filterNames(inCategory: CategoryCustomFilters).first else
    {
        return
    }

Next, we'll define a threshold value and pass it to the filter in its constructor:


    let threshold = 0.5
    let mona = CIImage(image: UIImage(named: "monalisa.jpg")!)!
        
    let filter = CIFilter(
        name: filterName,
        withInputParameters: [kCIInputImageKey: mona, "inputThreshold": threshold])

Failed again! threshold has been created by default as a double which, as you may have guessed, doesn't implement AnyObject. This is fixed by creating it as a NSNumber:


    let threshold: NSNumber = 0.5

The next step is to create a Core Graphics image from the filter's output:


    let context = CIContext()
    
    let final: CGImageRef = context.createCGImage(
        outputImage,
        fromRect: outputImage.extent)

Almost working, only CGImageRef is no longer available in Swift, so that needs to be changed to a CGImage:


    let final: CGImage = context.createCGImage(
        outputImage,
        from: outputImage.extent)

Finally, we'll use the dimensions of final to construct a UIImageView of the correct size and populate its image property with the filter's output:


    let frame = CGRect(
        x: Int(view.bounds.midX) - CGImageGetWidth(final) / 2,
        y: Int(view.bounds.midY) - CGImageGetHeight(final) / 2,
        width: CGImageGetWidth(final),
        height: CGImageGetHeight(final))
    
    let imageView = UIImageView(frame: frame)
    
    imageView.image = UIImage(CGImage: final)

There are some nice changes here - no more calls to CGImageGetWidth and CGImageGetHeight: the  image has width and height properties. There's also a rename of CGImage in the UIImage constructor:


    let frame = CGRect(
        x: Int(view.bounds.midX) - final.width / 2,
        y: Int(view.bounds.midY) - final.height / 2,
        width: final.width,
        height: final.height)
    
    let imageView = UIImageView(frame: frame)
    
    imageView.image = UIImage(cgImage: final)

Conclusion

If, like me, you haven't used NSNumber as the type for scalar Core Image filter attributes, now might be the time to start changing them. Of course, Swift 3.0 is still evolving and the changes made as part of SE-0072 may well change, but a close look at the Core Image documentation does suggest these should be NSNumber. Looks like this has changed in the May 31st version! 

I am halfway through updating Filterpedia to Swift 3.0, however there's a known issue in Swift 3.0 with UITableViewDataSource which is temporarily holding me back. 


4

View comments

  1. Maybe i'm missing something, but it's not totally obvious to me how to implement the "color kernel to do the thresholding". I was wondering whether you could incorporate that snippet in the tutorial?

    ReplyDelete
  2. • I very much enjoyed this article. Nice article thanks for given this information. I hope it useful to many Peopledata since Online Training Bangalore

    ReplyDelete
  3. • Nice and good article. It is very useful for me to learn and understand easily. Thanks for sharing your valuable information and time. Please keep updatingAzure Online Training hyderabad

    ReplyDelete
  4. Good Post! Thank you so much for sharing this pretty post, it was so good to read and useful to improve my knowledge as updated one, keep blogging…!!..Swift Online Training Hyderabad

    ReplyDelete


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

  1. ok. I've got to finish current job, then crash course in programming, and ... this is very inspirational!

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
About Me
About Me
Labels
Labels
Blog Archive
Loading