While writing Core Image for Swift, I've created quite a few custom Core Image filters all of which have been added to my Filterpedia app. To demonstrate the potential of filters based on custom kernel routines, here's a little roundup of some of the recent ones I've written.

If you're writing a photo editing app or want to add a post effect to your SceneKit or SpriteKit based game or app, writing custom kernels can really help your product stand out from the crowd. 

Color Directed Blur



This filter borrows a technique from the Kuwahara Filter: it creates four square "windows" around the current pixel in the North East, North West, South East and South West directions and calculates the average color of each. The shader then finds and returns the averaged color with the closest color match to the current pixel by measuring the distance in a 3D configuration space based on red, green and blue.

Color Directed Blur's final result is quite "painterly" - areas of similar colors are blurred, but edges are preserved. 

Difference of Gaussians



My Difference of Gaussians filter subtracts one Gaussian blurred version of the input image from another. Although the radius of both blurs is the same, the sigma differs between the two. The blurring weight matrices are generated dynamically and the blur filters are implemented as separable horizontal and vertical 9 pixel Core Image convolution kernels.

This filter uses existing Core Image filters in a composite filter rather than a custom kernel.

Polar Pixellate 




This filter uses code written by Jake Gundersen and Brad Larsen from GPUImage. Rather than creating square pixels, it creates arc shaped pixels surrounding an arbitrary center. 

Compound Eye



My Compound Eye filter simulates the eye of an insect or crustacean by creating multiple reflected images in a hexagonal grid. The lens size, lens refractive index and the background color can all be controlled. 

This filter is discussed in length in Core Image for Swift

Refracted Text


My Refracted Text filter accepts a string parameter which it renders as a virtual lens over a background image. It uses Core Image's Height Field from Mask filter which is passed to a custom kernel that uses the height field's slope at every point to calculate a refraction vector. Both the refracted image and the background image can be individually blurred.

This filter is discussed in length in Core Image for Swift

CMYK Registration Mismatch


Some color printing is based on four ink colors: cyan, magenta, yellow and black. If the registration of these colors isn't accurate, the final result displays blurred color ghosting. My CMYK Registration Mismatch filter simulates this effect by sampling cyan, magenta, cyan and black from surrounding pixels and merging them back to RGB colors for a final output. 

CMYK Levels


My CMYK Levels filter applies individual multipliers to the cyan, magenta, yellow and black channels of an RGB image converted to CMYK. 

Core Image for Swift

If you want to learn more about creating custom Core Image filters, may I recommend my book, Core Image for Swift. It discusses different kernel types (color, warp and general) and is a great introduction to Core Image Kernel Language. There are template filters with all the boilerplate code to start writing custom kernels straight away. 

If you're already familiar with writing shader code, you may want to take a look at my open source Sweetcorn app - it creates kernel routines using a node based user interface and now supports both color and warp kernels:



Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.




1

View comments

  1. The quality can be tested with time. Our custom writing service has many years of experience and thousands of satisfied clients. Become one of them and order a high-quality paper with us!
    essay writing company

    ReplyDelete


It's been a fairly busy few months at my "proper" job, so my recreational Houdini tinkering has taken a bit of a back seat. However, when I saw my Swarm Chemistry hero, Hiroki Sayama tweeting a link to How a life-like system emerges from a simple particle motion law, I thought I'd dust off Houdini to see if I could implement this model in VEX.

The paper discusses a simple particle system, named Primordial Particle Systems (PPS), that leads to life-like structures through morphogenesis. Each particle in the system is defined by its position and heading and, with each step in the simulation, alters its heading based on the PPS rule and moves forward at a defined speed. The heading is updated based on the number of neighbors to the particle's left and right. 

The project set up is super simple: 



Inside a geometry node, I create a grid, and randomly scatter 19,000 points across it. An attribute wrangle node assigns a random value to @angle:
@angle = $PI * 2 * rand(@ptnum); 
The real magic happens inside another attribute wrangle inside the solver.

In a nutshell, my VEX code iterates over each point's neighbors and sums the neighbor count to its left and right. To figure out the chirality, I use some simple trigonometry to rotate the vector defined by the current particle and the neighbor by the current particle's angle, then calculate the angle of the rotated vector. 
while(pciterate(pointCloud)) {

    vector otherPosition;
    pcimport(pointCloud, "P", otherPosition);

    vector2 offsetPosition = set(otherPosition.x - @P.x, otherPosition.z - @P.z);
    float xx = offsetPosition.x * cos(-@angle) - offsetPosition.y * sin(-@angle);
    float yy = offsetPosition.x * sin(-@angle) + offsetPosition.y * cos(-@angle);
    
    float otherAngle = atan2(yy, xx); 

    if (otherAngle >= 0) {
        L++;
    } 
    else {
        R++;
    }   
}
After iterating over the nearby particles, I update the angle based on the PPS rule:
float N = float(L + R);
@angle += alpha + beta * N * sign(R - L);
...and, finally, I can update the particle's position based on its angle and speed:
vector velocity = set(cos(@angle) * @speed, 0.0, sin(@angle) * @speed);  
@P += velocity ;
Not quite finally, because to make things pretty, I update the color using the number of neighbors to control hue:
@Cd = hsvtorgb(N / maxParticles, 1.0, 1.0); 
Easy!

Solitons Emerging from Tweaked Model



I couldn't help tinkering with the published PPS math by making the speed a function of the number of local neighbors:
@speed = 1.5 * (N / maxParticles);
In the video above, alpha is 182° and beta is -13°.

References

Schmickl, T. et al. How a life-like system emerges from a simple particle motion law. Sci. Rep. 6, 37969; doi: 10.1038/srep37969 (2016).


5

View comments

  1. ok. I've got to finish current job, then crash course in programming, and ... this is very inspirational!

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
About Me
About Me
Labels
Labels
Blog Archive
Loading