I've spend some time working on my iPad reaction diffusion simulation app written in Swift using Apple's Metal technology. As well as adding some more models: Gray-Scott and Belousov-Zhabotinsky, I've tweaked the code so that I'm getting around 1,000 solver iterations per second on an iPad Air 2.
The applyFilter() function inside my view controller now adds multiple commands to the command encoder in each call and only after committing the multiple calls writes the resulting texture to a UIImage instance and then to the UIImageView. The different reaction diffusion models define their own number of iterations per frame, so the solver part of applyFilter() now looks like this:
let commandBuffer = commandQueue.commandBuffer()
let commandEncoder = commandBuffer.computeCommandEncoder()
commandEncoder.setComputePipelineState(pipelineState)
var buffer: MTLBuffer = device.newBufferWithBytes(&reactionDiffusionModel.reactionDiffusionStruct, length: sizeof(ReactionDiffusionParameters), options: nil)
commandEncoder.setBuffer(buffer, offset: 0, atIndex: 0)
commandQueue = device.newCommandQueue()
for _ in 0 ... reactionDiffusionModel.iterationsPerFrame
{
if useTextureAForInput
{
commandEncoder.setTexture(textureA, atIndex: 0)
commandEncoder.setTexture(textureB, atIndex: 1)
}
else
{
commandEncoder.setTexture(textureB, atIndex: 0)
commandEncoder.setTexture(textureA, atIndex: 1)
}
commandEncoder.dispatchThreadgroups(threadGroups, threadsPerThreadgroup: threadGroupCount)
useTextureAForInput = !useTextureAForInput
}
commandEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
Since the application hovers around 50fps and two of the models request 20 iterations per frame, that's 1,000 solver iterations per second on a 640x640 grid, which, IMHO, is pretty impressive. I've even touched 60fps or 1,200 iterations per second!
The first new model I added is a simulation of the famous Belousov-Zhabotinsky chemical reactions with its characteristic spiral wavefronts:
The Metal kernel function for this is pretty simple and demonstrates the use of for loops inside GPU code:
kernel void belousovZhabotinskyShader(texture2d<float, access::read> inTexture [[texture(0)]],
texture2d<float, access::write> outTexture [[texture(1)]],
constant ReactionDiffusionParameters ¶ms [[buffer(0)]],
uint2 gid [[thread_position_in_grid]])
{
float3 accumColor = inTexture.read(gid).rgb;
for (int j = -1; j <= 1; j++)
{
for (int i = -1; i <= 1; i++)
{
uint2 kernelIndex(gid.x + i, gid.y + j);
accumColor += inTexture.read(kernelIndex).rgb;
}
}
accumColor.rgb = accumColor.rgb / 9.0f;
float a = accumColor.r + accumColor.r * (params.alpha * params.gamma * accumColor.g) - accumColor.b;
float b = accumColor.g + accumColor.g * ((params.beta * accumColor.b) - (params.alpha * accumColor.r));
float c = accumColor.b + accumColor.b * ((params.gamma * accumColor.r) - (params.beta * accumColor.g));
float4 outColor(a, b, c, 1);
outTexture.write(outColor, gid);
}
The second model I've added is Gray-Scott which is probably the most well known of all reaction diffusion models and can exhibit phenomena such as solitons, bifurcation and coral type patterns:
Again, the kernel function contains little more that some pretty basic arithmetic:
kernel void grayScottShader(texture2d<float, access::read> inTexture [[texture(0)]],
texture2d<float, access::write> outTexture [[texture(1)]],
constant ReactionDiffusionParameters ¶ms [[buffer(0)]],
uint2 gid [[thread_position_in_grid]])
{
const uint2 northIndex(gid.x, gid.y - 1);
const uint2 southIndex(gid.x, gid.y + 1);
const uint2 westIndex(gid.x - 1, gid.y);
const uint2 eastIndex(gid.x + 1, gid.y);
const float3 northColor = inTexture.read(northIndex).rgb;
const float3 southColor = inTexture.read(southIndex).rgb;
const float3 westColor = inTexture.read(westIndex).rgb;
const float3 eastColor = inTexture.read(eastIndex).rgb;
const float3 thisColor = inTexture.read(gid).rgb;
const float2 laplacian = (northColor.rb + southColor.rb + westColor.rb + eastColor.rb) - (4.0 * thisColor.rb);
const float reactionRate = thisColor.r * thisColor.b * thisColor.b;
const float u = thisColor.r + (params.Du * laplacian.r) - reactionRate + params.F * (1.0 - thisColor.r);
const float v = thisColor.b + (params.Dv * laplacian.g) + reactionRate - (params.F + params.K) * thisColor.b;
const float4 outColor(u, u, v, 1);
outTexture.write(outColor, gid);
}
The parameter editor panel on the right is fully dynamic. Each different model contains an array of its field names which is used inside the ReactionDiffusionEditor's createUserInterface() method to add a new ParameterWidget (basically a slider and a label) for each parameter:
parameterWidgets = [ParameterWidget]()
for fieldName in reactionDiffusionModel.fieldNames
{
let widget = ParameterWidget(frame: CGRectZero)
parameterWidgets.append(widget)
widget.minimumValue = reactionDiffusionModel.getMinMaxForFieldName(fieldName).min
widget.maximumValue = reactionDiffusionModel.getMinMaxForFieldName(fieldName).max
widget.value = reactionDiffusionModel.getValueForFieldName(fieldName)
widget.reactionDiffusionFieldName = fieldName
widget.addTarget(self, action: "widgetChangeHandler:", forControlEvents: UIControlEvents.ValueChanged)
addSubview(widget)
}
Currently, my implementation of these three models could do with improvement. Swift doesn't support reflection and I can't, for example, get and set values in the models via square brackets (e.g x = model["propertyName"] ). This has lead to some pretty hefty switch statements inside my ReactionDiffusionBase class:
func setValueForFieldName(fieldName: ReactionDiffusionFieldNames, value: Float)
{
switch(fieldName)
{
case .timestep:
reactionDiffusionStruct.timestep = value
case .a0:
reactionDiffusionStruct.a0 = value
case .a1:
reactionDiffusionStruct.a1 = value
case .epsilon:
reactionDiffusionStruct.epsilon = value
[...]
This application will probably form my first Swift based submission to the App Store. Since Ready, the giant on whose shoulder I stand, has its code available under GNU's General Public License, I plan to do the same myself.
However, there's still lots to do. My next step, which is something I've been planning to look at for months, is to implement Core Data for persistence and saving parameter settings. In the meantime, my source code is all available at my Git Hub repository.
Have you published your app yet? It sounds wonderful and I can't get Ready to launch properly on my Mac (High Sierra).
ReplyDelete