Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

AppGameKit/AppGameKit Studio Showcase / Compute Shaders Plugin

Author
Message
Lucas Tiridath
AGK Developer
15
Years of Service
User Offline
Joined: 28th Sep 2008
Location: Kings Langley, UK
Posted: 24th Mar 2018 20:47
Hi. I had fun making my last plugin so I thought I'd try my hand at adding support for compute shaders to AppGameKit. You can download the initial release here from GitHub.

The download for the release includes the binaries for Windows and Linux, as well as some documentation of the plugin commands and a couple of examples showing how the plugin can be used. In order to use the plugin, you'll need support for OpenGL 4.3 or better. As such, I'm not able to port the plugin to macOS as Apple ceased development of their OpenGL driver at OpenGL 4.1. If there's enough interest in this kind of functionality though, I will consider doing an OpenCL plugin which would be able to target macOS too.

In case you're not aware, compute shaders are general purpose stand alone shaders that are designed to make it easier to write arbitrary programs that run on the GPU. This can be particularly useful for games which can often benefit from the massive power of the GPU in ways beyond simply rendering graphics. For example computationally expensive tasks like AI, physics, audio processing, or collision detection could all be offloaded onto the GPU using compute shaders.

The most important part of the plugin is the ability to load (with Compute.LoadShader) and run (with Compute.RunShader) compute shaders from AppGameKit. As with normal AppGameKit shaders, you can set uniforms as inputs, and can bind AppGameKit images as inputs or outputs. In addition, the plugin adds the concept of a buffer, which can also be provided to the shader to operate on, using the buffer as either an input or an output. Buffers interact nicely with AppGameKit memblocks, so you can create buffers from memblock, memblocks from buffers, and copy data between the two.

This is just a first release, so of course I would be happy to hear any feedback or suggestions you have for the plugin. Hopefully you'll find it useful!
BatVink
Moderator
20
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 15th Apr 2018 17:43
Sounds intriguing.
Maybe one day I will understand how to take advantage of this!
I just starting to work with machine learning, it sounds like the amount of computations required in a ML scenario would benefit from this.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Quidquid latine dictum sit, altum sonatur
TutCity is being rebuilt
george++
AGK Tool Maker
16
Years of Service
User Offline
Joined: 13th May 2007
Location: Thessaloniki, Hellas
Posted: 16th Apr 2018 04:49 Edited at: 16th Apr 2018 04:51
Your plugin is really interesting. Personally I don't need it since my programs are simple. But your post gives an idea where PC architecture will go. So far we have GPU dedicated to graphics. Why not don't we have another chip dedicated to computations?
Richard_6
7
Years of Service
User Offline
Joined: 3rd Feb 2017
Location:
Posted: 17th Apr 2018 13:01 Edited at: 17th Apr 2018 13:02
I like the idea George! Reminds me the Amiga with several independent chipsets. Unfortunately it seems everything now is too centralized in the CPU even having powerful GPUs.
Lucas Tiridath
AGK Developer
15
Years of Service
User Offline
Joined: 28th Sep 2008
Location: Kings Langley, UK
Posted: 21st Apr 2018 10:06
Thanks for the feedback!

Yes machine learning would be an ideal application for GPGPU programming. I think you're probably right that in the future we'll see ever more diverse chipsets in our machines, in addition to the traditional CPU and GPU. We're already seeing lots of dedicated ASICs for things like cryptocurrency mining and artificial neural network acceleration, and I wouldn't be surprised if we got more in the years to come. Of course even on the CPU, we've now got lots of mobiles with heterogeneous cores, so you have a few big cores for powering games and stuff, and then some smaller cores with lower power consumption for background tasks. It's a really exciting area.

An advantage of an OpenCL plugin would be that OpenCL can target many different types of devices, rather than just GPUs, whereas I believe compute shaders are strictly GPU only.
BatVink
Moderator
20
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 22nd Apr 2018 08:34
Does anyone remember the revolution that was the 486 DX CPU?
It was the first chip to have a dedicated maths co-processor. At the time, it was a huge leap forward, especially for games that could offload intensive calculations onto the x87 chip.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Quidquid latine dictum sit, altum sonatur
TutCity is being rebuilt
Richard_6
7
Years of Service
User Offline
Joined: 3rd Feb 2017
Location:
Posted: 22nd Apr 2018 13:30
I remember that I had a 286 and I bought a math co-processor separataly to be able to work with 3D Studio.
Lucas Tiridath
AGK Developer
15
Years of Service
User Offline
Joined: 28th Sep 2008
Location: Kings Langley, UK
Posted: 22nd Apr 2018 20:56
Quote: "Does anyone remember the revolution that was the 486 DX CPU?"

Sorry, too young to remember that

<tangent>I did have some fun doing some bare metal programming on my first generation Raspberry Pi the other day though, because it turns out that the ARMv6 ISA doesn't include an FPU, and there was no way I was writing my own floating point math library, so I ended up using good old fashioned fixed point instead.</tangent>

If I get time, I'll have a look into doing an OpenCL plugin too. Do you think it might help to provide more examples for the plugin? I've added a couple of examples already, showing how it can be used for image processing and general purpose computation, but maybe if machine learning and AI is an area people are interested in, then an ANN example or something might be useful?
blink0k
Moderator
11
Years of Service
User Offline
Joined: 22nd Feb 2013
Location: the land of oz
Posted: 25th Apr 2018 06:06
This would be super handy for logic that required recursion
Lucas Tiridath
AGK Developer
15
Years of Service
User Offline
Joined: 28th Sep 2008
Location: Kings Langley, UK
Posted: 26th Apr 2018 17:45
How do you mean? I'm afraid shader languages like GLSL and OpenCL C don't typically support recursion, as many older GPUs don't have the capability to do it. Or were you thinking of somehow recursively dispatching compute kernels from the CPU side?
blink0k
Moderator
11
Years of Service
User Offline
Joined: 22nd Feb 2013
Location: the land of oz
Posted: 27th Apr 2018 09:18
Oh. Ok. I just assumed because it was a C like language that it supported recursion. Too bad then
Santman
12
Years of Service
User Offline
Joined: 15th Sep 2011
Location: Inverness
Posted: 9th May 2018 14:24
My eyes lit up at the concept of having the gpu render to a buffer and copy that to a memblock. Sounds really interesting, will need to check it out. Awesome potential.
Lucas Tiridath
AGK Developer
15
Years of Service
User Offline
Joined: 28th Sep 2008
Location: Kings Langley, UK
Posted: 9th May 2018 18:01
Thanks Santman, I do think that compute has a lot of potential . Do post here if you think of any improvements or make anything interesting with it!

Login to post a reply

Server time is: 2024-03-29 07:56:05
Your offset time is: 2024-03-29 07:56:05