Note: If you are reading this on Github, please not that this is only a mirror repository and the newest code is hosted on my mercurial repository at https://hg.basx.dev/games/semicongine/.
Hi there
This is a little game engine, mainly trying to wrap around vulkan and the operating system's windowing, input and audio system. I am using the last programming language you will ever need, Nim
The (incomplete, autogenerated) API documentation is hosted at https://semicongine.diademgames.com/.
The engine currently features the following:
- No dependencies outside of this repo (except zip/unzip on Linux). All
dependencies are included (
libs
for library dependencies,tools
for binaries/scripts,semicongine/thirdparty
for code dependencies) - Low-level, Vulkan-base rendering system
- All vertex/uniform/descriptors/shader-formats can and must be defined "freely". The only restriction that we currently have, is that vertex data is non-interleaved.
- A ton of compiletime checks to ensure the defined mesh-data and shaders are compatible for rendering
- Simple audio mixer, should suffice for most things
- Simple input-system, no controller support at this time
- Resource packaging of images, audio and 3D files as either folders, zip files or embedded in the executable
- Simple font and text rendering
- A few additional utils like a simple storage API, a few algorithms for collision detection, noise generation and texture packing, and a simple settings API with hot-reloading
[ ] Macro-based internal DSL to convert Nim code into GLSL/slang at compile time [ ] Better memory management - Simple buffer resizing - Mechanism to mark unused buffers - Use mapped GPU buffers without copying (implement seq with pointers to GPU memory) - Do not keep copy of content for un-mapped buffers around (only pass data on creating or update)
Attention, this project is not optimized for "hello world"-scenarios, so you have to write quite a few lines to get something to display:
import semicongine
# required
InitVulkan()
# set up a simple render pass to render the displayed frame
var renderpass = CreateDirectPresentationRenderPass(depthBuffer = false, samples = VK_SAMPLE_COUNT_1_BIT)
# the swapchain, needs to be attached to the main renderpass
SetupSwapchain(renderpass = renderpass)
# render data is used for memory management on the GPU
var renderdata = InitRenderData()
type
# define a push constant, to have something moving
PushConstant = object
scale: float32
# This is how we define shaders: the interface needs to be "typed"
# but the shader code itself can freely be written in glsl
Shader = object
position {.VertexAttribute.}: Vec3f
color {.VertexAttribute.}: Vec3f
pushConstant {.PushConstantAttribute.}: PushConstant
fragmentColor {.Pass.}: Vec3f
outColor {.ShaderOutput.}: Vec4f
# code
vertexCode: string = """void main() {
fragmentColor = color;
gl_Position = vec4(position * pushConstant.scale, 1);}"""
fragmentCode: string = """void main() {
outColor = vec4(fragmentColor, 1);}"""
# And we also need to define our Mesh, which does describe the vertex layout
TriangleMesh = object
position: GPUArray[Vec3f, VertexBuffer]
color: GPUArray[Vec3f, VertexBuffer]
# instantiate the mesh and fill with data
var mesh = TriangleMesh(
position: asGPUArray([NewVec3f(-0.5, -0.5), NewVec3f(0, 0.5), NewVec3f(0.5, -0.5)], VertexBuffer),
color: asGPUArray([NewVec3f(0, 0, 1), NewVec3f(0, 1, 0), NewVec3f(1, 0, 0)], VertexBuffer),
)
# this allocates GPU data, uploads the data to the GPU and flushes any thing that is host-cached
# this is a shortcut version, more fine-grained control is possible
AssignBuffers(renderdata, mesh)
renderdata.FlushAllMemory()
# Now we need to instantiate the shader as a pipeline object that is attached to a renderpass
var pipeline = CreatePipeline[Shader](renderPass = vulkan.swapchain.renderPass)
# the main render-loop will exit if we get a kill-signal from the OS
while UpdateInputs():
# starts the drawing for the next frame and provides us necesseary framebuffer and commandbuffer objects in this scope
WithNextFrame(framebuffer, commandbuffer):
# start the main (and only) renderpass we have, needs to know the target framebuffer and a commandbuffer
WithRenderPass(vulkan.swapchain.renderPass, framebuffer, commandbuffer, vulkan.swapchain.width, vulkan.swapchain.height, NewVec4f(0, 0, 0, 0)):
# now activate our shader-pipeline
WithPipeline(commandbuffer, pipeline):
# and finally, draw the mesh and set a single parameter
# more complicated setups with descriptors/uniforms are of course possible
RenderWithPushConstant(commandbuffer = commandbuffer, pipeline = pipeline, mesh = mesh, pushConstant = PushConstant(scale: 0.3))
# cleanup
checkVkResult vkDeviceWaitIdle(vulkan.device)
DestroyPipeline(pipeline)
DestroyRenderData(renderdata)
vkDestroyRenderPass(vulkan.device, renderpass.vk, nil)
DestroyVulkan()
For now all features that I need are implemented. I will gradually add more stuff that I need, based on the games that I am developing. Here are a few things that I consider integrating at a later point, once I have gather some more experience what can/should be used across different projects:
- More support for glTF format (JPEG textures, animations, morphing)
- Some often used utils like camera-controllers, offscreen-rendering, shadow-map rendering, etc.
- Some UI-stuff
- Controller support