-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Frequently Asked Questions
- Can I run raylib in my old PC?
- What are the default paths?
- What are the dependencies on Windows?
- How do I remove the log?
- How do I stop ESC from closing the window?
- How do I remove the console window?
- How do I make a timer?
- What do all the fields in
Camera2D
mean? - How do I center text on the screen?
- Why does calling
LoadTexture()
crash my program? - How can I draw a texture flipped?
- Why is my render texture upside down?
- How do I create a depth texture?
- Why is my font blurry?
- How do I load my 3d model animations in raylib?
- Does raylib support Vulkan? Will raylib support it?
- Why are my assets not loading?
- Why are my sound files not working?
- How do I setup a custom icon for my executable?
- How I deal with UTF-16 strings?
- Why does
DrawMeshInstanced
not draw anything? - Why is raylib using integrated graphics when a dedicated GPU is available?
Yes, you can. raylib comes compiled by default for OpenGL 3.3 backend but it can be recompiled for older OpenGL versions! It can be recompiled for OpenGL 1.1 (1997) and OpenGL 2.1 (2006). To do that using Notepad++, just follow this steps:
- Open scripts execution window (F6)
- Execute Notepad++ script:
raylib_source_compile
but choose the desired OpenGL API version:
SET GRAPHIC_API=GRAPHICS_API_OPENGL_11
SET GRAPHIC_API=GRAPHICS_API_OPENGL_21
SET GRAPHIC_API=GRAPHICS_API_OPENGL_33
- Open any raylib example or your raylib game
- Execute Notepad++ script:
raylib_compile_execute
Note that above process only needs to be done once, if at some point you need to change OpenGL version again, just repeat that process.
Like any other library or project, a raylib project requires different dependencies placed accordingly for a correct compilation. It means external library headers and, eventually, some static library.
Dependencies depend on platform, so every platform requires some libraries installed in an specific folder to be accesible for the compiler when compiling raylib or a raylib example/game.
Specific dependencies installation is listed for every platform on this Wiki. Actually, there are not many external dependencies for raylib, only a couple (GLFW3, OpenAL Soft) and only in some platforms.
On Linux, both the make and cmake build systems install libraylib and raylib.h to /usr/local/lib
and /usr/local/include
, respectively. If you want to customize these locations, inspect the appropriate Makefile or CMakeLists.txt.
For Windows platform, raylib is distributed with a self-contained installer that includes, not only raylib but also a raylib-ready-configured free text editor (Notepad++) and a C/C++ compiler with tools (MinGW-w64).
Notepad++ comes with a bunch of pre-created scripts ready for compiling a raylib project and raylib source for different platforms. Those scripts assume that raylib is installed in the portable folder path: C:\raylib
In some cases, it could happen that user decided to move that folder to somewhere else; in that case, raylib paths need to be reconfigured, just updating the RAYLIB_PATH
variable at the beginning of every script.
As a recommendation, if you move C:\raylib
folder to somewhere else, try to avoid spaces and special characters on the new path, it could generate errors on compilation.
For command-line compilation and custom pipeline configuration, just check Compile for Windows wiki.
This page will go over some of the common questions new users have when starting out using raylib.
Call SetTraceLogLevel(LOG_NONE)
before InitWindow()
Call SetExitKey(KEY_NULL)
It can be removed with a linker parameter, it depends on the platform and compiler. gcc
supports -Wl,--subsystem,windows
or -mwindows
compiler options. On Visual Studio, in Configuration Properties > Linker > System > SubSystem choose Windows (/SUBSYSTEM:WINDOWS)
to avoid console. With other compilers there should be similar options.
nu
raylib has no built in timer system. You are expected to keep track of time in your own code. You can do with with the GetTime()
and GetFrameTime()
functions. Below is an example of a simple timer struct and functions to use it.
typedef struct Timer {
double startTime; // Start time (seconds)
double lifeTime; // Lifetime (seconds)
} Timer;
void StartTimer(Timer *timer, double lifetime)
{
timer->startTime = GetTime();
timer->lifeTime = lifetime;
}
bool TimerDone(Timer timer)
{
return GetTime() - timer.startTime >= timer.lifeTime;
}
double GetElapsed(Timer timer)
{
return GetTime() - timer.startTime;
}
You can see a short video on the concept here: https://youtu.be/vGlvTWUctTQ
Video explaining Camera2D
: https://youtu.be/zkjDU3zmk40
The Camera2D
structure is used by BeginMode2D()
/EndMode2D()
to define a 2d transformation. This is very useful for games that want to draw a 2d world in a fixed coordinate system and have the ability to pan, zoom and rotate the view without the need to change drawing code.
typedef struct Camera2D {
Vector2 offset; // Camera offset (displacement from target)
Vector2 target; // Camera target (rotation and zoom origin)
float rotation; // Camera rotation in degrees
float zoom; // Camera zoom (scaling), should be 1.0f by default
} Camera2D;
Offset
The offset is used to shift the window origin away from the default of the upper left corner. it is a value defined in screen space. it is not affected by rotation or zoom. It is very common to set this to be half the window width/height in order to make the window origin be the center of the screen.
Target
This is a point in world space that the camera will follow. It defines zoom and rotation pivot points. It is very common to set this value to the thing you want to track in your world, such as your player position.
Rotation
This is the rotation angle that the view will be rotated by. It will rotate around the target point in world coordinates.
Zoom
This is the scale factor applied to the view. A value of 1 will draw the world at it's original scale. A value of 2 will be a 2x zoom and draw everything twice as large on screen.
There will be times when you need to convert from screen coordinates into world coordinates, such as wanting to find out what the mouse's screen position is in the world.
GetScreenToWorld2D()
Vector2 GetScreenToWorld2D(Vector2 position, Camera2D camera);
This will convert a screen point into a world point for a camera. This will include zoom and scale. It is very common to use this to get the mouse position in world coordinates to do collisions or picking.
Vector2 mouseInWorld = GetScreenToWorld2D(GetMousePosition(), MyCamera);
GetWorldToScreen2D()
Vector2 GetWorldToScreen2D(Vector2 position, Camera2D camera);
This function gets a screen point for a world point, using zoom and scale. It is useful for computing the location of HUD elements that should not be scaled or rotated with the world view, such as player names, health bars, or other labels.
raylib does not offer any text formatting functions, so you need to compute the starting point for all text that you draw. The starting point for text is always the upper left corner.
You can compute the center of the screen by dividing the screen width and height in half.
int screenCenterX = GetScreenWidth() / 2;
int screenCenterY = GetScreenHeight() / 2;
Next you need to compute how large the text is, and offset the center by half the text size.
const char text[] = "Congrats! You created your first window!";
int fontSize = 20;
int textWidth = MeasureText(text, fontSize);
int textStartX = screenCenterX - textWidth / 2;
int textStartY = screenCenterY - fontSize / 2;
DrawText(text, textStartX, textStartY, fontSize, LIGHTGRAY);
MeasureText()
only measures the width of text, but takes fewer arguments. It is often acceptable to just use the font size as the total text height, but for some fonts, this may not be accurate. MeasureTextEx()
will measure both height and width of text, but does take more arguments. For this reason it is used less often.
You are likely calling LoadTexture()
before calling InitWindow()
.
Loading textures requires a valid OpenGL context, that InitWindow()
sets up.
Therefore, textures cannot be loaded before calling InitWindow()
.
Drawing a texture flipped requires the use of either DrawTextureRec()
, or DrawTexturePro()
functions.
Every single texture drawing function calls DrawTexturePro()
to do its work, but only these two expose the parameters necessary to flip a texture: the source rectangle.
To flip the texture, simply pass a source rectangle with a negative width or height. Note that this functionality is custom from raylib.
Rectangle source = { 0, 0, -texture.width, texture.height };
DrawTextureRec(texture, source, (Vector2){ 0, 0 }, WHITE);
The above code will draw a texture flipped in the X axis.
All textures in OpenGL by default have the origin in the lower-left corner, while the screen origin is by default configured in the upper left corner. When you load a normal texture, raylib flips the image data for you, however this cannot be done with a render texture. The solution is to draw your render texture vertically flipped, see the above paragraph for how to do this.
By default LoadRenderTexture()
uses a RenderBuffer
for the depth texture, this is done for optimization (supposedly GPU can work faster if it knows that depth texture does not need to be read back). But a RenderBuffer
is unsuitable to be drawn on the screen like a regular Texture2D
.
Here some code to load a RenderTexture2D
that uses a Texture2D
for the depth:
RenderTexture2D LoadRenderTextureWithDepthTexture(int width, int height)
{
RenderTexture2D target = {0};
target.id = rlLoadFramebuffer(width, height); // Load an empty framebuffer
if (target.id > 0)
{
rlEnableFramebuffer(target.id);
// Create color texture (default to RGBA)
target.texture.id = rlLoadTexture(NULL, width, height, PIXELFORMAT_UNCOMPRESSED_R8G8B8A8, 1);
target.texture.width = width;
target.texture.height = height;
target.texture.format = PIXELFORMAT_UNCOMPRESSED_R8G8B8A8;
target.texture.mipmaps = 1;
// Create depth texture
target.depth.id = rlLoadTextureDepth(width, height, false);
target.depth.width = width;
target.depth.height = height;
target.depth.format = 19; //DEPTH_COMPONENT_24BIT?
target.depth.mipmaps = 1;
// Attach color texture and depth texture to FBO
rlFramebufferAttach(target.id, target.texture.id, RL_ATTACHMENT_COLOR_CHANNEL0, RL_ATTACHMENT_TEXTURE2D, 0);
rlFramebufferAttach(target.id, target.depth.id, RL_ATTACHMENT_DEPTH, RL_ATTACHMENT_TEXTURE2D, 0);
// Check if fbo is complete with attachments (valid)
if (rlFramebufferComplete(target.id)) TRACELOG(LOG_INFO, "FBO: [ID %i] Framebuffer object created successfully", target.id);
rlDisableFramebuffer();
}
else TRACELOG(LOG_WARNING, "FBO: Framebuffer object can not be created");
return target;
}
Now the RenderTexture
depth texture can be drawn as a regular texture... but note that information contained is not normalized!
If you call LoadFont()
, the text will be loaded at a fixed size (32 pixels height). If you draw this text at anything other than this fixed size (32 pixles height for .ttf
/.otf
fonts), the font texture will be scaled. Scaling introduces artifacts. Scaling up will introduce blur. You can load your font with LoadFontEx()
and specify the size you want to use it at for the best possible quality. You can pass in NULL
for fontChars
and 0
for glyphCount
to load the default character set.
LoadFontEx("myfont.ttf", 50, NULL, 0);
raylib supports several 3d file formats that support animations:
- M3D: Animation support was implemented recently (after raylib 4.2 release). It works great and there is a Blender exporter for .m3d files with animations.
- IQM: There is a Blender exporter but there are multiple ways to create the animations and export the IQM file. Depending on the process used, raylib may or may not be able to load the animations. Several users reported having problems with IQM animations.
- GLTF: This file format supports animation and raylib can load them, but with some limitations.
Raylib is built on top of OpenGL, and there are current no plans to support any other graphics APIs. In any case, raylib uses rlgl
as its abstraction layer to support different OpenGL versions. Theoretically a Vulkan equivalent could be developed that maps raylib's calls to a Vulkan backend, but creating that layer would take a considerable amount of work.
File loading functions such as LoadTexture()
check the current working directory for the file. Usually the working directory should be where the executable is located, but depending on OS and build system this is not always the case. A quick fix for this is to use ChangeDirectory(GetApplicationDirectory());
before loading your assets.
If you're getting Failed to open file
, see the above question. If you're getting warnings like Failed to get frame count for format conversion
, there are two possible causes:
- The audio device is not initialized, it must be initialized with
InitAudioDevice()
before any sounds can be loaded. - The audio file is corrupted or not supported by raylib, try converting it to another format or using a different file.
Icon support on executables is only supported in PE executables, usually on Windows systems. Icon can be embedded as a resource in code compilation, when creating the executable or it can be changed at runtime on execution.
To embed the icon as a resource on code compilation, a resource.rc
file should be created and compiled into an object file to be embedded as a regular code file. To compile that resource file, MinGW provides a tool called windres.exe
:
windres resource.rc -o resource.rc.data --target=pe-x86-64
With Visual Studio, adding the resource.rc
file to the project should be enough.
To change the icon at runtime, raylib provides the function SetWindowIcon(Image image)
. This is the main way to change the icon on other operating systems. Just make sure that image.format
is PIXELFORMAT_UNCOMPRESSED_R8G8B8A8
, it's a requirement. Here is an example:
Image icon = LoadImage("path/to/icon.png");
ImageFormat(&icon, PIXELFORMAT_UNCOMPRESSED_R8G8B8A8);
SetWindowIcon(icon);
UnloadImage(icon);
raylib supports by default UTF-8 strings, actually, text drawing functions expect to receive UTF-8 strings as inputs but sometimes source text is provided as UTF-16. Here is a handy conversion library.
You are likely using the default material when calling the function. DrawMeshInstanced
does not work with the default shader.
To create a vertex shader that would work with instancing, you must provide an input for the model transform matrix (the transform matrix for the instance) as a vertex attribute, and link its location to SHADER_LOC_MATRIX_MODEL
. By default SHADER_LOC_MATRIX_MODEL
binds as an uinform input in yout shader, you must use GetShaderLocationAttrib
to bind to it as an attribute.
Vertex Shader Example
#version 330
// Input vertex attributes
in vec3 vertexPosition; // vertex position relative to origin
in vec2 vertexTexCoord; // texture coord of vertex
in mat4 instanceTransform; // model transformation matrix
// Input uniform values
uniform mat4 mvp; // model-view-projection
// Output vertex attributes (to fragment shader)
out vec2 fragTexCoord;
void main()
{
// Pass texture coord
fragTexCoord = vertexTexCoord;
// Compute MVP for current instance
mat4 mvpi = mvp*instanceTransform;
// Calculate final vertex position
gl_Position = mvpi*vec4(vertexPosition, 1.0);
}
Fragment Shader Example
#version 330
// Input vertex attributes (from vertex shader)
in vec3 fragPosition;
in vec2 fragTexCoord;
// Input uniform values
uniform sampler2D texture0;
uniform vec4 colDiffuse;
// Output fragment color
out vec4 finalColor;
void main()
{
// Texel color fetching from texture sampler
vec4 texelColor = texture(texture0, fragTexCoord);
// colorize texture with diffuse color
finalColor = colDiffuse*texelColor;
}
Full C Example
const char fs[] =
"#version 330 \n"
" \n"
"// Input vertex attributes (from vertex shader) \n"
"in vec3 fragPosition; \n"
"in vec2 fragTexCoord; \n"
" \n"
"// Input uniform values \n"
"uniform sampler2D texture0; \n"
"uniform vec4 colDiffuse; \n"
" \n"
"// Output fragment color \n"
"out vec4 finalColor; \n"
" \n"
"void main() \n"
"{ \n"
" // Texel color fetching from texture sampler \n"
" vec4 texelColor = texture(texture0, fragTexCoord); \n"
" // colorize texture with diffuse color \n"
" finalColor = colDiffuse*texelColor; \n"
"} \n";
const char vs[] =
"#version 330 \n"
" \n"
"// Input vertex attributes \n"
"in vec3 vertexPosition; // vertex position relative to origin \n"
"in vec2 vertexTexCoord; // texture coord of vertex \n"
"in mat4 instanceTransform; // model transformation matrix \n"
" \n"
"// Input uniform values \n"
"uniform mat4 mvp; // model-view-projection \n"
" \n"
"// Output vertex attributes (to fragment shader) \n"
"out vec2 fragTexCoord; \n"
" \n"
"void main() \n"
"{ \n"
" // Pass texture coord \n"
" fragTexCoord = vertexTexCoord; \n"
" // Compute MVP for current instance \n"
" mat4 mvpi = mvp*instanceTransform; \n"
" // Calculate final vertex position \n"
" gl_Position = mvpi*vec4(vertexPosition, 1.0); \n"
"} \n";
Shader shader = LoadShaderFromMemory(vs, fs);
shader.locs[SHADER_LOC_MATRIX_MODEL] = GetShaderLocationAttrib(shader, "instanceTransform");
The GPU used is set by the user through the GPU vendor’s control panel. If you think it’s crucial that a dedicated GPU is used in your program, you can add the following code anywhere in your code to suggest the use of dedicated AMD or NVIDIA GPUs.
#ifdef __cplusplus
extern "C" {
#endif
__declspec(dllexport) unsigned long NvOptimusEnablement = 1;
__declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
#ifdef __cplusplus
}
#endif
www.raylib.com | itch.io | GitHub | Discord | YouTube
- Architecture
- Syntax analysis
- Data structures
- Enumerated types
- External dependencies
- GLFW dependency
- libc dependency
- Platforms and graphics
- Input system
- Default shader
- Custom shaders
- Coding conventions
- Integration with other libs
- Working on Windows
- Working on macOS
- Working on GNU Linux
- Working on Chrome OS
- Working on FreeBSD
- Working on Raspberry Pi
- Working for Android
- Working for Web (HTML5)
- Creating Discord Activities
- Working anywhere with CMake
- CMake Build Options
- raylib templates: Get started easily
- How To: Quick C/C++ Setup in Visual Studio 2022, GCC or MinGW
- How To: C# Visual Studio Setup
- How To: VSCode
- How To: Eclipse
- How To: Sublime Text
- How To: Code::Blocks