Horus-Hidden-Passage
Driven by curiosity, an unsuspecting explorer, ventures into the depths of an ancient Egyptian tomb, unaware of its dark secret. But as fate would have it, his exploration turns sinister when he becomes trapped within its ancient confines. With each breath, his air supply diminishes, and as darkness descends, Hamza realizes that the shadows conceal more than just his confinement. To his horror, he discovers he is not alone in this suffocating darkness, and that survival is his ultimate goal.
- The player is trapped inside a tomb and must find 5 hidden pieces to escape.
- Multiple obstacles will get in the player's way while searching, he needs to light all the torches for better visibility.
- Elements within the tomb, such as light and visual distortions, will hinder the player.
- The player must manage their air supply and escape as quickly as possible before suffocating.
- If the player finds all 5 pieces and escapes before running out of air, the tomb's ceiling will rise and the door will open. Otherwise, he loses.
The following diagram shows the relationship between the classes (and structs) in the project:
In our engine, the shader program defines the vertex and fragment shaders that run on the GPU during rendering. It will contain an OpenGL program object.
The mesh is a collection of vertices and faces. In OpenGL, we define a mesh using 3 OpenGL objects:
Vertex Buffer: This is a buffer that contains the data of all the vertices. Element Buffer: This is a buffer that contains data that defines the indices of the 3 vertices needed to draw each triangle (or 2 if lines or 1 if points). Vertex Array: This is an object that defines how the data in the vertex buffer is interpreted and sent to the vertex shader as attributes.
To draw different instances of a mesh at different positions, orientations, and/or scales, we send a transformation matrix to the vertex shader as a uniform. The transformation matrix is used to transform the vertex positions from the object's local space to the world space.
OpenGL is a state machine where the options we pick are stored in the OpenGL context and affect the upcoming draw calls. Since each object may require different options while drawing (e.g. transparent objects require blending while Opaque objects don't), we would need to store the options for each object in a data structure and set the OpenGL options to match the given options before drawing. This is where we use the "PipelineState" structure which we will use to store the depth testing, face culling, blending, and color/depth mask options. The setup function of the PipelineState sets the OpenGL options to match the ones stored in the corresponding PipelineState instance.
A 2D Texture is a sampleable storage containing a 2D array of pixels. By "sampleable", we mean that we can sample a color from it in the shaders.
There is more than one way to sample a texture. For example, we can choose between nearest or linear
filtering and we can select between different wrapping options. A sampler
is an OpenGL object that can
store the sampling options to use while sampling a texture.
We will combine all the previous parts using one class which we will call Material
.
Since there are lots of different types of Materials, we chose to create a base class for all materials (which
we call Material
) and we inherit from it to create more specific materials (e.g. TintedMaterial
and
TexturedMaterial
).
An entity-component-system ECS
framework consists of 3 parts:
Entities: These are containers that contain a set of components. An entity does nothing more than being a simple container and ideally, it should contain no logic of its own. Components: which are data objects that can be added to entities. The roles and the data of the entities are defined by their components. For example, if an entity contains a camera component, then it is a camera and we should be able to find camera-specific data (e.g. field of view angle) inside the camera component. Systems: which defines the logic. For example, if we want to render a set of entities, we should implement a renderer system that draws the entities in every frame.
The forward renderer will handle the transparent objects by sorting them by the distance along the camera's forward direction. The draw order of transparent objects is from far to near.
Here, we will modify the ForwardRenderer
system class to draw a sky sphere around the camera.
Sometimes, we want to apply an image effect on the rendered scene. This is called postprocessing which can be applied by rendering the scene to a texture and then rendering the texture to the screen using the effect shader.
Name | |
---|---|
Fares Atef | faresatef553@gmail.com |
Ghaith Mohamed | gaoia123@gmail.com |
Amr ElSheshtawy | Sheshtawy321@gmail.com |
Amr Magdy | amr4121999@gmail.com |