-
Notifications
You must be signed in to change notification settings - Fork 0
Part 1: TouchGFX
TouchGFX is a user-friendly graphical C++ tool for creating powerful embedded touch applications, made by STMicroelectronics.
It is primarily built for creating graphical applications with retained mode GUI and optimized only to redraw parts that were changed. We will create a game in which we will redraw the screen every frame, which is not the primary use case, however, I have chosen to use TouchGFX because it abstracts away a lot of boilerplate work.
When you create a new TouchGFX project, the necessary screen drivers, the HAL (Hardware Abstraction Layer), and the CubeMX project are all generated automatically. This includes setting up the required pin configurations and device clocks. Additionally, the TouchGFX Engine manages screen updates, user input events, and timing, so we don't need to handle these manually or write any assembly code.
TouchGFX also takes care of writing interrupt handlers and invoking event handlers, which we can override in our GUI to implement the desired behavior. This greatly reduces the amount of low-level code we need to write, which is also very dependent on the board chosen.
All main loops in renderers have a structure like:
while (true)
{
collect(); // Collect events from outside
update(); // Update the application UI model
render(); // Render new updated graphics to the framebuffer
wait(); // Wait for 'go' from display
}
In collect phase, TouchGFX connects click, drag and gesture touch events and exposes them to input handlers, which we can later override with custom methods.
In update phase, we will move the player based on received touch events. We will also recalculate all the pixels of the first-person view.
In render phase, we invalidate the pixels of game view - we flag them to the TouchGFX Engine that they need to be redrawn. Those pixels from our framebuffer will be scheduled to be sent to display.
In wait phase, we wait for the signal from the screen that it's ready for framebuffer transmission. This will happen at a fixed rate (usually 60Hz).
The rendering is synchronized with the display. As mentioned above some displays requires that the framebuffer is transmitted repeatedly. While this transmission is ongoing, it is not advisable to render arbitrarily to the framebuffer. The graphics engine therefore waits for a short time after the transmission is started before starting the rendering. Other displays send a signal to the microcontroller when the framebuffer should be transmitted. The graphics engine waits for that signal.
Frames are rendered at a fixed rate. It is often beneficial for the application that frames are rendered at a fixed rate, as this makes it easier to create animations that lasts a specific time. For example, if you have a 60 Hz display, a two seconds animation should be programmed to complete in 120 frames.
Delta-time refers to the time elapsed between two consecutive frames. When handling movement, animations, physics, or any actions that occur each frame, we need to ensure they progress at the same speed regardless of the frame rate. To achieve this, we multiply these calculations by the delta-time, allowing them to adapt to the varying time between frames.
For example, to move a player forward consistently across frames:
❌ Wrong:
// 30 fps
player.y += 0.10 // Will move 0.10 * 30 = 3 units / second
// 60 fps
player.y += 0.10 // Will move 0.10 * 60 = 6 units / second
✅ Correct:
// 30 fps
player.y += 6 * (1/30) // Will move 6 * (1/30) * 30 = 6 units / second
// 60 fps
player.y += 6 * (1/60) // Will move 6 * (1/60) * 60 = 6 units / second
We are making an assumption that our framerate will stay fixed at 60 FPS, and thus I only implemented delta time multiplication on desktop platform. However, if proven problematic, we would need to activate one of the timers and access it with HAL to calculate time elapsed between frames, since TouchGFX does not have such a built-in function.
void Raycaster::movePlayer(float speedMultiplier)
{
// making assumption of fixed 60fps, otherwise need to multiply by delta time
constexpr float baseSpeed = 0.03f;
Vec2f newPos = { playerPos.x + Raycaster::dir.x * baseSpeed * speedMultiplier,
playerPos.y + Raycaster::dir.y * baseSpeed * speedMultiplier };
// add collision detection - player can move only into empty space...
TouchGFX does however offer us a way to count the lost frames. This happens when rendering time is longer than display refresh time.
HAL::getInstance()->getLCDRefreshCount() // if > 1, frames are being dropped
We can compensate for lost frames either by waiting it out (artificially extending animation duration) or skipping some frames (accurate, but choppy).
void setFrameRateCompensation(bool enabled)
I also used this function to display FPS and render time on screen. I used this table, which is not entirely accurate, but close enough:
Dropped frames | FPS | Max rendering time |
---|---|---|
0 | 60 | 16.67 ms |
1 | 30 | 33.34 ms |
2 | 20 | 50.00 ms |
3 | 15 | 66.67 ms |
The most barebones solution without any unnecessary overhead would certainly be to create a raw framebuffer (ideally double buffers) and manually synchronize transmitting data to the display.
This would however mean manually setting up CubeMX project with configuring appropriate clock speeds, pins, display drivers and HAL and other hardware settings. It is very board-specific (thus non-portable) and requires lots of digging through reference manuals and trial-and-error, which is not beginner friendly.
Event handlers would also be needed to be implemented manually. VSYNC display synchronization with DMA transfers needs to be handled with care. Also, what happens if framebuffer is not ready for transfer yet? We need to define and implement a strategy (TouchGFX just resends the old buffer) and figure out a way to measure dropped frames.
- TouchGFX: Basic Concepts -> Main Loop
- TouchGFX: Basic Concepts -> Performance
- Introduction to LCD-TFT display controller (LTDC) on STM32 MCUs
TouchGFX has excellent documentation, recommended read.