

Game & UI Design Philosophy
For this project, the goal was to treat the UI as part of the gameplay rather than a layer on top of it. Every interaction was designed to feel responsive, readable, and expressive, matching the comic-book tone while clearly communicating player intent and feedback.
From the start, I evaluated each design decision with cross-platform use in mind. Whenever a new interaction or system was introduced, I considered how it would work on mouse and keyboard as well as controller, including PS5 input. Navigation flow, confirm/back behavior, focus handling, and visual prompts were built to work consistently across platforms so no input method felt secondary.
Performance was treated as a design constraint rather than something to fix later. Visual clarity and feedback were prioritized, but never at the cost of responsiveness or stability. This guided decisions around layout structure, animation use, and system reuse, helping the UI stay lightweight and reliable throughout development.
By treating UI, interaction design, and technical implementation as one connected system, the result was an interface that supports gameplay flow, adapts naturally to different input methods, and stays consistent across platforms without losing personality.
Download the game here.
Interaction Design,
Navigation & Feedback
I designed and implemented the full UI flow using Unreal Engine’s UMG system together with one programmer on the team. Our collaboration covered the main menu, lobby, pause menu, settings, tutorials, and end screens. My focus was to make navigation intuitive and predictable so players always understood where they were and how to move forward without friction.
A consistent visual hierarchy was used across all menus to reduce cognitive load. Buttons, layouts, and navigation patterns were reused so that learning one screen naturally translated to the rest of the interface. Clear visual and audio feedback such as hover states, animations, and sound cues confirmed player input and reinforced responsiveness across both mouse/keyboard and controller.
From a technical and UX perspective, performance and scalability were ongoing considerations. Widgets were primarily structured using overlays instead of canvas panels to reduce draw calls while keeping layouts flexible across resolutions and platforms. Shared button systems, reusable animations, and centralized hover and sound logic reduced duplication and made the UI easier to maintain as the project grew.
Controller navigation and platform-specific prompts were treated as core features. Focus handling, confirm/back behavior, and prompt bars were tuned to ensure smooth interaction on both PC and console, resulting in a UI that feels consistent regardless of input method.
The UI systems were implemented together with programmer Linus Sellstedt. His portfolio can be found here:
Iteration & Optimization
The UI was continuously iterated through playtesting and team feedback. Early versions were treated as prototypes, allowing me to test ideas quickly, identify friction points, and refine layout and interaction as the project evolved. This approach helped the UI mature alongside the rest of the game instead of being locked too early.
As the system grew, I improved structure and performance by optimizing widget hierarchies and removing unnecessary hit-testing. Repeated Blueprint logic was refactored into reusable components, reducing duplication and making the system easier to maintain. These changes allowed the UI to scale late into production without becoming brittle or difficult to modify.
By the end of the project, the UI had evolved from individual solutions into a cohesive, production-ready system that was performant, readable, and flexible enough to support last-minute changes without introducing instability.

Technical problem-solving & Stability
Beyond visual design, a large part of my role involved identifying and resolving technical UI issues that affected stability and usability. This included tracking down focus bugs, input conflicts, and menu state issues that could lead to soft-locks or unintended behavior, especially when switching between input methods.
Maintainability was an early challenge. Hover animations were initially implemented separately for each button, which quickly became difficult to manage. I refactored this into a parent–child animation setup so hover behavior could be reused across all UI elements. This simplified future changes and made it easier for others on the team to work within the system.
I also worked with animation curves in Unreal to fine-tune timing and easing, giving hover interactions a more responsive, “floaty” feel that matched the game’s tone. These adjustments improved both player feedback and internal workflow by reducing iteration time while increasing overall polish and reliability.









