Skip to main content

16 posts tagged with "Game Event System"

Game Event System related articles

View All Tags

Layering Visual & API Events: A Best Practices Guide for Scalable Projects

TinyGiants
GES Creator & Unity Games & Tools Developer

As projects grow, one of the most common questions I hear is: "Should I use the visual tools or the scripting API?" The answer is both — but knowing where each approach shines is what separates a clean, scalable event architecture from one that collapses under its own weight.

After seeing how teams of all sizes use GES, I want to share a layered approach that keeps your project maintainable whether you have 50 events or 500.

The Two Worlds of Game Objects

Every Unity project has two fundamentally different categories of objects:

Scene-Resident Objects — things that exist in the Hierarchy at edit time. UI canvases, level geometry, cameras, persistent managers, environment triggers. You can see them, select them, drag references to them.

Runtime-Spawned Objects — prefab instances created through Instantiate() at runtime. Enemies, projectiles, loot drops, pooled VFX, dynamically generated UI elements. They don't exist until the game is running.

This distinction is the foundation of how you should layer your event usage.

Layer 1: Visual Configuration for Scene-Resident Objects

Use the Editor, Behavior Window, and Flow Graph for anything that lives in the scene.

Why? Because scene-resident objects have stable Inspector references. A health bar sitting in your UI Canvas, a door trigger in your level, a background music controller — these objects are right there in the Hierarchy. You can:

  • Open the Game Event Editor, find the event, and click the Behavior button
  • Configure Action Conditions visually (e.g., only trigger the damage flash when health < 30%)
  • Set Schedule timing (0.2s delay before the hit sound, screen shake repeating 3x at 0.1s intervals)
  • Wire up UnityEvent actions by dragging the target object straight from the Hierarchy
  • Use the Flow Graph to orchestrate complex sequences (fade screen → load scene → reposition player → fade in)

This is where designers, artists, and audio engineers thrive. They can tweak game feel — adjust a delay from 0.2s to 0.35s, add a condition that skips the effect on low-HP enemies, reorder a chain sequence — without touching a single line of code and without waiting for a recompile.

Ideal For

  • UI responses — button clicks, panel transitions, HUD updates
  • Level scripting — door opens, traps activate, cutscene triggers
  • Audio events — play/stop/crossfade based on game state
  • Camera behaviors — shake, zoom, follow target switches
  • Environment reactions — lighting changes, particle effects, weather transitions
  • Tutorial sequences — step-by-step chains with conditions

Example

Your OnPlayerDeath event needs to: dim the screen, show a "You Died" panel, play a sound, and disable player input. All four responses are wired to UI and scene objects that already exist in the Hierarchy. This is a textbook case for the Behavior Window — four actions, one event, zero code. A designer can later add a 0.5s delay before the panel appears, or add a condition that skips the sound if the player is underwater, without filing a single code change request.

Layer 2: Scripting API for Runtime-Spawned Instances

Use AddListener(), RemoveListener(), and Raise() for prefab instances that are created at runtime.

Why? Because when you instantiate a prefab, there is no Inspector to drag references into. That enemy you just spawned from an object pool needs to listen for OnPauseGame to freeze its AI. That projectile needs to raise OnEnemyHit when it collides with a target. These bindings must happen in code, at the moment the object comes to life.

public class Enemy : MonoBehaviour
{
[GameEventDropdown] public GameEvent onPauseGame;
[GameEventDropdown] public GameEvent onResumeGame;

void OnEnable()
{
onPauseGame.AddListener(Freeze);
onResumeGame.AddListener(Unfreeze);
}

void OnDisable()
{
onPauseGame.RemoveListener(Freeze);
onResumeGame.RemoveListener(Unfreeze);
}

void Freeze() => agent.isStopped = true;
void Unfreeze() => agent.isStopped = false;
}

Notice something important: the event assets themselves are still assigned via the [GameEventDropdown] attribute on the prefab. The event reference is visual — it's a drag-and-drop field on the prefab asset. Only the listener registration is code, because the instance doesn't exist at edit time.

Ideal For

  • Spawned enemies/NPCs reacting to global events (pause, slow-motion, area effects)
  • Projectiles and VFX raising events on collision or lifetime expiry
  • Pooled objects that need to subscribe/unsubscribe as they activate/deactivate
  • Dynamically generated UI elements (inventory slots, chat messages, leaderboard rows)
  • Any system where the listener count is unknown at edit time

Example

You're building a tower defense game. Towers are placed at runtime. Each tower needs to listen for OnWaveStarted to begin targeting and OnWaveEnded to enter idle state. Since towers are instantiated dynamically, each one registers its own listeners in OnEnable() and cleans up in OnDisable(). Meanwhile, the wave manager that raises OnWaveStarted might be a scene-resident singleton with its timing configured entirely through the Behavior Window.

Layer 3: The Hybrid — Where the Magic Happens

The real power of GES emerges when you combine both layers intentionally:

Programmers define the event architecture and write the raise/listen code for runtime systems. They decide what events exist, what data they carry, and when they fire.

Designers and artists configure what happens in response using the Behavior Window, Flow Graph, and Condition Trees. They control the game feel, the timing, the conditions, and the visual/audio polish.

Here's a concrete hybrid workflow:

[Programmer writes]
- EnemyHealth.cs: raises OnEnemyDamaged(int damage) when hit
- EnemyHealth.cs: raises OnEnemyDeath(GameObject enemy) when HP <= 0
- WaveManager.cs: dynamically adds/removes listeners as enemies spawn/despawn

[Designer configures in Behavior Window]
- OnEnemyDamaged -> flash the damage number UI, shake the camera (condition: damage > 20)
- OnEnemyDeath -> play death VFX, add score to counter, check wave completion

[Designer configures in Flow Graph]
- OnLastEnemyDeath -> triggers OnWaveComplete
- OnWaveComplete -> chains: show reward panel -> wait 3s -> spawn next wave

The programmer never has to adjust a screen shake duration. The designer never has to write a listener registration. Each person works in their domain of expertise, and the event assets are the shared contract between them.

Practical Guidelines for Scaling

As your project grows, keep these principles in mind:

1. Let Object Lifetime Guide Your Choice

If it's in the scene at edit time → visual. If it's instantiated at runtime → API. This single rule resolves 90% of decisions.

2. Keep Event References Visual, Even in Code

Always use [GameEventDropdown] on your MonoBehaviour fields instead of hardcoding event lookups. This gives you type-safe, searchable dropdowns on prefabs and lets you swap events without code changes.

3. Use the Behavior Window for Response Tuning, Code for Response Logic

If the response is "play this sound after 0.3 seconds when health is below 50%," that's configuration — put it in the Behavior Window. If the response is "calculate damage reduction based on armor type, elemental resistance, and buff stacks," that's logic — write it in code.

4. Clean Up Runtime Listeners Religiously

OnEnable() → subscribe. OnDisable() → unsubscribe. No exceptions. This is the single most important habit for preventing memory leaks and ghost listeners in projects with object pooling or frequent scene loads.

5. Use Priority Listeners When Execution Order Matters

When multiple systems respond to the same event, don't rely on registration order. Use AddPriorityListener() with explicit priority values. Save data at priority 1000, update game state at 100, refresh UI at 0, play audio at -100. This makes your execution order self-documenting.

6. Use the Flow Graph to Make Invisible Relationships Visible

When an event triggers other events (via Triggers or Chains), always model it in the Flow Graph. Six months from now, no one will remember that OnDoorOpened triggers OnLightActivated, OnMusicChanged, and OnTutorialStep3. The Flow Graph makes these relationships discoverable at a glance.

7. Organize by Domain, Not by Type

Structure your event databases around game domains (Combat, UI, Audio, Progression) rather than technical categories (Void Events, Int Events). When your combat designer needs to tune something, they should find everything combat-related in one place.

The Decision at a Glance

LayerApproachWhoWhen
Scene objectsVisual (Editor, Behavior, Flow Graph)Designers, Artists, AudioObjects exist in Hierarchy at edit time
Runtime instancesScripting API (AddListener, Raise)ProgrammersPrefabs instantiated during gameplay
HybridEvents as shared contractsEveryoneProgrammers raise, designers respond

The Takeaway

The goal is not to pick one approach over the other. The goal is to let each team member work with the tools that match their expertise, while the event system acts as the clean boundary between disciplines.

Build the architecture in code. Polish the experience in the editor. Ship the game together.

Cross-Scene Events: The Persistence Problem Nobody Talks About

TinyGiants
GES Creator & Unity Games & Tools Developer

Your AudioManager plays background music. It subscribes to OnLevelStart to change tracks when the player enters a new area. You put the AudioManager on a DontDestroyOnLoad object so it persists across scene loads. Everything works during development because you're always testing in the same scene.

Then someone loads Level 2 from Level 1 for the first time. The music stops changing. The AudioManager is still alive — DontDestroyOnLoad did its job — but the event subscription didn't survive the transition. Or worse: the OLD subscription is still there, pointing at the destroyed Level 1 version of the event raiser, and the next time something tries to invoke it you get a MissingReferenceException in the middle of gameplay.

This is the persistence problem, and every Unity project with more than one scene hits it eventually.

Debugging the Invisible: Why Event Systems Need Their Own Observability Layer

TinyGiants
GES Creator & Unity Games & Tools Developer

A QA tester files a bug: "The door doesn't open when the player picks up the key."

Simple, right? Probably a missing reference or a wrong condition. You open the project, pick up the key, and... the door opens fine. Works on your machine. So you ask the tester for reproduction steps, and they say "it happens about 30% of the time, usually after a save/load cycle."

Now you're in debugging hell. Somewhere in the chain between the key pickup event, the inventory update, the quest progress check, and the door's unlock condition, something is failing intermittently. But which link? Was the event not raised? Was it raised but the listener wasn't subscribed? Was the listener subscribed but the condition evaluated to false? Was the condition correct but the door's state was stale after the load?

Event System Pitfalls: Memory Leaks, Data Pollution, and Recursive Traps That Ship in Production

TinyGiants
GES Creator & Unity Games & Tools Developer

You've been testing your game for 5 minutes at a time. It runs great. Then QA files a report: "Memory usage grows steadily over a 30-minute play session. Frame rate degrades from 60 to 40 after loading 6 scenes." You profile it. There are 847 listeners registered to an event that should have 12. Each scene load added new subscriptions but never removed the old ones. The objects were destroyed, but their delegate references live on, pinning dead MonoBehaviours in memory where the garbage collector can't touch them.

Or this one: "Health values are wrong on the second Play Mode session. First run works fine." You hit Play, test combat, stop. Hit Play again. The player starts with 73 HP instead of 100. ScriptableObject state from the last session bled through because nobody reset it.

Or the classic: the game hangs for 3 seconds, then Unity crashes. Event A's listener raised Event B. Event B's listener raised Event A. Stack overflow. Except sometimes it doesn't crash — it just hangs, eating CPU in an infinite loop that produces no visible error.

These aren't hypothetical. These are bugs I've seen ship in production games. And they all have the same root cause: event system patterns that look correct in isolation but fail at scale.

When Visual Editors Aren't Enough: Building Event Flows at Runtime for Procedural and Dynamic Systems

TinyGiants
GES Creator & Unity Games & Tools Developer

Your procedural dungeon generator just created a room with three pressure plates and a spike trap. The next room has a lever puzzle connected to a locked door. The room after that is a boss arena where environmental hazards activate based on the boss's health phase. None of these event relationships existed at edit time. The dungeon layout was determined by a seed that the player entered 30 seconds ago.

How do you wire up the events?

With a traditional approach, you write an enormous switch statement. For each room type, manually subscribe and unsubscribe event handlers. For each AI difficulty, manually chain different attack patterns. For each mod-created content piece, manually parse a config file and translate it into event connections. The "manual" part is the problem — you're reimplementing event wiring logic every time the topology changes at runtime.

Visual node editors are fantastic for flows you know at design time. But they fundamentally can't handle flows that don't exist until the game is running. And increasingly, the most interesting game systems are exactly the ones where the event graph is dynamic.

Execution Order Bugs: The Hidden Danger of 'Who Responds First' in Event-Driven Systems

TinyGiants
GES Creator & Unity Games & Tools Developer

The player takes 25 damage. The health system subtracts it from the current HP. The UI updates the health bar. Except the health bar shows 100 instead of 75. You stare at your code for 20 minutes before you realize: the UI listener executed BEFORE the health system listener. The UI read the old HP value, rendered it, and then the health system updated. By the time the data was correct, the frame was already drawn.

You've just discovered execution order bugs, and if you've shipped anything with event-driven architecture, you've probably shipped a few of these without knowing it. They're the kind of bug that works fine in testing because your scripts happened to initialize in the right order, then breaks in production because Unity decided to load things differently.

This isn't a rare edge case. It's a structural flaw in how most event systems work — including Unity's UnityEvent and standard C# event delegates. And once you understand why, you can't unsee it.

Time-Based Events in Unity: Why Coroutines Are the Wrong Tool for Delays, Repeats, and Cancellation

TinyGiants
GES Creator & Unity Games & Tools Developer

You need to delay an explosion by 2 seconds after a grenade lands. Simple enough. You write a coroutine. IEnumerator DelayedExplosion(), yield return new WaitForSeconds(2f), call the explosion logic. Maybe 10 lines if you're tidy. You feel good about it.

Then your designer says "the player should be able to defuse the bomb." Okay, now you need to store the Coroutine reference so you can call StopCoroutine(). But wait — what if the player defuses it before the coroutine starts? You need a null check. What if the game object gets destroyed mid-wait? Another null check. What if the player defuses it at the exact frame the coroutine completes? Race condition. Your 10 lines are now 25, and you haven't even handled the "show defused message vs. show explosion" branching yet.

This is the story of every time-based event in Unity. The first implementation is clean. The second requirement doubles the code. The third makes you question your career choices.

Parallel vs Sequential: The Two Execution Patterns Every Event System Needs (And Most Don't Have)

TinyGiants
GES Creator & Unity Games & Tools Developer

Player dies. Death sound and death particles should start at the same instant — no reason to wait for one before starting the other. But the screen fade absolutely MUST finish before the respawn point loads. And the respawn MUST finish before the player teleports. And the teleport MUST finish before the screen fades back in.

That's parallel AND sequential execution in the same flow, triggered by a single event. And here's the uncomfortable truth: most event systems in Unity give you exactly one pattern. Fire an event, all listeners respond, done. Whether those responses should happen simultaneously or in strict sequence? Your problem.

So you solve it. With coroutines. And callbacks. And booleans named _hasFadeFinished. And before you know it, you've built an ad-hoc state machine scattered across six files that nobody — including future-you — can follow.

Invisible Event Chains: Why You Can't Debug What You Can't See

TinyGiants
GES Creator & Unity Games & Tools Developer

Your player dies. A death sound plays. A ragdoll activates. A UI popup shows "You Died." The game auto-saves. An analytics event fires. A respawn timer starts counting down. That's six different systems, all responding to one event: OnPlayerDeath. But here's my question — WHERE is that documented?

Not in your code. Not in your project management tool. Not in any diagram. It exists in one place: inside the head of whoever originally set it up. And if that person left the team six months ago, it exists nowhere.

This is the dirty secret of event-driven architecture. We adopt it because it decouples our systems. We celebrate that our AudioManager doesn't need a reference to our UIManager. But we never talk about the cost: the flow of execution becomes invisible. And invisible things are, by definition, impossible to debug visually.

Escape if-else Hell: Visual Conditional Logic That Scales

TinyGiants
GES Creator & Unity Games & Tools Developer

Every game is basically a giant pile of conditions. "Only deal fire damage if the enemy isn't immune AND the player has a fire buff AND a random crit check passes." When you're prototyping, you throw an if-statement into a callback and move on. Thirty seconds. Works. You feel productive.

Then the prototype ships into production. Those thirty-second if-statements start breeding. One becomes five. Five becomes fifty. Fifty becomes "where the hell is the condition that controls the loot drop rate for the second boss?" And now your designer is standing behind you asking if they can change a damage threshold from 0.3 to 0.25, and you're explaining that it'll take a recompile.

Welcome to if-else hell. Population: every Unity project that lasted more than three months.