top of page

The Mortality Sequence

  • Writer: michaeldiazcompsci
    michaeldiazcompsci
  • Dec 25, 2024
  • 9 min read

Updated: Jan 27

Game Genre

Immersive Simulation

  • The Mortality Sequence was a project that I joined as part of my Ventures class during my last semester at the Florida Interactive Entertainment Academy; I wanted to gain experience in more complex systems and code architectures within games, and felt that this project would sufficiently provide that

    • Given the intent of the project was to create an imm sim demo, I worked with the Design Lead to establish a narrative of a humanoid-hyena scavenger unearthing the history of a dead Dyson swarm in their solar system

      • We supplemented the narrative mechanically through a custom "Plug and Socket" system, allowing the player to reconfigure tools, enemies and the environment to find new ways to approach combat and environment traversal


Game Engine

Unity v6

  • Initially the project began in Unreal, but we quickly found that having to learn C++ to get more use than blueprints would take too long for the entirety of our team; given our artist was already familiar with the graphics and rendering pipelines in Unity, we shortly switched over early on in The Mortality Sequence's lifetime

    • Additionally, as Unity 6 had recently released at the time, and our artist wanted to experiment with the new technologies for rendering, the opportunity was perfect to make the switch


Responsibilities

Technical Designer

  • At it's core, we wanted the player in The Mortality Sequence to feel first and foremost like an engineer- more specifically, the stereotypical "redneck engineer", someone who creates new machines through the reconfiguration of existing ones for some novel purpose

    • To achieve this fantasy in gameplay, we began with the concept of being able to break apart objects into components which could then be 'plugged' into other objects to override or bypass certain functionalities

      • In order to create an intuitive understanding of the system for players and to abstract it easily for us as developers, we conceptualized a Data class that would use the bridge design pattern, giving each object a type of data that would be carried from its plug to a receiving socket

        • The Data class on each plug object could carry either a boolean, float, 3D vector, or our custom data types for lore cartridges or projection reels; each plug and socket would be recognizable to the player in-game via the shape and color associated with it

          ree

        • To allow the players to interact with this system during gameplay, we created prefabs for a plug and socket that could be attached to any mesh and that would interface with the given script for the object in order to pass or receive the data; scripts for the plug and sockets made use of the physics engine, allowing players to move these objects just by grabbing them and moving them nearby a compatible socket (even throwing them at a socket would work too!)

      • For prototyping I created a variety of objects with isolated functions that could be applied in a variety of circumstances; any components that were on the enemy should also be usable to some degree by the player

        • The camera detects any sentient entity (the player, other enemies) within a dot protect between its forward vector and the normalized distance vector from it to said entity, sending its position as a 3D vector

        • The microphone listens for 'noise' events from audio sources within a given radius, performing a raycast between it and the audio source; for each collision detected with level objects, it subtracts from the source's 'volume' and ignores it if it falls below a threshold to mimic the sound being absorbed

        • The shield generator emits a field that protects all sockets within a radius from being decoupled, absorbing damage from all weaponry until its battery is depleted, requiring a recharge

        • The ball-and-socket arms on enemies would take in a 3D vector to orient itself to point at a position and output a boolean to trigger whatever was attached to its end, usually a weapon of some kind

        • The keypad had a series of numbers keys that the player could press to generate a 4 digit code- if the code matched the preset one int he inspector, it would send a "true" boolean value to whatever was attached, before sending a "false" value after some delay, anything else would prompt the player to try again

      • Out of all the objects I created, the most complex one was a "Cyberdeck", a portable computer that the player could carry and input commands into via a custom, diegetic command line

  • A large part of the game involved what was essentially an ecosystem of robotic lifeforms finding new purpose/life after the downfall of their creators; each one fills a specific niche within its mechanical ecosystem that would serve as the basis to it's design, whether this was the model's visual or how its 'brain' processed information

    • One core aspect that we knew we wanted as part of the enemy design in The Mortality Sequence was to have the enemies be configurable with the various objects we made

      • Narratively, this made sense as a civilization living on a Dyson swarm would have limited access to resources, and would have to find as many ways to recycle and repurpose the things they did spend those resources on

      • Mechanically, this allowed the plug and socket system to play a large role in how the player interacted with the enemies, giving them multiple options on how to approach combat; this in turn fed into the imm sim genre and a "play how you want" approach

        • In order to code the AI itself, we utilized Goal Oriented Action Planning, or GOAP, as its structure had convenient parallels to how the enemy was designed:

          • Each robot would have been built with a singular purpose in mind- "kill a target", "heal allies", "remove hazards", "fix the environment", and "send alerts", to name a few examples. These become the goals for the enemies, a single, fairly ambiguous state that can be expanded based on the context said enemy is in, alongside some more generic goals common to each enemy (retreating, seeking cover, moving to a point, etc.). In-game these are stored in a swappable "AI Core" on the robot's person

          • To infer said context, the robot comes with a variety of objects as it spawns in to take in its environment, such as the camera or microphone. These serve as the sensors for the AI that will periodically update the variables that make up the world state to continuously provide the information it needs to act

          • In order to act in accordance to its programming, robots are equipped with a set of objects with the intent to alter other entities or the environment, things like flamethrowers, gravity guns, laser cutter, and so forth. These objects all have effects and costs built into the various methods in their script, allowing the GOAP system to weigh the costs against how close a given action would get it to the current goal

          • Finally, the main body of the robot itself provides the methods of locomotion for the robot, allowing us to implement and consider multiple methods of pathfinding from bipedal humanoids to flying drones

        • The composition of the robot not only allows the player to easily incapacitate a robot by removing parts until it either has no information to use/no means to harm the player, but allows the player to utilize their existing AI to their benefit as well:

          • Consider a statically mounted guard patrol robot- under normal circumstances, if the player enters the view of the guard's camera it fires a stun at the player using a taser; should the player replace the taser with a healing gun, the robot in its mind is still firing on the player in an attempt to kill them, but now consistently will heal the player as long as they are in their line of sight

          • Consider a cargo robot who is programmed to move a heavy box from one QR code to another using a gravity gun- by moving the QR codes themselves, the player then modifies the robot to deliver anything to a location they determine; perhaps even an explosive crate to a locked door, blowing it open instead of needing to search the level for the passcode

          • Consider a drone with an alarm whose task is to follow and alert sentries to the player's position- when its AI core is replaced with one from a robot meant to prevent fires on the Dyson swarm, the player can use a flamethrower to light an area on fire and flee; the robot then senses fire and acts with the only object it can, the alarm. Sentries flock to the open flame and their sensors break in the heat, allowing the player to walk by undetected

        • We even took it a step further, designing functionality with the cyberdeck to allow the player to "spoof" the AI core, overwriting the GOAP decision making framework, and instead allowing the player to perceive through the robot's sensors, and act using its objects

          • In doing so, the player not only gains a better understanding of how the robots and their parts work, but also allows them to access areas of the level that they wouldn't be able to under normal circumstances: access vents, assembly lines, maintenance channels, etc.

          • We even went as far as changing the rendering methods to the screen to reflect the input from the sensors- IR sensors to see in the dark, movement detection to perceive imperceptible changes, and even 3D sonar to represent audio-only sensors

      • In order to prevent the player from abusing the plug and socket system to make a single overpowered robot that follows them throughout the course of the game, we determined that a sort of Day/Night cycle would help to reset the environment, and fix some of the smaller acts of destruction the player had enacted in the level

        • During the 'day', robots operated as normal, running whatever functions they were programmed to do within their local mechanical ecology

        • At 'night', many robots would seek charging areas and spend a short duration, meanwhile a special niche of "cleaner" robots would enter, patrolling the area to fix destroyed infrastructure and collect disassembled/destroyed/modified robots- repairing them and reintroducing them to the level at the start of the 'day'


Duration

4 Months

  • While the design of The Mortality Sequence encompassed many aspects of the game that would be present throughout its entirety (narrative, core mechanics, level concepts), we knew that we only had a short time to implement features and systems for the final presentation

    • As such, given our programmer/tech designer-heavy team, we opted to focus primarily on gameplay through scripting first and foremost, integrating whatever assets we could from our 3D artist as they came in

    • I spent the majority of my time on the project working on the enemy AI and prototyping various objects for the sake of testing the AI and the systems the objects relied on (LoS, Battery Charge, Audio Events)

      • By the time we reached our final deadline, we had complete modularity with our AI that could be expanded fairly easily through the creation of new game objects or scriptable objects with baseline AI configurations, along with a variety of interactable objects for levels

        • These included sliding doors, a monorail spline system, a universal coupling wrench for remotely working with sockets, and even a monorail base that the player can drive throughout the level


Lessons Learned

  • At the start of this project I only had a conceptual understanding of how GOAP worked; when initially working with the design lead to conceptualize an AI framework for our enemies, I came to the realization that GOAP would serve as a near-perfect parallel for our plug and socket system and spent a large chunk of time devoted to researching GOAP implementations

    • Development at first was very slow, integrating one aspect at a time as I started with the goals, then sensors and pathfinding, before finally integrating a robot's knowledge of the objects attached to them and their methods

      • Once I had a test robot working under specific conditions, I began to expand the system to become more generalized, compartmentalizing the different sections of code and using techniques such as dependency injection in order to decrease coupling across classes

      • After the code was fairly independent, I began with isolating portions of the AI setup that I could make into a general framework and created a scriptable object template for basic traits such as movement/turning speed, melee/ranged attack radius, and attack speeds to quickly create archetypes across differing enemy types

  • Another challenging technical aspect was the fact that not every enemy would be able to use Unity's default navmesh system for pathfinding as some of our enemy types were capable of flight/hovering

    • Given that we wanted a dynamic system that would react to geometric changes in the environment, we knew our solution needed to be as optimized as possible

      • In the end, we had taken inspiration from Unity 6's light probe system, and created a sort of 3D grid of arbitrary points, and used a variation on the A* pathfinding algorithm in order to create a path even if vertical traversal was needed

        • To make the movement more natural, we utilized some techniques from procedural animation, giving the (flying) robots a growing inertia when flying in a straight line, and running a continuous linear interpolation on the rotational angles as they turned; the result was a smoother, less point-to-point motion

  • Finally, while I have had extensive experience working with inheritance in scripts, the sheer scale of making an immersive simulation meant that we'd have many objects that utilized multiple systems

    • For example, the shield generators on robots needed to interact with the physics system to be held/unplugged, the plug and socket system for data transfer, and the battery system for usage

      • Rather than attempting to have it inherit from master classes for each of these systems, we can make use of Unity's existing component system to create a set of self-contained code to allow an object to interact with its respective system, before adding it and other scripts as components to the object in question

        • In doing so, we sped up our iteration process when it came to prototyping object by having to avoid refactoring the code for every specific variation of a given object, such as regular cameras vs. motion detectors vs. IR sensors

 
 
 

Recent Posts

See All

Comments


Leave a Message

Have questions, feedback, or want to talk in general? Feel free to reach out to me using the contact form below- I'm always eager to hear from other devs in the industry!

Find me on:

bottom of page