TRAPPIST LANDING

Trappist Landing

3D First Person Narrative Driven Exploration Game

Tool: Unity Engine (C#)

Team Size: 16

My Role: Gameplay Programmer

Development Time: 8 Months

      

About The Project

  • Trappist Landing is a 3D atmospheric narrative experience, where player take the role of Sarah, an astronaut crash-landed on the planet Trappist 1-e. Work with your partner to explore the strange alien life on the planet, avoid hostile creatures, and find a way out of the planet.

  • This game is my senior game project at Digipen, where I join the team as a gameplay programmer. It has been quite a challenging, but fun experience working with with other 5 programmers to support full art + sound design team.

Role in the team

  • As a gameplay programmer, I work closely with everything related to the player controller, from player controller implementation, player-object interaction, arm interface controller, scanning mechanic, player breath system, headbob system, and even plant behavior that require interaction with the player footstep like the Lightbulb.

  • Another part of my role in the team is game designer, where I work with the other designer in the team to realized and solve different problem like the automatic cover system, stealth mechanic, UX design, controller feel, button mapping, etc.

Contribution:

  • Extensively communicate with programmers ,designers, and artists to come up with the solution for various problem of gameplay system, optimization, design decision, etc.

  • Implement a custom navmesh-based first person controller that can switch to rigidbody-based at run-time.

    • Highly customization setting for all states including idle/walk/run/peek speed, transition curve, duration, cover detection range, run fov effect, jump.

    • Fine-tune and balance the character controller to give a sense of wearing a heavy spacesuit.

    • Setup an alien encounter for run-time character navmesh/rigidbody switch.

      

  • Automatic cover system for the player controller for in-game stealth encounter.

  • Expand player controller to make it able to switch from navigation mesh to physic base movement at run-time.

  • Diving into the tightly-coupled, Interaction detection and re-implementing it from scratch.

  • Implement the Scanning and touch mechanic for the player to detect and interact with the object in game.

      

 

Scanning mechanic to collect information

Touch mechanic to Interact with object

Player controller Profile serialized as .asset file

  • Create Breath system that determine the player breathing and heartbeat sfx, volume, pitch based on the player action state, breathing state, fear intensity, and heartbeat intensity.

  • Implement Headbob system that utilized perlin noise to create a natural headbob that also dispatch footstep down event for playing SFX and gameplay reactive plant such as the Lightbulb.

  • Implement Lightbulb emissive behavior that taking the player's speed, distance, and footstep event into an account to simulate a feeling of it reacting to a sound wave.

  • Create generic script for animating individual UI translate, scale, color, text size, opacity, etc and able to group them into a sequence animation.

 

      

Lightbulb plant that triggered by player's footstep event from headbob system

Hint UI appear animation made by the generic animation script

©2017 by Rittikorn Tangtrongchit.