Drug Runner


Technical Design Document

 

 

Final Draft

November 23rd 2001

 

 

 

 

 

andrew nealen alan woo connie fung cynthia lim james haslam serif askin

Table of Contents

1 Architectural Design and Implementation *

1.1 Model-View-Controller Architecture *

1.1.1 Basic logging and printing through CObject’s print methods *

1.1.2 Main Game Loop *

1.1.3 Timer *

1.1.4 Object Factory *

1.1.5 Kernel (or ‘Mini Kernel’) *

1.1.6 Controllers and the State Machine Pattern *

1.1.7 Model(s) *

1.1.8 Model Database *

1.1.9 Type database *

1.1.10 Triggers *

1.1.11 Spatial Index *

1.1.12 Gateway *

1.1.13 View *

1.1.14 Manager(s) *

1.1.15 Event Queue *

1.2 Single Root Class Hierarchy *

2 World Representation *

2.1 Overview *

2.2 Rendering Representation *

2.2.1 Models *

2.3 AI / Game-Logic Representation *

3 Behavioral Patterns *

3.1 Overview *

3.2 Dynamic Entities *

3.2.1 Player Vehicle *

3.2.2 Police Vehicles *

3.2.3 Boss Vehicle *

3.2.4 Other Vehicles *

3.2.5 Mission Critical Entities *

3.3 Static Entities *

3.3.1 World ‘Behavior’ *

3.3.2 HUD *

3.3.3 Streetlights / Lampposts *

3.3.4 (Burning) Trashcans *

3.3.5 Traffic Lights *

3.3.6 Trees *

3.3.7 Decals *

3.3.8 Missions *

4 Game-State Management (Game Flow) For Player Vehicle *

4.1 Overview *

4.2 Game-State Class *

5 AI / Path-finding *

6 Collision Detection/Resolution *

7 Vehicular Physics *

8 Sound *

8.1 Overview *

8.2 Background Music *

8.3 Predominant Sound *

8.4 Player Vehicle Sounds *

8.5 Sound Events *

8.5.1 Events Relevant to Sound *

8.5.2 Sound Event Priorities *

8.6 Recording and Storing Sounds *

8.7 Sampling Sounds *

8.8 DSManager Class *

8.9 CSoundManager Class *

8.10 Header File For Wave Files Used *

9 Input *

9.1 Overview *

9.2 Player Input Events *

9.3 Transforming Input into Data *

9.4 The CEventManager Class *

9.5 CInput Class *

9.6 Input Manager Class *

10 Camera Logic *

11 Rendering *

11.1 Use of Graphics API *

11.2 Animation and Textures *

11.3 Access to World and Static Entities *

11.4 Access to Dynamic Entities *

11.2 Camera *

11.3 Special Effects/Particles *

11.3.1 Particle System *

11.4 Use of Cel-Shading *

11.4.1 1D Texture Dynamic Cel-Shading *

12 Art Reqirements, Resources, Definition and Pipeline *

12.1 Required Art Assets *

12.1.1 Vehicle Art Requirements *

12.1.2 Static Object Art Requirements *

12.1.3 Particle System Art Requirements *

12.1.4 Animation Object Art Requirements *

12.1.5 Fonts *

12.1.6 Miscellaneous Art Requirements *

12.2 Definition Files *

12.2.1 World Representation File *

12.2.2 Object Representation Files *

12.2.3 Vehicle representation files *

12.2.4 Animated Object Representation files *

12.2.5 Particle System Definition files *

12.2.6 Directory structure of art assets *

12.3 Art pipeline. Introducing new art, modifying existing *

12.3.1 Creating new models. *

12.3.2 Creating a model definition file *

12.3.3 Populating the directories. *

12.3.4 Include new model in world representation *

12.3.5 Adjusting model parameters. *

12.3.6 Populating the type database. *

12.4 External resources. *

12.5 The Wavefront .obj file format *

13 The Prototype *

 

 

  1. Architectural Design and Implementation
    1. Model-View-Controller Architecture
    2.  

      The very high-level diagram of the Model-View-Controller architecture is displayed in figure 1.1

       

       

       

      Fig 1.1.1: High Level MVC Overview

       

       

      The game engine of ‘Drug Runner’ will be built according to the MVC paradigm. This nicely decouples the model from the view by assigning controllers to each model (or groups of closely associated models), which are the only elements of the engine with read/write access to the world objects. To demonstrate this architecture, all relevant components of a prototype engine are described in detail in the following sections. This overview of the core elements is not completely exhaustive as the later sections in this document on world objects and their behaviour will describe each game entity in more detail and also give an idea as to how those game entities tie into the overall game engine. What will be described in this section though, is the method by which these entities are created, how they are placed in the so called ‘model database’, how their controller is instantiated, the links between the two set up, etc. The complete source of the prototype and an executable is included with this document. Note that the prototype implements only the CsphereModel so all descriptions are on this model.

       

      But first we present a slightly more detailed view on the major subsystems of the game engine and their interactions, depicted by the arcs connecting them (see figure 1.1.2).

       

      Figure 1.1.2: Components of the MVC Game Engine of Drug Runner

       

       

      The descriptions of the functionality of each of these systems and how they interact is as follows.

       

      1. Basic logging and printing through CObject’s print methods
      2. Most classes in the game are derived off of a base class:

         

         

        class CObject {

        public:

        CObject();

        virtual void ToString();

        virtual void ToString(std::string _indent);

        virtual void LogToFile(CLogfile* _file);

        virtual void LogToFile(std::string _indent, CLogfile* _file);

         

        protected:

        std::string m_objectID; // unique identifier for each object (for debug)

        private:

        static int m_newID; // counts already used (object) id's

        };

         

         

        This class has methods to print to stdout or a dedicated logfile, which can (and should be!) overridden by subclasses to make the debugging process much easier. The logfile is a singleton object that is globally accessible so any subclass of CObject can access it and write debug output to it. Also this as the base constructor is always called, this class implements a unique ID (stored as a STL <string>) to identify it among all other objects in the world. Can also be helpful in tracking bugs during development.

      3. Main Game Loop
      4.  

        The main game loop looks like this (in CDemoEngine.cpp):

         

        bool CDemoEngine::RunFrame() {

        GetInput(); // get input

        if (m_timer->HasTickPassed()) { // for each frame

        m_colMgr->resolveCollisions(); // resolve collisions

        m_kernel->RunProcesses(m_timer->FrameRatio()); // move stuff

        m_renderer->RenderScene(); // render stuff

        }

        return true;

        }

         

         

        So we need a timer class, which knows when a game tick has passed (see class CTimer in the next section). GetInput () takes care of the input queue and processes the events from there by propagating the events to the controllable CController’s (this will mainly be the players vehicle). Then the collisions are detected/resolved and the appropriate models and controllers are informed and updated in case of a collision. Now come the heart of the ‘AI’ the RunProcesses() method of the CKernel class. In here, all processes (all controllers, as they implement the CProcess interface) are updated once per game tick. Also, if the desired framerate cannot be maintained, the frameratio is scaled accordingly which keeps the physics/gamelogic simulation at a constant real-time speed which is independent of the current framerate. After all these processes are updated, the renderer object (singleton CRenderer) renders all world geometry and static/dynamic world entities by accessing the model database via a spatial index (grid). The header file associated with the main loop CDemoEngine.h is:

         

         

        class CDemoEngine {

        public:

        CDemoEngine(COpenGLWindow* glWind);

        ~CDemoEngine();

         

        void Init(); // some initialization

        bool RunFrame(); // run the simulation

        void GetInput(); // get the players input (todo)

        void Close(); // stuff to do before closing

        void DebugOutput(); // some debug output and tests

        void DumpKernelToFile(); // dump contents of the kernel

         

        private:

        CKernel* m_kernel;

        CModelDB* m_modelDB;

        CLogfile* m_logFile;

        CPhysics* m_physics;

        CCollisionMgr* m_colMgr;

        CObjectFactory* m_objFactory;

        COpenGLWindow* m_glWind;

        CTimer* m_timer;

        CRenderer* m_renderer;

         

        float m_timeRunning;

        };

         

      5. Timer
      6. In the mainloop above we use a timer to query whether a game tick has passed or not. The header of the implementing class is:

         

         

         

        class CTimer : public CObject {

        public:

        CTimer() {}

        ~CTimer() {}

        void Init(double _targetFrameTime);// must be called to pass targetFrameTime

        // and start the timer

        bool HasTickPassed(); // asks if one targetFrameTime has passed and

        // updates the timer accordingly

        void UpdateTimer(); // alternative to 'hasTickPassed', simply updates

        // all values and doesn't impose a fixed framerate

        float FrameRatio(); // returns the timers current frameRatio

        // where 1.0 means the targetFrameTime is reached

        // < 1.0 means the frameTime is smaller

        // > 1.0 means the frameTime is larger

        // the frameRatio should be used to 'scale' the

        // movement of the AI to make it framerate

        // independent

        double FrameTime(); // returns the frameTime with 1/frameTime =

        // framesPerSec (FPS)

        void ToString(); // CTimer's version of ToString() and LogToFile()

        void ToString(std::string _indent);

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

         

        private:

        // four abstractions used by the operating system

        void QueryTicksPerSec(); // queries the win32 counter for ticksPerSecond

        void TimeStart(); // queries for the start ticks (m_startTime)

        void TimeEnd(); // queries for the current ticks (m_currentTime)

        double TimeValue(); // returns the frameTime based on m_startTime and

        // m_currentTime (see os-specific code below)

        float m_frameRatio;

        double m_frameTime;

        double m_targetFrameTime;

        #ifdef _WIN32 // some more platform specific stuff

        LARGE_INTEGER m_startTime;

        LARGE_INTEGER m_currentTime;

        LARGE_INTEGER m_ticksPerSecond;

        #else

        struct timeval m_startTime;

        struct timeval m_currentTime;

        struct timezone m_tz;

        #endif

        };

         

         

        Note that the timer was designed with both linux and win32 operating systems in mind. This interface offers all the functionality necessary to time the mainloop (given at least millisecond precision, which isn’t a problem under both win32 and linux). This will still need a stopwatch function to pause the simulation when the gamestate changes to the frontend or the main game is in a ‘paused’ state (see gamestate management).

         

      7. Object Factory

When the game initializes, all entities must be created and registered with their controllers in the kernel (which processes the controllers on their Update() function). To achieve this, the prototype has a class CObjectFactory which is responsible for the creation of models, controllers and their placement in the model database and kernel respectively. The interface is:

 

class CObjectFactory : public CObject {

public:

~CObjectFactory();

static CObjectFactory* Instance(); // singleton access

CModel* Create(int _type); // create an object of type '_type'

CModel* Create(int _type, vector3 pos);// create an object of type '_type'

// at position 'pos'

void ToString(); // CObjFactory's ToString() and

void ToString(std::string _indent); // LogToFile()

void LogToFile(CLogfile* _file);

void LogToFile(std::string _indent, CLogfile* _file);

 

////////////////////

// some object types

////////////////////

static const int PLAYER_VEHICLE_OBJECT;

static const int OBSTACLE_VEHICLE_OBJECT;

static const int STATIC_OBJECT;

static const int SPHERE_OBJECT;

static const int CELL_OBJECT;

 

private:

CObjectFactory();

static CObjectFactory* m_instance; // the singleton instance

 

//////////////////

// factory methods

//////////////////

CModel* CreatePlayerVehicle(vector3 _pos);

CModel* CreateObstacleVehicle(vector3 _pos);

CModel* CreateStaticObject(vector3 _pos);

CModel* CreateSphere(vector3 _pos);

CModel* CreateCell(vector3 _pos);

};

 

 

This is a general concrete factory for various game objects. Note (as mentioned above) that only the sphere model is implemented even though other symbolic constants are present. The factory can access the kernel and the model database through their unique instances within the application as they are implemented as singleton units (described in more detail further below). A Create() factory method basically does the following (see sourcecode for implementation):

 

 

As this engine has merely prototype status, a few important issues are not yet included (they will be in the final engine): (1) No object caching is performed, so when a model dies it is simply deleted and recreated again when a new one is needed. Dead models will be cached in the final engine to reduce allocation overhead, which is especially significant when many small objects are created such as in a particle shower (see section on particles for a detailed description on memory caching for particles). (2) In the prototype, each model has a unique controller. This is not always the case, as we can see from the particle example. The final engine implementation will incorporate a 1:n relation between controllers and models, so tightly bound models can all be attached to the same controller.

 

      1. Kernel (or ‘Mini Kernel’)
      2. The singleton class CKernel is the main processing unit for all controllers (it aggregates controllers). Once per game tick, all controllers in the kernel are updated using the Update(float _frameRatio) method. See the following interface definition:

         

         

        typedef std::list<CProcess*> ProcessList;

        typedef std::list<CProcess*>::iterator ProcessListItor;

         

        class CKernel : public CObject {

        public:

        ~CKernel();

        static CKernel* Instance(); // singleton access

        void RunProcesses(float frameRatio); // run all processes in the list

        void AddProcess(CProcess* process); // add a new process to the list

        void ToString(); // CKernel's version of ToString()

        void ToString(std::string _indent); // and LogToFile()

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile*);

        private:

        CKernel();

        static CKernel* m_instance; // the singleton instance

        ProcessList m_procList; // STL list of processes

        };

         

         

        RunProcesses() simply iterates over all currently active processes which are defined by the following interface, which is implemented by all controller classes:

         

         

        class CProcess : public CObject {

        public:

        CProcess();

         

        virtual void Update(float frameRatio) = 0;

        virtual int GetType() = 0;

        virtual bool IsAlive() = 0;

        virtual std::string GetProcID() = 0;

        protected:

        std::string m_procID; // unique identifier for each object (for debug)

        private:

        static int m_newID; // counts already used (process) id's

        };

         

         

        Notice that there exists a unique identifier for each process (added to the unique ID for the object itself). The processes (controllers) itself are state machines of varying complexity. The kernel can be dumped to the logfile at any time to help in debugging the process queue.

         

      3. Controllers and the State Machine Pattern
      4. Every entity in the world that is involved in a per-frame updating has an associated controller which has read/write access to the model state. Each controller is more or less complex state machine, maintaining the models behavioural state (the world state, such as position, color, etc. is held in the model itself as the renderer needs access to this information). This state information is maintained by an implementation of the state pattern (see further below for more details). The CController interface definition is:

         

         

        typedef std::list<CModel*> ControllerModelList;

        typedef std::list<CModel*>::iterator ControllerModelListItor;

        typedef std::vector<CState*> StateList;

         

        class CController : public CProcess {

        public:

        static CController* Create(int _type); // factory method

        // must be given a type

        ~CController();

         

        void Update(float frameRatio); // called once per frame

        int GetType(); // dummy function

        bool IsAlive(); // don't update it anymore, delete it

        void Kill(CModel* _model); // accessed by model to notify of

        // deletion controller only removed

        // from processlist if the number of

        // controlled objects == 0

        std::string GetProcID(); // unique (process) identifier:

        // (a static member of CProcess)

        void AddModel(CModel* _model); // add a model to the list of

        // models controlled from here

        void RemoveModel(CModel* _model);// remove a model from the list

         

        void ToString(); // CControllers's version of ToString()

        void ToString(std::string _indent); // and LogToFile()

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

         

        // some controller types

        static const int PLAYER_VEHICLE_CONTROLLER;

        static const int OBSTACLE_VEHICLE_CONTROLLER;

        static const int STATIC_OBJECT_CONTROLLER;

        static const int SPHERE_MOTION_CONTROLLER;

         

        // so state objects can access everything the controller can

        // TODO: should find a way to make the base class a friend

        // but this didn't work yet

        friend class CNotIntersectedState;

        friend class CIntersectedState;

         

        protected:

        CController(int _type);

        int m_type; // the type (types listed in controller.cpp)

        bool m_isAlive; // for deletion from the kernel

        // not yet implemented: TODO

        ControllerModelList m_cModelList;// the associated model(s) to update

        CState* m_state; // current controller state

        StateList m_stateList; // all possible states this controller can be in

        };

         

        class CSphereController : public CController {

        public:

        ~CSphereController();

        void Update(float frameRatio); // override the Update() function

        friend class CController;

        protected:

        CSphereController(int _type);

        };

         

         

        Controllers are objects that can be associated with multiple models that behave identical (that are in the same state at the same time) as described in the previous section, so the controller has a m_cModelList member, which holds all controlled models. For each update, the controller iterates over all models and updates them according to the current behavioural state.

         

        The state the controller is in is defined by the member m_state that points to a CState object. CState is an abstract interface for all states a controller can be in and it is implemented in all concrete subclasses of CState such as CIntersectedState in the prototype. On instantiation, each controller builds a list of states it can be in during execution (this is created in the CController’s factory Create() method). After each update (in each frame) the current state object checks if a state change is necessary. If yes, it sets the member variable m_state of CController to the next state. The state pattern implements the state transitions, completely removing any switch-case statements in the controller. Expanding the behaviour of any controller is made easy this way. Admittedly, state changes will not be as visible as in a switch-case statement, but the overall implementation is cleaner. The CState interface is ultimately straightforward:

         

         

        class CState : public CObject{

        public:

        virtual void Update(float frameRatio) = 0; // called once per frame

        void AttachController(CController* _contr);

        protected:

        CController* m_controller; // the controller this state is // attached to

        };

         

        class CIntersectedState : public CState {

        public:

        void Update(float frameRatio); // override the Update() function

        };

         

        class CNotIntersectedState : public CState {

        public:

        void Update(float frameRatio); // override the Update() function

        };

         

         

        The two heirs of CState are the possible states the sphere controller can be in. the update method in CController then simply does this:

         

        void CSphereController::Update(float frameRatio) {

        m_state->Update(frameRatio);

        }

         

         

        In the current version of the prototype engine, this pattern is used to reduce the speed of two spheres while they intersect. It also raises the damage level of the spheres during intersection, but this is currently not displayed in the view. To activate it, there is some code in CSphereModel’s Draw() method, which is currently commented out. Simply remove the comments, comment out the red-blue indexing and recompile.

         

      5. Model(s)
      6. The model system keeps track of all objects in the game, including vehicles, bosses, walls, buildings, etc. Each of these objects has a location, orientation, appearance and other states within the game. Although the state of the object is kept by the model, or more specifically the model database, it is the job of the controller to change this state depending on conditions applied to this object.

         

        Models will be implemented hierarchically. For example a vehicle may be a superclass of boss vehicle, player vehicle and pedestrian vehicle, allowing for more specific state information.

        A typical CModel header file (taken from the prototype):

         

        typedef std::vector<CGeometry*> AppearanceMap;

        typedef std::vector<CGeometry*>::iterator AppearanceMapItor;

         

        class CModel : public CObject {

        public:

        static CModel* Create(int _type);

        ~CModel();

         

        int GetType(); // get model type (not used)

        virtual void Draw(); // render the model (accessed from

        // the CRenderer singleton object

        void AddAppearanceRep(CGeometry* _rep); // add representation to list

        void ToString(); // CModel's version of ToString()

        void ToString(std::string _indent); // and LogToFile()

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

         

        static const int PLAYER_VEHICLE;

        static const int OBSTACLE_VEHICLE;

        static const int STATIC_OBJECT;

        static const int SPHERE;

        static const int CELL;

         

        vector3 pos;

         

        // allow model database and object factory to

        // access private and protected members of

        // all models (for setting up connections and 'pushing' models

        // into the model database)

        friend class CModelDB;

        friend class CObjectFactory;

         

        protected:

        CModel(int _type);

        int m_type;

        CController* m_controller;

        CGeometry* m_geometry; // points to currently rendered representation

        // in the appearance map

        AppearanceMap m_appMap; // stores the different rendering rep's

        };

         

        ///////////////////////

        // a derived demo model

        ///////////////////////

         

        class CSphereModel : public CModel {

        public:

        ~CSphereModel();

         

        virtual void Draw(); // render the sphere

        void ToString(); // CSphereModel's ToString()

        void ToString(std::string _indent);

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

         

        float radius; // spheres radius

        vector3 vel; // speed vector

        float color[3]; // 3 floats from 0.0 - 1.0 for rgb color

        float damage; // damage level of sphere

        bool isIntersected;// flags intersection for damage increase

         

        // so the factory method in CModel can create a type CSphereModel

        // CModel must be declared a friend of this class

        friend class CModel;

         

        protected:

        CSphereModel(int _type);

        };

         

         

        Again, to showcase the prototype, the CSphereModel is fully implemented as a subclass of CModel. During the coding of the prototype it became painfully aware, that very often, a downcast is needed to access CSphereModel specific functionality. This is an existing problem and will have to be resolved in the next engine iteration. A possible solution exists in ‘Effective C++’ by Scott Meyers.

         

        Furthermore, to decouple the controller completely from the rendering representation, each model has an AppearanceMap (an STL typedef), which stores all possible appearances of the model. This can be used for geometry animation effects, but also for animating textures. In the prototype it is used for the different colors of the spheres, which is overkill in this context, but gives way for more powerful and easy to implement ideas. The appearance map aggregates CGeometry objects, which are defined as follows:

         

         

        class CGeometry : public CObject {

        public:

        CGeometry();

        ~CGeometry();

         

        virtual void Draw(); // render the geometry

        void ToString(); // CGeometry's version of ToString()

        void ToString(std::string _indent); // and LogToFile()

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

        };

         

        class CSphereGeometry : public CGeometry {

        public:

        CSphereGeometry();

        ~CSphereGeometry();

        virtual void Draw(); // draw regular sphere

        };

         

        class CSphereGeometryWithDamage : public CSphereGeometry {

        public:

        CSphereGeometryWithDamage(float _r, float _g, float _b);

        ~CSphereGeometryWithDamage();

        virtual void Draw(); // draw 'damaged' sphere

        private:

        float m_color[3];

        };

         

         

        CModel’s Draw() method delegates to CGeometry’s Draw() method which then performs the drawing (using OpenGL in this case). CModel’s Draw() method is responsible for setting up the modelview matrix and deciding on the proper representation based on the models current state. In the prototype, the sphere controller flags the model through the boolean member variable isIntersected so the model knows when to switch representations.

         

         

      7. Model Database
      8. Since many models must be kept track of simultaneously, the models will be stored in a model database. This will aid in model creation/destruction, serialization (loading and saving of the models) and model lookup by the spatial index. The current implementation uses a STL <map> to store the models with the unique ID as a key into the map. This yields O(log n) lookup and insertion time. We guess this will be enough, but it could be optimised to O(1) using a hashtable.

         

        At the most basic level, the model database is one or more data structures that stores pointers to object instances.

         

        The interface is as follows:

         

        typedef std::map<std::string, CModel*> ModelList;

        typedef std::map<std::string, CModel*>::iterator ModelListItor;

        typedef std::map<std::string, CModel*>::const_iterator ConstModelListItor;

         

        class CModelDB : public CObject {

        public:

        ~CModelDB();

        static CModelDB* Instance(); // singleton access

        void Push(CModel* _model);// add model to database

        CModel* Pop(CModel* _model);// remove model from database

        CModel* Pop(std::string _modelID);// remove model from database

        const ModelList GetModels(); // returns a list of models

        void ToString(); // CModelDB's version of ToString()

        void ToString(std::string _indent); // and LogToFile()

        void LogToFile(CLogfile* _file);

        void LogToFile(std::string _indent, CLogfile* _file);

         

        friend class CCollisionMgr;

        private:

        CModelDB(); // singleton pattern: make contructor private

        // and hold a pointer to the unique instance

        static CModelDB* m_instance; // singleton instance

        ModelList m_modelList; // the models stored here

        int m_size; // num models stored here

        };

         

         

        This is the datastructure used by the renderer to iterate over all renderable models (including the static world geometry) and draw them to screen using the models Draw() method, which delegates to the associated CGeometry (see interface definition in previous section):

        1. Sub-Databases

         

        Terrain information is completely independent from mobile model data or sound data. They will also have different memory and lookup optimization needs. It therefore makes sense to store them in separate structures. This is especially relevant when using the spatial index.

         

        Terrain and building information will be indexed by a 2-dimensional hash table. The hashes will be city blocks centered about the middle of the block. Other models will be indexed again by a 2-dimensional hash table, but this time hashes will be city blocks centered at the intersections of the roads.

      9. Type database

Models share many common components. For instance, all vehicles regardless of whether they are police or player vehicles may have the same artwork for the wheels. Different buildings may use the same brickwork texture. Furthermore, this type of information is usually loaded from disk and remains static throughout the game. A type database is therefore used to store all static information of this type. The models then refer to the type database.

 

Benefits of a type database include:

 

Within the game, all information loaded from disk will be stored into the type database. The Database will basically consist of CGeometry objects to which the models point. The current prototype omits this datastructure, but it will be a substantial part of the final engine.

      1. Triggers

Triggers are a type of dummy object. They can be either activated or unactivated. When "activated", their job is to change the gamestate. Triggers have locations, and "trigger conditions". They are models insofar as they have state, and location, and therefore can be stored and accessed by a model database.

 

In drug runner, a trigger will have a location, which will appear as a dot in the HUD. In the game, a trigger will appear as a billboarded pistol. Activated and deactivated triggers will appear as differently, both in the HUD and in the world.

 

Triggers will be activated in one of several ways:

 

A typical use of a trigger is a mission object that has a direct influence on the game state.

      1. Spatial Index

The spatial index keeps track of the position of models within the world and their corresponding location in the model database.

 

Queries that the spatial index responds to:

 

Models in different databases are indexed differently by the spatial index.

Terrain and building information will be stored in the terrain database. A 2-dimensional hash table will index these models. The hashes will be city blocks centered about the middle of the block.

Other models stored in the mobile model database will be indexed again by a 2-dimensional hash table, but this time hashes will be city blocks centered at the intersections of the roads.

      1. Gateway

It is critical that all spatial indices be synchronized with their mobile model database. When models need to be changed, the gateway will update both model and spatial index simultaneously. The gateway also provides a simplistic interface to the models themselves.

 

The gateway will implement the following commands:

 

 

To keep all models synchronized, any updates will be done outside the game itself. This gives rise to two states that all models can be in. "In World" and "In Void". Objects are "pushed" from "in world" to "in void". All updates to objects are done "in void" and then "popped" back "into world". Any models that are "in void" do not get rendered.

 

Note that this pattern is not implemented in the prototype, but will be added to the final engine.

      1. View
      2. This is an abstraction of the renderer. It implements commands such as RenderScene(). In the prototype, all the renderer (singleton) does is iterate over all models and call their Draw() methods. This is possible in the prototype, as we know that there are only spheres. In future versions though, the model database will maintain multiple lists of objects, categorizing them as static_entities (hydrants, lampposts, patches of grass) , dynamic_entities (obstacle vehicles, policecars) and world_geometry (buildings, trees). The current (basic) rendering interface is:

         

         

        class CRenderer {

        public:

        ~CRenderer();

        static CRenderer* Instance(); // singleton access

        void Init(COpenGLWindow* glwind);// set up openGL

        void RenderScene(); // renders all models in the

        // model database

        private:

        CRenderer();

        static CRenderer* m_instance; // the singleton instance

        COpenGLWindow* m_glWind; // the context to render to

        };

         

         

        The COpenGLWindow is ultimately straightforward and not included in this technical document. See the provided sourcecode for the class interface and its implementation.

      3. Manager(s)

An interface between the view, user inputs, models and the controller. There are many different managers in the system, which will be implemented in a hierarchy. There will be a generic CEvent class defining the common interface for all sorts of events, such as getSource() and getType(). For more information on how a manager operates, visit section 1.2. Event based architecture.

 

The different managers include:

 

Managers (and those that get managed) will be implemented using an observer pattern. The sequence that a manager performs normally performs its duties is as follows:

 

 

Some events such as user requests behave differently, however and may involve bypassing this normal process. It is up to the manager to decide what to do in these cases.

 

The only manager implemented in the prototype is the CCollisionMgr, which iterates over all models and detects collisions without resolving them. The algorithm currently used is a non-optimised O(n2) algorithm and just meant for demo purposes. Optimization using spatial partitioning and plane sweeping will be implemented in the final engine.

      1. Event Queue

Event queues ensure that user requests, requests to resolve collisions and sound requests get processed in the correct order and do not get lost. It is implemented as a prioritized queue, storing CEvent types. Every manager will have an event queue.

 

Therefore the event queues in the system will be:

 

Events of greater priority, such as a "pause game" event, may get put higher up in the priority queue than other events. Note that none of these modules are implemented in the prototype yet.

 

Figure 1.1.3: One possible View of the Model-View-Controller Architecture (MVC)

 

 

Figure 1.1.3 is just one possible view of the high-level software architecture used in drugrunner (MVC). Basically every model (CModel) in the world database (CworldDatabase, omitted here for better clarity) is associated with a controller. The controllers implement all gamelogic/AI for the model associated with it. These are simple processes (CProcess, which is also the base class of CController. Not visible in figure 1.1.3). The SingletonManager (CKernel) steps through all processes and runs the Update() method on each in the list. The Controller is the only object allowed to write to the model database. The Renderer only has read access via a spacial index which culls non-visible grid cells before rendering. The controllers are also the state machines of the associated models. This is implemented through a state-pattern: The controller maintains a list of possible states it can be in and a single pointer to the current state (abstract class/interface CState). All other methods, which rely on state-related behavior, delegate to the apropriate virtual function in the dynamically bound CConcreteState.

    1. Single Root Class Hierarchy

 

Figure 1.3.1: Basic Schema of a Single Root Class Hierarchy

 

 

Like in Java, each class used in the game engine will inherit from a base class named CObject. This class will define a basic interface for debug output (logging) and other methods like ToString(). This will simplify the debugging process tremendously. It will also allow grouping any kind of game related entity, algorithm or datastructure into unified containers. Further down the inheritance graph we will see entities such as CVehicle, CBossVehicle, and so on. The above figure is by no means complete, just a general scheme by which the engine will be implemented.

 

  1. World Representation
    1. Overview
    2. There will exist two independent indexes into the world database from the so-called spatial index, implemented as a 2D array of grid cells, each containing a hashtable or STL map of dynamic and static entities (or to be more precise: pointers to entities in the world database). One representation will be used mainly for view-frustum-culling: The renderer will need to (read-)access the world database on every frame to update the positions of all dynamic objects in the world (vehicles, decals, particles, etc.). To minimize the data transferred to the graphics hardware, the spatial index will implement queries on the grid cells inside the view volume. The other representation will be used for proximity tests by the AI/gamelogic module: Both the physics/obstacle vehicle simulation and the collision detection engine will query the spatial index to receive the ‘area of interest’ to minimize necessary processing. This is only a first iteration, perhaps the two indexes can be merged into one large index and be queried differently depending on usage (AI or rendering).

    3. Rendering Representation

For the purposes of rendering, objects are indexed into cells within the world databases. Each cell represents one street block, including half of each of the roads around it (i.e. one side of the street), and contains references to all C++ objects representing entities (things in the real world) that are located in this world space. The underlying world databases, which store the objects themselves, differentiate between them based on the type of entity a particular object represents:

 

Dynamic entities have AI controllers or are controlled by the player, and can move themselves.

 

Static entities cannot move themselves once they are placed into the world, but can be moved or destroyed by contact with dynamic entities. Or their state may be changed, either by an AI controller (as in the case of traffic lights), or by contact with dynamic objects. For example, when a car (dynamic entity) runs over a fire hydrant (static entity), the hydrant becomes a spray of water shooting from the ground (particle system – static entity with controller).

 

World entities cannot be moved once placed at startup, and although they may sustain "damage" caused by contact with dynamic entities, the damage is superficial (rendered with a decal) and does not affect the underlying world entity.

 

      1. Models

The renderer should have a "read-only" view of the objects. The C++ objects that represent entities (the models in our model-view-controller architecture) provide read access to their position, state, and geometry. However, the controllers manipulate these objects too, and so the models must also expose functionality that allows the caller to change position and state – in other words, provide write access. Also, there are some aspects of the models, such as the lists of polygons that make up their geometry, or the textures that they incorporate, that should be hidden from the controllers.

 

A way to handle this is to separate each (dynamic) model into two objects – an AI object and a rendering object. The renderer will deal only with rendering objects, and the controllers only with AI objects. A rendering object knows about its position and orientation, animation state, and the geometry (in the type database) that describes both its triangle list and any associated textures. An AI object knows about its position and orientation, AI state (possibly), and bounding volume (for collision), among other things.

 

However, there are a couple of reasons to avoid this scheme:

 

Instead, every entity (dynamic or otherwise) will be represented by a single model. Since the triangles and textures in a model’s geometry (and its bounding volume) are actually kept in the type database, the model only needs to store animation state information on top of the information needed by an AI object.

 

This way of doing things does sacrifice elegance by making the model fully read- and writeable to the renderer, so as a compromise the model has a non-mutative Draw method that performs all the calls necessary to render the model.

 

class CModel : public CObject {

public:

vector3 position;

vector3 orientation;

 

// Animation state fields

// Will need n of each if there are n > 1 animations.

// Customize and hard code for each model type, or replace each field

// with a n-size dynamic array, indexed by order of animation (material)

// appearance in the geometry.

bool isAnimOn;

double animStartTime;

// ...AI state is maintained by the controller

 

// Modifies modelview matrix, then delegates to geometry->Draw()

Draw();

 

private:

CGeometry *geometry; // the type in the type database

};

 

For more details on all aspects of rendering, please refer to chapter 12.

 

 

    1. AI / Game-Logic Representation

For AI purpose, the grid cell is defined as a square region centred on an intersection. There are 21 by 11 cells in the first level with 19 by 9 intersection cells, 4 corner cells and the reminding are edge cells. The AI only knows what objects are located in a given cell but don't know the position of the objects within the cell. The objects themselves hold the location information.

 

Each cell contains a list of static objects and a list of dynamic objects. No static objects will be inserted into a cell but some will be removed. Dynamic objects will both be inserted and removed from a cell. The AI view of the world has a 2 dimensional array of cell pointers and the size of each cell. It provides method to transfer objects from one cell to another. An object is defined to be inside a given cell if the centre point of the object is within the boundary of the cell. As the object's centre cross a boundary, it is removed from the current cell and inserted into the new cell. The AI is focused on the player's car; therefore it has a pointer to it. The AI also provides a function that will return the 9 (or more) cells surrounding the player since these cells are within the radius of interest. Simulation and collision detection will be performed on the resulting cells.

 

The class for AI view and cell are defined as follow:

 

typedef std::List<CModel*> ModelList; // a list of object models

 

 

 

 

// CCell represents a cell in the AI view

class CCell: public CObject {

 

public:

~CCell();

void insert(CModel* _model);

// insert an object into the list of dynamic objects

void removeStatic(CModel* _model);

// remove an object from the list of static objects

void removeDynamic(CModel* _model);

// remove an object from the list of dynamic objects

 

protected:

CCell(); // constructor initialize both list to NULL

ModelList m_static; // pointer to the head of the list of static objects

ModelList m_dynamic; // pointer to the head of the list of dynamic objects

}

 

// CAIindex is the AI representation of the world

class CAIindex : public CObject {

 

public:

~CAIindex();

static CAIindex* instance(); // return singleton instance

void move(CModel* _model, int& _cell, float& _pos[3], int _direction);

// move an object to a neighbour cell in the

// direction given

int insert(CModel* _model, float _pos[3]);

// insert an object into a cell that contains

// the given world coordinate

CCell* getNeighbor(); // return the 9 cells around the player’s car

 

static const int RIGHT; // possible direction for move function

static const int LEFT;

static const float m_cellSize; // the height and width of each cell in

// world coordinate

 

private:

CAIindex(ModelList* _models); // constructor, place all model on list in

// correct cell

static CAIindex* m_instance; // pointer to singleton instance

CCell* m_grid[21][11]; // 2-dimensional array of pointers to cell

CModel* m_playerCar; // pointer to the player’s car object

}

 

The initial creation of the AI index will take in a pointer to the list of all objects in the world and place them efficiently into corresponding cell.

 

Each dynamic object has a controller; it acts as the brain of the object. The controller stores AI related information and is responsible for running physics simulation on the object. Based on the result, the controller will modify the state of the object such as changing its position or setting a flag. The controller’s behavior depends on its current state. Depending on the type of controller, the number and natural of the states varies. For example, an obstacle vehicle may have the following states: driving straight, right turn, left turn, acceleration and braking. The physics required in each state is different from any other state. The states are created as the controller is created. The transition of state is trigger by some game event. Continuing the obstacle vehicle example, as the vehicle approaches an intersection, it randomly chooses a direction to head. If it chooses to go straight, then the controller state should be set to driving straight. Similarly, if it chooses to turn right, then the controller state should be set to right turn. The physics for turning right should only run until the vehicle has turned 90 degrees. Once the turn angle reached 90 degrees, the state of the controller should be reset to driving straight.

 

The controller has a pointer to the AI index of the world, the cell number that the object is located and its x and y position with respect to the cell’s origin. This information is needed for updating the AI index to reflect the actual orientation of world objects. At each time tick, after the physics simulation, the controller has to check if the object has moved outside its current cell. If it did, a call to the AI index’s move function will relocate the object to the correct cell. Cell number and x, y position need to be updated to keep things consistent.

 

CAIindex* m_AIindex; // pointer to the AI representation

std::List<int> m_cell; // cell number that the model is located

std::List<float*> m_cellCoordinate; // x and y position of object with

// respect to cell origin

 

if (m_cellCoordinate[i]->x > m_AIindex->getCellSize())

m_AIindex->move(m_cModelList[i], m_cell[i],

m_cellCoordinate[i], CAIindex::RIGHT);

 

Please refer to the prototype for full declaration of controller.

 

Collision resolution is where deletion of objects happens. As two objects collide, one or both of them may explode and is removed from the object database. The object also needs to be removed from the corresponding cell. Since a pointer to the cell is passed to the collision manager as the return value of the getNeighbor function, the collision manager can invoke the remove function in the cell to achieve the desired effect.

 

Dynamic objects such as decals will be created during the simulation and they are place in the correct cells by calling the insert function of the AI index. The insert function will take the location of the object in world coordinate and translate it into cell number using the cell size. The insert function of the cell will be called to perform the actual insertion.

 

 

  1. Behavioral Patterns
    1. Overview
    2. In the following section, the behavior of all static and dynamic game entities is described. The requirements for each entity is split up into a primary and secondary (possibly tertiary) priority scheme, whereas the secondary (tertiary) issues are features of the behavior we would like to have in the game (time permitting), but that are not necessary for the initial game-play experience. Each behavior list has an estimated time associated with it, guessing at the time necessary to implement the behavior/state machine to simulate this pattern. If alternate design decisions were considered, they are discussed in each subsection, describing the advantages and disadvantages, which led to their rejection.

       

      We will try to design the complete system in a data-driven way, using configuration scripts which are parsed at runtime, either for initialization, or possibly even for state machine behavior. In some cases though, a hard-coded approach might be necessary due to time constraints. The configuration scripts we be stored in a subdirectory labeled ‘scripts’, so that the game engine has easy access to them and the file structure is less cluttered.

       

      To wrap up each subsection, a code sample is given, demonstrating how implementation of this part of the AI/game-logic can be achieved. The goal of the code samples is to define a narrow but complete (as far as that is possible at this time) public interface, so other team members can layout their work based on this information. It is crucial to the project, that these interfaces, should they be modified in any way, are updated in this document and that each team member has access to the newest version of this document at any time.

       

    3. Dynamic Entities
      1. Player Vehicle
      2. The player vehicle is the main dynamic entity in the world. The player controls it. The player can move the player vehicle forward or in reverse, and steer it right or left.

        1. General Behaviour and States
        2. Initially, when the game first starts, the player vehicle will be placed on a random location in the city. The vehicle will be placed on the right most lane, facing the proper direction. At any time of the game, depending on the play mode chosen, the player vehicle can be in three different states: 1) browsing state, 2) mission state, and 3) racing state.

          In single mission mode, when the player is simply driving around the city to accept missions, the player vehicle is in the browsing state. The game’s timer will allow the player vehicle to stay in the browsing state for a maximum of 15 seconds. In other words, the player must accept a mission within 15 seconds once he/she starts the game, or has completed another mission.

          In single time trial mode, the player vehicle must drive to the destination within a fixed time frame. During this racing state, the player vehicle

        3. General Physics
        4. The physics for these vehicles will not mimic exactly real world physics. Rather, it will be simple. The physics will make use of current velocity, current acceleration, current position, current direction, and vehicle type of the player vehicle, along with the road condition (i.e. friction factor), and the game’s timer to determine how the player vehicle will be operated on the road in the world. Collision detection will assume that the player vehicle is a sphere. There will be a 2D collision volume in front of the player vehicle for sensing collision with other objects in the world. Upon collision with other objects in the world, the player vehicle will be damaged. Depending on the damage level, the player vehicle’s performance will be affected and the effects will be reflected in the steering, speed, and noise produced by the vehicle’s engine.

        5. The Road
        6. There will be four lanes on the road, two on each side. Obstacle vehicles will drive on the centre of the lane at all times, unless being struck by other vehicles (in which case, they would have blown up anyway.

        7. Driving Rules
        8. The player doesn’t have to follow the traffic lights. It can run over red lights. It can drive in the wrong direction. Basically, the player can attempt to do all kinds of wild things with the player vehicle. But doing illegal things on the road will alert police vehicles near the player vehicle to chase after it.

        9. Player Vehicle Class
        10. player.h

          struct Player {

          int level;

          MISSION mission;

          int missionTimeElapsed;

          int money;

          }

           

        11. CPlayerModel.h

        class CPlayerModel : public CModel {

        public:

        protected:

        Player m_player_info;

        };

         

      3. Police Vehicles
      4. The police vehicles are the pursuit vehicles in the world. They are controlled by the AI to chase after the player vehicle.

        1. General Behaviour and States
        2. Initially, police vehicles will be placed randomly in the world, so that they will be distributed evenly. When they are first placed, they will always be on the right most lane on the road, facing the proper direction. At any time of the game, a police vehicle can be in two different states: 1) patrolling state, and 2) chasing state.

          If the player vehicle has not done any illegal action, or is in browsing state, none of the police vehicle will chase after it. In other words, all police vehicles shall be in patrolling state. In patrolling state, police vehicles will behave like obstacle vehicles (see details on section for Obstacle Vehicles).

          However, if the player vehicle is in mission mode (i.e. taking on an illegal task, speeding, or destroying other objects in the city), it will be tagged by any police vehicle which can see the player vehicle within its ‘line-of-sight’. In chasing mode, each of these police vehicles will increase speed to get to the last point that the player vehicle was seen. During the chasing process, the police vehicle will not avoid other dynamic objects in the world. Obstacle vehicles will blow up if hit by a police car. In any state, any police vehicle in the world is driving around at all times.

           

          The following state-flow diagram outlines the different state transitions for a police vehicle.

        3. General Physics
        4. The physics for these vehicles are similar to those for obstacle vehicles. Again, the physics will make use of current velocity, current acceleration, current position, current direction, and the police vehicle type, along with the road condition (i.e. friction factor), and the game’s timer to determine how a police vehicle will be operated on the road in the world. Collision detection will assume that a police vehicle is a sphere. There will be a 2D collision volume in front of a police vehicle for sensing collision with other objects in the world. Collision detection will be turned off for a cop vehicle if it is out of site from the player vehicle. So the collision detection will be a tunable parameter. If a cop vehicle is hitting another vehicle, it will blow up.

          A police vehicle cannot see objects that are behind another block in the city.

        5. Sound

        The only sound for these vehicles are engine sounds and siren sounds. Engine sounds for these vehicles will reflect the current acceleration and velocity of these vehicles. Sirens will be played if these vehicles are chasing after the player vehicle.

      5. Boss Vehicle
        1. Boss Vehicle Behavior

 

Note: (*) indicates a tunable parameter

The boss vehicle (in each level) is not present in the game level until the player has qualified for the "boss mission" by finishing all other missions on that level. At that point, the boss vehicle appears some distance (* - say a few blocks) away from the player, and the icon representing the boss vehicle blips into existence on the HUD map. Also, at this time all police vehicles disappear from the level. Somehow the boss has gotten wind that the player intends to take him down, so he starts driving away from the player immediately.

Boss vehicles run in two modes:

In this mode, the boss vehicle behaves just like a police vehicle on patrol (or perhaps like an obstacle vehicle), making random turn choices at each intersection and basically playing by the traffic rules. (Right side of the road, et cetera)

In this mode, the boss vehicle is driving away from the player’s vehicle. The algorithm involved should be very similar to that of a police vehicle in pursuit mode, except that instead of following the player while there is line of sight between the two, the boss vehicle should move away from the player. Some more complicated options are possible – the boss vehicle may try to evade the player by making turns whenever possible, for example. (All of this of course relies on the presence of a suitable graph that represents the possible paths leaving each intersection.)

In this mode, the boss vehicle should still stay on the same side of the road, drive in straight lines, and make proper turns, in order to facilitate transition back to loafing mode.

The boss vehicle starts in flight mode upon its appearance in the game world. Transition between the two modes is governed by the same rules as for police vehicles. When the distance between the player and boss vehicles is less than a certain threshold (* - sight radius), and the player has line of sight to the boss (this should be calculated by the controlling logic for the boss, not the player), then the boss vehicle is in flight mode. If line of sight is lost or the distance between vehicles exceeds the sight radius, then the boss goes back to loafing.

Another major point about boss vehicles is that they sustain damage differently than obstacle vehicles and police vehicles. They do not blow up immediately upon impact, nor sustain no damage whatsoever. Instead, boss vehicles have a low damage factor (perhaps similar to the player’s own but somewhat higher, so that the boss vehicle gets the worst of any collision between the two). It takes damage upon collision with other cars as well, allowing the player to inflict damage on the boss vehicle by forcing it into other cars. However, boss vehicles do not attempt to avoid collision with obstacle cars.

Special sound events are triggered by boss vehicle collision, namely voice clips of the boss cursing.

When damage to the boss vehicle reaches 100%, the boss vehicle explodes spectacularly, leaving a smoking charred "hole" (decal) in the street and/or walls. Then the camera should yield to a congratulatory narration screen, followed by a new sequence starting with the boss character running away (on foot) from the scene of the mess.

        1. Boss Character Behavior

The boss character appears in the middle of the smoking mess and starts moving in a straight line down the street, away from the player’s vehicle. The boss is represented by a 2-frame billboarded animation, with a control point at the center and a cylindrical volume (diameter equal to the width of the billboard) for collision detection. The boss character runs frantically (the frames should animate quickly) but slowly, of course, compared to the player’s vehicle. Screaming or cursing voice clips should also be playing. All other vehicles disappear when the boss character emerges.

The boss character runs in a straight line. If it should happen to run into a wall at the opposite end of a 3-way intersection, it will yell in panic, and run back. If the player manages to intersect with its collision volume (run the boss over), the billboarded picture of the boss is then pasted on the ground and the camera should take a bird’s eye view in order to gloat over the scene.

 

The following state-flow diagram illustrates the state transitions for a boss vehicle.

      1. Other Vehicles
      2. The world consists of obstacle vehicles, which are there for decorating the world. These vehicles drive around the world, and follow predetermined paths. Upon collision with other objects, they will blow up into pieces.

        1. General Behaviour
        2. Initially, these vehicles will be assigned a predetermined set of instructions for driving in the world. They will not be very smart. However, they will try to avoid colliding with buildings, and other objects in the world. There will be an algorithm of path finding for determining how these vehicles must behave in order to reach their destinations.

        3. General Physics
        4. The physics for these vehicles will not mimic exactly real world physics. Rather, it will be simple. The physics will make use of current velocity, current acceleration, current position, current direction, and vehicle type of the vehicles, along with the road condition (i.e. friction factor), and the game’s timer to determine how these vehicles will be operated on the road in the world. Collision detection will assume that these vehicles are spheres. There will be a 2D collision volume in front of each vehicle for sensing collision with other objects in the world. It will slow down upon driving close to other vehicles.

        5. The Road
        6. There will be four lanes on the road, two on each side. Obstacle vehicles will drive on the centre of the lane at all times, unless being struck by other vehicles (in which case, they would have blown up anyway.

        7. Traffic Lights
        8. There will be traffic lights on the road. AI will control how these traffic lights will operate. There are four directions in consideration for the operation of the traffic lights: north, east, south, west. The north and south traffic lights will have the same color of lights on at all times, similarly for the east and west traffic lights.

        9. Basic Maneuvering on the Road
        10. When reaching an intersection, these vehicles might either go straight, turn right, or turn left. Going straight is simple. The vehicles stay on the centre of the lane, and drive through the intersection if the traffic light is green or yellow. If the traffic light is red, an obstacle vehicle will slow down and stop in front of the stop line or behind the vehicle in front of it.

          When making a turn, the AI will assume that the centre of the car will be the point of focus for making the turn. The AI will make use of the radius of the curb of the road to determine how an obstacle vehicle will turn. Obstacle vehicles may only turn right on green light, and turn left on yellow light.

        11. Turn Right
        12. The curbs on the road are shaped to be a quarter of a circle. This eases the turning of vehicles in the world. When turning right, the obstacle vehicle must be on the right most lane in the right direction. Once an obstacle vehicle reaches the start of the curb, it will start to make the turn by following the curb on its right. This is done by using the centre point of the vehicle (i.e. the focus point) to calculate a fixed amount plus the radius of the curb. First, we calculate this distance, then we will calculate the direction at which that obstacle vehicle must face, given that its focus point must be kept that fixed distance away from the radius of the curb. Once it finished making its turn into the right most lane, the obstacle vehicle will keep going straight again.

        13. Turn Left
        14. Turning left is similar to turning right. When turning left, the obstacle vehicle must be on the left most lane in the right direction. Instead of using the curb on its right, the obstacle vehicle will use the radius of the curb on its left to determine the fixed distance. Again, we will then calculate the direction at which that obstacle vehicle must face, given that its focus point must be kept that fixed distant away from the radius of the left curb. Once it finished making its turn to the left most lane, it will keep going straight again.

        15. Sound

        The only sound for these vehicles are engine sounds and horn sounds. Engine sounds for these vehicles will reflect the current acceleration and velocity of these vehicles. Horns might be played if the vehicles are getting close to another dynamic object such as the user car, or other vehicles in the world.

      3. Mission Critical Entities

Basically, each mission critical entity has three mission critical states: 1) trigger new mission state, 2) mission progressing state, and 3) end of mission state. When the player vehicle drives through a mission trigger volume that belongs to one of those mission critical entities, that mission critical entity will switch its current state to the trigger new mission state.

 

 

 

    1. Static Entities
      1. World ‘Behavior’
      2. The world has a state corresponding to the current weather condition and a timer responsible for time related state transitions and triggers. The timer is incremented each second.

         

        class CWorldModel : public CModel {

         

        public:

        ~CWorldModel();

        int m_weather;

        CTimer m_gameTime;

         

        static const int SUNNY;

        static const int RAINNY;

         

        friend class CModel;

         

        protected:

        CWorldModel(int _type);

        };

         

      3. HUD
      4. The HUD contains a list of mission locations, a list of police car locations, the player's car location and the boss' car location. Mission locations are displayed using different color according to difficulty level. Locations of police car are displayed as blue squares. The location of player's car is displayed as a blinking dot. The locations of police cars and player's car are updated each frame. When a mission is activated, the location of its destination is displayed and the original mission location stop displaying. When the mission is completed, its destination stop displaying and the mission will not be displayed hereafter. When a certain police is bribed, its color will change. The police will revert to its original color after some time interval. If the boss mission is activated, all police cars and missions stop displaying and the location of the boss' car is displayed. The location of the boss' car is updated each frame. No mission or police car will be deleted from the list.

         

        typedef std::List<CModel*> ModelList;

         

        class CHud: public CObject {

         

        public:

        ~CHud();

        void display();

         

        protected:

        CHud();

        ModelList m_missions;

        ModelList m_policeCars;

        CModel* m_playerCar;

        CModel* m_bossCar;

        }

         

        The display method gets the necessary information from the model and based on the state of the models, display the appropriate color. For example,

         

        if (m_missions[i]->m_state == CMissions::ACTIVE) // display destination

        drawGreenDot(m_missions[i]->pos);

      5. Streetlights / Lampposts
      6. All lampposts appear the same; they only differ by location within the world. The lamppost is a static object that doesn't emit light, it isn't considered as a light source. All lampposts refer to the same art model and have the same collision volume, a circle with tunable radius. Upon collision, the lamppost will explode causing damage and leaving burn marks on the ground. All lampposts are created at the start of the game and are kept in a list. No new lamppost will be created afterward. When a lamppost explodes, it will be removed from the list.

         

        Declaration of lampposts is similar to the CDefaultModel in the prototype.

      7. (Burning) Trashcans
      8. The controllers of trashcans have two different states: burning or not burning. A collision with the rectangle collision volume of a trashcan when it is in not burning state will causes it to starting burning. Fire particles are placed at the top of the trashcan to signify burning. A burning trashcan will eventually stop burning after some time interval and transits from the burning state back to not burning state. Collision with a trashcan in burning state will cause it explode. Both the trashcan and the particles have to be removed from the database when explosion occurs.

         

        class CTrashcanModel : public CModel {

         

        public:

        ~CTrashcanModel();

        CTimer m_burningTimer;

         

        friend class CModel;

         

        protected:

        CTrashcanModel(int _type);

        };

      9. Traffic Lights
      10. The controllers of traffic lights have three different states: red, yellow and green. The transition of states occurs at fixed time interval. States cycle from green to yellow to red and back to green. The amount of time between state transitions is tunable. Traffic lights at the same intersection have dependency: a transition from yellow to red on one traffic light will trigger a transition from red to green on the other. Traffic lights have circular collision volume and will not deform or explode upon collision.

         

        class CTrafficLightModel : public CModel {

         

        public:

        ~CTrafficLightModel();

        float color[3];

        CTimer m_transitionTimer;

        CtrafficLightModel* m_sameDirection;

        CtrafficLightModel* m_oppositeDirection[2];

         

        friend class CModel;

         

        protected:

        CTrafficLightModel(int _type);

        };

         

        void CRedLightState::Update() {

        if (m_controller->m_model->m_transitionTimer->HadTickPassed()) {

        color[0] = 0.0; color[1] = 1.0; color[2] = 0.0;

        m_controller->m_state = m_controller->m_stateList[0];

        m_controller->m_model->m_sameDirection->setGreen();

        m_controller->m_model->m_oppositeDirection[0]->setRed();

        m_controller->m_model->m_oppositeDirection[0]->setRed();

        }

        }

      11. Trees
      12. Trees are represented by two intersecting planes. They have circular collision volume but have no interaction upon collision. Trees appear in block with 5 to 10 trees per block.

         

        Declaration of trees is similar to the CDefaultModel in the prototype.

      13. Decals
      14. There are two different types of decal: burn mark and skid mark. Burn marks are created upon explosion in the world and skid marks are created by hard braking and tight turns. Both types are created dynamically and inserted into the database. Each type has a different life span and goes through a number of stages. The transition of stages occurs at fixed time interval. At each transition, the decal becomes more transparent until it completely disappears and is removed from the database.

         

        class CDecalModel : public CModel {

         

        public:

        ~CDecalModel();

        int m_typeOfMark;

        CTimer m_lifeTimer;

         

        static const int SKID;

        static const int BURN;

         

        friend class CModel;

         

        protected:

        CDecalModel(int _type);

        };

         

      15. Missions

Mission is an important aspect of the game. They are scattered around the world waiting to be activated. Each mission has a corresponding destination and a difficulty level. They have collision volume stored in the type and when collided with the player, will become activated. An active mission becomes complete when the player reaches the destination, which is defined as colliding with the same collision volume center on the destination coordinate. This is achieved by setting the position of the destination as the position of the mission upon activation. Completed missions will not change state and is not collidable.

 

class CMissions: public CModel {

 

public:

~CMissions();

int m_state;

float m_destination[3];

int m_level;

float m_timeLimit;

 

static const int ACTIVE;

static const int NOTACTIVE;

static const int COMPLETED;

 

friend class CModel;

 

protected:

CMissions(int _type);

}

 

A mission manager controls all the missions in the world. Since only one mission can be active at any given time, the manager maintains a pointer to the active missions and a pointer to a list of other missions. The mission manager is responsible for changing the state of individual missions. The manager itself has a few states including mission active, mission completed and boss mission active. The return value of its member function is dependent on which state the manager is in. It has a collidable function, which would be call by the collision manager, that return a list of missions that the player can potentially collide. If there is an active mission, it is the only mission returned. If there is no active mission, missions that are not completed are returned. When activating a mission, m_missionTime is set to the mission’s timeLimit, the player will lose the game if the mission isn’t completed before the missionTime ticked.

 

typedef std::List<CMissions*> MissionList;

typedef std::List<CState*> StateList;

 

class CMissionMgr: public CObject {

 

public:

~CMissionMgr();

static CMissionMgr* instance();

void setActive(CMissions* _mission);

void complete();

MissionList collidable();

protected:

CMissionMgr();

static CMissionMgr* m_instance;

MissionList m_missions;

CMissions* m_active;

CTimer m_missionTime;

CState* m_state;

StateList m_stateList;

}

 

void CMissionMgr::complete() {

m_state->complete();

}

 

void CMissionActiveState::complete() {

m_manager->m_active->m_state = CMissions::COMPLETED;

m_manager->m_state = m_stateList[1]; // assume 1 is mission completed state

}

 

 

  1. Game-State Management (Game Flow) For Player Vehicle
    1. Overview
    2. The purpose of game-state management is to monitor the current status of the player in Drug Runner. In sum, possible game states are split into 4 categories: 1) game-related state events, 2) mission-related state events, 3) menu-related state events, and 4) game-mode related state-events. Furthermore, there are several global variables for storing player’s current game-state information.

       

       

       

       

      gamestate.h

       

      enum GAMESTATE {

      GAME_PAUSED,

      GAME_RESUMED,

      GAME_OVER,

      GAME_PLAYING,

      GAME_EXIT,

      MISSION_ACCOMPLISHED,

      MISSION_FAILED,

      MISSION_IN_PROGRESS,

      MENU_INTRO,

      MENU_MAIN,

      MENU_OPTIONS,

      MENU_CONTROL,

      MENU_SOUND,

      MENU_DISPLAY,

      MENU_HELP,

      MENU_PLAYMODE,

      SCREEN_CREDITS,

      SCREEN_PLAY_AGAIN,

      GAMEMODE_SINGLE_RACING,

      GAMEMODE_MULTI_RACING,

      GAMEMODE_SINGLE_MISSION,

      GAMEMODE_MULTI_MISSION

      }

       

       

      Game-State Flow Chart

    3. Game-State Class

    Each state is an object that is attached to a controller for a particular model in the game. It has a function for updating itself to another state, a function for attaching itself to a controller. For example, in the previous figure Game-State Flow Chart, MISSION_SEEK is a state object that is attached to the player vehicle’s controller. Depending on the actions performed by the player, MISSION_SEEK will update itself to be GAME_PAUSED, MISSION_IN_PROGRESS, or SCREEN_PLAY_AGAIN. And all of the states described in the Game-State Flow chart will belong to the player vehicle’s controller m_stateList.

  2. AI / Path-finding
  3.  

    This section presents an algorithm for finding the shortest paths between any two points from A to B given a graph G=(V, E). This algorithm is needed for determining the best (shortest) path for the player to get from its present position to its target location (mission accomplishment point), or for the police car to get to the player’s car as quick as possible. The world is represented as a 2-d grid (see section 2.1), which forms the graph, G, and the grid cell intersection points make the required vertices, V, for the algorithm. The algorithm also requires weights, w, between all pairs of vertices, which are the distances between them, and always fixed except between the last couple of pairs. The edge, E, is represented as a vertex pair with a weight w. There could be more than one shortest paths giving the same minimum value, in which case one of them will randomly be chosen.

     

    The algorithm we’ll use here will be the so-called Dijkstra’s algorithm (see, Cormen, et al, Algorithms, 1997) with a relaxation (this algorithm requires all weights to be positive, w>=0, for each edge (u, v) of E, which is always true in our case). The algorithm maintains a set S of vertices whose final shortest-path weights from the source s have already been determined. The running time of the entire algorithm is O(V^2) (but by implementing a priority queue Q with a binary heap reduces to O(E lg V), which will be implemented at later stages once a simple first version like the one given below works well).

     

    For illustrative purposes a simple working code is given here (slightly modified version of that in Data Structures and Algorithms 1998, by R. Lafore). We’ll either customize this code later to make it more general and/or efficient, or write a new code along the same lines. Although the Java version of the following program has been run, no testing of C++ version has been carried out so it’s probable to contain bugs.

     

    // an auxiliary class to hold distance and parent of a vertex

    class CPair

    {

    // items stored in sPath array

    public int distance; // distance from start to this vertex

    public int parent; // current parent of this vertex

     

    public CPair (int p, int d) { // constructor

    distance = d;

    parent = p;

    }

    }

     

    class CVertex

    {

    public char label;

    // label (e.g. 'A') but not really needed for our game program

    public bool isInTree;

    public CVertex (char lab) { // constructor

    label = lab;

    isInTree = false;

    }

    }

     

    class CGraph

    {

    private CVertex vList[ ]; // list of vertices

    private int MAXVERTS = 100;

    // any huge number > our grid distances will do

    private int INFINITY = 999999;

    // adjacency matrix but can be also replaced with adjacency list

    private int adjMat[ ][ ];

    private int nVerts; // current number of vertices

    private int nTree; // number of verts in tree

    private Pair sPath[ ]; // array for shortest-path data

    private int currentVert; // current vertex

    private int startToCurrent; // distance to currentVert

     

    public CGraph() { // constructor

    vList = new Vertex[MAXVERTS];

    adjMat = new int[MAXVERTS][MAXVERTS]; // adjacency matrix

    nVerts = 0;

    nTree = 0;

    for(int j=0; j<MAXVERTS; j++) // set adjacency matrix to infinity

    for(int k=0; k<MAX_VERTS; k++)

    adjMat[j][k] = INFINITY;

    sPath = new Pair[MAXVERTS]; // shortest paths

    }

     

    public void addVertex(char lab) {

    vList[nVerts++] = new Vertex(lab);

    }

    public void addEdge(int start, int end, int weight) {

    adjMat[start][end] = weight; // (directed)

    }

     

    public void path() { // find all shortest paths

    int startTree = 0; // start at vertex 0

    vList[startTree].isInTree = true;

    nTree = 1; // put it in tree

    // transfer row of distances from adjMat to sPath

    for(int j=0; j<nVerts; j++) {

    int tempDist = adjMat[startTree][j];

    sPath[j] = new Pair(startTree, tempDist);

    }

    // until all vertices are in the tree

    while(nTree < nVerts) {

    int indexMin = getMin(); // get minimum from sPath

    int minDist = sPath[indexMin].distance;

     

    if (minDist == INFINITY) { // if all infinite or in tree

    cout << "There are unreachable vertices\n";

    break; // sPath is complete

    }

    else { // reset currentVert

    currentVert = indexMin; // to closest vert

    startToCurrent = sPath[indexMin].distance;

    // minimum distance from startTree is

    // to currentVert, and is startToCurrent

    }

    // put current vertex in tree

    vList[currentVert].isInTree = true;

    nTree++;

    adjustPath(); // update sPath[] array

    }

     

    // display sPath[] contents will be replaced with savePaths()

    displayPaths();

    nTree = 0; // clear tree

    for(int j=0; j<nVerts; j++)

    vList[j].isInTree = false;

    }

     

    public int getMin() { // get entry from sPath with minimum distance

    int minDist = INFINITY; // assume minimum

    int indexMin = 0;

    for (int j=1; j<nVerts; j++) { // for each vertex if it's in tree and

    if( !vertexList[j].isInTree && // smaller than old one

    sPath[j].distance < minDist ) {

    minDist = sPath[j].distance;

    indexMin = j; // update minimum

    }

    }

    return indexMin; // return index of minimum

    }

     

    public void adjustPath() {

    // adjust values in shortest-path array sPath

    int column = 1; // skip starting vertex

    while (column < nVerts) { // go across columns

    // if this column's vertex already in tree, skip it

    if ( vertexList[column].isInTree ) {

    column++;

    continue;

    }

    // calculate distance for one sPath entry

    // get edge from currentVert to column

    int currentToFringe = adjMat[currentVert][column];

    // add distance from start

    int startToFringe = startToCurrent + currentToFringe;

    // get distance of current sPath entry

    int sPathDist = sPath[column].distance;

    // compare distance from start with sPath entry

    if (startToFringe < sPathDist) { // if shorter update sPath

    sPath[column].parentVert = currentVert;

    sPath[column].distance = startToFringe;

    }

    column++;

    }

    }

     

    // NOT NEEDED: used only for testing purposes

    public void displayPaths() {

    for(int j=0; j<nVerts; j++) { // display contents of sPath[]

    cout<< vertexList[j].label << "=";

    if (sPath[j].distance == INFINITY)

    cout << "inf";

    else

    cout << sPath[j].distance;

    char parent = vertexList[ sPath[j].parentVert ].label;

    cout << "(" << parent << ") ";

    }

    cout << "\n";

    }

    }

     

     

    It should be noted though, that (a) the game perhaps will not need any pathfinding algorithm, as the police vehicles will perform straight line search and (b) if pathfinding is needed we will probably modify the solution above to use heuristic A* search instead of a single source shortest path algorithm such as dijkstra.

     

     

  4. Collision Detection/Resolution
  5.  

    Figure 7.1: Collision detection scheme

     

     

    All needed methods to detect and resolve collisions will be embedded within a CollisionManager (CCollisionMgr) singleton object. On every game tick, the CollisionManager queries the spatial index for a list of models in the model database, which need to be tested against each other (currently O(n2) per cell, can be optimized using OBB trees or other datastructures). If a collision occurs between model 1 and model 2, the collision manager will obtain a handle to the two associated controllers, as the models own these. If the model is collidable, the associated controller will implement the ICollisionListener interface, so that the collision manager can dispatch a CCollisionEvent to the controllers involved, to resolve the collision (in the prototype engine this is currently implemented unclean by allowing the CcollisionMgr to directly manipulate the CModel’s position and velocity (its state)). See section 8 for a more detailed description on vehicular physics and the underlying architecture for the game engine. Each model has associated with it a pointer to its bounding volume, which is used for collision detection. Bounding volumes will mainly consist of oriented-bounding-boxes (OBB’s, for the vehicles), axis-aligned-bounding-boxes (AABB’s, for static, box-shaped objects such as lampposts or walls) and bounding spheres. The algorithms needed to efficiently compute these intersection tests are openly accessible on the world-wide-web, specifically at www.magic-software.com and www.realtimerendering.com, from where we will access all necessary resources to implement the needed intersection tests accordingly. The collision manager will have access to the bounding volume of each model and implement efficient collision detection algorithms for all possible pairs of intersections.

     

  6. Vehicular Physics
  7.  

    Vehicle physics in drug runner will be based on a mix of real-world-physical behavior, paired with the ability to tweak the parameters to make the game fun. Currently, we are researching various car models from which we will select a proper implementation, using Euler integration to simulate the timesteps. A good and simple car model can be found at http://home.planet.nl/~monstrous/tutcar.html, which includes all equations necessary to implement the simple car model. The model suffers from some mathematical singularities, which can be caught by proper checking of the cars attributes. Another good reference is http://www.marketgraph.nl/gallery/racer/, where the so called ‘pacejka magic formula’ is used to model tire slip. In any case, our model will definitely implement the notion of longitudinal and lateral forces to simulate slip. The initial model will be 2D only and have no suspension. This might be added at a later date (see section on scope cutbacks in the design doc), but is not on the primary feature list.

     

    There exists a CPhysics singleton class in the application scope which is primarily meant to store physics constants such as friction and gravity, but there will also exist static helper methods to do physics calculations if they are needed. Mostly the physics computations will reside inside dedicated controllers associated with the models that are in need of physics simulation (the cars).

     

  8. Sound
    1. Overview
    2. The sound API used in the game will be DirectSound.

    3. Background Music
    4. The normal background sounds of life going on in our game. Normal here means normal for the environment we have created within the game. This is ambient sound that is not ordinarily used to focus attention - just to make the gaming experience more lifelike.

    5. Predominant Sound
    6. First, at all times during the game, there should only be one predominant sound. That sound can be one single sound or a cacophony of other sounds (multiple explosions, screams, etc.) - either one serves to focus the player’s attention.

    7. Player Vehicle Sounds
    8. At all times (except when there are voices playing), any sounds related to the physical behaviour of the player vehicle will be played. These sounds include, engine noise produced through the startup, acceleration, deceleration, and collision with other dynamic objects.

    9. Sound Events
    10. The dynamic objects in the world trigger all the sound events. Each event invoked by the action of these dynamic objects will be put into a priority queue. Different actions will have different priorities.

      At each frame of the game, we will go retrieve sound events from the main event queue. First, we check whether there is any predominant sound playing, and what priority is given to that sound. Next, we take out the first item in the sound event queue. If currently there is a sound playing, then if the next sound to be played has a higher priority, then the current sound will be stopped, and the next sound will be played next. Otherwise, we do not stop the current sound unless it’s finished.

      1. Events Relevant to Sound
      2. Event Identifier

        What to do?

        EVENT_INPUT_ACCEL

        Play ACCEL_LOW, ACCEL_MEDIUM, or ACCEL_HARD

        EVENT_INPUT_BRAKE

        Play BRAKE_LOW, BRAKE_MEDIUM , or

        BRAKE_HARD

        EVENT_INPUT_TURN_LEFT

         

        EVENT_INPUT_TURN_RIGHT

         

        EVENT_INPUT_PAUSE

        Stop playing all sounds

        EVENT_INPUT_RESUME

        N/A

        EVENT_COLLISION

        Play CRASH_LOW, CRASH_MEDDIUM, or CRASH_HARD

        EVENT_COP_ALERT

        Play POLICE_SIRENS

           
      3. Sound Event Priorities

      Priority Rank

      Sound Event

      1

      Mission announcement

      2

      Boss voice

      2

      Cop’s voice

      2

      Cop’s sirens

      3

      Player vehicle engine sound

       

    11. Recording and Storing Sounds
    12. Very few people can hear any fewer than 16 Hz or any more than about 22 kHz (thousand cycles per second). The Nyquist formula states that the sampling rate of a sound must be twice the frequency of the highest sound to be sampled.

    13. Sampling Sounds
    14. All sounds (except for voices) will be taken from sample sounds taken from free-license sound samples. We will use Sound Forge and

    15. DSManager Class
    16. Basically, this class is used for initializing Direct Sound in the background, which sets up and gets Direct Sound going for used in CSoundManager.

       

      DSManager.h

      #include "dsound.h"

      class DSManager

      {

      private:

      HWND hMainWnd; // handle to the game window

      static LPDIRECTSOUND lpDS; // pointer to the directsound

      // object

      public:

      DSManager();

      ~DSManager();

       

      /* This function will initialize Direct Sound

      It returns TRUE if it was succesful or FALSE otherwise */

      BOOL Sound_Init(HWND gameWinHwnd);

       

      /* The following functions will be used manipulate the quality of the sound being played */

      void SetVolumeMode(VOLUMEMODE volmode);

      VOLUMEMODE GetVolumeMode(void);

      void SetPlayMode(PLAYMODE playmode);

      PLAYMODE GetPlayMode(void);

      BOOL GetVolume(LONG* pVol);

      BOOL SetVolume(LONG vol);

      BOOL IncrementVolume(void);

      BOOL DecrementVolume(void);

      BOOL SetDefaultVolume(void);

      BOOL GetFrequency(DWORD* pFreq);

      BOOL SetFrequency(DWORD dwFreq);

      BOOL GetPan(LONG* pGain);

      BOOL SetPan(LONG lGain);

      HRESULT StartDSBPlay(void);

      HRESULT StopDSBPlay(void);

      BOOL IsPlaying(void);

      /* This function will read in a wave file, stores it in

      buffer */

      BOOL Load_WaveFile(LPSTR lpzFileName)

       

      /* This function shuts down Direct Sound */

      void Sound_Exit(void);

      };

       

    17. CSoundManager Class
    18. CSoundManager is the main interface through which the sounds invoked by the game events are to be played or stopped.

       

      CSoundManager.h

       

      class CSoundManager

      {

      private:

      DSManager dsManager;

      DSoundArray m_arrayDSnd; // Sound data array

      DWORD m_dwRefKey; // Reference key to

      // sound data in array

       

      LPDIRECTSOUND m_pIDS;

      LPDIRECTSOUNDBUFFER m_pPrimaryDSB;

      HWND m_hWndPlay;

      CEventManager* m_eventManager;

      Int m_currentSound; // current sound played

      Int m_currentBackground;

       

      public:

      CSoundManager();

      ~CSoundManager();

      BOOL LoadWaveFile(LPSTR szFileName, DWORD* pKey);

      BOOL AddSound(CSoundData* pSnd, DWORD* pKey);

      BOOL StartPlay(DWORD dwKey);

      BOOL StopPlay(DWORD dwKey);

      BOOL RemoveSound(DWORD dwKey);

       

      void PlaySoundEvents();

      /* This is the main function called by the game

      It will go through the sound event queue, and decide

      what sound will be played */

       

      void StopAllSound(void);

      void RemoveAllSound(void);

       

      BOOL GetVolume(DWORD dwKey, LONG* pVol);

      BOOL SetVolume(DWORD dwKey, LONG vol);

      BOOL GetFrequency(DWORD dwKey, DWORD* pFreq);

      BOOL SetFrequency(DWORD dwKey, DWORD dwFreq);

      BOOL GetPan(DWORD dwKey, LONG* pGain);

      BOOL SetPan(DWORD dwKey, LONG lGain);

      BOOL GetVolumeMode(DWORD dwKey, VOLUMEMODE* pVMode);

      BOOL SetVolumeMode(DWORD dwKey, VOLUMEMODE VMode);

      BOOL GetPlayMode(DWORD dwKey, PLAYMODE* pPMode);

      BOOL SetPlayMode(DWORD dwKey, PLAYMODE PMode);

      BOOL GetDSBStatus(DWORD dwKey, DWORD* pStatus);

      BOOL IsPlaying(DWORD dwKey, BOOL* pPlay);

       

      };

       

    19. Header File For Wave Files Used

    waves.h

    #define BRAKE_LOW "c:\temp\music\brake_low.wav"

    #define BRAKE_MEDIUM "c:\temp\music\brake_med.wav"

    #define BRAKE_HARD "c:\temp\music\brake_hard.wav"

    #define ACCEL_LOW "c:\temp\music\accel_low.wav"
    #define ACCEL_MEDIUM "c:\temp\music\accel_med.wav"
    #define ACCEL_HARD "c:\temp\music\accel_hard.wav"

    #define CRASH_LOW "c:\temp\music\crash_low.wav"

    #define CRASH_MEDIUM "c:\temp\music\crash_med.wav"

    #define CRASH_HARD "c:\temp\music\crash_hard.wav"

    #define POLICE_SIREN "c:\temp\music\siren.wav"

    #define MISSION_1 "c:\temp\music\mission_1.wav"

    #define MISSION_2 "c:\temp\music\mission_2.wav"

    …etc.

     

  9. Input
    1. Overview
    2. Player input, such as pressing keyboard keys, clicking mouse buttons, or moving the mouse, generates events. We will only be covering keyboard and mouse input. DirectInput is the interface through which input events will be captured and handled.

    3. Player Input Events
    4. Input events are related to the actions performed by the player.

      Event Name

      Event Used by

      EVENT_INPUT_ACCEL

      CSoundManager, AI

      EVENT_INPUT_BRAKE

      CSoundManager, AI

      EVENT_INPUT_TURN_LEFT

      CSoundManager, AI

      EVENT_INPUT_TURN_RIGHT

      CSoundManager, AI

      EVENT_INPUT_PAUSE

      CSoundManager,AI,Graphics

      EVENT_INPUT_RESUME

      CSoundManager,AI,Graphics

      EVENT_INPUT_EXIT

      AI,Graphics

      EVENT_INPUT_NEXT_SCREEN

      AI,Graphics

      EVENT_INPUT_PREV_SCREEN

      AI,Graphics

    5. Transforming Input into Data
    6. The player can navigate throughout the game using either the mouse and/or the keyboard.

      By default, the following is the mapping of keys to the input events described in the previous section:

      Event Identifier

      Key Mapping / Mouse Clicking

      EVENT_INPUT_ACCEL

      DIK_UP (up arrow)

      EVENT_INPUT_BRAKE

      DIK_DOWN (down arrow)

      EVENT_INPUT_TURN_LEFT

      DIK_LEFT (left arrow)

      EVENT_INPUT_TURN_RIGHT

      DIK_RIGHT (right arrow)

      EVENT_INPUT_PAUSE

      ‘p’ (key ‘p’)

      EVENT_INPUT_RESUME

      ‘r’ (key ‘r’)

      EVENT_INPUT_EXIT

      ‘q’ (key ‘q’)

      EVENT_INPUT_NEXT_SCREEN

      Mouse left clicks on the ‘next’ button

      EVENT_INPUT_PREV_SCREEN

      Mouse left clicks on the ‘prev’ button

       

      The player can choose to change the keyboard mappings in the Options screen since all keyboard related input mappings will be stored globally, and can be changed at run-time.

    7. The CEventManager Class
    8. The CEventManager is responsible for handling all events associated with the player vehicle.

      CEventManager.h

       

      class CEventManager

      {

      private:

      // main list for storing events relevant to player vehicle

      EventList m_eventList;

      public:

      CEventManager();

      ~CEventManager();

       

      EventList getEventList(EventManagerType);

      // called by other event managers to // to retrieve relevant events

      }

    9. CInput Class
    10. The CInput Class is responsible for initializing DirectInput7 to be used by CInputManager.

       

      #include<dinput.h>

      class CInput

      {

      public:

      Cinput();

      ~Cinput();

      bool InitDirectInput(void); // inits DirectInput

      bool InitKeyboard(void); // inits keyboard stuff

      bool InitMouse(void); // inits mouse stuff

      bool Update(void); // updates keyboard status

      bool KeyPressed(int); // ask for pressed key

      private:

      LPDIRECTINPUT7 m_DirectInputObject; // DirectInput obj

      LPDIRECTINPUTDEVICE7 m_DirectInputKeyboardDevice // keyboard

      char KeyBuffer[256];

      }

    11. Input Manager Class

    The role of the input event manager is to define input callback functions, and to flush out events in the input queue every time frame.

     

    CInputManager.h

     

    class CInputManager : public CEventManager

    {

    private:

    CEventManager* m_eventManager;

    void UpdateEventQueue();

    CInput* m_cInput;

     

    public:

    CInputManager();

    ~CInputManager();

    LRESULT CALLBACK KeyboardProc(int code, WPARAM wParam, LPRARAM

    lPaam);

    void MouseProc(); // need to look into this, different from

    // keyboard

    };

     

     

  10. Camera Logic
  11.  

    Drug Runner uses a basic gameplay camera that is located behind the player’s car. The camera has only one view: overhead view, that is, it is always located outside and behind the car. But it is allowed to have three positions: close-up, normal, and far, which is tunable and can be selected on the fly with an extra button (keyboard or mouse that pops-up an input dialog box), as well as at the start of the game in the front-end menu. The camera will simply snap into the new position (not a smooth transition for simplicity). The optimum initial values of the close-up, normal, and far distances will be determined after a few runs of the game prototype program (since they are tunable in the configuration file this will not be a problem at all).

     

    The camera will be aware of its environment in that it will avoid passing through objects if and when, for example, player’s car is going around corners and the camera can’t see the car, or if the car is reversing and there is an object in the close proximity behind. In such situations sliding it away from the obscuring object will change the camera’s position. The distance between the camera and the player’s car forms two points on the circumference of a circle (or the surface of a sphere), and if any objects fall inside this circle (sphere) of interest and intersects the line (the imaginary the line from the car to the camera), then the camera’s position will slide along the circumference (changing the distance between itself and the car) . This move will be in the horizontal direction when going around the corners and in the vertical direction when the car is reversing. Once the objects are cleared the camera will resume its original distance from the car.

     

     

    For creating realistic scenes and a better game experience the camera will vibrate during an explosion after a collision of the player’s car with an obstacle. The level of vibration will be in proportion to the level of explosion. The vibration effects of the camera will simply be the small position change in a small amount of time in the vertical direction (for instance a sinusoidal motion). The camera will also be able rotate if the player’s car, for example, spins out.

     

     

  12. Rendering
    1. Use of Graphics API
    2. The renderer will use OpenGL.

       

    3. Animation and Textures
    4. All animation in the game will be restricted to texture animation only. The lack of geometric animation frees the renderer from having to keep track of and manipulate the "animatable" parts of complicated hierarchical models. Also, to avoid frequent texture swapping, all frames for an animation will be placed onto a single larger texture.

       

      Each animation will be stored in the type database as a material. The texture image itself will be loaded into a GL texture object at startup. Textures will thereafter be globally accessible through the use of a singleton texture manager CTextureMgr (the idea is described in detail on www.gamedev.net and omitted here). A material type stores the name of its texture object and the texture coordinates that the renderer will use. An animation type, which is a specialization of a material, will also store the number of frames in the animation. In addition, for each of its frames, the animation type contains a list of texture coordinates to be used for that frame, and the finish time of that frame. We calculate the active frame like so: given the current "world" time and the start time of the animation,

       

      current time = current time mod (finish time of the last frame)

      while (current time – start time > finish time of the current frame)

      advance frame mod (number of frames in animation)

       

      Note that the current time used above is a parameter available to the renderer, and corresponds to the time of the last simulation step. The variable timestep between frames that is implied by having individual finish times for frames avoids redundancy in frames for animations like traffic lights, where one frame may be active for a much more time than another.

       

       

      class CMaterial : public CObject {

      public:

      bool isAnimation;

       

      // load the texture object. we want to avoid calling this as much as possible.

      void Setup();

       

      vertex2 *coords; // Texture coordinates

       

      private:

      GLint name;

      };

       

      class CAnimation : public CMaterial {

      public:

      // this uses frame finish times with timeElapsed (the amount of time

      // elapsed since the last start of the animation) to find the right frame.

      // If it is a different frame, then this will grab the texture coordinates

      // for that frame and store them in coords.

      void CalculateCoords(double timeElapsed);

       

      private:

      int numFrames;

      vertex2 ** frameCoords;

      double * frameFinishTimes;

       

      // Avoid iterating through the entire list when calculating frame coordinates.

      int lastFrame;

      };

       

      This is how we would use the two classes:

       

      material->Setup();

       

      if (material->isAnimation) {

      CAnimation anim = (CAnimation) material;

      anim->CalculateCoords(animTimeElapsed);

      }

      ...Calls to glTexCoord(material->coords[x]) and glVertex(vertices[x]) for each face...

      Because we are designing our renderer as a highly delegated system in which each entity is responsible for drawing itself, we will end up swapping textures every time we draw another entity. We can get around this by sorting our entities before rendering (see the next section), so that we render all entities of the same type at once, and making only one call to CMaterial::Setup() before drawing all the objects for that type. (We could even remove the texture object name from the CMaterial and store it in the type instead.)

       

      To take full advantage of this, we must use the same texture for a model whether or not its animation is active. We can do this simply by requiring the first frame of every animation to be the texture that should appear when the animation is inactive (for instance, lights off on a police car siren animation). Then, when the animation is inactive, timeElapsed should be set to 0, so that the calculated texture coordinates are the ones that refer to the first frame.

       

      Here, a PrimGroup is a set of polygons that share a common texture (or have no texture):

       

      class CPrimGroup : public CObject {

      public:

      // run the snippet of code above, but exclude call to CMaterial::Setup()

      // if material is null, then make glVertex calls only

      Draw(double animElapsedTime);

       

      private:

      CMaterial *material; // if null, then this is to be cel-shaded

      vector3 * vertices;

      vector3 ** faces;

      };

       

      Sorting the entities by type is no help if we have any models in our system with multiple animations and/or multiple textures, and have to swap textures just to render a single model. We can deal with this by placing all of a model’s textures, and frames for all of the model’s animations, onto a single texture. There would be multiple CMaterial objects that stored the same GL texture object name, but they would have different texture coordinates to map to different parts of the compound texture.

       

      Finally, rendering a complete model is done by CGeometry.

       

      class CGeometry : public CObject {

      public:

      // Calls Draw() on each CPrimGroup.

      Draw();

       

      // one value per animation in the model.

      // set these before calling Draw().

      double * animTimeElapsedArray;

      // ..and bounding volume too.

       

      private:

      CPrimGroup * primgroups;

      };

       

      A last note on textures: we would like to use light maps, but the precomputation will be a lower priority.

    5. Access to World and Static Entities
    6. Access to these entities is controlled by the spatial index and gateway. The spatial index organizes models first by position (the cells that contain them, as described in section 2.2), and secondly by type, right down to the point where all models in the same bucket have pointers to the same CGeometry object.

       

      The gateway not only allows controllers to change model positions without worrying about updating the spatial index (moving the model from one cell to another as necessary), but can also help the renderer perform culling. Given the current camera position, orientation, and FOV/aspect ratio, the gateway can calculate its view frustum and return the set of cells that fall partially or completely within this frustum. (Culling will only be implemented if renderer performance becomes an issue).

       

      Given this set of cells, the renderer will look at the instances of each type of entity across all the cells (note: the cost of the union operation is a factor here and we will probably have to investigate various hashtable implementations), and draw them in that order, with world entities coming before static entities. In fact, the renderer will probably use the order of model type occurrences in the type database as the most convenient [non-hardcoded] guide for the order in which to render the corresponding models, so we must make sure that the ordering of model types in the type database is one that makes sense.

       

      Note that in our pipelines (both art and rendering), roads are implicitly represented by the position of the other world entities. So the first thing the renderer will draw should be a large plane (since road elevation will not be included in the game) textured with the road surface; after that, all the parks / buildings and surrounding sidewalks/grass will be drawn above; last, the static entities will be drawn.

       

    7. Access to Dynamic Entities
    8. The renderer’s access to and use of dynamic entities is exactly the same as for world and static entities. After drawing the dynamic entities, the renderer draws the HUD map and player information.

       

       

    9. Camera
    10. CCamera class is used to abstract the use of the camera so the AI part can control it easily. Several camera classes exist in the literature. The one (interface) given here was taken from Hill, F.S. (Computer Graphics 2nd Ed, 2001) and modified to include the vibrations (see section 11). Since most of the method definitions are already given in the book and they are quite lengthy we do not include them here.

       

      #define CLOSEUP 3 // Global tunable variables for cameras initial position (units are in meters)

      #define NORMAL 10

      #define FAR 50

       

      Class CCamera {

      Private:

      Point3 eye; // a point given with x,y,z coordinates

      Vector3 u,v,n; // orientation of the camera at the eye point

      Double viewAngle, aspect, nearDist, farDist; // view volume shape

      Vibrate T; // a vibration object with Tx, Ty, Tz components

      void setModelViewMatrix (); // tell openGL where the camera is located

       

      Public:

      CCamera (); // default constructor

      void set (Point3 eye, Point3 look, Vector3 up) // similar to gluLookAt ()

      void roll (float angle) // camera rotation around n

      void pitch (float angle) // camera rotation around u

      void yaw (float angle) // camera rotation around v

       

      // camera translation in the u, v, and n direction

      void slide (float delU, float delV, float delN)

       

      // set the viewing frustum

      void setShape (float ang, float asp, float nearD, float farD)

      void vibrate (float tx, float ty, float tz) // vibrate the camera

      }

       

       

    11. Special Effects/Particles
      1. Particle System

Particle systems will be used in drugrunner to add a little visual flair to some game events. The following are such candidate events:

        1. Particle system overview

In drugrunner, the following overall attributes describe a particle system:

 

A particle is described in code as follows:

 

class PARTICLE

{

private:

PARTICLE_SYSTEM* Parent; // Complex class controls all particles

 

public:

VERTEX3D prev_location; // The Particle's Last Position

VERTEX3D location; // The Particle's Current Position

VECTOR3D velocity; // The Particle's Current Velocity

 

float color[4]; // The Particle's Color

float color_counter[4]; // The Color Counter

 

float alpha; // The Particle's Current Transparency

float alpha_counter; // Adds/Subtracts Transparency Over Time

 

float size; // The Particle's Current Size

float size_counter; // Adds/Subtracts Transparency Over Time

 

float age; // The Particle's Current Age (decay)

float dying_age; // The Age At Which The Particle DIES!

 

void Set_ParentSystem(PARTICLE_SYSTEM* parent);

void Create(PARTICLE_SYSTEM* parent, float time_counter);

bool Update(float time_counter);

 

PARTICLE();

~PARTICLE();

}

 

This particular system, obtained from nehe.com will morph a particle’s size, colour and alpha transparency over time. This should prove sufficient for our uses, but we may choose to be more creative later on, firing around engine parts and implementing a more complex range of motion than simple trajectory-gravity model. In that case, adapting this system to use different texture maps and geometry can be implemented upon this base.

 

 

The class that controls all the particles is found below. This class defines the particle system, ore "emitter". As well as being a producer of particles, the will particle system actually exists as an entity in the world and has a single controller and attributes such as world coordinates.

 

class PARTICLE_SYSTEM

{

private:

bool attracting; // Is The System Attracting Particle Towards Itself?

bool stopped; // Have The Particles Stopped Emitting?

 

unsigned int texture; // The Particle's Texture

 

unsigned int particles_per_sec; // Particles Emitted Per Second

unsigned int particles_numb_alive; // The Number Of live Particles

float age; // The System's Current Age (In Seconds)

 

float last_update; // The Last Time The System Was Updated

 

float emission_residue;// Helps Emit Very Precise Amounts Of Particles

 

public:

PARTICLE particle[MAX_PARTICLES]; // All Of Our Particles

 

VERTEX3D prev_location; // The Last Known Location Of The System

VERTEX3D location; // The Current Known Position Of The System

VECTOR3D velocity; // The Current Known Velocity Of The System

 

float start_size; // The Starting Size Of The Particles

float size_counter; // Adds/Subtracts Particle Size Over Time

float end_size; // The Particle's End Size

 

float start_alpha; // The Starting Transparency Of The Particle

float alpha_counter; // Adds/Subtracts Particle's Transparency

float end_alpha; // The End Transparency

 

VECTOR3D start_color; // The Starting Color

VECTOR3D color_counter; // The Color That We Interpolate Over Time

VECTOR3D end_color; // The Ending Color

 

float speed; // The System's Speed

float speed_counter; // The System's Speed Counter

 

float life; // The System's Life (In Seconds)

float life_counter; // The System's Life Counter

 

float angle; // System's Angle

 

int spread_min; // Used For Random Positioning Around The Emitter

int spread_max;

float spread_factor; // Used To Divide Spread

 

VECTOR3D gravity; // Gravity For The X, Y, And Z Axis

float attraction_percent;

 

bool Update(float time, int flag, float num_to_create);

void Render(GLvoid);

unsigned int Active_Particles(void);

float Get_Location(int coordinate);

};

 

        1. Integration of Particle System into Architecture
        2. Unlike the regular world object case, where there is a one-to-one correspondence between world object and it’s A.I. controller, a common "emitter" controls all particles in the system. More than simply a controller, the emitter also has a location in the world, from which all particles emerge. This implies that, in the model-view-controller architecture, the same controller for the emitter is also responsible for all the particles. Furthermore, when the emitter is destroyed, so are all the particles.

           

          There can potentially be a great many particles being created and destroyed in rapid succession. To help speed this up, a memory pool will be used. Please refer to sec 11.3.1.4.

           

        3. Particle Behaviour
          1. Motion and decay
          2. The motion of the particles will initially be hardcoded into the system. They will follow an initial tragetory/gravity influence path where the only randomness occurs in the initial trajectory. This will be useful for sparks and explosions. Once completed to our satisfaction, the particle controller class will be subclassed to include a fire particle controller and a burning particle controller. The difference here is in the equations of motion that a particle can take. The equations will be more complex, introducing randomness as they move for a random drifting effect. Furthermore, the decay rates will be different, again introducing a randomness element so that the appearance of the particle will be less as it travels out from the emitter.

          3. Time to Live

          Particles have a defined lifetime. The act of the particles aging is called decay. The age of the particle often determines certain elements about it such as its size, colour and alpha blend factor continuously. This will suffice for our first iteration of the particle system. Once it has been established that the particle system works well, we may introduce "decay frames" and nonlinear decay into the system. Decay frames are points in the decay after which the texture and geometry of the particle get changed. These frames will be defined by in a .def file. By adding randomness to the decay rate, the particle frames will not all change at the same time.

        4. Particle System Memory Pool

Drug Runner particle system will use a memory pool to allocate and release memory for each particle object in order to increase the efficiency. The reason for doing this is that there will be many particle objects and we believe we could increase the programs speed by acquiring and freeing the memory for each new particle object from its own memory pool. Therefore we override the C++ new and delete keywords. The following class, CPool, allows Pool objects to be created, to perform allocation and deallocation operations, and to be destroyed. All the code given here is taken (with some modifications) from the book ‘Effective C++, 2nd Edition’.

 

class CPool
{
   public:
      CPool (size_t n);		// Create an allocator for objects of size n
      void*  alloc(size_t n) ;	// Allocate enough memory for one object
      void   free(void *p, size_t n);// Return to the pool the memory pointed to by p
      ~CPool();                      // Deallocate all memory in the pool
};

 

class CParticle

// the definition of this class is given just to show how new and delete are used

{

public:

... // usual CParticle functions

static void* operator new (size_t size);

static void operator delete (void *p, size_t size);

 

private:

static Pool memPool; // memory pool for Particles

static const int BLOCKS; // number of blocks (or number of particles)

//pointer to the head of the linked list that keeps free blocks

static CParticle *headOfFreeList; ... // usual CParticle private member variables

};

 

// these definitions go in an implementation file, not a header file

// create a new pool for Particle objects

CPool CParticle::memPool(sizeof(CParticle)); // implicitly initialized to null

CParticle* CParticle::headOfFreeList;

const int CParticle::BLOCKS = 512;

 

CPool::CPool(size_t particle) // constructor of the CPool class

{

// The free list is empty. Allocate a block of memory big enough to hold BLOCKS CParticle objects

CParticle *newBlock =static_cast< CParticle *>(::operator new(BLOCKS *

sizeof(particle)));

// form a new free list by linking the memory chunks together; skip the zeroth

// element, because you'll return that to the caller of operator new

for (int i = 1; i < BLOCKS-1; ++i)

newBlock[i].next = &newBlock[i+1];

 

// terminate the linked list with a null pointer

newBlock[BLOCKS-1].next = 0;

 

// set p to front of list, and headOfFreeList to chunk immediately following

p = newBlock;

headOfFreeList = &newBlock[1];

}

 

CPool::~CPool() // destructor of the CPool class

{

CParticle* p;

 

for (int i=0; i<BLOCKS-1; ++i)

{

p = headOfFreeList;

headOfFreeList = headOfFreeList ->next;

delete p;

}

}

 

void* CPool::operator alloc(size_t size)

//CHANGE LATER: void* to Cparticle* since always the case

{

// send requests of the "wrong" size to ::operator new();

if (size != sizeof(CParticle)) return ::operator new(size);

 

CParticle *p = headOfFreeList; // p is now a pointer to the head of the free list

// if p is valid, just move the list head to the next element in the free list

if (p) headOfFreeList = p->next;

 

return p;

}

 

// operator delete is passed a memory chunk, which, if it's the right size, is just 
// added to the front of the list of free chunks
void CPool::operator free (void * objectToRemove, size_t size)
{
    if (objectToRemove == 0) return;

 

    if (size != sizeof(CParticle)) {
        ::operator delete (objectToRemove);
        return;
    }

 

    CParticle *carcass = static_cast<CParticle*>( objectToRemove);

 

     carcass->next = headOfFreeList;
     headOfFreeList = carcass;
}

 

inline void * CParticle::operator new(size_t size)

// doesn’t have to be inline

{

return memPool.alloc(size);

}

inline void CParticle::operator delete(void *p, size_t size)

// doesn’t have to be inline

{

memPool.free(p, size);

}

 

 

    1. Use of Cel-Shading
      1. 1D Texture Dynamic Cel-Shading

All dynamic entities will be dynamically cel-shaded using a technique described in more detail on nehe.gamedev.net. The method uses tricks of the hardware (and the OpenGL API) to simulate thick-line-rendering by rendering a wireframe of the model on top of the current model. The exact procedure for the line drawing goes something like:

 

glEnable(GL_BLEND); // Enable Blending

// Set The Blend Mode

glBlendFunc(GL_SRC_ALPHA ,GL_ONE_MINUS_SRC_ALPHA);

glPolygonMode(GL_BACK, GL_LINE); // Draw Backfacing Polygons As Wireframes

glLineWidth(outlineWidth); // Set The Line Width

glCullFace(GL_FRONT); // Don't Draw Any Front-Facing Polygons

glDepthFunc(GL_LEQUAL); // Change The Depth Mode

glColor3fv(&outlineColor[0]); // Set The Outline Color

 

glBegin(GL_TRIANGLES); // Tell OpenGL What We Want To Draw

for (i = 0; i < polyNum; i++) // Loop Through Each Polygon

{

for (j = 0; j < 3; j++) // Loop Through Each Vertex

{

// Send The Vertex Position

glVertex3fv(&polyData[i].Verts[j].Pos.X);

}

}

glEnd();

 

glDepthFunc(GL_LESS); // Reset The Depth-Testing Mode

glCullFace(GL_BACK); // Reset The Face To Be Culled

glPolygonMode(GL_BACK, GL_FILL); // Reset Back-Facing Polygon Drawing Mode

glDisable(GL_BLEND); // Disable Blending

 

 

additionally (and possibly even more important) all dynamic models will be shaded using a 1D lightmap texture and directional light (sun). The 1D shader is stored in a simple .txt file for access by the textureManager. Drawing the shaded polygons is managed by OpenGL in the following way (initialization of tecture omitted for clarity):

 

 

glTranslatef(0.0f, 0.0f, -2.0f); // Move 2 Units Away From The Screen

glRotatef(modelAngle, 0.0f, 1.0f, 0.0f);// Rotate The Model On It's Y-Axis

glGetFloatv(GL_MODELVIEW_MATRIX, TmpMatrix.Data);// Get The Generated Matrix

glEnable(GL_TEXTURE_1D); // Enable 1D Texturing

glBindTexture(GL_TEXTURE_1D, shaderTexture[0]); // Bind Our Texture

glColor3f(1.0f, 1.0f, 1.0f); // Set The Color Of The Model

 

glBegin(GL_TRIANGLES); // Tell OpenGL That We're Drawing Triangles

for (i = 0; i < polyNum; i++) // Loop Through Each Polygon

{

for (j = 0; j < 3; j++) // Loop Through Each Vertex

{

// Fill Up The TmpNormal Structure With The

// Current Vertices' Normal Values

TmpNormal.X = polyData[i].Verts[j].Nor.X;

TmpNormal.Y = polyData[i].Verts[j].Nor.Y;

TmpNormal.Z = polyData[i].Verts[j].Nor.Z;

 

// Rotate This By The Matrix

RotateVector(TmpMatrix, TmpNormal, TmpVector);

Normalize(TmpVector); // Normalize The New Normal

 

// Calculate The Shade Value

TmpShade = DotProduct (TmpVector, lightAngle);

if (TmpShade < 0.0f)

TmpShade = 0.0f; // Clamp The Value to 0 If Negative

 

glTexCoord1f (TmpShade);// Set The Texture Co-ordinate As The

// Shade Value

// Send The Vertex Position

glVertex3fv (&polyData[i].Verts[j].Pos.X);

}

}

glEnd (); // Tell OpenGL To Finish Drawing

glDisable (GL_TEXTURE_1D); // Disable 1D Textures

 

 

The following two diagrams show different positions of the directional light source.

 

 

This section of the technical design is just meant to give a brief overview of the technology used. See the mentioned websites for much more detailed information.

 

  1. Art Reqirements, Resources, Definition and Pipeline
    1. Required Art Assets

Even though the primary objective of this game is as an exercise in gameplay, a certain amount of artwork is required to aid in immersion and thus overall entertainment value of the game. Objects in the "world" must, at bare minimum, be recognizable to the player as was intended by the designers. The player should recognize a bank as a bank, a police car as a police car, a road as a road, etc. As an added bonus (and is our hope) that objects will appear pleasing to the player’s eye as well.

The following is a list of world objects that artwork will be required for. A further breakdown of what is required of each of these assets can be found in the subsections.

 

      1. Vehicle Art Requirements

To render any sort of vehicle, the following is required:

 

Information stored in .obj file form:

 

Information stored in separate files:

 

All the textures required for on car, will be stored in as few texture files as possible, probably a single, uncompressed .tif file. This will allow for faster rendering as the renderer will not have to flush the pipeline in order to switch textures.

      1. Static Object Art Requirements

For any building, trash can, grass, road, sky, mission object or road marking, the following art is required

 

Information stored in .obj file form:

 

Information stored in separate files:

 

As in the vehicle case, the textures required for each object will be put in as few files as possible. In fact, two unrelated textures may share the same .tif file (just with different u,v coordinates) in order to speed up the rendering process. It is important to note, that these objects are considered "static" objects in the world, as they will not change shape, nor colour. As such, they may also be loaded into display lists in the renderer for increased performance.

      1. Particle System Art Requirements

The particle system will be employed in three ways in drugrunner; "burning" particles, "spark" particles and "explosion" particles. Loosely, particles behave in the following way: Each particle has a trajectory and a decay rate. The particles have animations associated with them associated with their decay rate. The animation will be displayed as two simple flat polygons displayed crosswise. It is important to note that that both trajectory and decay have a random element associated with them. The art requirements are therefore:

      1. Animation Object Art Requirements

In drug runner, all animations will take the form of a texture that changes over time, placed on a 2D surface. There are no 3-dimensional animation objects. Some animations, such as the "boss character" may be billboarded, where as others, such as traffic lights, and police car "cherries" will be decaled onto a static object. In either case, their art requirements are the same these are the following:

 

      1. Fonts
      2. In game pause menus and the entire front end will require a set of fonts to communicate menu options and other messages to the player. Rendering Fonts (non kerned) to the screen using OpenGL’s orthographic projection matrix is no problem at all, but still the fonts must be aquired. We will gather information on this from the internet, specifically from the webpages listed on our own links page at gamedev.nealen.com. These pages contain a slew of goodm information and also some nice freely available font textures

         

      3. Miscellaneous Art Requirements

There are other textures that are needed by the system that are not as easily categorized. In all cases they are simple textures that are sorted as uncompressed .tif files. The textures are the following:

 

    1. Definition Files

Each of the art requirements listed section 12.1 and its subsections has a definition file associated with it. Every world object in the game has a definition file, (with a .def extension) associated with it. The header file for each world Object will contain the path/filename of this .def file.

 

The characteristics of a .def file will be different depending upon the object it is specifying, but globally, the following will hold true:

Lines beginning with a "# " will be comment lines and will always be ignored

The first "valid" (unignored) line in a def file, defines the object type that is being defined. The possible objects are:

 

 

      1. World Representation File
      2. This file describes the landscape and the initial placement of all objects in the world.

         

        The most crucial aspect to understanding the world representation file is the concept of a city block. A city block (herein defined as simply a "block") is a square area in the world. The block extends from the middle of one intersection in the world to the middle of the other intersection kitty-corner to it. All objects in the world are placed inside a block. The block is, in turn located at some point in the city. The city is simply a grid of city blocks. It is these same city blocks are used in the spatial index.

         

        The world representation can be found in the file worldrep.def

         

        The following diagram describes the general outline of a block, and some of its distinguishing characteristics.

         

         

        fig. 12.2.1 – 1 World representation overview

         

         

        The following keywords are also found in the world representation file:

         

        world_dimensions = 10 5

        This specifies the number of blocks in the city in the x direction and the y direction respectively blocks are named by their block coordinate. These range from (0->nx-1, 0->ny-1) inclusive. In this example, the world is ten blocks across by five blocks

         

        block_scale = 100m

        This specifies the size of each block in the world. All models will be scaled against the block size. The models will eventually get scaled against real-world coordinates. This is the mechanism to do so.

         

        vehicle_size

        Anount to scale vehicles relative to the block size

         

        road_texture_file = roadtexture.tif

        sidewalk_texture_file = sidewalktexture.tif

        grass_texture_file = grasstexture.tif

        sky_texture_file = skytexture.tif

        These are filenames that the world-relivant textures are found in. These textures are common to all blocks in the system.

         

        block_definition_begin 0 0

        block specifications

        block_definition_end

        Between these two statements are all definitions related to the block specified after the block_definition_begin statement. They specify what block that the contained statements apply to, and are for file parsing purposes. The statements that identify road widths are contained below. Please refer to the diagram for a better understanding of their meaning.

         

        road_width_x = 0.05

        The half-width of the road on both sides along the x-axis. This is relative to the block size

         

        road_width_y = 0.05

        The half-width of the road on both sides along the y-axis. This number is relative to the block size

         

        building_type = bank

        this specifies the type of building that will be placed in the center of the block. Possible buildings may include bank, apartment, courthouse, park, restaurant.

         

        Building_size = 0.50

        All buildings will be roughtly square. This number is how large the square gets relative to the size of the block.

         

        PLACE_object type size location_x location_y orientation_x orentation_y orentation_z

        There will be many of these lines within a block definition. They place all the other objects in the world, such as cars, streetlights and lampposts.

      3. Object Representation Files
      4. All world object types that are not special effects nor composite objects are represented the same way in the art pipeline. Buildings, trash cans and street lights be definded by a .rep file. Vehicles are represented the same in the artwork whether they be player vehicles, police vehicles or pedestrian vehicles. It is only their logic that distinguishes them in the game.

         

        There will be several vehicle representation files, however, depending on what type of vehicle it is being described. The filenames are as such: policevehicle.def playervehicle.def ped_car.def ped_bus.def ped_moped.def .

         

        Buildings will be defined be by filenames such as bank.def, apartment.def, park.def, courthouse.def

         

         

        vehicle, building, streetlight, trashcan

        This line indicates to the parser that it is a vehicle, streetlight, etc that is about to be loaded. The game should already be aware of what type of vehicle (i.e. police, pedestrian, etc.) or type of building (apartment, bank, etc.) that is being loaded.

         

        geometry = playervehicle.obj

        This is the filename of the geometry of the vehicle. The file will contain information about face and vertex normals, and textures as well. The filenames of any textures are located in the object file. Please review the discussion of the .obj file format that follows.

         

        scaling_factor = .01

        The vertices that are listed in the geometry file must be all scaled between 0 and 1 so that they may be handled appropriately. Since the actual scale of the vertices is done by the artist, and can’t be relied on (especially if the "artist" is a random art website somewhere) it makes sense to find a factor that works.

         

        cel_gradients = 0.0 0.0 0.0 0.0 0.1 0.1 0.5 0.5 0.6 0.7 0.7 0.9 1.0

        There may be an arbitrary number of coefficients on this line. The cel-gradients specifiy how the cell-shading on this object will behave. It is important to note that not all objects may have a cel_gradient line. For instance, buildings will not be cel-shaded so they will not have this line.

         

        bounding_rectangle = (-1.0 –1.0 –1.0) (1.0 –1.0 –1.0) (-1.0 1.0 –1.0) (1.0 1.0 –1.0) (1.0 –1.0 1.0) (1.0 -1.0 1.0) (-1.0 1.0 –1.0) (1.0 1.0 –1.0)

        This indicates a bounding volume about the object, for collision detection. Other bounding volumes may include

         

        bounding_sphere = (.1 .1 .1) 0.9

        It is important to note that all bounding volumes are relative to the vertices after scaling_factor has been applied to them.

         

      5. Vehicle representation files

This file is broken down exactly like world object representation file, but with the following additional parameters which should be self-explanitory. Non-player vehicles may contain all this information, but it may not all be used.

 

      1. Animated Object Representation files
      2. An animation is composed of a surface to apply the animation texture, a single texture file containing all the frames of the animation, the (u,v) coordinates of each frame of the animation in the sequence, and the number of milliseconds that each frame of the animation is to be displayed for, and the scaling factor of the animation.

        The breakdown of the file is as follows:

        animation

        This first line of the file tells the parser that what to follow is animation commands.

        geometry

        This is the filename of the .obj file that defines the geometry that the animation will be applied to.

        animation_texture = myanimation.tif

        An uncompressed tif file that contains every animation sequence

        numframes 3

        This defines the number of frames that the geometry has

         

        framedef 0 0 0 750

        framedef 1 10 0 250

        framedef 2 0 10 750

        The definition of each frame in the animation. There are numframes number of framedef lines with indices (first number) ranging from 0 to numframes - 1. The first number defines the frame index, the second two numbers represent the u,v coordinates of the frame in the texture and the third number represents the number of milliseconds that the animation frame is to be displayed before switching to the next frame.

      3. Particle System Definition files
      4. Particle systems may be defined with filenames such as sparks.def and explosions.def.

         

        Like an animation, a particle in a particle system is composed of surfaces to apply a texture, a texture file containing all the "decay frames" of the ‘animation’, and their corresponding the (u,v) texture coordinates, and the number of "decay ticks" that each decay fram to be displayed for before moving on to the next frame, and finally the scaling factor of the geometry.

        There will be a certain other factors of particles that will be hardcoded into the system, in the interest of overall simplicity. These factors are the equation of the particle trajectory, (including random elements) and equation that the "decay ticks" pass by (again, including random elements).

        The breakdown of the particle definition file is as follows:

        Particle_system

        This first line of the file tells the parser that what to follow is animation commands.

        numframes 3

        This defines the number decay frames that this particular particle system has.

        geometry

        This is the filename of the .obj file that defines the geometry of all decay frames that the particles can take. The decay frames are numbered from 0 to numframes.

        In the .obj file format, different surfaces can have different materials set for them. Sets of polygons can be made of the same material. This way, all the polygons that are set "material 0" can be considered to be of decay frame 0. Therefore, when on decay frame 0, we only render those faces tagged material 0. It may also be possible to use "element grouping" in the .obj format to accomplish the same task.

         

        animation_texture = myanimation.tif

        This is an uncompressed .tif file that contains every "animation" frame of the decay sequence. Often, this will just a single frame, and the image will change by morphing the geometry, stretching the texture to fit. By allowing several animation frames in the decay sequence, we increase the flexibility of the particle system. An important fact in our implementation, is that a decay frame can only refer to a single (u,v) coordinate in the animation texture. Therefore, all surfaces on the particle will have the same texture. This has been done for simplicity’s sake.

         

        framedef 0 0 0 750

        framedef 1 10 0 250

        framedef 2 0 10 750

        The definition of each decay frame in the animation. The first number defines the frame index, the second two numbers represent the u,v coordinates of the frame in the animation_texture and the third number represents the number of "decay ticks" that the decay frame is to be displayed before switching to the next frame. The numbers are additive. It is possible to skip a frame if the decay will somehow get too fast.

         

      5. Directory structure of art assets

The following directory structure will be used.

art_assets

sound_assets

texture_assets

geometry_assets

definitions

 

art_assets

The place where all art assets go (duh!). This is divided into the following subdirectories:

sound_assets

All the .wav files for sound events will go here.

 

texture_assets

All .tif files, including those for animaition.

 

geometry_assets

All .obj files that define the geometry. If the .obj files refer to other textures inside, then their path is assumed to be texture_assets.

    1. Art pipeline. Introducing new art, modifying existing
    2. To introduce new art into the game the following steps must be accomplished. When modifying existing art, the same sequence must be used, though the files produced need not me made from scratch; this has already been done. It is assumed here that the code supports the new model / model type that is being created, and there is a class appropriate for this model in the model database.

      1. Creating new models.
      2. This can be done by using a free low-poly. tool such milkshape 3D, a commercial application such as Maya or Studio Max, or even grabbing an existing one from some other source. The important part to remember here is the filetype that the geometry is to be saved to is .obj and the textures are to be saved to .tif files.

      3. Creating a model definition file

The file can be commented with a hash mark as the first line for clarity and future revisions. Use section 12.2 as a reference to how to create this file. The important points to note are:

      1. Populating the directories.
      2. Make sure that the files are named clearly and uniquely and put in their proper places.

      3. Include new model in world representation
      4. This step will usually involve adding a place_object line into a block stanza in the world_rep.def file. It is wise to include the model in the same block as where the user starts in the game. That way, the artist can instantly see how the model looks without having to actually the game.

      5. Adjusting model parameters.

The game will have to be restarted now. No re-compiling is necessary, however. The new model should be located in the block specified in the world representation. To tweak the model, one of two files may need to be modified:

      1. Populating the type database.

All art resources should be loaded into the type database on game initialization. This involves first parsing each object definition file first, loading the information into the type database, creating an instance of that object and inserting it into the world according to the specifications of the world_rep.def file.

 

The type database will contain the following raw information:

 

After the raw data of a world object is loaded in from disk, a single instance of the object is created. This object will have all the correct pointers into the raw data, but will not actually be popped into the world. Instead, it will also exist permanently in the type database as a prototype.

 

Whenever a new object needs to be created, a new object is created and it’s private and public data will be populated from the prototype, by using a copy() method. The new world object is then populated with state data, such as position and orientation, it’s controller instantiated (if necessary) and finally it is popped into the world.

    1. External resources.
    2. There are a variety of tools available for model creation available, such as Milkshape3d, 3d Studio Max and Maya. Provided that the tools can save geometry data in .obj format and texture data .tif format, you’re good to go.

       

      It is also within scope of this game to use models already created and provided for public use on the internet. If these are used it is important to note that these probably won’t be useable in the form found. These tend to be rendered using more polygons than is practical. The .obj file format can also specify complex shapes such as bezier curves and cubic b-spline surfaces. These complexities must be removed from the original using the tool of your choice.

       

      Regardless of source, credit for the original model and the tool that it was used/modified in must be given provided. The appropriate spot for this to go is in the comments in that model’s .def file.

       

    3. The Wavefront .obj file format

The .obj file format is the geometry file format that this game will be using. Its original intent is to define the geometry and other properties for objects in Wavefront’s Advanced Visualiser. This format was chosen because it is a readable ascii format, and a file loader for this format will be graciously be provided for us.

 

Items that this format specifies that are of interest to us are:

 

The fine details of this file format can be found at the following website:

http://www.dcs.ed.ac.uk/home/mxr/gfx/3d/OBJ.spec

 

 

  1. The Prototype

As mentioned in the very first chapter, this technical design document comes with a prototype engine, which demo’s the feasibility of the games architecture. The prototype exe and source can be downloaded at gamedev.nealen.com on the ‘docs’ page. Note that the username/password combination must be entered to retrieve the file (source of prototype 2). I will email this information to Dave Forsey’s email account today (Friday, November 23, 2001).