3D Physics System

This slideshow requires JavaScript.


For my master thesis I wanted to implement a 3D Physics Engine based on iterative model and rigidbodies. It is an extension from my 2D physics implementation, but is also revealed to be a great difference from it due to math complexities in 3D.


  • Role: Programmer
  • Engine: Personal engine
  • Language/Tool: C++, OpenGL, GLSL
  • Dev Time: 4 months


  • Shapes
    • Primitives: sphere, box and plane
    • Convex hull
  • Body
    • Particle physics with linear dynamics 
    • Rigidbody physics with linear and angular dynamics
  • Integration
    • Euler integration
    • Verlet integration
  • Model
    • Iterative model with coherency
    • Impulse model for contacts
    • Force model for soft constraints
  • Collision detection
    • Among primitives
    • Among convex hulls
  • Collision resolutions among primitives
  • Broadphase: Bounding Volume Hierarchy (BVH)
  • Restitution and friction
  • Debug draws (texts, wireframes, etc)
  • Continuous collision detection (CCD)
  • Assimp integration for model import
  • Math infrastructure of Quaternion, Matrix, Transform, Closest point algorithms, Intersection tests, Geometry Features, Minkowski difference


  • The pipeline follows the pattern: update force, update shapes, update contacts
  • Update of debug section and UI are added due to their frequent use in the pipeline

Particle Physics


  • Particle is a good start on linear dynamics, as we don’t care for rotations
  • One application is the fireworks


  • Different integration scheme is experimented
  • The Euler integration rules the green particle, while the Verlet integration rules the other
  • Due to the symplectic nature, Verlet integration conserves the internal energy of the applied system



  • The scene handles the update of rigidbodies from dynamics, contact generation to resolution

Collision Zoo

  • The collision zoo provides collision test of all primitives and convex hulls
  • I used it to set up test cases and validate them before integrating to the resolution pipeline

Force Registry

  • The registry is applied to soft constraints, since we want to know the force over frames. This is conceptually different from an impulse model.

  • The anchored spring, for example, update forces used in the integration step, where duration is considered

Contact Resolution

  • I resolved velocity both linearly and angularly based on the impulse model.
  • The pipeline is to cache coherent resolution data, resolve penetration and finally resolve the velocity

  • Take the velocity resolution for example, the steps are: compute the impulse (frictional or frictionless based on materials) in local contact coordinate; transform impulse into world coordinate; compute linear and angular velocity change due to that impulse; apply those changes


  • Note that we cached some data for the resolution step above to avoid recomputing them every frame
  • I did the same thing for the generation step. I generated more than one contact potentially and kept them in a storage.
  • When generating contacts in a frame, I checked if the contact was in that storage already. If yes I updated it, otherwise I added it to the storage.

Bounding Volume Hierarchy

  • BVH is a broadphase technique to filter collision pairs we know do not collide, saving us from computing data for them
  • The used bounding volume is up to your decision. I used bounding spheres to demonstrate dynamic update of it on the right.

To allow the flexibility on Bounding Volume, BVH is a good use case for templates:

template<class T>
class BVHNode
    BVHNode* m_children[2];
    BVHNode* m_parent;
    T m_volume;
    Entity3* m_body;    // if leaf this is not nullptr

Continuous Collision Detection

CCD is for solving the tunneling effects. I used the Minkowski difference to reduce the problem into a raycast in some particular direction decided by velocities. The effect looks like this:

CCD sphere vs plane

To allow the tunneling I made the sphere small. There are two spheres in the scene: the normal discrete one elevated to the middle of the screen, and the continuous one near the bottom of the screen. While moving, spheres leave a purple trail along the path. Note how the sphere near the bottom gets blocked by the plane, while the normal one passes it every time.

Convexity Problems


So far all shapes are primitives. I wanted to extend the system to allow collisions among general convex shapes. This started with the generation of the convex shape itself. I used the Quickhull algorithm to do this. The presentation by Dirk Gregorius from Valve Software provided invaluable information to me on this topic.

Essentially we need a definition of so-called HalfEdge, which I defined as:

struct HalfEdge
	HalfEdge* m_prev;
	HalfEdge* m_next;
	HalfEdge* m_twin;
	Vector3 m_tail;
	QHFace* m_parentFace = nullptr;		// weak pointer to belonging face

	Mesh* m_body_mesh;
	Mesh* m_arrow_mesh[3];

The important info is the prev, next and twin pointers to respective halfedges, which together form a circular-list of 3 or 4 halfedges and hence define a face.

With the type defined we essentially need to go through the faces of hull, step through the halfedges, and record those as horizon if they differentiate the visible and invisible region of the hull subject to a point we call eye. We tell the visibility with plane tests. Note the add of horizon:

Once the set of halfedges for the horizon is confirmed, we can form new faces with each of them and the eye. The result looks like the following (with some debug draws to show directions of halfedges):

Gilbert-Johnson-Keerthi Algorithm

With convex hull generated, we need a way to generate contact info. The GJK algorithm helps us do this. It tells us whether or not the collision happens and set up the foundation to get complete contact info.

Three methodologies are important for GJK. First is the Minkowski Difference. The concept remains the same as in 2D: if the Minkowski shape contains the origin, then two convex hulls collide.

Second, the Supporting Point. For convenience, I defined the SP in this case as the point with the most positive extension in some direction, a little different from the 2D case.

Third, the Simplex. As the algorithm finds us points approaching the origin gradually, the Simplex is where we decide that approaching direction. It is no more complex than a vector of points defining the vertices of simplex:

std::vector<Vector3> simplex_verts;

For example, the above convex is a Minkowski shape. The point D is a supporting point of one iteration, where the shape ADB is a simplex of one iteration. The video presented by Casey Muratori provided some great insights into this algorithm.

After all, this is what the process looks like:

But again, GJK solves the yes or no problem for us with some lack of contact details. We need the EPA to finish up.

Expanding Polytope Algorithm

EPA is an algorithm to get full contact info including the normal, the penetration depth and the point of contact. It starts from the simplex we ended up with GJK and expands that polytope into a convex hull until a threshold is met. Along the way, it maintains a supporting point P in the direction of the closest point from origin to the hull. When the algorithm stops, P gives us all contact info.

One may notice that this is very similar to the problem of Quickhull, as it requires deletions of old faces and adds new faces. The good thing is that we know the horizon beforehand given the simplex. So instead of defining a halfedge, EAP only requires an edge definition with face reference count to know if the edge is completely new, or still part of the shape as there is some face retaining it:

struct sEPAEdgeRef
	std::vector<Vector3> ends;
	int ref_count=0;

The article by Gino van den Bergen has a detailed look into EPA.

The process looks like this:

Note the contact info on the upper left when EPA finishes.

Some Math


The Feature in Computational Geometry differentiate the primitives that form shapes. It is important because now we need to differentiate from points to edges, edges to faces and faces to points in convex problems. 

Take Quickhull for example. The hierarchy of features looks like:

class QHFeature
    eQHFeature m_id;
    Mesh* m_mesh = nullptr;

class QHVert : public QHFeature
    Vector3 vert;

class QHEdge : public QHFeature
    Vector3 v1;
    Vector3 v2;

class QHFace : public QHFeature
    std::vector<Vector3> verts;         // size of 3 or 4 size, vertices of face (triangle or quad)
    std::vector<QHVert*> conflicts;	// conflicting point list of the face, not face vertices
    HalfEdge* m_entry = nullptr;        // linked list entry for half edges

One application of this is the general closest point algorithm. Previously if we compute the distance from point to triangle, we think the triangle as a plane, and compute the distance from point to that plane. But for Quickhull, if the projection of point lies outside the triangle, the closest distance is now either formed with an edge or vertex, not the face/plane. 

The way we encode this sense of outside is to use the so-called Voronoi Region.

Voronoi Region

The Voronoi Region is defined to be the Feature a point projected to on the shape.

  • In the image, we say the Voronoi Region of P is the edge AC
  • Not any of the vertices A, B, or C, nor the face ABC
  • We code this with dot products plus barycentric coords
  • If the barycentric area of PAC is <= 0, we know P lies on edge AC at the very least
  • This way, I can successfully return a QHEdge, not QHVert or QHFace

Now that the problem with a triangle is solved, the one with tetrahedron is naturally solved (the triangle feature with the shortest distance).

The book Real-Time Collision Detection by Christer Ericson is a treasure with most math methods behind the huge topic of Collision Detection and will benefit anyone interested in this topic like me :).

Post Mortem

What went well

  • Improved a lot on 3D Math and Physics infrastructure of my Engine
  • Improved a lot on Debug Render system which is beneficial to many projects
  • Developed a mind with minimal assumptions among modules as the project became large

What went wrong

  • Spent too much time on validating Math. It was a necessary process but it could be done better if I could do it again.
  • Blocked by Renderer due to some legacy code issue and had to clear those blocks first
  • Too many techniques, too little time. Physics Engine is such an interesting but huge topic. There are so many methods I wish to implement but 4 months will not allow.

Action Plan

  • Before going for any Physics feature, plan a Math infrastructure testing pipeline and commit it
  • Keep the Renderer integrated and updated
  • Definitely extend this project in the future. Explore and improve on models. Continue reading literature on advanced Game Physics topics.