This simple 3D engine will load 3D models and render them using the "2d" context on the <canvas>
It seems that when we restrict ourselves to a few hundred faces, static models can be rendered without too much trouble.
Needless to say, textures are even more demanding than the filled faces, so expect pretty bad performance for the high-polygon models.
When textures are enabled, only the simpler models can be rotated in something resembling real-time, since the frame rate drops rather drastically.
are stored as JSON
are just regular JPEG images.
Before rendering can begin, they are chopped up into small pieces (for each face) according to the UV coordinates of the mesh. These pieces are then rotated and scaled onto a canvas for easier rendering. This may take a few seconds. This will probably be redone, so the "baking" of textures is done beforehand to speed up loading (or at least allow for both raw and baked textures).
The chopped up triangular texture parts are then rendered by skewing, rotating and scaling them into place so they fit the triangle to be rendered. A clipping path ensures that only the relevant part of the baked texture is rendered.
The skewing is done by a rotate -> scale -> another rotate, inspired by this function
As you might have figured out, this does not give entirely correct results. This approach is called affine mapping
and it means that, for any textured face, all the texture pixels are rendered as if they were at the same distance from the viewer.
I don't think the problem is very visible on the models provided here (especially with my shoddy texturing job!), but it can certainly be a problem in some cases. It seems like an ok compromise for now, though, since doing correct texturing would be too slow for any real-time use.