I only released the source, this was written in Blitzbasic. I was playing with the idea to port it to JS as a test for my BB2JS project. Then I'd had used the imageData object to render the frame and "put" it onto the canvas, seems pretty straight forward. But then I get stuck in WebGL ^^
Really not much to see in that sourcecode. But what I would suggest to you is to implement mipmapping. Just auto-downscale the textures to a set of 256, 128, 64, 32, 16, 8, 4, 2 and 1 pixel size, and read from the one defined by the distance to the camera - it's cheap in terms of cpu load. And it prevents blinking details at a distance. The additional UV calculation can be achieved quickly by shifting the bits to the right.