Search code examples
c#renderingraycasting

Why does my C# ray casting rendering engine produce curved lines when they should be straight?


Currently writing a simplistic ray casting engine based off of the Wolfenstein 3D engine in C#. It takes a string[] as a map and renders what the camera would see. The way this is done is casting 960 rays, one for each vertical slice of pixels in the output window, and measuring the distance that that ray travels. It then runs that value through a function and converts it into a height for the one-pixel wide rectangle rendered for that vertical slice. My issue is that, for very close walls, the view distorts the straight edges of the wall to curve off akin to a y=ln(x) graph.

Map (c is camera, facing due East):

# = wall, - = empty, c = camera

Output (with comments):

render output with non-straight lines

It just seems strange to me that such a system would incorrectly render this small part, when the rest renders fine. I have added a Euclidean distance correction calculation to the ray cast length already, see below.

float adjusted_dist = (float)Math.Cos(angle) * length;

This fixed a similar fisheye effect but did not resolve this issue.

My raycasting code works fine as shown by this top-down debug image I rendered:

top-down view of ray cast paths

With a screen size of 960px by 540px, I convert the distance of the ray into a height for the screen slice according to this formula:

int pixels = (int)(540 - (540 * heights[i] / max_dist));

Where max_dist is a constant with a value of 150. My reasoning was that this would accurately convert the length of the ray into the range (0-540) from the range (0-150) as there is nothing currently in my scene that is more than 150 units away from the camera. (Each character in the map is a 10unit by 10unit square). However, this is the line in which I suspect the error originates from, however the exact cause evades me. Would love some input and potential other approaches to converting ray length to pixel slice height.

I have excluded most of the rendering code as it works fine, however if more code is needed I can provide more snippets.


Solution

  • I think the main problem here is that this:

    int pixels = (int)(540 - (540 * heights[i] / max_dist));

    will only be at its maximum (540), when the length of the ray is 0. First of all, this won't be the happening in any case (from looking at the debug render), and second: I think you would want to have the maximum pixel height drawn already at distances farther away than no distance at all. Looking at the ouput, about a third of the right side of the screen would have the maximum pixel height already.

    I'm not sure how you could do your approach better, one idea I had is substracting some amount from the ray length (with a minimum at 0 of course), but that probably wouldn't fix the core issue and everything would look as if closer by, but maby it is worth a shot.

    I also got the idea for a diffenrent approach, going from a more "modern" camera:

    int wallheight = 20; //pretty much just a guess
    float horizontal_view = angle; //in radians
    float vertical_view = (horizontal_view / screen_width) * screen_height;
    
    int pixels = (int)Math.Round(Math.Min((wallheight / (2 * Math.Tan(0.5f * vertical_view) * ray_length)) * screen_height, screen_height), 0);
    

    here, angle is basically your field of view, the one you are using for tha raycasts aswell. Then, by using 2 * tan(0.5 * vertical_view) * length, we calculate kind of the maximum height that we can see at that distance. If we divide the wallheight by that, we should, I think get the percentage of the wallsize compared to the height we can see. Now multiply that by your 540 and we get the amount of pixels we have to draw vertically, which can of course be higher than the 540, so I added the Min(), maby that won't even be necessary for you.

    I hope this works, since I can't really test it