Shading problems with cubes next to each other?

Hello Community,
I’ve ran into a very odd problem a few weeks ago already and didn’t manage to fix it so far.
I’m making a 2D Sidescrolling jumping game, where players can create their levels out of blocks.
The problem is, that whenever the player moves, there are odd flashing artifacts (black lines) appearing inbetween some of the cubes. Since this only appears to be the case when changing the cameras position I suppose the shading messes things up.

Here is a screenshot
alt text

I’m combining the meshes of the blocks in order to reduce drawcalls.
I tried changing the rendering mode from forward to anything else, didn’t work.
Spent hours of tweaking the light to minimize the effect, but unless i turn off shaders completely it remains.
Slowly I’m running out of ideas, maybe some of you can help?
Thanks in advance!

What you’re seeing is basically due to floating point imprecision, combined with (as Olly says) non-welded vertices. This can occur even if you’re sure there are no imprecisions in your own measurements: Your blocks may be exactly 1m square, and you may have used grid snapping to position them exactly 1m apart, but Unity’s engine (and most other 3d engines) will be storing the object positions and vertex positions as floating point values.

Because they’re separate objects and not welded vertices, Unity has no idea that these two vertices are supposed to be in exactly the same spot, and the calculations used to compute the on-screen position may result in tiny differences in value.

Consider the vertex shown in yellow for object A and B:

[11545-rendering+gap+artefact.jpg|11545]

If this mesh was welded, Unity would calculate one position for the shared vert. But since they’re not welded, and two separate objects each have a vert in that position, Unity will perform two separate floating point calculations - one for object A and one for object B.

Most of these blocks appear to line up exactly, from most camera positions. But from the occasional unfortunate position, the exact value for A’s position plus its vertex at (0.5,0.5,-0.5) might be slightly different to object B’s position plus its vertex at (-0.5,0.5,-0.5) . The result is that Unity shows a tiny gap, within which you can see the shadowed side of cube A. If these were planes instead of cubes, you’d be able to see through the pixel-wide gap.

A workaround might be to make these blocks a tiny amount larger than your grid size, so they overlap slightly. For 1x1x1 bricks, a scale of (1.001, 1.001, 1.001) would be enough. This would mean as a side effect that you might be able to see the slight overlap when looking very closely, but from your screenshot, it might be acceptable for your purpose.

These seams are most likely the result of adjacent vertices (e.g. minute gaps between your user placed blocks) they will often display artefacts between them as a result of the shadows calculating minute gaps between the geometry, the only sure way to prevent this is to make sure the geometry is welded e.g water tight - which I appreciate is difficult to do with dynamic content.

To minimise it you can try things like angling the direction of the light, increasing shadow quality settings and increasing the accuracy by which your blocks snap to each other. Obviously you also want to make sure the blocks are consistent scales etc.

I have encountered a similar problem.

I played with position, scale and shaders with no luck.

Then I started adjusting project Quality settings but that didn’t help either.

Finally, I looked at my camera settings.

Changing Occlusion Culling had no effect.

But Setting HDR and MSAA to “Use Graphics Settings” did the trick for me!

I think it is the MSAA that did it but why not just enable them both?

I hope this saves someone a bit of time playing with settings.

178736-camerasettings.png