# Spherical Light

One of the things that’s typically very difficult to model in a rendering engine is an area light source. In raytracing this becomes relatively straightforward (although computationally expensive!).

I’ve covered a type of global area light source in one article already when I talked about global lighting, effectively creating a light that exists all around the scene and calculating how much light falls on an object. This gives some nice results, but ultimately the area of the light source is so great that it’s difficult to get a noise free image. Importance sampling improves this, but it’s still only useful on outdoor scenes.

Once your scene is inside an object, like a Cornell box, then you need a localised area light source instead. To start with I picked a spherical light source because it’s also the first type of raytracing primtive most people work with!

The principle for using a spherical lightsource is very simple. With a typical point light source we calculate the ray to the light using the formula:

`ray_to_light = light_position - intersection_point`

For a sphere light we modify the light_position to have a random vector attached that places it within a sphere. The code for this is pretty simple:

```do
{
randVec[0] = scene.GetRandom()*2 - 1;
randVec[1] = scene.GetRandom()*2 - 1;
randVec[2] = scene.GetRandom()*2 - 1;
mag = randVec.Magnitude();
}
while (mag>1);

light_position += randVec;```

We then randomly select locations inside the sphere, and resample the lighting multiple times. With enough iterations we start to build up soft shadows around object boundaries where only some of the sphere is visible.

Spiral of spheres lit with an area light

## The Visible Light

Note that in the image above you can see the light itself. Sadly this doesn’t come for free. All we’ve done so far is allow lighting to be calculated from positions within a defined sphere. This can make it very hard to understand why a scene looks how it does, and the eye generally expects to see a light source when it’s inside the scene.

To achieve this I’ve added  a couple of pieces of functionality to the raytracer so I can render a visible sphere in the scene.

First off an object being used to represent the light is no good if it casts a shadow. With all of the light originating inside the sphere, the shadow would be cast across the whole scene, and we’d end up with an entirely dark render. So I’ve added a flag do-not-light to the rendering engine which indicates that an object should not be used in lighting calculations when detecting shadows.

The next thing I’ve had to implement is an emissive surface type. An emissive surface has a specific colour in the scene regardless of whether or not it’s being lit. i.e.

`diffuse = emissive_colour + other_lighting_contributions`

Emissive isn’t quite right the right word here, because in real life an emissive surface also emits light into the scene itself. In this case the emissive surface only appears to emit light, because it’s placed directly over the top of an actual lightsource. ho-hum..

One aspect has had me pretty stumped though. What colour should the emissive surface of the sphere be? In the raytracing engine I typically use light colours of 100,100,100. Given attenuation this tends towards a value of 1.0 over 10-15 units. This gives a nice balance in most scenes, but what colour should the surface of my lightsource be?

If I make it 100,100,100 then it’s hugely bright compared to the rest of the scene, and also compared to white objects which it’s illuminating in close proximity. While pondering this, and eating some cod and chips, I realised that the surface of the sphere could use a level of brightness equivalent to being illuminated by a point light source at the centre of the sphere. That should be similar to the lighting contribution a white surface receives while sitting next to the sphere.

This is the result…

Two spheres, the one on the right is plain white diffuse, the one on the left is emissive and maps to a spherical light source.

I’m pretty happy with this result, although it still doesn’t feel quite right. Hmm, answers on a postcode if you’ve got a better idea!

## Sphere or Spherical Shell

The next aspect of the spherical light I’ve been playing with is whether or not the light should come from anywhere within the sphere of light itself, or whether the light should always come from the surface of the sphere.

I’ve tried both as I felt the latter would be slightly more realistic although tougher to calculate precisely.

So here we go, sphere light…

Sphere light with light origin’s throughout the interior of the sphere

And now for the spherical shell light source…

Spherical light source with light rays starting only at the surface of the sphere

The summary of days of messing about then… not a hell of a lot of difference. :-/ That’s just the way the cookie crumbles sometimes. There were two aspects of the render that I was particularly interested in seeing. Firstly how different do the shadows look.  From what I can tell they’re generally a little broader with a more defined umbra.

Secondly, is the noise in the image much reduced with the shell light? I would hope so, but I struggle to see much difference in the test renders I’ve been doing so far!

For a final render I thought I’d do the classic RGB lights (one red, one green, one blue) with some basic elements to see how it looks.

red, green and blue area lights in a raytraced scene