Ten Ways You and Your Camera See Differently

In brief:

  • The human eye and the lens of a camera have similar components to them. The iris of our eye represents the glass in the lens. The retina, on which the image is focused inside our eyes, can be seen as the film or the digital sensor in the camera. The brain that makes sense of the images projected onto our eyes compares to the computer algorithms used to manipulate images in modern cameras.
  • Despite the similarities, the eye and the camera system differ in some fundamental ways, which I’ll be listing in this post. The main difference between the two is the level of complexity: the human vision is light years ahead of even the most advanced camera technology.
  • Understanding the dissimilarities between the eye and the camera, and trying to imagine how the camera sees the world, helps us hone our skills and vision as photographers.
  • Some of the differences between our eyesight and the way the camera sees the world have to do with the limitations in our biology; while others stem from the superior technical capabilities of the human vision.

Ten Ways You and Your Camera See Differently

Most people imagine that what they see with their eyes is what their camera sees.

The truth is, however, that the lens in your eyes is very different from the lens of your camera.

Your eyes are a precision instrument that no man-made object can replicate.

As photographers, it might be worth our while to have an understanding of how the eye works, and compare that to the way the camera and the lens do.

How the Eye Works

Ten Ways You and Your Camera See Differently

The human eye is connected to the brain. The light reflected from the outside world travels through iris, the lens of the eye. It is focused at the back of the eye, on an area called retina. The image is focused on the retina upside down.

From the retina, the image is transferred to the brain via optic nerves. In the brain, the picture is then turned the right way.

The brain also interprets the information relayed by the eye, allowing us to know what we are looking at.

All of this is happening constantly and in real time.

How the Camera Works

Ten Ways You and Your Camera See Differently

The camera works in much the same way as the eye. It’s just very much simpler.

When you take a photo, the light goes through the lens and enters the light-tight box of the camera. The image is focused at the back of the camera. As with the eye mechanism, the photo in the camera is also formed upside down.

The camera has two optional methods of recording the image.

The traditional film photography uses a chemical process to attach the impression on a piece of plastic or other material covered with light-sensitive silver crystals.

Digital photography works by transforming light into an electrical current which is then recorded by the silicone image sensor inside the camera.

The primitive brain of the camera is represented by the computer program in most modern cameras.

This computer is slightly more complex in digital cameras than in film cameras where its function is mainly to ensure that the exposure of the image is more or less correct.

In digital cameras, the computer algorithm performs a range of other functions, besides automated exposure control. These may include quite sophisticated preprogrammed image editing as well as lens corrections.

I personally find it nothing short of magical that we can point a device at a scenery and somehow get a realistic representation of it, with the added bonus of optional image editing, all done in a fraction of a second!

That said, we are still operating on a very basic technological level, compared to nature. As brilliant as the brain of the camera is, it remains inferior to the one inside our heads.

The following, then, is a list of ten ways in which the eye and the camera see the world differently.

1. Detailed View

Ten Ways You and Your Camera See Differently

We’ll start with a human limitation.

Unlike the camera, we don’t actually see the whole view in front of us at once.

The human eye-brain combination takes in the world by concentrating on a small part of the scene at a time. The eye moves from one detail to another in rapid succession. From these consecutive impression our brain then constructs a larger scene.

The whole process is very fast and completely automatic.

The camera, on the hand, sees and records everything at once.

This is the reason we are sometimes surprised to see in our photos details that we didn’t notice when taking them.

Our brain gives us a stylised version of the world by letting us see and perceive what it sees as important, and forcing us to ignore what it deems as nonessential.

In normal circumstances, it makes a lot of sense for the brain to sensor our view. The amount of detail and information in everything would certainly jam our system if it didn’t. Seeing the world selectively helps us function in this world.

For photographers, the normal way of seeing the world can be a hindrance, however.

Learning to see details in a shot, before taking the photo, is something that will serve you well.

Ask yourself: Where are the limits of my photo? What are the details included in my photo? What is the background behind my subject like? Is there anything in the foreground that clashes with the subject or the background?

2. Conscious Focus

Ten Ways You and Your Camera See Differently

The eye focuses on a subject automatically and, in most cases, without thinking. Everything it sees is always sharp (assuming you have a healthy eyesight).

As opposed to the eye, the camera lens has to be focused on a target consciously.

The focusing process can, of course, be made at least semi-automatic, and in beginner photography it almost always is.

But, for anyone photographing at a slightly more advanced level, the focusing is done consciously and deliberately.

The camera can also render some parts of the view as sharp and others as blurred. This area of selective focus is called the field of focus and it’s something that your is normally not capable of.

As we saw in the first item on our list, the eye darts rapidly from one spot to another, which makes us see everything sharply. We can experience the blurring effect – or bokeh, as it’s also called – with our eyes if we focus our gaze on a nearby object while also paying attention to the background.

This is not a normal way of viewing the world, however, and it requires deliberate concentration.

The way the camera focuses selectively on one or more objects at a time can be used for artistic purposes.

Portrait and wedding photography industry is more or less based on taking photos of beautiful people with a gorgeous bokeh behind them.

In landscape and architectural photography, on the other hand, we strive to make everything sharp or at least limit the area of defocus in the image.

To achieve a uniformly sharp image, we have three options.

We can use a wide-angle lens which naturally limits the area of bokeh in the photo.

We can use a smaller aperture which, again, increases the sharp areas in the image.

We can also take a succession of photographs with a limited area of focus and combine these images into one photo in which all parts are in sharp focus.

3. Framing It

Ten Ways You and Your Camera See Differently

The world you see with your eyes has no visible limits. The scenery you’re looking at continues indefinitely in all directions. The only “frames” your perception has are the physical limits of your eyesight: you know you can’t see behind your back without turning your head.

The photograph that we take is always confined between the four walls of the image frame.

The frames are the limitation which allow us to pick up details from what we’re seeing with our eyes and organise them in artistically pleasing ways.

Your subject will take on a different look depending on where you place it within your photograph.

You can juxtapose or balance out various elements in your photo to create either a dynamic or harmonious composition.

You’re able to detect the relative size of items in your photograph by viewing them from different angles or placing them next to objects that are either larger or smaller.

There are a million and one things you can do within a picture frame.

The only limits are the ones imposed by your imagination, or the lack of it.

4. 2D vs. 3D

Ten Ways You and Your Camera See Differently

Photography is realised on a two-dimensional surface, whether it’s a computer screen or a piece of photographic paper.

The real world shows itself to us in at least three dimensions. We can physically walk inside our everyday environment.

We can only enter the photograph with our minds.

The depth perception in a photograph is created by the relative size of objects as well as the tones of colors and greys.

Our brain sees a photo of two adults standing at different distances from the camera as being of similar heights even when, in the reality of the photo, they will measure differently.

The brain interprets the size difference, quite correctly, as perspective.

A perspective can also be observed in a photo depicting a mountain range.

The closeby hills in the image are darker than the ones farther away which turn light blue and slightly hazy.

In both cases, the perspective is an illusion. The whole image is on a flat surface with no physical depth at all.

5. Blink of an Eye

Ten Ways You and Your Camera See Differently

You can’t freeze time with your eyes, like the camera does, unless you open and close them very rapidly.

In normal life, our eyes are like video cameras. They record the succession of events without ever stopping the motion.

Because of photography’s unique ability to stop time in its track, our understanding and perception of the natural world have expanded significantly.

Prior to the invention of cameras, we were not able to obserbe sings of the hummingbird in mid-flight, for example.

We were not even sure of the finer details of the horse run. It was in 1878 that the photographer Eedweard Muybridge took his series of photographs of a running horse. These photos showed in minute detail the sequence of the equine gallop.

We could now observe that the horse does, in fact, lift all its feet simultaneously off the ground at certain intervals during its gait.

The Horse in Motion by Eadweard Muybridge. “Sallie Gardner,” owned by Leland Stanford; running at a 1:40 gait over the Palo Alto track, 19th June 1878. Frames 1-11 used for animation, frame 12 not used.

The series of photos taken by Muybridge were traced out onto a glass disk by an artist and then viewed on the zoöpraxiscope, an early predecessor of the movie projector. This and other similar experiments eventually led to the birth of the cinema with its moving images procted onto the big screen.

6. The Range of Light

Ten Ways You and Your Camera See Differently

The dynamic range of the camera means its ability to record contrasting brightness values in one image.

Even the most outstanding cameras today fall short of the capability of the human eye to detect detail in the shadows and highlights at the same time. The gap seems to be narrowing, however.

The dynamic range of the eye is about twenty stops. The best cameras tested by DxOmark Laboratories reached an average of 14-15 stops.

Nikon D850, Nikon D810 and the medium format camera Hasselblad X1D-50c all have a dynamic range of 14.8 stops. Pentax 645Z, a slightly older medium format camera, trails behind the top trio for one point at 14.7 stops. Sony A7R III, which came out this year, is neck-to-neck with the Pentax, also at 14.7 stops.

Canon 5DS R, launched in 2015, has a dynamic range of 12.4 stops which to me suggests an upward curve in terms of contrast reading ability for top cameras.

For a photographer, the inferior dynamic range of cameras in comparison to the human eye, is something to keep in mind.

Because the camera often cannot record both the brightest and the darkest areas in a contrasty scene, you as a photographer will have to make an artistic decision: will you expose for the highlights (save the detail in the brightest parts of the image), or will you try keep the shadow details and risk clipping the highlights?

7. White Balance

Ten Ways You and Your Camera See Differently

White balance in photography referes to the overall colour temperature of an image. You can make the colours in your photograph warmer or cooler. You can also tweak the tone of the image by adding to it either green or pink.

All light sources project a different colour on to the objects they illuminate. The colour of these objects changes lightly, or sometimes drastically, depending on the colour of the light.

Because our vision is a product of our eyes and our brain, in combination, we hardly ever notice the colour changes. To us, indoor lighting with its yellowish colour tone looks the same as the bluer sunshine outdoors.

Again, the reason we can’t see these colour differences is that we don’t really see the world around us as it is. Our brain translates the world to us, and shows us what it thinks we need to see.

The cameras are not like that.

They will always capture the light as it appears. Whether the light in the photo seems correct to us depends on our camera settings.

All professional digital cameras have settings for auto, daylight, cloudy, shade, tungsten, fluorescent, flash and custom white balance.

What I’ve noticed, in practice, is that the auto setting works very well in most situations. I do edit the white balance of my images in post process, but I’d have to do this in any case – unless I was using custom white balance and adjusting it separately for each image.

So my advice would be to just use the auto white balance for everyday photography. No need to complicate things unnecessarily.

8. Black and White

Ten Ways You and Your Camera See Differently

This is a difference that needs no explanations.

Cameras can “see” in black and white. We can’t.

Actually, that’s not true. Some of us can.

There is a rare condition calle achromatopsia which affects about one person in 40,000. People with full-blown achromatopsia only see shades grey and black and white.

If your vision is normal, however, you will have to learn to see without colour if you wish to become a successful black and white photographer.

The absence of colour in photography requires a different set of aesthetics compared to the one utilised when taking colour photographs.

Black and white photography uses shades, tones and textures in its aesthetic language.

Often the images taken without colour are more striking than colour photos. With the chromatic information missing, it’s easier to simplify the photographic message. Whether our point gets across to the viewer dpends entirely on our skill as a photographer.

Sometimes the circumstances are not ideal for black and white photography. I’ve noticed that days with dull, grey light produce boring black and white images.

Look for light with drama.

This could be a beam of light breaking through thunder clouds. Or the horizontal light of the morning illuminating a row of trees in the forest.

Black and white seems to work for photos taken at night also.

If you can’t see anything interesting happening in terms of light, stick to colour.

Or just put the camera away and wait for the next opportunity.

9. Distorted View

Ten Ways You and Your Camera See Differently

The visual field of the human eye is about 180 degrees. That’s how wide we can see when we’re looking forward without moving our eyes.

Our field of view is divided into two parts.

The one which allows us to see details clearly is called the cone of visual attention. The width of this part is about sixty degrees.

The other part of our eyesight is called peripheral vision. It’s located on the fringes of our vision and mainly just allows us to detect motion.

The peripheral vision reaches all the way from sixty degrees to 180 degrees and so comprises the largest part of our field of view.

The focal length of our eye, incidentally, is about 25mm, the distance from the cornea to the retina.

If our eye were a camera lens, it would be either a fisheye lens, with a focal length of 8-10mm, or a 30mm lens, with a field of view of 61.9 degrees.

The fisheye lens is equivalent to our total field of view combining the cone of visual attention and the peripheral vision.

The 30mm lens corresponds to the cone visual attention on its own. Its field of view is approximately the same width as the area in which we can see details with our eyes.

One difference between the optical qualities of our eyes and some lenses is in the amount of the distortion they create.

The human eyes give us an undistorted, balanced view of the world.

Many camera lenses – such as the fisheye or telephoto lenses – distort the view in one way or another.

Ultra-wide angle lenses produce a curvilinear image with strong barrel distortion.

Telephoto lenses, on the other hand, tend to compress the image, making objects seem closer to one another than what they are in reality.

Both of these ways of lens distortion are impossible for our eyes to reproduce, which makes these kind of specialty lenses an interesting choice for artistic experimentation.

10. Post Processing

Ten Ways You and Your Camera See Differently

Finally, the “images” produced by the human eye differ from photographic images in the fact that they cannon be modified by post processing.

When you take a photo with your camera, in most cases, you haven’t finished with the process of creating the image.

All photographs, whether they are of the digital or the film variety, are usually manipulated in some way after their capture.

You can change the colours in your image, or get rid of them completely.

You can recrop you photo.

You can add to or subtract elements from it.

Possibilities are endless.

The images we “take” with our eyes are what they are. You can’t change what you’ve seen and can’t unsee things.

The best we can do with our eyes is learn to see more clearly and try to imagine how the camera would render each particular view.

If we can do that, we’re already on our way to mastering the art of photography.


Sources:

Paul Petzold: Focal Book of Practical Photography, published by Focal Press, 1980.

Digital camera imaging sensor: What is it and how does it work?: http://www.whatdigitalcamera.com/technology_guides/digital-camera-imaging-sensor-work-60565

How Does a Camera Work?: https://wonderopolis.org/wonder/how-does-a-camera-work

Sallie Gardner at a Gallop: https://en.m.wikipedia.org/wiki/Sallie_Gardner_at_a_Gallop

Achromatopsia: https://en.m.wikipedia.org/wiki/Achromatopsia

Canon 5DS R: Tests and Reviews: https://www.dxomark.com/Cameras/Canon/EOS-5DS-R

Sony A7R III: Tests and Reviews: https://www.dxomark.com/Cameras/Sony/A7R-III

Landscape: https://www.dxomark.com/Cameras/Ratings/Landscape

The Camera Versus the Human Eye: https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye/

Field of view: https://en.m.wikipedia.org/wiki/Field_of_view

What is the maximum angle a human eye can see?: https://www.quora.com/What-is-the-maximum-angle-a-human-eye-can-see

Fisheye lens: https://en.m.wikipedia.org/wiki/Fisheye_lens

Fisheye Lenses: https://www.photographymad.com/pages/view/fisheye-lenses


All photographs in this post are copyright (c) Markus Jaaskelainen Photography 2018.

Leave a comment