Why Do Computer Screens Look Different In Pictures Than In Real Life?

I’m going to go ahead and put this picture right here to show you know what I’m talking about:

computer screen screenshot and image

See the difference between the two pictures of the same screen?

The picture on the left is a screenshot of a page that I took of my laptop, while the one on the right is a photograph of the same page that I took using my smartphone. Do you notice how the two pictures of the same page look so different?

Go ahead and try it right now. Take a picture of your laptop screen, and it’s highly likely that the photo you take will be covered in weird ‘rainbow patterns’ that aren’t actually visible when you look at the screen (with the naked eye).

The same phenomenon is also observed in movies and videos when they show a TV or computer screen that’s up and running. This is especially true for old CRT screens.

So, what’s going on there? Why do pictures of computer and TV screens look so different than they do in real life?

The brain’s talent for image processing

We really need to understand and appreciate how the brain works when our eyes first feed it an image, which the brain then processes and shows shows you, i.e., makes you actually ‘see’ something. In other words, your eyes just transmit the image of a bunch of photons falling on them; it’s the brain that actually processes them and derives some sense from them, allowing you to actually ‘see’ stuff.

When you see a motion picture, the picture itself is not moving; rather, it’s a collection of multiple pictures that appear incredibly fast on the screen, which the brain then smooths out, making you think that something is actually moving on the screen. This is where something known as the frame rate comes in.

Simply put, it’s the number of images that appear on a screen per second. The higher the frame rate, the more convincing the motion looks on the screen.

Almost all Hollywood movies are shot at 24 fps, which means that when you watch a Hollywood movie, you basically see 24 still images projected on the screen in a single second! Action movies and video games have higher frame rates, to make the motion look even smoother!


A moire pattern is an interference pattern that appears when an opaque ruled pattern with transparent gaps is overlaid on another similar pattern. Note that the two patterns shouldn’t be identical.

Moire pattern

Moire pattern. (Photo Credit : Fibonacci./Wikimedia Commons)

A picture of a computer screen looks odd because the screen is made of an array of three tiny different-colored dots (red, blue and green), which end up being in similar in size to the red, blue or green samplers in the camera. This results in the formation of a moire pattern, which is why the photograph of a computer/TV screen looks as if it’s filled with arbitrary rainbow patterns (which are not really there on the screen).

Refresh rate

Another reason behind that weird-looking picture of a computer screen is the refresh rate of the screen. For the uninitiated, the refresh rate refers to the number of frames a digital screen (desktop/laptop monitor, TV etc.) can show per second.

The higher the refresh rate of a screen, the more uniform is the display and the less screen flickering you get. For more detailed information on the refresh rate and how it affects your viewing experience, click here!

room office modern lectronic

Modern televisions have very high refresh rates. (Photo Credit : Pixabay)

A digital screen is refreshed multiple times per second. Our eyes don’t catch this process (because the brain smooths it out to make the screen look consistent), but cameras do. That’s why any picture of a computer screen looks very different from the real thing.

Older screens were updated or ‘refreshed’ by a line, i.e., a scanning line ran the entire breadth of the screen multiple times to create an image on the screen. Of course, the scanning lines worked so fast that the naked eye couldn’t actually see them loading an image. Fun fact: If you have a high-speed camera, you can actually capture the scanning lines that make up an image on the screen.

moire upward-movement

Line moiré with slow movement of the revealing layer upward. Isn’t this how old computer screens look in pictures and videos? (Photo Credit :Public Domain/Wikimedia Commons)

If you take a picture of a CRT screen, the camera only captures the part of the screen presently lit by the scanning line (whereas the brain does a lot of smoothening to make the screen look completely normal and uniform to us). That’s yet another reason why the picture of a screen looks nothing like the real thing.


  1. Wake Forest University
  2. Stony Brook University, New York
  3. Pennsylvania State University
  4. Washington University
The short URL of the present article is: http://sciabc.us/chCjN
Help us make this article better
About the Author:

Ashish is a Science graduate (Bachelor of Science) from Punjabi University (India). He spends a lot of time watching movies, and an awful lot more time discussing them. He likes Harry Potter and the Avengers, and obsesses over how thoroughly Science dictates every aspect of life… in this universe, at least.

Science ABC YouTube Videos

  1. Photosynthesis: How Plants Make Their Food?
  2. How Does A Helicopter Work: Everything You Need To Know About Helicopters
  3. Rigor Mortis, Livor Mortis, Pallor Mortis, Algor Mortis: Forensic Science Explains Stages of Death
  4. Why Is Space Cold If There Are So Many Stars?
  5. Tensor Tympani Sound: Why Do You Hear A Rumbling Sound When You Close Your Eyes Too Hard?
  6. Hawking Radiation Explained: What Exactly Was Stephen Hawking Famous For?
  7. Current Vs Voltage: How Much Current Can Kill You?
  8. Coefficient Of Restitution: Why Certain Objects Are More Bouncy Than Others?