Powered by Smartsupp
M
Morie
Back to Home/Morie Blog/Why Do Moire Patterns Appear in Photos? A Scientific Explanation
Photography Basics
Science
Moire Patterns
Sensors

Why Do Moire Patterns Appear in Photos? A Scientific Explanation

Morie TeamPublished on 1/8/202610 min read

What is the Moire Effect?

If you've ever taken a photo of a computer screen, a striped shirt, or a brick wall, you've likely encountered the strange, wavy, rainbow-colored patterns known as the Moiré effect. These patterns don't exist in the real world—they are an optical artifact created by the interaction between the subject and your camera. It can turn a perfect shot into a psychedelic mess that is distracting and unprofessional.

The term "Moiré" (pronounced mwahr-ay) originates from the French textile industry, describing a fabric with a watered or rippled appearance. While silk weavers created this effect intentionally for aesthetic beauty, in digital photography, it is almost always an unwanted error that photographers fight to avoid.

The Science: Interference Patterns

At its core, a Moire pattern is an interference pattern. It occurs when two grid-like patterns overlap but aren't perfectly aligned. In photography, these two grids are:

  • The Subject's Pattern: The fine detail in the object you're photographing (e.g., the pixels on a screen, the threads in fabric, the lines of a skyscraper).
  • The Sensor's Pattern: The fixed grid of photosites (pixels) on your camera's digital sensor.

When the frequency of the subject's pattern is close to or exceeds the frequency of the sensor's pixel grid, the camera fails to resolve the detail accurately. This phenomenon is known as spatial aliasing. Instead of capturing the true fine lines, the sensor creates a new, lower-frequency wave pattern on top of the image—this is Moiré.

The Nyquist-Shannon Sampling Theorem

To understand why this happens, we must look at signal processing theory. The Nyquist rate states that to accurately reproduce a signal (in this case, visual detail), you must sample it at a rate at least twice the frequency of the highest frequency present in the signal.

In simple terms: if you want to capture 100 lines per inch of detail, your camera sensor needs to have a resolution capable of resolving at least 200 lines per inch. When the detail is finer than what the sensor can capture (violating the Nyquist limit), the "extra" data doesn't just disappear; it folds back into the visible spectrum as false patterns—Moiré.

Color Moiré vs. Luminance Moiré

There are two main types of Moire you might see, and they often appear together:

  • Luminance Moire: Appears as curved, wavy lines of light and dark. This affects the details and texture of the image. It looks like ripples in a pond and is often seen on architectural details or black-and-white fabrics.
  • Color Moire: Appears as rainbow-colored banding (often purple, green, or yellow). This happens because of the Bayer filter array used on most digital sensors to capture color. The red, green, and blue pixels are arranged in a specific mosaic grid. When fine details interfere with this color grid, the camera "guesses" the wrong color for certain pixels, generating false color artifacts.

The Role of Optical Low-Pass Filters (OLPF)

For many years, camera manufacturers fought Moiré hardware-side by installing an Optical Low-Pass Filter (OLPF), also known as an anti-aliasing filter, directly in front of the sensor.

This filter works by intentionally blurring the incoming image just slightly—enough to smooth out the finest details that would cause aliasing, but (hopefully) not enough to ruin the photo's sharpness. It effectively cuts off the high frequencies before they reach the sensor.

However, in the quest for maximum sharpness and resolution, modern high-end cameras (like the Nikon Z series, Sony Alpha R series, and Canon R5) often omit the OLPF. They rely on high megapixel counts to out-resolve patterns. While this yields sharper images, it means Moiré is making a comeback, especially in fashion and architectural photography.

Why It's Hard to Fix

Moire patterns are baked into the raw data of the image. Because the sensor literally "saw" the wrong pattern, removing it requires complex mathematical reconstruction of what the image should have looked like. It's not as simple as "removing a blemish." You are trying to separate the false pattern from the real texture.

This is why traditional editing tools often struggle—they tend to blur the texture to hide the pattern. AI-powered solutions like Morie are becoming the standard because they can "hallucinate" or reconstruct the likely original texture while subtracting the geometric interference pattern.

Related Posts

Tutorial
Photoshop
A comprehensive comparison of manual restoration techniques in Adobe Photoshop versus instant AI removal tools.
1/9/202612 min read
Reviews
Tools
We tested the best free tools on the web for fixing photo artifacts. Here is what we found.
1/7/202610 min read
Education
Definitions
From textiles to television, learn everything about the Moire effect and how it impacts digital imaging.
1/6/202612 min read