Connect with us

Technology

Why low-light events, like the Northern Lights, often look better through your phone’s camera

Avatar

Published

on

Why low-light events, like the Northern Lights, often look better through your phone's camera

This article originally appeared on The conversation.

Martphone cameras have improved significantly in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Pictures of the northern Lightsor aurora borealis, provide a particularly striking example.

If you saw the Northern Lights during the geomagnetic storms in May 2024you may have noticed that your smartphone made the photos look even more vivid than the reality.

Auroras, known as the Northern Lights (aurora borealis) or Southern Lights (aurora australis), occur when the solar wind disrupts The Earth’s magnetic field. They appear as streaks of color across the sky.

The left side shows the aurora as seen with the naked eye. The right side shows how a smartphone camera can capture brighter and more colorful light. Douglas Goodwin

What makes the photos of these events even more striking than they seem at first glance? Like a professor of computational photographyI’ve seen the latest smartphone features overcome the limitations of human vision.

Your eyes in the dark

Human eyes are remarkable. It allows you to see footprints in a sun-drenched desert and drive vehicles at high speeds. However, in low light your eyes perform less impressively.

Human eyes contain two types cells that respond to light – rods and cones. Fishing rods are numerous and more sensitive to light. Cones process color, but require more light to function. As a result, our night vision is highly rod-dependent and lacks color.

Rods and cones in your eyes are photoreceptors that process both black and white and color. Blume, C., Garbazza, C. & Spitschan, M., CC BY-SA

The result is like wearing dark sunglasses when watching a movie. At night, the colors look washed out and muted. Likewise, under a starry sky, the vibrant hues of the Northern Lights are present, but often too faint for your eyes to see clearly.

In low light, your brain sets priorities motion detection and shape recognition to help you navigate. This trade-off means that the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their clarity.

Taking the perfect photo

Smartphones have revolutionized the way people conquer the world. These compact devices use multiple cameras and advanced sensors to collect more light than the human eye can, even in low light. They achieve this through longer exposure times – how long the camera records light – larger apertures and increasing the ISO, the amount of light your camera lets through.

But smartphones do more than adjust these settings. They also use computational photography to enhance your images using digital techniques and algorithms. Image stabilization reduces camera shake and exposure settings optimize the amount of light the camera captures.

Multiple image processing creates the perfect photo by stacking multiple images on top of each other. An institution called night mode can balance colors in low light, while LiDAR capabilities some phones keep your images accurately sharp.

Image stacking aligns and combines several noisy photos to improve the quality of the final image. By averaging these images together, random sensor noise is suppressed. This results in a clearer and more detailed image than all the photos alone. Douglas Goodwin

LiDAR stands for light detection and ranging, and phones with this setting emit laser pulses to quickly calculate distances to objects in the scene in any kind of light. LiDAR generates a depth map of the environment to improve focus and make objects stand out in your photos.

Smartphone cameras not only capture flat images, they also collect depth information. The left side shows a regular photo, while the right side illustrates the depth map, with lighter pixels closer to the camera and darker pixels further away. Normally hidden, this depth data allows smartphones to apply effects such as artificial background blur to recreate the appearance of the Northern Lights against a night sky. Douglas Goodwin

Artificial intelligence tools in your smartphone camera can further enhance your photos by optimizing settings, applying light flashes and using super-resolution techniques to get really fine details. They can even do that identify faces in your photos.

AI processing in your smartphone’s camera

While you can do a lot with a smartphone camera, regular cameras have larger sensors and superior optics, giving you more control over the images you take. Camera manufacturers such as Nikon, Sony and Canon generally avoid tampering with the imageinstead, you let the photographer take creative control.

These cameras offer photographers the flexibility of photographing in raw formatwhich allows you to retain more data from each image for editing and often produces higher quality results.

unlike special camerasmodern smartphone cameras use AI during and after you take a photo to improve the quality of your photos. As you take a photo, AI tools analyze the scene you’re pointing the camera at and adjust settings such as exposure, white balance and ISO, while recognizing the subject you’re photographing and stabilizing the image. These ensure that you get a great photo when you press the button.

You can often find features that use AI, such as high dynamic distance, night mode And portrait modeenabled by default or accessible through your camera settings.

AI algorithms further enhance your photos by refining details, reducing blur, and applying effects like color correction after you take the photo.

All these features help your camera take photos in low light and contributed to the stunning aurora photos you may have captured with your phone camera.

While the human eye struggles to fully appreciate the otherworldly hues of the Northern Lights at night, modern smartphone cameras overcome this limitation. Using AI and computational photography techniques, your devices allow you to see the bold colors of solar storms in the atmosphere, amplifying the color and capturing otherwise invisible details that even the sharpest eye will miss.

Disclosure: Douglas Goodwin receives funding from the Fletcher Jones Foundation through Scripps College.