I've been thinking and writing a lot about instant photography as paranormal evidence over the last week or so, and over this period of time, I've come across a number of articles talking about how digital photography, in particular smartphone photography, is beginning to feel less and less reliable. In particular, two news stories have broken that talk about how in some circumstances, you can't trust the pictures that you take on your phone.
The first story is about Samsung's photo "enhancements" (much more on that down below). The second is about how people thought that they were cropping out or redacting sensitive information on screen grabs on their Google Pixel phones because it looked like the images were cropped or redacted. But years later, it's been revealed that the "redacted" data was still available in the file, and it can be retrieved, meaning that credit card numbers, names, addresses, and other sensitive information has been compromised. (By the way, a second "acropalypse" bug has now been found on Window devices, as well—another strike against feeling like you can trust the images that you see on your devices. Images aren't quite what they seem.)
Before I get into the Samsung controversy, I want to elaborate a bit more on some reasons why instant photography feels so trustworthy, particularly in contrast to the mysterious ways in which are phones can twist reality in the photographs that we take.
Instant photos feel real
Like I mentioned when I wrote about instant photos as paranormal evidence, Polaroid photos are extremely physical. They take a moment and immediately allow you to have a keepsake of it, a physical reminder of where you just were, who you're with, or what just occurred. I love bringing my Polaroid or Fujifilm camera on trips, because it's nice to have that physical souvenir of a place, rather than just a bunch of smartphone photos.
So when trying to create a record of something as insubstantial as a ghost, of course it makes sense to want to do that through a physical means. Because, again, instant photographs allow you to take a particular moment in time—something that can only be experienced by being there physically—and turn it into an artifact immediately.
I think there's something in the desire to try to capture a non-corporeal entity like a ghost in an incredibly physical and immediate form of media. It almost feels like a way to "prove" the existence of ghost.
In addition to being harder to fake than digital photography, a Polaroid of a ghost or paranormal phenomena translates an insubstantial thing into a very real feeling photograph. It literally takes the image of the ghost from the theoretical, invisible, untouchable realm of the unknown and turns it into a physical photograph that you can hold. The desire to want to catalog your paranormal experiences using Polaroids makes complete sense. If you see a potentially paranormal anomaly in your instant photo, it feels like the phenomena is more real because it was captured in the picture.
Fakery in smartphone photography
On the other end of the spectrum, there's the computational photography , which can modify the images we photograph with smartphone cameras in various ways. An article in The Verge sums up the recent Samsung controversy well:
This week, Samsung drew criticism for the technology its newer phones use to “enhance” photos of the Moon. A user on Reddit, ibreakphotos, conducted an experiment by creating a blurred photo of the Moon and then taking a picture of it using their Galaxy S23 Ultra. Even though the photo was completely blurry, their Samsung device appeared to add details to the image that weren’t there before, like craters and other marks, calling into question whether the highly detailed Moon photos people have been taking with their Galaxy devices really are photos of the Moon.
The Verge article is a fascinating read; not only does it document the Samsung moon-augmentation scandal, but it also talks about how many, many images of the moon that we see have been modified. Part of that is because it's so easy to do these days:
And while faking the night sky once involved “sandwiching negatives, doing things in the darkroom,” as Nordgren says, it’s become far easier and more prevalent in the age of Photoshop.
“One of the biggest things people do is sky replacements,” Lynsey Schroeder, a professional astrophotographer tells The Verge. “They’ll take the Milky Way from a different photo and Photoshop it in so that it looks like it was there.” An expert would immediately know that it’s fake. “But to the general public, they don’t know.”
As someone who's reworked plenty of photos in Photoshop, I can say that this sort of photo manipulation is trivially easy. Like I've mentioned before, as popular apps like Facetune allow people to modify photos on their mobile devices, people have learned to trust digital photography less and less.
But Samsung's wholesale replacement of the moon in photos—using a "deep-learning-based AI detail enhancement engine"—strikes me as a step beyond that. (Samsung has apparently been using AI in their cameras since the Galaxy S10, and their "Scene Optimizer" technology since the Galaxy S21 series. Though I can tell you that pictures of the moon on my Galaxy S22+ still look like garbage. So they've clearly made some major changes for their latest devices. Either that, or I guess I gotta try using my phone's 100x zoom, which I had no idea existed. )
It's one thing for someone to decide to modify their own photographs; it's another for apps themselves to rework images in the process of capturing them.
In the case of someone photographing the moon and getting a completely different image, there was never a "real," unedited version of the image. You can't revert between the edited and original versions; the edit is the only one that exists.
Samsung isn't the only company that has introduced "computational photography" into its cameras. Apple's live photos and portrait mode could be considered computational photography, but as AppleInsider points out, "users are beginning to ask where to draw the line between these algorithms and something more intrusive, like post-capture pixel alteration."
There are so many questions that this raises, but the question of memory resonates the most to me. Many people (myself included) use smartphone photos as an aide-mémoire. I'll often take pictures not because something is beautiful or because I'm expressing myself artistically, but because I want to remember something. I'm not going to post that image to Instagram, but I will scroll back in my phone, see the timestamped, unaesthetic mirror selfie in a venue bathroom, and think "oh, right, that's the day that I went to that concert."
For me, the visual information that I collect in the form of photos is more for constructing and preserving my memories than anything else. So my question is: If our everyday smartphone photos help us remember reality and our pasts, what happens when, unbeknownst to us, our cameras are modifying the images? In that case, it becomes a form of memory modification. At that point, you aren't the arbiter of your memories; the images on your phone can override your recollections. As AppleInsider eloquently puts it, "the final image doesn't represent what the sensor detected and the algorithm processed. It represents an idealized version of what might be possible but isn't because the camera sensor and lens are too small."
There's something truly chilling about that.
The AppleInsider article goes on:
By changing how the moon appears using advanced algorithms without alerting the user, that image is forever altered to fit what Samsung thinks is ideal. Sure, if users know to turn the feature off, they could, but they likely won't.
So here we are, in a place where large tech corporations have the power to override reality—and perhaps even our very memories. No wonder instant photography, despite its limitations, can feel like a more reliable way to access paranormal realities.
If smartphone cameras are increasingly depicting "idealized" images of the world, smoothing out anomalies and removing variations from what an computer might consider "normal," what does that mean for paranormal photography? Is it possible that phone cameras might capture paranormal phenomena, but the AI in the phone's camera wipes that out, replacing it with "expected" reality? Or could strangeness seep in anyway, through synchronicity and glitches?