Every time we have a UAP thread, more people come forward with their sightings. We need high-resolution cameras pointed up, not down at us, with machine-vision software to detect the characteristic movements, and high-magnification telephoto lenses on gimbal mounts to automatically zoom in and get detailed pictures. Under citizen control, not government, so the data doesn't get hidden. We have the technology, we just need to do it.
Why do pictures taken of UAPs at night not tend to reflect what your naked eye can see, even if you use a sophisticated modern camera? Is it because they are a spiritual entity, is it some sort of fancy cloaking? No, it's due to exposure and light metering.
Cameras have a uniform exposure value across the frame. There are three variables which determine exposure: Aperture, shutter speed, and ISO. In film cameras, ISO is the actual chemical light sensitivity of the film. In digital cameras, ISO is the degree of amplification of the signal coming off the sensor. This isn't the best or the only way to design a camera, it's just simple.
The human eye does not have that limitation. The pupil aperture is fixed over short periods, but the retina has several mechanisms which adjust its sensitivity to light, and it doesn't have to be uniform.
Now if you want to photograph UAPs at night, you need to practice on similar subjects, high dynamic range subjects, where you have a bright object against a dark background. The moon, a streetlight, the exposed filament of an incandescent bulb. One of my favorites for testing is a multi-LED safety light propped on a shelf in a dark room. It's not that bright, but it still illustrates the problem perfectly.
With your eye, you can see every LED and the construction of the light in detail, and also the background of the room illuminated by it. The camera just sees a flare. No detail, you can't even see the individual LEDs.
Unless you are shooting in full manual mode, the camera is performing light metering, and determining the three exposure variables algorithmically based on that. By default, it's using a metering mode which attempts to optimize exposure over the entire field, which means the small bright object is overexposed and you can't make out any detail. Switch to spot metering mode, put the spot on the brightest part of an LED, and try again. Now you can see all the LEDs, but the background is underexposed and completely black, and the detail on the light still isn't what you can see with your naked eye.
Now you see why the human eye works the way it does. For actually finding your way around the environment and extracting maximum useable detail from high-dynamic range scenes, varying gain across the sensor is the way to go. There is no reason why we can't design digital cameras to do this.
These cameras already have so many modes intended to optimize for specific subjects. Pet mode, portrait mode, sunset mode, gourmet mode. And auto-tracking autofocus for sports action photography. They need a UAP mode. Display a circle in the center of the viewfinder. Lock onto the bright object within the circle and use that exclusively for metering and autofocus. Then automatically shoot a second picture with full-frame light metering to capture the context. This is similar to what photographers have to do capture the moon & landscape. It's actually done by shooting two frames with a tripod and combining them in photoshop, since you can't expose both properly at once.
You will also need a lot of zoom to see any detail. But again, automated telephoto lenses on a gimbal mount. We already have the technology, it could be capturing them all night long.
The human eye, as usual, isn't evolved to give you absolute data, it's evolved for object recognition and usability. You have no idea how high dynamic range the scene actually is, since it's being subjected to flexible dynamic range compression on the fly.
(And it may be out of focus, too. I usually walk around with center autofocus equipped, so I can be sure it'll focus on what I want. "Smart" full-field autofocus has a tendency to pick out-of-center objects you aren't remotely interested in. Set up a custom mode with center autofocus and spot metering, and leave that as default, or bind it to a back-panel button so you call it up instantly.)
This post has been edited by Necromusume: Feb 16 2023, 23:07
|