Comment by ge96

Comment by ge96 5 days ago

19 replies

That fake zoom with AI is gross ugh

If I'm taking a picture of something I want it to be real light-to-pixel action not some made up wambo-jambo

hatthew 5 days ago

I find it kinda scary that this is marketed as "zoom" and "recovering details", when the reality is that it quite literally makes stuff up and hopes you won't notice the difference. You and I know that it's completely fake, but we (or at least I) don't even know how much is faked, and probably 99% of people won't even know that it's fake at all.

How long until someone gets arrested because an AI invented a face that looks like theirs? Hopefully lawyers will be know to throw out evidence like that, but the social media hivemind will happily ruin someone's life based on AI hallucinations.

6thbit 5 days ago

It becomes misleading to even keep calling it "Zoom".

More like "Interpolation" with a pinch of hallucination. I can see this becoming a thing though, it is after all the mythical 'zoom & enhance' from csi...

  • buu700 5 days ago

    I actually think it's a cool feature, but it shouldn't be called "zoom". "Zoom & Enhance" would make sense. The UI should also have a clear visual indicator of which modes are pure optical zoom, which (if any) are substantially just cropping the image, and which are using genAI.

    • 6thbit 5 days ago

      agree. allow toggling between the blurry pixels and enhanced version and we're golden.

    • mycall 4 days ago

      Also, over time when all of these enhanced photos are posted online and used for future model training, Pixel 13 Enhance might do even worse.

epolanski 5 days ago

> If I'm taking a picture of something I want it to be real light-to-pixel action not some made up wambo-jambo

Then don't take pictures with phones because it's been like that since more than half a decade at this point even on midrange phones.

cameronh90 5 days ago

Digital has never been light-to-pixel.

At a minimum, you have demosaicing, dark frame subtraction, and some form of tone mapping just to produce anything you'd recognise as an photo. Then to produce a half-way acceptable image will involve denoising, sharpening, dewarping, chromatic aberration correction - and that just gets us up to what was normal at the turn of the millennium. Nowadays without automatic bracketing and stacking, digital image stabilisation, rolling shutter reduction, and much more, you're going to have pretty disappointing phone pics.

I suspect you're trying to draw a distinction with the older predictable techniques of turning sensor data into an image when compared to the modern impenetrable ones that can hallucinate. I know what you're getting at, but there's not really a clear point where one becomes the other. You can consider demosaicing and "super-res zoom" as both types of super-resolution technique intended to convert large amounts of raw sensor data into image that's closer to the ground truth. I've even seen some pretty crazy stuff introduced by an old fashioned Lanczos-resampling based demosaicing filter. Albeit, not Ryan Gosling[0].

Of course, if you don't like any of this, you can configure phones to produce RAW output, or even pick up a mirrorless, and take full control of the processing pipeline. I've been out of the photography world for a while so I'm probably out of date now, but I don't think DNGs can even store all of the raw data that is now used by Apple/Google in their image processing pipelines. Certainly, I never had much luck turning those RAW files into anything that looked good. Apple have ProRAW which I think is some sort of hybrid format but I don't really understand it.

[0] https://petapixel.com/2020/08/17/gigapixel-ai-accidentally-a...

  • hatthew 5 days ago

    By my understanding, demosaicing almost always just "blurs" the photo slightly, reducing high-frequency information. Tone mapping is unavoidable, invisible to most people, and usually doesn't change the semantic information within an image (the famous counterexample is of course The Dress). Phone cameras in recent years do additional processing to saturate, sharpen, HDR, etc., and I find those distasteful and will happily argue against them. But AI upscaling/enhancement is a step further, and to me feels like a very big step further. It's the first time that an automatic processing step has a very high risk of introducing new (and often incorrect) semantic information that is not available in the original image, the classic example being the samsung moon.

  • ge96 5 days ago

    It's just crazy that demo they show, imagine the vehicle is actually a truck but you zoom in and it becomes a porsche...

    conspiracy tangent, try to take a picture of something you're not supposed to and your phone won't let you ha, well money could be an example which I get the reason (it's printers but that idea)

  • atomicthumbs 5 days ago

    the car looks mutated and slimy. most stuff that's used computational photography before now didn't imagine things from whole cloth

amcgivern 4 days ago

Agreed, I was recently looking through photos from my Pixel 5 and realized that it had "helpfully" modified an image of a bunny, making the fur look wavy and adding a weird faded-out eye on the body. On top of that, it applies skin smoothing to every photo, even with the beauty filter disabled. It irked me enough that I'm looking for a new third-party camera app. Despite using exclusively Pixel/Nexus phones for the last 12 years, I might abandon them entirely once this phone dies.

racktash 5 days ago

Agreed, not a fan. The world has enough fakery in it already without people accidentally generating even more (I assume quite a lot of casual users will mistake this zoom for zoom in the traditional sense).

gdbsjjdn 5 days ago

It's giving "Samsung fake moon". If generative AI is going to make up details why bother zooming in, you could just ask to make up a whole AI slop picture.

  • Spivak 5 days ago

    This does seem to actually, ya know, do the upscaling though instead of clumsily faking it. Like yeah it's AI with AI failure modes but upscaling models are quite good and have less 'weird artistic' liberties than imagegen models.

  • ge96 5 days ago

    The other thing that was annoying me with a cheap phone I bought, it was applying this generic surface to your face so your face didn't have pores but it looked wrong.