janeway 2 hours ago

This topic is fascinating to me. The Toy Story film workflow is a perfect illustration of intentional compensation: artists pushed greens in the digital master because 35 mm film would darken and desaturate them. The aim was never neon greens on screen, it was colour calibration for a later step. Only later, when digital masters were reused without the film stage, did those compensating choices start to look like creative ones.

I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.

In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.

I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?

  • _bent an hour ago

    I know you're looking for something more universal, but in modern video workflows you'd apply a chain of color transformations on top the final composited image to compensate the display you're working with.

    So I guess try separating your compensations from the original work and create a workflow that automatically applies them

  • ilamont 14 minutes ago

    There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.

    I first heard about this when reading an article or book about Jimi Hendrix making choices based on what the output sounded like on AM radio. Contrast that with the contemporary recordings of The Beatles, in which George Martin was oriented toward what sounded best in the studio and home hi-fi (which was pretty amazing if you could afford decent German and Japanese components).

    Even today, after digital transfers and remasters and high end speakers and headphones, Hendrix’s late 60s studio recordings don’t hold a candle anything the Beatles did from Revolver on.

  • davidalayachew 33 minutes ago

    Isn't the entire point of "reinventing the wheel" to address this exact problem?

    This is one of the tradeoffs of maintaining backwards compatibility and stewardship -- you are required to keep track of each "cause" of that backwards compatibility. And since the number of "causes" can quickly become enumerable, that's usually what prompts people to reinvent the wheel.

    And when I say reinvent the wheel, I am NOT describing what is effectively a software port. I am talking about going back to ground zero, and building the framework from the ground up, considering ONLY the needs of the task at hand. It's the most effective way to prune these needless requirements.

  • vodou an hour ago

    Do you have some concrete or specific examples of intentional compensation or purposeful scaffolding in mind (outside the topic of the article)?

  • Gravityloss an hour ago

    Theory: Everything is built on barely functioning ruins with each successive generation or layer mostly unaware of the proper ways to use anything produced previously. Ten steps forward and nine steps back. All progress has always been like this.

  • snarfy 36 minutes ago

    It seems pretty common in software - engineers not following the spec. Another thing that happens is the pivot. You realize the scaffolding is what everyone wants and sell that instead. The scaffold becomes the building and also product.

  • layer8 30 minutes ago

    Chesterton’s Fence is a related notion.

  • RedNifre an hour ago

    "Cargo cult"? As in, "Looks like the genius artists at Pixar made everything extra green, so let's continue doing this, since it's surely genius."

  • pbronez an hour ago

    That’s a great observation. I’m hitting the same thing… yesterday’s hacks are today’s gospel.

    My solution is decision documents. I write down the business problem, background on how we got here, my recommended solution, alternative solutions with discussion about their relative strengths and weaknesses, and finally and executive summary that states the whole affirmative recommendation in half a page.

    Then I send that doc to the business owners to review and critique. I meet with them and chase down ground truth. Yes it works like this NOW but what SHOULD it be?

    We iterate until everyone is excited about the revision, then we implement.

    • randallsquared 26 minutes ago

      There are two observations I've seen in practice with decision documents: the first is that people want to consume the bare minimum before getting started, so such docs have to be very carefully written to surface the most important decision(s) early, or otherwise call them out for quick access. This often gets lost as word count grows and becomes a metric.

      The second is that excitement typically falls with each iteration, even while everyone agrees that each is better than the previous. Excitement follows more strongly from newness than rightness.

KaiserPro 4 hours ago

Aha! I used to work in film and was very close to the film scanning system.

When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")

Ideally you'd use a non-real time scanner like this: https://www.filmlight.ltd.uk/products/northlight/overview_nl... which will collect both colour and infrared. This can help automate dust and scratch removal.

If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)

However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.

the mechanics are written out here: https://www.secretbatcave.co.uk/film/digital-intermediary/

  • xattt 2 hours ago

    How much of the colour change is also dependent on the film printer and also film scanner/telecine?

    It just seems like there’s a lot of variability in each step to end up with an unintended colour, that will taken as the artist’s intent.

  • tomcam 2 hours ago

    How did you dust bust it? Wipe it by hand with a microfiber cloth or something?

cbolton 2 hours ago

There's a similar issue with retro video games and emulators: the screens on the original devices often had low color saturation, so the RGB data in those games were very saturated to compensate. Then people took the ROMs to use in emulators with modern screens, and the colors are over-saturated or just off. That's why you often see screenshots of retro games with ridiculously bright colors. Thankfully now many emulators implement filters to reproduce colors closer to the original look.

Some examples:

https://www.reddit.com/r/Gameboy/comments/bvqaec/why_and_how...

https://www.youtube.com/watch?v=yA-aQMUXKPM

  • forgotoldacc an hour ago

    With the GBA, the original GBA screen and the first gen GBA SP had very washed out colors and not saturated at all. The Mario ports to the GBA looked doubly since they desaturated their colors and were shown on a desaturated screen. I've heard that the real reason the colors were desaturated was because the first GBA model didn't have a backlight so the colors were lightened to be more visible, but I'm not quite sure that's the case. Lots of other games didn't do that.

    And with the second version of the GBA SP and the GB Micro, colors were very saturated. Particularly on the SP. If anything, cranking up the saturation on an emulator would get you closer to how things looked on those models, while heavily desaturating would get you closer to the look on earlier models.

  • zeta0134 2 hours ago

    Ah yes, we often get folks in the nesdev community bickering over which "NES Palette" (sourced from their favorite emulator) is the "best" one. The reality is extraordinarily complicated and I'm barely qualified to explain it:

    https://www.nesdev.org/wiki/PPU_palettes#2C02

    In addition to CRTs having variable properties, it turns out a lot of consoles (understandably!) cheat a little bit when generating a composite signal. The PPU's voltages are slightly out of spec, its timing is weird to work around a color artifact issue, and it generates a square wave for the chroma carrier rather than an ideal sine wave, which produces even more fun problems near the edges. So we've got all of that going on, and then the varying properties of how each TV chooses to interpret the signal. Then we throw electrons at phosphors and the pesky real world and human perception gets involved... it's a real mess!

nappy 8 hours ago

It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical... Besides whatever the provenance of these prints are (this gets complicated) the reality is that these were also made to look at best as they could for typical movie theater projector systems in the 90s. These bulbs were hot and bright and there were many other considerations around what the final picture would look like on the screen. So yeah, if you digitize 35mm film today, it will look different, and different from how its ever been been displayed in a movie theater.

  • johngossman 7 hours ago

    Agreed. It's a fine article but leaves half the story on the table. It is supposedly comparing what these movies looked like in the theater to the modern streaming and bluray versions, but is actually comparing what a film scan (scanner settings unspecified) projected on a TV (or other unspecified screen) looks like compared to the digital versions on (presumably) the same screen. And then we can ask: how were the comparison images captured, rendered to jpeg for the web, before we the readers view them on our own screens? I'm not arguing Catmull and company didn't do a great job of rendering to film, but this comparison doesn't necessarily tell us anything.

    Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.

    Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater

  • jama211 7 hours ago

    It’s an interesting and valid point that the projectors from the time would mean current scans of 35mm will be different too. However, taking for example the Aladdin screenshot in particular, the sky is COMPLETELY the wrong colour in the modern digital edition, so it seems to me at least that these 35mm scans whilst not perfect to the 90’s are closer to correct than their digital counterparts.

    • sersi 7 hours ago

      And as someone who is part of those conservation communities that scan 35mm with donations to keep the existing look, a lot of the people doing those projects are aware of this. They do some color adjustment to compensate for print fading, for the type of bulb that were used in movie theatres back then (using a LUT), etc...

      I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.

      • faeyanpiraat 5 hours ago

        Whats wrong with terminator 2?

        Are there like multiple digital releases, one with better colour than the other?

    • davidferguson 2 hours ago

      See my top level comment for more info on this, but the Aladdin scan used in the article was from a 35mm trailer that's been scanned on an unknown scanner, and had unknown processing applied to it. It's not really possible to compare anything other than resolution and artefacts in the two images.

  • davidferguson 2 hours ago

    And it was made by a lab that made choices on processing and developing times, that can quite easily affect the resulting image. You hope that labs are reasonably standard across the board and calibrate frequently, but even processing two copies of the same material in a lab, one after the other will result in images that look different if projected side by side. This is why it's probably impossible to made new prints of 3-strip-cinerama films now, the knowledge and number of labs that can do this are near zero.

  • postalcoder 7 hours ago

    I think the entire premise of the article should be challenged. Not only is 35mm not meant to be canonical, but the 35mm scans the author presented are not what we saw, at least for Aladdin.

    I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.

    > Early home releases were based on those 35 mm versions.

    Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA

    Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8

    • sersi 6 hours ago

      Famously CRT TVs didn't show as much magenta so in the 90s home VHS releases compensated by cranking up the magenta so that it would be shown correctly on the TVs of the time. It was a documented practice at the time.

      So, yes the VHS is expected to have more magenta.

      Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.

      • postalcoder 6 hours ago

        The author here is asserting that VHS were based on the 35mm scans, and that the oversaturation is a digital phenomena. Clearly, that's not true.

        I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.

  • dehrmann 7 hours ago

    This reminds me of how pre-LCD console games don't look as intended on modern displays, or how vinyl sounds different from CDs because mixing and mastering targeted physical media with limitations.

    • Ekaros 7 hours ago

      Wasn't CD more so cheapening out? Doing work one time and mostly for radio where perceived listening scenario was car or background and thus less dynamic range allowed it be louder on average.

      CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.

      • projektfu 3 hours ago

        The loudness war was a thing in all media. In the 80s most of us didn't have CD players but our vinyl and tapes of pop and rock were all recorded overly loud. Compared to the classical and jazz recordings, or perhaps the heyday of audiophile 70s rock, it was annoying and sad.

  • beAbU 7 hours ago

    > It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical

    Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.

    Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.

    • vanderZwan 6 hours ago

      I don't know if this is still true, but I know that in the 2000s the vinyls usually were mastered better than the CDs. There even was a website comparing CD vs vinyl releases, where the person hosting it was lamenting this fact because objectively CDs have a much higher dynamic range than vinyls, although I can't find it now. CDs were a victim of the loudness war[0].

      Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.

      [0] https://en.wikipedia.org/wiki/Loudness_war

      • jorvi 2 hours ago

        The loudness wars were mostly an artifact of the 90s-2010s, because consumers were listening on horrible plasticky iPod earbuds or cheap Logitech speakers and the music had to sound good on those.

        Once better monitors became more commonplace, mastering became dynamic again.

        This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.

        [0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).

    • Cthulhu_ 4 hours ago

      I dunno about authentic but for a while (as another commenter pointed out) they didn't have the loudness maxed out and / or had better dynamic range. That said, music quality aside, vinyls have IMO better collectability value than CDs. They feel less fragile, much more space for artwork and extras, etc.

  • ForHackernews 4 hours ago

    It sounds like in the case of Toy Story, the Pixar team were working toward a 35mm print as the final product, so that probably should be considered canonical: it's what the creative team set out to make.

  • NewsGotHacked 8 hours ago

    [flagged]

    • findyoucef 8 hours ago

      You literally added nothing to this conversation by making this comment. Some people love to hear themselves talk.

timenotwasted 9 hours ago

This makes so much more sense now. After having kids I've been watching my fair share of Pixar and I just never recalled how flat and bland everything looked but I would always chalk it up to my brain not recalling how it looked at the time. Good to know I guess that it wasn't just entirely nostalgia but sad that we continue to lose some of this history and so soon.

  • behringer 9 hours ago

    Things like this are being preserved, you just have to sail the high seas.

    • phantasmish 7 hours ago

      Yeah I clicked this link going “oh god it’s because they printed to film, I bet, and man do I hope it looks worse so I don’t have to hunt down a bunch of giant 35mm scans of even more movies that can’t be seen properly any other way”

      But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.

      And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.

      Piracy: saving our childhoods one frame at a time.

      • actionfromafar 5 hours ago

        When it comes to Star Wars, people are literally spotting them in Photoshop frame by frame. :)

    • qingcharles 8 hours ago

      I'm not sure why you're getting downvoted. What you're hinting at is that a lot of original 35mms are now getting scanned and uploaded privately, especially where all the commercial releases on Blu-ray and streaming are based on modified versions of the original movies, or over-restored versions.

      These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.

      • squigz 8 hours ago

        > I see them mostly traded on short-lived gdrives and Telegram.

        Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.

        • sersi 6 hours ago

          The main reason they are not shared as widely is that there's a bit of conflict within the community between those that really want to stay under the radar and not risk being targeted by copyright owners (and so try to keep things very much private between the donors who funded the 600-900 usd cost of the scans) and those who want to open up a bit more and so use telegram, reddit and upload to private trackers.

      • eviks 7 hours ago

        > with low compression to keep things like grain.

        But you have algorithmic grain in modern codecs, so no need to waste so much space for noise?

    • diogenescynic 8 hours ago

      You can’t trust corporations to respect or protect art. You can’t even buy or screen the original theatrical release of Star Wars. The only option is as you say. There are many more examples of the owners of IP altering it in subsequence editions/iterations. This still seems so insane to me that it’s not even for sale anywhere…

      • haunter 3 hours ago

        > You can’t even buy or screen the original theatrical release of Star Wars

        You can actually, the 2006 Limited Edition DVD is a double disc version one being the original version.

        However they are not DVD quality because they were transferred from LaserDisc and not the original film stock

        • phantasmish an hour ago

          Even those aren’t accurate to the 1977 film.

          To pick an arguably-minor but very easy to see point: the title’s different.

      • 4gotunameagain 6 hours ago

        I don't understand why you're getting downvoted. So many beautiful things have been lost to perpetual IP, e.g. old games that could be easily ported by volunteers given source code, which can never be monetised again.

        Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.

    • squigz 9 hours ago

      What sort of terms might one search for?

      • behringer 9 hours ago

        "toy story film scan" on Kagi led me to a reddit page that may or may not contain links that might help you, but don't dawdle those links may not work forever.

        Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)

    • dmonitor 9 hours ago

      Would be annoying, but I suppose you could also recalibrate your display to turn down the greens?

hekkle 9 hours ago

I'm surprised they can't just put a filter on the digital versions to achieve a similar look and feel to the 35mm version.

It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.

  • etempleton 9 hours ago

    They could, but it would require some work to get it right. This is very similar to conversations that happen regularly in the retro game scene regarding CRT monitors vs modern monitors for games of a certain era. The analog process was absolutely factored in when the art was being made, so if you want a similar visuals on a modern screen you will need some level of thoughtful post processing.

    • Torn 9 hours ago

      Disney 100% has access to colorists and best in class colour grading software. It must have been a business (cost cutting) decision?

      • afavour 9 hours ago

        I’m reminded of the beginning of the movie Elf, where the book publisher is informed that a printing error means their latest book is missing the final two pages. Should they pulp and reprint? He says,

        > You think a kid is going to notice two pages? All they do is look at the pictures.

        I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.

        With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.

      • etempleton 9 hours ago

        The vast majority of people will not care nor even notice. Some people will notice and say, hey, why is it "blurry." So do you spend a good chunk of time and money to make it look accurate or do you just dump the file onto the server and call it a day?

      • aidenn0 9 hours ago

        Just dialing down the red and blue channels a bit makes it much closer for several of the early '90s releases (look at that Aladdin example from TFA)

      • ZiiS 6 hours ago

        Disney do pay for industry leading colorists. They chose to favour a more saturated look for Aladdin et al. It is reasonable to prefer either. I can't imaging what happened to the greens in the Toy Story examples if they are accurate.

    • philistine 9 hours ago

      And ultimately, what you need to achieve acceptable CRT effects is resolution. Only now, with 4K and above, can we start to portray the complex interactions between the electron beam and the produced image by your console. But the colour banding that caused the hearts of The Legend of Zelda to show a golden sheen is still unreachable.

  • mer_mer 8 hours ago

    You can, that's what Pixar did while creating the film. From the article:

    > During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.

    But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.

    • happymellon 7 hours ago

      > Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor

      Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.

      If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.

    • Asooka 3 hours ago

      The best option for me would be to release it in whatever format preserves the most of the original colour data without modification, then let the viewer application apply colour grading. Give me the raw renders in a linear 16bpc colour space with no changes. Sadly, I don't think we have digital movie formats that can handle that.

  • belZaah 8 hours ago

    It is doable and you can get presets designed to mimic the look of legendary photography film stock like Velvia. But what they did back then was very much an analog process and thus also inherently unstable. Small details start to matter in terms of exposure times, projectors used etc. There’s so many frames and it took so much time, that it’s almost guaranteed there’d be noticeable differences due to process fluctuations.

  • [removed] 9 hours ago
    [deleted]
heldrida 22 minutes ago

Film is magical. We should preserve and incentivise it.

sja 9 hours ago

Neat! The Youtube channel Noodle recently did a related deep dive into the differences in the releases of The Matrix [0]. The back half of the video also touches on the art of transferring from film/video to digital.

[0]: https://www.youtube.com/watch?v=lPU-kXEhSgk

  • larusso 9 hours ago

    I always felt the old matrix had a more colder blue. And it changed drastically when the second and third hit cinemas. At least that was my memory because I watched a double feature when the second one hit the theatre's and complained then that the Matrix somehow looked weird. But it could also be my memory since I also own the blue ray release.

    Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.

coopierez 5 hours ago

I'm stunned so many people here can remember details as fine as the colour grading of a film. I couldn't remember specifics like that from 6 months ago, let alone 30 years ago when I was a child and wouldn't have had the thought to watch for cinematographic touches.

Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.

  • Telaneo 4 hours ago

    I doubt many people 'remember' this to any significant extent, but there are probably many cases of media giving the 'wrong' vibe with a new release, and you just assume it's because you've gotten older, but then when you get access to the original you experienced, the 'good' vibe is back, and you can easily compare between the two.

    Although some people do infact remember the differences, but I'd guess a lot of those incidents are caused by people experiencing them in fairly quick succession. It's one thing to remember the difference between a DVD 20 years ago and a blu-ray you only watched today, and another to watch a DVD 15 years ago and a blu-ray 14 years ago.

  • sunaookami 4 hours ago

    Different people just remember different things. I bet most people don't remember either and only going "ah yes of course!" after reading this blogpost (which means they didn't remember at all).

    • bspammer 2 hours ago

      Anecdata here, but I played Zelda Ocarina of Time on CRT when I was a child, and have since replayed it many times via emulator. The game never looked quite as good as I remembered it, but of course I chalked it up to the fact that graphics have improved by several orders of magnitude since then.

      Then a few years ago I was throwing out my parent's old CRT and decided to plug in the N64 one last time. Holy crap was it like night and day. It looked exactly as I remembered it, so much more mysterious and properly blended than it does on an LCD screen.

      I don't see why the same wouldn't apply to films, sometimes our memories aren't false.

pavlov 7 hours ago

Film weave could also be worth mentioning.

Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."

(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")

bsimpson 8 hours ago

Same is true of home video hardware:

If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.

Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.

Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:

https://www.youtube.com/watch?v=wcym2tHiWT4

yCombLinks 9 hours ago

The texture of the film grain makes Mulan and Aladdin really look better. The large simple filled sections look like they have so much more to them.

  • kemayo 9 hours ago

    The one frame they showed from the Lion King really stood out. The difference in how the background animals were washed out by the sunlight makes the film version look significantly better.

    • saghm 9 hours ago

      I'm not sure if I'm just young enough to be on the other side of this despite seeing all three of those Disney movies as a millennial kid (Lion King and Aladdin were VHS mainstays in my house, and I remember seeing Mulan in theaters), but I honestly don't find the film grain to look better at all and think all three of those bottom images are much more appealing. For the Toy Story ones, I think I'm mostly indifferent; I can see why some people might prefer the upper film images but don't really think I'd notice which one I was watching. I'd definitely think I'd notice the difference in the 2D animation though and would find the film grain extremely distracting.

    • charcircuit 8 hours ago

      To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches. Whenever I watch a film based movie my immersion always gets broken by all the little specs that show up. Digital is a much more immersive experience for me.

      • autoexec 3 hours ago

        > To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches.

        In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.

        They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.

        Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.

      • opello 7 hours ago

        This is an interesting take when you look at the gas station Toy Story example and consider the night sky. In the digital version the stars are very washed out but in the film version the sky is dark and it's easy to appreciate the stars. Perhaps it's unrealistic when you realize the setting is beneath a gas station canopy with fluorescent lights, but that detail, along with some of the very distinct coloring, stuck out to me.

  • Cthulhu_ 4 hours ago

    Which is of course highly subjective; you could argue that film grain is an unwanted but unavoidable side-effect from the medium used, just like other artifacts from film - vertical alignment issues, colour shifting from "film breath", 24 frames per second, or the weird sped-up look from really old films.

    I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.

  • Arn_Thor 4 hours ago

    It does, but much more important to me is the color grading. The white point in the film versions is infinitely better.

dabinat 36 minutes ago

I don’t doubt that the colors of the digital-digital releases are objectively worse, but even if they output colors that were faithful to the creators’ vision, that doesn’t get around the fact that most TVs ship with god-awful factory defaults that make everything look like a video game. So much so that a bunch of people in the industry lobbied TV manufacturers to create a special mode that would display movies more accurately: https://filmmakermode.com/faqs/

But there is some tension because a number of people complain Filmmaker Mode is too dark to their eye, even if is faithful to the creators’ vision.

opello 7 hours ago

> He [David DiFrancesco] broke ground in film printing — specifically, in putting digital images on analog film.

> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.

While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.

[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...

jtolmar 9 hours ago

Is it possible to replicate the digital->film transition with tone mapping? (I assume the answer is yes, but what is the actual mapping?)

alex-moon 4 hours ago

So it's fascinating reading this looking at the screengrabs of the "original" versions... not so much because they are "how I remember them" but indeed, because they have a certain nostalgic quality I can't quite name - they "look old". Presumably this is because, back in the day, when I was watching these films on VHS tapes, they had come to tape from 35mm film. I fear I will never again be able to look at "old looking" footage with the same nostalgia again, now that I understand why it looks that way - and, indeed, that it isn't supposed to look that way!

bee_rider 9 hours ago

How well does 35mm hold up over time? Could these movies be said to “no longer exist” in some sense, if the scans have decayed noticeably?

  • JKCalhoun an hour ago

    Hollywood does store original prints in underground salt mines (at least I am aware of a place in Kansas where they do this). Of course who knows where the frames we are being shown from the 35mm film version are coming from. Likely not these copies that are probably still in halite storage.

  • phantasmish 7 hours ago

    Playing them, handling them, and poor storage all degrade them. Most original prints will have been played many times, and often haven’t been consistently stored well.

    The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.

aidenn0 9 hours ago

Beauty and the Beast on Bluray looks completely different from what I remember; I had assumed that they had just regraded it, but given that it was developed with CAPS, maybe this is part of the effect?

Tabular-Iceberg 5 hours ago

I call it the Newgrounds animation effect. Digital-to-digital always looked a bit unserious to me.

I’m sure many young people feel the exact opposite.

ErroneousBosh 5 hours ago

> "Even so, it’s a little disquieting to think that Toy Story, the film that built our current world, is barely available in the form that wowed audiences of the ‘90s."

Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.

Or you could slap a film-look LUT on, but you don't need to go that far.

vinhnx 9 hours ago

What an excellent piece! I thoroughly enjoyed reading it, brought my childhood memories flooding back. I have so many fond recollections of that 90s era, including "A Bug's Life." I remember gathering with my cousins at my grandmother's house to watch these films on VHS. Time flies.

albertzeyer 6 hours ago

It reminds me also of the 24 FPS discussion, which is still the standard as far as I know for cinema, even though 48 or 60 FPS are pretty standard for series, The 24 FPS give it a more cinematic feeling.

https://www.vulture.com/2019/07/motion-smoothing-is-ruining-... https://www.filmindependent.org/blog/hacking-film-24-frames-...

  • Cthulhu_ 4 hours ago

    To add, when it comes to video games sometimes people go "baww but 24 fps is enough for film". However, pause a film and you'll see a lot of smearing, not unlike cartoon animation frames I suppose, but in a video game every frame is discrete so low framerate becomes visually a lot more apparent.

    I think it was The Hobbit that had a 60 fps version, and people just... weren't having it. It's technologically superior I'm sure (as would higher frame rates be), but it just becomes too "real" then. IIRC they also had to really update their make-up game because on higher frame rates and / or resolutions people can see everything.

    Mind you, watching older TV shows nowadays is interesting; I think they were able to scan the original film for e.g. the X Files and make a HD or 4K version of it, and unlike back in the day, nowadays you can make out all the fine details of the actor's skin and the like. Part high definition, part watching it on a 4K screen instead of a CRT TV.

BolexNOLA 9 hours ago

It’s fascinating to me how many of these discussions boil down to dialing in dynamic range for the medium in question.

As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.

Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.

OJFord 5 hours ago

I remember it grainier and with occasional 'VHS zebra stripes'(?) too, though.

wilg 9 hours ago

If you're interested in these 35mm film scans, I recommend watching this excellent YouTube video "SE7EN & How 35mm Scans Lie to You" https://www.youtube.com/watch?v=uQwQRFLFDd8 for some more background on how this works, and especially how these comparisons can sometimes be misleading and prey on your nostalgia a bit.

If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/

afiori 5 hours ago

Happy to have set my television to less than half saturation

CGMthrowaway 6 hours ago

Now there is the problem where many of my friends will take one look at a movie I started on the TV and say "ew, I don't like this movie, it's old" They don't realize they feel that way, viscerally, is because it's shot on film. How do I get people to watch film movies with me? They are far better anyway on average than many modern movies (from a storytelling, moviemaking pov, to say nothing about the picture quality).

  • Cthulhu_ 4 hours ago

    Make them into a drinking game. We watched The Princess Bride the other day (never watched it), I think it's aged pretty well but then I'm old. But if they get bored, make it a game to have a drink or get a point or whatever for every sexual innuendo, lol.

    Some films didn't age well though.

    And finally, accept it and move on, ultimately it's their loss.

[removed] 9 hours ago
[deleted]
charcircuit 8 hours ago

>Computer chips were not fast enough, nor disks large enough, nor compression sophisticated enough to display even 30 minutes of standard-definition motion pictures.

This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.

  • opello 8 hours ago

    I bumped on this too, since 1994-1995 was about the time when multi-gigabyte hard drives were readily available and multiple full motion video codecs were being used in games, albeit for cut scenes. Theater projector compatibility makes complete sense.

    • toast0 7 hours ago

      In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors. The Phantom Menance was shown digitally.... on two screens. By the end of 2000, there were 31 digital cinema screens in theaters.

      Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.

      https://en.wikipedia.org/wiki/Digital_cinema

      • opello 6 hours ago

        > In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors.

        I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.

    • Theodores 4 hours ago

      In period I was somewhat in charge of the render queue at a small animation company. I had to get rendered images onto tape, as in Sony Digibeta or better. Before that I had to use film.

      We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.

      We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.

      Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.

      Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.

      The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.

      Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.

      It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.

      Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.

dabluecaboose 9 hours ago

Those comparisons were strangely jarring. It's odd to see (on the internet awash with "Mandela Effect" joke conspiracies) direct photo/video evidence that things we remember from our childhood have indeed been changed; sometimes for the worse!

d--b 7 hours ago

I just showed Toy Story to my kids. It looked really bad. Mostly textures and lighting.

I wonder if artificial grain would actually make it look better.

Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.

dyauspitr 7 hours ago

Interesting, I think the film versions feel like they have more gravitas, especially the Lion king and Mulan scenes.

globular-toast 7 hours ago

I find a lot of the stuff I remember from decades ago looks worse now. Toy Story in particular I watched when I got a projector after I'd seen Toy Story 4 and it looked bad, almost to the point I wish I hadn't tarnished my memory of it. Similar things have happened with N64 games that I cherished when I was little.

I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.

It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.

  • theshackleford 7 hours ago

    > Same with CRTs. Sure, it's a different look, but my memory still looks better.

    I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.

tobr 5 hours ago

The changes in the Aladdin and Lion King stills surely can’t be accidental side effects? The Aladdin shot especially looks like they deliberately shifted it to a different time of day. Could there have been a continuity reason?

bpiroman 9 hours ago

wtf happened to Simpsons on Disney+? looks like it's zoomed in.

  • 6581 an hour ago

    There's an option to switch back to the original 4:3 ratio.

  • MangoToupe 8 hours ago

    The simpsons was originally made in 4:3. Many people don't like watching with large black bars to the right and left, so they show a cropped 16:9 version. People complained because this is occasionally a problem and ruins a joke, so I believe you can opt into either.

    • redwall_hp 8 hours ago

      A similar crime against taste as the pan-and-scan "fullscreen" DVDs of the early 2000s. If I want to pay to watch something, don't crop out a chunk of what the cinematographer wanted me to see...

      • frou_dh 5 hours ago

        There's a (much less severe) instance of that peeve with computer video player apps that have slightly rounded corners on the windows.

      • opello 7 hours ago

        David Simon talked about this for the HD release of The Wire:

        https://davidsimon.com/the-wire-hd-with-videos/

        It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.

      • toast0 7 hours ago

        Pan and scan wasn't a DVD innovation. Most VHS releases were pan and scan too; DVDs at least commonly had widescreen available (many early discs came with widescreen on one side and full screen on the other... good luck guessing if widescreen on the hub indicates the side you're reading is widescren or if the otherside is widescreen so you should have the widescreen label facing up in your player.

squigz 9 hours ago

Wow. Based on those comparisons they really do feel completely different. Really remarkable how such relatively simple changes in lighting and whatnot can drastically change the mood.

And here I was thinking of re-watching some old Disney/Pixar movies soon :(

davidferguson 3 hours ago

TL;DR: Linking to YouTube trailer scans as comparisons for colour is misleading and not accurate.

---

> see the 35 mm trailer for reference

The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.

When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.

When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.

Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.

The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:

> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.

Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.

The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.

Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.

All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:

> At times, especially in the colors, they’re almost unrecognizable

> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different

I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.

GenericDev 9 hours ago

Damn. I wish we could get the release of the 35mm colors in the way they look in the comparisons. The Aladdin one specifically looks so good! It makes me feel like we're missing out on so much from the era it was released.

kazinator 6 hours ago

> These companies, ultimately, decide how Toy Story looks today.

LOL, what? Anyone with a Blu-Ray rip file and FFmpeg can decide how it looks to them.

  • wiseowise 3 hours ago

    And how many people will have that? Eventually they'll just go "eat ze bug" and you'll have to eat shit they give you.