nadermx 5 days ago

Tangential, but I practically owe my life to this guy. He wrote the flask mega tutorial in what I followed religiously to launch my first website. Then right before launch, in the most critical part of my entire application; piping a fragged file in flask. He answered my stackoverflow question, I put his fix live, and the site went viral. Here's the link for posterity's sake https://stackoverflow.com/a/34391304/4180276

  • miguelgrinberg 5 days ago

    You have made my day, sir. :)

    • LostMyLogin 5 days ago

      When I was in college I discovered the flask mega tutorial and fell in love with programming. Switched from an economics degree to software engineering and now work in the industry.

      Thank you for the work you put in.

      • barrenko 4 days ago

        Economist here, started to learn to code as an elaborate way to procrastinate on my master's thesis after I've quit playing videogames.

    • xp84 5 days ago

      Absolutely love seeing like a dozen people piling on Mr Grinberg to show gratitude for his work, and indeed the even little things he does to help uplift others in the field. It’s a good reminder that a small helpful contribution, or bit of teaching given at the right time, can be so valuable!

      • mathattack 4 days ago

        Please note the Buy Me Coffee button at the bottom of the post.

    • pablopudding 5 days ago

      I also want to say thank you for the Flask Mega Tutorial.

      When I started my first job as a Data Scientist, it helped me deploy my first model to production. Since then, I’ve focused much more on engineering.

      You’ve truly started an amazing journey for me.

      Thank you. :)

    • hangonhn 5 days ago

      Whoa! You're here! Well, I think a lot of us owes you a debt of gratitude. Thank you for all you've done for the Python and Flask community.

    • nessad 5 days ago

      I also want to chime in and say how you changed my life. I did the same Flask megatutorial and that led me to leaving helpdesk and becoming a support engineer. Years later, and I'm now in big tech. Thanks Miguel!

    • c0balt 5 days ago

      Thank you for the Flask Tutorial, it got me started in web development and down the line into systems development.

    • indigodaddy 5 days ago

      I came way late to the game, so went more the video side, so I have the same feelings about Pretty Printed, love his stuff.

      But just now checking out the Mega Flask Tutorial, wow looks pretty awesome.

    • pkphilip 4 days ago

      Amazing to see all of the people thanking you! Great to see that gratitude is still alive and well. You seemed to have touched a lot of lives through that mega tutorial! wow!

    • jaza 4 days ago

      I learnt a lot from your numerous Flask blog posts over the years. Your blog is often better than the official Flask docs. Kudos to you, Miguel!

    • frakkingcylons 5 days ago

      I also used your tutorial to get started with web development and helped me get my first job about 11 years ago. Thanks a lot!

    • mmasu 4 days ago

      I too started with your tutorial - thanks a million

    • Celeo 5 days ago

      I also got started in webdev and built a few sitesdl from your tutorial. Thank you!

    • naldb 5 days ago

      I also learnt a lot from your tutorial of Flask. Thank you.

  • wiseowise 5 days ago

    > flask

    Off-topic, but I absolutely loathe new Flask logo. Old one[0] has this vintage, crafty feel. And the new one[1] looks like it was made by a starving high schooler experimenting with WordArt.

    [0] - https://upload.wikimedia.org/wikipedia/commons/3/3c/Flask_lo...

    [1] - https://flask.palletsprojects.com/en/stable/_images/flask-na...

    • Stratoscope 5 days ago

      I hope they go Full Cracker Barrel on this:

      1. Original logo has country charm and soul.

      2. Replaced with a modern soulless logo.

      3. Customer outrage!

      4. Company (or open source project) comes to its senses and returns to old logo.

      https://media.nbcboston.com/2025/08/cracker-barrel-split.jpg

      (n.b. The Cracker Barrel Rebellion is sometimes associated with MAGA. I am very far from that, but I have to respect when people of any political stripe get something right.)

      • b00ty4breakfast 4 days ago

        the funny thing about the Cracker Barrel brouhaha is that the new one still looked like something you'd find on a pack of matches from a hotel bar in the 70s.

        • janc_ 3 days ago

          It looked like Cracker Barrel's own logo in the 1960s/1970 IIRC.

      • UltraSane 5 days ago

        The Cracker Barrel "controversy" seems to have largely been fueled by bots.

      • swyx 5 days ago

        ah, the New Coke Gambit

    • BreakingProd 5 days ago

      I was unaware of the new logo… and I am just realizing for the first time after many many Flask apps… that the logo is not a chili pepper.

      • w-ll 5 days ago

        This logo is bad.. not even talking about the mark, the fonts are wtf. Uppercase 'F' shorter than the lower 'l' and 'k', the 'a' and the 'k' bad, even the lower bar on the 'f' angle is just... eww. And then the mark. I dont get any of this.

      • nkozyra 5 days ago

        Using a chili pepper as a flask could work, though, but not necessarily recommended.

      • jonpurdy 4 days ago

        I was going to post the same thing; glad I searched for 'chili' and found your comment.

      • doctaj 4 days ago

        I feel dumb - I thought it was a chili pepper, too.

    • Imustaskforhelp 5 days ago

      I didn't know that they have the new logo before reading your comment. Been 2 years since I last searched flask but yeah the old logo was vintage and I also preferred the old logo and the new logo feels mid/sucks.

      The old logo is much better.

      • actionfromafar 5 days ago

        New logo is instantly forgettable. Would disappear as an app icon on a phone home screen, forever mistaken for a bank app.

      • hackernewds 5 days ago

        Old logo is impossible to resize and present on any assets that aren't rectangular. Flask isn't a country podunk restaurant

    • echelon 5 days ago

      Oh God, that's not it.

      The old logo is classic and bespoke. I could recall it from memory. It's impressionable.

      The new one looks like an unfunded 2005-era dorm room startup. XmlHttpRequests for sheep herders.

      • cap11235 5 days ago

        No, it looks like a disney channel show in 2008 that had one season

      • [removed] 5 days ago
        [deleted]
    • thaumasiotes 5 days ago

      Huh. What most stands out to me about the logo, old and new, is that it clearly depicts a drinking horn instead of a flask.

    • foresto 5 days ago

      The old logo would seem at home on a shelf of classic O'Reilly books. :)

    • saltcured 5 days ago

      I think it should not have a logo, so it is left to interpretation.

      Thinking about hand-rolled web services, I usually imagine either a stealth alcoholic's metal flask or a mad scientist's Erlenmeyer flask.

    • zestyping 4 days ago

      Goodness gracious, that font in the new logo is the most hideous font I've seen in a very long time.

    • varispeed 5 days ago

      New logo looks like a device some tribes' men use to cover their member.

    • WD-42 5 days ago

      What the…? I guess I’ve been reaching for FastAPI instead of flask these days because I had no idea this happened. Didn’t all the pallets projects have the old timey logos? I wonder what happened.

    • parlortricks 5 days ago

      yikes, that is not a great logo. it has also lost its essence

      • Stratoscope 5 days ago

        In fact, when I saw the new logo, the first thing that came to my mind was Brigadier General Jack D. Ripper in Dr. Strangelove saying "I deny them my essence."

      • travisgriggs 5 days ago

        But, this seems to me the gestalt of modern design. Less less less. Until it is no more.

        I also hate the new ones. And most of what modern design pumps out now days.

    • guywithahat 4 days ago

      Counterpoint: The old logo looks like it's for a piece of software that stopped being maintained 15 years ago

    • Terretta 4 days ago

      For [1] they picked clip-art of a crown molding cross section.

    • callamdelaney 4 days ago

      Yeah I yearn to go back to flask but the logo is giving me the ick.

    • coldtea 5 days ago

      The usual crap when either some "business" or some "designer" types come in

    • AlienRobot 4 days ago

      Is it just me or there has never been a single logo update in history that actually improved a logo?

      An once whimsical corner of web development has lost its charm due to arbitrary trends.

  • svieira 5 days ago

    Nice story! My guess is that the site was https://yout.com/ given your profile. Does it still run Flask?

    • nadermx 5 days ago

      It's all grown up now. Runs on Django for the admin panel. Not that flask ever failed. Just became easier to manage the user base that way.

      • swyx 5 days ago

        because of Django admin? any downsides/notable warnings for people considering Flask v Django? any migration guide that's helpful?

  • signalblur 5 days ago

    Thanks for sharing this story. It goes to show how much of a difference being kind and helping a stranger can make.

    Hope I'm able to do the same for someone one day :)

  • lucb1e 5 days ago

    For anyone else wondering whether to click to find what "fragged file" means: no, it's not about Quake and the linked page does not mention 'frag' at all. The question asks how to stream a file to the client in Flask as opposed to reading it all into memory at once and then sending it on. I figured as much (also because e.g. IP fragmentation) but first time I hear this alternative term for streaming

  • jbs789 5 days ago

    Similar story here. Pleasant to work with too.

    The accessibility of this material and also the broader python ecosystem is truly incredible. After reflecting on this recently, started finding ways to give back/donate/contribute.

  • rnikko 4 days ago

    Same here with following the mega tutorial. Truly one of the goats.

  • ohduran 4 days ago

    Same happened to me; I owe a career to having gone through his Mega Tutorial. Miguel if you're reading this, thank you from the bottom of my heart.

  • Izikiel43 4 days ago

    When I saw you were using readlines to read binary file I thought wtf at first, seems like he noticed as well.

  • robertlagrant 4 days ago

    The other answer to your question there is why Flask is so good. One short file and you have a backend and a frontend!

  • pietroppeter 4 days ago

    Yet another appreciation story for Miguel’s mega tutorial. In 2017 I used it to create our wedding site and learn a bit of web dev (my background is in data science). To motivate me to actually do it I used the strategy the fund the then occurring refactoring of the tutorial. I am still very fond and proud of that first time I actually went and funded some open source effort, it gives you back more than you might expect

  • tomhow 5 days ago

    We fixed the typo in the first sentence: ow -> owe. Hope that's okay!

    Edit: corrected typo in "typo".

  • Y_Y 5 days ago

    Cool story, but was your life really at risk in that situation?

    • nadermx 5 days ago
      • Y_Y 5 days ago

        > Brazil Advances Criminal Prosecution of American Yout.com Operator

        Touché! I see sibling comments assuming I was being sarcastic (without mandatory sarcasm tag!), but what I was really hoping for was more backstory like this. I guess it depends on how you read things in your head.

    • jryb 5 days ago

      Not all statements should be interpreted literally.

      • shoobiedoo 5 days ago

        You just took the wind right out of his sails

    • [removed] 5 days ago
      [deleted]
    • DANmode 3 days ago

      At risk of not being programming?

      Seemingly.

  • AtlasBarfed 5 days ago

    Did you throw any money his way?

sroussey 5 days ago

Please don’t make benchmarks with timing inside the loop creating a sum. Just time the loop and divide by the number. Stuff happens getting the time and the jitter can mess with results.

  • mjevans 4 days ago

    The real world benchmark is measuring it from invocation, both for cold launches and 'hot' (data cached from the last run).

    Interestingly I might have only ever used the time (shell) builtin command. GNU's time measuring command prints a bunch of other performance stats as well.

    • rocqua 4 days ago

      I'm annoyed every time I have to write $(which time). But the stats given by -v are just so much more valuable from gnu-time.

      • Tom1380 4 days ago

        Wouldn't it also work with "env time" if that's easier to type?

didip 5 days ago

Every time I hear news about Python language itself, it sadden me that, in 2025, PyPy is still a separate distinct track from mainline Python.

That said, I wonder if GIL-less Python will one day enable GIL-less C FFI? That would be a big win that Python needs.

  • taleinat 4 days ago

    The biggest thing PyPy adds is JIT compilation. This is precisely what the project to add JIT to CPython is working on these days. It's still early days for the project, but by 3.15 there's a good chance we'll see some really great speedups in some cases.

    It's worth noting that PyPy devs are in the loop, and their insights so far have been invaluable.

  • petters 5 days ago

    > That said, I wonder if GIL-less Python will one day enable GIL-less C FFI?

    What do you mean exactly? C FFI has always been able to release the GIL manually.

  • nu11ptr 5 days ago

    > That said, I wonder if GIL-less Python will one day enable GIL-less C FFI? That would be a big win that Python needs.

    I'm pretty sure that is what freethreading is today? That is why it can't be enabled by default AFAIK, as several C FFI libs haven't gone "GIL-less" yet.

  • 8organicbits 5 days ago

    Can you clarify the concern? Starting from C I've come to expect many dialects across many compiler implementations. It seems healthy and encourages experimentation. Is it not a sign of a health language ecosystem?

    Pypy compatibility with cpython seems very minor in comparison https://pypy.org/compat.html

  • natdempk 5 days ago

    Well, they added an experimental JIT so that is one step closer to PyPy? Though would assume the trajectory is build a new JIT vs. merge in PyPy, but hopefully people learned a lot from PyPy.

  • freddie_mercury 5 days ago

    How do you see that changing?

    Python introduce another breaking change than also randomly affects performance, making it worse for large classes of users?

    Why would the Python organisers want to do that?

  • ActorNightly 5 days ago

    I don't understand why C FFI is that popular.

    The amount of time it takes spent to write all the cffi stuff is the same amount it takes to write an executable in C and call it from python.

    The only time cffi is useful is if you want to have that code be dynamic, which is a very niche use case.

    • Too 4 days ago

      You write the ffi once and let hundreds or thousands of other developers use it. For one off executables it rarely make sense.

      Mixing the use with other libraries provided by the Python ecosystem is a another scenario. Do you really want to do HTTP in C or do you prefer requests?

    • eternauta3k 4 days ago

      Could you go into more detail? How would you build e.g. numpy without FFI?

      • jononor 4 days ago

        These days you could probably build a pretty performant numpy like using shared memory with Arrow format and IPC for control. Though it would be considerably more complex and not at all easier than FFI...

    • KeplerBoy 4 days ago

      We need the FFI to share memory in-process with C functions?

  • cap11235 5 days ago

    [flagged]

    • [removed] 5 days ago
      [deleted]
    • freddie_mercury 5 days ago

      Guido stepped down over 7 years ago. How out of touch are you?

    • rowanG077 5 days ago

      Who are you talking about? Python hasn't had a dictator for ages now.

amelius 5 days ago

I hope it doesn't get stuck at 3.14, like TeX.

https://www.reddit.com/r/RedditDayOf/comments/7we430/donald_...

  • feoren 5 days ago

    You hope it doesn't?

    > [Donald Knuth] firmly believes that having an unchanged system that will produce the same output now and in the future is more important than introducing new features

    This is such a breath of fresh air in a world where everything is considered obsolete after like 3 years. Our industry has a disease, an insatiable hunger for newness over completeness or correctness.

    There's no reason we can't be writing code that lasts 100 years. Code is just math. Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago? LOL dude get with the times, everyone uses Equately for their equations now. It was made by 3 interns at Facebook, so it's pretty much the new hotness." No, I don't think I will use "Equately", I think I'll stick to the tried-and-true idea that has been around for 3000 years.

    Forget new versions of everything all the time. The people who can write code that doesn't need to change might be the only people who are really contributing to this industry.

    • kibwen 5 days ago

      > There's no reason we can't be writing code that lasts 100 years. Code is just math.

      In theory, yes. In practice, no, because code is not just math, it's math written in a language with an implementation designed to target specific computing hardware, and computing hardware keeps changing. You could have the complete source code of software written 70 years ago, and at best you would need to write new code to emulate the hardware, and at worst you're SOL.

      Software will only stop rotting when hardware stops changing, forever. Programs that refuse to update to take advantage of new hardware are killed by programs that do.

      • KK7NIL 5 days ago

        This is a total red herring, x86 has over 30 years of backwards compatability and the same goes for the basic peripherals.

        The real reason for software churn isn't hardware churn, but hardware expansion. It's well known that software expands to use all available hardware resources (or even more, according to Wirth's law).

      • aj_hackman 4 days ago

        The bare minimum cost of software churn is the effort of one human being, which is far less than hardware churn (multiple layers of costly design and manufacturing). As a result, we see hardware change gradually over the years, while software projects can arbitrarily deprecate, change, or remove anything at a whim. The dizzying number of JS frameworks, the replacement of X with Wayland or init with systemd, removal of python stdlib modules, etc. etc. have nothing to do with new additions to the x86 instruction set.

      • 9rx 4 days ago

        > and computing hardware keeps changing.

        Only if you can't reasonably buy a direct replacement. That might have been a bigger problem in the early days of computing where people spread themselves around, leaving a lot of business failures and thus defunct hardware, but nowadays we all usually settle on common architectures that are very likely to still be around in the distant future due to that mass adoption still providing strong incentive for someone to keep producing it.

      • disentanglement 5 days ago

        TeX is written in a literate programming style which is more akin to a math textbook than ordinary computer code, except with code blocks instead of equations. The actual programming language in the code blocks and the OS it runs on matters a lot less than in usual code where at best you get a few sparse comments. Avoiding bit rot in such a program is a very manageable task. In fact, iirc the code blocks which end up getting compiled and executed for TeX have been ported from Pascal to C at some point without introducing any new bugs.

        • Quekid5 4 days ago

          The C version of TeX is also terrible code in the modern day (arbitrary limits, horrible error handling, horrible macro language, no real Unicode support, etc. etc), hence LuaTeX (et al.) and Typst and such.

          The backward-compat story is also oversold because, yes, baseline TeX is backward compatible, but I bet <0.1% of "TeX" document don't use some form of LaTeX and use any number of packages... which sometimes break at which point the stability of base TeX doesn't matter for actual users. It certainly helps for LaTeX package maintainers, but that doesn't matter to users.

          Don't get me wrong, TeX was absolutely revolutionary and has been used for an insane amount of scientific publishing, but... it's not great code (for modern requirements) by any stretch.

      • api 5 days ago

        This is correct when it comes to bare metal execution.

        You can always run code from any time with emulation, which gives the “math” the inputs it was made to handle.

        Here’s a site with a ton of emulators that run in browser. You can accurately emulate some truly ancient stuff.

        https://www.pcjs.org/

      • 7952 5 days ago

        Given how mature emulation is now why couldn't that just continue to be possible into the future?

    • dieggsy 5 days ago

      Are you by chance a Common Lisp developer? If not, you may like it (well, judging only by your praise of stability).

      Completely sidestepping any debate about the language design, ease of use, quality of the standard library, size of community, etc... one of its strengths these days is that standard code basically remains functional "indefinitely", since the standard is effectively frozen. Of course, this requires implementation support, but there are lots of actively maintained and even newer options popping up.

      And because extensibility is baked into the standard, the language (or its usage) can "evolve" through libraries in a backwards compatible way, at least a little more so than many other languages (e.g. syntax and object system extension; notable example: Coalton).

      Of course there are caveats (like true, performant async programming) and it seems to be a fairly polarizing language in both directions; "best thing since sliced bread!" and "how massively overrated and annoying to use!". But it seems to fit your description decently at least among the software I use or know of.

      • feoren 5 days ago

        I respect and understand the appeal of LISP. It is a great example of code not having to change all the time. I personally haven't had a compelling reason to use it (post college), but I'm glad I learned it and I wouldn't be averse to taking a job that required it.

        While writing "timeless" code is certainly an ideal of mine, it also competes with the ideals of writing useful code that does useful things for my employer or the goals of my hobby project, and I'm not sure "getting actual useful things done" is necessarily LISP's strong suit, although I'm sure I'm ruffling feathers by saying so. I like more modern programming languages for other reasons, but their propensity to make backward-incompatible changes is definitely a point of frustration for me. Languages improving in backward-compatible ways is generally a good thing; your code can still be relatively "timeless" in such an environment. Some languages walk this line better than others.

        • lycopodiopsida 4 days ago

          I think, the "useful" part is more covered by libraries than everything else, and the stability and flexibility of the core language certainly helps with that. Common Lisp is just not very popular (as every lisp) and does not have a very big ecosystem, that's it.

          Another point for stability is about how much a runtime can achieve if it is constantly improved over decades. Look where SBCL, a low-headcount project, is these days.

          We should be very vigilant and ask for every "innovation" whether it is truly one. I think it is fair to assume for every person working in this industry for decades that the opinion would be that most innovations are just fads, hype and resume-driven development - the rest could be as well implemented as a library on top of something existing. The most progress we've had was in tooling (rust, go) which does not require language changes per se.

          I think, the frustrating part about modern stacks is not the overwhelming amount of novelty, it is just that it feels like useless churn and the solutions are still as mediocre or even worse that what we've had before.

    • psychoslave 5 days ago

      Stability is for sure a very seducing trait. Also I can totally understand the fatigue of the chase for the next almost already obsolete new stuff.

      >There's no reason we can't be writing code that lasts 100 years.

      There are many reason this is most likely not going to happen. Code despite best effort to achieve separation of concern (in the best case) is a highly contextual piece of work. Even with a simple program with no external library, there is a full compiler/interpreter ecosystem that forms a huge dependency. And hardware platforms they abstract from are also moving target. Change is the only constant, as we say.

      >Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago?

      Well, that might surprise you, but no, they weren't. At least, they were not dealt with as they are thought and understood today in their contemporary most common presentation. When Babylonians (c. 2000 BCE) solved quadratic equation, they didn't have anything near Descartes algebraic notation connected to geometry, and there is a long series evolution in between, and still to this days.

      Mathematicians actually do make a lot of fancy innovative things all the time. Some fundamentals stay stable over millennia, yes. But also some problem stay unsolved for millennia until some outrageous move is done out of the standard.

      • zenmac 5 days ago

        Don't know about 100 years, but old static web page from lat 90's with js on wayback machine still works. There might be something to this static html css to archive content maybe even little programs.

        • psychoslave 5 days ago

          Yes, and we only need a browser to achieve that, the kind of piece of software well known to be small, light and having only sporadic changes introduced into them. :D

          That's actually a good moment to wander about what an amazing they are, really.

    • 0xDEAFBEAD 5 days ago

      To be fair, if math did have version numbers, we could abandon a lot of hideous notational cruft / symbol overloading, and use tau instead of pi. Math notation is arguably considerably worse than perl -- can you imagine if perl practically required a convention of single-letter variable names everywhere? What modern language designer would make it so placing two variable names right next to each other denotes multiplication? Sheer insanity.

      Consider how vastly more accessible programming has become from 1950 until the present. Imagine if math had undergone a similar transition.

      • vovavili 4 days ago

        Math personally "clicked" to me when I started to use Python and R for mathematical operations instead of the conventional arcane notation. I did make me wonder why we insist on forcing kids and young adults to struggle through particularly counter-intuitive ways to express mathematical concepts just because of historical baggage, and I am glad to hear now that I am not the only one who thinks this way.

      • tyg13 4 days ago

        What in the Hacker News in this comment?

        Mathematical notation evolved to its modern state over centuries. It's optimized heavily for its purpose. Version numbers? You're being facetious, right?

      • hansvm 5 days ago

        If the compiler forbade syntactic ambiguity from implicit multiplication and had a sensible LSP allowing it to be rendered nicely, I don't think that'd be such a bad thing. Depending on the task at hand you might prefer composition or some other operation, but when reducing character count allows the pattern recognition part of our brain to see the actual structure at hand instead of wading through character soup it makes understanding code much easier.

        • 0xDEAFBEAD 4 days ago

          Yep, this explains why the APL programming language was so ridiculously successful.

      • [removed] 4 days ago
        [deleted]
    • sacado2 4 days ago

      > There's no reason we can't be writing code that lasts 100 years. Code is just math. Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago? LOL dude get with the times, everyone uses Equately for their equations now. It was made by 3 interns at Facebook, so it's pretty much the new hotness." No, I don't think I will use "Equately", I think I'll stick to the tried-and-true idea that has been around for 3000 years.

      Not sure this is the best example. Mathematical notation evolved a lot in the last thousand years. We're not using roman numerals anymore, and the invention of 0 or of the equal sign were incredible new features.

      • feoren 4 days ago

        > Mathematical notation evolved a lot in the last thousand years

        That is not counter to what I'm saying.

            Mathematical notation <=> Programming Languages.
        
            Proofs <=> Code.
        
        When mathematical notation evolves, old proofs do not become obsolete! There is no analogy to a "breaking change" in math. The closest we came to this was Godel's Incompleteness Theorem and the Cambrian Explosion of new sets of axioms, but with a lot of work most of math was "re-founded" on a set of commonly accepted axioms. We can see how hostile the mathematical community is to "breaking changes" by seeing the level of crisis the Incompleteness Theorem caused.

        You are certainly free to use a different set of axioms than ZF(C), but you need to be very careful about which proofs you rely on; just as you are free to use a very different programming language or programming paradigm, but you may be limited in the libraries available to you. But if you wake up one morning and your code no longer compiles, that is the analogy to one day mathematicians waking up and realizing that a previously correct proof is now suddenly incorrect -- not that it was always wrong, but that changes in math forced it into incorrectness. It's rather unthinkable.

        Of course programming languages should improve, diversify, and change over time as we learn more. Backward-compatible changes do not violate my principle at all. However, when we are faced with a possible breaking change to a programming language, we should think very hard about whether we're changing the original intent and paradigms of the programming language and whether we're better off basically making a new spinoff language or something similar. I understand why it's annoying that Python 2.7 is around, but I also understand why it'd be so much more annoying if it weren't.

        Surely our industry could improve dramatically in this area if it cared to. Can we write a family of nested programming languages where core features are guaranteed not to change in breaking ways, and you take on progressively more risk as you use features more to the "outside" of the language? Can we get better at formalizing which language features we're relying on? Better at isolating and versioning our language changes? Better at time-hardening our code? I promise you there's a ton of fruitful work in this area, and my claim is that that would be very good for the long-term health and maturation of our discipline.

    • ants_everywhere 5 days ago

      > There's no reason we can't be writing code that lasts 100 years. Code is just math

      Math is continually updated, clarified and rewritten. 100 years ago was before the Bourbaki group.

      • feoren 4 days ago

        > Math is continually updated, clarified and rewritten

        And yet math proofs from decades and centuries ago are still correct. Note that I said we write "code that lasts", not "programming languages that never change". Math notation is to programming languages as proofs are to code. I am not saying programming languages should never change or improve. I am saying that our entire industry would benefit if we stopped to think about how to write code that remains "correct" (compiling, running, correct behavior) for the next 100 years. Programming languages are free to change in backward-compatible ways, as long once-correct code is always-correct. And it doesn't have to be all code, but you know what they say: there is nothing as permanent as a temporary solution.

    • AceJohnny2 5 days ago

      > an insatiable hunger for newness over completeness or correctness.

      I understand some of your frustration, but often the newness is in response to a need for completeness or correctness. "As we've explored how to use the system, we've found some parts were missing/bad and would be better with [new thing]". That's certainly what's happening with Python.

      It's like the Incompleteness Theorem, but applied to software systems.

      It takes a strong will to say "no, the system is Done, warts and missing pieces and all. Deal With It". Everyone who's had to deal with TeX at any serious level can point to the downsides of that.

    • cess11 4 days ago

      If you look at old math treatises from important historical people you'll notice that they use very different notation from the one you're used to. Commonly concepts are also different, because those we use are derived over centuries from material produced without them and in a context where it was traditional to use other concepts to suss out conclusions.

      But you have a point, and it's not just "our industry", it's society at large that has abandoned the old in favour of incessant forgetfulness and distaste for tradition and history. I'm by no means a nostalgic but I still mourn the harsh disjoint between contemporary human discourse and historical. Some nerds still read Homer and Cicero and Goethe and Ovid and so on but if you use a trope from any of those that would have been easily recognisable as such by europeans for much of the last millenium you can be quite sure that it won't generally be recognised today.

      This also means that a lot of early and mid-modern literature is partially unavailable to contemporary people, because it was traditional to implicitly use much older motifs and riff on them when writing novels and making arguments, and unless you're aware of that older material you'll miss out on it. For e.g. Don Quixote most would need an annotated version which points out and makes explicit all the references and riffing, basically destroying the jokes by explaining them upfront.

    • dhosek 5 days ago

      Worth noting that few people use the TeX executable as specified by Knuth. Even putting aside the shift to pdf instead of dvi output, LaTeX requires an extended TeX executable with features not part of the Knuth specification from 1988.

      Btw, equations and polynomials while conceptually are old, our contemporary notation is much younger, dating to the 16th century, and many aspects of mathematical notation are younger still.

    • stingraycharles 4 days ago

      This philosophy may have its place in some communities, but Python is definitely not one of them.

      Even C/C++ introduces breaking changes from time to time (after decades of deprecation though).

      There’s no practical reason why Python should commit to a 100+ year code stability, as all that comes at a price.

      Having said that, Python 2 -> 3 is a textbook example of how not to do these things.

      • procaryote 4 days ago

        Python is pretty much on the other extreme as 3.x → 3.y should be expected to break things, there's no "compability mode" to not break things, and the reasons for the breakage can be purely aestetic bikeshedding

        C in contrast generally versions the breaking changes in the standard, and you can keep targeting an older standard on a newer compiler if you need to, and many do

    • AlphaSite 5 days ago

      While i think Latex is fantastic, i think there is plenty of low hanging fruit to improve upon it... the ergonomics of the language and its macros aren't great. If nothing else there should be a better investment in tooling and ecosystem.

    • pooyamo 5 days ago

      Some stuff like LAPACK and BLAS fit your bill. They are math libraries written decades ago and still in use.

    • __alexs 4 days ago

      Mathematical notion has changed over the years. Is Diophantus' original system of polynomials that legible to modern mathematicians? (Even if you ignore the literally being written in ancient greek part.)

    • OisinMoran 4 days ago

      I agree somewhat with your sentiment and have some nostalgia for a time when software could be finished, but the comment you're replying to was making a joke that I think you may have missed.

    • lxgr 4 days ago

      > There's no reason we can't be writing code that lasts 100 years. Code is just math.

      The weather forecast is also “just math”, yet yesterday’s won’t be terribly useful next April.

      • feoren 4 days ago

        No, weather forecasting models are "just math". The forecast itself is an output of the model. I sure hope our weather forecasting models are still useful next year!

            weather forecasting models <=> code <=> math
        
            weather forecast <=> program output <=> calculation results
        
        So all you're saying is that we should not expect individual weather forecasts, program output, and calculation results to be useful long-term. Nobody is arguing that.
        • lxgr 4 days ago

          That's why I said "[yesterday's] weather forecast" and not "weather forecast models".

          But my larger point actually also stands: Weather forecast models also, in the end, incorporate information about geography, likely temperature conditions etc., and might not be stable over 100 years.

          The more interesting question is probably: Is Python more like the weather or a weather forecasting model? :)

    • nurettin 5 days ago

      My C++ from 2005 still compiles! (I used boost 1.32)

      Most of my python from that era also works (python 3.1)

      The problem is not really the language syntax, but how libraries change a lot.

    • denzil 5 days ago

      Kinda related question, but is code really just a math? Is it possible to express things like user input, timings, inteerupts, error handling, etc. as math?

      • CableNinja 5 days ago

        I would slightly sort of disagree that code is just math when you really boil it down, however, if you take a simple task, say, printing hello world to the output, you could actually break that down into a mathematical process. You can mathematically say at time T value of O will be the value of index N of input X, so over a period of time you eventually get "hello world" as the final output

        Howeveeerrr.. its not quite math when you break down to the electronics level, unless you go really wild (wild meaning physics math). take a breakdown of python to assembly to binary that flips the transistors doing the thing. You can mathematically define that each transistor will be Y when that value of O is X(N); btw sorry i can't think of a better way to define such a thing from mobile here. And go further by defining voltages to be applied, when and where, all mathematically.

        In reality its done in sections. At the electronic level math defines your frequency, voltage levels, timing, etc; at the assembly level, math defines what comparisons of values to be made or what address to shift a value to and how to determine your output; lastly your interpreter determines what assembly to use based on the operations you give it, and based on those assembly operations, ex an "if A == B then C" statement in code is actually a binary comparator that checks if the value at address A is the same as the value at address B.

        You can get through a whole stack with math, but much of it has been abstracted away into easy building blocks that don't require solving a huge math equation in order to actually display something.

        You can even find mathematical data among datasheets for electronic components. They say (for example) over period T you cant exceed V volts or W watts, or to trigger a high value you need voltage V for period T but it cannot exceed current I. You can define all of your components and operations as an equation, but i dont think its really done anymore as a practice, the complexity level of doing so (as someone not building a cpu or any ic) isnt useful unless youre working on a physics paper or quantum computing, etc etc

      • api 5 days ago

        Isn’t it possible to express anything as math? With sufficient effort that is.

    • [removed] 5 days ago
      [deleted]
    • Razengan 4 days ago

      > This is such a breath of fresh air in a world where everything is considered obsolete after like 3 years.

      I dunno man, there's an equal amount of bullshit that still exists only because that's how it was before we were born.

      > Code is just math.

      What?? No. If it was there'd never be any bugs.

      • feoren 4 days ago

        > > Code is just math.

        > What?? No. If it was there'd never be any bugs.

        Are you claiming there is no incorrect math out there? Go offer to grade some high-school algebra tests if you'd like to see buggy math. Or Google for amateur proofs of the Collatz Conjecture. Math is just extremely high (if not all the way) on the side of "if it compiles, it is correct", with the caveat that compilation only can happen in the brains of other mathematicians.

        • Razengan 4 days ago

          That's human error. "Correctness vs. mistakes" applies to all human languages too, English etc.

          In math, `a - b` doesn't occasionally become `b - a` if one CPU/thread/stream finishes before an other, just to give one example.

          Or, if you write `1 + 2` it will forever be `1 + 2`, unlike code where it may become `3 / 4 - 5 + 6 ^ 7 + 1 + 2` or whatever junk gets appended before or after your expression tomorrow (analogy for the OS/environment your code runs in)

          I guess to put it simply: code is affected by its environment, math isn't.

    • bitwize 5 days ago

      Except uh, nobody uses infinitesimals for derivatives anymore, they all use limits now. There's still some cruft left over from the infinitesimal era, like this dx and dy business, but that's just a backwards compatibility layer.

      Anyhoo, remarks like this are why the real ones use Typst now. TeX and family are stagnant, difficult to use, difficult to integrate into modern workflows, and not written in Rust.

      • feoren 5 days ago

        > the real ones use Typst now

        Are you intentionally leaning into the exact caricature I'm referring to? "Real programmers only use Typstly, because it's the newest!". The website title for Typst when I Googled it literally says "The new foundation for documents". Its entire appeal is that it's new? Thank you for giving me such a perfect example of the symptom I'm talking about.

        > TeX and family are stagnant, difficult to use, difficult to integrate into modern workflows, and not written in Rust.

        You've listed two real issues (difficult to use, difficult to integrate), and two rooted firmly in recency bias (stagnant, not written in Rust). If you can find a typesetting library that is demonstrably better in the ways you care about, great! That is not an argument that TeX itself should change. Healthy competition is great! Addiction to change and newness is not.

        > nobody uses infinitesimals for derivatives anymore, they all use limits now

        My point is not that math never changes -- it should, and does. However, math does not simply rot over time, like code seems to (or at least we simply assume it does). Math does not age out. If a math technique becomes obsolete, it's only ever because it was replaced with something better. More often, it forks into multiple different techniques that are useful for different purposes. This is all wonderful, and we can celebrate when this happens in software engineering too.

        I also think your example is a bit more about math pedagogy than research -- infinitesimals are absolutely used all the time in math research (see Nonstandard Analysis), but it's true that Calculus 1 courses have moved toward placing limits as the central idea.

      • AAAAaccountAAAA 5 days ago

        Even if Typst was going to replace TeX everywhere right now, about half a century would still be a respectable lifespan for a software project.

      • erichocean 5 days ago

        > nobody uses infinitesimals for derivatives anymore

        All auto-differentiation libraries today are built off of infinitesimals via Dual numbers. Literally state of the art.

  • rich_sasha 4 days ago

    It did previously get stuck on 2.7, it might have an affinity to mathematical constants.

  • ForceBru 5 days ago

    LMAO that actually fits really well given all the πthon jokes

wbolt 5 days ago

More than 300 comments here and still no convincing answer. Why the community wastes time on trying to make CPython faster when there is pypy which is already much faster? I understand pypy lacks libraries and feature parity with up to date CPython. But… can’t everyone refocus the efforts and just move to pypy to add all the missing bits and then just continue with pypy as the “official python”? Are there any serious technical reasons not to do it?

  • ActorNightly 4 days ago

    > Are there any serious technical reasons not to do it?

    Yes.

    First is startup time. REPL cycle being fast is a big advantage for development. From a business perspective, dev time is more expensive then compute time by orders of magnitude. Every time you make a change, you have to recompile the program. Meanwhile with regular python, you can literally develop during execution.

    Second is compatibility. Numpy and pytorch are ever evolving, and those are written a C extensions.

    Third is LLMs. If you really want speed, Gemma27bqat that runs on a single 3090 can translate python codebase into C/C++ pretty easily. No need to have any additional execution layer. My friend at Amazon pretty much writes Java code this way - prototypes a bunch of stuff in Python, and then has an LLM write the java code thats compatible with existing intra-amazon java templates.

    • procaryote 4 days ago

      I really hope I'll never need to touch code written by people who code in python and throws it at a plausible randomiser to get java or C

      If you for some reason do this, please keep the python around so I can at least look at whatever the human was aiming at. It's probably also wrong as they picked this workflow, but there's a chance it has something useful

      • ActorNightly 4 days ago

        LLMs are there to get the meat of the software in. Fine tuning it is easy when you already have all the syntax written for you. With enough prompting on how you want the code laid out, the modern models do a really good job of getting it right with very minor things you have to tweak.

      • mystifyingpoi 4 days ago

        I get the "old man yells at cloud" vibes from your comment. Who cares how he got the result? I thought our job is to create working software. If this flow works for him and creates code that meets company standards, then more power to him.

        However, if the output quality is crap, then well, maybe his creativity should not be rewarded. I've seem hefty amount of Map<Object, Object> in Java, written primarily by JS developers.

    • wbolt 4 days ago

      Repl I get it. Possibly valid point. Yet I guess same issue are valid to node.js which seems much faster in many cases and still has valid dev experience.

      C compatibility / extension compatibility - nope. First, it is an issue of limited resources. Add more devs to pypy team and compatibility bugs gets fixed. Second, aren’t people writing C extensions due to python being slow? Make python fast - as pypy - and for some cases native code won’t be that crucial.

      So I don’t see a real issue with pypy that could not be solved by simply moving all the dev efforts from CPython.

      So are there political, personal or business issues?

      • ActorNightly 4 days ago

        >C compatibility / extension compatibility - nope. First, it is an issue of limited resources.

        No, its an issue of reinventing the wheel. Native code is native code. Numpy stuff isn't going to be faster running in pypy, neither is any of the ML stuff. Stuff like FastAPI or Uvicorn aren't going see much speed increase.

        In the modern world, there is basically no need for middle ground performance. In the past, when you had single core processors, making things go fast was advantageous. Now, if you need to go fast, you most likely need to go REALLY fast, at which point just go full native. Otherwise, you are going to be slowed down by network calls and other factors.

        So while PyPy is a cool projects that can be an optimization on top of regular python, its not worth while trying to make Python into something it will never be

  • selcuka 5 days ago

    > can’t everyone refocus the efforts

    You have answered your own question.

    Seriously, though. PyPy is 2-3 versions behind CPython (3.11 vs 3.14) and it's not even 100% compatible with 3.11. Libraries such as psycopg and lxml are not fully supported. It's a hard sell.

    • og_kalu 4 days ago

      Pypy only has a handful of devs. If it had the PSF's official blessing, it wouldn't lag behind CPython so much.

    • wbolt 5 days ago

      But this is exactly my point. The resources pypy has are much smaller. And still for years they managed to follow up being just 2-3 versions behind with features and high on performance.

      So why not move all the resources from CPython to close the gap with features faster and replace CPython entirely?

      Since this is not happening I expect there to be serious reasons, but I fail to see them. This is what I ask for.

  • ModernMech 4 days ago

    > Are there any serious technical reasons not to do it?

    Forget technical reasons, how would you ever do it? It feels like the equivalent of cultural reprogramming "You must stop using your preferred interpreter and halt all your efforts contrary to the one true interpreter". Nah, not going to happen in a free and open source language. Who would have the authority and control to make such a directive?

    Yes, there may be technical reasons, but the reason it doesn't happen more than any other is that programming languages are languages spoken by people, and therefore they evolve organically at no one's direction. Even in languages like Python with a strong bent for cultural sameness and a BDFL type direction, they still couldn't control it. Often times, dialects happen for technical reasons, but it's hard to get rid of them on technical grounds.

  • otabdeveloper4 5 days ago

    > pypy which is already much faster

    It isn't.

    • bjoli 4 days ago

      For all my applications, going to PyPy was an instant 2x improvement.

      Not only that, it is a lot easier to hack on. I might be biased, but the whole implementstion idea of PyPy seems a lot more sane.

    • MobiusHorizons 4 days ago

      I think for pure python performance it is significantly faster at least on all the benchmarks I have seen. That said a lot of what people actually do in python calls into libraries that are written in C++ or C, which I believe has a similar performance (when it works) on pypy.

anaccount342 5 days ago

I don't know how realistic only using a benchmark that only uses tight loops and integer operations. Something with hashmaps and strings more realistically represents everyday cpu code in python; most python users offload numeric code to external calls.

  • miguelgrinberg 5 days ago

    There is no "realistic" benchmark, all benchmarks are designed to measure in a specific way. I explain what my goals were in the article, in case you are curious and want to read it.

    • hshdhdhehd 4 days ago

      Run a django app and throw traffic at it wouldnt be bad.

  • e-khadem 5 days ago

    I agree with you, this is not an in depth look, could have been much more rigorous.

    But then I think in some ways it's a much more accurate depiction of my use case. I mainly write monte-carlo simulations or simple scientific calculations for a diverse set of problems every day. And I'm not going to write a fast algorithm or use an unfamiliar library for a one-off simulation, even if the sim is going to take 10 minutes to run (yes I use scipy and numpy, but often those aren't the bottlenecks). This is for the sake of simplicity as I might iterate over the assumptions a few times, and optimized algorithms or library impls are not as trivial to work on or modify on the go. My code often looks super ugly, and is as laughably unoptimized as the bubble sort or fib(40) examples (tail calls and nested for loops). And then if I really need the speed I will take my time to write some clean cpp with zmq or pybind or numba.

  • procaryote 4 days ago

    It's still interesting though. If the most basic thing isn't notably faster, it makes it pretty likely the more complex things aren't either.

    If your actual load is 1% python and 99% offloaded, the effect of a faster python might not mater a lot to you, but to measure python you kinda have to look at python

  • gsibble 5 days ago

    Or have it run some super common use case like a FastAPI endpoint or a numpy calculation. Yes, they are not all python, but it's what most people use Python for.

    • miguelgrinberg 5 days ago

      FastAPI is a web framework, which by definition is (or should be!) an I/O bound process. My benchmark evaluates CPU, so it's a different thing. There are a ton of web framework benchmarks out there if you are interested in FastAPI and other frameworks.

      And numpy is a) written in C, not Python, and b) is not part of Python, so it hasn't changed when 3.14 was released. The goal was to evaluate the Python 3.14 interpreter. Not to say that it wouldn't be interesting to evaluate the performance of other things as well, but that is not what I set out to do here.

      • KeplerBoy 4 days ago

        That's the thing with Python: A lot of things should be bound by all kinds of limitations, but are in practice often limited by the Python interpreter if not done carefully.

        Fundamentally for example, if you're doing some operations on numpy arrays like: c = a + b * c, interpreted numpy will be slower than compiled numba or C++ just because an eager interpreter will never fuse those operations into an FMA.

      • xmcqdpt2 4 days ago

        Numpy is partly written in C but includes a lot of Python code. If you include scipy or scikit learn or pandas, most of the code is python calling primitive numpy C operations. I'd expect that many semi-complex data science programs to benefit from improvement in the python interpreter, especially if they weren't written in super tight numpy code.

t43562 4 days ago

For me the "criminal" thing is that Pypy exists on a shoestring and yet delivers the performance and multithreading that others gradually try to add to cpython.

It's problem is, IMO, compatibility. Long ago I wanted to run it on yocto but something or other didn't work. I think this problem is gradually disappearing but it could be solved far more rapidly with a bit of money and effort probably.

  • dec0dedab0de 4 days ago

    PyPy still has the GIL so the multithreading stuff is the same problem.

    However, the JIT does make things much faster

    • __alexs 4 days ago

      What happened to the STM version of PyPy with no GIL?

      • kbd 4 days ago

        I was soo excited when they announced this, but I've heard almost nothing since.

veber-alex 5 days ago

The most interesting part for me is that PyPy is faster than free threaded CPython even on multi threaded code.

Havoc 5 days ago

Really pleasing to see how smooth the non-GIL transition was. If you think about 2->3 python this was positively glorious.

And that it gets into spitting range of standard so fast is really promising too. That hopefully means the part not compatible with it get flushed out soon-ish

  • jabl 4 days ago

    AFAIU GIL is still the default, and no-GIL is a build option, you can't select it at runtime.

    The big issue is what about all those C extension modules, some of them might require a lot of changes to work properly in a no-GIL world.

  • defraudbah 4 days ago

    it still has GIL, likely a few more versions until we get rid of it