fmajid 11 hours ago

Microsoft laid off the Faster CPython lead Mark Shannon and ended support for the project, where does this leave the Verona project?

  • pjmlp 10 hours ago

    They belong to Microsoft Research not DevDiv, so while that doesn't protect them from layoffs, certainly gives them some protection being under different management.

    Microsoft Research sites tend to be based in collaborations with university research labs.

  • sitkack 3 hours ago

    Boycott Microsoft. Don't work there, don't use their products.

    • smt88 2 hours ago

      How does using their Python tools help Microsoft?

      • lucianbr 32 minutes ago

        It's a large corporation. I'm certain someone asked that question and got an answer before starting producing Python tools. It's management's job to ask that question and get answers, you know.

      • akkad33 an hour ago

        At the least telemetry and recognition in the software community. At worst training their AI

zenkey 10 hours ago

I've been programming with Python for over 10 years now, and I use type hints whenever I can because of how many bugs they help catch. At this point, I'm beginning to form a rather radical view. As LLMs get smarter and vibe coding (or even more abstract ways of producing software) becomes normalized, we'll be less and less concerned about compatibility with existing codebases because new code will be cheaper, faster to produce, and more disposable. If progress continues at this pace, generating tests with near 100% coverage and fully rewriting libraries against those tests could be feasible within the next decade. Given that, I don't think backward compatibility should be the priority when it comes to language design and improvements. I'm personally ready to embrace a "Python 4" with a strict ownership model like Rust's (hopefully more flexible), fully typed, with the old baggage dropped and all the new bells and whistles. Static typing should also help LLMs produce more correct code and make iteration and refactoring easier.

  • anon-3988 4 hours ago

    > I'm personally ready to embrace a "Python 4" with a strict ownership model like Rust's (hopefully more flexible), fully typed, with the old baggage dropped and all the new bells and whistles. Static typing should also help LLMs produce more correct code and make iteration and refactoring easier.

    So...a new language? I get it except for borrow checking, just make it GC'ed.

    But this doesn't work in practice, if you break compatibility, you are also breaking compatibility with the training data of decades and decades of python code.

    Interestingly, I think as we use more and more LLMs, types gets even more and more important as its basically a hint to the program as well.

  • _ZeD_ 6 hours ago

    You think of code as an asset, but you're wrong: code is a cost.

    Feature is what you want, and performance, and correctness, and robustness; not code

    Older code is tested code, that is known to work, with known limitations and known performances

    • shiandow 4 hours ago

      A corollary is that if at all possible try to solve problems without code or, failing that, with less code.

      • Beltiras 4 hours ago

        Given that you want to solve problems with a computer, what is the alternative to code?

    • abirch 2 hours ago

      I agree, older code is evidence of survivorship bias. We don't see all of the code that was written with the older code that was removed or replaced (without a code repository).

  • pjmlp 9 hours ago

    I think people are still fooling themselves about the relevance of 3GL languages in an AI dominated future.

    It is similar to how Assembly developers thought about their relevance until optimising compilers backends turned that into a niche activity.

    It is a matter of time, maybe a decade who knows, until we can produce executables directly from AI systems.

    Most likely we will still need some kind of formalisation tools to tame natural language uncertainties, however most certainly they won't be Python/Rust like.

    We are moving into another abstraction layer, closer to the 4GL, CASE tooling dreams.

    • dragonwriter an hour ago

      > I think people are still fooling themselves about the relevance of 3GL languages in an AI dominated future.

      I think, as happens in the AI summer before each AI winter, people are fooling themselves about both the shape and proximity of the “AI dominated future”.

      • brookst 30 minutes ago

        It will be approximately the same shape and proximity as “the Internet-dominated future” was in 2005.

    • albertzeyer 8 hours ago

      4GL and 5GL are already taken. So this is the 6GL.

      https://en.wikipedia.org/wiki/Programming_language_generatio...

      But speaking more seriously, how to get this deterministic?

      • pjmlp 8 hours ago

        Fair enough, should have taken a look, I stopped counting when computer magazines buzz about 4GLs faded away.

        Probably some kind of formal methods inspired approach, declarative maybe, and less imperative coding.

        We should take an Alan Kay and Bret Victor like point of view where AI based programming is going to be in a decade from now, not where it is today.

    • Wowfunhappy 8 hours ago

      Assemblers and compilers are (practically) deterministic. LLMs are not.

      • traverseda 6 hours ago

        LLMs are deterministic. So far every vendor is giving them random noise in addition to your prompt though. They don't like have a free will or a soul or anything, you feed them exactly the same tokens exactly the same tokens will come out.

      • pjmlp 8 hours ago

        Missed the part?

        > Most likely we will still need some kind of formalisation tools to tame natural language uncertainties, however most certainly they won't be Python/Rust like

    • zenkey 9 hours ago

      Yes I agree this is likely the direction we're heading. I suppose the "Python 4" I mentioned would just be an intermediate step along the way.

      • sanderjd 5 hours ago

        I think the question is: What is the value of that intermediate step? It depends on how long the full path takes.

        If we're one year away from realizing a brave new world where everyone is going straight from natural language to machine code or something similar, then any work to make a "python 4" - or any other new programming languages / versions / features - is rearranging deck chairs on the Titanic. But if that's 50 years away, then it's the opposite.

        It's hard to know what to work on without being able to predict the future :)

    • sitkack 3 hours ago

      > It is a matter of time, maybe a decade who knows, until we can produce executables directly from AI systems.

      They already can.

    • krembo 4 hours ago

      Wild thought: maybe coding is a thing of the past? Given that an llm can get fast&deterministic results if needed, maybe a backend for instance, can be a set of functions which are all textual specifications and by following them it can do actions (validations, calculations, etc), approach apis and connect to databases, then produce output? Then the llm can auto refine the specifications to avoid bugs and roll the changes in real time for the next calls? Like a brain which doesn't need predefined coding instructions to fulfill a task, but just understand its scope, how to approach it and learn from the past.

      • TechDebtDevin 4 hours ago

        I really want to meet these people that are letting an LLM touch their db.

        • krembo 3 hours ago

          Fast forward to the near future, why wouldn't it with the correct restrictions? For instance, would you let it today run SELECT queries? as Hemingway once said "if it's about price we know who you are".

  • kryptiskt 10 hours ago

    I'd think LLMs would be more dependent on compatibility than humans, since they need training data in bulk. Humans can adapt with a book and a list of language changes, and a lot of grumbling about newfangled things. But an LLM isn't going to produce Python++ code without having been trained on a corpus of such code.

    • johnisgood 9 hours ago

      It should work if you feed the data yourself, or at the very least the documentation. I do this with niche languages and it seems to work more or less, but you will have to pay attention to your context length, and of course if you start a new chat, you are back to square one.

    • energy123 7 hours ago

      I don't know if that's a big blocker now we have abundant synthetic data from a RL training loop where language-specific things like syntax can be learned without any human examples. Human code may still be relevant for learning best practices, but even then it's not clear that can't happen via transfer learning from other languages, or it might even emerge naturally if the synthetic problems and rewards are designed well enough. It's still very early days (7-8 months since o1 preview) so to draw conclusions from current difficulties over a 2-year time frame would be questionable.

      Consider a language designed only FOR an LLM, and a corresponding LLM designed only FOR that language. You'd imagine there'd be dedicated single tokens for common things like "class" or "def" or "import", which allows more efficient representation. There's a lot to think about ...

      • jurgenaut23 6 hours ago

        It’s just as questionable to declare victory because we had a few early wins and that time will fix everything.

        Lots of people had predicted that we wouldn’t have a single human-driven vehicle by now. But many issues happened to be a lot more difficult to solve than previously thought!

      • LtWorf 4 hours ago

        How would you debug a programming language made for LLMs? And why not make an LLM that can output gcc intermediate representation directly then?

  • fulafel 6 hours ago

    > embrace a "Python 4" with a strict ownership model like Rust

    Rust only does this because it targets low-level use cases without automatic memory management, and makes a conscious tradeoff against ease of programming.

  • adsharma 4 hours ago

    You described the thinking behind py2many.

    Code in the spirit of rust with python syntax and great devx. Give up on C-API and backward compat with everything.

    Re: lifetimes

    Py2many has a mojo backend now. You can infer lifetimes for simple cases. See the bubble sort example.

  • procaryote 10 hours ago

    100% coverage won't catch 100% of bugs of course

  • mountainriver 4 hours ago

    At the point which you describe we could easily write Rust or even just C

  • pseudony 9 hours ago

    Ownership models like Rust require a grester ability for holistic refactoring, otherwise a change in one place causes a lifetime issue elsewhere. This is actually exactly what LLM's are doing the worst at.

    Beyond that, a Python with something like lifetimes implies doing away with garbage-collection - there really isn't any need for lifetimes otherwise.

    What you are suggesting has nothing to do with Python and completely misses the point of why python became so widely used.

    The more general point is that garbage collection is very appealing from a usability standpoint and it removes a whole class of errors. People who don't see that value should look again at the rise of Java vs c/c++. Businesses largely aren't paying for "beautiful", exacting memory management, but for programs which work and hopefully can handle more business concerns with the same development budget.

    • vlovich123 2 hours ago

      Rust lifetimes are generally fairly local and don’t impact refactoring too much unless you fundamentally change the ownership structure.

      Also a reminder that Rc, Arc, and Box are garbage collection. Indeed rust is a garbage collected language unless you drop to unsafe. It’s best to clarify with tracing GC which is what I think you meant.

    • pjmlp 9 hours ago

      While I go into another direction in a sibling comment, lifetimes does not imply not needing garbage collection.

      On the contrary, having both allows the productivity of automatic resource management, while providing the necessary tooling to squeeze the ultimate performance when needed.

      No need to worry about data structures not friendly to affine/linear types, Pin and Phantom types and so forth.

      It is no accident that while Rust has been successful bringing modern lifetime type systems into mainstream, almost everyone else is researching how to combine linear/affine/effects/dependent types with classical automatic resource management approaches.

  • oivey 10 hours ago

    I mean, why not just write Rust at that point? Required static typing is fundamentally at odds with the design intent of the language.

    • trealira 10 hours ago

      A lot of people want a garbage collected Rust without all the complexity caused by borrow checking rules. I guess it's because Rust is genuinely a great language even if you ignore that part of it.

      • Elucalidavah 9 hours ago

        > a garbage collected Rust

        By the way, wouldn't it be possible to have a garbage-collecting container in Rust? Where all the various objects are owned by the container, and available for as long as they are reachable from a borrowed object.

      • logicchains 9 hours ago

        Isn't garbage collected Rust without a borrow checker just OCaml?

      • spookie 7 hours ago

        D and Go exist.

        There are alternatives out there

    • tgv 5 hours ago

      Not only that: Rust is considerably faster and more reliable. Since you're not writing the code yourself, Rust would be an objectively better choice.

      Who are we trying to fool?

  • qaq 5 hours ago

    You pretty much described Mojo

  • zzzeek 3 hours ago

    > As vibe coding becomes normalized

    Just want you to know this heart monitor we gave you was engineered with vibe coding, that's why your insurance was able to cover it. Nobody really knows how the software works (because...vibes), but the AI of course surpasses humans on all current (human-created) benchmarks like SAT and bar exam tests, so there's no reason to think its software isn't superior to human-coded (crusty old non "vibe coded" software) as well. You should be able to resume activity immediately! good luck

    • brookst 24 minutes ago

      What percent of applications require that level of reliability?

      Vibe coding will be normalized because the vast, vast majority of code is not life or death. That literally what “normal” means.

      Exceptional cases like pacemakers and spaceflight will continue to be produced with rigor. Maybe even 1% of produced code will work that way!

froh 6 hours ago

I'd rather love to see confluent persistence in python, i.e. a git-like management of an object tree.

so when you create a new call stack ( generator, async sth, thread) you can create a twig/branch, and that is modified in-place, copy on write.

and you decide when and how to merge a data branch,there are support frameworks for this, even defaults but in general merging data is a deliberate operation. like with git.

locally, a python with this option looks and feels single threaded, no brain knots. sharing and merging intermediate results becomes a deliberate operation with synchronisation points that you can reason about.

kubb 11 hours ago

Sounds like a fun job, I’d love to do something like this in my 9 to 5.

It’s also amazing how much work goes into making Python a decent platform because it’s popular. Work that will never be finished and could have been avoided with better design.

Get users first, lock them in, fix problems later seems to be the lesson here.

  • materielle 8 hours ago

    Python is about 35 years old at this point. It was the better language that had the better design and the fixed problems at some point in time.

    Sure, maybe a committee way back in 1990 could have shaved off of some the warts and oopsies that Guido committed.

    I’d imagine that said committee would have also shaved off some of the personality that made Python an enjoyable language to use in the first place.

    People adopted Python because it was way nicer to use compared to the alternatives in say, 2000

    • zahlman 3 hours ago

      I would say it was closer to 2005 that Python really took off. Coincidentally around when I started using it, but I remember a noticeable increase in "buzz".

    • rixed 7 hours ago

      Yes, writting CGI with python the configuration language was so much better than with perl the shell replacement!

  • darkwater 10 hours ago

    > Get users first, lock them in, fix problems later seems to be the lesson here.

    Or with a less cynical spin: deliver something that's useful and solves a problem for your potential users, and iterate over that without dying in the process (and Python suffered a lot already in the 2 to 3 transition)

    • kubb 9 hours ago

      2 to 3 was possibly precisely because of user lock-in and sunk cost. This kind of global update was unprecedented, and could have been totally avoided with better design.

      • exe34 7 hours ago

        > could have been totally avoided with better design

        This is why taxi drivers should run the country!

        • [removed] 39 minutes ago
          [deleted]
  • fastball 10 hours ago

    Imo it is less about locking anyone in (in this case) and more about what Python actually enables: exceedingly fast prototyping and iteration. Turns out the ability to ship fast and iterate is actually more useful that performance, esp in a web context where the bottlenecks are frequently not program execution speed.

    • procaryote 10 hours ago

      Python has compounding problems that make it extremely tricky though.

      If it was just slow because it was interpreted they could easily have added a good JIT or transpiler by now, but it's also extremely dynamic so anything can change at any time, and the type mess doesn't help.

      If it was just slow one could parallelise, but it has a GIL (although they're finally trying to fix it), so one needs multiple processes.

      If it just had a GIL but was somewhat fast, multiple processes would be OK, but as it is also terribly slow, any single process can easily hit its performance limit if one request or task is slow. If you make the code async to fix that you either get threads or extremely complex cooperative multitasking code that keeps breaking when there's some bit of slow performance or blocking you missed

      If the problem was just the GIL, but it was OK fast and had a good async model, you could run enough processes to cope, but it's slow so you need a ridiculous number, which has knock-on effects on needing a silly number of database/api connections

      I've tried very hard to make this work, but when you can replace 100 servers struggling to serve the load on python with 3 servers running Java (and you only have 3 because of redundancy as a single one can deal with the load), you kinda give up on using python for a web context

      If you want a dynamic web backend language that's fast to write, typescript is a much better option, if you can cope with the dependency mess

      If it's a tiny thing that won't need to scale or is easy to rewrite if it does, I guess python is ok

      • pjmlp 5 hours ago

        > If it was just slow because it was interpreted they could easily have added a good JIT or transpiler by now, but it's also extremely dynamic so anything can change at any time, and the type mess doesn't help.

        See Smalltalk, Common Lisp, Self.

        Their dynamism, image based development, break-edit-compile-redo.

        What to change everything in Smalltalk in a single call?

        a becomes: b

        Now every single instance of a in a Smalltalk image, has been replaced by b.

        Just one example, there is hardly anything that one can do in Python that those languages don't do as well.

        Smalltalk and Self are the genesis of JIT research that eventually gave birth to Hotspot and V8.

      • johnisgood 9 hours ago

        Or Elixir / Erlang instead of Java / Kotlin, and Go instead of Python, for this use case.

    • kubb 9 hours ago

      I agree that fast iteration and the „easy to get something working” factor is a huge asset in Python, which contributed to its growth. A whole lot of things were done right from that point of view.

      An additional asset was the friendliness of the language to non-programmers, and features enabling libraries that are similarly friendly.

      Python is also unnecessarily slow - 50x slower than Java, 20x slower than Common Lisp and 10x slower than JavaScript. It’s iterative development is worse than Common Lisp’s.

      I’d say that the biggest factor is simply that American higher education adopted Python as the introductory learning language.

      • beagle3 5 hours ago

        For American higher education, It was Pascal ages ago, and then it was Java for quite a while.

        But Java is too bureaucratic to be an introductory language, especially for would-be-non-programmers. Python won on “intorudctoriness” merits - capable of getting everything done in every field (bio, chem, stat, humanities) while still being (relatively) friendly. I remember days it was frowned upon for being a “script language” (thus not a real language). But it won on merit.

  • [removed] an hour ago
    [deleted]
throwaway81523 8 hours ago

I wish Python had moved to the BEAM or something similar as part of the 2 to 3 transition. This other stuff makes me cringe.

  • toast0 an hour ago

    I'm a big BEAM person, but python 3.0.0 was released december 2008. At that time, I believe OTP R12 was current, and it only gained SMP support in R11. [1] In 2008, I don't know that it would have been clear that the BEAM would be a good target. And I don't know how switching to BEAM then would have addressed what I think is the core issue python 3 was working on, unicode strings; BEAM didn't start taking on unicode until R13 and IMHO, is kind of on the slow end of unicode adoption (which isn't always bad... being late means adopting industry consensus with less of the intermediate false steps)

    [1] https://erlang.org/euc/08/euc_smp.pdf

  • pansa2 8 hours ago

    Python’s core developers don’t even seem to care about other Python implementations (only about CPython).

    There’s no way they would move to, say, PyPy as the official implementation - let alone to a VM designed for a completely different language.

    • throwaway81523 7 hours ago

      At the time of the original Py3 release, PyPy was not ready for wide use. Otherwise maybe there could have been a chance of it replacing CPython. They were in too big a hurry to ship Py3 though. Tragedy.

    • pjmlp 5 hours ago

      Which is a pity, Python ends up being the only major dynamic language, where for all pratical purposes there is no JIT support, because while there are alternative implementations with great JIT achievements, the comunity behaves as if all that effort was for nothing other than helping PhD students doing their thesis.

pjmlp 10 hours ago

This looks like a pivot on the Project Verona research, as there have not been much other papers out since the initial announcement, regarding the programming language itself.

bgwalter 6 hours ago

This is the true Python concurrency effort! I know, I have followed many! (Life of Brian)

So they sounded out the Faster CPython team, which is now fired (was van Rossum fired, too?):

"Over the last two years, we have been engaging with the Faster CPython team at Microsoft as a sounding board for our ideas."

And revive the subinterpreter approach yet again.

  • zem 3 hours ago

    this will work very well with free threaded python you don't need sub interpreters. I agree that it's the most promising approach I've seen yet.

[removed] 9 hours ago
[deleted]
OutOfHere 4 hours ago

Microsoft just fired 3% of its staff, more than it ever did before. I would stick with type-checked free-threaded Python with locks and queues. Someone should be able to enhance the type checker to also check for unsafe mutation of variables.

  • surajrmal 4 hours ago

    Only 3%. Hard to say this effort was affected

    • pansa2 3 hours ago

      3% of the whole company - but a lot of Python specialists were in that 3%. Including, apparently, the entire "Faster CPython" team.

akkad33 3 hours ago

"fearless concurrency" reminds of the buzzword for another language