Swizec 2 days ago

I’ve seen Picallilli’s stuff around and it looks extremely solid. But you can’t beat the market. You either have what they want to buy, or you don’t.

> Landing projects for Set Studio has been extremely difficult, especially as we won’t work on product marketing for AI stuff, from a moral standpoint, but the vast majority of enquiries have been for exactly that

The market is speaking. Long-term you’ll find out who’s wrong, but the market can usually stay irrational for much longer than you can stay in business.

I think everyone in the programming education business is feeling the struggle right now. In my opinion this business died 2 years ago – https://swizec.com/blog/the-programming-tutorial-seo-industr...

  • arthurfirst 2 days ago

    I get the moral argument and even agree with it but we are a minority and of course we expect to be able sell our professional skills -- but if you are 'right' and out of business nobody will know. Is that any better than 'wrong' and still in business?

    You might as well work on product marketing for ai because that is where the client dollars are allocated.

    If it's hype at least you stayed afloat. If it's not maybe u find a new angle if you can survive long enough? Just survive and wait for things to shake out.

    • order-matters 2 days ago

      Yes, actually - being right and out of business is much better than being wrong and in business when it comes to ethics and morals. I am sure you could find a lot of moral values you would simply refuse to compromise on for the sake of business. the line between moral value and heavy preference, however, is blurry - and is probably where most people have AI placed on the moral spectrum right now. Being out of business shouldn't be a death sentence, and if it is then maybe we are overlooking something more significant.

      I am in a different camp altogether on AI, though, and would happily continue to do business with it. I genuinely do not see the difference between it and the computer in general. I could even argue it's the same as the printing press.

      What exactly is the moral dilemma with AI? We are all reading this message on devices built off of far more ethically questionable operations. that's not to say two things cant both be bad, but it just looks to me like people are using the moral argument as a means to avoid learning something new while being able to virtue signal how ethical they are about it, while at the same time they refuse to sacrifice things they are already accustomed to for ethical reasons when they learn more about it. It just all seems rather convenient.

      the main issue I see talked about with it is in unethical model training, but let me know of others. Personally, I think you can separate the process from the product. A product isnt unethical just because unethical processes were used to create it. The creator/perpetrator of the unethical process should be held accountable and all benefits taken back as to kill any perceived incentive to perform the actions, but once the damage is done why let it happen in vain? For example, should we let people die rather than use medical knowledge gained unethically?

      Maybe we should be targeting these AI companies if they are unethical and stop them from training any new models using the same unethical practices, hold them accountable for their actions, and distribute the intellectual property and profits gained from existing models to the public, but models that are already trained can actually be used for good and I personally see it as unethical not to.

      Sorry for the ramble, but it is a very interesting topic that should probably have as much discussion around it as we can get

      • arthurfirst 2 days ago

        > Yes, actually - being right and out of business is much better than being wrong and in business when it comes to ethics and morals.

        Yes, but since you are out of business you no longer have an opportunity to fix that situation or adapt it to your morals. It's final.

        Turning the page is a valid choice though. Sometimes a clean slate is what you need.

        > Being out of business shouldn't be a death sentence, and if it is then maybe we are overlooking something more significant.

        Fair point! It feels like a death sentence when you put so much into it though -- a part of you IS dying. It's a natural reflex to revolt at the thought.

        > For example, should we let people die rather than use medical knowledge gained unethically?

        Depends if you are doing it 'for their own good' or not.

        Also the ends do not justify the means in the world of morals we are discussing -- that is pragmatism / utilitarianism and belongs to the world of the material not the ideal.

        Finally - Who determines what is ethical? beyond the 'golden rule'? This is the most important factor. I'm not implying ethics are ALL relative, but beyond the basics they are, and who determines that is more important than the context or the particulars.

      • int_19h 17 hours ago

        > What exactly is the moral dilemma with AI? We are all reading this message on devices built off of far more ethically questionable operations.

        The main difference is that for those devices, the people negatively affected by operations are far away in another country, and we're already conditioned to accept their exploitation as "that's just how the world works" or "they're better off that way". With AI, the people affected - those whose work was used to train, and those who lose jobs because of it - are much closer. For software engineers in particular, these are often colleagues and friends.

      • sambuccid a day ago

        >> The creator/perpetrator of the unethical process should be held accountable and all benefits taken back as to kill any perceived incentive to perform the actions, but once the damage is done why let it happen in vain?

        That's very similar to other unethical processes(for example child labour), and we see that government is often either too slow to move or just not interested, and that's why people try to influence the market by changing what they buy.

        It's similar for AI, some people don't use it so that they don't pay the creators (in money or in personal data) to train the next model, and at the same time signal to the companies that they wouldn't be future customers of the next model.

        (I'm not necessarely in the group of people avoiding to use AI, but I can see their point)

      • derangedHorse a day ago

        > but once the damage is done why let it happen in vain?

        Because there are no great ways to leverage the damage without perpetuating it. Who do you think pays for the hosting of these models? And what do you mean by distribute the IP and profits to the public? If this process will be facilitated by government, I don’t have faith they’ll be able to allocate capital well enough to keep the current operation sustainable.

    • johnnyanmac 2 days ago

      >but if you are 'right' and out of business nobody will know. Is that any better than 'wrong' and still in business?

      Depends. Is it better to be "wrong" and burn all your goodwill for any future endeavors? Maybe, but I don't think the answer is clear cut for everyone.

      I also don't fully agree with us being the "minority". The issue is that the majority of investors are simply not investing anymore. Those remaining are playing high stakes roulette until the casino burns down.

  • jimbokun 2 days ago

    Has anyone considered that the demand for web sites and software in general is collapsing?

    Everyone and everything has a website and an app already. Is the market becoming saturated?

    • oldandboring 2 days ago

      I know a guy who has this theory, in essence at least. Businesses use software and other high-tech to make efficiency gains (fewer people getting more done). The opportunities for developing and selling software were historically in digitizing industries that were totally analog. Those opportunities are all but dried up and we're now several generations into giving all those industries new, improved, but ultimately incremental efficiency gains with improved technology. What makes AI and robotics interesting, from this perspective, is the renewed potential for large-scale workforce reduction.

    • Aperocky 2 days ago

      The demand is massively increasing, but filled by less people and more GPUs.

    • HeyLaughingBoy 2 days ago

      And new companies are created every day, and new systems are designed every day, and new applications are needed every day.

      The market is nowhere close to being saturated.

      • jimbokun 17 hours ago

        You're just begging the question.

        What are examples of these "new applications" that are needed every day? Do consumers really want them? Or are software and other companies just creating them because it benefits those companies?

        • HeyLaughingBoy 17 hours ago

          Most of the software written worldwide is created for internal company usage. Consumers don't even know that it exists.

          I've worked (still do!) for engineering services companies. Other businesses pay us to build systems for them to either use in-house or resell downstream. I have to assume that if they're paying for it, they see profit potential.

  • xnx 2 days ago

    > In my opinion this business died 2 years ago

    It was an offshoot bubble of the bootcamp bubble which was inflated by ZIRP.

  • kagevf 2 days ago

    I think your post pretty well illustrates how LLMs can and can't work. Favoriting this so I can point people to it in the future. I see so many extreme opinions on it like from how LLM is basically AGI to how it's "total garbage" but this is a good, balanced - and concise! - overview.

  • jillesvangurp 2 days ago

    This is the type of business that's going to be hit hard by AI. And the type of businesses that survive will be the ones that integrate AI into their business the most successfully. It's an enabler, a multiplier. It's just another tool and those wielding the tools the best, tend to do well.

    Taking a moral stance against AI might make you feel good but doesn't serve the customer in the end. They need value for money. And you can get a lot of value from AI these days; especially if you are doing marketing, frontend design, etc. and all the other stuff a studio like this would be doing.

    The expertise and skill still matter. But customers are going to get a lot further without such a studio and the remaining market is going to be smaller and much more competitive.

    There's a lot of other work emerging though. IMHO the software integration market is where the action is going to be for the next decade or so. Legacy ERP systems, finance, insurance, medical software, etc. None of that stuff is going away or at risk of being replaced with some vibe coded thing. There are decades worth of still widely used and critically important software that can be integrated, adapted, etc. for the modern era. That work can be partly AI assisted of course. But you need to deeply understand the current market to be credible there. For any new things, the ambition level is just going to be much higher and require more skill.

    Arguing against progress as it is happening is as old as the tech industry. It never works. There's a generation of new programmers coming into the market and they are not going to hold back.

    • wartywhoa23 2 days ago

      > Taking a moral stance against AI might make you feel good but doesn't serve the customer in the end. They need value for money. And you can get a lot of value from AI these days; especially if you are doing marketing, frontend design, etc. and all the other stuff a studio like this would be doing.

      So let's all just give zero fucks about our moral values and just multiply monetary ones.

      • simianwords 2 days ago

        >So let's all just give zero fucks about our moral values and just multiply monetary ones.

        You are misconstruing the original point. They are simply suggesting that the moral qualms of using AI are simply not that high - neither to vast majority of consumers, neither to the government. There are a few people who might exaggerate these moral issues for self service but they wouldn't matter in the long term.

        That is not to suggest there are absolutely no legitimate moral problems with AI but they will pale in comparison to what the market needs.

        If AI can make things 1000x more efficient, humanity will collectively agree in one way or the other to ignore or work around the "moral hazards" for the greater good.

        You can start by explaining what your specific moral value is that goes against AI use? It might bring to clarity whether these values are that important at all to begin with.

      • ako 2 days ago

        AI is just a tool, like most other technologies, it can be used for good and bad.

        Where are you going to draw the line? Only if it effects you, or maybe we should go back to using coal for everything, so the mineworkers have their old life back? Or maybe follow the Amish guidelines to ban all technology that threatens sense of community?

        If you are going to draw a line, you'll probably have to start living in small communities, as AI as a technology is almost impossible to stop. There will be people and companies using it to it's fullest, even if you have laws to ban it, other countries will allow it.

      • tjwebbnorfolk 2 days ago

        What parent is saying is that what works is what will matter in the end. That which works better than something else will become the method that survives in competition.

        You not liking something on purportedly "moral" grounds doesn't matter if it works better than something else.

      • idiotsecant 2 days ago

        That's how it works. You can be morally righteous all you want, but this isn't a movie. Morality is a luxury for the rich. Conspicuous consumption. The morally righteous poor people just generally end up righteously starving.

        • dripdry45 2 days ago

          This seems rather black and white. Defining the morals probably makes sense, then evaluating whether they can be lived or whether we can compromise in the face other priorities?

    • senordevnyc 2 days ago

      Totally agree, but I’d state it slightly differently.

      This type of business isn’t going to be hit hard by AI; this type of business owner is going to be hit hard by AI.

    • alextingle 2 days ago

      I think it's just as likely that business who have gone all-in on AI are going to be the ones that get burned. When that hose-pipe of free compute gets turned off (as it surely must), then any business that relies on it is going to be left high and dry. It's going to be a massacre.

      • simonw 2 days ago

        The latest DeepSeek and Kimi open weight models are competitive with GPT-5.

        If every AI lab were to go bust tomorrow, we could still hire expensive GPU servers (there would suddenly be a glut of those!) and use them to run those open weight models and continue as we do today.

        Sure, the models wouldn't ever get any better in the future - but existing teams that rely on them would be able to keep on working with surprisingly little disruption.

    • classified 2 days ago

      > And the type of businesses that survive will be the ones that integrate AI into their business the most successfully.

      I am an AI skeptic and until the hype is supplanted by actual tangible value I will prefer products that don't cram AI everywhere it doesn't belong.

    • BenGosub 2 days ago

      I understand that website studios have been hit hard, given how easy it is to generate good enough websites with AI tools. I don't think human potential is best utilised when dealing with CSS complexities. In the long term, I think this is a positive.

      However, what I don't like is how little the authors are respected in this process. Everything that the AI generates is based on human labour, but we don't see the authors getting the recognition.

      • b3ing 2 days ago

        Website building started dying off when SquareSpace launched and Wix came around. WordPress copied that and its been building blocks for the most part since then. There are few unique sites around these days.

      • classified 2 days ago

        > we don't see the authors getting the recognition.

        In that sense AI has been the biggest heist that has ever been perpetrated.

    • balamatom 2 days ago

      Sure, and it takes five whole paragraphs to have a nuanced opinion on what is very obvious to everyone :-)

      >the type of business that's going to be hit hard by AI [...] will be the ones that integrate AI into their business the most

      There. Fixed!

    • skeeter2020 2 days ago

      it is totally valid to NOT play the game - Joshua taught us this way back in the 80's

    • rob74 2 days ago

      I don't know about you, but I would rather pay some money for a course written thoughtfully by an actual human than waste my time trying to process AI-generated slop, even if it's free. Of course, programming language courses might seem outdated if you can just "fake it til you make it" by asking an LLM everytime you face a problem, but doing that won't actually lead to "making it", i.e. developing a deeper understanding of the programming environment you're working with.

      • ako 2 days ago

        But what if the AI generated course was actually good, maybe even better than the human generated course? Which one would you pick then?

    • otabdeveloper4 2 days ago

      AI is not a tool, it is an oracle.

      Prompting isn't a skill, and praying that the next prompt finally spits out something decent is not a business strategy.

      • classified 2 days ago

        Seeing how many successful businesses are a product of pure luck, using an oracle to roll the dice is not significantly different.

      • rob74 2 days ago

        Do you remember the times when "cargo cult programming" was something negative? Now we're all writing incantations to the great AI, hoping that it will drop a useful nugget of knowledge in our lap...

      • danielbln 2 days ago

        Hot takes from 2023, great. Work with AIs has changed since then, maybe catch up? Look up how agentic systems work, how to keep them on task, how they can validate their work etc. Or don't.

        • otabdeveloper4 2 days ago

          > if you combine the Stone Soup strategy with Clever Hans syndrome you can sell the illusion of not working for 8 billable hours a day

          No thanks, I'm good.

      • tonyhart7 2 days ago

        "praying that the next prompt finally spits out something decent is not a business strategy."

        well you just describing an chatgpt is, one of the most fastest growing user acquisition user base in history

        as much as I agree with your statement but the real world doesn't respect that

        • otabdeveloper4 2 days ago

          > one of the most fastest growing user acquisition user base in history

          By selling a dollar of compute for 90 cents.

          We've been here before, it doesn't end like you think it does.

    • Towaway69 2 days ago

      > Arguing against progress as it is happening is as old as the tech industry. It never works.

      I still wondering why I'm not doing my banking in Bitcoins. My blockchain database was replaced by postgres.

      So some tech can just be hypeware. The OP has a legitimate standpoint given some technologies track record.

      And the doctors are still out on the affects of social media on children or why are some countries banning social media for children?

      Not everything that comes out of Silicon Valley is automatically good.

  • skeeter2020 2 days ago

    markets are not binary though, and this is also what it looks like when you're early (unfortunately similar to when you're late too). So they may totally be able to carve out a valid & sustainable market exactly because theyu're not doing what everyone else is doing right now. I'm currently taking online Spanish lessons with a company that uses people as teachers, even though this area is under intense attack from AI. There is no comparison, and what's really great is using many tools (including AI) to enhance a human product. So far we're a long way from the AI tutor that my boss keeps envisioning. I actually doubt he's tried to learn anything deep lately, let alone validated his "vision".

  • tonyhart7 2 days ago

    what happen if the market is right and this is "new normal"?????

    same like StackOverflow down today and seems like not everyone cares anymore, back then it would totally cause breakdown because SO is vital

    • lmm 2 days ago

      > what happen if the market is right and this is "new normal"?????

      Then there's an oversupply of programmers, salaries will crash, and lots of people will have to switch careers. It's happened before.

      • _DeadFred_ 2 days ago

        Some people will lose their homes. Some marriages will fail from the stress. Some people will chose to exit life because of it all.

        It's happened before and there's no way we could have learned from that and improved things. It has to be just life changing, life ruining, career crippling. Absolutely no other way for a society to function than this.

      • gloosx 2 days ago

        It's not as simple as putting all programmers into one category. There can be oversupply of web developers but at the same time undersupply of COBOL developers. If you are a very good developer, you will always be in demand.

      • yungwarlock 2 days ago

        I'm young, please when was that and in what industry

    • indemnity 2 days ago

      I haven’t visited StackOverflow for years.

      • throwaway2037 2 days ago

        I don't get these comments. I'm not here to shill for SO, but it is a damn good website, if only for the archive. Can't remember how to iterate over entries in JavaScript dictionary (object)? SO can tell you, usually much better than W3Schools can, which attracts so much scorn. (I love that site: So simple for the simple stuff!)

        When you search programming-related questions, what sites do you normally read? For me, it is hard to avoid SO because it appears in so many top results from Google. And I swear that Google AI just regugitates most of SO these days for simple questions.

      • ido 2 days ago

        Ive honestly never intentionally visited it (as in, went to the root page and started following links) - it was just where google sent me when searching answers to specific technical questions.

      • [removed] 2 days ago
        [deleted]
    • m463 2 days ago

      buggywhips are having a temporary setback.

      • throwaway2037 2 days ago

        I had a "milk-up-the-nose" laughter moment when I read this comment.

      • vkou 2 days ago

        The coach drivers found other work, their horses got turned into glue.

      • balamatom 2 days ago

        leaded gasoline is making a killing, though

    • _zoltan_ 2 days ago

      you mixed up "is dead" with "is vital" :-)

  • jimmydddd 2 days ago

    Not wanting to help the rich get richer means you'll be fighting an uphill battle. The rich typically have more money to spend. And as others have commented, not doing anything AI related in 2025-2026 is going to further limit the business. Good luck though.

    • Aurornis 2 days ago

      Rejecting clients based on how you wish the world would be is a strategy that only works when you don’t care about the money or you have so many clients that you can pick and choose.

      Running a services business has always been about being able to identify trends and adapt to market demand. Every small business I know has been adapting to trends or trying to stay ahead of them from the start, from retail to product to service businesses.

      • bluGill 2 days ago

        Rejecting clients when you have enough is a sound business decision. Some clients are too annoying to serve. Some clients don't want to pay. Sometimes you have more work than you can do... It is easy to think when things are bad that you must take any and all clients (and when things are bad enough you might be forced to), but that is not a good plan and to be avoided. You should be choosing your clients. It is very powerful when you can afford to tell someone I don't need your business.

    • satvikpendem 2 days ago

      People who make products with AI are not necessarily rich, often it's solo "vibe coders."