87 comments

  • alexgotoi 19 hours ago
    The thing people miss in these “replace juniors with AI” takes is that juniors were never mainly about cheap hands on keyboards. They’re the only people in the org who are still allowed to ask “dumb” questions without losing face, and those questions are often the only signal you get that your abstractions are nonsense.

    What AI does is remove a bunch of the humiliating, boring parts of being junior: hunting for the right API by cargo-culting Stack Overflow, grinding through boilerplate, getting stuck for hours on a missing import. If a half-decent model can collapse that search space for them, you get to spend more of their ramp time on “here’s how our system actually fits together” instead of “here’s how for-loops work in our house style”.

    If you take that setup and then decide “cool, now we don’t need juniors at all”, you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.

    Always love to include a good AI x work thread in my https://hackernewsai.com/ newsletter.

    • rsanek 18 hours ago
      "without losing face"? What culture are you referring to? The Western companies I have worked at do not discourage such questions -- in fact, it's often the sign of someone very senior when they ask a seemingly 'dumb' question that others have taken for granted.
      • Kiro 2 minutes ago
        It doesn't matter if the culture encourages it, you still don't want to ask dumb questions.
      • socketcluster 14 hours ago
        Yep, I fully agree with this view and I find that it's seniors who ask the 'dumb' questions. Everyone is worried about losing face in this precarious economy... But seniors are able to ask really smart questions as well so even their dumb questions sound smart... They can usually spin dumb questions into a smart questions by going one level deeper and bringing nuance into the discussion. This may be difficult to do for a junior.

        My experience as a team lead working with a lot of juniors is that they are terrified of losing face and tend to talk a big game. As a team lead, I try to use language which expresses any doubts or knowledge gaps I have so that others in my team feel comfortable doing it as well. But a key aspect is that you have to really know your stuff in certain areas because you need to inspire others to mirror you... They won't try to mirror you if they don't respect you, based on your technical ability.

        You need to demonstrate deep knowledge in some areas and need to demonstrate excellent reasoning abilities before you can safely ask dumb questions IMO. I try to find the specific strengths and weaknesses of my team members. I give constructive criticism for weaknesses but always try to identify and acknowledge each person's unique superpower; what makes them really stand out within the team. If people feel secure in their 'superpower', then they can be vulnerable in other areas and still feel confident. It's important to correctly identify the 'superpower' though because you don't want a Junior honing a skill that they don't naturally possess or you don't want them to be calling shots when they should be asking for help.

        • tharkun__ 11 hours ago

              My experience as a team lead working with a lot of juniors is that they are terrified of losing face
          
          So much this! Both from my experience as Junior very many years ago and also with Juniors (and not so Juniors) today.

              tend to talk a big game
          
          Very big game. Claude does too. The kind of BS it spews in very confident language is amazing.

              As a team lead, I try to use language which expresses any doubts or knowledge gaps I have so that others in my team feel comfortable doing it as well
          
          Agree. I also often literally say "Dumb idea: X" to try and suss out areas that may have been left by the wayside and under-explored or where assumptions have been made without verifying them. It's amazing how often even "Seniors"+ will spew assumptions as fact without verification. It's very annoying actually.

              superpower
          
          How do you actually do this tho? I would love to do this but it seems hard to find an actual "superpower". Like where does "super" power start vs. "yeah they're better at this than others but definitely not as good as me or "person X that definitely does have that superpower". Like when can you start encouraging so to speak,
          • socketcluster 4 hours ago
            The fact that you mentioned Claude (LLMs) is interesting! I definitely feel like there is a parallel; maybe because AI sometimes hallucinates, people have built up a tolerance for this kind of speculative use of language from people as well.

            About finding the superpower of each team member; after working with someone for a few months, I start to notice certain patterns in how they think. Sometimes there might be something they say once or a question they ask which makes me view them more favorably than before. Maybe they're fast, good at assembling stuff or slow but good at building stable core components. Maybe they're similar to me or maybe they have a skill I don't have or a certain interest/focus or way to approach problems that is different from me but is beneficial to the project and/or team.

            It's a bit like playing a weird game of chess where you can't see the pieces properly at the beginning so everyone looks like a pawn initially; But then over time you discover that one person is a knight, another is a bishop, another is a queen... And you adapt your strategy as your visibility improves.

      • andrewmutz 18 hours ago
        Completely agree with this. I got to work closely with an IBM fellow one summer and I was impressed by his willingness to ask "dumb questions" in meetings. Sometimes he was just out of the loop but more often he was just questioning some of the assumptions that others in the room had left unquestioned.
        • boston_clone 18 hours ago
          Unfortunately, I found that the culture of "think." at IBM is not matched at many other organizations. Most days, I miss it.

          But forced RTO and only 10 days off per year is enough to keep me away ;)

      • lvspiff 17 hours ago
        100% agree and more credit if I could give it!

        Even as a lead I ask the dumb question when no one else does just because when i can see the look in people faces or realize no one is chiming in the dumb question is needed to ensure everyone drives the point home. I've never been met with any sort of looking down upon nor do i discourage any of my staff - quite the opposite - I champion them for being willing to speak up.

        • tpoacher 15 hours ago
          in fairness, these do not sound like "dumb" questions.

          Some questions really are dumb and bring no value to the table.

          The key is knowing which is which, and that is the part that comes with experience.

          • noisy_boy 13 hours ago
            > Some questions really are dumb and bring no value to the table.

            They do tell you that the person asking them either isn't getting it, which is valuable information, or that they are trying to ask questions for the sake of it, which is also valuable information.

            • TeMPOraL 3 hours ago
              Which is exactly what the OP was saying - these are the kind of questions that are often needed, but that seniors won't ask because it'll make them lose face. Juniors are the ones allowed to "not get it".
      • TeMPOraL 3 hours ago
        There are kinds of questions that you can ask to signal your seniority and matureness. There are other kinds of questions that, should you ask them, will leave people wondering what the hell have you been doing for the past N years and why they're paying you senior-level salary.

        A lot of early signs of problems, such as critical information becoming tribal knowledge instead of being documented, are revealed when asking the second kind of questions.

      • darth_avocado 14 hours ago
        It depends on the company and the people around you. At one company, my quarterly feedback was that I don’t ask too many questions in meetings, which was mostly due to the fact that the project was pretty straightforward and requirements were hammered down on a document. In another company, asking questions got me the feedback that I was maybe not experienced enough to manage the project by myself, which I was completely capable of. It’s a double edged sword.

        But yes on a personal level, being senior enough in my career, I’d rather be thought of as less skilled by asking questions before the s hits the fan, than execute and mismanage a project that I didn’t ask enough questions on. The latter has more consequences tbh.

        • nopurpose 3 hours ago
          > my quarterly feedback was that I don’t ask too many questions

          Sounds like your manager felt that he need to provide at least some feedback and it is best/safest he could come up with.

      • reactordev 18 hours ago
        Company culture. Some companies I worked for would fire you for questioning decisions. Others, welcomed criticism. You don’t really know which environment you’re in until someone says something. Are you going to take the risk and be the first?
        • JohnFen 15 hours ago
          > Are you going to take the risk and be the first?

          Absolutely. If the company I work for happens to be one that's so crappy that I'd get fired for questioning things, it's better to find that out as soon as possible. That's not a company that's worth my time and attention.

        • Xelbair 18 hours ago
          Yes, because i would rather not work at such company and go somewhere else.
          • reactordev 17 hours ago
            In this market you might not have a choice but to stick it out a while
      • rukuu001 17 hours ago
        Think of highly competitive environments where looking foolish can be weaponised against you. They definitely exist here (my experience in UK and Australia)
        • shakna 11 hours ago
          IBM Aus discouraged it, Accenture, Concentrix, EY, CommBank, ANZ, and others all encouraged it, for myself.

          I wouldn't say discouraging it will be the norm across most places in Australia.

        • awesome_dude 15 hours ago
          My entire career - New Zealand, and Australia - asking questions is weaponised (as I stated above)

          Graduate, Junior, Senior, Team Lead, - my title hasn't mattered to the response

          • vkou 15 hours ago
            You're working in toxic workplaces. Most of them aren't like this.

            (I believe you when you say that most of yours are like this.)

        • globalnode 15 hours ago
          some of the descriptions above of thoughtful supportive work places where people are free to explore different ideas and question assumptions sound like paradise.
          • awesome_dude 15 hours ago
            Or judged via rose coloured spectacles.
      • baxtr 17 hours ago
        I am bit more senior nowadays.

        Whenever I don’t understand something I say something like: "Uh, I’m probably the only one here, but I don’t get it…"

        • Rodeoclash 17 hours ago
          I'm the CTO and probably one of the most common things I'll say is "help me understand X"
          • darth_avocado 15 hours ago
            There’s power dynamics that come into play when you’re a C level executive. You’re allowed to ask questions. For entry level employees, asking questions almost always comes with a judgement of lower skills/experience. This is often what senior and experienced folk forget, there’s a certain amount of assumed competence when you asked questions, that doesn’t get assigned to newer people.
          • tracker1 15 hours ago
            My favorite CTO/CIO acronym is "PFM". Such as, "and then we run through the PFM process, and it comes out and does..."

            PFM - Pure Fucking Magic

            I've only once ever had anyone actually ask what it means... essentially it's used as an abstraction for a complex process that would be excessive to explain in context.

            I asked, after the meeting.

      • Spooky23 16 hours ago
        There are a lot of bad places to work, and those are the types of places that do things like replace junior devs with AI.

        The place I work at is in the middle of a new CEO’s process of breaking the company. The company can’t go out of business, but we’ll set stuff on fire for another 12-18 months.

      • xeromal 17 hours ago
        Yup, my SR Director boss often says "I'm an idiot. Can you tell me what 'X' means when most of us probably wanted to know but were too afraid to ask
      • tracker1 15 hours ago
        That's been my experience as someone who tends to ask them regularly. I don't have a lot of hubris when it comes down to it, so I'll almost always ask. Especially acronyms or industry/insider business terms... I'll usually do a quick search in a browser, but if the results don't make sense, will simply ask.

        Asking stupid questions almost goes hands in glove with "it's easier to ask forgiveness than permission." A lot of times, you're better off just doing something. Asking a simple question or making something happen saves a lot of grief more often than not. Understanding is important.

      • simsla 17 hours ago
        I don't think that's the same. I spitball crazy ideas, but my core knowledge/expertise is sound, and I try not to talk out of my ass. (Or I am upfront when I'm outside my area of expertise. I think it's important to call that out once your word starts carrying some weight.)

        A product manager can definitely say things that would make me lose a bit of respect for a fellow senior engineer.

        I can also see how juniors have more leeway to weigh in on things they absolutely don't understand. Crazy ideas and constructive criticism is welcome from all corners, but at some level I also start expecting some more basic competence.

        • lanstin 16 hours ago
          In general there are so many different sub-fields of knowledge that it's extremely confining to stay in one area of expertise; the slow uneducated person that has been working to keep some giant build farm running and migrating projects and helping fix tickets for 5 years will have a lot of expertise you don't have if you have a more casual experience of the system.
      • ben_w 14 hours ago
        Company culture != national culture != personal culture. Such things can be all over the place.

        I've worked with people from Korea who took me 100% seriously when I said the architecture was too complex and hard to work with slowing down velocity, and although they did end up keeping it there was no outward indication of lost face and they did put in the effort to justify their thinking.

        I've also worked with some British, Greek, and Russian people who were completely unwilling to listen to any feedback from coworkers, only order them around.

        Even within a person: I know a self-identified communist, who is very open minded about anything except politics.

      • tpoacher 15 hours ago
        My cousin, when he got his first job, he managed to wipe the database clean on his first day at work. (classic, I know)

        The seniors were very understanding, and more importantly it raised important questions about backups, dev vs prod pipelines, etc.

        But you can bet my cousin was super embarrassed by it, and saving face was a big part of it.

        • hdgvhicv 7 hours ago
          Seniors should be the ones embarrassed for giving anyone, let alone a new junior, such level of access
        • zingar 15 hours ago
          I totally did this! Why would I need a database named “mysql” inside my MySQL database? Delete that, good job on day one!
      • mv4 15 hours ago
        I found this varies.

        Meta? Ask questions anytime.

        Amazon? Not so much.

      • doctaj 12 hours ago
        There’s also the benefit of being naive - like, juniors can be seriously audacious when they haven’t been burned a million times. I miss having excitement and optimism.
        • laz 2 hours ago
          The Facebook "little red book" had a quote in it along these lines:

          When you don't realize what you can't do, you can do some pretty cool stuff

        • aut0m8d 12 hours ago
          This is huge and quite underrated I think. There are some angles that are really hard to see through a weathered lens.
        • marcus_holmes 12 hours ago
          the flip side of this is having juniors create some wildly unrealistic expectations in other business units if we're not careful ;)
      • apercu 15 hours ago
        Culture varies.
      • kelipso 15 hours ago
        I don’t know..this seems like one of those that is admired in HN but I don’t see in any of the multiple US companies that I’ve worked in. People are definitely concerned with looking dumb. “Losing face” may be something people here attribute to Oriental Cultures, but in practice it works similarly here too.
      • sharkweek 10 hours ago
        Best VP I’ve ever had would stop meetings with regular frequency and say, “maybe I’m the dumbest person here, but I don’t understand [insert something being discussed], can you help me get a better understanding?”

        It was anybody’s guess if they really didn’t understand the topic or if they were reading the room, but it was always appreciated.

      • awesome_dude 15 hours ago
        My Entire career

        "Why the f*ck are you asking, you should know this"

        or

        "Why the f*ck can you not look that up"

        edit: There's an entire chapter of the internet with "LMGTFY" responses because people ask questions.

        or

        "Isn't it f*cking obvious"

        or

        "Do I have to f*cking spell it out for you"

        There's a strong chance that I am autistic, which means, yes, I need people to be (more) explicit.

        AI has done a hell of a good job making it easier for me to search for subtexts that I typically miss. And I receive less of the negative feedback when I have something to ask that does help.

        • versteegen 10 hours ago
          A completely flipped perspective:

          > "Why the f*ck are you asking, you should know this"

          Because you mentioned NZ: my father, a toolmaker, said there was a huge difference between Europe and NZ. In Germany/Netherlands, he'd be working under a more senior toolmaker. When he took a job in NZ and asked the boss something, as would have been the proper thing to do in Europe, he got a response just like that: because he was the expert, and his NZ boss was just a manager.

        • jamblewamble 14 hours ago
          Asking questions is a good thing but that doesnt mean ALL questions. It doesnt include questions you could answer with a google search or by reading documentation, obviously.
          • awesome_dude 14 hours ago
            Got it - ask questions, but not ones that you already know where the answer is.

            edit: well, except when you search the documentation and get (literally) 70+ results because you don't know the exact phrasing used in the self hosted wiki...

            Or, when it's a question that is domain specific (meaning that the SME is supposed to know it, which you only know if you are... an SME...)

            etc

            • mmh0000 10 hours ago
              Provide context when asking.

              : “hey bob, I looked here and here and here and didn’t find the correct information. Can you show me where to look or tell me the answer so I can document it”

              Because most people don’t bother doing the tiniest amount of their own research before asking dumb questions it becomes a huge headache to answer the same thing a million times.

              However, if you can show that you did put in the effort to look up the answer first people will be much more willing to help.

              • awesome_dude 10 hours ago
                So far I have two examples (in this thread) of people making (potentially toxic) judgements about the fact that someone asked a question

                Can you show why you assumed that what you are asking for wasn't provided?

                Can you also explain why your response is to make rather harsh judgements rather than work out what was going on in the first place?

            • watwut 5 hours ago
              > Got it - ask questions, but not ones that you already know where the answer is.

              More of, ask do the quick google search or check the doc before asking that question. If the quick search or look into the doc does not contain the answer, ask.

      • lanstin 17 hours ago
        I have worked at a place where people were routinely criticized for asking basic questions on a big all-dev DL (which was archived and searchable, so they actually added to a growing record). The preferred solution was to ask a co-worker on the same team. People were answered a lot of questions were also criticized for being helpful. In neither case was the criticism that much from devs but from managers and given in boss feedback directly to people. Also it had a problem with spreading a good culture and common technical vision to new people, for some reason ( /s )
    • giancarlostoro 18 hours ago
      > They’re the only people in the org who are still allowed to ask “dumb” questions without losing face

      I strongly disagree, a Senior who cannot ask a "dumb question" is a useless developer to me. Ask away, we want to figure things out, we're not mind readers. Every good senior developer I know asks questions and admits when they don't know things. This is humility and in my eyes is a strong market for a good Senior Developer. The entire point of our job is to ask questions (to ourselves typically the most) and figure out the answers (the code).

      Juniors could or should ask more questions, and by the time you're a Senior, you're asking key questions no matter how dumb they sound.

      • cloudfudge 17 hours ago
        You're making the same point as the person you're responding to. They're saying seniors are allowed to ask dumb questions. It's junior who are often afraid to do so.
    • johnfn 16 hours ago
      > The thing people miss in these “replace juniors with AI” takes is that juniors were never mainly about cheap hands on keyboards. They’re the only people in the org who are still allowed to ask “dumb” questions without losing face, and those questions are often the only signal you get that your abstractions are nonsense.

      This seems almost entirely wrong to me? That anyone, at any level of seniority, can ask "dumb questions" and give signal about "nonsense abstractions" seems a property of any healthy organization. That only juniors can do this doesn't just seem wrong, it seems backwards. I would expect seniors to have the clearest idea on whether abstractions make sense, not juniors.

      • t43562 2 hours ago
        People who are new to the business should be able to challenge the assumptions that the business has built up over time and ceased to question.

        They are the most insecure, however, no knowing who will be annoyed, shown up, embarassed by that question if it suggests that some past decisions were wrong.

    • tyre 19 hours ago
      This is a really good and under appreciated point. My recommendation to mid-level, senior, and staff engineers is to keep questioning decisions and create a culture where that’s encouraged.

      Junior devs do that naturally (if you have the culture) because they don’t know anything already. It’s great

      • 47928485 19 hours ago
        > My recommendation to mid-level, senior, and staff engineers is to keep questioning decisions and create a culture where that’s encouraged.

        Tell me you've never worked at FAANG without telling me you've never worked at FAANG...

        • lkjdsklf 16 hours ago
          I’ve worked at a couple FAANG and questioning things is pretty much always viewed as a positive when it’s done in a productive and professional manner

          What isn’t viewed positively is when you refuse to accept a decision after it’s been made and your concerns have been heard. People get pissed if you keep relitigating the same points over and over again

          • kyralis 7 hours ago
            My advice to engineer is always:

            Your job is to make sure that the decision makers, when they're not you, have the information needed to make competent decisions. You should keep arguing when (a) there is credible reason to believe that important information has not been heard or understood or (b) when new information has come to light that you credibly believe might change the decision. In the absence of those two, your should accept that you have done your job and should let your managers to theirs, even if you disagree with them. Bring it back up when (a) or (b) changes, and not until.

        • jauer 14 hours ago
          I've worked in various teams on the infrastructure side of a FAANG from early career/L4 to sr staff eng/L7 and have always been encouraged and rewarded for asking questions, even when those questions have led to unexpected multimillion dollar costs and in one case a loss of ~1% of fleetwide compute capacity.

          I think this comes down to how you go about asking. You have to take the time to understand what is and how it's seen by others by being curious, reading docs, etc instead of rolling in making assertions disguised as questions to assert authority like so many are wont to do.

          I suppose it's possible that I'm the designated court jester and that's why I can get away with questioning, but I don't think that's the case :)

          • iyihz 13 hours ago
            How did the questions lead to costs? Like your questions highlighted issues that already existed that you all then had to fix?
        • ebiester 18 hours ago
          Some of us intentionally avoid FAANG for that reason.
        • tyre 15 hours ago
          I’ve never worked at a FAANG and have no intention to.

          I did work at Stripe, which in places did this pretty well. It still felt like a huge company (this was back in 2022) that had lost part of that spirit depending on org leadership. I had to pull that out of engineers who had been scared out of that level of vulnerability. But building that trust is part of leadership and great people tend to want to question and improve things.

        • kyralis 7 hours ago
          As a staff engineer at FAANG... tell me you've never worked at FAANG without telling me you've never worked at FAANG.

          I've given talks on work/life balance -- and I stand by those talks enough to argue with directors and above when needed, though it rarely is -- and an important part of that talk is about how much better it can look when you can intelligently describe the limits of your knowledge, skills, and estimation.

          If you get penalized for that, you're just in a shit role with a shit manager. Don't project that on the rest of us.

        • iwontberude 18 hours ago
          Usually the people who question decisions are shot down because they don’t have a wholistic understanding of the decision and (respectfully) don’t have good arguments. This is only because they are focused on some narrow aspect of the business which distorts or reduces their visibility and understanding.
          • wkat4242 18 hours ago
            One of the most regulated industries, aviation requires crews to go through crew management training where it's explicitly trained for lower level staff to raise concerns in spite of perceived superior knowledge.

            Some of the biggest accidents have happened directly due to this. Like Tenerife where the flight engineer had been listening to the radio and raised doubts about the runway being free but was ignored by the overconfident captain.

          • lanstin 11 hours ago
            This thinking pattern exactly illustrates how a group of very intelligent people can make disasterously bad decisions without anyone challenging them. Don't look for holes in the the arguments of people saying you are making a bad decision, look for the information they have that you do not have or have not explicitly analyzed. If you think you have all the information that the org possesses, go right ahead and make your choices without others; you might be lucky and be Steve Jobs post 2000.
          • mikrl 18 hours ago
            > This is only because they are focused on some narrow aspect of the business

            Is this a bad thing though? If some technical decision has downside risk, I’d reasonably expect:

            - the affected stakeholder to bring it up

            - the decision maker to assuage the stakeholder’s concern (happy path) or triage and escalate

            • iwontberude 18 hours ago
              I think you are right. It's still worth encouraging people to question decisions even though most of the time it won't be the right compromise for the business.
          • fn-mote 16 hours ago
            This is definitely a best case scenario.

            As important as I think questioning is, there’s another side of it where people push their own agenda with questions on topics that were decided by other/more senior people hashing it out. At some point this does need to be dealt with. All I see is the yapping questions wasting meeting time, though.

          • watwut 5 hours ago
            > Usually the people who question decisions are shot down because they don’t have a wholistic understanding of the decision and (respectfully) don’t have good arguments.

            It takes months of dysfunction until the customer says "I do not want to work with you anymore" or until the "overtime and over budget" thing suddenly becomes too large and problems show up in numbers. Or until key team suddenly completely decomposes. Every single time I have seen that, multiple people tried to communicate about the issue and were shot down.

            It is not like management was always "wholistically" right and everyone down there just dont see big picture or have bad arguments - they usually just do not know what is going on on lower levels. Failure to actually listen, whether because it feels bad or because it would take time is quite common.

          • AdrianB1 18 hours ago
            I don't know how to calculate "usually", in my experience people who question decisions in my company are shot down because the decision makers are usually (no pun intended) very incompetent and the questions make that visible, even if not intended. Many companies that I know are so corrupt that competent people are considered to be dangerous for the status quo.
        • MrChoke 10 hours ago
          Most of us haven't, good for you
    • ebiester 18 hours ago
      So, I think there are two models.

      One is a "one junior per team" model. I endorse this for exactly the reasons you speak.

      Another, as I recently saw, was a 70/30 model of juniors to seniors. You make your seniors as task delegators and put all implementation on the junior developers. This puts an "up or out" pressure and gives very little mentorship opportunities. if 70% of your engineers are under 4 years of experience, it can be a rough go.

      • jorvi 18 hours ago
        That second model is basically the hospital model.

        You have 1 veteran doctor overseeing 4 learning doctors. For example operating rooms do this, where they will have 4 operating rooms with 4 less experienced anesthesist and then 1 very experienced anesthesist who will rotate between the 4 and is on call for when shit hits the fan.

        Honestly I think everyone here is missing the forest for the trees. Juniors their main purpose isn't to "ask questions", it's to turn into capable seniors.

        That's also why the whole "slash our junior headcount by 3/4th" we are seeing across the industry is going to massively, massively backfire. AI / LLMs are going to hit a wall (well, they already hit it a while ago), suddenly every is scrambling for seniors but there are none because no one wanted to bear the 'burden' of training juniors to be seniors. You thought dev salaries are insane now? Wait until 4-5 years from now.

        • p_l 2 hours ago
          Fred Brooks proposed "surgical team" structure, where as people gain experience they "bud out" new teams - i.e. the most senior after "the surgeon" ultimately leave team to become "surgeon" of a new team
        • marcosdumay 17 hours ago
          I guess Peopleware couldn't get every single thing right.

          A hospital model may be a good idea. One where you have a senior programmer and many junior ones doing most tasks isn't. IMO, something closer to a real hospital team, where you have experts of different disciplines, and maybe a couple of juniors composing the team has much higher chances of success.

          • jorvi 17 hours ago
            > something closer to a real hospital team, where you have experts of different disciplines

            That is not how hospitals work. The surgery departement won't have a crack team of different disciplines to teach budding surgeons everything. They'll only have veteran surgeons to guide less-experienced surgeons.

            What you will have is interdepartmental cooperation / hand-off, and you'll have multi-discipline roles like surgical oncologist.

            In the same way, you won't have devops seniors training front-end juniors.

            • marcosdumay 17 hours ago
              A surgery team has a surgeon an anesthesiologist, a nurse specialized on material handling overseeing the material usage in the procedure, maybe a nurse specialized on equipment handling. None of those people are junior or subordinated to the others.
      • tetha 14 hours ago
        In my operational team, I'm following a third model, inspired by german trade workers. You have juniors, journeymen and masters. Juniors are generally clueless and need to be told what to do, specifically. This is very much the level of "Here are 28 marks that needs bolts placed in concrete, make it so, I can explain why". Journeymen should be figuring out a plan how to solve a project and challenge it with the master to see if it fits the quality requirements of the master.

        And practically, you can have one or two journeymen per master. However, these 2-3 people can in turn support 3-4 more juniors to supply useful work.

        This also establishes a fairly natural growth of a person within the company. First you do standard things as told. Then you start doing projects that mostly follow a standard that has worked in the past. And then you start standardizing projects.

      • lanstin 10 hours ago
        My first big job was the 1 junior per team; those years were extremely good for learning how to design and write high performance services. Since then, I've mostly been at the 70/30 places where I'm considered senior. Occasionally I just sit down and blast out a big software project, just to feel I am still able, but mostly I tend the garden hoping that a few of the fragile stems will survive and grow into mighty oaks.
      • AdrianB1 18 hours ago
        With the subjective view on what a junior is, I think the 70-30 - or higher - model is used in any company I ever interacted with. For this evaluation I consider junior=someone with less skills than needed to do the job autonomously/require direction and supervision most time, senior=someone that can work autonomously.
    • jappgar 2 hours ago
      I generally agree with you but AI confusion is also a good signal your abstractions are nonsense.

      One problem there is that people would rather believe the AI is "dumb" than face the facts.

    • veunes 4 hours ago
      I agree, the routine is gone, but at what cost? Understanding "how our system fits together" means solving problems, reading code, and debugging. If those fundamental skills aren't built through "humiliating and boring" tasks, how will a junior understand how the system actually works, not just how it appears to work?
    • sailfast 18 hours ago
      I think the biggest challenge now becomes how more seasoned engineers teach juniors. The AI makes the ramp a lot easier but you still do best when you understand the whole stack and make mistakes.

      It’s damned near impossible to figure out where to spend your time wisely to correct an assumption a human made vs. an AI on a blended pull request. All of the learning that happens during PR review is at risk in this way and I’m not sure where we will get it back yet. (Outside of an AI telling you - which, to be fair, there are some good review bots out there)

      • startupsfail 18 hours ago
        Junior engineers now learn from AIs. And AIs now learn from RL cost functions. And RL cost functions are being set by PhDs, with little to no production grade engineering experience ;)

        The result is interesting. First, juniors are miserable. What used to be a good experience coding and debugging, in a state of flow is now anxiously waiting if an AI could do it or not.

        And senior devs are also miserable, getting apprentices used to be fun and profit, working with someone young is uplifting, and now it is gone.

        The code quality is going down, Zen cycle interrupted, with the RL cost functions now at the top.

        The only ones who are happy are hapless PhDs ;$

    • HeavyStorm 17 hours ago
      I sense a lot of hate/baggage in this post subtext.

      Really, juniors are only important because they ask "dumb" questions that can help remove useless abstractions? That your take?

    • never_inline 8 hours ago
      If that was the case I would be chilling during my junior years.

      Juniors are usually given either grunt or low priority work while seniors get more "important" work.

      OTOH, it takes a lot to get your questions on RIGHT EARS when you're a junior, so wouldn't agree with your characterization at all.

      • int_19h 6 hours ago
        It really depends on the workspace. Some places will give juniors serious work items specifically to grow them.
    • PunchyHamster 12 hours ago
      I don't think many orgs learn all that much from coaching their juniors, at least after first few.

      Juniors are just... necessary in the balance, have to little of them and the mid and senior devs will get more and more expensive, so you hire a bunch of juniors, they get trained on job, and it balances it out.

      Hell, if company does it right they might underpay junior-turned-senior for decade before they notice and look how industry pay looks like!

    • TheGRS 14 hours ago
      I don't agree that this is the central value juniors provide. Its a nice tertiary value, but not why one hires them. I think the value is the later part of farming for new talent and just growing your team.

      I still think the central issue is the economy. There are more seniors available to fill roles, so filling out junior roles is less desirable. And perhaps "replacing juniors with AI" is just the industry's way of clumsily saving face.

    • marcosdumay 17 hours ago
      Yes, the most helpful things AIs do is to guide people into popular environments they don't know very well.

      Or in other words, the people that get the most value from AI are junior devs, since they still don't know very well plenty of popular environments. It's also useful for seniors that are starting something in a new environment, but those only get 1 or 2 novel contexts at a time, while everything is new for a junior.

      Or, in again another set of words, AI enable juniors to add more value more quickly. That makes them more valuable, not less.

    • Waterluvian 16 hours ago
      That's a perplexing take based on how I've experienced the past 15 years: the more senior someone is, the more questions they tend to ask.
    • frtime2025 18 hours ago
      AI will replace jobs. People are putting their IT/dev budget into something, which means something else will be cut.

      I also don’t believe juniors, kids, seniors, staff, principals, distinguished/fellow should be replaced by AI. I think they WILL be, but they shouldn’t be. AI at Gemini 3 Flash / Claude Opus 4.5 level is capable with help and review of doing a lot of what a lot of devs do currently. It can’t do everything and will fail, but if the business doesn’t care, they’ll cut jobs.

      Don’t waste time trying to argue against AI to attempt to save your job. Just learn AI and do your job until you’re no longer needed to even manage it. Or, if you don’t want to, do something else.

      • marcosdumay 17 hours ago
        > People are putting their IT/dev budget into something, which means something else will be cut.

        That's not how things work in normal times.

        But normal times require minimally capable managers, a somewhat competitive economy, and some meritocracy in hiring. I can believe that's how things will work this time, but it's still a stupid way to do it.

    • epgui 15 hours ago
      I’m a senior engineer on a staff track, I am proud to ask “dumb” questions all the time, and I don’t want to work somewhere where I don’t feel safe pursuing knowledge openly and candidly.
    • dejj 18 hours ago
      Agree.

      > cargo-culting Stack Overflow

      What do you mean by this? I understand “cargo-culting” as building false idols, e.g. wooden headphones and runways to attract airplanes that never come.

      • kjellsbells 18 hours ago
        It means to copy code or instructions from a site into your own project without having any comprehension of how or why it works.

        example: you have a Windows problem. You search and read that "sfc /scannow" seems a popular answer to Windows problems. You run it without ever understanding what sfc does, whether the tool is relevant to your problem, etc. You are cargo culting a solution.

      • PaulStatezny 17 hours ago
        I think the idea is copy-pasting code snippets from StackOverflow without comprehension of whether (and how) the code fixes the problem.
      • kunley 17 hours ago
    • citizenpaul 14 hours ago
      The real thing people miss is not AI replacing Juniors. Its that senior management soured on hiring juniors even a few years before AI, almost across all industries.

      AI is now just the scapegoat for an economy wide problem. Execs found "one neat trick", piling junior work on seniors until they quit. While not hiring replacements in order to goose short term profits. Now every company is in the same position where hiring a senior really means hiring 5 seniors to replace the one that had 5 jobs layered on over a few years. This is of course impossible for any mortal to jump into. Now they also dont even have juniors to train up to senior levels.

    • flyinglizard 15 hours ago
      Good juniors are also great at just working. Usually no family so they are able to put in a lot of attention into work and they have that innocent curiosity and can-do in them which brings a lot of positive energy.
      • never_inline 8 hours ago
        > put in a lot of attention into work and they have that innocent curiosity

        They're also good at putting company code into ChatGPT.

        /snark

    • protocolture 8 hours ago
      >They’re the only people in the org who are still allowed to ask “dumb” questions without losing face

      This is the only role of executives, sales people, account managers. They usually do it with complete and utter confidence too. Vibe-questioning and vibe-instructing other people without a care in the world.

    • jppope 18 hours ago
      This is actually a super power I have after spending my first part of my career in sales.

      I was never formally trained so I just keep asking "why" until someone proves it all the way. Sales itself is also a lot about asking questions that won't come up to find the heart of the thing people actually want... which is just another side of the coin.

    • aerhardt 18 hours ago
      I mean, that’s an interesting take, but “having people around to ask dumb questions” is not why most orgs hire juniors.
      • lingrush4 17 hours ago
        In my experience, juniors are absolutely terrified of asking any sort of question at all during a meeting. Senior engineers are far more likely to ask interesting, useful questions.

        We hire juniors so that we can offload easy but time-consuming work on them while we focus on more important or more difficult problems. We also expect that juniors will eventually gain the skills to solve the more difficult problems as a result of the experience they gain performing the easy tasks.

        If we stop hiring juniors now, then we won't have any good senior engineers in 5-10 years.

    • throwaway894345 8 hours ago
      I think the main benefit of junior devs is that it’s the only pipeline for getting senior devs. The other benefit over AI is that most of software engineering is not writing the code, but doing all of the other stuff like working out what to build, flagging concerns, operating the software once it is running, etc.
    • SecretDreams 9 hours ago
      I'll also add the obvious answer in that real companies constantly have seniors leaving/retiring. Juniors are meant to be trained up to be the future seniors of the company. You should consistently feed and grow this pipeline unless you think you won't exist in the future or AI will replace all jobs, period.
    • hiddencost 19 hours ago
      Yikes. Sounds like you work for a toxic company. The mid and senior level engineers I know all go out of their way to ask the dumb questions. Every junior employee I've mentored, I've told them the main way they can fail is not asking questions early and often. Gotta build a culture that supports questions.
    • iwontberude 18 hours ago
      [flagged]
    • deepGem 18 hours ago
      What will eventually pan out is that senior devs will be replaced with junior devs powered by AI assistants. Simply because of the reasons you stated. They will ask the dumb important questions and then after a while, will even solve for them.

      Now that their minds are free from routine and boilerplate work, they will start asking more 'whys' which will be very good for the organization overall.

      Take any product - nearly 50% of the features are unused and it's a genuine engineering waste to maintain those features. A junior dev spending 3 months on the code base with Claude code will figure out these hidden unwanted features, cull them or ask them to be culled.

      It'll take a while to navigate the hierarchy but they'll figure it out. The old guard will have no option but to move up or move out.

      • throwway120385 18 hours ago
        Why would Claude code help you find unused features? The end customer uses features, not the AI. I would want to know from the end customer whether a feature is unused, and a Junior with an LLM assistant is not going to be able to tell me that without adding new features to the code base.
        • deepGem 14 hours ago
          Am using Claude code as an approximation here. 2 years down the line the tooling around analytics will get integrated in AU assistants and they will be absolutely able to figure out unused features.
      • alwa 18 hours ago
        How do you suppose the old guard are filling their days now?

        At some level, aren’t you describing the age-old process of maturing from junior to mid level to senior in most lines of work, and in most organizations? Isn’t that what advancing in responsibility boils down to: developing subtlety and wisdom and political credibility and organizational context? Learning where the rakes are?

        I wish 3 months, or even 3 years, were long enough to fully understand the whys and wherefores and politics of the organizations I cross paths with, and the jungle of systems and code supporting all the kinds of work that happen inside…

  • simonw 21 hours ago
    Relevant post by Kent Beck from 12th Dec 2025: The Bet On Juniors Just Got Better https://tidyfirst.substack.com/p/the-bet-on-juniors-just-got...

    > The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]

    > If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.

    • beAbU 20 hours ago
      Isn't the struggling with docs and learning how and where to find the answers part of the learning process?

      I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...

      • chaos_emergent 19 hours ago
        Isn't the struggle of sifting through a labyrinth of physical books and learning how and where to find the right answers part of the learning process?

        I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...

        • sfpotter 19 hours ago
          It may well be. Books have tons of useful expository material that you may not find in docs. A library has related books sitting in close proximity to one another. I don't know how many times I've gone to a library looking for one thing but ended up finding something much more interesting. Or to just go to the library with no end goal in mind...
          • calepayson 19 hours ago
            Speaking as a junior, I’m happy to do this on my own (and do!).

            Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.

            There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.

            If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.

            Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.

            • weakfish 19 hours ago
              I’ll also make the observation that the extra time spent is very valuable if your objective solely is learning, but often the Business™ needs require something working ASAP
            • sfpotter 19 hours ago
              You're reading a lot into my ellipsis that isn't there. :-)

              Please read it as: "who knows what you'll find if you take a stop by the library and just browse!"

            • alwa 17 hours ago
              I admire your attitude and the clarity of your thought.

              It’s not as if today’s juniors won’t have their own hairy situations to struggle through, and I bet those struggles will be where they learn too. The problem space will present struggles enough: where’s the virtue in imposing them artificially?

          • bee_rider 19 hours ago
            This should be possible online, it would be if more journals were open access.
            • sfpotter 19 hours ago
              Disagree, actually. Having spent a lot of time publishing papers in those very journals, I can tell you that just browsing a journal is much less conducive to discovering a new area to dive into than going to a library and reading a book. IME, books tend to synthesize and collect important results and present them in an understandable (pedagogical?!) way that most journals do not, especially considering that many papers (nowadays) are written primarily to build people's tenure packets and secure grant funding. Older papers aren't quite so bad this way (say, pre-2000).
              • bee_rider 15 hours ago
                I think I don’t disagree. Only, it would at least be easier to trace the research concept you are interested in up to a nice 70’s paper or a textbook.
          • usefulcat 19 hours ago
            You could make much the same observation about online search results.
        • GeoAtreides 19 hours ago
          When I first opened QBasic, <N> years ago, when I was a wee lad, the online QBasic help didn't replace my trusty qbasic book (it supplemented it, maybe), nor did it write the programs for me. It was just there, doing nothing, waiting for me to press F1.

          AI, on the other hand...

        • beAbU 4 hours ago
          I'm not sure what you are trying to say here, or if you are trying to somehow diminish my statement by somehow claiming that online documentation is causing the same magnitude of harm compared to using a book?

          Two things:

          1 - I agree with you. A good printed resource is incredibly valuable and should be perfectly valid in this day and age.

          2 - many resources are not in print, e.g. API docs, so I'm not sure how books are supposed to help here.

        • deepsquirrelnet 11 hours ago
          It’s an interesting question isn’t it? There are obvious qualities about being able to find information quickly and precisely. However, the search becomes much narrower, and what must inevitably result is a homogeneity of outcomes.

          Eventually we will have to somehow convince AI of new and better ways of doing things. It’ll be propaganda campaigns waged by humans to convince God to deploy new instructions to her children.

          • 627467 10 hours ago
            > inevitably result is a homogeneity of outcomes

            And this outcome will be obvious very quickly for most observers won't it? So, the magic will occur by pushing AI beyond another limit or just have people go back to specialize on what eventually will becoming boring and procedural until AI catches up

        • ori_b 19 hours ago
          Well, yes -- this is why I still sit down and read the damn books. The machine is useful to refresh my memory.
        • mplewis 17 hours ago
          You posted this in jest but it's literally true. You need to read the whole book to get the context. You SHOULD be reading the manuals and the docs. They weren't written because they're fun.
        • thinkingemote 19 hours ago
          learning to learn
      • Aurornis 19 hours ago
        I recall similar arguments being made against search engines: People who had built up a library of internal knowledge about where and how to find things didn't like that it had become so easy to search for resources.

        The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.

        • CharlieDigital 19 hours ago
          I argue that there is a strong, strong benefit to reading the docs: you often pick up additional context and details that would be missing in a summary.

          Microsoft docs are a really good example of this where just looking through the ToC on the left usually exposes me to some capability or feature of the tooling that 1) I was not previously aware of and 2) I was not explicitly searching for.

          The point is that the path to a singular answer can often include discovery of unrelated insight along the way. When you only get the answer to what you are asking, you lose that process of organic discovery of the broader surface area of the tooling or platform you are operating in.

          I would liken AI search/summaries to visiting only the well-known, touristy spots. Sure, you can get shuttled to that restaurant or that spot that everyone visits and posts on socials, but in traveling that way, you will miss all of the other amazing food, shops, and sights along the way that you might encounter by walking instead. Reading the docs is more like exploring the random nooks and crannies and finding experiences you weren't expecting and ultimately knowing more about the place you visited than if you had only visited the major tourist destinations.

          As a senior-dev, I have generally a good idea of what to ask for because I have built many systems and learned many things along the way. A junior dev? They may not know what to ask for and therefore, may never discover those "detours" that would yield additional insights to tuck into the manifolds of their brains for future reference. For the junior dev, it's like the only trip they will experience is one where they just go to the well known tourist traps instead of exploring and discovering.

        • raw_anon_1111 10 hours ago
          I have been online since 1993 on Usenet. That was definitely not a widespread belief. We thought DejaNews was a godsend.
        • Cpoll 18 hours ago
          It's possible those arguments are correct. I wouldn't give up Google and SO, but I suspect I was learning faster when my first stop was K&R or a man page. There's a lot of benefit in building your own library of knowledge instead of cribbing from someone else's.

          Of course no-one's stopping a junior from doing it the old way, but no-one's teaching them they can, either.

      • rafaelmn 19 hours ago
        No, trying stuff out is the valuable process. How I search for information changed (dramatically) in the last 20 years I've been programming. My intuition about how programs work is still relevant - you'll still see graybeards saying "there's a paper from 70s talking about that" for every "new" fad in programming, and they are usually right.

        So if AI gets you iterating faster and testing your assumptions/hypothesis I would say that's a net win. If you're just begging it to solve the problem for you with different wording - then yeah you are reducing yourself to a shitty LLM proxy.

      • tencentshill 19 hours ago
        The naturally curious will remain naturally curious and be rewarded for it, everyone else will always take the shortest path offered to complete the task.
        • macintux 19 hours ago
          > The naturally curious will remain naturally curious and be rewarded for it

          Maybe. The naturally curious will also typically be slower to arrive at a solution due to their curiosity and interest in making certain they have all the facts.

          If everyone else is racing ahead, will the slowpokes be rewarded for their comprehension or punished for their poor metrics?

          • rgoulter 13 hours ago
            > If everyone else is racing ahead, will the slowpokes be rewarded for their comprehension or punished for their poor metrics?

            It's always possible to go slower (with diminishing benefits).

            Or I think putting it in terms of benefits and risks/costs: I think it's fair to have "fast with shallow understanding" and "slower but deeper understanding" as different ends of some continuum.

            I think what's preferable somewhat depends on context & attitude of "what's the cost of making a mistake?". If making a mistake is expensive, surely it's better to take an approach which has more comprehensive understanding. If mistakes are cheap, surely faster iteration time is better.

            The impact of LLM tools? LLM tools increase the impact of both cases. It's quicker to build a comprehensive understanding by making use of LLM tools, similar to how stuff like autocompletion or high-level programming languages can speed up development.

      • marcosdumay 17 hours ago
        > learning how and where to find the answers part of the learning process?

        Yes. And now you can ask the AI where the docs are.

        The struggling is not the goal. And rest assured there are plenty of other things to struggle with.

      • PaulKeeble 19 hours ago
        The thing is you need both. You need to have periods where you are reading through the docs learning random things and just expanding your knowledge, but the time to do that is not when you are trying to work out how to get a string into the right byte format and saved in the database as a blob (or whatever it is). Documentation has always has lots of different uses and the one that gets you answers to direct questions has improved a bit but its not really reliable yet so you are still going to have to check it.
      • fireflash38 16 hours ago
        The problem isn't that AI makes obtuse documentation usable. It's that it makes good documentation unread.

        There's a lot of good documentation where you learn more about the context of how or why something is done a certain way.

      • supersour 19 hours ago
        I think if this were true, then individualized mastery learning wouldn't prove to be so effective

        https://en.wikipedia.org/wiki/Mastery_learning

        • amrocha 18 hours ago
          Except none of us have a master teaching and verifying our knowledge on how to use a library. And AI doesn’t do that either.
      • throwaway613745 19 hours ago
        The best part is when the AI just makes up the docs
      • pizza234 19 hours ago
        It really depends on what's being learned. For example, take writing scripts based on the AWS SDK. The APIs documentation is gigantic (and poorly designed, as it takes ages to load the documentation of each entry), and one uses only a tiny fraction of the APIs. I don't find "learning to find the right APIs" a valuable knowledge; rather, I find "learning to design a (small) program/script starting from a basic example" valuable, since I waste less time in menial tasks (ie. textual search).
        • Terr_ 19 hours ago
          > It really depends on what's being learned.

          Also the difference between using it to find information versus delegating executive-function.

          I'm afraid there will be a portion of workers who crutch heavily on "Now what do I do next, Robot Soulmate?"

      • Ifkaluva 19 hours ago
        No :)

        Any task has “core difficulty” and “incidental difficulty”. Struggling with docs is incidental difficulty, it’s a tax on energy and focus.

        Your argument is an argument against the use of Google or StackOverflow.

        • skydhash 19 hours ago
          Not really. There’s a pattern to reading docs, just like there’s a pattern to reading code. Once you grasped it, your speed increase a lot. The slowness that junior has is a lack of understanding.

          Complaining about docs is like complaining about why research article is not written like elementary school textbooks.

      • tikhonj 19 hours ago
        Struggling with poorly organized docs seems entirely like incidental complexity to me. Good learning resources can be both faster and better pedagogically. (How good today's LLM-based chat tools are is a totally separate question.)
        • amrocha 18 hours ago
          Nobody said anything about poorly organized docs. Reading well structured and organized complex material is immensely difficult. Anyone who’s read Hegel can attest to that.

          And yet I wouldn’t trust a single word coming out of the mouth of someone who couldn’t understand Hegel so they read an AI summary instead.

          There is value in struggling through difficult things.

      • jimbokun 19 hours ago
        Why?

        If you can just get to the answer immediately, what’s the value of the struggle?

        Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.

      • schainks 19 hours ago
        Disagree. While documentation is often out of date, the threshold for maintaining it properly has been lowered, so your team should be doing everything it can to surface effective docs to devs and AIs looking for them. This, in turn, also lowers the barrier to writing good docs since your team's exposure to good docs increases.

        If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.

      • jaapbadlands 19 hours ago
        Feel free to waste your time sifting through a dozen wrong answers. Meanwhile the rest of us can get the answers, absorb the right information quickly then move on to solving more problems.
        • bigstrat2003 18 hours ago
          And you will have learned nothing in the process. Congratulations, you are now behind your peer who "wasted his time" but actually knows stuff which he can lean on in the future.
          • twosdai 13 hours ago
            This is a wrong take. People learn plenty while using AI. it's how you use it that matters. Same issue happened years ago if you just copied stack overflow without understanding what you were doing.

            Its no different now, just the level of effort required to get the code copy is lower.

            Whenever I use AI I sit and read and understand every line before pushing. Its not hard. I learn more.

      • bigstrat2003 18 hours ago
        Yes, it is. And yes, it absolutely is harmful.
      • tayo42 19 hours ago
        If the docs are poorly written then your not learning anything except how to control frustration
      • seanmcdirmid 19 hours ago
        1965: learning how to punch your own punch cards is part of the learning process

        1995: struggling with docs and learning how and where to find the answers part of the learning process

        2005: struggling with stackoverflow and learning how to find answers to questions that others have asked before quickly is part of the learning process

        2015: using search to find answers is part of the learning process

        2025: using AI to get answers is part of the learning process

        ...

        • lifeformed 18 hours ago
          Has the quality of software been improving all this time?
          • seanmcdirmid 17 hours ago
            The volume of software that we have produced with new tools has increased dramatically. The quality has remained at a level that the market can accept (and it doesn't want to bother paying for more quality for the cost of it).
          • tester756 17 hours ago
            Sure, people were writing terrible code 25 years ago

            XML oriented programming and other stuff was "invented" back then

        • wizzwizz4 18 hours ago
          This is both anachronistic and wrong.

          To the extent that learning to punch your own punch cards was useful, it was because you needed to understand the kinds of failures that would occur if the punch cards weren't punched properly. However, this was never really a big part of programming, and often it was off-loaded to people other than the programmers.

          In 1995, most of the struggling with the docs was because the docs were of poor quality. Some people did publish decent documentation, either in books or digitally. The Microsoft KB articles were helpfully available on CD-ROM, for those without an internet connection, and were quite easy to reference.

          Stack Overflow did not exist in 2005, and it was very much born from an environment in which search engines were in use. You could swap your 2005 and 2015 entries, and it would be more accurate.

          No comment on your 2025 entry.

          • seanmcdirmid 18 hours ago
            > To the extent that learning to punch your own punch cards was useful, it was because you needed to understand the kinds of failures that would occur if the punch cards weren't punched properly. However, this was never really a big part of programming, and often it was off-loaded to people other than the programmers.

            I thought all computer scientists heard about Dijkstra making this claim at one time in their careers. I guess I was wrong? Here is the context:

            > A famous computer scientist, Edsger Dijkstra, did complain about interactive terminals, essentially favoring the disciplined approach required by punch cards and batch processing.

            > While many programmers embraced the interactivity and immediate feedback of terminals, Dijkstra argued that the "trial and error" approach fostered by interactive systems led to sloppy thinking and poor program design. He believed that the batch processing environment, which necessitated careful, error-free coding before submission, instilled the discipline necessary for writing robust, well-thought-out code.

            > "On the Cruelty of Really Teaching Computing Science" (EWD 1036) (1988 lecture/essay)

            Seriously, the laments I hear now have been the same in my entire career as a computer scientist. Let's just look toward to 2035 where someone on HN will complain some old way of doing things is better than the new way because its harder and wearing hair shirts is good for building character.

            • wizzwizz4 28 minutes ago
              Dijkstra did not make that claim in EWD1036. The general philosophy you're alluding to is described in EWD249, which – as it happens – does mention punchcards:

              > The naive approach to this situation is that we must be able to modify an existing program […] The task is then viewed as one of text manipulation; as an aside we may recall that the need to do so has been used as an argument in favour of punched cards as against paper tape as an input medium for program texts. The actual modification of a program text, however, is a clerical matter, which can be dealt with in many different ways; my point is […]

              He then goes on to describe what today we'd call "forking" or "conditional compilation" (in those days, there was little difference). "Using AI to get answers", indeed. At least you had the decency to use blockquote syntax, but it's tremendously impolite to copy-paste AI slop at people. If you're going to ingest it, do so in private, not in front of a public discussion forum.

              The position you've attributed to Dijkstra is defensible – but it's not the same thing at all as punching the cards yourself. The modern-day equivalent would be running the full test suite only in CI, after you've opened a pull request: you're motivated to program in a fashion that ensures you won't break the tests, as opposed to just iterating until the tests are green (and woe betide there's a gap in the coverage), because it will be clear to your colleagues if you've just made changes willy-nilly and broken some unrelated part of the program and that's a little bit embarrassing.

              I would recommend reading EWD1035 and EWD1036: actually reading them, not just getting the AI to summarise them. While you'll certainly disagree with parts, the fundamental point that E.W.Dijkstra was making in those essays is correct. You may also find EWD514 relevant – but if I linked every one of Dijkstra's essays that I find useful, we'd be here all day.

              I'll leave you with a passage from EWD480, which broadly refutes your mischaracterisation of Dijkstra's opinion (and serves as a criticism of your general approach):

              > This disastrous blending deserves a special warning, and it does not suffice to point out that there exists a point of view of programming in which punched cards are as irrelevant as the question whether you do your mathematics with a pencil or with a ballpoint. It deserves a special warning because, besides being disastrous, it is so respectable! […] And when someone has the temerity of pointing out to you that most of the knowledge you broadcast is at best of moderate relevance and rather volatile, and probably even confusing, you can shrug out your shoulders and say "It is the best there is, isn't it?" As if there were an excuse for acting like teaching a discipline, that, upon closer scrutiny, is discovered not to be there.... Yet I am afraid, that this form of teaching computing science is very common. How else can we explain the often voiced opinion that the half-life of a computing scientist is about five years? What else is this than saying that he has been taught trash and tripe?

              The full text of much of the EWD series can be found at https://www.cs.utexas.edu/~EWD/.

        • linksnapzz 19 hours ago
          Unironically, yes.

          Now get back to work.

    • lokar 20 hours ago
      For an experienced engineer, working out the syntax, APIs, type issues, understanding errors, etc is the easy part of the job. Larger picture issues are the real task.

      But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.

      • bdangubic 20 hours ago
        what is a larger issue? lacking domain knowledge? or lacking deeper understanding of years of shit in the codebase that seniors may have better understanding? where I work, there is no issue that it "too large" for a junior to take on, it is the only way that "junior" becomes "non-junior" - by doing, not by delegating to so-called seniors (I am one of them)
        • dclowd9901 20 hours ago
          "Larger issue" is overall technical direction and architecture, making decisions that don't paint you into a corner, establishing maintainability as a practice, designing work around an organization's structure and habit and so on.

          But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.

        • lokar 15 hours ago
          Just curious, are you mostly FE? I could see this there (but there is still a lot of browser esoteria, etc)

          Doing backend and large distributed systems it (seems to me), much deeper. Types of consistency and their tradeoffs in practice, details like implementing and correctly using lamport clocks, good API design, endless details about reworking, on and on.

          And then for both, a learned sense of what approaches to system organization will work in the long run (how to avoid needing to stage a re-write every 5 years).

          • ehnto 10 hours ago
            I still agree more or less that the best way for a junior to succeed is to jump in the deep end, not without guidance though. Mentorship is really important in distributed systems where the inner machinations can be quite obtuse. But I find you can't just explain it all and expect it to stick, mentoring someone through a task is the best way.
          • phist_mcgee 10 hours ago
            >Just curious, are you mostly FE

            Gatekeeping?

            Why couldn't a backend team have all tasks be junior compatible, if uncoupled from deadlines and time constraints?

        • ekkeke 20 hours ago
          You can't give a junior tasks that require experience and nuance that have been acquired over years of development. If you babysit them, then perhaps but then what is the point? By it's nature "nuance" is something hard to describe concretely but as someone who has mentored a fair few juniors most of them don't have it. AI generally doesn't have it either. Juniors need tasks at the boundary of their capability, but not far beyond to be able to progress. Simply allowing them to make a mess of a difficult project is not a good way to get there.

          There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.

          • ehnto 10 hours ago
            > If you babysit them, then perhaps but then what is the point

            In a long term enterprise the point is building up a long term skillset into the community. Bolstering your teams hive mind on a smaller scale also.

            But work has evolved and the economy has become increasingly hostile to long term building, making it difficult to get buy in for efforts that don't immediately get work done or make money.

          • lokar 14 hours ago
            Much of the job of the Sr is to understand where the Jr is, and give them tasks that are challenging but achievable, and provide guidance.
          • bdangubic 17 hours ago
            you work(ed) in some shitty places if you believe this to be true
            • ekkeke 15 hours ago
              Perhaps, I don't consider them shitty myself but palates differ. Is engineering nirvana a place where tasks are such that any can been done by a junior engineer, and the concept of engineering skill developed through experience is non-existent?
              • bdangubic 12 hours ago
                > Is engineering nirvana a place where tasks are such that any can been done by a junior engineer, and the concept of engineering skill developed through experience is non-existent?

                how does one junior acquire engineering skills except through experience as you said?

        • Aperocky 19 hours ago
          Unnecessary complexity, completely arbitrary one off designs, over emphasis on one part of the behavior while ignoring others. Using design patterns where they shouldn't be used, code once and forget operations exist, using languages and framework that are familiar but unfit for that purpose. The list goes on and I see it happen all the time, AI only makes it worse because it tend to verify all of these with "You're absolutely correct!".

          Good luck maintaining that.

          • bdangubic 17 hours ago
            this can only happen in a shitty places with incompetent team
            • jason_oster 15 hours ago
              Every team has incompetence at some level. If every team was perfect, there would be no more work left to do, because they would always get the right product built correctly the first time. No bug fix releases, no feature refreshes, no version 2.

              Beware, your ego may steer you astray.

              • bdangubic 12 hours ago
                been hacking 31 years with the same ego but you never know. and if I learned anything in these years is to get out the heck out of any place that treats people not by their skills but by how long ago their Mom gave them birth
        • butwhyth0oo 20 hours ago
          [dead]
      • atomicnumber3 17 hours ago
        This is honestly what I (staff engineer) find AI the most useful for. I've been around the block enough that I typically know in general what I want, but I often find myself wanting it in a new framework or paradigm or similar, and if I could just ASK a person a question, they'd understand it. But not knowing the exact right keywords, especially in frameworks with lots of jargon, can still make it annoying. I can often get what I want by just sitting down and reading approximately 6 screen-heights of text out of the official docs on the general topic in question to find the random sentence 70% of the way down that answered my question.

        But dyou know what's really great at taking a bunch of tokens and then giving me a bunch of probabilistically adjacent tokens? Yeah exactly! So often even if the AI is giving me something totally bonkers semantically, just knowing all those tokens are adjacent enough gives me a big leg up in knowing how to phrase my next question, and of course sometimes the AI is also accidentally semantically correct too.

      • never_inline 7 hours ago
        When I joined I could do all this.
    • ChuckMcM 19 hours ago
      And this is always my question: "... because the genie, used well, accelerates learning." Does it though?

      How are we defining "learning" here? The example I like to use is that a student who "learns" what a square root is, can calculate the square root of a number on a simple 4 function calculator (x, ÷, +, -) if iteratively. Whereas the student who "learns" that the √ key gives them the square root, is "stuck" when presented with a 4 function calculator. So did they 'learn' faster when the "genie" surfaced a key that gave them the answer? Or did they just become more dependent on the "genie" to do the work required of them?

      • pests 16 hours ago
        Some random musings this reminded me of.

        I graduated HS in mid 2000s and didn't start using a calculator for math classes until basically a junior in college. I would do every calculation by hand, on paper. I benefited from a great math teacher early on that taught me how to properly lay out my calculations and solutions on paper. I've had tests I've turned in where I spent more paper on a single question than others did on the entire test.

        It really helped my understanding of numbers and how they interacted, and helped teachers/professors narrow down on my misunderstandings.

        • jacquesm 13 hours ago
          Not only that: I suspect you already have an inkling of the range of the expected outcomes for the answer in your head just looking through the problem and any answers that fail that test will cause you to pause to re-check your work.

          This aspect is entirely missing when you use an oracle.

      • sailfast 18 hours ago
        You still need to be curious. I learn a ton by asking questions of the LLMs when I see new things. “Explain this to me - I get X but why did you do Y?”

        It’s diamond age and a half - you just need to continue to be curious and perhaps slow your shipping speed sometimes to make sure you budget time for learning as well.

        • simonw 18 hours ago
          I think that's the "used well" in "because the genie, used well, accelerates learning".
    • almosthere 20 hours ago
      We had 3 interns this past summer - with AI I would say they were VERY capable of generating results quickly. Some of the code and assumptions were not great, but it did help us push out some releases quickly to alleviate customer issues. So there is a tradeoff with juniors. May help quickly get features out, may also need some refactoring later.
      • fastball 14 hours ago
        What makes them more capable than a senior engineer with three LLM agents?
      • turnsout 20 hours ago
        Interesting how similar this is to the tradeoff of using AI coding agents
    • golly_ned 13 hours ago
      > Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced

      This really isn't the case from what I've seen. It's that they use Cursor or other code generation tools integrated into their development environment to generate code, and if it's functional and looks from a fuzzy distance like 'good' code (in the 'code in the small' sense), they send an oversized PR, and it's up to the reviewer to actually do the thinking.

      • simonw 37 minutes ago
        That's bad and those juniors should be taught to do better or be "managed out of the company".

        Their job is to deliver code that they have proved to work.

      • rustystump 13 hours ago
        This. I have seen MRs with generated open cv lut mapping in them because a junior didnt understand that what they needed was a simple interpolation function.

        The crux is always that you dont know what u dont know. AI doesnt fix this.

    • sharemywin 19 hours ago
      I think the big win with AI is being able to work around jargon. Don't know what that word means ask AI. what the history on it no problem. don't understand a concepts explain this at a high school reading level.
    • xp84 15 hours ago
      > the genie, used well, accelerates learning.

      Ehh... 'used well' is doing some very heavy lifting there. And the incentive structure at 90% of companies does not optimize for 'using it well.'

      The incentive is to ship quickly, meaning aim the AI-gun at the codebase for a few hours and grind out a "technically working" solution, with zero large-scale architecture thought and zero built-up knowledge of how the different parts of the application are intended to work together (because there was no "intention"). There will be tests, but they may not be sensible and may be very brittle.

      Anyway, deploying a bunch of fresh grads armed not with good mentorship but with the ability to generate thousands of LOC a day is a recipe for accelerating the collapse I usually see in startup codebases about 6-8 years old. This is the point where the list of exceptions to every supposed pattern is longer than the list of things that follow the patterns, and where each bug, when properly pursued, leads to a long chain of past bad decisions, each of which would take days of effort to properly unwind (and that unwinding will also have a branching effect on other things). Also, coincidentally, this is the point where an AI agent is the most useless, because they really don't expect all the bizarre quirks in the codebase.

      Am I saying AI is useless? No, it's great for prototyping and getting to PMF, and great in the hands of someone who can read its output with a critical eye, but I wouldn't combine it with inexperienced users who haven't had the opportunity to learn from all the many mistakes I've made over the years.

    • zahlman 14 hours ago
      Not to disagree with Kent Beck's insights on juniors using AI, but the effect of AI on his own writing is palpably negative. His older content is much more enjoyable to read. And so is his recent non-post "activity" on Substack. For example, compare a "note" preceding this article (https://substack.com/@kentbeck/note/c-188541464), on the same topic, to the actual content.
    • SkyPuncher 19 hours ago
      *Some juniors have gotten better.

      I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).

      There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.

      The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.

    • GeoAtreides 19 hours ago
      >but because the genie, used well, accelerates learning.

      This is "the kids will use the AI to learn and understand" level of cope

      no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser

      • CuriouslyC 19 hours ago
        I've learned a lot of shit while getting AI to give me the answers, because I wanted to understand why it did what it did. It saves me a lot of time trying to fix things that would have never worked, so I can just spend time analyzing success.

        There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.

        • veunes 4 hours ago
          Understanding "why it works" is one thing, understanding "why it should work this way and not another, and what the alternatives are" is entirely different. AI shows you just one of countless correct implementations. You might understand that single implementation, but not the entire theory behind it
        • jplusequalt 18 hours ago
          >I've learned a lot of shit while getting AI to give me the answers

          I would argue you're learning less than you might believe. Similarly to how people don't learn math by watching others solve problems, you're not going to learn to become a better engineer/problem solver by reading the output of ChatGPT.

          • CuriouslyC 15 hours ago
            If I know what I want to do and how I want to do it, and there's plumbing to make that a reality, am I not still solving problems? I'm just paying less attention to stuff that machines can successfully automate.

            Regarding leveling up as an engineer, at this point in my career it's called management.

        • skydhash 19 hours ago
          Do you honestly think that’s how people learn?

          This is an example of a book on Common Lisp

          https://gigamonkeys.com/book/practical-a-simple-database

          What you usually do is follow the book instructions and get some result, then go to do some exploration on your own. There’s no walk in the dark trying to figure your own path.

          Once you learn what works, and what does not, then you’ll have a solid foundation to tackle more complex subject. That’s the benefit of having a good book and/or a good teacher to guide you to the path of mastering. Using a slot machine is more tortuous than that.

          • CuriouslyC 19 hours ago
            I don't find it to be more torturous than that. In fact, if I were to go back and learn lisp again, I think I'd be a lot more motivated seeing how to build something interesting out of the gate rather than the toy programs I learned in my racket course.

            Also, for a lot of things, that is how people learn because there aren't good textbooks available.

            • skydhash 18 hours ago
              Define interesting.

              I was helping a few people on getting started with an Android Development bootcamp and just being able to run the default example and get their bearing around the IDE was interesting to them. And I remember when I was first learning python. Just doing basic variable declaration and arithmetic was interesting. Same with learning C and being able to write tic-tac-toe.

              I think a lot of harm is being done by making beginner have expectations that would befit people that have years of experience. Like you can learn docker in 2 months to someone that doesn't even know Linux exists or have never encountered the word POSIX.

              Please do read the following article: https://www.norvig.com/21-days.html

      • switchbak 17 hours ago
        Some might (most might?), those aren't the ones we're interested in.

        Just as some might pull the answers from the back of the textbook, the interesting ones are the kids who want to find out why certain solutions are the way they are.

        Then again I could be wrong, I try hard to stay away from the shithose that is the modern social media tech landscape (TikTok, Insta, and friends) so I'm probably WAY out of touch (and I prefer it that way).

      • simonw 18 hours ago
        Right, and they won't get hired beyond their internship.
    • ares623 18 hours ago
      The cynic in me sees it as using juniors as a vehicle for driving up AI metrics. The seniors will be less critical reviewing AI output with a human shield/messenger.
    • lanfeust6 20 hours ago
      Search is easily the best feature of AI/LLMs.
      • alpha_squared 20 hours ago
        I kind of agree here. The mental model that works for me is "search results passed through a rock tumbler". Search results without attribution and mixed-and-matched across reputable and non-reputable sources, with a bias toward whatever source type is more common.
      • sublinear 20 hours ago
        That's arguably all it ever was. Generating content using AI is just finding a point in latent space.
      • gosub100 19 hours ago
        Which was trained on a pre-AI internet. What's going to happen in coming years when new tech comes out but perhaps isn't documented the same way anymore? It's not an unsolvable problem, but we could see unintended consequences like, say where you must pay the AI provider to ingest your data. Similar to buying poll space or AdSense or whatever they call it for search engines
        • simonw 18 hours ago
          If you release a new piece of technology from 2025 onwards and don't invest a decent amount of effort into producing LLM-friendly documentation (with good examples) that a user can slurp into their coding agent you're doing your new technology a disservice.
          • nothrabannosir 7 hours ago
            I thought this was always true? What’s new about documentation being important?
            • simonw 7 hours ago
              If your technology has competition that's already in the training data, the only way to make it equally accessible to LLM users is to ensure there is concise, available documentation that can be fed directly into those LLMs.

              That's why "copy page" buttons are increasingly showing on manual pages eg. https://platform.claude.com/docs/en/get-started

            • gosub100 2 hours ago
              If LLMs get more popular, fewer people will actually "browse the web" which could reduce the need for it to be published. At the least, fewer people will ask stack overflow questions for the LLM to learn from. So there could be an island of knowledge where LLMs excel at topics that had mass volume published before AI, and be much less useful for new tech developed after.
    • gnerd00 20 hours ago
      first response from me "let me mention how the real business world actually works" .. let's add a more nuanced slice to that however

      Since desktop computers became popular, there have been thousands of small to mid-size companies that could benefit from software systems.. A thousand thousand "consultants" marched off to their nearest accountant, retailer, small manufacturer or attorney office, to show off the new desktop software and claim ability to make new, custom solutions.

      We know now, this did not work out for a lot of small to mid-size business and/or consultants. Few could build a custom database application that is "good enough" .. not for lack of trying.. but pace of platforms, competitive features, stupid attention getting features.. all of that, outpaced small consultants .. the result is giant consolidation of basic Office software, not thousands of small systems custom built for small companies.

      What now, in 2025? "junior" devs do what? design and build? no. Cookie-cutter procedures at AWS lock-in services far, far outpace small and interesting designs of software.. Automation of AWS actions is going to be very much in demand.. is that a "junior dev" ? or what?

      This is a niche insight and not claiming to be the whole story.. but.. ps- insert your own story with "phones" instead of desktop software for another angle

      • mlloyd 20 hours ago
        One thing I'd point out is that there are only so many ways to write a document or build a spreadsheet. There are a ton of business processes that are custom enough to that org that they have to decide to go custom, change their process, or deal with the inefficiency of not having a technical solution that accomplishes the goal easily.

        Lotus Notes is an example of that custom software niche that took off and spawned a successful consulting ecosystem around it too.

        • dylan604 20 hours ago
          > Lotus Notes is an example

          TIL Notes is still a thing. I had thought it was dead and gone some time ago.

      • nateglims 20 hours ago
        I'm a little confused by this analysis. Are you saying that all enterprise software has been replaced with MS word and AWS?
        • gnerd00 20 hours ago
          certainly no -- not "all software" of anything. Where is the word "enterprise" in the post you have replied to ? "enterprise" means the very largest companies and institutions..

          I did not write "all software" or "enterprise software" but you are surprised I said that... hmmm

    • imiric 18 hours ago
      I'm not swayed by appeals to authority, but this is a supremely bad take.

      "AI" tools are most useful in the hands of experienced developers, not juniors. It's seniors who have the knowledge and capability to review the generated output, and decide whether the code will cause more issues when it's merged, or if it's usable if they tweak and adapt it in certain ways.

      A junior developer has no such skills. Their only approach will be to run the code, test whether it fulfills the requirements, and, if they're thorough, try to understand and test it to the best of their abilities. Chances are that because they're pressured to deliver as quickly as possible to impress their colleagues and managers, they'll just accept whatever working solution the tool produces the first time.

      This makes "AI" in the hands of junior developers risky and counterproductive. Companies that allow this type of development will quickly grind to a halt under the weight of technical debt, and a minefield of bugs they won't know how to maneuver around.

      The unfortunate reality is that with "AI" there is no pathway for junior developers to become senior. Most people will gravitate towards using these tools as a crutch for quickly generating software, and not as a learning tool to improve their own skills. This should concern everyone vested in the future of this industry.

      • versteegen 7 hours ago
        > A junior developer has no such skills. Their only approach will be to run the code, test whether it fulfills the requirements, and, if they're thorough, try to understand and test it to the best of their abilities.

        This is also a supremely bad take... well, really it's mainly the way you worded it that's bad. Juniors have skills, natural aptitudes, as much intelligence on average as other programmers, and often even some experience but what they lack is work history. They sure as hell are capable of understanding code rather than just running it. Yes, of course experience is immensely useful, most especially at understanding how to achieve a maintainable and reliable codebase in the longterm, which is obviously of special importance, but long experience is not a hard requirement. You can reason about trade offs, learn from advice, learn quickly, etc.

        • imiric 6 hours ago
          You're right, that was harshly worded. I meant to contrast it with the capability of making a quality assessment of the generated output, and understanding how and what to change, if necessary. This is something that only experts in any field are capable of. I didn't mean to imply that people lacking experience are incapable of attaining these skills, let alone that they're less intelligent. It's just that the field is positioned against them in a way that they might never reach this level. Some will, but it will be much harder for most. This wouldn't be an issue if these new tools were infallible, but we're far from that stage.
    • amrocha 18 hours ago
      I disagree. In my experience AI does most of the work and the juniors already poor skills atrophy. Then a senior engineer has to review AI slop and tell the junior to roll the AI dice again.
      • snarf21 17 hours ago
        Agreed, this is like AI doing your homework. A select few will use it to learn but most will copy/pasta, let it create their PR and slack the rest of the day. But at least they are "trying" so they won't get fired. And it requires strong senior engineers to walk them through the changes they are trying to check in and see why they chose them.
      • Yodel0914 16 hours ago
        I’m so sick of getting “but copilot said…” responses on PR comments.
      • thinkingtoilet 16 hours ago
        I've seen it go both ways. As usual, a good manager should be able to navigate this.
        • amrocha 5 hours ago
          Ok, but not all managers are good and not all situations are navigable.
    • ivape 20 hours ago
      Don’t confuse this with this persons ability to hide their instincts. He is redefining “senior” roles as junior, but words are meaningless in a world of numbers. The $$$ translation is that something that was worth $2 should now be worth $1.

      Because that makes the most business sense.

    • irishcoffee 19 hours ago
      The amount of copium in the replies to this is just amazing. It’s amazing.
    • bgwalter 20 hours ago
      How would a person who describes himself as a "full time content producer" know what is actually going on in the industry?

      https://substack.com/@kentbeck

      What software projects is he actively working on?

      • helsinkiandrew 20 hours ago
        • dlisboa 20 hours ago
          To be fair, even if I appreciate Beck, some people do get too famous and start to inhabit a space that is far removed from the average company. Many of these guys tend to give out advice that is applicable to a handful of top earning companies but not the rest.
        • umanwizard 20 hours ago
          Doesn't this back up the point? From his wiki it seems like he is mostly famous as a programming influencer, not as a programmer.
        • bgwalter 20 hours ago
          So? His bio is literally one latest fad after the other. Now he joins the "AI" fad, what a surprise.
          • repler 20 hours ago
            That is a very cynical take which completely ignores his contributions through the decades.

            In many cases he helped build the bandwagons you're implying he simply jumped onto.

            • exasperaited 20 hours ago
              > In many cases he helped build the bandwagons you're implying he simply jumped onto.

              The fact that I cannot tell if you mean this satirically or not (though I want to believe you do!) is alarming to me.

      • psunavy03 20 hours ago
        The dude literally invented Extreme Programming and was the first signer of the Agile Manifesto. He's forgotten more about software development than most people on this site ever knew.
        • bigfishrunning 16 hours ago
          Seems to me that his core competency is in managing a software team, not developing software.
        • imiric 18 hours ago
          Someone's accomplishments don't make them incapable of having bad opinions and being wrong. Cults of personality are harmful to progress. Opinions should hold the same weight and be held to the same scrutiny regardless of who voiced them.
          • psunavy03 18 hours ago
            That wasn't the question being asked. The question being asked was literally "what are this guy's accomplishments," and Kent Beck is a tech industry OG with a laundry list of them.

            Of course he can be wrong; he's human. That wasn't my point.

            • bgwalter 17 hours ago
              No, that wasn't the question.
              • switchbak 17 hours ago
                When you're so out of touch as to not know who Kent Beck is, these questions hardly matter.

                The thrust of the issue is that: when used suitably, AI tools can increase the rate of learning such that it changes the economics of investments in juniour developers - in a good way, to the contrary of how these tools have been discussed in the mainstream. That is an interesting take, and worthy of discussion.

                Your appeal to authority here is out of place here and clearly uninformed, thus the downvotes.

                • bgwalter 17 hours ago
                  I know who Kent Beck is and I'm not impressed by Agile and Extreme Programming.

                  What I did not know and what the Wikipedia page revealed is that he worked for a YCombinator company. Thus the downvotes.

  • neilv 15 hours ago
    > “Number one, my experience is that many of the most junior folks are actually the most experienced with the AI tools. So they're actually most able to get the most out of them.”

    Would that experience be from cheating on their homework? Are you sure that's the skill you want to prioritize?

    > “Number two, they're usually the least expensive because they're right out of college, and they generally make less. So if you're thinking about cost optimization, they're not the only people you would want to optimize around.”

    Hahaha. Sounds like a threat. Additional context for this is that Amazon has a history of stack ranking and per-manager culling quotas, and not as much a reputation for caring about employees like Google did.

    > “Three, at some point, that whole thing explodes on itself. If you have no talent pipeline that you're building and no junior people that you're mentoring and bringing up through the company, we often find that that's where we get some of the best ideas.”

    I thought the tech industry had given up on training and investing in juniors for long-term, since (the thinking goes) most of them will job-hop in 18 months, no matter how well you nurture. Instead, most companies are hiring for the near-term productivity they can get, very transactionally.

    Does AWS have good long-term retention of software engineers?

    • rossdavidh 15 hours ago
      A big, and little-discussed, problem across many industries is that there is no "pipeline" inside any company. Since the 1980's, the idea that you develop your own talent has fallen by the wayside. You hire it from other companies. Inside software, the issue may be bigger, but it exists in many others as well.
      • neilv 15 hours ago
        Does AWS intend to have that pipeline within the company, starting with juniors, like this talk implies?
        • simgt 5 hours ago
          They don't, that's why their CEO needs everyone else to believe they need to keep on hiring juniors aggressively so AWS can poach them a couple years down the line if needed
    • ygouzerh 2 hours ago
      I feel the majority of junior job-hopping is due to the fact that they are often hired for really low, and then proposed just an incremental raise after two years. Instead, if they change company, then they got a big jump.

      At least, that's what I saw happening here in Hong Kong for juniors I worked with, not sure for other areas.

    • byzantinegene 10 hours ago
      if it isn't obvious already, his plan is to get other companies to train juniors so AWS can poach them when they become seniors
      • zelphirkalt 4 hours ago
        And only seniors! No measly mid-levels here please!
  • orliesaurus 20 hours ago
    Interesting take... I'm seeing a pattern... People think AI can do it all... BUT I see juniors often are the ones who actually understand AI tools better than seniors... That's what AWS CEO points out... He said juniors are usually the most experienced with AI tools, so cutting them makes no sense... He also mentioned they are usually the least expensive, so there's little cost saving... AND he warned that without a talent pipeline you break the future of your org... As someone who mentors juniors, I've seen them use AI to accelerate their learning... They ask the right questions, iterate quickly, and share what they find with the rest of us... Seniors rely on old workflows and sometimes struggle to adopt new tools... HOWEVER the AI isn't writing your culture or understanding your product context... You still need people who grow into that... So I'm not worried about AI replacing juniors... I'm more worried about companies killing their own future talent pipeline... Let the genies help, but don't throw away your apprentices.
    • orliesaurus 20 hours ago
      ON TOP OF IT ALL, juniors are the ones who bring novel tools to the desk MOST times...i.e. I had no clue the Google IDE gave you free unlimited credits despite the terrible UI...but a young engineer told me about it!!
      • bluGill 20 hours ago
        I've seen seniors and juniors bring novel tools in. Seniors do it less often perhaps - but only because we have seen this before under a different name and realize it isn't novel. (sometimes the time has finally come, sometimes it fail again for the same reason it failed last time)
        • JKCalhoun 19 hours ago
          I've seen seniors bring up novel ideas to juniors—well, novel to the juniors anyway.

          Just an example. I've been in so many code bases over the years… I had a newer engineer come aboard who, when he saw some code I recently wrote with labels (!) he kind of blanched. He thought "goto == BAD". (We're talking "C" here.)

          But this was code that dealt with Apple's CoreFoundation. More or less every call to CF can fail (which means returning NULL in the CF API). And (relevant) passing NULL to a CF call, like when trying to append a CF object to a CF array, was a hard crash. CF does no param checking. (Why, that would slow it down—you, dear reader, are to do all the sanity checking.)

          So you might have code similar to:

          CFDictionary dict = NULL;

          dict = CFCreateDictionary();

          if (!dict)

              goto bail;
          
          You would likely continue to create arrays, etc—insert them into your dictionary, maybe return the dictionary at the end. And again, you checked for NULL at every call to CF, goto bail if needed.

          Down past 'bail' you could CFRelease() all the non-null instances that you do not return. This was how we collected our own garbage. :-)

          In any event, goto labels made the code cleaner: your NULL-checking if-statements did not have to nest crazy deep.

          The new engineer admitted surprise that there might be a place for labels. (Or, you know, CF could have been more NULL-tolerant and simply exited gracefully.)

      • al_borland 13 hours ago
        I work at a place with lots of rules around what can and can’t be used. When someone new start we end up spending a lot of time policing to make sure they aren’t using stuff they should be.

        A very basic example were the interns who constantly tried to use Google Docs for everything, their personal accounts no less. I had to stop them and point them back to MS Office at least a dozen times.

        In other situations, people will try and use free tools that don’t scale well, because that’s what they used in college or as a hobby. It can take a lot of work to point them to the enterprise solution that is already approved and integrated with everything. A basic example of this would be someone using Ansible from their laptop when we have Ansible Automation Platform, which is better optimized for running jobs around the globe and automatically logs to Splunk to create an audit trail.

      • salawat 20 hours ago
        I'm just shocked people aren't clueing into the fact that tech companies are trying to build developer dependence on these things to secure a "rent" revenue stream. But hey, what do I know. It's just cloud hyper scaling all over again. Don't buy and drive your own hardware. Rent ours! Look, we built the metering and everything!
        • SketchySeaBeast 19 hours ago
          I'd hope people are. It's painfully obvious this entire AI push is rent-seeking half hidden by big tech money. At some point the free money is going to go away, but the rent for every service will remain.
        • positr0n 18 hours ago
          I'm clued in to that but at this point who cares. All the models are fungible for the coding assistance use case.
        • orliesaurus 18 hours ago
          you will be able to run code assistants on your own machine soon, just like how you can run intense graphical simulations (videogames?) thanks to GPUs
          • agoodusername63 17 hours ago
            idk about that with the rising cost of hardware. But I guess if your IT dept is doing the purchasing thats not really your problem.
    • codegeek 20 hours ago
      "BUT I see juniors often are the ones who actually understand AI tools better than seniors"

      Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?

      • __s 20 hours ago
        Their implication is that junior devs have more likely built up their workflow around the use of AI tooling, likely because if they're younger they'll have had more plasticity in their process to adapt AI tooling

        Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers

        • citrin_ru 19 hours ago
          Speaking of vim - adding and configuring copilot plugin for vim is easy (it runs a nodejs app in the background but if you have spare 500 Mb RAM it's invisible).
          • YetAnotherNick 19 hours ago
            Copilot in vim is not the same as cursor. e.g. There is no multiline tab autocomplete.
        • sailfast 18 hours ago
          All the major players offer a CLI, for what it’s worth.

          You won’t need Vim except to review changes and tweak some things if you feel like it.

      • francisofascii 18 hours ago
        Speaking as a senior dev, anecdotally juniors may indeed understand AI tools better, because they spend more hours a day coding and working with the tools, and they need the tools to understand the codebase or to be productive. Seniors have more hours stuck in meetings, developing specs/tickets for the juniors, code reviewing, etc. Seniors are likely to not bother with a prompt for simple changes in codebases they already understand.
      • perfmode 19 hours ago
        Some old dogs resist learning new tricks.
      • bongodongobob 19 hours ago
        If AI is just prompts to you, you fall into the "don't know how to use it" group.
    • whazor 20 hours ago
      Amazon has an internal platform for building software. The workflows are documented and have checks and balances. So the CEO wants to have more junior developer that are more proficient with AI, and have (in ratio) less senior developers. Also, product context comes from (product) managers, UX designers.

      For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.

    • WestCoader 20 hours ago
      Sorry but what the heck is up with all the ellipses in this comment?
      • Mountain_Skies 20 hours ago
        It's a sort of stream of consciousness. That style of writing goes in and out of style from time to time but some people use it consistently.
      • raincole 20 hours ago
        They have an emacs package that triples their . automatically!
      • butwhyth0oo 20 hours ago
        [dead]
      • debo_ 20 hours ago
        They're trying really hard to make sure you know they didn't write their post with an LLM? /s
        • red-iron-pine 20 hours ago
          honestly i think that'll be a thing in the future

          "bespoke, hand generated content straight to your best readers"

          • DiscourseFan 20 hours ago
            Nah, models can be fine tuned and trained on anything. Common consumer products like ChatGPT and Gemini have particular styles, very polite and helpful, but there are models trained to be combatative, models trained to write in the style of shakespeare, all sorts of things. Someone could train a model to reply to posts in the style of HN comments and you’d probably never know.
    • JKCalhoun 19 hours ago
      > I'm seeing a pattern...

      Me too. Fire your senior devs. (Ha ha, not ha ha.)

      • Ancalagon 19 hours ago
        No no, fire them.

        Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.

    • 9rx 15 hours ago
      > He said juniors are usually the most experienced with AI tools, so cutting them makes no sense.

      While anyone is free to define words as they so please, most people consider those with the most experience to be seniors. I am pretty sure that has been the message around this all along: Do not cut the seniors. The label you choose isn't significant. Whether you want to call them juniors or seniors, it has always been considered to make no sense to cut those with the most experience.

      • dragonwriter 15 hours ago
        No, he’s saying that juniors, while having less experience ind development in general have more experience with AI tools. (This may be true broadly, certainly less experienced devs generally, IME, seem more enthusiastic about adopting and relying heavily on AI tooling.)
        • 9rx 15 hours ago
          While, again, anyone can define words as they see fit, most people consider the "junior" and "senior" labels to apply to the activity being conducted, not something off to the side. As the job is to use AI tools, these most experienced people would be considered "seniors" by most. Nobody was ever suggesting that you should cut good help because they're juniors in knitting or dirt biking.
          • dragonwriter 15 hours ago
            No, the job is to develop software. Using AI tools is one piece of the job. Having less experience with the job overall and more experience with one piece is a thing that happens.
            • 9rx 13 hours ago
              The job is never to develop software. The job is always to solve problems for customers. Developing software is just a tool in the toolbox. As is, increasingly, using AI. As such, it is valuable to have those who are experienced in using AI on staff.

              Which is nothing new. It has always been understood that it is valuable to have experienced people on board. The "cut the juniors" talk has never been about letting those who offer value go. Trying to frame it as being about those who offer experiential value — just not in the places you've arbitrary chosen — is absurd.

          • zahlman 13 hours ago
            > As the job is to use AI tools

            Aside from the absurdity of this claim, consider how many years of experience a "senior" is typically expected to have, and then consider how long even ChatGPT has been available to the public, never mind SOTA coding agents.

            • 9rx 21 minutes ago
              > consider how many years of experience a "senior" is typically expected to have

              That entirely depends on what the experience is towards. If it is something like farming where you only get to experience a different scenario once per year due to worldly constraints, then one would expect many years — decades, even — before considering someone "senior".

              But when the domain allows experiencing a new scenario every handful of milliseconds, you can shorten that tremendously. In that case, a couple of years is more than enough time to become a "senior" even with only a modicum of attention given to it. If you haven't "seen it all" after a couple of years in that kind of environment, you're never going to become "senior" as you are hardly engaging with it at all.

    • lvl155 19 hours ago
      7/10 senior devs (usually fellas 50+) will get mad at you for trying to use Claude Code. Me: “dude it writes better code than crap you write in your mush middle-age brain.” Also me: “I also have mush brain.”

      I think LLM is a reflection of human intelligence. If we humans become dumber as a result of LLM, LLMs will also become dumber. I’d like to think in some dystopian world, LLM’s trained from pre 2023 data will be sought after.

      • thunky 12 hours ago
        > 7/10 senior devs (usually fellas 50+) will get mad at you for trying to use Claude Code

        Ironic because the junior has much more to lose. The 50+ can probably coast across the finish line.

    • ch2026 20 hours ago
      is this just a janky summary cause you added zero new viewpoints
    • yieldcrv 20 hours ago
      you're right but my opinion about this has changed

      I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.

      But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.

      I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.

      Perhaps more complacent firms are the same as they were a year ago.

    • kakacik 19 hours ago
      Maybe, but you make it sound like juniors are more worthy to companies than seniors. Then fire most/all seniors and good luck with resulting situation.

      Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.

      So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.

      Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.

      I get what you say and I agree partially but its a double edged sword.

  • pnathan 20 hours ago
    I - senior - can patch an application in an unknown language and framework with the AI. I know enough to tell it to stop the wildly stupid ideas.

    But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.

    I'm pretty sure AI is going to lead us to a deskilling crash.

    Food for thought.

    • omnimus 19 hours ago
      I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future. Maybe by asking AI to explain/teach instead of asking for solution direclty. Or not using AI at all.
      • thunky 11 hours ago
        > I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future.

        AI is an excellent teacher for someone that wants to learn.

    • zahlman 13 hours ago
      > But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm. I'm pretty sure AI is going to lead us to a deskilling crash.

      Nothing is preventing you from studying how the bugfix works once it's in place.

      Nor is there any reason this use of AI should cause you to lose skills you already have.

      • golly_ned 13 hours ago
        I haven't seen things work like this in practice, where heavy AI users end up being able to generating a solution, then later grasp it and learn from it, with any kind of effectiveness or deep understanding.

        It's like reading the solution to a math proof instead of proving it yourself. Or writing a summary of a book compared to reading one. The effort towards seeing the design space and choosing a particular solution doesn't exist; you only see the result, not the other ways it could've been. You don't get a feedback loop to learn from either, since that'll be AI generated too.

        It's true there's nothing stopping someone from going back and trying to solve it themselves to get the same kind of learning, but learning the bugfix (or whatever change) by studying it once in place just isn't the same.

        And things don't work like that in practice any more than things like "we'll add tests later" end up being followed through with with any regularity. If you fix a bug, the next thing for you to do is to fix another bug, or build another feature, write another doc, etc., not dwell on work that was already 'done'.

        • hyperadvanced 11 hours ago
          Ironically, AI is really good at the adding tests later thing. It can really help round out test coverage for a piece of code and create some reusable stuff that can inspire you to test even more.

          I’m not a super heavy AI user but I’ve vibe coded a few things for the frontend with it. It has helped me understand how you lay out react apps a little better and how the legos that React gives you work. Probably far less than if I had done it from scratch and read a book but sometimes a working prototype is so much more valuable to a product initiative than learning a programming language is that you would be absolutely burning time and value to not vibe code the prototype

        • rudnevr 12 hours ago
          that's true, and nice comparison with tests.
      • Karliss 5 hours ago
        Often it's less about learning from the bugfix itself but the journey. Learning how various pieces of software operate and fit together, learning the tools you tried for investigating and debugging the problem.
    • deepspace 19 hours ago
      > I'm pretty sure AI is going to lead us to a deskilling crash.

      That's my thought too. It's going to be a triple whammy

      1. Most developers (Junior and Senior) will be drawn in by the temptation of "let the AI do the work", leading to less experience in the workforce in the long term.

      2. Students will be tempted to use AI to do their homework, resulting in new grads who don't know anything. I have observed this happen first hand.

      3. AI-generated (slop) code will start to pollute Github and other sources used for future LLM training, resulting in a quality collapse.

      I'm hoping that we can avoid the collapse somehow, but I don't see a way to stop it.

    • JeremyNT 20 hours ago
      I think seniors know enough to tell whether they need to learn or not. At least that's what I tell myself!

      The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.

      And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.

    • BeFlatXIII 18 hours ago
      On the other hand, if it's a one-off, you'll have forgotten what you learned by the time you'd need to use that skill again.
      • PaulStatezny 17 hours ago
        But without AI, there are neural connections formed while determining the correct one-off solution.

        The neural connections (or lack of them) have longer term comprehension-building implications.

        • pnathan 16 hours ago
          This is it. Otherwise I'd know more about $that_technology than I would otherwise.
    • pphysch 19 hours ago
      On the contrary, being able to access (largely/verifiably) correct solutions to tangible & relevant problems is an extremely great way to learn by example.

      It should probably be supplemented with some good old RTFM, but it does get us somewhat beyond the "blind leading the blind" StackOverflow paradigm of most software engineering.

  • frostiness 21 hours ago
    I can't help but feel this is backpedaling after the AI hype led to people entering university avoiding computer science or those already in changing their major. Ultimately we might end up with a shortage of developers again, which would be amusing.
    • mjr00 20 hours ago
      I went to university 2005-2008 and I was advised by many people at the time to not go into computer science. The reasoning was that outsourced software developers in low-cost regions like India and SEA would destroy salaries, and software developers should not expect to make more than $50k/year due to the competition.

      Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

      This might be the professional/career version of "buy when there's blood in the streets."

      • avgDev 20 hours ago
        I went for CS in my late 20s, always tinkered with computers but didn't get into programming earlier. College advisor told me the same thing, and that he went for CS and it was worthless. This was 2012.

        I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.

        • dylan604 20 hours ago
          The one thing I learned in college is that the advisors are worthless. There's how many students? And you are supposed to expect they know the best thing for you? My advisor told me that all incoming freshmen must take a specific math class, a pre-calculus course, totally ignoring all of my AP exams that showed I was well beyond that. Wasted my time and money.
        • realityfactchex 17 hours ago
          The single most costly mistake I ever made, in hindsight, was talking myself out of a CS trajectory and into something more "practical" circa 2003.
      • filoleg 18 hours ago
        You nailed it on the head, down to the exact examples.

        I was still in high school in 2010, and was told the same thing about outsourcing to India/SEA/etc. making a CS degree/career (in the US) a terrible choice. It wasn't just random people saying this either, I was reading about it in the news, online, had some family acquaintances with alleged former software dev career, etc. I didn't listen, and I am glad I didn't.

        As I was graduating from college, and deep learning was becoming a new hot thing, I heard the same thing about radiologists, and how they are all getting automated away in the next 5 years. I had no plans to go to med school, and I didn't know anyone at the time who went through it, so I didn't know much about the topic. On the surface, it seemed like a legitimate take, and I just stored it in my head as "sounds about right."

        Cue to now, I know more than a few people who went through med school, and am in general more attuned to the market. Turns out, all of that was just another genpop hype, those news articles about "omg radiologists are all getting replaced by computers" stopped from showing up on any of my news feeds, and not a single radiology-specialized med school graduate I know had any issues with getting a job (that paid significantly more than an entry level position at a FAANG).

        I have zero idea what point I was trying to make with this comment, but your examples mirror my personal experience with the topic really well.

      • codegeek 20 hours ago
        My take is that these are not binary issues. With outsourcing, it is true that you can hire someone cheaper in Asian countries but it cannot kill all jobs locally. So what happens is that the absolute average/mediocre get replaced by outsourcing and now with AI while the top talent can still command a good salary because they are worth it.

        So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.

        • mjr00 18 hours ago
          > So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.

          Again this is what people said about outsourced developers. 2008 logic was, why would anyone hire a junior for $50k/year when you could hire a senior with 20 years experience for $10k/year from India?

          Reality: for 5+ years you could change careers by taking a 3-6 month JavaScript bootcamp and coming out the other end with a $150k job lined up. That's just how in demand software development was.

        • symlinkk 18 hours ago
          If you have to be “top talent” to survive it’s not a good field to get into anymore.
          • codegeek 18 hours ago
            To survive against Outsourcing/cheaper labor and AI, I would agree.
      • hrimfaxi 20 hours ago
        > Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

        At the end of the day, radiologists are still doctors.

        • symlinkk 18 hours ago
          Yep, the only reason their pay is high is artificial barriers to entry.
      • sublinear 20 hours ago
        Yup hearing big talk about competition and doom is a strong signal that there is plenty of demand.

        You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.

        • bluGill 20 hours ago
          When there is no competition that is a sign there is no demand.

          There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.

    • ravenstine 20 hours ago
      Can anyone really blame the students? If I were in their shoes, I probably wouldn't bother studying CS right now. From their perspective, it doesn't really matter whether AI is bullshit in any capacity; it matters whether businesses who are buying the AI hype are going to hire you or not.

      Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.

      • bonzini 17 hours ago
        Three-four years is a lot of time for these companies to face the harsh reality.
    • simonw 20 hours ago
      "after the AI hype led to people entering university avoiding computer science or those already in changing their major"

      That's such a terrible trend.

      Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!

    • roncesvalles 20 hours ago
      Certainly, I even know of experienced devs switching out of tech entirely. I think the next couple of decades are going to be very good for software engineers. There will be an explosion of demand yet a contraction in supply. We are in 2010 again.
      • DiscourseFan 20 hours ago
        There will be programmers of the old ways, but AI is basically code 2.0, there are now a lot of things that are AI specific that those with traditional software development skills can’t do.
        • omnimus 19 hours ago
          Like what exactly?
          • platevoltage 6 hours ago
            It's really good at writing Brainfuck code.
    • andrewl-hn 15 hours ago
      Perhaps, their own hiring pipeline is suffering, too. With most companies out there cutting internships and hiring of people with no experience "because AI will replace them" for the past 2-3 years we probably having a large dip in number of prospective candidates with 2-3 years of experience today.

      Historically, these candidates have been the hiring sweet spot: less risky than brand new engineers, still small enough experience to efficiently mold them into your bespoke tools and processes and turn them into long-term employees, and still very cheap.

    • Nextgrid 20 hours ago
      It's backpedaling but I don't think it's planning ahead to prevent a developer shortage - rather it's pandering to the market's increasing skepticism around AI and that ultimately the promised moonshot of AI obsoleting all knowledge work didn't actually arrive (at least not in the near future).

      It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.

      • seg_lol 20 hours ago
        "We were against this all along"
        • mattgreenrocks 19 hours ago
          The party line will be: “we always advised using it if it as long as it helps productivity.”

          Pointing out that it wasn’t always that will make you seem “negative.”

          • seg_lol 19 hours ago
            You are right, perfect amount of false humility and balance. The wage suppression is an accidental biproduct and not the intent. Collateral damage if you will.
    • ay 14 hours ago
      Reading this article is especially amusing since this bit just hit the news as well:

      https://www.business-standard.com/amp/world-news/amazon-euro...

    • fullshark 20 hours ago
      Or maybe they realize the AI needs humans in the loop for the foreseeable future for enterprise use cases and juniors (and people from LCL areas) are cheaper and make the economics make some sort of sense.
    • burningChrome 20 hours ago
      Agreed.

      Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.

    • raincole 20 hours ago
      > changing their major

      To what?

  • Sheeny96 2 hours ago
    I, as an experienced engineer, am not afraid of the power of AI to take my job. I'm afraid of midwits who think it can, that hold the purse strings.
  • ok123456 20 hours ago
    So he's saying we should be replacing the seniors with fresh grads who are good at using AI tools? Not a surprising take, given Amazon's turnover rate.
  • bespokedevelopr 17 hours ago
    Some of my friends who are senior/staff engs at various fang companies are basically convinced their jobs are at risk over the next few years due to how good the llms have gotten this year.

    I switched over to consulting/contracting so I don’t have the visibility like they do, but my work is heavily dependent on llms. However I don’t see it wiping out the industry but rather making people more efficient.

    They have much more robust tooling though around their llms and internal products that have automated much of their workflows which is I believe where the concern is coming from. They can see first hand how much of their job has turned into reviewing outputs and feeding outputs into other tools. A shift in skills but not fully automated solution yet.

    It’s hard to gauge where things are going and where we’ll be in 5 years. If we only get incremental improvements there’s still huge gains to be made in building out tooling ecosystems to make this all better.

    What does that look like for new college grads though? How much of this is really computer science if you are only an llm consumer?

    • jakub_g 15 hours ago
      Staff+ work is not that much (exclusively) coding anymore, but identifying correct big things to work on and keeping focus on it, making Bob and Steve from different teams talk to each other instead of building the same stuff twice, making opinionated decisions on things, blocking harmful initiatives, finding elephant in the room and saying things out loud that no one wants to say etc.

      It's not really the work that LLMs currently do. I mean sure, maybe if you plug an LLM to read all emails and slacks and zoom transcripts of the entire company, it could do it at some point in the future. But would it have the same amount of influence compared to an industry & company veteran who has the company specific knowledge and experience that is nowhere written down?

    • deadbabe 15 hours ago
      Computer science isn’t even about code. Finding the most efficient way to pack boxes into a limited space is computer science. Code is the language. Could LLMs solve such real world problems and all their forms? If it’s in their training data, maybe.
    • user3939382 16 hours ago
      If you’re a master of the syntax fog you’re done. If you understand computation from first principles to tcp frames and transformer architecture you’re golden.
  • epolanski 20 hours ago
    My experience is that juniors have an easier time to ramp up, but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

    I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.

    • veunes 4 hours ago
      If a quick start with AI is inevitable, then mentorship and review programs need to be re-evaluated. Seniors shouldn't just check for functionality; they should actively ask juniors to explain why the AI suggested a particular solution, what the alternatives are, and what risks it entails. The focus must shift to understanding, not just generation
    • tayo42 19 hours ago
      > but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

      You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers

      • epolanski 16 hours ago
        To some extend, you're right, but I'd still say that pre AI you had to at some point to write some notes, come with a plan and read more code.
    • PartiallyTyped 20 hours ago
      I have the same experience.

      In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.

      You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.

      From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.

  • kachapopopow 8 hours ago
    I think the bigger problem is that if you elimitate jr devs you kind of just extend the amount of time in college required - effectively requiring to roleplay being in a real company doing real jobs until you gain the experience to go straight into a more senior job, but I mean doctors have relitively the same problem where there is no such thing as entry level because when you are responsible for human lives anything other than a senior role is often not enough.
  • benjismith 16 hours ago
    I think the biggest injury to the hiring of junior devs happened after COVID made remote-work ubiquitous. It's a lot harder for a junior dev to get real mentorship, including the ambient kind of mentorship-by-osmosis, when everyone works alone in a sad dark room in their basement, rather than in an office with their peers and mentors.

    The advent of agentic coding is probably punch #2 in the one-two punch against juniors, but it's an extension of a pattern that's been unfolding for probably 5+ years now.

  • veunes 4 hours ago
    The main problem Garman overlooks is skill degradation. AI is excellent at helping a junior quickly draft boilerplate or find the right API, but it doesn't teach the essentials: debugging, systems thinking, and reading complex code. A junior who grows up on AI "crutches" risks never learning to solve the complex, ambiguous problems that distinguish a senior engineer.
  • siliconc0w 16 hours ago
    "AI will replace software developers"

    "If your seniors are resisting AI and saying it doesn't work, replace them with AI-native engineers!"

    "AI will replace all junior software developers"

    "AI will be a tool to help junior software developers"

    Eventually we will get to:

    "AI requires and will likely to continue to require pretty heavy hand holding and is not a substitute for building and maintaining independent subject matter expertise"

  • alecco 20 hours ago
    Meanwhile:

    "Amazon announces $35 billion investment in India by 2030 to advance AI innovation, create jobs" https://www.aboutamazon.com/news/company-news/amazon-35-bill... (Dec 9 2025)

    • la64710 20 hours ago
      This is for data locality.
    • reop2whiskey 18 hours ago
      oh so CEO is probably referring to using AI to replace foreign junior devs is a bad idea. Domestic junior dev replacement is ok though
  • xp84 15 hours ago
    This is written without any acknowledgement of how short-term thinking has poisoned the entire (world of work? Capitalist system? IDK).

    Yes, killing your talent pipeline is a horrible idea. But that's Future CEO's problem. When we need new seniors to backfill natural attrition, we can poach them from competitors.

    And juniors don't make that much less money, either. Sure, there are people who do light frontend work on Wordpress sites and stuff, who make a lot less. But at my place of work, when we had junior SWEs (we either developed them into seniors in the past 3 years or let them attrition), they were making about ¾ of what seniors make. So, you can pay 4 juniors or you can pay 2-3 seniors. Arguably 1 senior using AI will be a lot more sustainable than 4 juniors burning tokens all day trying to get Cursor to do things they don't really even understand and can't evaluate effectively.

    Anyway I completely agree that all of this, especially eliminating the bottom 2 steps of the career ladder for engineers, is horrible for our entire industry. But our incentive structure will richly reward companies for doing this. Stock price go up. Let Future CEO worry about it.

  • cowsandmilk 21 hours ago
    He said the same thing four months ago: https://news.ycombinator.com/item?id=44972151
  • kunley 17 hours ago
    Btw, it's funny that certain kind of people tend to believe in common sense only when it comes from the mouth of the AWS CEO or a similar persona, and one can be ridiculed when saying the same thing as an anynomous commenter.
    • mrcsharp 17 hours ago
      It's always been this way with any hype cycle. This one is just the latest iteration.
  • israrkhan 19 hours ago
    1. replacing junior engineers, with AI ofcourse breaks the talent pipeline. Seniors will retire one day, who is going to replace them? Are we taking the bet, that we wont need any engineer at that time? sounds dangerous.

    2. Junior engineer's heavy reliance on AI tools is a problem in itself. AI tools learn from existing code that is written by senior engineers. Too much use of AI by junior engineers will result in deterioration of engineering skills. It will eventually result in AI learning from AI generated code. This is true for most other content as well, as more and more content on internet is AI generated.

  • jr-throw 20 hours ago
    I recently pair-worked with two junior developers (on their first job, but still with like 2+ years with the company) in order to transfer the know-how of something.

    I realized that they are shockingly bad at most basic things. Still their PR:s look really good (on the surface). I assume they use AI to write most of the code.

    What they do excel in is a) cultural fit for the company and b) providing long-term context to the AIs for what needs to be done. They are essentially human filters between product/customers and the AI. They QA the AI models' output (to some extent).

  • harshaw 17 hours ago
    Full disclosure: I am pretty sour on the current Amazon/AWS leadership as I think, well, they couldn't lead a company out of a paper bag (former manager at AWS). Is there data that Amazon/AWS is still hiring junior devs? I've heard it's very hard to get into student programs these days but I don't have the data. My grumpy position would be Garmin saying one thing and doing another.
  • aposm 20 hours ago
    We frequently get juniors or interns who are perfectly capable of pumping out many LoC with the use of AI in various forms - the issue is that they _don't_ actually ever learn how to think for themselves, and can't fix problems when something goes wrong or the LLM paints itself into a corner. I have found myself doing a lot more shepherding and pairing with juniors when they can't figure something out recently, because they just have not had the space to build their own skills.
  • twostorytower 21 hours ago
    Well, yeah. Then who will become the senior engineers in 10-15 years?
    • jsheard 21 hours ago
      You think the people deciding whether to hire more juniors are planning more than one or two quarters ahead? 10-15 years is someone else's problem.
      • steveBK123 20 hours ago
        Yes, unfortunately 10-15 years is 5 sequential someones else's problems.
    • azemetre 21 hours ago
      These people are working on destroying the planet to make more money, they absolutely do not care. Our society isn't set up to punish them, but encourage such behavior to even more extremes (see datacenter build outs causing water shortages, electricity hikes, and cancer in poor communities; nearly every politician capitulating on such actions because they don't know better).
      • riskable 20 hours ago
        I wish people would get off the "AI is the worst thing for the environment" bandwagon. AI and data centers as a whole aren't even in the top 100 emitters of pollution and never will be.

        If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.

        Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.

        AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.

        Why don't people get this upset at airport expansions? They're vastly worse.

        • cakealert 6 hours ago
          The answer to that is simple: They hate AI and the environment angle is just an excuse, much like their concern over AI art. Human psychology is such that many of these people actually believe the excuse too.

          It helps when you put yourself in the shoes of people like that and ask yourself, if I find out tomorrow that the evidence that AI is actually good for the environment is stronger, will I believe it? Will it even matter for my opposition to AI? The answer is no.

          • jraph 1 hour ago
            > The answer is no.

            You don't know that. I don't know about you (and whatever you wrote possibly tells more about yourself than anyone else), but I prefer my positions strong and based on reality, not based on lies (to myself included).

            And the environment is far from being the only concern.

            You are attacking a straw man. For you, being against GenAI, simply because it happens to be against your beliefs, is necessarily irrational. Please don't do this.

        • azemetre 17 hours ago
          People are allowed to reject whatever they want, I'm sorry that democracy is failing you to make slightly more money while the rest of society suffers.

          I'm glad people are grabbing the reigns of power back from some of the most evil people on the planet.

        • jnd-cz 20 hours ago
          Of course they aren't polluters as in generating some kind of smoke themselves. But they do consume megawatts upon megawatts of power that has to be generated somewhere. Not often you have the luxury of building near nuclear power plant. And in the end you're still releasing those megawatts as heat into the atmosphere.
        • jraph 20 hours ago
          > Why don't people get this upset at airport expansions?

          We do too, don't worry.

    • ghc 21 hours ago
      Pretty sure Anthropic is hoping the answer is Claude.
      • marcosdumay 21 hours ago
        Pretty sure Antropic knows their hopes won't come true. They just won't tell you that.
    • phyzix5761 21 hours ago
      10-15 years? Sadly, most people are declared senior after like 2 years of work.
    • tastyfreeze 21 hours ago
      Obviously after 10-15 years of experience working as a developer AI will be a senior dev. Probably will get promoted to management with all that experience.
      • simonw 21 hours ago
        Promoting your best engineers to management sometimes gets you a great manager, but often gets you a mediocre or just-about-competent manager at the cost of a great engineer.

        I'm a big fan of the "staff engineer" track as a way to avoid this problem. Your 10-15 year engineers who don't vibe with management should be able to continue earning managerial salaries and having the biggest impact possible.

        https://staffeng.com/about/

        I'm also a fan of leadership without management. Those experienced engineers should absolutely be taking on leadership responsibilities - helping guide the organization, helping coach others, helping build better processes. But they shouldn't be stuck in management tasks like running 1-1s and looking after direct reports and spending a month every year on the annual review process.

        • ryandrake 20 hours ago
          This is a general problem that corporations have trouble with with: The struggle to separate leadership and people management. Why does the person who tells you what to do also need to be the same person who does your annual review, who also has to be the same person who leads the technical design of the project, approves your vacation, assists with your career development, and gives feedback or disciplinary correction when you mess up? Why do we always seem to bundle all these distinct roles together under "Manager"?
        • RevEng 20 hours ago
          This is exactly where I find myself. I've been asked several times to take on management, but I have no interest in it. I got to be a principal after 18 years of experience by being good at engineering, not management. Like you said, I can and do help with leadership through mentorship, offering guidance and advice, giving presentations on technical topics, and leading technical projects.
        • tastyfreeze 20 hours ago
          Absolutely agree. Regardless, my org keeps trying to get me to take a management role after 15 years dev experience. I love my job and don't like managing people. You couldn't pay me enough to become a manager.
        • grogenaut 20 hours ago
          I still spend a week on annual reviews but you make great points all around.
    • otikik 20 hours ago
      Anything besides next quarter does not exist.
    • nextworddev 20 hours ago
      AI will be the senior engineer
    • DiscourseFan 20 hours ago
      You can’t pretend you know where technology will take us.
    • philipwhiuk 21 hours ago
      Stolen from your competitors, obviously.
  • rippeltippel 18 hours ago
    > In fact, 30% of companies that laid off workers expecting savings ended up increasing expenses, and many had to rehire later.

    Such as (cough...) Amazon?

    • turtletontine 17 hours ago
      Source..? What exactly are you referring to?
  • stargrazer 12 hours ago
    Will these intro AI systems then mature to be senior devs who can then mentor more junior AIs? Then we won't need any devs? Isn't that the end goal, AI runs the company and we can all go fishing?
  • retinaros 22 minutes ago
    he is saying that while doing that. if you look at who gets laid off and who gets hired guess what? l7-l6 are fired. l4-l5 hired. they most likely think replacing senior by a junior with ai is worth it financially
  • geodel 20 hours ago
    I have heard this thing quite a few times over last few months each time is Amazon or AWS CEOs. May be this time he want to replace senior engineers. That would be more useful for them as each passing year they more and more of them and in times like these they are not looking to go leave Amazon on their own.
  • JanickSpielmann 4 hours ago
    If you don't invest in juniors, it will only be like 20 years until you're out of seniors.
  • klipklop 19 hours ago
    I believe the idea is to not stop hiring juniors. Instead it's to replace anybody that commands a high salary with a team of cheaper juniors armed with LLM's. The idea is more about dragging down average pay than never hiring anybody. At least for now.
    • stockresearcher 19 hours ago
      And then all those unemployed seniors with extensive domain knowledge use AI to speedrun the creation of competition and you need to spend $$$$ to buy them out and shut them down. Solid idea.
      • klipklop 17 hours ago
        They have lawyers, patents, non-competes, collusion and regulatory capture to prevent this from happening.

        We can also assume once these coding models get good enough they will not be shared with the general public or competitors.

  • jaredcwhite 19 hours ago
    The level of cynicism here is astronomical. After discovering the strategy of "fire juniors and let a few seniors manage autonomous agents" was an abject failure, now the line is "actually juniors are great because we've brainwashed them into thinking AI is cool and we don't have to pay them so much". Which makes me want to vomit.

    The only relevant point here is keeping a talent pipeline going, because well duh. That it even needs to be said like it's some sort of clever revelation is just another indication of the level of stupid our industry is grappling with.

    The. Bubble. Cannot. Burst. Soon. Enough!!

  • focusgroup0 21 hours ago
    Which is a less dumb idea: replacing new grad junior devs with AI or H1Bs?
    • ThrowawayTestr 21 hours ago
      What about new grad H1Bs?
      • toast0 20 hours ago
        Hire new grads on OPT and transition to H-1B.
    • platevoltage 18 hours ago
      Better idea, replace everyone with AI, and when that doesn't work, replace the AI with H1Bs.
    • nxor 21 hours ago
      [dead]
  • welliebobs 14 hours ago
    Do we really think that we're going to see all developers morph into one archetype where we all have exactly the same effective skills? Many engineers already have an area of interest where they focus, be that performance optimisation or high level architecture.

    It's my prediction that we're going to see more specialised skill sets become more commonplace. We'll have developers who can effectively use AI to bootstrap PoC's, developers who use AI in well established code bases to increase velocity (think asking Cursor to implement another set of REST endpoints for a new type), and developers who might choose to exclude AI from their workflows.

    Eventually (I hope, at least) it'll be expected that it's another tool that developers can use in their day to day and less of the Omnissiah that has come to replace us as developers.

  • HeavyStorm 17 hours ago
    Thank God someone still has a functioning brain.

    You should replace devs vertically, not horizontally, otherwise, who'll be you senior dev tomorrow?

    Jokes aside, AI has the potential to reduce workforce across the board, but companies should strive to retain all levels staffed with humans. Also, an LLM can't fully replace even a junior, not yet at least.

  • twelvechess 21 hours ago
    Most of the apps that I use regularly fail at least once a day nowadays. I think this is a direct cause of putting AI code in production without reviewing/QA.
    • RevEng 20 hours ago
      While I have no particular love for AI generated code, I think this has nothing to do with AI. Software has been unreliable for over a decade. Companies have been rushing out half baked products and performing continual patches for many years. And it's our fault because we have just come to accept it.
      • rented_mule 20 hours ago
        > Software has been unreliable for over a decade

        The "over" deserves a lot of emphasis. To this day, I save my code at least once per line that I type because of the daily (sometimes hourly) full machine crashes I experienced in the 80s and 90s.

        • daedrdev 20 hours ago
          Same, I think I should just turn on autosave at this point to save my fingers
          • rightbyte 1 hour ago
            I have this fear autosave might corrupt the file by trying to save while the program has hung or whatever.

            I don't remember which app made me think that. Maybe some old version of Matlab cleared unsaved files when hung and with autosave enabled.

    • XenophileJKO 20 hours ago
      My prediction is that this will actually get better, because the cost to find and fix with AI is so much lower in time investment.
      • RussianCow 20 hours ago
        The problem is human, not technical. Companies and managers need to start caring about the details instead of crossing items off a list. Until we see that culture shift in the industry, which might never happen, AI isn't going to help—if anything, it'll make the problem worse as devs rush to deliver on arbitrary deadlines.
        • XenophileJKO 18 hours ago
          Well the reason I think it might be different is that I am noticing a material change in my behavior.

          I have always cared a lot about quality and craftsmanship. Now when I am working and notice something wrong, I just fix it. I can code it entirely with AI in the time it would've take me to put it on an eternal backlog somewhere.

        • joshribakoff 20 hours ago
          Plus, if you are skipping tests or telling yourself, you wrote them when they don’t actually verify anything in the first place, then buying into a hype cycle of “the AI writes perfect code“ is unlikely to break the pattern
      • otabdeveloper4 18 hours ago
        AI doesn't ever fix anything, it just breaks stuff and adds technical debt.
  • stillworks 6 hours ago
    And yet, they won't like it if aspiring SDEs use AI to "assist" when interviewing :D

    Just yesterday had a coding interview (not any FAANG) and the interviewer wanted a screen share and also checked my IDE settings to make sure "AI" was turned off.

    Not that I intended to or even intend to use LLM based tooling for interviews.

    Although having said that, if I intended to, interviewers won't find out. Interviews should always be done in person. (That took a different tangent... sorry)

  • KnuthIsGod 13 hours ago
    Team at a bank I know went from 13 members to 2. The remaining two are likely to be outsourced. They are trying to transition to the business side.

    Folks in Hyderabad can run LLMs too and data centre and infrastructure costs are lower in India.

  • nevir 18 hours ago
    Juniors are also more likely to be the MOST proficient/comfortable with AI tooling.

    Pair them with a senior so they can learn engineering best practices:

    And now you've also just given your senior engineers some extra experience/insights into how to more effectively leverage AI.

    It accelerates the org to have juniors (really: a good mix of all experience levels)

    • goosejuice 12 hours ago
      > Juniors are also more likely to be the MOST proficient/comfortable with AI tooling.

      Why? That seems unlikely to me. That's like saying juniors are likely the most comfortable with jj, zed, or vscode.

  • par 19 hours ago
    Ive been managing and supporting teams for a long time and i'm sorry, but junior and mid-level devs do the majority of the heavy lifting when it comes to work output in big corps. I don't think AI will replace them. I don't think all these IC5 and IC6 engineers are going to be putting up 400-500 diffs a year anytime soon.
  • itissid 19 hours ago
    I gave opus an "incorrect" research task (using this slash command[1]) in my REST server to research to use SQLite + Litestream VFS can be used to create read-replicas for REST service itself. This is obviously a dangerous use of VFS[2] and a system like sqlite in general(stale reads and isolation wise speaking). Ofc it happily went ahead and used Django's DB router feature to implement `allow_relation` to return true if `obj._state.db` was a `replica` or `default` master db.

    Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.

    [1] https://github.com/humanlayer/humanlayer/blob/main/.claude/c... [2] https://litestream.io/guides/vfs/#when-to-use-the-vfs [3] https://docs.boundaryml.com/guide/baml-advanced/prompt-optim... [4]https://github.com/gepa-ai/gepa

    • NewJazz 19 hours ago
      I'm not sure a junior would immediately understand the risks of what you described. Even if they did well in dist sys 101 last year.
  • reassess_blind 7 hours ago
    Junior devs become senior devs. No more junior devs = no more senior devs.
  • shevy-java 16 hours ago
    Yet the big corporations all do it. So, something in the meta-explain-chain here does not work. You can not go about "this is a dumb idea", but then do it anyway - that just doesn't add up.
  • danans 19 hours ago
    Junior vs senior is the wrong framing. It's "can use LLMs effectively" vs "can't use LLMs effectively".

    It's like expecting someone to know how to use source control (which at some point wasn't table stakes like it is today).

  • gehsty 18 hours ago
    A Junior Dev is not just for Christmas, it’s for life.

    But more seriously are there CEOs out there who think they can replace the people starting off in their industry with AI? Who do the think will be the senior devs in 5-10yrs?

  • sputr 2 hours ago
    Replacing juniors with AI is a fantastic idea.

    Let’s also replace mids while we’re at it.

    — A senior developer

  • jacquesm 13 hours ago
    Obviously. If you don't have junior devs you will never have senior ones either. It's implicit.
    • testing22321 13 hours ago
      I guess the theory is that in 5 or 10 years when that becomes a problem AI will replace the seniors too.
      • jacquesm 13 hours ago
        Magical and wishful thinking in my opinion.
  • zkmon 19 hours ago
    But I think the actual reason was not addressed. The work of junior devs is exactly what can be replaced by AI, instead of the more complex abilities senior development possess.
  • fire2dev 19 hours ago
    > A company that relies solely on AI to handle tasks without training new talent could find itself short of people.

    I kind of agree with this point from the perspective of civilisation.

  • Jerry2 18 hours ago
    Same story from 4 months ago: https://news.ycombinator.com/item?id=44972151
  • gaptoothclan 18 hours ago
    old zuckerburg said 80% of junior developers would be cut in 2026, I say 80% of CEO's who replace their software engineers will be cut in 2026
  • exabrial 15 hours ago
    Cheaper in the short term until executives turn over, then it's the next guy's problem.
  • zkmon 19 hours ago
    The third point is applicable to general demographic survival as well. Countries with most child birth rates, would ultimately win.
  • echelon_musk 21 hours ago
    > AWS CEO says using AI to replace junior staff is 'Dumbest thing I've ever heard' (theregister.com) 1697 points by JustExAWS 3 months ago

    https://news.ycombinator.com/item?id=44972151

    Does this story add anything new?

    • geodel 20 hours ago
      This CEO is a sales guy. He's gonna repeat same thing every few months claiming "Brand new and improved AI Prediction"
  • fastball 14 hours ago
    Is that still true if all the junior engineers are using AI?
  • add2 13 hours ago
    By the way, junior devs are using AI for their work.
  • rudnevr 12 hours ago
    I don't know, I've always thought that junior problem was mostly non-technical, kids issues: overconfidence, love for shortcuts, sense of entitlement, arrogance, lack of communication and respect of colleagues, including fellow juniors and seniors, aversion to holy wars, lack of compromise and team discipline, disrespect to existing solutions, laziness in following-up post-delivery, negligent edge case checking, being opinionated about tooling, languages and whatnot. Very little of this can be fixed with AI, and many things can be easily amplified. I mean, one junior with AI vs one senior with AI might yield comparable results, but seven juniors with AI vs seven seniors with AI should fail pretty fast.
  • enigma101 9 hours ago
    first the junior then the senior and finally the ceo...
  • gchokov 17 hours ago
    Watch them what they do, not what they say.
  • wolfi1 18 hours ago
    so what is the take away message? fire only the senior devs cause they cost too much and can't use AI?
  • tasqyn 19 hours ago
    some companies doing the opposite, firing senior devs and hiring junior with AI experience
  • rvz 20 hours ago
    Now with AI, I expect junior developers to learn much quicker and progress to senior very quickly. I'd now rather hire at least 1 of each to begin with, both "junior" and a "senior" developer and then additionally hire more juniors to quickly turn them into a "senior".

    We do not need to hire anymore outside senior developers who need to be trained on the codebase with AI, given that the junior developers catch up so quickly they already replaced the need to hire a senior developer.

    Therefore replacing them with AI agents was quite premature if not completely silly. In fact it makes more sense to hire far less senior developers and to instead turn juniors directly into senior developers to save lots of money and time to onboard.

    Problem solved.

    • platevoltage 18 hours ago
      If only everyone thought this way.
  • 0xdeadbeefbabe 21 hours ago
    4) Junior devs have an incomparably superior context window.
  • dwa3592 20 hours ago
    on a separate note- If AI eats the SaaS, what will happen to AWS?
    • geodel 20 hours ago
      Umm.. more business. Because AI runs in cloud and consume even more resources.
      • dwa3592 19 hours ago
        but which business?
  • gorbachev 18 hours ago
    Somebody gets it.
  • testemailfordg2 13 hours ago
    Is or Was?
  • aurizon 15 hours ago
    Move fast with AI, and break things, many, many things really fast
  • d--b 16 hours ago
    I find it a little weird that junior devs are considered not good in general.

    When I started working, I think I was fairly competent technically, and usually the people I hired were also pretty good straight out of uni.

  • AtNightWeCode 17 hours ago
    What a complete moron. This is why every senior dev hates these idiots in mid management. The kids know better? Yeah right. The new kids knows nothing. Recognizing how things work is not knowledge.
  • mberning 19 hours ago
    To me the more insidious problem is that we have juniors now that aren’t learning much because they lean on AI for everything. They are behind the curve.
  • johnwheeler 20 hours ago
    …proceeds to replace junior devs with AI
  • oulipo2 20 hours ago
    They know AI is inefficient and mostly just a glorified "template filler" at this point...
  • MSKJ 16 hours ago
    I would think you need Juniors to get Seniors. Or is there another way?
  • Nextgrid 21 hours ago
    This is performative bullshit pandering to the increased skepticism around AI. He wouldn't be saying that if AI investment was still in full swing.

    I do agree with him about AI being a boon to juniors and pragmatic usage of AI is an improvement in productivity, but that's not news, it's been obvious since the very beginnings of LLMs.

    • grogenaut 20 hours ago
      So it's performative when the head of AWS says it and not news. But it's not performative when you say it and people should have listened to you in the comments?
      • Nextgrid 20 hours ago
        It's performative when you talk whatever the market wants to hear rather than sticking to an opinion (no matter how flawed it is). This behavior reminds me of the cryptobros that were hailing NFTs/web3 as the next best thing since sliced bread, and when that didn't came to pass quietly moved onto the next grift (AI) with the same playbook.

        (also I’m just talking out of my ass on a tech forum under a pseudonym instead of going to well-publicized interviews)

  • butterisgood 16 hours ago
    correct.
  • oytis 18 hours ago
    Am I going crazy or he already said that several months ago?
  • Madmallard 11 hours ago
    I don't really understand this comment from the CEO.

    Does he not understand the people making millions or billions off AI literally do not care?

    They fully are committed to seeing if they can do away with having to employ people all together.

    They want techno-feudalism.

    Sam Altman and the ilk are so anti-humanity seeming in interviews it's really disgusting that we allow them to be in a position of power at all.

  • tonyhart7 20 hours ago
    shareholder wouldn't like this
  • ubicomp 20 hours ago
    FINALLY.
  • panny 13 hours ago
    Isn't it weird how nobody here cared about Silicon Valley ageism until it hurt junior developers. No, I don't have schadenfreude. I just see a bunch of people, even in this very thread, now claiming juniors somehow understand AI tools better than seniors. Even though we've all had the same amount of time to use them.

    At some point, admit you have a problem maybe? Maybe that will only happen after you spent 20 years staying on top of the latest tools and tech only to be told you're out of style because your hair started to grey.

  • mocha_nate 13 hours ago
    and yet...
  • coolThingsFirst 15 hours ago
    Yes they decapitated careers now when the bubble is about to burst, well sorry.
  • mxkopy 19 hours ago
    They must’ve forgot who created the first tech hype bubbles in the first place bc I’m about to replace some of these companies if I don’t get hired soon
  • dogemaster2032 19 hours ago
    [flagged]
  • elmean 21 hours ago
    [flagged]
    • msdrigg 21 hours ago
      I think you might have terminal brain rot
    • jjmarr 20 hours ago
      Week 1: 64 interns

      Week 2: 32 interns

      Week 3: 16 interns

      Week 4: 8 interns

      Week 5: 4 interns

      Week 6: 2 interns

      Week 7: 1 intern

      Week 8: 0.5 interns

      Is it possible to make it to the end of the summer without getting sliced in half?

      • cobbal 20 hours ago
        If they don't agree to be turned into a cyborg, are they really using enough AI?
      • geodel 20 hours ago
        If these interns think a bit they both should run away by end of week six.
    • linuxhansl 20 hours ago
      I think you made that up.
    • glitchc 20 hours ago
      You must really not trust your hiring process.
      • elmean 17 hours ago
        This is the hiring process...
    • nerdsniper 21 hours ago
      How much do these interns get paid per hour?
      • elmean 17 hours ago
        I don't understand the question, they are interns
    • happymellon 21 hours ago
      Wow, what a shit working environment.
    • otikik 20 hours ago
      > each week we slim down to group by half

      What the hell.

      Consider making them fight each other in an arena, you could monetize that.

      • elmean 17 hours ago
        Hunger games style I like it
    • nehal3m 20 hours ago
      Am I missing some irony or sarcasm here? Aren't internships meant to spend some time teaching people the ropes in return for free hands? This sounds like a weird Jack Welch circlejerk.
      • elmean 17 hours ago
        The creamé settles to the top
  • smurda 21 hours ago
    This sounds like a comment from someone who doesn't have visibility into how good the models are getting and how close they are to fully autonomous, production-grade software development.
    • Nextgrid 20 hours ago
      This is an easy theory to prove; if AI was anywhere close to a senior engineer, we'd see the costs of software development drop by a corresponding amount or quality would be going up. Not to mention delivery would become faster. With LLMs being accessible to the general public I'd also expect to see this in the open-source world.

      I see none of that happening - software quality is actually in freefall (but AI is not to blame here, this began even before the LLM era), delivery doesn't seem to be any faster (not a surprise - writing code has basically never been the bottleneck and the push to shove AI everywhere probably slows down delivery across the board) nor cheaper (all the money spent on misguided AI initiatives actually costs more).

      It is a super easy bet to take with money - software development is still a big industry and if you legitimately believe AI will do 90% of a senior engineer you can start a consultancy, undercut everyone else and pocket the difference. I haven’t heard of any long-term success stories with this approach so far.

      • throw1235435 3 hours ago
        TL;DR: Code is the easy part; and at least in the last few years was rarely the bottleneck so even if we get rid of coding we don't deliver infinity amount of software. The "What to build" usually takes longer than building it. The amount will only go up where coding was holding things up or the main portion of time spent in delivering software (Hint: It usually isn't even 20% of the time in delivery times in my experience). There's many other stages to the SDLC and lots of processes even before then for large scale systems.

        On your point about a consultancy; many of the software dev consultancies will dry up w.r.t work. There won't be success as you state -> after all if your consultancy can do it so can an LLM so why do I need you as the middleman? After all just get Claude/Gemini/etc to do it for small things; you are already seeing this effect in things like graphic design, copywriting and other small creative skills. For large things with large complexity and judgement you need domain experts and guardrails again and other non-coding jobs -> that slows things down considerably so but still better to be in those jobs than anything requiring intelligence now.

        As a result coding could easily be automated entirely and we may only see for example 20% increase in total "large" software velocity. As I mentioned in another comment it will be the people in the chain who produce little value but are required for other reasons (e.g. compliance, due diligence, sales, consultants, etc) that will remain and will be the bottleneck. The people that techies thought offered little value and made up inefficiencies and didn't contribute at all -> they have the last laugh in the end and they have AI to thank for that.

        Personally in my team I know we are seeing significant improvement to the point where hiring is no longer considered; I'm worried about our senior staff even. Anything that is labor, and not deciding "what to do" I feel I no longer need help with nearly as much. This is many components in a large public org. Feel like I only need two staff now, and that's more to understand the problem and what to do then the action of actually doing it; and a backup for accountability. If I hire more its only because we I can't keep up with the AI and am burning out, and I won't because I don't want to "hire to fire" later on if we run out of product work. It makes me anxious, and I can't recommend anyone with honesty to make this their career anymore; anything else feels more like false hope at this point.

    • joshribakoff 20 hours ago
      This sounds like a comment from someone who has tested it in a limited capacity such as small blog sites or side projects that did not need to be maintained
    • JackSlateur 20 hours ago
      Yes, you are right

      I'm yet to see that production-grade code written by these production-grade models;