Is the software industry seeing a fundamental shift?
May 4, 2024 2:13 PM   Subscribe

Myself and others who I’d consider very senior have been wholesaled laid off. These aren’t lower level developers, though that’s happening, these are people who fundamentally or at least were part of teams that created the modern software SLDC. The question isn’t whether software engineering is going away but is it moving to lower cost of living areas because for most software the “hard” problems have been solved? Similar to textile manufacturing?

I hope this question isn’t too broad and it isn’t as if all jobs everywhere are leaving, but the argument was that with cloud services, modern tool chains and increased automation people are willing to put up with what we’d consider traditionally bad software but are willing to deal with “quirks” as software becomes broadly fast fashion or so reliable it doesn’t need to be updated. There’s generally no data loss, downtime and creating your average business app has always been a dream of being something a business user can do, which may never happen completely, can at least be moved to increasingly lower cost of living countries. That running a software project or company is like running an apparel company: the business and marketing is more important and making a shoe translates roughly to the same as making a couch.

These aren’t “tech bros” or neckbeard types. We’ve seen layoffs since the 90s but this seems fundamentally different. It was a bit of middle age crisis. We aren’t worker bees anymore but still code and would make decisions that are now being overruled, and we have diverse backgrounds.

The conclusion was most problems now aren’t what we had before: projects often don’t outright fail and the quality issues we see now have entered phase of no longer mattering. A lot of our pioneering work by us and others mean we don’t see memory segmentation faults but maybe the analytics/telemetry are not done right (go to Chanel’s site for a simple example, and the number of errors from some analytics package not being fired off correctly).

I guess, given the mass layoffs of core units at even the tech giants, some of which is overreaction I’m sure or over staffing, I remember my parents complaining the globalism and analytics driven culture of the 80s turned their jobs into generic reports driven and unless it impacted the bottom line quarterly it wasn’t a concern. Or is this just a recession and there will be a new normal? Keep in mind this is a group of people passionate about software, not latest HackerNews new tech types, and with a strong sense of business strategy.


Again the objectives of business seem nothing fundamentally new in that business always wants buy over build, get the lowest quality before it impacts their bonus, etc. but did we get see a tipping point that we got too food that obvious things like “our software product isn’t working” to hard to quantify things (semantics wrong? Are they needed even?) that we were brought into the decision making process because simply upper management didn’t know if they were the problem?
posted by geoff. to Computers & Internet (33 answers total) 17 users marked this as a favorite
 
Response by poster: (Sorry about typos on my phone, ha. Underlines my point I chose the easier to use tech with quality issues over making it over to my laptop.)
posted by geoff. at 2:15 PM on May 4 [1 favorite]


Well, I was in Software Development in the Banking Industry from 1976 until my recent retirement near the end of 2023. In my (non-HR wonk) opinion, the momentum of all those years was driven by industry consolidation (i.e.,. "conversions" and on-boarding), technology advancement (e.g., IDMS or IMS => DB2), pivoting to the Cloud (believe it or not COBOL applications running on an IBM mainframe *can* be ported to the Cloud) and pivoting from less to more profitable industry niches (e.g., retail Banking to Syndications or Capital Markets) which requires a decent amount of insight into accounting rules, Fed regs, security, etc.). Also, once during my career the Bank I was working at out-sourced the Systems Development to a famous (though not very sharp) consulting firm.

Like I said, I'm not an expert, but my perspective is more longitudinal as opposed to just looking at recent events, trends or fads. On the other hand it is focused on my experience in FinTech. FWIW.
posted by forthright at 2:39 PM on May 4 [4 favorites]


This piece by Baldur Bjarnason seems relevant? “My theory is fairly straightforward: The long-term popularity of any given tool for software development is proportional to how much labour arbitrage it enables. The more effective it is at enabling labour arbitrage, the more funding and adoption it gets from management.”
posted by migurski at 2:58 PM on May 4 [13 favorites]


My software engineer acquaintances who have learned new technologies are still extremely in demand. Currently, the ones who can create AI innovations are more in-demand than ever. Previously, it was the ones who learned technologies to power self-driving cars or social networks. Most of their software skills are transferrable, but they have to be willing to put in the long hours to learn the newest areas.

In software, the job skills that are in demand (and the ones that are obsolete) shift quickly. In medical research, if someone researched polio or smallpox, they were at one point the most highly sought-after researchers. But in 2024, it would be harder to be employed as a polio researcher. Software just shifts much faster than medical research, and a software engineer might experience multiple shifts each decade.
posted by vienna at 3:28 PM on May 4 [3 favorites]


First, my condolences. Layoffs suck.

I think Bjarnason has it. It's late capitalism coming for the software industry.

I'm out of the industry now, thank the gods, but I think there's a stark contrast in the jobs I've had: when I was working for a guy who actually wrote code, it was fun; when I wasn't, it wasn't.

The whole idea of an MBA is that industry knowledge isn't needed: managers are interchangeable, and so are workers. The boss who's a coder may be a jerk, but at least he values good code. The MBA doesn't know what good code is, so all he cares about is doing it cheaply. The dev who's been there ten years and knows why things go wrong is now an obstacle.
posted by zompist at 3:29 PM on May 4 [22 favorites]


Perhaps I’m bullish, but I wouldn’t count software out. Projects fail, quality matters, security matters. Ask any company that has experienced a high profile information breach or security incident- you cannot race to the bottom. Having personnel and data in the country of the customer matters, especially for the public sector. Offshoring is not the answer.

Tech is down because money is expensive and big tech has bad real estate on the books. When these conditions are corrected and investing in businesses beats inflation again, I think we will see more jobs. There has been a lot of stupid money in tech that will take time to wash out of the system. Sure, the jobs may be different, but you always had to learn new things to stay relevant.
posted by shock muppet at 4:02 PM on May 4 [7 favorites]


I don't think software is in a recession, but talking to friends who are still working it sounds like the pendulum has swung towards offshoring again.

A number of years from now someone will realize that the problems of managing all of these teams (and the fact that they hate getting up at 3AM to talk to them) could be solved by bringing it all into a single local concern. And a number of years after that...

I think I went through this cycle 4 or 5 times in my career.

Not that we should complain. "Decentralize everything that's centralized, centralize everything that is decentralized" is a motto that has kept software systems architects in business for years.
posted by Tell Me No Lies at 4:22 PM on May 4 [4 favorites]


I don't think there's been a time since the outset when software hasn't been undergoing a major shift. It's never been easier to write a language, the languages are better designed for testing and to mirror the requirements you're building towards, CPU power has increased so you don't have to be a specialist in efficiency and CI exists now when it didn't before. And even taking those examples I think I'm just scratching the surface of how the world has changed.

I think the change in the job market has a lot more to do with available money than it does differences in need. When money was cheap we just did more development. Now money is expensive people are spending their pennies more wisely (or, perhaps, trying to). If it turns out that they cut their costs by choosing what to do more wisely then we will see a shift, but as it is that remains to be seen.
posted by How much is that froggie in the window at 4:49 PM on May 4 [2 favorites]


My dad's career started and ended a few years earlier than forthright's but periodically companies would go through a period of laying off more experienced programmers and hiring a smaller number of newer ones at much lower pay. Eventually, this would backfire, but it would take long enough for it to look like profit for the people who benefited from the appearance of numbers go up. (Early on he worked for what we'd call tech companies or startups now, but they weren't stable then either, and he spent most of his career at major non-tech corporations.)
posted by wintersweet at 6:30 PM on May 4 [6 favorites]


20 years in industry, my observation is close to wintersweet's - I feel a lot of companies have managed to establish their major products and platforms in the last few years and are shifting to hiring juniour maintainers over senior product people. This works for a while - the new guard are typically OK at keeping the lights on and adding incremental features, but things eventually fall apart when it comes to major updates and pursuing new product capabilities.

Bit like the Boeing story - they outsted and outsourced, this was fine for keeping their old designs around, but as soon as they hit new competition and had to innovate again they flew apart at the seams.
posted by muppetkarma at 8:17 PM on May 4 [1 favorite]


Response by poster: Oddly I am in AI but I found out quickly a CTO doesn’t see it as product recommendation engine will increase revenue but AI as a product like ChatGPT and wanting a number to bring down head count to decrease costs. Very few companies have a direct need for AI as you could probably say about a lot of new technology. It feels like explaining why linting is inportant and how does it add revenue/decrease costs. That’s very difficult to do. I was replaced by someone who spouts platform AI (generally not that AI help but SAP has an AI product that will not do what it says but check a box on the we invested in AI platform).

Similarly CI/CD would catch errors and deploy code automatically but rarely do you receive GC or seg dev fault errors. It’s more nuanced and treated as an advanced version of FTP.

So I’ve changed disciplines multiple times in my career and I guess being in consulting is even worse. It’s weird we all had known each other for years some knew business better than others but generally we’re all in “report to executives” kind of roles. Not churning out code all day. Or just feels different especially when Google laid off core Python / Flutter & Dart teams.

Tech advancements are rarely quarter to quarter and seeing large profitable companies do whatever they can to spend less made us wonder if we solved things not to the point where we could do more but that no one would want to do more. Just being a money thing due to ZIRP makes sense.

When I see Fortune top 10 companies monitor productivity of AI based on the amount of code change of a certain group uses a tool I’m like what. That’s a compete misunderstanding of AI, and not really how it should be utilized.

That’s one example bit of felt like at last night’s dinner it was like, well we aren’t losing money and we are profitable so let’s keep cutting things.

Again I guess until ZIRP is back might be worth looking into other industries.
posted by geoff. at 8:21 PM on May 4


Response by poster: Oj wow that article is perfect. For a long time I argued web components or simple html/js were needed. But there were no certs for that. Everyone was coming out of React bootcamps. You’d have simple web pages a mess of react code and if there wasn’t an squiggly underline of red and it “built” not their fault.
posted by geoff. at 8:45 PM on May 4


periodically companies would go through a period of laying off more experienced programmers and hiring a smaller number of newer ones at much lower pay

This happened to my dad. This happens to every industry that lacks union density. Experience only means something if management thinks it does
posted by eustatic at 9:04 PM on May 4 [5 favorites]


My experience is that software companies, even the ones rolling in money, have laid off a lot of senior people in the past two years. Like you, I'm in AI and have decades of experience. Aside from the very high flying AI unicorns, I have not seen many openings and of those most are a serious pay cut from what I was seeing in ~2021.

Anecdotally, previous gig purged a lot of senior technical staff and managers and brought in a lot of remote eastern european contractors.
posted by zippy at 9:23 PM on May 4


I’m seeing layoffs but at my company for revenue shortfalls. Still, I have about 150 working days until retirement, and if happens sooner that’s fine with me.
posted by billsaysthis at 3:33 AM on May 5


Chances are that, at present, companies are more interested in profit than in competing in new markets.

During my time in industry, there was point when it was thought that SQL and natural language processing was going to make all programming quick and easy. And another when it was all going to move offshore to India. In the 80s into the 90s, HTML was written "by hand".
posted by SemiSalt at 4:46 AM on May 5 [1 favorite]


is it moving to lower cost of living areas

FWIW, some of it has always been there, even within the US.

The less exciting, the more likely. Epic, for instance. Or, Diebold.

And then there's the persistence of remote work post-Covid: you can choose to be in a HCOL area, but unless your skills are especially sought after you may not be compensated any better than a peer willing to live in South Dakota.

Not sure even the return of ZIRP will change that, though it might soften it.
posted by snuffleupagus at 6:16 AM on May 5


Are you sure this is unique to software? In every white collar company I know it’s standard practice to lay off the more experienced, expensive older employees and replace them with people straight out of college.
posted by CMcG at 6:51 AM on May 5 [4 favorites]


This doesn't answer your question, but why would you lay off the people that wrote your software? You couldn't possibly get more expertise de facto.
posted by nTeleKy at 7:17 AM on May 5


I think a few trends have converged.

- software development is now a fully mainstream career. It’s been an option for at least 60 years. It’s no longer a niche field.
- as the field matured, roles were created: QA, designers, analysts. These fields lagged behind developers but are mainstream now too. Each of those roles both takes some work from the developer and allows the to dev to code more efficiently (hopefully, at least.)
- tools are easier to use. I’m not a developer but I stood up a coherent Salesforce app at a job by being medium smart and using their documentation.
- offshore talent is getting better.
posted by punchtothehead at 7:30 AM on May 5 [2 favorites]


I guess I’ll go slightly against the grain and say that I think the answer is yes, it is. I’m not a programmer but I’ve worked in tech and know a lot of people in tech-adjacent roles. When I was in college ~10 years ago, computer science was the hot thing that everyone knew guaranteed you a fun, wildly well-paid tech job with a bunch of perks. When that happens it’s just a matter of time until everyone floods CS programs and there are too many software engineers. I think the ‘disruption’ shine has also finally worn off on tech and I have a feeling tech jobs are going to be a lot less like they were and a lot more like normal jobs going forward.

There’s a ton of other factors that play into this so who knows really! I’m not an expert by any means, but that’s my gut feeling based on my observations.
posted by caitcadieux at 8:28 AM on May 5


This doesn't answer your question, but why would you lay off the people that wrote your software?

maintenance mode on feature complete and/or legacy products, perhaps after an acquisition
posted by snuffleupagus at 8:35 AM on May 5


Response by poster: So uh what does a director of software / principal architect transition to? There is always niche roles but I found those even tend to be cliquey. We are are all ways from retirement and we’re never money motivated. Feel like working with large multinationals and navigating those politics gave me experience, I learned to talk not about the tech but in consulting language. Is there a LinkedIn for someone who wants to get into a startup or somewhere? Covid destroyed networking. I’m thinking a higher up role at a somewhat established startup would fit me. Less about the money but people who are smart with a passion. No interest in being a tycoon here but still at least 10 years from full retirement.
posted by geoff. at 2:12 PM on May 5


Discords, sadly. The essence of cliquey.
posted by snuffleupagus at 3:58 PM on May 5


With your experience, maybe you'd be suited to working internally for larger enterprises, where in my recent experience there is a huge appetite for transformation related to data, automation, and the opportunity to implement any AI use cases that are worthwhile, but a need for talent that has sophisticated technical background and appreciation of the complexities of enterprise software while also having a degree of maturity, and where your ability to speak consultant might be valued. I have heard anecdotally that with the current state of the software development market more non-companies that didn't traditionally hire developers are looking into it. I know non-anecdotally that this is the case for e.g. large law firms. There will be opportunities, I expect.
posted by lookoutbelow at 6:13 PM on May 5


Oh, and if you do look for jobs like that, LinkedIn is the place. I'd go figure out who has the kind of job you might want or who would hire such a person and go interact with their posts and share your expertise. E.g. in a law firm environment it would be directors of innovation or legal operations who might be involved in hiring software developers. With the appetite for generative AI there is a lot of internal product development happening where they need people who can figure out the details of building the architecture and integrations with legacy software. Maybe this might seem dull from the outside but you might also find working with innovation-oriented people outside of tech refreshing.
posted by lookoutbelow at 6:20 PM on May 5


Legal tech is forever, to my dismay, a backwater.

Discord. [The man, Theo's YT.]

And those of the similar figures who appeal to you, but of the whippersnappers he feels more mefish. As it were.
posted by snuffleupagus at 6:27 PM on May 5


It's not all bad. We all miss BBSes, right? Tiring as it may be, it's BBSes writ large.
posted by snuffleupagus at 6:37 PM on May 5


There's a lot in your post about what you and your colleagues have done but not a lot about what you bring going forward. I think that you need to be able to define your value-add going forward to both technical and non-technical stakeholders now more than ever, especially if you are relatively senior and expensive.
posted by Kwine at 6:41 PM on May 5


Response by poster: That’s a great point. I have emphasized anchor points and delivering MLOps methodology to bring the idea into the organization along with long term plans based off of Assets under returns with a schedule. Unfortunately I can cluster industry productivity. I can’t say you'll save 12% overall. So value add turns to operational efficiency and profitability gain in a branching decision making tree. Which branch off further, etc. I’ve developed schedules and detailed estimates. This is standard 5S/7S developing a methodology and defining a structured problems solving situation. For complex and uncertain risks. This leads to a Minto Pyramid. The value add is unknown which is why you establish anchor points on the branching tree, aggregate data you have to back it up and develop an action plan/working team.

A less academic explanation is here: https://www.mckinsey.com/capabilities/strategy-and-corporate-finance/our-insights/how-to-master-the-seven-step-problem-solving-process

Sorry for the McKinsey link but it is complex problem that needs breaking down. There are risks involved for sure but this once worked and does not now. I can’t give 1.5% quarterly reduction in costs but provide a methodology based on research that shows long term gains. This is all without mentioning technology at sll. Or in the abstract.

Were you thinking different?
posted by geoff. at 8:21 PM on May 5


Mod note: OP, please note that Ask Metafilter is about getting help and solutions for your problem or answers to your question, and not for having a back and forth discussion. If you'd like to discuss the topic with fellow members, please make a related Metafilter post for that conversation. Thank you!
posted by taz (staff) at 1:54 AM on May 6


I've worked in tech for 20+ years, and we are finally internally hiring again after not hiring for 10+ years. I've not seen any slowdowns over that time period, though lots of our actual product developing has always been outsourced, but in-sourced developers create reports, write troubleshooting tools, analysis tools -etc.
posted by The_Vegetables at 8:25 AM on May 6


I think OP is noticing something real, although it's more complex than "software is dying" or "everyone is offshoring".

First, software doesn't really exist as an independent industry in the way I feel like it existed 20 or 30 years ago. The lack of software experience and skills inside many companies back then necessitated bringing in outside talent—in the form of shrink-wrapped software that required customization, or true custom development work—from outside firms. These firms employed a lot of people, including me for a period of time, and were often pretty nice places to work!

But the flip-side of software "eating the world" and becoming critical to nearly all competitive businesses, means that more organizations have in-housed some degree of software development. Even if it doesn't look like software development in the classic sense, the 22-year-old analyst who knows just enough VBA to write some Excel macros, or someone who goes to a week-long training course on a "no-code" business process framework, are likely doing tasks that a dedicated software developer would have done just a decade or two ago. So a more technically adept workforce in general, combined with better tools, means fewer dedicated programmers.

I don't think it's the case that the hard problems in software have been solved, not at all. But I think a lot of the common problems in business software have been solved, many times over, and now people have found ways to productize those solutions in a way that doesn't require much customization.

All those low-code/no-code SAAS startups, which are ubiquitous across many industries today (if you run a dentist office, there are like 30 off-the-shelf "cloud" software solutions to choose from; same for accounting, auto repair shops, ad infinitum) have—by design—taken away work from traditional custom-code development houses. Whether they create more jobs for people-sitting-at-computers-writing-code (by whatever term you call that job: programmer, developer, s/w engineer, etc.) than they eliminate is a question I've often wondered about, but have never seen a clear answer to.
posted by Kadin2048 at 6:16 PM on May 6 [1 favorite]


« Older ENM and the flow of information   |   Proper iPhone Camera settings for recording bike... Newer »

You are not logged in, either login or create an account to post comments