Congress/president should pause H1B visas or hike up fee to 200-500K so that only truly exceptional talent are allowed in. Right now it's just give away to corporations that are laying off people by tens of thousands.
There's lies, damned lies, and then: there's statistics.
You have to counter the growth in jobs based on how many new people there are to take them, the location in which they are, and somewhat weirdly other jobs.
Plenty of people feel so dejected at the current state of things that they leave computer work entirely making "openings" where there isn't actually any growth.
Like all things that you try to understand: a single datapoint, when averaged, is like trying to calculate the heat from the sun by looking through a telescope at jupiter. It will give you a far-out tiny facet of data that only makes sense when coalesced with a hundred other ones.
This makes sense given both automation and the US's role in the global economy, but it runs somewhat contrary to standard ideas of class and inequality.
Apparently "top executive" median pay is $105,350 per year: https://www.bls.gov/ooh/management/top-executives.htm
Now just think of the comp levels in sectors like government, education, etc.
If you click the link it mentions "general and operations managers". They're tossing a lot of different roles into the category.
It's the combination of tech and big or fast growing companies.
People who operate in FAANG or Silicon Valley bubbles (or who spend too much time on Blind) can lose track of what salaries look like in the rest of the world.
I often share Buffer's open salary page because their compensation is actually pretty normal from all of the data I've seen and hiring I've done: https://buffer.com/salaries
Every time it gets posted there are comments from people aghast that the software engineers "only" make $200K and in disbelief that the CEO's salary is "only" $300K.
Can you elaborate?
Chief Executives is actually a specific sub-category of it and is, obviously, much smaller.
(don't forget to "allow pasting" in [chrome] console first)
A lot more inconvenient for others to have to pick colors that satisfy all potential site issues, which is primarily why I think it should be an OS solution rather than an individual creator's responsibility. It's not that I don't care about those with the site issue, it's purely about who is responsible for creating a reasonable solution. And honestly, there's no way every creator is going to study accessibility and so it's just a never ending uphill battle. If you had a tool in your system already that could help, why wouldn't you use it?
I think AI outcomes distribute to contexts where it is used, and produce a change in how we work, what work we take on. Competition takes care of taking those surpluses and investing them in new structure, which becomes load bearing and we can't do without it anymore.
In the end it looks like we are treading water, just like it was when computers got 1M times faster in a couple of decades, but we felt very little improvement in earnings or reduction in work.
Surplus becomes structure and the changed structure is something you can't function without. Like the cell and mitochondrion, after they merged they can't be apart, can't pay their costs individually anymore. Surplus is absorbed into the baseline cost.
The 1% pockets, this is where the vast majority of the extra productivity computers/internet/automation brought goes to for the last 50 years: https://www.epi.org/productivity-pay-gap/
1) The salaries of corporate employees 2) Shareholders and capital owners
Regarding number 2: "Shareholders" would include anyone who owns any stock at all, including a lot of middle class people with a simple S&P 500 ETF in their portfolio.
And the increase in productivity allowed more people to become capital owners, AKA entrepreneurs. The explosion in software entrepreneurs, for example.
Because no matter what fairy tales you want to believe in your $20 "invested" in palantir won't make you a "shareholder" lmao
Lots of middle class people have graduated into upper-middle class: https://www.aei.org/research-products/report/the-middle-clas...
Wealth inequality is still a problem. But it's not just the people at the very top benefitting.
https://images.seattletimes.com/wp-content/uploads/2017/12/9...
https://www.peoplespolicyproject.org/wp-content/uploads/2020...
https://datawrapper.dwcdn.net/CvQar/full.png
https://static.guim.co.uk/ni/1415721490539/Wealth_line-chart...
Upper-middle class is people making ~$200k/year.
A lot of people have moved from middle class to upper middle class over the last decade. Both those categories are outside the 1%.
For a business, the question is whether you can make more money by doing more ambitious things.
Agriculture is a good example of that: http://www.johnhearfield.com/History/Breadt.htm
But given that the stock market hasn't panicked, this must mean at least one of these premises is false:
1. Economic activity is relatively flat.
2. AI makes us a million billion zillion times more productive than we used to be.
3. The stock market is rooted in reality.
This was already obvious, the more important question is what are we (collectively, society & our governments) going to do about it?
We (should have) already known most of our jobs were bullshit jobs, especially white collar jobs. The difference is now we might have something coming that will eliminate the bullshit jobs.
But society will always need bullshit jobs or the whole system collapses. Not everyone can go dig ditches, so what do we do?
I think this is a very important point. The hedonic treadmill means real gains are discounted. The novelty information cycle is like an Osborn Effect for improvements, like the semi-annual Popular Mechanic's flying car covers where there is an enticing future perpetually nearly here and at the same time disappointingly never materialized.
This time the jobs most in the crosshairs of AI are the jobs that constituted the paper pushing overhead of modern society, all the paper pushing jobs. Instead of $1 widgets from China replacing $2 domestic widgets it's gonna be $1 AI services replacing $2 services that require a real human.
This is hard to reason about because people tend to consume these kinds of services in big multi hundred or multi thousand dollar increments but in practice what it means is that when you have to engage an accountant, engineer, having something planned out in accordance with some standard, that will be substantially cheaper because of the reduced professional labor component.
And of course, as usual, the string pulling and in investor class will get fabulously wealthy along the way.
BLS forward looking guidance means nothing when technology revolutionizes the nature of work.
Putting aside the slop facade place atop the data....why would we trust the data?
Yay!
>Computer Programmers: -6%
Oh no
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Computer Programmers median pay according to BLS: $98,670 per year
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Software developers typically do the following:
- Analyze users’ needs and then design and develop software to meet those needs Recommend software upgrades for customers’ existing programs and systems Design each piece of an application or system and plan how the pieces will work together
- Create a variety of models and diagrams showing programmers the software code needed for an application
- Ensure that a program continues to function normally through software maintenance and testing
- Document every aspect of an application or system as a reference for future maintenance and upgrades
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Computer programmers typically do the following:
- Write programs in a variety of computer languages, such as C++ and Java
- Update and expand existing programs
- Test programs for errors and fix the faulty lines of computer code
- Create, modify, and test code or scripts in software that simplifies development
(Source: https://www.bls.gov/ooh/computer-and-information-technology/...)
Programmers is like a translator; somebody else came up with what to do and you're doing the mechanical work of converting words into C++.
Developer involves coming up with what to do.
Hence programmers is a lower paid position.
There's no functional difference between a 'software developer' and a 'programmer'. they're just synonyms that sometime pay differently.
It feels like the intent was that "Programmers" were the ones doing the routine / lower skill tasks while the Developers were the ones that did the specification and architecture.
Those got juggled around and largely people getting listed as "Computer Programmer" is going down as the company relists them as Software Developer.
This is also part of the confusion of "Web Developer" which is also in there.
It reflects what government thought management thought title and roles were some years ago.
---
Edit: From days of old: https://web.archive.org/web/20110616142157/https://www.bls.g...
15-1132 Software Developers, Applications
Develop, create, and modify general computer applications software or specialized utility programs. Analyze user needs and develop software solutions. Design software or customize software for client use with the aim of optimizing operational efficiency. May analyze and design databases within an application area, working individually or coordinating database development as part of a team. May supervise computer programmers.
https://web.archive.org/web/20110531043521/http://www.bls.go... 15-1133 Software Developers, Systems Software
Research, design, develop, and test operating systems-level software, compilers, and network distribution software for medical, industrial, military, communications, aerospace, business, scientific, and general computing applications. Set operational specifications and formulate and analyze software requirements. May design embedded systems software. Apply principles and techniques of computer science, engineering, and mathematical analysis.
https://web.archive.org/web/20110925005933/http://www.bls.go... 15-1131 Computer Programmers
Create, modify, and test the code, forms, and script that allow computer applications to run. Work from specifications drawn up by software developers or other individuals. May assist software developers by analyzing user needs and designing software solutions. May develop and write computer programs to store, locate, and retrieve specific documents, data, and information.
Note that the specifying part of it isn't done by the programmers but the other roles.... And for completness
https://web.archive.org/web/20130624010204/http://www.bls.go...
15-1134 Web Developers
Design, create, and modify Web sites. Analyze user needs to implement Web site content, graphics, performance, and capacity. May integrate Web sites with other computer applications. May convert written, graphic, audio, and video components to compatible Web formats by using software designed to facilitate the creation of Web and multimedia content. Excludes "Multimedia Artists and Animators" (27-1014).Reason for hope
They're saying that programmers will be declining. While Developers, and crucially, Testers and QA people will be increasing. That testers and QA become more important in the future sounds plausible to me in a future hypothetical world of ubiquitous AI.
All of that doesn't necessarily imply that the Developer class of employees will grow at the same rate as the Tester and QA classes of employees.
I'd like to use this on my website and also see if I can create variations for some of the major EU markets.
My friends and I who have a bachelor's degree in CS make more money than my friends who have or are working towards master's degrees in CS, because the former are working in the private sector and the latter are in academia making peanuts.
Edit: Another possible reason that Masters degrees were less common in the past, so the Bachelors pay statistics skew towards people with more work experience in their higher earning years, whereas the Masters pay statistics skew towards younger people with less work experience.
Apple, a very successful company, makes 300B/y revenue? (ish)
~10% is all you need to be Apple.
And, it can work by taking all of 10% of the jobs and collecting the whole salary (the AI employee -- dubious proposition),
or by taking 10% of everyone's salary and automating part of everyone's job (the AI "tool" -- much more plausible).
If "part" being automated is >10%, we all win in the long run, every company gets productivity growth without cost growth, etc etc.
If you add in data center costs, and multiple competing AI companies, and then expand the TAM to all white collar work worldwide, you can make everyone successful beyond their wildest dreams with a "20% of work for 20% of the cost" model. Again, how you distribute that 20% remains to be seen (20% new unemployment, or new 0% unemployment with "tools".
I formalized my thoughts here: https://jodavaho.io/posts/ai-jobpocolypse.html
It's also understated, because the real value of AI is not in replacing work, but making new products possible either because it's finally cheap enough to make them, or because -- AI.
Given the state of AI (LLMs) - they still need a very human (skilled driver) to operate
Potable water is far more important than AI or iPads ever will be, but the world's most valuable water company only does about 5B/year in revenue: https://en.wikipedia.org/wiki/American_Water_Works
Frequently seen as a big fun number in pitch decks. "The TAM for our new Coca-Cola killer is $1.6T: all humans who imbibe liquids on a regular basis. You simply MUST invest."
On second thought, client service folks might do extremely well here!
What you mention here is the exact thing why my earlier relationship went bust, because I didnt have any of these, then the children arrived :-X
> Rate the occupation's overall AI Exposure on a scale from 0 to 10.
Are LLMs good at scoring? In my experience, using an LLM for scoring things usually produces arbitrary results. I'm surprised to see Karpathy employ it
Whats the outlook like?
Thank you!
If you turn on the color filters in accessibility settings in macOS you can see what the contrast could look like to a colorblind person.
The general trick is you can rely on differences in color lightness, patterns, text and icons, but not differences in color hue. The page should be usable in grayscale.
Needs
- [utility] add filter by keyword / substring match, e.g majority of visualized reports are un-labeled requiring hovering with a mouse pointer
- [improve discovery] add sort by demographic / pop impact, e.g largest block is 7m ('Hand laborers and movers') and default sorted to bottom-left
Stand in front with a gun while mobs come to burn down the data center that took their jobs.
(I think I'm half joking).
https://apnews.com/article/trump-jobs-firing-f00e9bf96d01105...
> Taxi Drivers, Shuttle Drivers, and Chauffeurs
> Overall employment of taxi drivers, shuttle drivers, and chauffeurs is projected to grow 9 percent from 2024 to 2034, much faster than the average for all occupations.
...word?
A -4.0% hit to cashiers may have less of an impact than -4.0% to lawyers or another category that is propping up the middle of the economy with spending.
I guess that was to be expected...
Started my career in the decade of offshoring and didn't think we'd have anything close to an "AI" taking our jobs before we potentially unionized or had a government that would protect its labor force from being replaced by literal robots.
2020-2022 felt like the usa tech ship was finally growing into something really great. All gone now.
When I worked in devops I always worried that my job was automating away other engineers, it definitely had a "when will this come for me" feeling, because it really was, now the dev and ops are both getting automated away.
This is my first time looking at HN in practically a year. Tech is just so uninteresting to me now. Nobody is hiring SDE/SWE/SREs except for the problem makers, like Anthropic, Meta, etc. Anthropic has pages and pages of $300k-$600k roles open right now. But do you go help the rest of your colleagues lose their jobs?
I guess lets talk about kubernetes or something...
In addition, little work is done to separate the classes. He has probation officers in the same node as teachers, completely separate from law enforcement.
Here's some much better examples:
- https://www.washingtonpost.com/nation/2022/05/04/abortion-nu...
- https://flowingdata.com/2015/04/02/how-we-spend-our-money-a-...
It's not great for them, but it's a definite advantage for people who are already in the mindset of distinguishing and discriminating information and sources on merit, instead of running an "AI bad" rubric as part of their filter.
AI has already won. It's taking over. It might be a year or two, or five, or ten, but AI isn't slowing down, nobody is going to pause, and there's a whole shit ton of work people do that won't be meaningful or economically relevant in the very near term. Jevons paradox isn't relevant to cognitive surplus - you need a very different model to capture what's going to happen.
It's time to surf or drown, because it doesn't look like any of the people in charge have the slightest clue about how to handle what's coming.
Maybe it was linked from a comment somewhere on HN but just today I saw a post saying “Microwaves are the future of all food: if you don’t think so, you better get out of the kitchen”
Microwaves have already won. There will be a microwave in every home over the next few years.
It’s time to start microwave cooking or drown
It's annoying that the dishes still have some pooled water in them when the cycle finishes; it doesn't always get everything perfectly clean; I have to know not to put the knives or the wooden stuff or anything fancy in it. But in spite of all of that, I use it every day, it's a huge productivity boost, and I'd hate to be without it.
I can tell you that I didn't observe a single hand-wash-only holdout.
Perhaps such holdouts existed at a point, but a restaurant can only flatter the ego of their performatively-unproductive seniors for so long. Competition exists.
Hand-washing dishes also, from what I understand, uses more energy and water than the dishwasher does.
Correct, more energy, detergent, and water. Dishwashers are more efficient than what you can do by hand because they effectively manage their water usage.
A modern dishwasher will use 3 to 4 gallons on a run. By comparison, my kitchen sink holds about 10 gallons of water on each side. When I wash by hand, I'll fill one side with soapy water and rinse each dish individually. Easily more than 10 gallons of water get used in the whole process.
Dishwashers are so efficient because they rinse everything off the dishes with about ~1 gallons of water, they drain the water, then use detergent in the second run which gets off the tougher food stains, another 1 gallons of water. Then they rinse with another gallon of water.
Dishwashers maximize getting food particulates into dirty water in a way that you can't really sanely do by hand.
If I hand wash, I wash as I go. It takes maybe 5 minutes to wash up dishes from breakfast or lunch, maybe a little more for a big dinner, maybe not.
Dishwashers let you accumulate dirty dishes for a day or two which is the real advantage in water savings. But I've noticed a lot of people pre-wash by hand and then load the dishwasher. I don't understand that, if I'm going to "pre-wash" anything I'll just wash it completely and put it away.
5 minutes of most sinks running is 10 gallons of water. (Most kitchen sinks are 2 gallons per minute).
> Dishwashers let you accumulate dirty dishes for a day or two which is the real advantage in water savings.
I agree. If you aren't filling the dishwasher then you are probably wasting water. However, a full dishwasher is going to be a real water/energy saver. Especially if you aren't washing the dishes before putting them in the dishwasher. (I know a decent number of people do that. It's a hard habit to break).
My wife and her family :D. Water conservation mentality is a battle.
I'm pro-dishwasher, but you could use much less water handwashing.
If I don't have a dishwasher, my normal method is to stopper one side of my sink, squirt some dish soap on the first few dishes, and run just enough water to wet the dishes. Then I scrub some dishes, run the water (into the stoppered sink) just to rinse them as I transfer to the dish rack, then turn off the water and repeat. The dirtiest dishes that have the most food stuck on get done last so they get the most time soaking in the soapy rinse water from the rest of the dishes. I can do a full dishwasher load with one side of my sink maybe 1/4 full of water.
(The "normal" cycle is specced for 11.0-27.7 litres but uses more electricity, which is more expensive than water.)
Modern high-efficiency dishwashers probably beat the most efficient humans now, but that's relatively recent and not a huge margin (and may not get the same results).
I use the time I spend to hand-wash my dishes as a time to pause and to let my mind wander. Having the hands in water is soothing.
And its a pleasant feeling, where cleaning is part of the food workflow : I cook, I eat, I clean (the kitchen, the dishes, my teeth).
I hate home dishwashers: you have to play Tetris after each meal to fill them, trying not to get your hands/arms dirty, then you have to let it do the work, and now you have to spend a few minutes to get the dishes out and store them where they should be, even though most of them are not linked to a meal you just had. Maybe worse, you could unload the dishwasher at a time completely unrelated to food, so that breaks the link.
On the other hand, having worked in restaurants, industrial dishwashers are awesome.
Fridge OTOH, not so much.
LLMs require a lot more effort.
This is an incredible self-report. If you consider microwaved meals to be your default method of cooking and not something primarily for reheating leftovers or defrosting frozen meat, I sincerely hope you've gotten your cholesterol and blood pressure checked recently. That is not normal.
this is nuts! I use an oven every day dude - so its a special occasion is it?
The default method for cooking is using an oven or using a stove. Microwaving is for heating up left-overs for the most part.
One of the dangers of people who are too close to programming is that they think of life as binary.
I’d also say that while I like my air fryer oven, I would prefer to do some of the bigger things like a whole bird in the oven. It’s cheaper to buy a whole bird for meal prep.
Or you're batch cooking
I’m from northern Europe. I might use the micro to heat up leftovers or a cup of water for tea or whatever in a pinch, but in this household (and at all my friends’), the stove and the oven cooks the food. I know literally no-one who could say they cook most meals in the micro.
I didn’t have a microwave oven before we bought a house. It took up too much space to justify, for such a relatively rarely-used appliance.
I think OP is just an outlier.
Although, the analogy seems sort of useless, in that the food preparation ecosystem is really not any less complex than the program creation ecosystem, so it doesn’t offer any simplification.
I've lived without a microwave for a long time and it's only a little bit inconvenient because things take longer to reheat.
Thankfully there is real data if we want to know how microwaves are used. Survey below says they are used a bit more than ovens, but half as much as cooktops/stoves. Varies by cohort and meal.
Source: https://indoor.lbl.gov/publications/residential-cooking-beha...
Ovens are a special occasion thing in my house because our oven is huge and I can usually do the same thing in the air fryer, which is just a small convection oven.
That really only makes sense if for households with a toaster oven, single adults, childless couples, and retired people. A toaster oven makes a lot more sense for small meals, in part because it can heat up much faster than a full oven.
Otherwise, a daily family meal isn't a special occasion.
The food have been cooked in industrial ovens in the factory.
Not true in my household, in my parent's, in my in-laws, or any of my closest friends'. And none of us are cooks, so it's not a niche thing.
I'm sure in a lot of households the microwave oven is the primary form of cooking, but it's important to look outside the bubble before reporting trends.
You think "there's a whole shit ton of work people do that won't be meaningful or economically relevant in the very near term" is wrong?
(The original phrase was not just made up, it was sourced from actual news articles and marketing about microwave ovens, that’s why it feels relevant to a hype cycle like this)
You also see this kind of naive optimism if you go look at illustrations from the early 1900s. People believed everything would eventually be a machine: that a machine would feed you, wake you up in the morning, physically move everything within your home etc. And yeah those things are possible to do, but in reality they aren’t practical and we do not actually use machines to do everything because it has costs
So, you know how looking at one pattern and then just saying "this one will be like that one?" without considering the similarities and differences is similar to what people complain about AIs doing?
Consider: Unlike my Microwave, Claude can work on Claude. Unlike my Microwave, Claude gets better at more things. Unlike my microwave, we do not know what causes Claude to work so well. My Microwave cannot improve the process that makes my microwave.
Also, um.
I'm not sure if you noticed?
But machines are everywhere.
I'm typing on one while another one (a microwave, in fact!) heats my breakfast, while another one washes my clothes, while another one vacuums my floor, while another one purifies the air in my room, while another one heats the air in my room, while another one monitors my doors and windows for unauthorized entry and another one keeps my food cool and another one pumps the Radon gas out of my basement and another one scoops my cat's poop.
You’re kind of missing the point a bit. Yes, machines are everywhere but the details are very different.
The machines don’t magically do that stuff for you. You have to buy them, plug them in, turn them on and off. Lots of people don’t have any at all. They can’t do most things unsupervised. There are still lots and lots of tasks for which a machine exists, that people will still do entirely manually
There is a naivety to these predictions that is chipped away by the mundane details of having to exist in the real world. Cost, effort etc
No, AI has not "already" won. And phrasing it as you do, "It's taking over. It might be a year or two, or five, or ten" is an admission of that.
People may indeed not pause, but there's never any guarantee that the next step of progress is possible; whatever we reach may be all we can do, and we'll only find out when we get there. Or it might go hyperbolic and give us everything.
I'm not certain, but I suspect Jevons paradox is probably the wrong thing to bring up here, that's about cheaper stuff revealing more latent demand, and sure, that's possible and it may reveal a latent demand for everyone to build their own 1:1 scale model of the USS Enterprise (any of them) as a personal home, but we may also find that AI ends the economic incentives for consumerism which in turn remove a big driver to constantly have more stuff and demand goes down to something closer to a home being a living yurt made out of genetically modified photovoltaic vines that also give us unlimited free food.
(I mean, if we're talking about the AI future, why not push it?)
What I do think is worth bringing up is comparative advantage: Again, this is just an "I think", I'm absolutely not certain here, but if AI can supply all demand at unlimited volumes*, I think the assumptions behind comparative advantage, break.
> It's time to surf or drown, because it doesn't look like any of the people in charge have the slightest clue about how to handle what's coming.
Yes, and I think they've also not even managed to figure out the internet yet.
* and AI may well be able to, even if all models collectively "only" reach the equivalent of a fully-rounded human of IQ 115; and yes I know IQ tests are dodgy, but we all know what they approximate, by "fully rounded" I mean that thing their steel-man form tries to approach, not test passing itself which would have the AI already beat that IQ score despite struggling with handling plates in a dishwasher.
Ah, the classic, forever-untestable "it's just around the corner" hypothesis.
I've lived through multiple "it's gonna be over in 12-18 months" arguments since November 2022. It's a truism for any technology to say that it's going to get better over time. But if you're convinced that "AI has already won", why not make a specific prediction? What jobs are going to be obsolete by when?
Jevons paradox was never relevant to cognitive surplus. That isn't what it's about.
Cognitive surplus only strengthens Jevons paradox. Humans are a competitive advantage for businesses in a world dominated by human needs
OP comment is not clever
i think i need more patience -- i seem to fall into a certain tone due to my low expectations, and it's likely a self-fulfilling process which i am complicit in
1. Brick and mortar is dead.
2. The internet will die.
3. What is the business model? (this one still seems to exist to this day to some extent, lol)
Reality fell between 1 and 2.
just because it was wrong once doesn't mean its never wrong. And was it really that wrong? The internet is great but would it be the worst thing in the world if we didn't live our lives around it?
You'd probably put me into that bucket, although I'd disagree. I'm not at all against using AI to do something like: type up a high level summary of a product featureset for an executive that doesn't require deep technical accuracy.
What I AM against is: "summarize these million datapoints and into an output I can consume".
Why? Because the number of times I've already witnessed in the last year: someone using AI to build out their QBR deck or financial forecast, only to find out the AI completely hallucinated the numbers - makes my brain break. If I can't trust it to build an accurate graph of hard numbers without literally double checking all of its work, why would I bother in the first place?
In the same way, if you tell me you've got this amazing dataset that AI has built for you, my first thought is: I trust that about as much as the Iraqi Information Minister, because I've seen first hand the garbage output from supposedly the best AI platforms in the world.
*And to be clear: I absolutely think businesses across the board are replacing people with AI, and they can do so. And I also think it'll take 18+ months for someone to start asking questions only for them to figure out they've been directing the future of their company on garbage numbers that don't reflect reality.
If I were in need of hard analytics you can be damn sure I'd have it build a tool with a solid suite of tests following a rigorous process to ensure the outputs are sound. That's the difference between engineering and vibing.
Published AI generated code is a mild negative signal for quality, but certainly not a fatal one.
Published AI generated English writing is worthless and should be automatically ignored.
Could you elaborate on this? Is it just a claim, or is there some consensus out there based on something that it doesn't/shouldn't apply?
a. "Has already won"
b. "Might be a year or two, or five, or ten"
So... What exactly are you talking about?
Whether people are adopting AI or not, everybody doing the same kind of job gets the same number for exposure to AI.
You can claim that AI is creating a Jevons paradox situation and making companies hire as crazy the people it nominally replaces. But then you would have to point any instance of that happening, because it's clearly not there either.
Over the past year (where Opus has supposedly changed the game), we're seeing ~10% more job postings for software developers compared to this time last year [1,2]
A huge amount of our work is not easily verifiable, therefore it's extremely hard to actually train an LLM to be better at it. It doesn't magically get better across the board.
AI HAS WON. SURF OR DROWN. YOU DONT KNOW WHATS COMING!!!?!?!
Stop with this doomer drivel. It's sick. It's not based in reality and all it does is stress innocent people out for no reason.
However I was completely unimpressed with this tool when I saw it this weekend for two reasons:
The first is directly related to how this is built:
> These are rough LLM estimates, not rigorous predictions.
This visualization is neat (well except for reason number two), but it's pretty much just AI slop repackaged. There's no substance behind any of these predictions. Now I'm perfectly open to the critique that normal BLS predictions are also potentially slop, but I don't see how this is particularly valuable.
And the second, like 8% of male population I'm colorblind, so I can't read this chart.
For the record, I do agentic coding pretty much everyday, have shipped AI products, done work in AI research, etc.
Ironically, it's comments like yours that keep me the most skeptical. The fact that an attack on a strawman is the top comment really makes me feel like there is some sort of true mania here that I might even be a bit caught up in.
AI is great for searching. I ll give you that. And that itself is a big deal. In software development, there is also real value provided by AI if you use it for code reviews. But I am not sure how much worth it would be if you have to retrain a model with new information just to give better search results and for code reviews..
Maybe that will be subsidized by all the people like you who want everything to be done by AI, for the rest of us to use it as a better search tool and use it for quick reviews..who knows!
I think AI is not going anywhere.
I also don't think the future will play out as you envision. AI is a very poor replacement for humans.
And I say this as a misanthrope who doesn't have a particular beef against AI.
I use AI every day as part of my work, it's very unclear to me where it's going and we have no idea if we're on an exponent or S-curve. Now, normally people talk with conviction because they have more data. But one of the breakthroughs of crypto was this social convention of just have very strong opinions based on nothing. A lot of that culture has come over to AI.
Your comment typifies this, it's all about I need to get on board, AI has already won, you've got an advantage over me because you realise this.
Go back, look at the actual article you're commenting on. Did the AI analysis of job exposure provide anything of value. I'm not totally convinced it did, and you didn't even think about it. What critical thinking did you do about the data that came out of this dashboard.
I could understand if all the naysayers doing old fashioned stuff like work all of a sudden have no more work to do. But the AI Embracers will have what, in comparison? Five years of experience manipulating large language models that are smarter than them by a thousand fold?
brainbroken by chatbots lmao
Man.. I suggest you touch some grass. You are living in a bubble.
This cuts both ways...
> there's a whole shit ton of work people do that won't be meaningful or economically relevant in the very near term
What work do you think AI is going to replace? There are whole categories of people who are going to drown in the hubris of "AI being able to do the job" when it cant.
The moment one stops pretending that its going to be AI, that were getting AGI and views it as another tool the perspective changes. Strip away the hype and there is a LOT there... The walls of the garden are gonna get ripped down (Agents force the web open, and create security issues). They end lots of dark patterns, you cant make your crappy service hard to cancel... because an agent is more persistent to that. One size fits all software is going to face a reckoning (how many things are jammed into sales force sideways... that dont have to be). These things are existential threats to how our industry is TODAY, and no one seems to be talking about the impact to existing business models when the overhead of building software gets cut in half (and how it leads to more software not less).
It's been several years and nothing has changed except the AI grift is crumbling as we get out of the post-covid slump.
Companies Are Laying Off Workers Because of AI’s Potential - Not Its Performance - https://news.ycombinator.com/item?id=47401368 - March 2026
> Some companies that announced large headcount reductions because of AI have since revised their talent strategies or have faced public criticism. Klarna, for example, the Swedish fintech that offers “buy now, pay later” e-commerce loans, reduced its human workforce by 40% between December 2022 and December 2024 as it invested in AI. (The company used a hiring freeze and natural attrition, not layoffs to achieve this cut.) But in 2025 the company’s CEO told Bloomberg that Klarna was reinvesting in human support, explaining that prioritizing lower costs had also led to “lower quality.” A spokesman told HBR that the company has hired about 20 people to deal with customer service cases the AI assistant can’t handle, and that the use of AI “changes the profile of the human agents you need in the customer support role.” The language-learning company Duolingo announced that AI would be used to replace many human contractors, and it faced considerable criticism on social media.
> For one, AI typically performs specific tasks and not entire jobs. As an example, Nobel laureate Geoffrey Hinton stated in 2016 that it was “completely obvious” that AI would outperform human radiologists within five years. A decade later, there is no evidence that a single radiologist has lost a job to AI—in part because radiologists perform many tasks other than reading scan images. Indeed, there is a substantial shortage of them.
The 'AI-Washing' of Job Cuts Is Corrosive and Confusing - https://news.ycombinator.com/item?id=47401499 - March 2026
* Companies are "AI washing" layoffs, blaming artificial intelligence for workforce reductions they would have made anyway, according to OpenAI CEO Sam Altman.
* A Resume.org survey found that 59% of hiring managers say they emphasize AI's role in layoffs because it "is viewed more favorably by stakeholders than saying layoffs or hiring freezes are driven by financial constraints".
* The stated reason for the layoff matters more than the fact of the layoff, and framing cuts as proactive restructuring around AI can result in a valuation boost, even if the technology doesn't actually work.
> The AI premium isn’t even reliable. By late 2025, Goldman Sachs group Inc. found that investors were actually punishing AI-attributed layoffs, with shares falling an average of 2%. The analysts concluded that investors simply didn’t believe the companies. But Block’s surge shows the incentive hasn’t vanished. It’s just a lottery instead of a sure thing. And executives keep buying tickets.
> The broader data confirms the gap between narrative and reality. A National Bureau of Economic Research study published in February surveyed thousands of C-suite executives across the US, UK, Germany and Australia. Almost 90% said AI had zero impact on employment over the past three years. Challenger, Gray & Christmas tracked 1.2 million layoffs in 2025, and AI was cited in fewer than 55,000 of them. That’s 4.5%. Plain old “market and economic conditions” accounted for four times as many.
So! Sophisticated capital market participants don't believe this; why do people here?
AI is making CEOs delusional [video] - https://www.youtube.com/watch?v=Q6nem-F8AG8
> Rate the occupation's overall AI Exposure on a scale from 0 to 10.
The sad part isn't that this is low-effort AI slop, but that intelligent people and policy makers are going to see it and probably make important decisions impacting themselves and others based on these numbers.
All the "research" on the site comes from a single LLM prompt.
And for whatever reason a lot of people in startup/tech seem to have a huge Dunning-Kruger effect blind spot where they believe knowing a lot about one thing makes them an expert in everything.
This used to just be funny, but when it started to intersect with politics it began to actively contribute to destroying society. It isn't funny anymore.
(I don't think Karpathy's job data here is destroying society, this is a more generalized observation).
This is the equivalence of telling a Designer that can't create infographics on anything but principled design subjects -- or else they're out of line. Any research or data they might use isn't relevant because they're not exerts? lol?
It is a website that visualizes the output of an LLM prompt and passes it off as data. Big difference between the two.
Its especially(!) very common for people who made an exit and are now "wealthy" - sure they can afford to have an oppinion on everything, but very often they are just talking bullshit, thinking: "hey, I made it in field X, so why do not try field Y".
Esp the "MBA crowd" is famous for this: For whatever reason they think they are more intelligent than ana engineer who filed a patent, e.g. (while most of the MBA bobos would fail just in acquiring all documents required for this)
Other example: If you wrote once a book and it got traction, even if you are not a proven expert you will be invited to television shows etc. (and MORE than the people who are real experts with proven track record)
There is definitely impact on Software engineering jobs at the moment, interns/juniors are struggling to find jobs, companies are squeezing every bit of dev slack time to produce more stuff with AI.
Is that notion supported by this content? The BLS Outlook for most software engineering jobs is most in the "much faster than average" growth range.
* Yes software engineering jobs can grow - by increasing demand for custom software thanks to coding agents unlock
* AI can impact it - by making software engineers LLM code approvers
What would be useful is tracking the change in minimum pay per hour from legitimate job listings, now that there are quite a few states that require posting pay ranges on job listings.