Jobs used to laugh at Microsoft for all manner of inconsistencies in behaviour and user experience with Windows, but now Apple is contending with the same problem, in part due to exposure as macOS has never been so popular and prevalent, and now there are ever growing amount of eyes calling them out for those inconsistencies that have been appearing more and more frequently without Jobs' leadership style.
The one thing that distinguished Jobs from the rest ever since is the fact, that he was Apple's greatest fan boy. If you have a look at the Itunes introduction, Jobs sits there and for around 2 hours showcases every feature and function. He was so into the product, that this keynote is for me the most nerdy ever conducted by him.
The others as well always show him being the company's No 1 fan and host of every feature there is.
Imagine to have a boss like this. He set the standard for product development in every regard.
And this is what slipped. Consistency is lacking and according to biographies about Cook, he has a very huge focus on him as a person. This is always wrong. It is about the product, nothing else.
There will never be a Jobs again. And it is getting worse from here: the old guard is mostly gone. Even the myth of Steve Jobs is nothing Gen Z cares about.
We live in the Post-Jobs phase and Cook seems to be overshadowing Jobs, as sad as this is. All innovations except the headphones date back to Jobs. All the scale that Apple reached to Cook.
I bet Jobs would rather have a way smaller scale with great products. This luxury lifestyle is nothing Jobs liked.
Sad, but true.
And the updates to Music (formerly iTunes) are so bad the entire team should be dressed down, Steve Jobs style.
If the biggest flaw of a OS is the border radius of its windows, you've got yourself a pretty decent OS!
It's not gonna make me leave my darling Linux, ofc, but i think this whole debacle can only be interpreted as praise.
On second thought, it might also be considered a mediation on people's tendency to bike-shed.
Or to stay it another way, if we see shit like this then we know the whole thing is a hack.
Not because they necessarily cared, but because it functions as an easy-to-verify proxy for whether the venue actually read the contract.
I never said that
As a related anecdote, my friend said my car was ugly. I asked him what cars he thought looked good. He said “I don’t like cars”. As a result I realized his opinion was worthless
I guess you are only interested in the desktop looks part which on Linux is done by different window managers (like KDE, Gnome, Sway, ...) which can compete with MacOS in my view.
I was recently forced to switch from Gnome to MacOS Tahoe and the UX is so bad it's frustrating. Mission Control has no features apart from switching windows it seems (can not close windows, not change dock icons which all works on Gnome). Password fields often have no option to view the cleartext entered. This is especially confusing because symbols that I used daily are suddenly not printed on my keyboard anymore and I have to memorize shortcuts to enter them. In finder I see no way to go to the parent folder, isn't that something people on macs do? It just feels like it's years behind open source alternatives...
Concerning your car story: have you tried other Operating systems? Otherwise your opinion might be worthless here...
Currently, MacOS has the worst window management compared to Windows and (all) the Linux desktop environments. I mean, where else do you have such problems with resizing windows or just switching between windows, not to mention the inconsistent feature sets when you want to work with virtual desktops...
For example, there is not much you could do to Finder to make it worse.
This argument would also make Windows 11 a pretty decent OS by extension via "If the biggest flaw of a OS is the position of the start menu you've got yourself a pretty decent OS".
In general I could use any minor nuisance as a proof of decency - or inject some to form this argument on purpose as a manufacturer.
People don't like if their environment changes in minor unsolicited ways. There's always gonna be fuzz about these things and that means that the fuzz itself can't be used to make any strong argument whatsoever.
That’s way more than just the “position of the start menu”
As someone who works on Windows, Mac, and Linux; Windows stands alone in my opinion as the "stepping on legos with no socks on" of operating systems.
Apple design is only different on release, after a few months I start getting force fed apple-isms in programs that don't have anything to do with them.
To my designers eye it was the first thing I saw, to him it was nothing.
I still think it's bad and a sign of a change in apple focus/style, but it's clearly not an issue at all for a lot of people.
Said colleague did get cross when he struggled to resize a window though. Turns out inconsistent corners means inconsistent handles. And that is a real problem.
There are loads of other flaws with the OS. It just so happens that people care a lot about the design of Apple's products, so people talk about these details.
MacOS has been shit for as long as I've used it (8 years) and probably for much longer than that. There are many lists available of MacOS problems (https://old.reddit.com/r/MacOS/comments/12rw1sn/a_long_list_... for example), it's just that there's not much point making a new article about the Finder that's been shit, and unchanged, for a decade.
My computer was running so slowly that I had to minimize transparency in system preferences somewhere. I think I also turned off opening every app in its own space. And I hid the icons on the Desktop in Finder settings somehow, which helped a lot. There are countless other little tweaks that are worth investigating.
I also highly recommend App Tamer (no affiliation). It lets you jail background apps at 10% cpu or whatever. It won't help with WindowServer or kernel_task (which also often runs at 100+% cpu), but it's something.
I can't help but feel that there's nobody at the wheel at Apple anymore. When I have to wait multiple seconds to open a window, to switch between apps, to go to my Applications folder, then something is terribly wrong. Computers have been running thousands of times slower than they should be for decades, but now it's reaching the point where daily work is becoming difficult.
I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples. Then we can finally boot up with better alternatives that force Apple/Microsoft/Google to try again. I could see Finder or File Explorer alternatives replacing the native ones.
I've been hearing this complaint for decades and I'll never understand it. The suggestion seems completely at odds with my own experience. Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.
I remember a time when I could visually see the screen repaint after minimizing a window, or waiting 3 minutes for the OS to boot, or waiting 30 minutes to install a 600mb video game from local media. My m2 air with 16gb of memory only has to reboot for updates, I haphazardly open 100 browser tabs, run spotify, slack, an IDE, build whatever project I'm working on, and the machine occasionally gets warm. Everything works fine, I never have performance issues. My linux machines, gaming pc, and phone feel just as snappy. It feels to me that we are living in a golden age of computer performance.
Now in iOS 26, you can just be typing in Notes or just the safari address bar for example, and the keyboard will randomly lag behind and freeze, likely because it is waiting on some autocomplete task to run on the keyboard process itself. And this is on top of the line, modern hardware.
A lot of the fundamentals that were focused on in the past to ensure responsiveness to user input was never lost, became lost. And lost for no real good reason, other than lazy development practices, unnecessary abstraction layers, and other modern developer conveniences.
Nowadays seems like half of Apple’s own software blocks on their main thread, like you said things like keyboard lock up for no reason. God forbid you try to paste too much text into a Note - the paste will crawl to a halt. Or, on my M4 max MacBook, 128GB ram, 8tb ssd, Photos library all originals saved locally - I try to cmd-R to rotate an image - the rotation of a fully local image can sometimes take >10 seconds while showing a blocking UI “Rotating Image…”, it’s insane how low the bar has dropped for Apple software.
10% of the time, Windowserver takes off and spends 150% CPU. Or I develop keystroke lag. Or I can't get a terminal open because Time Machine has the backup volume in the half mounted state.
It's thousands of times faster than the Ultra 1 that was once on my desk. And I can certainly do workloads that fundamentally take thousands of times more cycles. But I usually spend a greater proportion of this machine's speed on the UI and responsiveness doesn't always win over 30 years ago.
Spotlight doesn’t make sense either.. caches get evicted, but there’s no logic that prevents it from building it back up immediately
Log processes are fine, but they should never be able to use 100% / At the same priority (cpu+io)
One analogy is that the distance between two places in the world hasn't changed, but we're not arriving significantly faster than we before modern jetliners were invented. There was a period of new technology followed by rapid incremental progress toward shortened travel times until it leveled off.
However, the number of people able to consistently travel between more places in the world has continued to increase. New airports open regularly, and airliners have been optimized to fit more people, at the cost of passenger comfort.
Similarly, computers, operating systems, and their software aren't aligned in optimizing for user experience. Until a certain point, user interactions on MacOS took highest priority, which is why a single or dual core Mac felt more responsive than today, despite the capabilities and total work capacity of new Macs being orders of magnitude higher.
So we're not really even asking for the equivalent of faster jet planes, here, just wistfully remembering when we didn't need to arrive hours early to wait in lines and have to undress to get through security. Eventually all of us who remember the old era will be gone, and the next people will yearn for something that has changed from the experiences they shared.
Photons travel about 1 foot per nanosecond ... so the CPU can executes MANY instructions between the time photons leave your screen, and the time they reach your eyes.
Now, on Windows start Word (on a Mac start Writer) ... come on ... I'll wait.
Still with me? Don't blame the SSD and reload it again from the cache.
Weep.
It is one where reaction is under a single frame from action. EDIT: and frame is 1/60s, that is 16.(6)ms. I feel bad feeling I have to mention this basic fact.
This was possible on 1980s hardware. I witnessed that, I used that. Why is it not possible now?
This very much depends on what hardware you have and what you're doing on it (how much spare capacity you have).
Back in university I had a Techbite Zin 2, it had a Celeron N3350 and 4 GB of LPDDR4. It was affordable for me as a student (while I also had a PC in the dorm) and the keyboard was great and it worked out nicely for note taking and some web browsing when visiting parents in the countryside.
At the same time, the OS made a world of difference and it was anything but fast. Windows was pretty much unusable and it was the kind of hardware where you started to think whether you really need XFCE or whether LXDE would be enough.
I think both of the statements can be true: that Wirth's law is true and computers run way, way slower than they should due to bad software... and that normally you don't really feel it due to us throwing a lot of hardware at the problem to make us able to ignore it.
It's largely the same as you get with modern video game graphics and engines like UE5, where only now we are seeing horrible performance across the board that mainstream hardware often can't make up for and so devs reach for upscaling and framegen as something they demand you use (e.g. Borderlands 4), instead of just something to use for mobile gaming.
It's also like running ESLint and Prettier on your project and having a full build and formatting iteration take like 2 minutes without cache (though faster with cache), HOWEVER then you install Oxlint and Oxfmt and are surprised to find out that it takes SECONDS for the whole codebase. Maybe the "rewrite it in Rust" folks had a point. Bad code in Rust and similar languages will still run badly, but a fast runtime will make good code fly.
I could also probably compare the old Skype against modern Teams, or probably any split between the pre-Electron and modern day world.
Note: runtime in the loose sense, e.g. compiled native executables, vs the kind that also have GC, vs something like JVM and .NET, vs other interpreters like Python and Ruby and so on. Idk what you'd call it more precisely, execution model?
The modern throughput is faster by far. However, what some people mean when they talk about "slower" is the latency snappiness that characterizes early microcomputer systems. That has definitely gotten way worse in an empirically measurable fashion.
Dan Luu's article explains this very well [1].
It is difficult today to go through that lived experience of that low latency today because you don't appreciate it until you lived it for years. Few people have access to an Apple ][ rig with a composite monitor for years on end any longer. The hackers that experienced that low latency never forgot it, because the responsiveness feels like a fluid extension of your thoughts in a way higher latency systems cannot match.
Some apps do respect it, but sometimes it's hardcoded, and OS settings don't seem to override it. Even the OS doesn't respect it in some cases, but I think it used to. Flutter apps? Forget about it.
That's because some app is spamming window updates.
It's been an ongoing problem for many releases. AFAICT, WindowServer 100% CPU is a symptom, not a cause.
FWIU there's really no backpressure mechanism for apps delegating compositing (via CoreAnimation / CALayers) to WindowServer which is the real problem IMO.
If Apple would give insight about this, the developers wold get bug reports and complaints
Similar to the electron shit
Why? No one has shown that LLMs produce particularly good code. You can get a lot of useful shit done with what is still slop, but there is no reason to believe there's any evolutionary improvement.
But when RAPL and similar tools to throttle CPU are used, the CPU time gets reported as kernel_task - on linux it would show similarly as one of the kernel threads.
Even if that would be possible, you can't run commercial software. And for many people, the software they run is more important than the OS.
If you use SIP and use package managers (npm, cargo, pip, etc) outside of a VM you are substantially more vulnerable to attack than someone who doesn't use SIP and doesn't use package managers.
So if you want to fix your corners, you can do it guilt-free by adopting some better security practices around the malware delivery systems / package managers that you have installed on your computer.
SIP guarantees that you will be able to turn on your computer in safe mode and remove the malware, whereas without it your OS is toast.
If I had malware then the fate of the hardware is at the bottom of my priority list, I'm probably going to be replacing it anyway. I'd be more concerned that someone is going to steal my AWS credentials to run a cryptominer and I get a bill for hundreds of thousands of dollars!
The only solution to malware is to not install it in the first place. By the time SIP is useful you are already very screwed. SIP makes you safer in the same way that having a parachute on a plane makes you safer, technically yes but the difference in safety is marginal.
Do you have a system in mind that prevents the user from doing this?
Sure, macOS could adopt an iPad-style security system that refuses to run all software outside the App Store. It works on iPhone and iPad just fine, all the prosumers love it.
It's not like native darwin triples are a popular compilation target. There wouldn't be any vast tragedy if the macOS shellutil authors were told to use zsh in a VM instead, it would separate the parts of macOS that Apple cares about from the parts they don't seriously support. WSL and Crostini achieves this on vastly weaker hardware with great results.
https://news.ycombinator.com/item?id=47282085#47310011
Probably my least favorite redesign in the whole update. Why is everything an oval? It's just bizarre.
> Now at least everything is consistently bad. #Programming
My neurodivergency makes me feel actual distress over those corners. I am not being dramatic. It sucks.
Does anyone actually do this? Especially for heavy-duty applications like my web browser and IDE, this has always felt like a bizarre assumption to me.
IMO, this has been their assumption for years, and it actually turned me off when I tried getting used to Mac circa 2006-2007. Coming from Windows at the time, I just couldn't get over a weird anxiety that my application window wasn't maximized, because it didn't look like it completely snapped into the screen corners.
Now, using 34-inch ultrawide monitors almost exclusively, I never maximize anything... it'd be unusable.
Browsers only ever get maximized to the left/right half screen for me too
Which is something macos should really improve on though, the ux is pretty bad compared to Windows and Linux there
Obnoxiously, it's part of the recent trend of overloading the Globe/Fn key, so it's hard to do with third-party keyboards.
Hover over the green button in the top left of the window. I recently found out about that menu for moving a window between screens, which is also an option it has. (I also just found them in the Window menu if you prefer that. I dont; the options take an extra level of hovering to get to.)
Weirdly it still doesn't quite do what I want. It leaves a gap around the edge of the window for some reason.
This is the biggest reason I love Linux. I can choose my own desktop, or even forsake the desktop entirely for a simpler window manager, without changing operating systems. Some are hyper focused on a tailored experience (gnome) while others let you configure to your heart's content (kde).
There's sacrifices to be made, of course, but not having to live under the oppression of Apple's beneficiary dictator designers is absolutely worth it for me.
Every MacOS app has a menu item explicitly made for this exact thing. It's often the third item in the menu:
File Edit View
But they refuse to put these viewing options under the View menu item. Why? Why would you not put these really great viewing options under View?I maximize windows of graphics and video editors.
All the rest I'd prefer to just summon as-needed and then dismiss without navigating away from the windows I care about.
sway/niri want me to tile every window into some top-level spot.
Took me a while to admit it, but the usual Windows/macOS/DE "stacking" method is what I want + a few hotkeys to arrange the few windows I care about.
It sounds like the scratchpad may be especially close to what you want.
[1]: https://github.com/esjeon/krohnkite [2]: https://github.com/paulmcauley/klassy
Apple then made things go full screen, but in a special full screen mode, so macOS worked more like the iPad.
By the time they added a way to maximize windows in the way Windows does, the idea of maximizing an app has largely worked its way out of my workflow. It was always too much trouble, and I find very few apps where it provides much benefit. Web browsers, for example, often end up with a lot of useless whitespace on the sides of the page, so they work better as a smaller window on a widescreen display. In an IDE, it really depends on what’s being worked on and if text wrapping is something I want. Ideally lines wouldn’t get so long that this is a problem.
With the way macOS manages windows, I often find it easiest to have my windows mostly overlapped with various corners poking out, so I can move between app windows in one click. The alternative is bringing every window of an app to the front (with the Dock or cmd+tab), or using Mission Control for everything, neither of which feels efficient.
I could install some 3rd party window management utility, I suppose, but in the long run, it felt easier to just figure out a workflow that works on the stock OS, so I can use any system without going through a setup process to customize everything. It’s the same reason I never seriously got into alternative keyboard layouts.
Full screen one. Switch to the other. Now, use just cmd-tab and cmd-` to get to the full screen safari window (cmd-` switches between windows in the same application, which is literally never the right thing, but I digress).
For what it's worth, the third party tool 'altTab' mostly fixes this.
Bonus MacOS UI bug: I had to exit altTab to confirm they still hadn't fixed cmd-`. When I re-opened it using cmd-space, finder defaulted to the version in ~/Downloads instead of /Applications, then read me the riot act about untrusted software trying to change accessibility settings.
One more thing: I'm still not using MacOS 26, so all my complaints are about the "last known good" release.
Except Safari, which just fills out the window's height vertically. Kinda weird to make an exception like that but I don't hate it, because I generally use Safari for reading, and shrinking the browser's width forces lines of text to not get too long if the website's styling isn't setting that manually.
When I use the Window menu, Zoom replicates what double-clicking the top title bar does, while Fill maximizes the window. This holds true with the behavior you describe in Safari as well.
It just seems like a lot of apps treat Zoom and Fill the same now (I tried Calendar, Notes, TextEdit, and NetNewsWire), which adds to the confusion.
After I got used to working in windows instead of full screen all the time, I can't really go back. Even on Windows I find myself working the way I do on macOS. Full screening every app made more sense on a 1024x768 screen (or smaller). Once I moved to a widescreen display (which happened to coincide with getting my first mac) running full screen felt like the wrong move most of time.
Web pages would look something like this:
| <- whitespace -> | <- content -> | <- whitespace -> |
| | Lorem ipsum | |
| | dolor sit amet, | |
| | consectetur | |
| | adipiscing | |
| | elit. Morbi | |
| | convallis ante | |
Making the window smaller meant less wasted space and less blinding white space. Once I got used to that idea, it carried over to most other apps.Sorry if this comes across as disrespectful, but it smells like Stockholm Syndrome. You are choosing not to use the full extent of your screen estate, and that is your fine choice, but that is no excuse for making it hard. If you compound the whitespace, the thick borders and the generally oversized UI controls, not much of "productive space" remains available to get the work done. I am not interested in macOS as a content-consumption-first vehicle, though that's clearly where Apple is steering.
I am currently running a 16" display at a similar fractional scaled resolution (because Apple stopped understanding DPI after shipping the first LaserWriter, apparently).
Over that time, my eyes have not gotten better to match display DPI, so I'd rather have web sites just adjust the font size so that there are a reasonable number of words per line instead of rendering whitespace.
Non-full-screen windows would make more sense if Apple supported tiling properly, like most Linux WMs and also modern Windows.
MacOS sort of supports tiling in a "program manager shipped it + got promoted" sort of way, but you have to hover over the window manager buttons, which is slower than just manually arranging stuff. If there are any keyboard shortcuts to invoke tiling, or a way to change the WM buttons to not suck, I have not found them.
As for tiling in macOS...
You can use the mouse to drag windows into tiled positions. Grab a window and when your cursor hits the side, corner, or top edge of the screen, it will indicate the tiling position, much like AeroSnap on Windows from some years back. You can also hold the Option key while holding the window to get the tiling regions to show up without moving all the way to the edge.
Keyboard shortcuts exist as well. Go to Settings -> Keyboard -> Keyboard Shortcuts... In the dialog that opens, go to Windows. There you can see all the options and customize them if you'd like. Or set shortcuts for things that might not have one yet.
If for some reason dragging the windows around doesn't work, go to Settings -> Desktop & Dock -> the Windows heading. There are toggles to enable or disable dragging to tile, and the Option key trick. You can also turn off the margins on tiled Windows, which you'd probably want to do.
I've never been a big fan of window tiling myself. There was a time when I needed a lot of different windows visible at all times, but that hasn't been the case in a long time. I find tiling makes things too big or small, it's never what I actually want. I drag the window up to the top of the screen to invoke Fill from time to time, but that's about it.
That was the Mac in the 1990s. It was designed for, and highly usable with, a one-button mouse. It didn't have hidden context menus or obscure keyboard shortcuts. Everything was visible in the menu bar and discoverable. The Finder was spatially aware with a high degree of persistence that allowed you to develop muscle memory for where icons would appear onscreen every time you opened a folder.
There was almost nothing hidden or lurking in the background, unlike today (my modern Mac system has 500 running processes right now, despite having only 15 applications open). We've had decades of feature creep since the classic Mac OS, which has made modern Macs extremely hard to use (relatively speaking).
Why is it that some of the most useful features in Apple products are impossible to find on your own? I recently also learned about "three finger swipe to undo" in iOS instead of shaking the damn thing like it owes me money.
It works well for me, makes it easy to get two things side by side without wasting space.
Full Screen Mode was their answer to maximize, going back many years now (10.7).
Obviously all of that works better if Finder windows don't usually fill the screen, but it's not a hard requirement.
(IMO the spacial Finder was designed around floppies and small folders and didn't work so well with hierarchical folder views, so no big loss...)
I’ve never found a setup with multiple desktops or similar with a way to quickly switch between apps I’m using more than “editor slightly more left, browser slightly more right, …” and just clicking on a border I know brings that app to the front. I’m sure many think I’m crazy. That’s ok. :)
That said, I generally hate the new OSX UI. Every UI element that is non usable just became larger and wastes space I should be able to utilize. Likewise, it made some operations insanely frustrating (here’s looking at you, corner drag resize!).
I haven’t maximized a window in years. They look ridiculous like that. Especially web pages with their max width set so the content is 1/4 the screen and 3/4 whitespace.
If I ever accidentally full screen a window, and it’s not in night mode, I am instantly blinded by a wall of mostly white empty background!
I frequently use macOS on a projector, it doesn't quite fill my wall floor to ceiling but it comes close. I don't use full screen often, but I do it occasionally as a focusing strategy, and it's fine.
You're shining a bright light on a wall, which you are looking at.
With a monitor you are shining a bright light at your face, while staring directly at the lightbulb!
If you're using a monitor in the dark the way you use a projector, you should turn the backlight down. If you're using it in a well lit room, the brighter backlight should have less of an effect.
It sounds to me you've never actually looked at a monitor display large swaths of white before, it's brighter than light hitting a wall for sure, even with the brightness down, extra so when the ambient lightning is dark too.
The fact that it's bright outside when the sun is up might help, but it's nowhere near enough to compensate!
It’s probably a me problem, but I’m going to open stuff and then leave it scattered around all day. It’s fine.
I don’t use more than a couple of virtual desktops either. Just one for current tasks and one for background apps.
My actual biggest pet peeve with this setup is the vast number of web sites that deliberately choose to limit their content to a tiny column centered horizontally in my browser, with 10cm of wasted whitespace on each side.
I sometimes maximize something - other than video calls: those are always full-size - on the laptop screen, but otherwise not at all.
I can see how a full-screen IDE makes sense, but I don't use one, so I always want a couple of terminal sessions running alongside my editor.
There are vanishingly few contexts in which I find full-screen helpful. Not criticizing anyone else, or recommending my way of working, but it's what works for me.
[0] I would like better support for desktop management: naming and shortcutting, particularly. Years ago I tried some (I think it was Alfred, or a predecessor) add-on that promised that, but it was super flaky. Does anything exist that works well?
It’s so ingrained I tend to get frustrated on other desktops, which are nearly all built around the Windows mentality of keeping displays filled to the brim with tiled or maximized windows.
Even on the handful of times with maximize/tile on macOS, it’s with a gap of a few pixels of desktop peeking through so it doesn’t feel as “boxed in” and claustrophobic.
I think there's a conflict between the users who use it on studio displays and users who use it on 13 inch laptops. The Mac team at apple won't pick a side or come up with two solutions.
That's not completely true, they've been pushing swipe between fullscreen apps for a while.
But that doesn't make any sense on an iMac.
So the recommendation from pro users is to use Alfred to manage windows.
The other day I was explaining to one that their designs fixed width looks silly once it got up towards 4k resolutions. But the designers main concern was if people actually used full screen browsers on 4k monitors and if there was any point in thinking about the design at that resolution.
There are plenty of times I enjoy have 2 browsers side by side of even 4 browsers in a square, and being able to do that is one of the benefits of having a 4k monitor. But without a doubt the majority of my time is spent with a full size browser window open, and I observe the same from all the other windows/linux users I manage that use a 4k monitor.
In service of keeping this post simple, I've ignore system display/ui scaling. But still... the question/assumption from the mac designer completely blew my mind.
1. On a screen share support call with a mac user
2. Asked them to pull up a webpage
3. They pull up a super tiny ass browser window to the point I can't really see anything
4. I ask them to full screen the browser so we can actually read shit
5. The mac user just straight up panics or acts like like I've spoken an alien language to them.
The same process happens when I need a mac user to get to an apps settings that on a windows/linux computer would normally be under something like File > Preferences/Settings. They have no idea what I'm talking about or know just barely enough to know they don't remember how to do it and panic.Then I have to go google it and remember that CMD+comma(⌘+,) exists and reveal it to the mac user like it's actual black magic. And then I immediately forget about it until 6 months later when I need to support a mac user again and I repeat the whole cycle again.
I can’t tell if this is a serious comment or humor.
eta: i'm just saying if i had a glowing half drank beer or partially eaten pizza on my laptop in a business meeting i am getting weird looks. Just because you all normalized glowing fruit doesn't mean the rest of us take you seriously.
The assumption is that the window should be the size of the content of the document inside.
It turns out that this approach works well for many applications, especially what the mac was designed for in the 80s and 90s. And it's horrid for modern "pro" applications.
However, after the internship I went right back to fullscreen/window tiling in linux, so I can't say I really preferred it. Even now as a Gnome user with a big monitor and magic trackpad on my desk - which gives me ~equal access to either approach - I fullscreen everything.
Another component is how ability to overlap windows is emphasized, allowing the currently relevant portion of them to be visible without taking center stage or stealing any space from your main window(s).
Both are part of a larger difference in mentality and workflow style.
But for other apps where interactions tend to be brief like Finder, Messages, Notes, Music, etc - yeah I don't usually expand them to full screen.
I use cmd+tab and cmd+~ a ton also as I have multiple browser profiles and windows open and usually a few instances of ide with different projects.
And always close tabs with cmd+w and apps with cmd+q to avoid running apps with no visible windows.
I feel super productive with this workflow, never need to fiddle with manual resize.
When someone is screen sharing and they have a bunch of random sized windows it drives me crazy.
On large external monitors, I think it makes total sense not to have every window maximized, though. Probably less usable that way.
I do have Rectangle installed, so apps generally get at most the left or right half of the screen, with a shortcut for badly behaved websites that need 2/3 to look right. Apps are usually pretty good about remembering window positions, so mostly you futz with it once and you're done.
Also just want to be 100% clear: Tahoe is bad and I hate the changes and I don't think the OS should prefer one way of working over the other. I just hope it's helpful to explain my perspective.
Personally I try and work with that as much as possible, though it’s not always ideal.
I have a 39" ultrawide and I keep every window maximized. I have OCD about this. I can't stand things all layered on top of each other. I like to focus on one screen at a time.
Chromium browsers have been rolling out split tabs and I use that on a couple of tasks where I'm constantly cutting/pasting between sites, but that's about it.
That said, I am a huge fan of manual window management.
Somewhat relatedly, we use Windows at work, and it drives me crazy when I hop on a computer after someone's been using it and they have every single thing maximized, even Windows Explorer, on 27" monitors. A maximized browser, I get... I don't do it myself but I understand how it can be useful, but maximizing Windows Explorer is just insane to me, and yet a lot of my coworkers do it.
A lot of stupid things about Mac window management stems from the mistake of forcing all applications to share a single menu that's glued to the top of the screen. This essentially turns your entire desktop into ONE application's window, within which its actual windows float around.
Historically this led to the Mac's penchant for apps that spawned an irritating flotilla of windows that you had to herd around your screen. Not only did this deny users a way to minimize the whole app at once, but it also sucked because you could see everything on your desktop (or other apps' UIs) THROUGH the UI of the application you were trying to use. A dysfunctional mess.
Around 15 years ago, I estimate, the huge advantage of a single application window finally permeated the Apple mindset and things have gotten much better in that regard. But Apple should have abandoned the single menu in the transition to OS X, and put menus where they belong: on applications' main frames. That would make the desktop a truly unlimited workspace and eliminate the daily irritation of the menu changes its contents behind the user's back because he clicked on another application's window (perhaps to move it).
Multiple times a day I minimize an application and then attempt to do something in the application that's now filling the screen... only to find that the menu still belongs to the application that isn't even shown. It's just so dumb.
But then... this is the GUI that, for decades, would only let you resize windows from ONE corner and NO edges. Apple grudgingly, half-assedly, and unreliably addressed that in the 2000s, only now to make it even less reliable in the shambolic Tahoe UI.
In general my browser is dead center or slightly to the right so I can access my other windows (terminal, throw away text editor, etc) easily where command tab is insufficient (when I have multiple terminal windows, eg)
Strange, I constantly get annoyed by how slow and unresponsive the Mac's tiling is when dragging windows to the edge. At the top it has at least half a second delay for no reason. But at least the newest version now has caught up with Windows 7.
But do use apps fullscreen when Im traveling. The laptop screen is too small to use chrome or vscode any other way.
As you said, browser and IDE are the big exceptions, plus things like Lightroom or my 3d printer's slicer.
Even VS Code usually lives as a smaller window when I'm using more a text editor rather than as an IDE.
I have been using it for years and I just gave up entirely on managing anything and if I zoom out to see all my windows it looks like the freaking Milky Way from windows I forgot
Yes (but not for a browser). My terminal windows are 80x24, pretty much always. I do this today on Linux, I've done it through multiple versions of Windows, and I did it in my childhood on a 9" B&W "luggable" Mac screen.
I just like it, okay?
so in response I changed my windowing strategy to having a set of windows floating around at exactly the size I want them, and then the advantage of the enormous screen is just how many windows I can have open at once
that being said, I use KDE not MacOS, and 90% of Mac users I'd guess are on laptops, so using this strategy sounds completely insane to me. On laptops I still default to fullscreening or "half-screening" most apps.
I would in fact say that the culture of not maximizing windows was a small reason why I switched to Mac OS X in the early 2000s.
Or just use the taskbar, which is literally made for switching between windows. Or it was, before Microsoft forgot its purpose.
Meanwhile, I want to use my graphical, mutli-window preemptive multitasking operating system to, you know, use multiple applications at the same time.
Trying to maximize a window, even 23 years ago when I first moved to OS X, was a completely manual process. It was designed around windows, not walls. And screens were much smaller and lower res back then.
This goes towards something that I've felt for a little while: at some point in time around the early 2000s, operating system vendors abdicated their responsibility to innovate on interaction metaphors.
What I mean is, things like tabbed interfaces got popularized by Web browsers, not operating systems. Google Chrome and Firefox had to go out of their way to render tabs; there was no support built into the OS.
The OS interfaces we have now are not appreciably different from what we had in the early 2000s. It seems absurd that there has been almost no progress in the last 25 years. What change there has been feels like it could have been accomplished in user-space, plus it doesn't get applied consistently across applications, thus making it feel like not a core part of the OS.
MacOS in particular was supposed to an emphasis on the desktop environment being the space of window and document level manipulation, as exemplified by the fact that applications did not have their own menubars. All application menu bars were integrated together at the top of the screen. Why should it be any different with any other UI organizational feature? Should not apps merely be a single window pane, accomplishing a single thing, and you combine multiple apps together to get something akin to an IDE out of them?
Well, I don't know if they should be. But they can't. Because OS vendors never provided a good means to do it. Even after signalling they wanted it.
Look at how older versions of Word, Excel, and Visual Studio worked. The tool trays stay consistant as you move between document windows. The entire application is minimizable and quittable together as one.
Photoshop still uses this metaphor. In the ealry and mid-2000s, Photoshop on Windows had a window for the application separate from the documents, but on Apple OS9 and OSX, the only representation of the application itself was in the menu bar. Document windows and tooltray windows both floated in the same desktop space as every other window.
I haven't checked on the GNU Image Manipulation Program, but I seem to remember it retained the same "no application window, tooltrays and doc windows exist in the DE" metaphor for much longer than Photoshop.
There is also a difference in the way that Chrome renders tabs in the window title area. That's a part of the UI chrome that one would expect to be in the perview of the UI toolkit, but Google took it on themselves.
https://en.wikipedia.org/wiki/Tab_(interface)
Don Hopkins himself can enlighten us about it (NeWS) better than me literally anyone in this thread, jut wait.
I suppose you could splurge for a Mac desktop and then get the cheapest, smallest screen possible, but I hope it’s rare.
Any space not used for the task I'm focused on is wasted. For me the actual problem is that switching apps/windows is too slow because of UI animations.
I'd like to be able to snap things to the middle third, especially on the ultrawides.
Only little calculator widgets, property panels, and modal dialogs that get immediately closed after use don't get maximized or at least docked to fill some region. I hate the cluttered, layered feeling of having a bunch of non-full-screen windows overlapping, I want to have a dozen apps open and making optimal use of the available display area.
Lots of native applications also pop up multiple windows with the expectation that they kind of just float around. But at least in Mac you can scroll on an app that isn’t in focus…
Same as in Windows. It just makes sense.
There are apps that they need to run in the background, sure. They have a spot in the menubar.
Oh no I forgot, you can only have 5 of them. Not 6. Why? Because FU. Go buy a third party app (bartender) that records your entire screen to do basic app management that the OS should do.
I hate MacOS.
When Adobe suite was de facto standard for designing and coding interfaces (you know, Flash) their own software was so immensely bad that there was enough material for a guy to make fun of them on a daily basis for a good couple of years.
they tried to do something with remembering "how you left things" between sessions, and even when disabled things are still weird...
Also some power management related hooks are not working as well as before. Like if you put the computer to sleep at night, and wake it up in the morning, the automatic dark-to-light theme switch doesn't trigger. at least not always.
Still the best system to work with though!
There are things which definitely do bother me like the Liquid Glass, but the window corners really don't bother me. And I'm into design and constantly inspect parts of ui with Digital Color Meter app.
I used to roll my eyes at the complaints until I actually had one of these, and it is appallingly bad engineering. Especially since the previous design, which was functionally identical just needed a 10 second battery swap.
defaults write -g com.apple.SwiftUI.DisableSolarium -bool YES
Makes thing bearable....note log out to see it change.
You still need to then turn transparency off via settings see
this
https://tidbits.com/2025/10/09/how-to-turn-liquid-glass-into...
What does it say?
It doesn't render for me either, but is in the HTML at path...
.../html/body/div/div/main/div[3]/div[6]/div/div[2]/div/p
Edit: SIP has a series of control bits for a diverse set of protections. You can see what these control (and which bits "csrutil disable" toggles) in this include file: https://github.com/apple-oss-distributions/xnu/blob/f6217f89...
The platform would aggregate by major/minor version, and you could see in totality whether the current version of macOS/iOS would make Steve proud of miserable.
Ultimately I decided against it, for defamation/cease-and-desist reasons, and not wanting to find out. But it needs to exist.
Apple traditionally burned out its talent, and is no longer structured to follow Jobs original vision. There is a lot of goodwill with the users, but just like Sony/HP/IBM/Microsoft/Sun it can't last forever. The process-people entrench themselves, and ruin everything... just as Jobs predicted. =3
this is actually one of the reasons i ended up going all in on a tiling wm (aerospace). once youre tiling, windows are edge to edge so the corner radius thing mostly disappears. the trade off is giving up floating windows,
the DYLD_INSERT_LIBRARIES approach is clever though. making everything consistently rounded is way more pragmatic than fighting apples design decisions or disabling SIP.
True, the "blessing" of forced online accounts, telemetry and advertisement didn't arrive to MacOS, yet. But, I wonder how long it will take us to get there.
Ads in a start menu can die in a fire though.
If you want ads in Spotlight or Launchpad, telling people to tolerate "opinionated, and likely worse but also not breaking" features is exactly how you get it. It's how Windows got there.
Now they sell expensive but nice hardware and they have mediocre software.
It seems you can only choose one out of three, nice hardware, nice software, good price. Apple is always choosing high price, and they either gave customers nice hardware or nice software, but not both.
Rounded corners are just...bizarre. Just because the laptop casing is physically rounded !? (Yet the menubar squares it off off at the top, and the bezel squares it off on the bottom...)
Not really, if you have malware that has root access on your system I think you're already pretty screwed, especially considering that you don't even need root to read all your saved passwords and personal files https://xkcd.com/1200/
The number of times I have noticed the corner of my windows is precisely zero because each important application gets its own workspace, so the window frame doesn't get rendered. Sometimes I'll tile two windows side by side on my external monitor but even then this is a complete non issue for me.
Are you guys just running everything on the one desktop workspace in windowed mode? That seems like madness.
That means there are exactly two of us.
I get the UI consistency thing but it's okay to transition to new UI things gradually than making radical changes all at once. If this is still an issue 2yrs from now it will be more of a concern about their commitment.