Not only is this an insanely cool project, the writeup is great. I was hooked the whole way through. I particularly love this part:
> At this point, the system was trying to find a framebuffer driver so that the Mac OS X GUI could be shown. As indicated in the logs, WindowServer was not happy - to fix this, I’d need to write my own framebuffer driver.
I'm surprised by how well abstracted MacOS is (was). The I/O Kit abstraction layers seemed to actually do what they said. A little kudos to the NeXT developers for that.
I felt similarly. The learning curve was a tad steep, especially since I had never written a driver before, but once I figured out how to structure things and saw the system come alive, I grew to appreciate the approach IOKit takes.
With that said, I haven't developed drivers for any other platforms, so I really can't say if the abstraction is good compared to what's used by modern systems.
IOKit was actually built from the ground up for OS X! NeXT had a different driver model called DriverKit. I've never coded against either, but my understanding was they're pretty different beasts. (I could be wrong)
That said, indeed, the abstraction layer here is delightful! I know that some NetBSD devs managed to get PPC Darwin running under a Mach/IOKit compatibility layer back in the day, up to running Xquartz on NetBSD! With NetBSD translating IOKit calls. :-)
There’s a great video of a NeXT-era Steve Jobs keynote floating around—I think the one where he announces the x86 port as NeXT was transitioning to a software-only company—where he specifically calls out DriverKit and how great it is.
Steve was not a developer but he made it his business to care about what they cared about.
Yeah - even from the start, I remember NeXT marketing was spending a disproportionate amount of their time selling NeXT’s “object technology”, AppKit and Interface Builder, DPS as an advanced graphics model. It was good hunch from Steve, given how how modern NeXTSTEP feels in retrospect.
For some reason, though, it means that people overlook how NeXT’s hardware was _very_ far from fast. You weren’t going to get SGI level oomph from m68k and MO disks.
Yes, you're right! I'm just dolt who's never checked what a .kext on OS X actually is.
I had been under the impression that DriverKit drivers were quite a different beast, but they're really not. Here's the layout of a NS ".config" bundle:
The driver itself is a Mach-O MH_OBJECT image, flagged with MH_NOUNDEFS. (except for the _reloc images, which are MH_PRELOAD. No clue how these two files relate/interact!)
OS X added a dedicated image type (MH_KEXT_BUNDLE) and they cleaned up a bit, standardized on plists instead of the "INI-esque" .table files, but yeah, basically the same.
IOKit was almost done in Java; C++ was the engineering plan to stop that from happening.
Remember: there was a short window of time where everyone thought Java was the future and Java support was featured heavily in some of the early OS X announcements.
Also DriverKit's Objective-C model was not the same as userspace. As I recall the compiler resolved all message sends at compile time. It was much less dynamic.
Mostly because they thought Objective-C wasn't going to land well with the Object Pascal / C++ communities, given those were the languages on Mac OS previously.
To note that Android Things did indeed use Java for writing drivers, and on Android since Project Treble, and the new userspace driver model since Android 8, that drivers are a mix of C++, Rust and some Java, all talking via Android IPC with the kernel.
Yes, also the same reason why Java was originally introduced, Apple was afraid that the developer community educated in Object Pascal / C++, wasn't keen into learning Objective-C.
When those fears proved not true, and devs were actually welcoming Objective-C, it was when they dropped Java and the whole Java/Objective-C runtime interop.
And there are enough parallels to Linux's stack, I'm thinking about looking through the Linux on Wii project more and comparing how it handles fb issues in comparison. I loved reading this whole post, crazy how many OSes have now been run on the humble Wii!
I’d say it’s more about how much explanation is needed. There are cool abstractions that require explanation because they aren’t intuitive at first, and then it clicks. But usually if I find endless explanations of why indirection is better because it aligns with someone’s conceptual model, that’s to me a bad abstraction. Not because it’s leaky, but because it resists understanding.
Excellent project! This is one of the topics that keeps Hacker News ever refreshing. Seeing work get done in a way that feels like real hacking but in a positive way.
The author has mentioned earlier attempts to port other OSes to the Wii but it appears these works didn't get much traction here on HN except for Windows:
Lastly, since we are in the context of turning the Wii into a computer, I'd like to honorable mention: Hosting a blog on the Wii (622pts, 104cmts): (https://news.ycombinator.com/item?id=43754953)
This is the most incredible part. I cannot even use a laptop adequately in an economy class seat, I cannot position the screen so that I could see it, and the keyboard so that I could type on it, at the same time. (To say nothing of connecting a Wii.)
I struggle so much to even comfortably play a handheld video game system on a plane, let alone use a laptop (I have also tried that) that I've mostly given up on even trying and just line up a few albums on my phone to listen to and close my eyes as much as possible.
I can't imagine trying to program on a laptop with an external device, even something as portable and small as a phone, on a plane. I expect my frustration and frequently bumping things about would mean I'd get nothing done aside from having a bad time.
In one of the pictures, the laptop is on his tray, and the wii is on the tray of the seat next to him, and that seat looks empty. So the wii got its own airplane seat?
Huh - I know Apple’s first PowerBook 100 had an ad with Shaq on a plane, and then later one with Yao Ming… I guess Apple really wanted to crack the “I’m working on a plane damn it” market?
I can't imagine concentrating on a complicated project like that on the go, but I went back to stare in awe at said picture and I think its a train or bus. Still a flex.
because of the mix of boredom, very shoddy internet that drops constantly and ANC earbuds removing distractions, I often find myself getting in the zone while riding the train back home from the office. As the kids say, I lock in
"I've now received my Cease and Desist letter from Nintendo over their Wii trademark. Feeling encouraged, I've written a full seven-world Super Mario Brothers sequel for macOS on the Wii that I've titled 'Newer Super Mario Brothers Wii Subsystem for macOS'"
I mean, you need WiFi, and that's definitely a roll of the die on flights. But the last flight I had had WiFi, and the gal who sat next to me was vibe coding something.
Meanwhile I was taking photos of the seat back infotainment system's map, which showed our ETA as being before we left. Sadly, we did not time travel.
What's flex-worthy about this? There's a lot of dev work that goes on in economy class airplane seats. Or are VC valley programmers so rich they fly business everywhere?
It's uncomfortable and awkward (the Wii was on his leg in the first shot), and often you need to break concentration and pack things up to let someone out of or into their seat.
So what you're saying is if it's flexing, it's entirely performative. gotcha.
I don't think that's a healthy way to look at it - dude was just getting some work done, but maybe I'm a broken human being who's churned out more code than I would ever admit while sitting in 32C on a cross-country flight.
Back in the day I was a hardcore Mac nerd and became a professional at it too. My best reverse-engineering trophy was building one of the first "iOS" apps when there was not an official appstore for the iPhone.
But man, this is way ahead of what I could do. What this dude accomplished blew my mind. Not only the output (running MacOS on a Wii), but the detailed post itself. A-MA-ZING.
As the author of the NetBSD Wii and Wii U ports, congrats! I’m looking forward to seeing how you solved some of the problems that I faced along the way.
Many thanks for all the code! The second hand market is currently flooded with Wii U hardware that is cheap enough to buy enough stock to last a life time for peanuts. Would be amazing fun for PowerPC development and if I had an alternative timeline where I went into low-level programming, I would love to push for OpenBSD support inspired by your work.
This reminds me the 2008-2009 era where Mac OS X Leopard was running Hackintosh on Dell Mini 9 and some other netbooks.
At $349, it was almost a fully functional laptop that runs on Mac OS X (comparing to over $1000+ MacBooks or $1599 MacBook Pros)
Two friends of mine literally working remotely in an Africa trip with Dell Mini 9 and mobile hotspots and were doing video conferencing with Skype (on Wi-Fi).
Another funny history fact with the Wii is Windows Vista released the same month in North America. People were so upset that the minimum requirement for Vista said 512 MB (which was already more than the average existing home PC of the time had without an upgrade) but it ran like crap unless you had more.
We truly had to get away with less back then. These days it feels like there is a bit more headroom where 8 GB is on the downtrend, 16 GB is becoming the most common, and the user's apps are enjoying the extra fat.
Recently I was ranting about this very thing, my work machine running win 11 has 16gb ram and Windows just sits at 8Gb on idle. My first laptop had 1 gig of ram... until I got my mac (16GB M1 Air) I used to manage with 4GB RAM while serving clients... Optimization seems to have been forgotten these days
The reason it sits at 8GB on idle is... optimization. The memory is there to be used, so the OS will use it to improve performance until it's needed for more important tasks.
That's a blatant simplification, and does not match reality as far as I've seen.
The OS only only has one large source of memory it uses "optimistically" - the file/buffer cache. But that's tracked separately, so it doesn't show up in normal memory usage numbers (unless you don't know how to read them).
The other source of "extra" memory usage is memory mapped executable files, which can be unmapped and then read back on demand. That's quite small though.
Everything else (mostly) is actual memory usage caused by actual drivers and programs (though it can be swapped, but that's a major perf hit).
The major reason for Windows memory bloat is the hundreds of inefficient services from both Microsoft and hardware vendors that run at startup. The "optimization" pool (let's not call it that way) is way smaller than that.
eg. pre-loading an application is a pessimization if there's not enough memory - not only does it permanently eat away a portion of the total memory due to the intricacies of Windows working set memory management, it will need to be swapped out when actual demand for memory arises, competing with other disk access.
The only actual "optimization" in Windows is Superfetch, and that barely works these days.
I really wish this meme would die. Every modern operating system - macOS, Linux and Windows - use available memory for certain performance optimizations. It doesn’t mean when needed, that memory isn’t available for other applications and your computer just starts swapping.
Well, Microsoft pioneered with that earlier. Win98, or was it 95b, merged the filesystem Explorer with Internet Explorer and came up with ActiveDesktop.
Nintendo uses web UI a lot. The Switch eShop is notoriously a web app running on WebKit without JIT. The Action Guide in Super Mario Odyssey is web despite everything else being native.
Before figuring out how to tackle this project, I needed to know whether it would even be possible. According to a 2021 Reddit comment:
There is a zero percent chance of this ever happening.
Feeling encouraged, I started with the basics: what hardware is in the Wii, and how does it compare to the hardware used in real Macs from the era.
I almost think such projects are worth it just to immortalize comments like these. There's a whole psychology of wrongness that centers on declaring that not-quite-impossible things will definitely never happen, because it feels like principled skepticism.
That used to be my thing: wherever our ops manager declared something was impossible, I’d put my mind to proving her wrong. Even though we both knew she might declare something impossible prematurely to motivate me.
My favorite was “it’s impossible to know which DB is failing from a stack trace”. I created STAIN (stack traces and instance names): a ruby library that would wrap an object in a viral proxy (all returns from all methods are themselves proxies) that would intercept all exceptions and annotate the call stack with the “stain”ed tag.
I've seen more than one half-joke-half-serious chunk of code that would "encode" arbitrary info into stack traces simply by recursively calling `fn_a`, then `fn_s`, `fn_d`, and `fn_f` before continuing with the actual intended call, giving you a stack trace with (effectively) "asdf" in it.
They've also been useful more than once, e.g. you can do that to know what iteration of a loop failed. There are of course other ways to do this, but it's hard to beat "stupid, simple, and works everywhere" when normal options (e.g. logs) stop working.
Well you're doing gods work as far as I'm concerned. Conflating difficulty in practice with impossibility in principle is, to my mind, a source of so much unnecessary cognitive error.
Similarly, one of the great things about Python (less so JS with the ecosystem's habit of shipping minified bundles) is that you can just edit source files in your site_packages once you know where they are. I've done things like add print statements around obscure Django errors as a poor imitation of instrumentation. Gets the job done!
I'm remindded of my favorite immortalized comment, "No wireless. Less space than a Nomad. Lame." Rob Malda of Slashdot, 2001, dunking on the iPod when it debuted.
Funny enough about the Dropbox comment, it caught so much flak that it’s gone full circle and I’ve often found people defending it saying what the guy said made sense at the time etc
They're kinda like high-effort shitposts. Which are my absolute favorite kind. The worse the effort/reward payoff, and the more it makes you ask "WHY??!!?", the better.
I got the idea of writing an emulator in JavaScript in the pre-Chrome era, circa 2007. I remember searching around trying to find whether somebody had done it before. It seemed not, and somebody on a forum declared “that’s not possible”.
To me, it was obviously possible, and I was determined to prove them wrong.
Wasn't the old Linux joke, don't ask "how do I do X with Linux" (because you'd get ridiculed for not reading the docs) but instead, just state "X isn't possible with Linux" and then someone would show you how it's done?
Its a great motivator, happened with me too, I once asked a question about getting the original camera on custom rom and got this as a response [1].
This lead to 2 year long project [2] and an awesome time full of learnings and collaboration
Debugging kernel panics on a Wii in an economy seat is a level of focus I can't even imagine. Most people can't read a book on a plane without losing their place every 5 minutes.
What's not to love? A small and beautiful PowerPC Unix workstation, something IBM hasn't done in a long, long time. How far does MacPorts go with a PPC?
Neat, and kudos! Reminds me of my young hobbyist days. I wish low level dev work was that approachable now.
Back in the old days, it was REALLY easy to initialize VGA and throw pixels around in ASM, C, or C++. The 6502 and related chips were relatively easy chips to build stuff for, even though tooling was non-existent. Shoot, you could do some really awesome things on a Tandy CoCo2 and BASIC of all things.
It feels like engineering has made this type of thing inaccessible. Most systems require a ton of knowledge and expertise. There is no easy 'in' for someone with a special interest in development. Even worse, AI is artificially dumbing things down, while making things even more inaccessible.
As someone who's been trying to do something VERY similar (port Mac OS 9 to the Nintendo Wii U), all I can say is I'm 1) absolutely impressed, and 2) absolutely encouraged, as my project keeps telling me "this is impossible" at every opportunity.
You are at a slight disadvantage without XNU and Darwin sources, but you can have leaked System 7.1 source, Ghidra and MCP to help make up the difference.
I hope OP is still reading comments. I noticed that the project was written in Xcode (the repo even has the xcodeproj folder) but in some screenshots I see CLion. Did you switch at some point or were you using both throughout the development simultaneously?
Amazing writeup, love this types of blog posts and hope the hawaii trip was enjoyable
A side note: you embedded .mov videos inside <img> tags. This is not compatible with all browsers (notably Chrome and Firefox), which won't load the videos.
Minor usability comment: the screenshots are too small to be readable. Whenever that's that case in my blog posts, I make those screenshots clickable and add (Click to enlarge) below it, to make it easier for readers to see the image are original resolution. In markdown, I do that like this:
[ ](image_url.png)
(Of course, I can also right-click and do "Open image in new tab", but that's one click extra...)
Congrats on the awesome project, BTW! You were lucky that I wasn't sitting next to you on the plane. I would have wasted so much of your time asking dumb questions.
What stood out to me is how much of this worked because of strong abstraction boundaries.
It’s interesting because we don’t often think about OS-level abstractions in the same way anymore — but projects like this really show how powerful they are when they’re done right.
Makes me wonder how feasible something like this would be with modern systems, where things feel more tightly coupled and security constraints are much stricter.
I wonder if you can place an A18 from a Neo onto an iPhone board, and then make that work somehow... You wouldn't be able to use the one originally from the iPhone because it's differently fused to only accept iOS images.
Is it possible that a jailbreak is found that could allow a “kexec” kind of thing to load a new OS? Of course it would be a huge amount of work even if theoretically possible
marcan once said this was not possible on M1 macs. It was possible before, as coolbooter demonstrated, but it seems now that the hardware cannot be completely reinitialized without being power cycled (it was on Mastodon in 2024, he has since deleted his account so I cannot give you the exact quote). But you can do wizardry to load macOS' userspace on top of iOS' kernel [0] with a jailbreak.
You can't reinitialize the hardware, but if whatever you are trying to load is compatible with what's going on, then it should work. In a sense you could consider kexec to be like booting on a kind of weird machine where your interface to talking to the hardware is whatever macOS initialized the devices to.
Had a very similar issue porting a hypervisor to ARM S-EL2. Writes would succeed, there were no faults, and everything looked reasonable in GDB, but the other side never saw the data. The root cause was that Secure and Non-Secure physical address spaces were backed by different memory even at the same address, and a single PTE bit selected between them. That took me much longer to understand than I’d like to admit.
> In the end, I learned (and accomplished) far more than I ever expected - and perhaps more importantly, I was reminded that the projects that seem just out of reach are exactly the ones worth pursuing.
Couldn't agree more. I've had my own experience porting something that seemed like an intractable problem (https://news.ycombinator.com/item?id=31251004), and when it finally comes together the feeling of accomplishment (and relief!) is great.
The one that really bugs me is the Apple TV. It would be a great little box to use for terminals/thin client style work and there are a ton of old cheap ones. Having a $50 dollar used box that was low power and could run OSX would be great.
The original one does run a modified OS X Tiger. I jailbroke it a while ago to run custom stuff, but didn't do much with that. Just remember being able to VNC or SSH into it.
hand-rolled iokit drivers and a bootloader to get xnu running on 88mb of ram with cpu-bound yuv-to-rgb conversion at 60fps, all because the wii's powerpc 750cl is close enough to a g3 imac that darwin mostly just worked. solid systems work and a genuinely useful writeup but might try on a dreamcast personally. rom burns
This was an incredible read! Especially for what looks like the first post to this blog too? I wanted to subscribe to the RSS feed but unfortunately it gives a 404 error.
Chiming in as well to say to the author when the victory lap here is over: please consider adding the RSS feed! I want to see whatever you do next, regardless of how long it takes.
This is extraordinary, not only pushing the limit but documenting everything so clearly to show people what can be accomplished with time and dedication. Thank you for such thorough documentation, and congrats on getting it done!
YUV appears to be a PAL-specific color space. I wonder how off an NTSC Wii would be. Presumably it would have the wrong color space until an equivalent conversion scheme was devised for NTSC.
I was surprised to see regional color spaces leak into the project, but I presume that Nintendo's iOS (the coincidentally-named system this is replacing) could handle that abstraction for game developers.
Some of this is just really widespread imprecise usage of terms: what really should be called "YCbCr" in digital contexts is frequently called "YUV." So-called "YUV" digital formats for video are really really common, and they're used for both NTSC and PAL. "YUV420," YCbCr using 4:2:0 chroma subsampling so the two color components are half the resolution in each dimension vs. the luma, in particular is super-common.
The Wii seems to actually use "YUV422" internally, so 4:2:2 chroma subsampling, where the chroma is only halved in one dimension. The conversion to analog NTSC or PAL signals happens later in the process. The repository here actually looks like it sets up the Wii's video interface to output NTSC progressive by default, but lets you configure for PAL with a config file.
That's true in broad strokes, but looking into it, it turns out NTSC's variant of YUV is called YIQ, and SECAM's variant is called YDbDr. They are however all more or less the same thing, and the digital YUV used by the Wii hardware in this case is presumably independent of the video standard.
This is incredible. I wonder when an LLM will pull this knowledge out to help someone down the line who would never have had the craft to pull this off, as it requires so much depth and broad skill. Admirable.
Highly respectable project. My hat's off to you. I'm just curious, what computer programming language did you do most of this in and what do you think was the most challenging part of porting Mac OS X on to the Wii console?
Thanks! The project was mostly C for the bootloader and C++ for the drivers.
As for which part was the most challenging... probably understanding the IOKit driver model. I really would have benefitted from having an expert explain some of the concepts to me, and give me some advice about how to structure my own drivers.
The Wii is very moddable. I've modded my Wii in the past just for playing modded versions of Super Smash Brother Melee (mainly training packs, like flashing a red light when I miss an L-cancel).
I wrote that L-cancel training code! Funny to see it come up out of nowhere. I too have always adored the Wii and its moddability. It'd be my go-to choice if I someday ever get the itch to write console homebrew software of my own.
Yes - this project (and countless others) would not have been possible without the incredible work to hack the Wii from Team Twiizers (now fail0verflow) back in the day. The work they did was a huge inspiration for me getting into computer science when I was a teenager.
Just keep doing stuff and gaining experience. Sometimes you'll find that you don't know how to do something, at that point don't just reach for an LLM, do your best to try and understand it, google around, and if all else fails, put it down and maybe come back to it later with fresh eyes
Given that the original Apple TV ran on a modified version of macos, what are the chances one could turn an old wii into an Apple TV..?
EDIT: also, I just noticed on a second pass the system is addressing 78mb of ram, potentially meaning the ram spans the gddr3 and sram, I'm amazed this works as well as it does with seemingly heterogeneous memory
I'd say there is a zero percent chance of this ever happening :D
The original Apple TV was an Intel Core Solo with 256 MB of RAM and an nVidia GPU, running a modified Mac OS X 10.4 that booted into something similar to Front Row instead of Finder.
Oh interesting, it looks like that geforce had an entire 64mb of gddr3 too, it'd still be fun to see if one could limbo that low, though I agree that save for upgrading the BGA ddr3 of the wii to something more the size of the dev kit had(128mb GDDR3)
They are successfully porting Mac OS onto every kind of modern computer over at the hackintosh subreddit, and I can't understand why there is so little interest for this stuff in the "hacker" sphere.
Surely, it must be a better option than Linux if you want to get the most out of a PC computer? At least for 10 more years.
I'm not sure why it would. Why would anyone want to hack on different proprietary software with no supplier support and whose days are clearly numbered (Apple's move to ARM)?
For usability I mean. It's clearly an interesting technical feat.
Because it has all the advantages of Linux + all the advantages of Windows + many advantages of its own.
So to have a fully fledged and more usable computer, for those who don't want to purchase the Apple hardware.
And the latest Mac OS still supports Intel, so you'll get many more years out of a machine. For what I know, the last 10 versions of MacOS are still very usable.
Have you ever tried daily-driving a Hackintosh? It's nothing like using a Mac, or even a Windows/Linux machine for that matter.
I dual-booted Mojave on 2 Wintel machines back during the Clover bootloader days, I could only tolerate it for ~2 weeks before giving up. Spoofing OEM Apple hardware is basically impossible, even with configuration-matched CPUs your motherboard will still mismatch the ACPI overrides that macOS expects. Any variety of modern GPU is basically forfeit, hardware acceleration is flaky, Metal is inhumane (with or without Hackintosh), CUDA is unsupported, Vulkan is MIA, filesystem support is a joke, and OTA updates have to be disabled or else your system volume will overwrite the partition table and erase everything that's installed. Reinstalling from scratch can take multiple days if you don't back up your EFI configuration, so you really want to avoid bricking your install while you tweak the configs to stop being broken.
Even as a developer, using a Hackintosh was a waste of my time back in 2018 when "everything was supported". In 2026, I cannot comprehend a single objective reason why you would use an x86 Hackintosh instead of a better-supported and more fully-featured Linux or WSL installation. x86_64-apple-darwin is a partially depreciated target triple that's not suitable for any macOS or Linux development work, and for prosumers the architecture is already unsupported by many professional apps. Hackintosh is a museum piece now, even OpenCore can't save it: https://blog.greggant.com/posts/2025/07/16/open-core-is-dead...
You didn't answer my question - have you tried making a Hackintosh yourself?
I'm writing from the perspective of a tech-savvy Windows user that had triple-boot working on my desktop and laptop. I'm willing to deal with some system configuration, but it took upwards of 12 hours to configure my EFI for each unique device I wanted to Hackintosh. And it still didn't fix my iCloud or get my laptop trackpad working.
That is an entirely unacceptable process for someone who isn't a developer. I cannot recommend anyone use an OS that blocks OTA security updates, let alone people that can't/don't/won't program.
No I haven't, if I had I wouldn't have needed to make follow up questions.
But if you follow a guide from somebody who has hackintoshed the exact same device you have, then it shouldn't take that long, or am I missing something? The posts in the hackintosh subreddit generally details what will work and not work.
Exceptional work. While it may not mean much, I am truly impressed. I like to toy with reverse engineering here and there, but such a port like this would take me multiple lifetimes.
Not to distract too much from the main topic, but what do you think about the Hopper disassembler? I have only used Radare2, IDA Pro, and Ghidra. Though, I haven't used the latter two on MacOS. What do you prefer about Hopper? I have been hesitant to purchase a license because I was never sure if it was worth the money compared to the alternatives.
I like using it for disassembling UIKit (for my day job working on iOS apps), and overall, I like the UI/UX and how it feels like a native Mac app.
I've tried Ghidra, and while extremely impressive and capable, it might be the most Java-feeling app I've ever used. I'd love for someone to whip up an AppKit + SwiftUI shell for it.
I've had it on my list to do for a very long time but unfortunately it has never gotten much effort. Although at this point I'm not super happy with the design (I feel like it's built to be slow…) and I might build on top of something more modern like Binary Ninja instead.
There are bugs and undocumented behaviors that need to be understood in order to be worked around - I wish it wasn't the case but such is life developing for closed-source platforms.
> I like using it for disassembling UIKit (for my day job working on iOS apps), and overall, I like the UI/UX and how it feels like a native Mac app.
You are correct about the UI/UX. I do think Hopper is ahead of others in that regard. Though, Radare2 being a CLI tool is nice as well. Though, I haven't attempted to use Radare2 for MacOS/iOS disassembly. Though I must ask, why are you disassembling UIKit? Looking for private API behavior or working around bugs? I've been learning more about iOS in my spare time, because despite my love for Swift, I have never used it for iOS. I only have used Swift for MacOS automation, i.e., AppleScript replacement via the Accessibility, Core Foundation, AppKit, etc..
> Ghidra, and while extremely impressive and capable, it might be the most Java-feeling app
I chuckled while reading this because I had the exact same thought when I first used Ghidra. I haven't tried Ghidra on MacOS because I will not taint my machines with the impurities of Java. I also do not want to enable Rosetta, so that was another obstacle in trying Ghidra on MacOS. In Ghidra's defense, using Java was a pragmatic choice. The "write once, run anywhere" promise of Java is likely a near-necessity for a disassembler for government operations.
I bet if me-20-years-ago knew that current me would have no fucking clue how to even begin to tackle a problem like this, me-20-years-ago would be very disappointed. Very jealous of your expertise. Awesome work!
What I would like to see it's the full OS reimplementation a la AROS m68k.
There are Minivmac ports for 9front. Exegutor it's made in C++, so no way to compile it with NPE (micro-POSIX compat layer for 9front). If anyone had that under MiniVmac, it could run everywhere.
On Advanced Mac Substitute, as it has an SDL2 interface, it can be almost done unless it's written in C++. If it's ANSI C or C99, it might run under 9front.
I probably have rose colored glasses, but this is what I associate with Hacker News when I first started coming to this site. Truly absurd projects for no reason other than the love of the game and detailed write ups.
I'm not an LLM post hater, but it definitely has been a bit draining lately. This is exactly what I love to see here.
yeah - I wish the 'Hacker' part of 'HackerNews' got more attention. Last few years it often feels more like "VC-Buzzword-of-the-day-News" - AI is just the most recent cycle.
I’m SOO happy but also wistfully sad when I open a post like this that I am desperately excited to read and it’s not muddled-thinking- and LinkedInese-riddled slop.
The post is a work of an actual hacker who knows what they're doing. Zero mention of "I used Claude" or "Used AI" to understand what is needed for accomplish this task.
This is exceptional work. Unlike the low-effort slop posts I see here on "Show HN".
I used plenty of non-agentic AI to help understand the XNU codebase, and also research various topics. It wasn't always correct, but it certainly helped at times! My philosophy for this project was to use it as a learning tool - since that was kind of the whole point of me attempting this :)
Absolutely nothing. People protest Github to virtue signal their anti-Microsoft sentiment and fill a hole in their personality.
Similarly, "zero mention of AI" is just a surface-level observation that says nothing about how the project was completed and everything about your own insecurities defining the word hacker.
Presumably Wii has AltiVec that you could use to accelerate the conversion? Did you happen to look into the code the compiler generates for your loop around rgbrgb16toycbycr?
This solution’s COU cost can be significantly improved by using memory protection. You protect the frame buffer from writes. The first time it is written, you take a fault, and start refreshing every 60 Hz and leave it writeable. After some number of refreshes, you protect it again, the idea being is that the UI may now be quiescent. I do this in my Palm OS port for the same reason.
This would work for an idle screen but I can't imagine this would be very efficient for real use, unless the hardware cursor is put in some other framebuffer (I assume it's not otherwise the author would have probably mentioned it?)