It's also a con. You get worse sustained performance. You also get a hotter device. There's a reason the base model M series MBPs consistently bench higher than the exact same chip MBAs in things like Cinebench. The fan.
As I’ve pointed out in my other comments, the Nintendo Switch and Switch 2 are perfect devices to dispel this whole “no fan is better” narrative.
Clearly it’s not a challenge to make a compact, performant device with a nearly silent fan. Clearly customers don’t mind that devices have fans even for devices meant to be held in hand for hours that weigh less than a pound.
I can buy a handheld from Nintendo for $450 that can play new AAA games with great performance while the Neo struggles with 5 year old titles like Cyberpunk despite likely having better overall hardware. A MacBook Neo with a fan would get 15-30% better overall performance and +50% framerate in games as has been demonstrated by multiple tinkerers on YouTube.
You can also dump GB Camera photos with GBxCart RW. Those can also be used to dump ROMs/saves and reprogram flash carts. The cheap GBC games off Ali express are flash carts in reality.
Oh yeah I forgot to mention that, it was the first option I considered but getting it shipped to Sweden was super expensive. So it didn't make sense considering I just wanted the camera dump feature. Buying the pcb and port on the other hand only costed me about 10 bucks since I already had an Arduino laying around, and also served as some necessary soldering practice :)
You go on eBay or similar site and you pay for a used copy on floppy or CD-ROM. Then using the appropriate tool you back those files up and use them for OpenCiv 1. Cheap, no. Convenient, no. But legal.
If you're lucky you stumble across it in a thrift store that wasn't paying particular attention and assumed it was a puzzle or a board game.
Huh, guess I’ve never worked at a Mac shop big enough to suffer Mac-ruining software. My biggest shop only had about 15,000 employees, so maybe it’s only the large companies enduring that.
You never had GlobalProtect take a multi-Gbps connection down to <20Mbps due to all that userland processing of packets thanks to Apple's lack of vendor kext.
The GPU in the Neo isn't particularly fast...nor is the storage. Neo makes loads of compromises to hit $600 with some of it's features. Even for $400 you can get Windows PCs with TWO whole USB 3.0 ports. $400 quickly hits diminishing returns territory.
Twice the storage, twice the RAM, comparable GPU. CPU is a slower in single core, but comparable in multi-core. Faster storage. USB 4, HDMI, multiple USB A ports. Supports more than 1 external monitor. Yep, chassis and screen are worse but it's better in many other ways.
So for $100 less, you get a markedly lower-DPI screen that's 40% dimmer, a slower CPU, hotter running, and a worse chassis. Almost no one's going to be slapping multiple external monitors on either of these. If they did, they might run into the problem where the Acer is often limited to 640x480: https://community.acer.com/en/discussion/733442/have-a-new-a...
That is not remotely in the same category as the Neo.
You get twice as much RAM, twice as much storage. 4x faster storage too. You get a full sized HDMI port. You can do multiple monitors if you need to. It has a fan for better sustained performance. You can plug in a flash drive, mouse, monitor or other external peripheral without a dongle. Oh, and it's actually COOLER running than the Neo.
The Neo costs a $100 more, needs a $30 dongle to connect to 90% of the stuff people have, has half the RAM, half the storage, slower storage. Has considerably worse I/O. But has a better screen and build quality comparable to a MacBook Pro from 2007.
It's different compromises. Personally I'd rather have more RAM, storage and IO than a prettier case and better screen.
The quibbling about ram is strange only because Apple is much better positioned to utilize ram since they are vertically integrated. I produce music and occasionally compile Haskell on my 2016 MacBook with an i3 and 8gigs of ram. So I’m in the 99th percentile power user and a 10 year old machine works great. I bet the new Mac would be even better.
It doesn’t have 8gigs of ram to cheat the consumer. It’s because this company can do 10000 hours of user testing to see what people need to do their normal people things.
No, they're not "better positioned" to utilize memory.
NT has a far better VMM than macOS does and handles OOM significantly better than macOS (and Linux, for that matter).
Look no further than the various Mac subreddits for applications such as TextEdit, Calculator, Safari, and other first and third party applications leaking like a sieve to the point of OOM for multiple versions of macOS at this point.
Not to mention, Macs are sharing that precious memory with the CPU; on those 8GiB machines, leaving 7.5GiB or less (depending on what you're doing) for the kernel to use for non-graphics services.
> NT has a far better VMM than macOS does and handles OOM significantly better than macOS (and Linux, for that matter).
That's one of my great frustrations with Windows. NT is a fine kernel. The userspace on top of that is fucking terrible though.
When people compare "operating systems" they're not comparing the kernel. They're at the most technical comparing the userspace tools shipped with that kernel, and at their most general the "ethos" of the developers that build the ecosystem. The terrible experience on windows of every programing having an installer that pokes around god knows where in the registry is just as much an experience of the Windows operating system as piping curl into bash is on Linux.
> NT has a far better VMM than macOS does and handles OOM significantly better than macOS (and Linux, for that matter).
All of them handle OOM the same way: paging to disk with subsequent thrashing. How can any OS be better than any other in that respect?
If your computing experience leaves much to be desired it’s more-often-than-not the fault of the fact more and more applications are eschewing (admittedly neglected) efficient native platforms and using Electron/WebViews.
…looking at you, Balena Etcher. No-one needs a 200MB front-end for `dd`.
Completely agree. In my current role, I work with a lot more "normal" computer users, and it's helped me have a better understanding of many consumer technologies from different perspectives
I have seen the survey results and work studies for our large enterprise of Mac users, most (not all, but most) have zero change in satisfaction or perceived or objective work performance with 8GB vs 16GB MacBooks. Most users are swapping between outlook, teams and chrome, anything more than an M2 8GB MacBook Pro would be a waste for these users. Disk performance is similar, anything in the M line is more than good enough for 75%+ of our users. Mac screens and keyboards have very high customer satisfaction in our org. Just like 16 GB of RAM, it does not translate to a measurable increase in work performance, but subjectively people report higher satisfaction.
As for cost, the MacBook has a lower total cost of ownership in our organization than a Windows PC at a similar purchase price because: 1) longer OS support timeline from apple means they can be used longer and 2) at the end of their lifespan with us, they have much higher resale value than comparable windows hardware.
Just a different perspective as to why 8GB MacBooks make sense for some users.
I can see exactly one, and it's niche: the ability to safely leave tiny USB-A peripherals like flash drives, wireless dongles, and SFF YubiKeys connected while not in use (not that I'm recommending a YubiKey be left connected to a laptop when not in use).
Hubs are mostly only relevant for docking or increasing the number of ports, given that USB-A to -C adapters are so cheap (assuming they're not bundled with the peripheral in the first place) you can reasonably leave them permanently attached to larger form factor USB-A peripherals.
As for full-sized HDMI, assuming you're not talking about the hellish mini or micro HDMI as alternatives, I'll take USB-C, or even mini DisplayPort, over full HDMI, as both have decent connectors and provide more and better inexpensive options for display connectivity (though admittedly finding good active DisplayPort-to-HDMI dongles can be harder than it should be because chroma subsampling is a thing that's not frequently touched upon in product descriptions).
You're proving the point. The computer you found wins on the specs page for sure. But the proof is in the pudding; Apple makes money hand over fist because they focus on reasonable specs, and quality. The thing that kills a modern laptop is not a slow CPU or RAM on the chip; it's a cheap chassis that breaks. That's what makes people change their computer.
> Apple doesn’t load your computer up with crapware and ads from the five different companies in the supply chain.
No apple prefers to have a monopoly on ads and crapware but they're still there.
The internet is filled with annoyed apple customers who want to debloat their systems:
You didn't read any of those, did you. They're asking about things like, literally: How can I delete the Chess app? How do I disable Spotlight? How do I remove Siri?
Those are not in any way comparable to ads or Candy Crush in the start menu.
I still haven't figured out how to remove Microsoft Store apps from the Start menu in recent non-LTSC versions of Windows 11, even on Enterprise with the Enterprise-only "disable consumer experiences" Group Policy key set.
Suggestion for any Microsofties listening: give me an easy way to override Windows key press-and-release to open the PowerToys Command Palette, and I'll never complain about the Start menu again.
What makes it horrifying? Plastic? Is the only thing that's important the material it's made out of? I think there's many use cases where the Acer would be less horrifying to use than the Neo. Which device would be better for running a Linux VM for CS class homework for example?
Why bother with a VM for Linux on the Acer? Just run it natively. There's almost nothing that actually requires Microsoft anymore, and you'll get better performance.
A vanishingly small number of end users (both PC and Mac) care about how much RAM they have. I'd be willing to bet that at least 75% of PC and Mac laptop owners couldn't even tell you how much RAM they have, or they mistake hard disk storage for RAM or vice versa.
> "What I will say is that in recent years, Apple has really accelerated the performance of their SSDs. And this has been a key part of the argument as to why PCs are absolute trash."
Umm, for the past 5+ years or so PC SSDs have have generally been as fast or faster than what Apple has been shipping. When Apple moved to NVMe they did so before the PC industry for the most part and had some advantage but they got eclipsed.
Not knowing these devices personally, I'll just say I find most of these sorts of SSD performance summaries completely useless.
Too often, specs or even shallow benchmarks report little more than some theoretical peak speed from system to SSD controller RAM buffers, without any real information about reads or writes that actually go all the way to the solid state storage cells. And even when they do go all the way, they fail to really highlight performance variance for different realistic workloads...
As a general rule: any SSD benchmark that gives you a result of over 1GB/s is not measuring what's actually most important for day to day interactive use. And anything that's within a factor of two of the SSD's marketing numbers is probably relevant only to copying a single file to or from another SSD.
A 128GB 2TB Dell Pro Max with Nvidia GB10 is about $4200, a Mac Studio with 128GB RAM and 2TB storage is $4100. So pretty comparable. I think Dell's pricing has been rocked more by the RAM shortage too.
Unfortunately the GB10 is incredibly bandwith starved. You get 128gb ram, but only 270GB/s bandwidth. The M3 Ultra mac studio gets you 820GB/s. (The M4 max is at 410GB/s. I'm not aware of any workload that gets the GB10 to it's theoretical peakflops.
From the spec sheets I’m looking at, it is not. I’m seeing models of the Dell Pro Max with 128 GB of DDR5-6400 as CAMM2, then a separate memory of up to 24 GB on the GPU. CAMM2 does not make the memory unified.
You're not looking at the right thing. Dell's naming is horrible. Dell Pro Max with GB10 (https://www.dell.com/en-us/shop/cty/pdp/spd/dell-pro-max-fcm...). It's a very different computer than what you're looking at and has 128GB LPDDR5X unified memory.
AFAIK, for the unified bandwidth, it depends mostly on the CPU, for M4 Max (I think it's the default today?) it does ~550 GB/s, while GB10 does ~270 GB/s, so about a 2x difference between the two. For comparison, RTX Pro 6000 does 1.8 TB/s, pretty much the same as what a 5090 does, which is probably the fastest/best GPUs a prosumer reasonable could get.
For N64 and back, a MiSTer is a good option these days. Because of the Mister Pi and QM-Tech clones and clones of the clones prices have dropped a fair bit.
Cheap generic upscalers add significant latency and worse non-consistent latency. It might be OK for a turn based RPG like Persona 3, but it will drive you mad for games like DDR, Guitar Hero or heavily action based games.
As I said the results were "ok". Obviously you would need to buy a retrotink or something similar to have the best results but all but the most expensive models are often out of stock.
The biggest issue is that some stretch picture IMO. The latency is greatly exaggerated IME. I just used an older TV with a built in good upscaler (newer TV have worse upscalers).
IDK, when I tried one DKC was completely unplayable. For some games it's very important. This has been a well documented problem with the cheap scalers.
They have other issues too and RetroRGB has a good video going over the problems:
Not all TVs support 240p over component either, especially from the late 2000s/early 2010s which means you can run into problems with some PS1 games on a PS2 or earlier consoles.
I am aware of all this. This is why I said that depending on the box it can be either total garbage or "ok". At this point any emulation is always going to be easier.
Eh, while they do need a Wii U Game Pad, once you're past setup you don't really need it except for a handful of games. Maybe to do the occasional settings change. If you're just using it as a Wii you can also press B on the Wii Remote while booting to boot straight into Wii Mode too.
I wouldn't recommend it for non-technical people, but people here would probably be fine.
reply