Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement.
It doesn’t matter how good your processor is if you can only bank 8 GB of something into memory it’s going to be slow. The only way an 8 GB device would beat a 16 GB device would be if the 16 GB device had the world’s slowest processor. Like something from 2005. Taking stuff out of RAM is the single slowest operation you can perform other than loading from a hard drive.
Apple exec doesn’t actually understand how computers work and think that that actually might be a reasonable arguement
I think a lot of Apple users fit this bill too so it doesn’t matte much if this is the messaging, a fair amount of people will believe it.
Scammers!
16gb is always better, and I usually recommend it to people looking to buy a Mac, but they aren’t wrong about Macs handling RAM more efficiently. They still sound arrogant af when using that as their excuse, though.
they aren’t wrong about Macs handling RAM more efficiently.
More efficiently than what other system? How did you come to that conclusion? If you open tabs in your browser, do you think MacOS will allow you to open more tabs than other operating systems?
Just from my observations from owning a 2015 MBP with 8GB of memory, it is easy to be fooled into thinking memory management is much better on macOS because you can effectively have more open than you would on an equivalent Windows laptop with 8GB memory.
From what I understand though, the SSD is used to compensate as swap a lot more than Windows, and I believe this is causing a lot of ewaste with the M1 Macs in particular being effectively binned because the SSDs are worn out on them from swapping and they’re soldered.
Ridiculous.
deleted by creator
Instead I feel it’s the opposite because that memory is shared with the GPU. So if you’re gaming even with some old game, it’s like having 4gb for the system and 4gb to the GPU. They might claim that their scheduler is magic and can predict memory usage with perfect accuracy but still, it would be like 6+2 GB. If a game has heavy textures they will steal memory from the system. Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.
Their pricing for the ram is ridiculous, they’re charging $300 for just 8gb of additional memory! We’re not in the 2010s anymore!
Maybe you want to have a browser for watching a tutorial on YouTube during gaming, or a chat. That’s another 1-2 gb stolen from the CPU and GPU.
Or five times that amount if you’re running Chrome
The most expensive 8GB DDR5 stick I can find on Amazon is about us$35. There are 64GB sets that are under us$200!
Apple should be ashamed.
16 gb optiplexes on sale for 85 dollars on eBay. Dont come with windows, but neither do macs :P
Install Linux and this is the way.
Yeah yeah, I do think even like windows 11 these days. I’m a debian with KDE guy.
My GeoTIFFs do not agree.
Seems fair, you pay 1000 for the logo and 600 for the hardware.
It’s a very nice logo. And it lights up. Hard to argue with their pricing, really.
It’s actually just the display backlight which is why I had to cover it with aluminium tape instead of just disconnecting the wire. Not only don’t I want an ad on my computer I especially don’t want an illuminated one.
Apple laptops and gaming headphones, keeping it classy
It actually doesn’t light up anymore…
For $375 you can get an iFlashlight to point at the logo
Ordered the iFleshlight. Looking forward to seeing the jealous looks I get at the coffee shop.
Just upgrade the RAM yourself.
Oh wait, you can’t because it’s 2023 and it’s become inexplicably acceptable to solder it to the motherboard.
It’s not “inexplicable”.
DIMM mounting brackets introduce significant limitations to maximum bandwidth. SOC RAM offers huge benefits in bandwidth improvement and latency reduction. Memory bandwidth on the M2 Max is 400GB/second, compared to a max of 64GB/sec for DDR5 DIMMs.
It may not be optimizing for the compute problem that you have, and that’s fine. But it’s definitely optimizing for compute problems that Apple believes to be high priority for its customers.
Not even soldered, it’s part of the CPU/GPU die now.
Ah yes, it’s the SSD that’s soldered.
Just 300 of your English pounds to upgrade from 512GB to 1TB.
Meanwhile, a 2TB drive at PS5 speeds is under £100.
For unupgradable kit, the pricing is grotesque.
Apple has put a lot of effort into (successfully) creating a customer-base that thinks overpriced goods and different colored texts make them in a special club, I’m not surprised that an exec thought this excuse would fly
It’s a bit more complex than that (and you probably know it).
When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board. Their UX is excellent. Their support is good. Things work well, applications are easy to use and pretty stable and well built. And if they violate your privacy like the others, at least they don’t make the open-bar sale of your data their fucking business model (wink wink Google).
Of course you there’s a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there’s that one thing that doesn’t work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It’s a nice golden cage with soft pillows.
I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad… I just don’t have time to “manage” my hardware anymore. Nor the urge to do it. I need things to work out of the box in a predictable way. I don’t want a digital mental load. Just a simple UX, consistency across my devices and good apps (and no Google, fuck Google). Something I wouldn’t have with an Android + PC setup. :)
The whole “special club” argument is bullshit, and I hope we grow out of it. Neither the Apple nor the Google/Microsoft environments are satisfactory. Not even speaking of Linux and FOSS. We must aim higher.
Yea sorry but I disagree with the vast majority of things you said. Consistent, coherent, and well thought out experiences are pretty par for the course at this point, regardless of what flagship phone company you’re buying from. The UX is great for people who grow up with the UX, just like android UX is great for people who grow up with it. Android users who switch to Apple generally think Apple’s UX is dogshit, and vice versa. The UX argument is trash and the reality of it is that people just think whatever UX they’re used to is the more intuitive one. Support is pretty much the same in my experience between the two, and I have an android personal phone and an apple work phone. The major difference is the image that Apple support airs is better. The vast majority of popular Android applications are just as stable and usable as Apple apps, and the ones that aren’t are often niche enough that similar apps aren’t even on the apple appstore. So it’s a question of basically the same service for most apps, and then either no service or degraded service for lesser apps.
The one thing they have over android is the security argument, which is fair to an extent, but not this bulwark that a lot of people like to pretend it is. The Fappening, for example, still got plenty of explicit images from Apple phones.
All in all, apple is an advertising company first, and a tech company second, with admittedly improved security, but also admittedly security that isn’t good enough to justify the price hike.
I’m gonna have to argue against a few of these points:
When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board.
Consistent: yes. Every Apple device leverages a functionally very similar UI. That said, the experience is, in my opinion, not very coherent or well thought out. Especially if you are attempting to leverage their technology from the standpoint of someone like a Linux power user. The default user experience is frustratingly warped around the idea that the end user is an idiot who has no idea how to use a terminal and who only wants access to the default applications provided with the OS.
Things work well
Things work…okay. But try installing, uninstalling, and then reinstalling a MySQL DB on a macbook and then spend an hour figuring out why your installation is broken. Admittedly, that’s because you’re probably installing it with Homebrew, but that’s the other point: if you want to do anything of value on it, you have to use a third party application like Homebrew to do it. The fact that you have to install and leverage a third party package manager is unhinged for an ecosystem where everything is so “bundled” together by default.
Of course you there’s a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there’s that one thing that doesn’t work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It’s a nice golden cage with soft pillows.
I guess the ultimate perspective is one in which you have to be happy surrendering control over so much to Apple. But then again, you could also just install EndeavorOS with KDE Plasma or any given flavor of Debian distribution with any DE of your choice, install KDE Connect on your PC and phone, and get 95 percent of the experience Apple offers right out of the box, with about 100x the control over your system.
I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad… I just don’t have time to “manage” my hardware anymore.
I don’t know of anyone who would describe themselves as a hardcore “PC/Linux user,” or what this means to you. I’m assuming by PC you mean Windows. But people who are really into Linux generally don’t like MacOS or Windows, and typically for all the same reasons. I tolerate a Windows machine for video game purposes, but if I had to use it for work I’d immediately install Virtualbox and work out of a Linux VM. For the people who are really into Linux, the management of the different parts of it is, while sometimes a pain in the ass, also part of the fun. It’s the innate challenge of something that can only be mastered by technical proficiency. If that’s not for you, totally fine.
The whole “special club” argument is bullshit, and I hope we grow out of it.
It’s less argument and more of a general negative sentiment people hold towards Apple product advocates. You can look up the phenomenon of “green bubble discrimination.” It’s a vicious cycle in which the ecosystem works seamlessly for people who are a part of it, but Apple intentionally makes leaving that ecosystem difficult and intentionally draws attention to those who interact with the people inside of it who are not part of it. Apple products also often are associated with a higher price tag: they’re status symbols as much as they are functional tools. People recognize a 2000 dollar Macbook instantly. Only a few people might recognize a comparably priced Thinkpad. In a lot of cases, they’ll just assume the Macbook was expensive and the non-Macbook was cheap. And you might say, “yeah, but that’s because of people, not because of Mac.” But it would be a lie to say that Apple isn’t a company intensely invested in brand recognition and that it doesn’t know it actively profits from these perceptions.
Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)
You are right and make good points. But you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.
(Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don’t have the motivation to do so. Or the time. I’ll probably just reinstall Mint.)
Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)
Wow. Talk about coincidences…
you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.
I’m a dumbass and if I can do it anyone can. But, yes, technology is a daunting thing to most people. Intuition and experience go far. That said, it’s literally easier today than it ever has been. You put in the installation usb, click next a whole bunch, reboot, and you have a working machine. Is it sometimes more complicated than that and you have to do BIOS/UEFI bullshit? Sure, but past that hurdle it’s smooth sailing.
(Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don’t have the motivation to do so. Or the time. I’ll probably just reinstall Mint.)
Ah, the joys of rolling release distros. Endeavor has been stable for me so far. I’m running it on an X1 Thinkpad. Generally works more reliably than my own vanilla arch installs and more low profile tiling window managers. I’ve found myself sticking to KDE Plasma for a DE because it’s so consistent and has enough features to keep me happy without having to spend all my time fine tuning my own UX, which I just don’t care about. My realization has been that arch distros are best suited for machines running integrated graphics and popular DEs, rather than ones with separate cards and more niche or highly customizable DEs. Prevents you from having to futs about with things like Optimus, with graphics drivers being the primary cause of headaches for that distro, per my experience. That said, I used to run an old Acer laptop with arch and a tiling window manager called qtile. Qtile was great, but every other update completely altered the logic and structure of how it read the config file for it, so the damn thing broke constantly. I’m like…just decide how you want the config to look and keep that. Or at least allow for backwards compatibility. But they didn’t.
I mean, the NAND chips can be replaced fairly effectively if you know what you’re doing
Actually no. There’s some pairing trickery going on on the SoC level, so if you change the NAND chips by higher capacity ones without apple’s special sauce, you’ll just get an unbootsble system
And paging in & out of RAM frequently is probably one of the quickest ways to wear out the NAND.
Put it all together and you have a system that breaks itself and can’t be repaired. The less RAM you buy the quicker the NAND will break.
I was under the impression that had been solved by third parties? Or is chip cloning not enough?
Unbelievable …
Macrumors just released an article talking about how the 8gb is a bottleneck in the new M3 models lol
deleted by creator
deleted by creator
deleted by creator
deleted by creator
Why did you decide to go back to Apple instead of giving Linux a try? It’s free so it literally would have cost nothing to try, and you could keep your other OS(s).
deleted by creator
By mobile app integration, do you mean a connection between your mobile phone and your computer? KDE Connect is pretty good from my experience. It has more features than the Windows alternative at least (and I think there’s even a Windows version oddly enough).
If you mean running a mobile app in the system, I have no experience with that.
deleted by creator
Good grief, I had a lady behind the counter try to berate me onto the store’s rewards card and she wasn’t as pushy as this comment.
It’s pushy to ask why someone made a large purchase when there’s a free alternative they might not have tried that they may or may not like better? Unlike buying an Apple product, it takes little effort and no cost to just boot up Linux and give it a shot. Some people won’t like it and that’s fine. It’d be pushy to say you will like it better, which is not what I said.
deleted by creator
I doubt it’s the last time. also while “PC” means personal computer, it was a very specific brand name by IBM, not a general purpose term. their computers (and clones later) became synonymous with x86-windows machines.
Even apple themselves have always distanced themselves from the term (I’m a Mac, and I’m a PC…).
deleted by creator
deleted by creator
deleted by creator
Can you run that outside of a virtual box?
It’s not virtualization. It’s actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the “hard drive” portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.
Asahi Linux is configured so that Apple’s firmware loads a Linux bootloader instead of booting MacOS.
And wouldn’t it be a lot cheaper to just build your own PC rather than pay the premium for the apple logo?
Apple’s base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips. The expense comes from exorbitant prices for additional memory or storage, and the fact that they simply refuse to use cheaper display tech even in their cheapest laptops. The entry level laptop has a 13 inch 2560x1600 screen, which compares favorably to the highest end displays available on Thinkpads and Dells.
If you’re already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.
deleted by creator
Except the boot process on a non apple PC is open software.
For the most part, it isn’t. The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. They all end up supporting the open UEFI standard, but the implementation is usually closed source. Having the ability to flash new firmware that is mostly open source but with closed source binary blobs (like coreboot) or fully open source (like libreboot) gets closer to the hardware at startup, but still sits on proprietary implementations.
There’s some movement to open source more and more of this process, but it’s not quite there yet. AMD has the OpenSIL project and has publicly committed to open sourcing a functional firmware for those chips by 2026.
Asahi uses the open source m1n1 bootloader to load a U-boot to load desktop Linux bootloaders like GRUB (which generally expect UEFI compatibility), as described here:
- The SecureROM inside the M1 SoC starts up on cold boot, and loads iBoot1 from NOR flash
- iBoot1 reads the boot configuration in the internal SSD, validates the system boot policy, and chooses an “OS” to boot – for us, Asahi Linux / m1n1 will look like an OS partition to iBoot1.
- iBoot2, which is the “OS loader” and needs to reside in the OS partition being booted to, loads firmware for internal devices, sets up the Apple Device Tree, and boots a Mach-O kernel (or in our case, m1n1).
- m1n1 parses the ADT, sets up more devices and makes things Linux-like, sets up an FDT (Flattened Device Tree, the binary devicetree format), then boots U-Boot.
- U-Boot, which will have drivers for the internal SSD, reads its configuration and the next stage, and provides UEFI services – including forwarding the devicetree from m1n1.
- GRUB, booting as a standard UEFI application from a disk partition, works like GRUB on any PC. This is what allows distributions to manage kernels the way we are used to, with grub-mkconfig and /etc/default/grub and friends.
- Finally, the Linux kernel is booted, with the devicetree that was passed all the way from m1n1 providing it with the information it needs to work.
If you compare the role of iBoot (proprietary Apple code) to the closed source firmware in the typical Dell/HP/Acer/Asus/Lenovo booting Linux, you’ll see that it’s basically just line drawing at a slightly later stage, where closed-source code hands off to open-source code. No matter how you slice it, it’s not virtualization, unless you want to take the position that most laptops can only run virtualized OSes.
I think you mean that Apple uses its own memory more effectively then a windows PC does.
No, I mean that when you spec out a base model Macbook Air at $1,199 and compare to similarly specced Windows laptops, whose CPUs/GPUs can deliver comparable performance on benchmarks, and a similar quality display built into the laptop, the Macbook Air is usually cheaper. The Windows laptops tend to become cheaper when you’re comparing Apple to non-Apple at higher memory and storage (roughly 16GB/1TB), but the base model Macbooks do compare favorably on price.
deleted by creator
deleted by creator
deleted by creator
I think the history is such that a “PC” is a computer compatible with the “IBM PC” which Macs were historically not and modern ones aren’t either.
But I still like “Windows computer”, we can abbreviate that to “WC”.
Another complication was that DOS-using machines weren’t always running Windows at one point in time.
I agree with you, but you know how Apple operates, slapping a shiny new name on an already existing concept and making it sound premium.
True by the letter but not really by practice. PC is synonymous with a computer running Windows, or Linux at a push. I don’t know whether that’s because of Microsoft’s early market dominance or because Apple enjoys marketing itself as a totally different entity, or some combination of the two. But yeah, usage determines meaning more than what the individual words mean in a more literal sense.
Originally “PC” was IBMPC or PC Compatible (as in compatible with IBM without using their trademark). An IBMPC could have run DOS, Windows or even OS/2
deleted by creator
poop. it’s poop.
Lol. My personal iMac has 32GB, and I’m happy with it. My POS work MBP has only 8GB, and I wanna frisbee the fucken thing out the window pretty much every day.
My research disproves this clown’s hypothesis.
If I wasn’t so broke, my 8GB MBP would enter the frisbee competition…
deleted by creator
Yeah, I gave Apple a try over the last two years, largely because I was annoyed with Google and wanted to ditch Android. I’ve been fed up since about 6 months in, but gave it some more time, which led to an eventual waiting game to get the replacements I want.
I just picked up a Thinkpad P14s g4 AMD with a 7840u, 64GB of RAM, and a 3 year onsite warranty for $1270 after taxes. I added a 4TB Samsung 990 Pro for another $270. I can’t imagine spending more than that and only getting 8GB RAM (and less warranty), which is what I have assigned to the GPU. Plus I get to run Linux, which I really didn’t realize how much MacOS would leave me wanting.
The thing I’ll miss is the iPhone 13 Mini size. I found iOS to be absolute trash, but there’s just not an Android phone that’s a reasonable size. But at least I can run Calyx/Graphene on a Pixel and get a decent OS without spying.
I do like the M1 MBA form factor, too, but I’ll grab the Thinkpad X13s successor for portability and get a better keyboard. I don’t need top end performance out of that, I really just want battery life and passive cooling.
And don’t even get me started on the overpriced mess that are the Airpods Max. I much prefer the Audeze Maxwell and Sennheiser Momentum 4 I replaced them with.
deleted by creator
deleted by creator
When you’re nearing the terabytes range of RAM you should consider moving your workload to a server anyway.
deleted by creator
The Apple M series is not ARM based. It’s Apple’s own RISC architecture. They get their performance in part from the proximity of the RAM to the GPU, yes. But not only. Contrary to ARM that has become quite bloated after decades of building upon the same instruction set (and adding new instructions to drive adoption even if that’s contrary to RISC’s philosophy), the M series has started anew with no technological debt. Also Apple controls both the hardware to the software, as well as the languages and frameworks used by third party developers for their platform. They therefore have 100% compatibility between their chips’ instruction set, their system and third party apps. That allows them to make CPUs with excellent efficiency. Not to mention that speculative execution, a big driver of performance nowadays, works better on RISC where all the instructions have the same size.
You are right that they do not cater to power users who need a LOT of power though. But 95% of the users don’t care, they want long battery life, light and silent devices. Sales of desktop PCs have been falling for more than a decade now, as have the investments made in CISC architectures. People don’t want them anymore. With the growing number of manufacturers announcing their adoption of the new open-source RISC-V architecture I am curious to see what the future of Intel and AMD is. Especially with China pouring billions into building their own silicon supply chain. The next decade is going to be very interesting. :)
The Apple M series is not ARM based. It’s Apple’s own RISC architecture.
M1s through M3s run ARMv8-A instructions. They’re ARM chips.
What you might be thinking of is that Apple has an architectural license, that is, they are allowed to implement their own logic to implement the ARM instruction set, not just permission to etch existing designs into silicon. Qualcomm, NVidia, Samsung, AMD, Intel, all hold such a license. How much use they actually make of that is a different question, e.g. AMD doesn’t currently ship any ARM designs of their own I think and the platform processor that comes in every Ryzen etc. is a single “barely not a microprocessor” (Cortex A5) core straight off ARM’s design shelves, K12 never made it to the market.
You’re right about the future being RISC-V, though, ARM pretty much fucked themselves with that Qualcomm debacle. Android and android apps by and large don’t care what architecture they run on, RISC-V already pretty much ate the microcontroller market (unless you need backward compatibility for some reason, heck, there’s still new Z80s getting etched) and android devices are a real good spot to grow. Still going to take a hot while before RISC-V appears on the desktop proper, though – performance-wise server loads will be first, and sitting in front of it office thin clients will be first. Maybe, maybe, GPUs. That’d certainly be interesting, the GPU being simply vector cores with a slim insn extension for some specialised functionality.
Thanks for the clarification. I wonder if/when Microsoft is going to hop on the RISC train. They did a crap job trying themselves at a ARM version a few years back and gave up. A RISC Surface with a compatible Windows 13 and proper binary translator (like Apple did with Rosetta) would shake the PC market real good!
The whole “Apple products are great because they control both software and hardware” always made about as much sense to me as someone claiming “this product is secure because we invented our own secret encryption”.
Here’s an example for that: Apple needed to ship an x86_64 emulator for the transition, but that’s slow and thus make the new machines appear much slower than their older Intel-based ones. So, what they did was to come up with their own private instructions that an emulator needs to greatly speed up its task and added them to the chip. Now, most people don’t even know whether they run native or emulated programs, because the difference in performance is so minimal.
deleted by creator
Interesting, I thought they had ditched the ARM license completely, my mistake.
deleted by creator
The Apple M series is not ARM based.
Very confident it is ARM based.
the mac pro is a terrible deal even compared to their own mac studio. It has the same specs but for almost $1000 extra. Yes, the cheese grater aluminum case is cool, but $1000 cool?
Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M.
I can’t remember who it is but somebody is already doing this. But it’s primarily marketed as an AI training chip. So basically only Microsoft and Google are able to buy them, even if you had the money, there isn’t any stock left.
a toy for professional workloads
[rant]
I think this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.
The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.
Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while sparce matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.
[\rant]
deleted by creator
Indeed! It makes the benchmarks that much more disingenuous since pros will end up CPU crunching. I find video production tedious (it’s a skill issue/PEBKAC, really) so I usually just let the GPU (nvenc) do it to save time. ;-)