That’s fair. I’ve put it there as more of a possible use case rather than something you should be consistently doing.
Although iGPU can perform quite well when given a lot of RAM, afaik.
That’s fair. I’ve put it there as more of a possible use case rather than something you should be consistently doing.
Although iGPU can perform quite well when given a lot of RAM, afaik.
If games, modding uses a lot. It can go to the point of needing more than 32gb, but rarely so.
Usually, you’d want 64gb or more for things like video editing, 3d modeling, running simulations, LLMs, or virtual machines.
I work in IT as PM, you’re pretty close.
Modern technology is glued together NOT random shit that somehow works.
Everything created has been built with a purpose, that’s why it’s not random. However, the longer you go on, the more rigid the architecture becomes, so you start creating workarounds, as doing otherwise takes too much time which you don’t have, because you have a dozen of other more important tasks at hand.
When you glue those solutions together, they work because they’ve been built to work in a specific use case. But it also becomes more convoluted every time, so you really need to dig to fix something you didn’t account for.
Then it becomes so rigid and so convoluted that to fix some issues properly, you’d have to rebuild everything, starting from architecture. And if you can’t make more workarounds to satisfy the demand? You do start all over again.
I think you’ll appreciate #ffcc66