@aurynn@cloudisland.nz Yeah, none of these has really "won" the battle to be THE Windows package manager unfortunately, but I should try them more.
I've learned a lot more about the makeup of the typical Windows laptop now too and it's just not the right platform for heavy dev workloads. I am surprised how heavy the node stack is though.
For the moment, the laptop manages better if I remove a monitor from it.
@aurynn@cloudisland.nz Which, IMHO, gives away how unsuitable it is.
@aly@mastodon.nz thatβs β¦ not a good sign. Is it only an integrated graphics chip? Might need to force Windows to use the accelerated GPU instead, if there is one.
Sounds like itβs just not a good laptop overall. π
@aurynn@cloudisland.nz Haha... oh the sadness. Yes you can disable the integrated GPU and force it to use the NVidia one. The minor tradeoff is that this prevents you from using any external monitors.
"good" is always subjective depending on the context right? It's light, nice enough, has good battery life... it's great for taking to meetings.
I don't want my life optimized for going to meetings though.
@aly@mastodon.nz it prevents using the external display!? What? How? What? How!?
@aurynn@cloudisland.nz OK, I'm no expert here but, from what I understand...
Systems may have a mux, which allows routing of graphics signals between cards and ports. In that case you have options.
Lighter more portable laptops are likely muxless. In this particular setup with integrated Intel graphics, and an NVidia Optimus discrete card, it is probably wired so that the discrete card passes its signals though the integrated card to display things. So when you disable the integrated you lose access to the extra ports. This is cheap, light, and efficient.
Windows will choose for you which card to use for which task, and it uses the integrated graphics for basically everything unless something like a game engine specifically asks otherwise. This saves lots of power since it doesn't need to run the discrete card.
Again, great for going to meetings and taking notes, but my burning need is to compile so so many tiny JS packages.
@aurynn@cloudisland.nz I haven't done numbers, but running two 4K monitors definitely feels like it hurts the laptop's performance in big way.
@aly@mastodon.nz Okay so I understand all that about a mux, and that makes sense.
But afaik you can force Windows to use the accelerated card for everything? Like there's hardware GPU scheduling setting, and the power profile stuff as well? So this ... should be able to force Windows to not be quite so bad.
@aurynn@cloudisland.nz If you can tell me how to do that that would be great. I have tried, I promise.
@aurynn@cloudisland.nz It's kind of a deep rabbit hole, see e.g.
https://dchambers.github.io/articles/driving-multiple-monitors-on-an-optimus-laptop/
@aurynn@cloudisland.nz Additionally, this is a problem that money solves fairly easily. Alas...
@aly@mastodon.nz Money solves many things, alas...
Trying to find the setting I remember for you now. Wouldn't surprise me if it got removed and you need something like gpedit to bring it back...
@aly@mastodon.nz okay I'm feeling quite gaslit by Windows right now because I am dead certain I remember settings like forcing the high-performance GPU to be enabled at all times, and I can't find it. I also can't find this hardware-accelerated graphics scheduling setting that's supposed to exist in Windows.
I might need to use a second GPU to find this.
In any case, the high performance power plan is where I remember being able to force the good GPU?
Or the nvidia control panel?
@aurynn@cloudisland.nz "feeling gaslit by Windows" is familiar. Part of why I want off it. :/
@aly@mastodon.nz I do not blame you there. I'm sorry, that's very frustrating.