@aly@mastodon.nz M3/M4 Pro machines are very stonking fast, yes. Their engineering is so far beyond what anyone else is doing itβs kind of absurd.
As far as lots of small files, sounds like you donβt need to be syncing those across to Windows, and VSCode, if thatβs what you use, can connect into WSL2 to read and edit the project.
If you do end up jumping to macOS, PowerShell is available there.
@aurynn@cloudisland.nz To be clear: I use PowerShell because I am on Windows... not because I like it!
.NET has such good cross platform support now (I've had a good time on linux) that if I was to do a new dev machine today I think I would go Mac, and keep Windows for the gaming.
@aly@mastodon.nz dunno if you know about Scoop or Chocolatey for Windows, they might solve your woes of installing things like bash or nodejs to the base machine.
@aurynn@cloudisland.nz Mm yes. I think the general point is that everything works or has a workaround... but workarounds are needed... vs it being a more works-by-default experience on non-Windows. It used to be that I was constrained by Visual Studio etc since that's what I was paid to do. I haven't opened that in years now though.
@aly@mastodon.nz macOS has the same constraint, you'll need Brew or MacPorts to get much of the same package management and installing other software experience.
@aurynn@cloudisland.nz The constraint I mean was that VS is Windows only (not VSCode). These days I have Rider.
Package management really does seem to be where linux is a mile ahead though.
@aly@mastodon.nz yeah, package management on Linux is pretty good.
However, an environment manager (I use mise) is basically necessary to ensure that version pinning of tools is possible, since distro packages are Not what people are going to actually want.
@aurynn@cloudisland.nz Oh they can't help but add more layers huh. π
@aly@mastodon.nz well being able to have consistency between all the devs and production itself is impossible to maintain without some layer like this.
@aurynn@cloudisland.nz We don't really face this problem on the C# side, but then the project files build in the SDK versions etc. Obviously yes I'm familiar with nvm and its like.
One of the pros of having a more "overlord" type management of the eco system I guess.
Yes there are so many cons. You don't get choices, but on the other hand you don't get choies.
@aly@mastodon.nz Ensuring that everyone is on the same compiler version and has pulled in the same 3rd-party library versions and SDK point releases is all handled by something, though. Might just be VS that handles it, but something has to.
@aurynn@cloudisland.nz It is, yes, by the .NET stack. It has its package.json equivalent. There are not really significant choices like e.g. nvm/fnm, or npm/yarn/bun though. You just use nuget wich is built in. Also the way it installs you just get the lastest dotnet on the system, and let it work out which language version to wire up based on your per project settings.
So yes it's handled, but most of the choices are made for you. I would say it's a more streamlined user experience, which naturally comes with restrictions. Backwards compat etc is generally very good. Until you step out of bounds...
Moving in to other dev worlds can seem like chaos because there is choice, which is powerful.
@aly@mastodon.nz Just getting the latest dotnet does sound like living dangerously, to me π
The environment managers are meant to lock down choice, so that unexpected changes don't happen
@aurynn@cloudisland.nz Haha, true, but the release cycle on that is rather more constrained and "professional". I'm not sure how to express how... boring... the .NET package ecosystem feels compared to node.
It has the weight of Microsoft on it and I think probably the better parts of Microsoft.
* "professional" is used here to mean enterprisey and slow and communicated. Whether you consider that a good thing or not is debatable.
@aly@mastodon.nz I like stable and knowable upgrade cycles, which is why I quite like version pinning.
But also, I dislike vendor-provided packages because I can't enforce a global-to-all-developers set of versions for things.
But also, that then requires I use something like Docker to contain everything for production.
It gets complicated.
@aurynn@cloudisland.nz dotnet still has its package lock for version pinning. This includes on all of the MS provided SDK packages etc. We do also publish in docker on minimalised linux images.
@aly@mastodon.nz Pin all the things!
@aurynn@cloudisland.nz also the release .NET stack has been around in some for since 2002. The core libraries are really well baked and don't need that much change. The number of depandabot alerts for tiny packages that come through on node vs C# is a little alarming.
@aurynn@cloudisland.nz I think I'd like to see node get more boring, and dotnet get less boring. A best of both worlds.
@aly@mastodon.nz Confusing computing for everyone?
@aurynn@cloudisland.nz I don't know, I've been a pro MS dev for so long and I don't much like MS so there's always tension.
@aly@mastodon.nz Fortunately, macOS will be different in new and exciting ways.
@aurynn@cloudisland.nz Oh I know, I've been there.
My honest preference would be a well specified tower PC at home running Linux, and not needing to travel for work.
@aly@mastodon.nz let me guess, an Linux workstation does not have corporate popularity ?
@aurynn@cloudisland.nz It deeply does not.
- I do work on site occasionally so need to be portable and to be able to present at meetings etc.
- Compliance requires certain things to be installed (brick on demand etc) which are not compatible.
I think the first one should be solved with good remote access these days, but the second is a bit on a showstopper.
@aly@mastodon.nz To pick this back up, I learned about WinGet yesterday, as a built-in package manager, and there's community repos for it, as a tool for installing things like nodejs onto the root machine instead of trying to use WSL, which could alleviate some of your performance pains.
@aurynn@cloudisland.nz Yeah, none of these has really "won" the battle to be THE Windows package manager unfortunately, but I should try them more.
I've learned a lot more about the makeup of the typical Windows laptop now too and it's just not the right platform for heavy dev workloads. I am surprised how heavy the node stack is though.
For the moment, the laptop manages better if I remove a monitor from it.
@aurynn@cloudisland.nz Which, IMHO, gives away how unsuitable it is.
@aly@mastodon.nz thatβs β¦ not a good sign. Is it only an integrated graphics chip? Might need to force Windows to use the accelerated GPU instead, if there is one.
Sounds like itβs just not a good laptop overall. π
@aurynn@cloudisland.nz Haha... oh the sadness. Yes you can disable the integrated GPU and force it to use the NVidia one. The minor tradeoff is that this prevents you from using any external monitors.
"good" is always subjective depending on the context right? It's light, nice enough, has good battery life... it's great for taking to meetings.
I don't want my life optimized for going to meetings though.
@aly@mastodon.nz it prevents using the external display!? What? How? What? How!?
@aurynn@cloudisland.nz OK, I'm no expert here but, from what I understand...
Systems may have a mux, which allows routing of graphics signals between cards and ports. In that case you have options.
Lighter more portable laptops are likely muxless. In this particular setup with integrated Intel graphics, and an NVidia Optimus discrete card, it is probably wired so that the discrete card passes its signals though the integrated card to display things. So when you disable the integrated you lose access to the extra ports. This is cheap, light, and efficient.
Windows will choose for you which card to use for which task, and it uses the integrated graphics for basically everything unless something like a game engine specifically asks otherwise. This saves lots of power since it doesn't need to run the discrete card.
Again, great for going to meetings and taking notes, but my burning need is to compile so so many tiny JS packages.
@aurynn@cloudisland.nz I haven't done numbers, but running two 4K monitors definitely feels like it hurts the laptop's performance in big way.
@aly@mastodon.nz Okay so I understand all that about a mux, and that makes sense.
But afaik you can force Windows to use the accelerated card for everything? Like there's hardware GPU scheduling setting, and the power profile stuff as well? So this ... should be able to force Windows to not be quite so bad.
@aurynn@cloudisland.nz If you can tell me how to do that that would be great. I have tried, I promise.
@aurynn@cloudisland.nz It's kind of a deep rabbit hole, see e.g.
https://dchambers.github.io/articles/driving-multiple-monitors-on-an-optimus-laptop/
@aurynn@cloudisland.nz Additionally, this is a problem that money solves fairly easily. Alas...
@aly@mastodon.nz Money solves many things, alas...
Trying to find the setting I remember for you now. Wouldn't surprise me if it got removed and you need something like gpedit to bring it back...
@aly@mastodon.nz okay I'm feeling quite gaslit by Windows right now because I am dead certain I remember settings like forcing the high-performance GPU to be enabled at all times, and I can't find it. I also can't find this hardware-accelerated graphics scheduling setting that's supposed to exist in Windows.
I might need to use a second GPU to find this.
In any case, the high performance power plan is where I remember being able to force the good GPU?
Or the nvidia control panel?
@aurynn@cloudisland.nz "feeling gaslit by Windows" is familiar. Part of why I want off it. :/
@aly@mastodon.nz I do not blame you there. I'm sorry, that's very frustrating.