The failure of npm for Visual Studio in the Enterprise

January 15, 2017

Modern application development is hard. There are simply so many things you have to think about when you are developing, and over time, more and more features are created and many of those need to be integrated into your applications.

This is not without cost. Early on in my career, we used to talk about DLL Hell. DLL Hell is the problem where there were so many versions of a DLL installed in your environment that you never knew which one that application was trying to use.

The modern version of this I now call Package Hell. When I open a modern enterprise Visual Studio application, such as an Angular 2 application, I now have a minimum of around 40 packages installed under the Dependencies/npm branch, just to get a simple application up and running, and many of those packages have dependencies too. And that’s only one of the package managers. Other package managers available include nuget, and Bower.

What is supposed to happen these days is that anything that is Microsoft Dot Net related may be found in nuget, while anything that is javascript related will be in npm. Npm is the node package manager. It is essentially an online repository for javascript packages, not just in the Microsoft world, but for any environment that wants access to those packages over the web. It enables developers to find, share, and reuse packages of code from hundreds of thousands of developers — and assemble them in powerful new ways. Microsoft didn’t invent npm. Microsoft decided it was the tool everyone was using, that did the job, so decided to get onboard. They decided they needed to do this to keep up, to stay competitive in the development space.

To easily explain the problems with npm, I will compare this to nuget. Why? Because nuget works! Nuget Package Manager is simple, it is visual, it keeps you informed, it’s easy to find packages and keep them up to date, and it’s easy to change versions of packages if you need to. You don’t need to focus on the tooling – you can install packages and focus instead on integrating with your business logic and providing business value.

npm Problem 1: The proxy.

If you want npm to work in an enterprise environment, you will most likely have to go through a proxy server.

With nuget, you open up the package manager screen, type in package names, and it gives you a list of candidate packages. It automatically handles the proxy for you. You don’t have to configure it to work. You don’t have to go to everyone’s machine, modify a configuration file, just to ensure that their login has the credentials to authenticate through the proxy to get to the nuget repository. It is automatic.

npm, in this regard, is a complete failure. In the environment I was in, we couldn’t even get that configuration right – even with all the correct settings, it still failed. The workaround is to install a third party package called cntlm, which is a service that opens a local port and automatically authenticates through the proxy. All you then have to do is point npm at that port. “Install what?” I hear you say. Yep, exactly. That’s a major fail in a large environment.

Note: you could also use fiddler, but its the same issue. Developers shouldn’t have to spend time configuring or using third party packages for something that should just work out of the box. It works for nuget. It needs to work for npm

npm Problem 2: Finding new packages.

When using nuget, you type a keyword into the search bar, and you can see a list of packages come up. Most of the time, this is because you were googling and found a reference to a package that might solve a problem, or you might have come across some cool new feature and want to try it out. In the process of doing that, you might also discover other packages that do the job, because you can easily scroll down the list and see what else is on offer. Nuget makes discovery of new interesting packages easy.

But not with npm. Sure, you might find out about the package by googling, but the exploration in npm just isn’t there. In npm, it involves opening up the package.json file, typing a double quote and then you will get your list of choices in a 9 item scrollable tool tip. It’s rubbish. Not to mention that some packages aren’t even discoverable. That’s right, you can’t actually find any package in the registry that starts with an “@” symbol, such as @angular because there are special rules for scoped packages.

npm Problem 3: Version control.

With nuget, when you open up the package manager, it looks up the list of installed packages and compares their version number with what’s available on the net. If one of the packages has been upgraded, it shows you in an Updates tab. You can then choose to upgrade if you want. It’s entirely up to you. But at least it has that feature.

With npm, on the other hand, you might have 40+ packages, but there’s nowhere near as much control. Compared to nuget, it really sucks. To tell nuget that you want to continually upgrade, you have to manage it in a configuration file, for example:

“jquery”: “^2.2.1”,

The hat ^ character tells it that you are happy for it to install any version it finds above this. Um. Wrong. You should be the one to decide when you want upgrades. Part of the problem is finding out when something needs to be upgraded, and npm fails at that. The second problem is that not every upgrade is a success. In a corporate environment, you don’t upgrade a major package automatically because it will break stuff and then your whole application is unusable. But you still want the option. You still want to know if there is a package upgrade available, so the npm way is to only install a particular version. Never mind that you would have at least liked the option to upgrade. The whole concept is flawed.

npm Problem 4: Configuration files and the command line

Ok, so we somehow have reverted back to using the command line or fiddling with configuration files. It’s all very 1990s. I mean seriously, who has the massive enough ego to require people to fiddle with json configuration files? Is there some hugely nerdy boffin who still believes they are better than everyone else because they can memorise a bunch of command line attributes?

This is the 21st century. I want my people focusing on business logic and producing business value, not working out the correct command they need to type to get some package installed on their machine. Not when a visual tool will provide everything they need to continue with their core function, which is to provide business value.

Those are the 4 major failures, but now for a couple of quirks.

npm The Quirks

Firstly, when I do an npm restore packages, its often quite difficult to figure out what’s going on or whether its finished its work. The user interface is still interactive too, and you can right-click and install individual npm packages and click restore, even though a global restore is in progress. Huh?

Secondly, my Dependencies folder is almost permanently set to “Dependencies – not installed” even though all my packages are installed. What is the point of showing this if the message isn’t helpful. It makes people lose confidence in the tooling.

In our environment, like most corporate environments, introducing new technologies can be quite difficult. It’s a typical catch-22 situation. You can’t introduce a new technology until its proven, but on the other hand, you can’t prove it until you’re allowed to introduce it. It’s why so many corporates bypass the architecture teams and build a silo to enable innovation within an environment to gain a competitive advantage. It becomes even harder when the tools are problematic.

I was able to get an application up and running within the corporate environment. It was an Angular 2 application running on Dot Net Core with Web Pack. Because of my skill level, I could get it going, but to expect others with less experience to have to fight configuration files and do stuff from the command line and configure the proxy, and install third party tools just to be able to start their job is ridiculous. It’s all experience, I hear you say, well, no. I don’t buy it. It’s hard enough to move to new technologies without the added complexity of dealing with problems that should not exist.

The result was that after a week of having the team fighting (mostly) npm and all the new technologies, we decided to fail early. The entire rest of the team were continually struggling with the development infrastructure and it became a productivity killer. So we have gone back to our old and working development environment. The down side is that there a certain packages that aren’t available on nuget, such as Angular 2. But the up side is that everything else works.

I have to say, I’m disappointed. For all its supposed benefits, the new environment just felt half-baked. The impediment to getting a team running smoothly was just too high. For this to work, npm needs a user interface, and it needs to work automatically through the proxy, much like the far superior experience of nuget. This needs to be fixed for us to be able to move forward, or they will be just as happy, where I am, to stay on the existing tech stack that runs smoothly and virtually without a hitch.

Edit: I have since found out that there is, indeed, a GUI for npm package management. The problem is that it is only available in Node Js applications and not standard asp.net applications. What’s also disappointing is that the GUI isn’t really very good. It certainly isn’t up to the standard of nuget – it feels very much a hack.