March 20, 2017 By Paul Webster 4 min read

Integrate your Nexus-hosted npm registry into your Node.js toolchain

By now, you probably know that with toolchains in IBM® Bluemix® Continuous Delivery, you can build a Node.js app in the Delivery Pipeline and push the app to Bluemix. But did you know that you can integrate a Nexus server with Delivery Pipeline by using a toolchain? You might have a Nexus server to manage node modules that need to be consumed by only your app; they don’t need to be pushed to the global registry. You can use Nexus to both store and provide node modules during your Delivery Pipeline builds.

For example, you might have a GitHub repo with an echo node module and a GitHub repo with an express app that consumes your node module. With Nexus, you can keep the builds of those node modules separate.

The process to set up the Nexus tool integration has two main steps:

  1. Add Nexus to your toolchain.

  2. Add an npm build job to your Delivery Pipeline.

Note: You must run your own Nexus server in order to integrate it with toolchains.

Adding Nexus to your toolchain

From your toolchain’s Overview page, click Add a Tool. On the Nexus configuration page, enter these details:

  • A name for the integration, such as npm-nexus

  • The console URL for your Nexus server

  • The repository type, such as npm registry

  • An email address

  • Your npm registry authentication token; for example,

    echo -n nexus_user_id:nexus_password | base64 -w0 -
  • The URL of your private npm registry

  • The URL of your public npm registry

For more information about the Nexus server setup, see Nexus server configuration details.

Adding an npm build job to your pipeline

Add a build job to your pipeline and for the builder type, select NPM Build. For the tool integration type, make sure that Nexus is selected. If you use the default selection for the tool instance name, the npm build job will retrieve its configuration information from the only Nexus integration in your toolchain.

Building your npm module

The npm build job preconfigures your .npmrc file with the public registry and authentication information that you provided in your Nexus integration. An npm install or npm run build will use your Nexus server to retrieve required modules.

The npm build job also provides environment variables that can be used when you run your build command:

  • NPM_NAME: The name of the service instance

  • NPM_USER_ID: The email address for the npm registry

  • NPM_TOKEN: The token or password for the npm registry

  • NPM_RELEASE_URL: The private npm registry

  • NPM_MIRROR_URL: The proxy npm registry, which is used as the default npm registry

  • NPM_MODULE_NAME: The module name of the package.json file that is at the root of the workspace, if available

Publishing your npm module

To publish your module to your private registry, create another build job that uses NPM Build as the builder type. You must provide extra information to the publish command line because the .npmrc registry points to your public registry. To publish the module to your private registry, use the NPM_RELEASE_URL environment variable.

Support Continuous Delivery in your pipeline by using Nexus

Unlike Maven, an npm registry has no native concept of snapshots. Snapshots are pre-release versions of modules that can be repeatedly pushed to a repository. To work with a pipeline that builds the module and publishes it, each commit would need to increase the module version. But with a little help, NPM Build can support snapshots of a node module that can be used with another module or app without relying on a developer to update the module version on every commit.

Configure your module version in the package.json file to a pre-release version; for example, 1.0.6-SNAPSHOT.0. The -SNAPSHOT.0 suffix represents a pre-release version of 1.0.6. For details about node module versions, see the semantic versioner for npm.

In your publish build job, select the Increment snapshot module version check box.

Before you run your npm publish command, the NPM Build job selects the highest node module version from the package.json file and your NPM_MIRROR_URL registry. Then, the NPM Build job increments that version by using the semantics that are defined by npm semver. For example, 1.0.6-SNAPSHOT.5 becomes 1.0.6-SNAPSHOT.6 and is written into the local package.json file. The new version is used during the npm publish, but the NPM Build job does not deliver this change to the SCM repository.

The app that depends on the module sets the dependency version to ^1.0.6-SNAPSHOT.0 to always pick up the latest snapshot of that module.

Nexus server configuration details

Storing your node modules in a private registry during your builds is a common use case. In Nexus, you can support this use case by creating three registries:

  1. A hosted npm registry to store your private modules

  2. A proxy npm registry, which caches modules that are downloaded from the global registry, https://registry.npmjs.org/

  3. A group npm registry, which acts as an aggregator for the other two registries

You also need your npm registry credentials. The user ID is your email address, which doesn’t need to be known to the Nexus server, and your auth token is the base64 representation of your Nexus user ID and password. An example auth token might be echo -n nexus_user_id:nexus_password | base64 -w0 -

Try it

You can check out an example of a toolchain that builds two node modules by clicking the Create Toolchainbutton. You’ll need to enter your Nexus server configuration information. Also, be sure to select a reasonable org and space for Delivery Pipeline.

The resulting toolchain includes two GitHub repos, a Nexus integration, and two Delivery Pipelines, one for the module and one for the app.

Resources

More from Cloud

Hybrid cloud examples, applications and use cases

7 min read - To keep pace with the dynamic environment of digitally-driven business, organizations continue to embrace hybrid cloud, which combines and unifies public cloud, private cloud and on-premises infrastructure, while providing orchestration, management and application portability across all three. According to the IBM Transformation Index: State of Cloud, a 2022 survey commissioned by IBM and conducted by an independent research firm, more than 77% of business and IT professionals say they have adopted a hybrid cloud approach. By creating an agile, flexible and…

Tokens and login sessions in IBM Cloud

9 min read - IBM Cloud authentication and authorization relies on the industry-standard protocol OAuth 2.0. You can read more about OAuth 2.0 in RFC 6749—The OAuth 2.0 Authorization Framework. Like most adopters of OAuth 2.0, IBM has also extended some of OAuth 2.0 functionality to meet the requirements of IBM Cloud and its customers. Access and refresh tokens As specified in RFC 6749, applications are getting an access token to represent the identity that has been authenticated and its permissions. Additionally, in IBM…

How to move from IBM Cloud Functions to IBM Code Engine

5 min read - When migrating off IBM Cloud Functions, IBM Cloud Code Engine is one of the possible deployment targets. Code Engine offers apps, jobs and (recently function) that you can (or need) to pick from. In this post, we provide some discussion points and share tips and tricks on how to work with Code Engine functions. IBM Cloud Code Engine is a fully managed, serverless platform to (not only) run your containerized workloads. It has evolved a lot since March 2021, when…

Sensors, signals and synergy: Enhancing Downer’s data exploration with IBM

3 min read - In the realm of urban transportation, precision is pivotal. Downer, a leading provider of integrated services in Australia and New Zealand, considers itself a guardian of the elaborate transportation matrix, and it continually seeks to enhance its operational efficiency. With over 200 trains and a multitude of sensors, Downer has accumulated a vast amount of data. While Downer regularly uncovers actionable insights from their data, their partnership with IBM® Client Engineering aimed to explore the additional potential of this vast dataset,…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters