Build a simple notification service with Node.js and MongoDB

27 June 2014
PDF (224 KB)

Build a simple notification service with Node.js and MongoDB

02:14  |  Transcript

Kevin Williams

Cloud, Automation, and Test Architect for IBM Software


Sign up for IBM Bluemix
This cloud platform is stocked with free services, runtimes, and infrastructure to help you quickly build and deploy your next mobile or web application.

Do you ever find yourself trying to orchestrate the behavior of a growing set of disparate tools to create some larger system? In my case, our development team needs to assemble a continuous delivery pipeline and sequence their operation. One option is to employ a notification service that supports creating, signaling, and subscribing to events.

Such a service might be applied to a build system to announce the availability of a new build. Downstream components of the pipeline can then receive notifications and take action based on the presence of a new build. The action might include proactively provisioning a new test system and running a regression suite.

A notification service doesn't have to be exciting to be useful.

I developed this simple notification service using the Node.js runtime for its support for rapid development of HTTP servers with a RESTlike API. I also chose to employ MongoDB for the back end. Its document orientation seemed perfect for rapid prototyping and I didn't need strict support for ACID (atomicity, consistency, isolation, and durability) properties.

Architecture of the app

To see the notification service in action, click Run the app to retrieve a log of the five most recent event signals processed by the notification service. Most signals you'll see were generated by our product build system and are JSON formatted.

What you'll need to build a similar app


READ:Node.js beyond the basics

After you install Node and MongoDB you’ll be able use Node’s package manager (npm) to load the required dependencies. See the package.json file in the code you downloaded from DevOps Services for specific versions of the modules and frameworks that enable this application.

Let's get started!

Step 1. Create the APIs


The API is REST-like in that resources are accessed and modified by using unique URLs that apply HTTP verbs: GET, PUT, POST, and DELETE. Any application with an HTTP messaging capability can access the service. This approach provides a clean interface for future, browser-based tools.

The primary resources managed by the service are events and subscriptions. An API is available for each of the create, read, update, and delete actions.

I based this API on the Express framework because it includes a robust set of features for web application development and because it's super easy to use, if you follow the conventions. Take a look at the code from the main server.js module of the application that sets up routing for incoming request directed at events:

console.log ('registering event routes with express');
app.get('/events', event.findAll);
app.get('/events/:id', event.findById);'/events', event.addEvent);
app.put('/events/:id', event.updateEvent);
app.delete('/events/:id', event.deleteEvent);

This same basic pattern is then repeated for subscriptions:

console.log ('registering subscription routes with express');
app.get('/subscriptions', sub.findAll);
app.get('/subscriptions/:id', sub.findById);'/subscriptions', sub.addSubscription);
app.put('/subscriptions/:id', sub.updateSubscription);
app.delete('/subscriptions/:id', sub.deleteSubscription);

In addition to these basic actions for events and subscriptions, there are two additional important APIs, specifically:

  • An API to signal an event:'/signals', signal.processSignal);
  • An API to retrieve a log of recent signals:
    app.get('/signallog', signallog.findRecent);

Step 2. Work with the back end


MongoDB provides a persistent store for the service. I work directly with back-end documents (collections) rather than use an object mapper. The Node.js native driver provides all the needed capability.

The two primary resources, events and subscriptions, map directly to database collections. Access to them is straightforward. For example, a GET request to the events resource results in a call to event.findAll as defined in the previous code section. The findAll function is defined in the events.js module:

exports.findAll = function(req, res) {
   mongo.Db.connect(mongoUri, function (err, db) {
      db.collection('events', function(er, collection) {
         collection.find().toArray(function(err, items) {

The findAll function simply connects to the database, obtains a handle to the events collection, returns all elements, and then closes the connection. All API requests are handled in this similar and very direct way. In the following example, the delete event request is handled by the deleteEvent function from the events.js module:

exports.deleteEvent = function(req, res) {
   var id =;
   mongo.Db.connect(mongoUri, function (err, db) {
      db.collection('events', function(err, collection) {
         collection.remove({'_id':new BSON.ObjectID(id)}, {safe:true}, function(err,
result) {

As before, a connection is established and a collection handle is obtained. Next, the collection element is removed based on the provided ID, and the connection is closed.

Step 3. Send notifications


In addition to performing the create, replace, update, and delete operations against the managed resources, this service also sends messages to appropriate subscription endpoints when events are signaled. This first version of the service provides email notifications. The next few code sections trace the entire process, end to end.

A signal is received through the signals route. You can see from the server.js module that we handle this route with the processSignal function in the signals module:'/signals', signal.processSignal);
exports.processSignal = function(req, res) {
   var signal = req.body;
   console.log('Processing Signal: ' + JSON.stringify(signal));

  mongo.Db.connect(mongoUri, function (err, db) {
   db.collection('subscriptions', function(err, collection) {
     collection.find().toArray(function(err, items) {
      matches = _.filter(items, function(sub){return sub.eventTitle == signal.eventTitle});
      _.each(matches, function (sub) {processMatch(sub, signal)});

As before, a connection to the database is established and a handle is obtained for the associated collection. We then filter to get all subscriptions with a matching Event title. Each element of this subset of matching subscriptions is then processed by the processMatch function.

Did you notice my use of _.filter and _.each? These are both from a very cool library called Underscore, which provides slick helper functions that are quite familiar to users of object-oriented languages such as Ruby.

The processMatch function simply assigns values appropriate for an email message and calls the mailer.sendMail function:

function processMatch(subscription, signal) {
   opts = {
      from: 'Simple Notification Service',
      to: subscription.alertEndpoint,
      subject: subscription.eventTitle + ' happened at: ' + new Date(),
      body: signal.instancedata
   // Send alert

The sendMail function is also straightforward because I use another library named NodeMailer. You can see that my sendMail function just makes use of a few NodeMailer capabilities to send the email notification. It basically initializes a transport object with authentication values, then specifies message content (subject, body, address, and so on), and initiates a send:

exports.sendMail = function (opts) {
   var mailOpts, smtpTransport;

   console.log ('Creating Transport');

   smtpTransport = nodemailer.createTransport('SMTP', {
      service: 'Gmail',
      auth: {
         pass: config.password

// mailing options
mailOpts = {
   from: opts.from,
   replyTo: opts.from,
   subject: opts.subject,
   html: opts.body

console.log('mailOpts: ', mailOpts);

console.log('Sending Mail');
// Send mail
smtpTransport.sendMail(mailOpts, function (error, response) {
   if (error) {
   }else {
      console.log('Message sent: ' + response.message);
   console.log('Closing Transport');


Step 4. Test the app


I have covered the major functionality of the notification service. Next, I’ll quickly address automated testing.

Another slick framework in the Node.js ecosystem is Mocha for testing. Combining Mocha with add-ons such as supertest for HTTP goodies and should for assertions provides readable functional tests for my API. Here is a snippet from the eventtests.js module that verifies the readbyid Event API:

it('should verify revised News flash 6 event', function(done) {
      .get('/events/'+ newsFlash6id)
      .end(function(err, res) {
         if (err) {
            throw err;
         res.body.title.should.equal('News flash 6 - revised');

This syntax and format is very readable and I expect it will be relatively easy to maintain.

My automated functional regression test suite includes tests for each API and I have structured a test folder that includes all tests. Mocha recognizes this convention. To run the test suite, issue the command mocha from the root of the project.

As shown in the following figure, the output from the test suite is terse if there are no failures:

Screen capture of test output

My current process is to run this regression suite locally (and often) as I make changes. I always run it before pushing changes to source control. An instance of this service is running on the Bluemix platform, and if you’ve clicked Run the app, you have interacted with this instance and you see a log of recent event signals.

I essentially have two environments today: my local development system where I make changes and perform tests, and the production system on Bluemix. This arrangement works fine for now, but a next step is to create another hosted staging environment on Bluemix. A new staging environment can enable me to run the regression suite in a more production-like environment and open up options like automatic test and deployment into production after source changes. When I take that step, I’ll be sure to blog about it.

Consider building a similar, simple notification service to manage events across a broad set of tools. Give it a try and let us know how it goes!


Add a comment

Note: HTML elements are not supported within comments.

1000 characters left

developerWorks: Sign in

Required fields are indicated with an asterisk (*).

Need an IBM ID?
Forgot your IBM ID?

Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name

The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.


All information submitted is secure.

Zone=Web development, Open source, Cloud computing
ArticleTitle=Build a simple notification service with Node.js and MongoDB