Scheduling Tasks in the MEAN stack

As I was still learning my way around the MEAN stack, I couldn’t wrap my mind around how to schedule future tasks in the MEAN stack (such as sending an email). This was because MEAN is so stateless, transactional. Or at least it seems to be. But, recall that we have ExpressJS and Node.js in the picture, which are always running on the server side.

Given this, the solution turns out to be very simple. You can schedule tasks in Node using the node-schedule package. To install node-schedule, run the command:

npm install node-schedule

Then in ExpressJS you can set up tasks like so:

var schedule = require("node-schedule");
var dailyRule = new schedule.RecurrenceRule();
dailyRule.second = 0;
dailyRule.minute = 0;
dailyRule.hour = 0;
var onceADay = schedule.scheduleJob(dailyRule, function(){
  //Do something here
});

I’ve found that it’s easy to put this in ExpressJS’s app.js (or wherever your server startup script is), since that is called on the MEAN application’s server side startup.

MEAN application, getting all sockets during a request

In my MEAN application I had a need to get the list of all open websockets during a particular request. Turns out with socket.io this is easily doable if can save a reference to the socketio variable during initialization.

For a typical socket.io init in Express, the app.js would contain the following code:

var app = express();
var socketio = require('socket.io')(server, {
    path: '/socket.io-client'
});
require('./config/socketio')(socketio);

So what you need to do is save a reference to the “socketio” variable, which can be retrieved from the request passed down in Express, and used to get all sockets. To do so, you can set it as an application-wide variable in Express like so in app.js:

app.set('socketio', socketio);

Next, a typical Express function handling a request looks like so:

function(req, res) {
}

From your request (the “req” variable) you can get a reference to the Express app, which allows you to get a reference to the socketio variable you set application wide, which you can use to get the list of sockets. Like so:

function(req, res)  {
    var sockets = req.app.get('socketio').sockets.sockets;
    //...
}

And now you can loop through all the sockets like below, and do whatever with them you’d like:

for (var socketId in sockets) {
    var socket = sockets[socketId];
}

As easy as that!

Mongoose model hasOwnProperty() workaround

When using a Mongoose model, if you have a reference to a queried document, you’ll notice that you cannot call hasOwnProperty() to check if it has a certain property. For example, if you have a User model, and the schema has a property called “verified”, if you call user.hasOwnProperty(‘verified’), it will not return true!

This is because the object you get back from your mongoose query doesn’t access properties directly.

An easy and quick workaround is like so:

var userObject = user.toObject();
if(userObject.hasOwnProperty('verified'))
  //...

And now hasOwnProperty() will return true.

Mongoose save() to update value in array

This had me scratching my head for a while until I figured it out. Let’s say you have a MongoDB model which has an array in it. If you’ve got a reference to a document which you pulled using Mongoose, and you update a value in the array, then you call document.save(), the array will not get updated!

Turns out you need to use the document.markModified(arrayName) function calling document.save().

Here’s an example. Let’s say your model looks like this:

var UserSchema = new Schema({
  //...
  emailSubscriptions: {
    type: Array,
    'default': ["all"]
  }
});
exports.model = mongoose.model('User', UserSchema);

So after you get a user and modify an array (in this case emailSubscriptions), you’ll need to mark it as modified before saving it:

User.findById(userId, function(err, user) {
  user.emailSubscriptions = ['none'];
  user.markModified('emailSubscriptions');
  user.save(function(err, user) {
    //...
  });
});

socket.io: Getting all connected sockets in Node/Express

Sometimes you need a list of all connected sockets in your server side API. This could be, for example, if you want to loop through all the sockets and emit something when a certain event happens, such as some a data model updating or something else.

As of socket.io version 1.3, you have access to all sockets using “socketio.sockets.sockets” programatically.

So for example if you needed all sockets in an Express controller running on node.js, you could access it this way.

In your server side app.js, socket.io is usually configured in express this way:

var app = express();
//...
var socketio = require('socket.io')(server, {
  //...
});

And then, you can save the reference for socketio for later using Express’s app.set() and app.get(). In app.js you would do:

app.set('socketio', socketio);

Then in an express controller, you can access socketio like so:

exports.whateverController = function(req, res, next) {
  var socketio = req.app.get('socketio');
  //...
}

And then you can loop through the sockets:

var sockets = socketio.sockets.sockets;
for(var socketId in sockets)
{
  var socket = sockets[socketId]; //loop through and do whatever with each connected socket
  //...
}

Simple, right?!

Paginating documents/items in MEAN

If you’ve ever scrolled through the Facebook newsfeed, you’ve noticed that the topmost stories are the most recent ones, and as you scroll to the bottom, older ones get loaded over and over as you keep scrolling.

This feature is firstly kind of “cool”, and fits in perfectly in a single page application. It’s also pretty useful from a performance standpoint, since not all of the documents (items your page is displaying, such as Facebook news stories, classified ads, search results, etc.) need to be loaded up front all at once when the user first lands on the page.

Paginating your documents in a MEAN application can be accomplished fairly easily, though it isn’t necessarily obvious. So I thought I’d write about the process I took, and the code I wrote, to get it done.

Let’s start with the AngularJS side of things. I used the ngInfiniteScroll module (https://github.com/sroze/ngInfiniteScroll) to accomplish the continuous scrolling effect. It’s pretty simple to configure, so please read up on the documentation. Essentially it can just be wrapped around an Angular ng-repeat directive, and be configured with a function to call to fetch more documents when the bottom of the page is reached (ngInfiniteScroll does all the calculations internally). Here is an example of what it would look like for getting more “classifieds” from the database to add them to the view:

loading-classifieds

So in the example above, the getMorePosted() function in your controller is called whenever ngInfiniteScroll detects that the user is at the bottom of the page. Note here that ngInfiniteScroll will most likely trigger right when the user lands on the page, unless you pre-load some documents in your controller. I elected getMorePosted() to fetch both the initial set of documents, and every successive set of documents as well. Depending on how you set things up, this may or may not make a difference, but it did for me.

My getMorePosted() function in the controller looks like this (note: it uses a factory called Classified to do the actual getting of classifieds from the API (Express/MongoDB on the server side of MEAN) which I’ll define later):

$scope.initialLoadDone = false;
$scope.loadingClassifieds = false;
$scope.getMorePosted = function() {
  if($scope.loadingClassifieds) return;

  $scope.loadingClassifieds = true;

  if(!$scope.initialLoadDone) {
    Classified.getPosted(function (postedClassifieds) {
      $scope.postedClassifieds = postedClassifieds;
      $scope.loadingClassifieds = false;
      $scope.initialLoadDone = true;
    });
  }
  else
  {
    Classified.getMorePosted(function(err,numberOfClassifiedsGotten) {
      $scope.loadingClassifieds = false;
      if(numberOfClassifiedsGotten==0)
        $scope.noMoreClassifieds=true;
    });
  }
}

A couple things to note here. When the classifieds are being loaded, the $scope.loadingClassifieds flag is set to true. This disables ngInfiniteScroll from attempting to keep loading more classifieds when the bottom is reached, and it can also be used to put up a message to the user that loading is underway (in case it doesn’t happen near instantly due to a slow connection). Furthermore, getMorePosted() also tracks through the $scope.noMoreClassifieds flag when the end has reached (if ever, depending on how many thousands or millions of documents are in your database, and how far down the user scrolls). It does this by measuring the number of documents returned, and if the number equals zero, it means the end of pagination has been reached.

This is how getPosted() and getMorePosted() look like in the Classified factory:

app.factory('Classified', function Classified(ClassifiedResource, ...) {
      var postedClassifieds = [];
      var postedClassifiedsLoaded = false;
      //...
      getPosted: function(callback) {
          var cb = callback || angular.noop;
          if (postedClassifiedsLoaded) {
            //console.log("Sending already-loaded postedClassifieds");
            return cb(postedClassifieds);
          } else {
            return ClassifiedResource.Posted.query(
              function(_postedClassifieds) {
                //console.log("Loading postedClassifieds from webservice");
                postedClassifieds = _postedClassifieds;
                postedClassifiedsLoaded = true;
                return cb(postedClassifieds);
              },
              function(err) {
                return cb(err);
              }).$promise;
          }
        },
        getMorePosted: function(callback) {
          var cb = callback || angular.noop;
          if (!postedClassifiedsLoaded)
            callback();
          else {
            return ClassifiedResource.Posted.query({
                startTime: new Date(postedClassifieds[postedClassifieds.length - 1].posted).getTime()
              },
              function(_postedClassifieds) {
                //console.log("Loading more postedClassifieds from webservice, from before startTime="+postedClassifieds[postedClassifieds.length-1].posted);
                for (var i = 0; i < _postedClassifieds.length; i++)
                  replaceOrInsertInArray(postedClassifieds, _postedClassifieds[i], true);
                return cb(null, _postedClassifieds.length);
              },
              function(err) {
                return cb(err);
              }).$promise;
          }
        },
        //...

And this is how ClassifiedResource looks like:

app.factory('ClassifiedResource', function ($resource) {
  return {
    Posted: $resource(
      '/api/classified/getPosted/:startTime',
      {
      },
      {
      }
    ),
}

So note that in my setup, the service loads and maintains the list of documents (postedClassifieds) within memory. And getPosted() returns that list if it is already loaded, and it also gets the first set of documents. getMorePosted() is where the magic happens. It gets the timestamp of the last classified, and transmits that to the API (server side, Express) which then loads the next “page” after for all documents (classifieds in this case) after that timestamp.

Before we continue to examine the server side, it’s important to note that you’ll need a field to sort by in a descending order (or ascending if you want you want the oldest documents up front). A timestamp value will work great. Otherwise a MongoDB ID could work too, since those are incremental. It will depend on your data. In my case, a timestamp called “posted” was available in my data, and very consistent. Documents could only be removed from before a past timestamp, but not added to in a past timestamp (even then, this wouldn’t be a huge problem). So that works just fine with this pagination approach.

Here is what the server side looks like in Express/NodeJS:

var Classified = require('./classified.model');
exports.getPosted = function(req, res) {
  var startTime = req.params.startTime ? req.params.startTime : null;

  var query = Classified.find(
      {posted: { $ne: null }}
  );
  query.sort('-posted -_id');
  query.limit(20);
  if(startTime)
    query.where({posted: {$lt: new Date().setTime(startTime)}});
  query
    .exec(function (err, classifieds) {
      if(err) { ... }
      return res.status(200).json(classifieds);
    });

}

Note that “Classified” defines my model, which is queried from using Mongoose. I limit the number of documents returned to 20, which works well for my application. And the query is sorted in descending order by the “posted” field, which is a timestamp. You’ll notice a where clause added, which gets only the classifieds posted before the time sent in (“startTime”) from the UI, so that works in conjunction with the sort and returns 20 more classifieds before the “startTime”. Also note that I send the timestamp in milliseconds, which gives a nice clean number that can be sent down to the API from the UI.

And, that’s it!

Something I want to add is that on your client side (in AngularJS) if you end up loading too many documents/items in your ng-repeat, the application performance will greatly degrade. With ngInfiniteScroll, all items on the page are always kept once they’re loaded, even if they’re not in the view currently. There’s another module: https://github.com/angular-ui/ui-scroll which will allow you to destroy and re-create items as they go in and out of the view from the user’s browser as the user scrolls through. This will vastly improve performance when a lot of documents are loaded.

Giving the user a message before resizing images through ng-file-upload

This had been bothering me for a while until I stumbled upon an answer that led me to a solution.

With ng-file-upload (https://github.com/danialfarid/ng-file-upload) for AngularJS, you have the capability to resize images using the ngf-resize directive. It’s very handy since you can put some of the CPU processing burden resizing giant images to smaller sizes on the user, rather than to put it on your own server (if you resize the images after the files are uploaded).

HOWEVER, the problem with ngf-resize is when resizing starts, and especially if the user selects multiple images at once, the user’s browser hangs while the images resize. The bigger the image, the longer it takes. This is irritating, and also confusing, causing the user to wonder what is going on. For the longest time I was trying to figure out how to give the user a message before the resizing actually starts.

I eventually stumbled upon the ngf-before-model-change directive part of ng-file-upload. This allows you to define a function that is called before the model changes (and the images start resizing). This is a perfect place to put up a message to the user that their images are about to be resized, and for them to sit tight for the next few seconds.

Then the ngf-select directive can be used to define a function which is called AFTER the resizing is complete, and this is where you can remove the message to the user.

Full example follows like this. In your JavaScript side of things (in your AngularJS controller) you would do:

$scope.beforeResizingImages = function(images)
{
  blockUI.start($translate.instant('FORMATTING_IMAGES')+"...");
  $scope.$apply();
}

$scope.afterResizingImages = function(images)
{
  blockUI.stop();
}

And then in HTML:

<div ngf-before-model-change="beforeResizingImages($files)" ngf-select="afterResizingImages($files)" />

And that’s it! beforeResizingImages() and afterResizingImages() will be called in the correct order, putting the message up before resizing images (and before the browser hangs for a few seconds for the CPU intensive process), and taking it off after resizing. Note that I use angular-block-ui (https://github.com/McNull/angular-block-ui) to block the UI and put the message up, and of course angular-translate to translate the text for the block message.

Simple AngularJS directive

So I found many articles on using AngularJS directives, which go into great detail for very complex directives. I couldn’t find any on just a relatively simple directive, so I thought I’d write about it.

First of all, an AngularJS directive defines a reusable HTML tag that you can use in your HTML files. It’s very useful if you want to create some complicated custom tags, or even just simple ones, that you can reuse everywhere.

In my case, I just wanted to give my users a message for them to add the system email to their spam filter allow list, so no system-generated emails get blocked. I wanted to do this everywhere an operation included a system-generated email to the user. I wanted to style the message in a particular way, translate it according to the user’s preferences, and I didn’t want to copy/paste it in those several places. I thought it would be difficult in case I wanted to change the styling, wording, or so on, I’d have to do it in many places. This is where an AngularJS directive becomes useful. I can define the directive in one place, and use it in many places.

This is how the directive is defined in JavaScript:

.directive('emailSpamMessage', function() {
  return {
    template: '

<span class="label label-danger">{{"NOTE" | translate | uppercase}}</span> {{"SPAM_MESSAGE" | translate {HOSTING_NAME:CONSTANTS.HOSTING_NAME,SYSTEM_EMAIL:CONSTANTS.SYSTEM_EMAIL} }}' 
  } 
})

And this is how it would look in HTML:

<email-spam-message></email-spam-message>

And that’s it! Note that the directive in JavaScript is defined as emailSpamMessage, but in HTML as “email-spam-message”. AngularJS does the conversion for you.

Also note that in the controller associated to the HTML that the custom directive is in, needs to have the correct includes for it to work fully. For my particular example the controller will need to have $translate (angular-translate) and a CONSTANTS variable included.

Limiting upload of number of files in ng-file-upload

ng-file-upload is a wonderful module to help you manage uploading files through AngularJS. Head on over to https://github.com/danialfarid/ng-file-upload to check it out.

One thing that isn’t obvious is how to limit the number of files that a user can upload. (This of course only applies if you are allowing multiple file uploads).

One way to limit the number of files a user can upload is through the “ngf-validate-fn” angular directive. This directive can be used to call a custom function defined in your controller that validates the file.

In this custom validation function, you can check the number of files that already exist in the files model, and return true (meaning validation passed, file should be allowed) or false (or an error name, meaning validation failed… the max number of files has reached).

Let’s say you want to limit the maximum number of files uploaded to 10. It would look like this in your html:

<div ngf-select ngf-multiple="true" ng-model="files" ngf-validate-fn="validateFile($file)" />

And in your controller:

$scope.validateFile = function(file)
{
  if($scope.files.length>=10)
    return "TOO_MANY_FILES";
  return true;
}

And that’ll do it.

HOWEVER, big caveat: This will only work if the user selects one file at a time. If the user selects multiple files all at once (from the file-selection dialog box their browser presents), then this limitation trick will not work, and more files will get through. I am currently in the process of either myself implementing this feature as a native directive in the ng-file-upload module, or waiting till someone else implements it. I’ve posted this as an enhancement request on the module’s github page.

MEAN stack: associating a socket with a user

I’m using the MEAN stack for an application I’m working on. The project was seeded using the Angular fullstack yeoman generator (https://github.com/DaftMonk/generator-angular-fullstack/).

Out of the box the project has support for websockets (using socket.io), and users (using passportjs). However, sockets on the server side in express running on node are not tied to users, out of the box.

For several reasons the application likely needs to know what user a socket belongs to. For example, if there’s a change made to a model that needs to be emitted, you may need to emit it to only users with a certain role.

To get around this, I made a bunch of modifications which I’ll detail below. Essentially, the user object will get saved within the socket object. So when a socket is being processed, say through a model level trigger (i.e. “save” or “delete”) using mongoose for example, the user object will be in the socket and can be used in whatever processing logic.

The MEAN project seeded from the angular fullstack generator uses a token generated through jwt, which is stored in a cookie, to authenticate a user. So when a user login occurs, an event can be emitted with the jwt token over the socket to register the user with the socket. Furthermore, in your socketio.on(‘connection’,…) function in express, you can read the cookie to get the jwt token, then get the user and put it in the socket. This is essential so that if a user is already logged in, and returns to your web application (or opens a new tab to your application) and a new websocket is created, the cookie can be used to associate the socket with the user, since a new login event will not be emitted at that point.

First, let’s define a function that can take a token either directly as a parameter, or read it from the cookie in a socket, and get the user. This same function can be called from a login emit event with a jwt token as the payload over the socket, or from socketio.on(‘connection’,…).

var auth = require('../auth/auth.service');
function setupUserInSocket(socket, inputToken)
{
  var tokenToCheck = inputToken;
  if(!tokenToCheck && socket && socket.handshake && socket.handshake.headers && socket.handshake.headers.cookie)
  {
    socket.handshake.headers.cookie.split(';').forEach(function(x) {
      var arr = x.split('=');
      if(arr[0] && arr[0].trim()=='token') {
        tokenToCheck = arr[1];
      }
    });
  }
  if(tokenToCheck)
  {
    auth.getUserFromToken(tokenToCheck, function (err, user) {
      if(user) {
        console.info('[%s] socket belongs to %s (%s)', socket.address, user.email, user._id);
        socket.user = user;
      }
    });
  }
}

Note that the cookie is in socket.handshake.headers.cookie. Also note that I call auth.getUserFromToken, which is another function I created that decrypts the user ID from the jwt token, queries the user from the model, and returns it. The function looks like this:

var User = require('../api/user/user.model');
var async = require('async');
var config = require('../config/environment');
function getUserFromToken(token, next)
{
  async.waterfall
  (
    [
      function(callback)
      {
        jwt.verify(token, config.secrets.session, function(err, decoded) {
          callback(err, decoded);
        });
      },
      function(decoded, callback)
      {
        if(!decoded || !decoded._id)
          callback(null, null);
        else {
          User.findById(decoded._id, function (err, user) {
            callback(null, user);
          });
        }
      },
    ],
    function(err, user)
    {
      next(err, user);
    }
  );
}

Next, let’s use socketio.on(‘connection’,…) to call the function with the socket. If the jwt token is already in the cookies, meaning the user already logged in previously, the user will be associated with the socket:

socketio.on('connection', function (socket) {
  setupUserInSocket(socket);
  //...
});

And that’s it for that particular scenario! Next, let’s worry about when a user actually logs in. Within socketio.on(‘connection’, …) we can listen for login emits from the client over the socket like so:

socket.on("login", function(token,next) {
  setupUserInSocket(socket,token);
  next({data: "registered"});
});

And on the client side, we emit the login event over the socket when a successful login occurs. This can be done in a number of ways, but I decided to do it in login.controller.js. After Auth.login() is called, I call socket.login():

angular.module('classActApp')
  .controller('LoginCtrl', function ($scope, Auth, socket, ...) {
    //...
        Auth.login({
          email: $scope.user.email,
          password: $scope.user.password
        })
        .then( function() {
                  //...
                  socket.login();
//...

And in the client side socket.service.js, the login() function does the following:

angular.module('classActApp')
  .factory('socket', function(socketFactory, $location, CONSTANTS, Auth) {
return {
login: function () {
  socket.emit("login", Auth.getToken(), function(data) {
  });
},

Note that you also need to worry about logouts. If the user logs out from your web application, but sticks around on your web application, the socket for that session will remain associated to the user they were logged in as previously. This could be undesirable for several reasons. So in your express side of things (server side), you want to listen for logout events and clear out the user from the socket like so (note, this is added within socketio.on(‘connection’,…)):

socket.on("logout", function(next) {
  if(socket && socket.user) {
    console.info('[%s] socket being disassociated from %s (%s)', socket.address, socket.user.email, socket.user._id);
    socket.user = null;
  }
  next({data: "un-registered"});
});

And in your angular side of things (client side), when the user logs out, you want to emit a logout event over the socket. In the angular fullstack seeded project I’m using, this happens in navbar.controller.js which has the logout function.

angular.module('classActApp')
  .controller('NavbarCtrl', function ($scope, $location, socket, ...) {
    //...
    $scope.logout = function() {
      //...
            socket.logout();

And in socket.service.js:

angular.module('classActApp')
  .factory('socket', function(socketFactory, $location, CONSTANTS, Auth) {
    //...
    return {
      //...
      logout: function () {
        socket.emit("logout", function(data) {
        });
      },

And that’s it! Now all your sockets have the user they belong to (if any) associated to them (accessible from socket.user on the server side). And when emitting events from the server side from the socket, or when reading events emitted from the client side over the socket, we can now know the user the socket belongs to!