This framework sets up an organized NodeJS project in an easier and 'classier' manner. Since this is being used internally, we added our way of standardizing the code structure.
This framework wraps around ExpressJS, with the following additions:
- App class - A big 'motherboard'
- A router
- some default middlewares
- Dockerfile
- project folder structure
This framework has an App Class which acts as a 'motherboard' of the whole application. It controls the phases of the app in the following sequence:
- load configurations from /config folder
- load framework related models and export it to a context(default to be global)
- import services folder and export to the context
- import models folder and export to the context
- import viewModels folder and export as context.ViewModels
- import controllers folder and export to the context
- connect to dependent services (like mongo, redis)
- start the service ( default to starting an express server )
- stop the service ( defaut to stopping an express server )
- disconnect from dependent services
Router is a simple component of the framework. It intends to make routing easier. It intakes a big object, parses it and call the express router to do the routing.
Most of the case, the big routing object will be the config/routes.js
The middleware can be called before and after request. the name of the middleware will be same as its filename.
The framework will provide the a Dockerfile of building a nodejs application image
TBC
- Install NodeJS on your machine
- Install @shopline/sl-express on your machine
npm i -g @shopline/sl-express
- Clone this repo to your machine
- Go into the
example/basic
directory on your terminal and executenpm install
- Run
sl-express start
on your terminal - Open your browser and go to http://localhost:3000
- You should also see a log on terminal that the route has been requested
- Run
sl-express console
on your terminal and this will bring you into the express console - Type
app.id
on your express console to see the time when you start your console
- Run
sl-express asyncConsole
on your terminal and this will bring you into the express console with async mode - Type
app.id
on your express console to see the time when you start your console
example/basic/Dockerfile has already been configured
- Install Homebrew on your machine
- Install docker via Homebrew
brew cask install docker
- Go into the
example/basic
directory on your terminal and build a docker image bydocker build --tag=test-app .
- Create a docker container by
docker run -p 3000:3000 test-app
- Open your browser and go to http://localhost:3000
- Run
ctrl+c
to stop the container. You can check all the containers you have withdocker ps -a
docker rm <CONTAINER_ID>
to remove the container
example/basic/docker-compose has already been configured
- Go into the
example/basic
directory on your terminal and build the images bydocker-compose build
- Create docker containers by
docker-compose up
- Open your browser and go to http://localhost:3000
- Run
ctrl+c
to stop the container or you can rundocker-compose stop
There are a few phases you can customize by doing method overriding. Always remember to manage the super method
This phase actually doing the loading phase of the app. most of the case, you would do:
prepare() {
super.prepare()
/* your extra loading here */
}
This phase handle the connection to other service like mongo, redis and rabbitmq. By default, it connects to no services. My suggestion will be:
async connectAwesomeService() {
/* connect... connect... connect... */
}
async disconnectAwesomeService() {
/* disconnect... disconnect... disconnect... */
}
async connectDependencies() {
super.connectDependencies()
/* your other connections here */
try { connectAwesomeService() } catch(e) { throw e }
}
This phase handle the connection from other service like mongo, redis and rabbitmq. By default, it disconnect from no services. My suggestion will be:
async connectAwesomeService() {
/* connect... connect... connect... */
}
async disconnectAwesomeService() {
/* disconnect... disconnect... disconnect... */
}
async disconnectDependencies() {
/* the best practice is to diconnect in a reversed sequence of connections
/* your other connections here */
try { disconnectAwesomeService() } catch(e) { throw e }
super.connectDependencies()
}
The phase that really start the service. By default, it will start the express server. You can do a customization by condition
async startService() {
if (/*this.role == 'SERVER'*/) {
await this.startExpress()
return
}
if (/*this.role == 'WORKER'*/) {
/* start consuming queue */
return
}
}
The phase to stop the service. By default, it is stopping nothing. You can also do a customization by condition
async stopService() {
if (/*this.role == 'SERVER'*/) {
await this.stopExpress()
return
}
if (/*this.role == 'WORKER'*/) {
/* stop consuming queue */
return
}
}
To add a route, you can simply add a string to routes: []. It will split it by spaces.
The pattern should be:
HTTP_METHOD URI middleware middleware Controller.action
Sometimes you may not want to insert middleware one by one. Then you can use the preMiddlewares. Please check: https://expressjs.com/en/guide/using-middleware.html
The pattern should be:
REGEX middleware middleware
Please also reference: https://expressjs.com/en/guide/routing.html
module.exports = {
preMiddlewares: [
'* middleware middleware'
],
routes: [
'GET /index PublicController.index'
]
postMiddlewares: [],
}
This is how a controller should be added to the api/controllers directory.
In this example, router.js can reference the index controller by PublicController.index.
Class PublicController {
async index(req, res) {
return res.send('hello world')
}
}
module.exports = PublicController
first you will need to create a class under api/services directory.
api/services/AwesomeService.js
let _theLibYouUse = null;
let _sharedAwesomeService = null;
class AwesomeService {
/* a lazy loading singleton. It ensures the lib would not be required if the service is not used. It may seems a bit dirty for requiring lib in functions. But it makes this service able to move into the core framework some days.
*/
static get theLibYouUse() {
if (!_theLibYouUse) {
_theLibYouUse = require('theLibYouUse');
}
return _theLibYouUse;
}
/* A singleton. Most of the case you will just need to init one Service instance. You still better do a signleton pattern so that you can do stubbing easily when doing unit test on Model methods that make use of this service */
static get sharedAwesomeService() {
if (!_sharedAwesomeService) {
_sharedAwesomeService = new AwesomeService();
}
return _sharedAwesomeService;
}
/* As singleton is used, it will be hard to pass the config when initializing the service. That's why we use init instead of constructor. Besides, we may not want to set the config or directly get the global config inside this class because it's better to keep it with fewer dependencies. The config should be passed to the signleton in the motherbroad */
init(config) {
this.endpoint = config.endpoint;
this.abc = config.abc;
}
}
module.exports = AwesomeService;
config/awesomeService.js
module.exports = {
endpoint: process.env.AWESOME_SERVICE_ENPOINT
};
.env
/* all env-dependent variable should put in .env file */
AWESOME_SERVICE_ENPOINT=http://awesomeservice.com/api
app.js
async connectAwesomeService() {
await AwesomeService.sharedAwesomeService.init(this.config.awesomeService).connect()
}
async disconnectAwesomeService() {
await AwesomeService.SharedAwesomeService.disconnect()
}
async connectDependencies() {
await super.connectDependencies()
await this.connectAwesomeService()
}
async disconnectDependencies() {
await this.disconnectAwesomeService()
await super.disconnectDependencies()
}
In most of the frameworks, they like to do a structure like
- config/
- env/
- development.js
- production.js
- config1
- config2
And these framework will first gather config1 and config2, and do a overriding with the specified environment config. Yet this framework WON'T do this.
All environment related config should be controlled by .env file
This framework use log4js wrapped in a plugin logger. Things can be configured in config/logger.js.
Please config your config/app.js
{
plugins: ['logger'];
}
There is no magic for configuring the Logger. Please visit: https://www.npmjs.com/package/log4js
Most of the cases, you just need to add categories like 'broadcast', 'queueHandling'. It just bases on what feature you want to take log.
Besides, as we are using cloudwatch, we just append our logs to stdout at this moment.
-
debug
-
trace
-
warn
-
error
-
info
-
trace: Most of the case we will add trace log every where as we should be able to investigate problems in a black-box system in production
-
warn: some error that are not exactly exceptional but you want to keep track of these kind of weird behaviour
-
error: Every exceptional should be logged with error log, no matter it breaks the process or not
-
info: System-wise log will be assigned to info log, like 'connected mongo'.
must-have:
- logCategory
- logLevel
- obj
logCategory: a category to group logs. most of the case it is designed by feature like 'broadcast', 'notificationMessage'. logLevel: like the upper section obj: a obj to be JSON.stringify. WE HIGHLY RECOMMEND YOU ADD THE FOLLOWING: 1. action (string to describe the process), 2. traceId
There is a built-in model called MongooseModel. This model wants to:
- make class declaration using class instead of using prototype
- handle the way of mixing the mongoose schema and the class by using mongoose.model
- provide a more user friendly way to use the mongoose pre and post hook.
class AwesomeModel extends MongooseModel {
static schema() {
/* you can always access the mongoose library with this.mongoose */
return {
ownerId: { type: String, required: true }
};
}
static beforeSave(obj, next) {
//do something
return next();
}
}
module.exports = AwesomeModel;
Mongo should be connected when MongooseModel is used. There is a static getter function in MongooseModel and App. The one in MongooseModel will return the mongoose lib. The one in App will return the mongoose in MongooseModel. They are actually, most of the time, the same.
There are built-in function for connection mongo. What you need to do is adding ENV to your .env file. Basically, we have a config/mongoose.js in the framework that mapping a mongo endpoint to ENVs so you just need to add ENV.
MONGODB_USER
MONGODB_PASS
MONGODB_HOST
MONGODB_PORT
MONGODB_DATABASE
Please config your config/app.js
{
plugins: ['mongoose'];
}
Add the following to your docker-compose.yml
version: '3'
services:
# your app build
# ...
mongo:
image: 'mongo'
ports:
- '27017:27017' # configure your port
volumes:
- 'mongodb:/data/db'
# ...
# your other services (rabbitmq, redis)
volumes:
mongodb:
driver: local
By default we have a config file in framework mapping ENVs to the redis config
REDIS_USER
REDIS_PASS
REDIS_HOST
REDIS_PORT
REDIS_DATABASE
REDIS_TIMEOUT_MS
Please config your config/app.js
{
plugins: ['redis'];
}
Add the following to your docker-compose.yml
version: '3'
services:
# your app build
# ...
redis:
image: 'redis'
ports:
- '6379:6379' # configure your port
# ...
# your other services (rabbitmq, mongo)
By default we have a config file in framework mapping ENVs to the redis config
RABBITMQ_USER
RABBITMQ_PASS
RABBITMQ_HOST
RABBITMQ_PORT
RABBITMQ_PREFETCH_COUNT
RABBITMQ_QUEUE_PREFIX
Please config your config/app.js
{
plugins: ['messageQueue'];
}
Add the following to your docker-compose.yml
version: '3'
services:
# your app build
# ...
rabbitmq:
image: 'rabbitmq:3-management'
ports:
- '5672:5672' # configure your port
- '15672:15672'
# ...
# your other services (redis, mongo)
You need to connect to both Redis and Rabbitmq for this feature. By default, we will store the payload of a task to Redis and only send the task id to the rabbitmq. This design will avoid sending too large payload to the rabbitmq. QueueTask, as a model, will handle all this for you.
Please refer to using redis section. Please refer to using messageQueue section.
To get it set up, you need to add the following code:
Add queueTask plugin to config/app.js
{
plugins: ['queueTask', '...your other plugins'];
}
Add a config file config/queueTask.js
module.exports = [
{
type: 'TEST', // an identifier for you task
queue: 'test_queue', // the CONSUMER_QUEUE_ID or consumerQueueId to handle the queue
messageExpireTimeSec: 3600 * 24 // set a timeout on message, default to (3600 * 24)s. After the timeout has expired, the message will automatically be discarded.
handler: 'Test.dequeue', // the handler of the tasks of this type
description: 'any remarks you want to add'
}
];
Add a Test.js file on api/models/Test.js to queue and dequeue handling
class Test {
static async enqueue(test) {
//do something to make a payload
let payload = {
firstName: test.firstName,
sex: test.sex
};
await QueueTask.queue({
taskType: 'TEST',
payload: payload
});
}
static async dequeue(queueTask) {
let payload = queueTask.payload;
// handle the payload
console.log(payload.firstName);
}
}
module.exports = Test;
To test the queueTask feature, you will have to start your app with a consumer and publisher role. To do so, you can use docker-compose
to instantiate two containers of the app, one with a consumer role and the other with a publisher role.
Modify your docker-compose.yml
services:
test-app-publisher:
build: .
ports:
- '3000:3000'
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ROLE=PUBLISHER # to start the container using a publisher role
command: bash -c "chmod +x ./wait-for-it.sh && ./wait-for-it.sh rabbitmq:5672 -- nodemon server.js"
test-app-consumer:
build: .
volumes:
- .:/app
- /app/node_modules
environment:
- APP_ROLE=CONSUMER # to start the container using a consumer role
- CONSUMER_QUEUE_ID=test_queue # set the id the same as the queue attr in config/queueTask.js
command: bash -c "chmod +x ./wait-for-it.sh && ./wait-for-it.sh rabbitmq:5672 -- nodemon server.js"
#
#
# your other services (redis, mongo, rabbitmq)
You can build any plugin you like using Plugin feature. SL-expres will
- read the app.config.plugins
- read
/plugins
of YOUR application folder and import the plugin ONLY the key exists in the config - if there are some keys in the config cannot be imported, it try to import them from sl-express. (overriding the default)
// config/app.js
module.exports = {
plugins: [
'helloWorld',
'drinkTea',
]
}
the plugin must fulfill the directory structure
// plugins
- helloWorld
- index.js
- drinkTea
- index.js;
the export of the index.js must provide the following interfaces
- prepare(app) { }
- async connectDependencies(app) { }
- async disconnectDependencies(app) { }
- async willStartService(app) { }
- async didStartService(app) { }
These interfaces are related to specific App phases. check the class App for details
app
means the App instance. You can get properties through this app instance. Most of the cases, you will need the app.config
// plugins/
- helloWorld
- lib/
- ModelA.js
- ModelB.js
- HelloWorldService.js
- HelloWorldPlugin.js
- index.js
- README.md
// index.js
const HelloWorldPlugin = require('./HelloWorldPlugin')
module.exports = new HelloWorldPlugin()
// HelloWorldPlugin.js
class HelloWorldPlugin {
prepare(app) {
const service = new HelloWorldService()
}
async connectDependencies(app) { }
async disconnectDependencies(app) { }
async willStartService(app) { }
async didStartService(app) { }
}
module.exports = HelloWorldPlugin
We can organise our code structure will five kinds of components:
- Controller: its actions to receive API calls
- AppService: services that provided by our sl-express application to complete use cases by making use of models.
- Model: to control the queries, data structure, data change of itself
- Service: services that are intaken into the application.
- Plugin: a structure that compose of 2,3,4. it can abstruct the whole service you want to provide. It also conform the interface to integrate with the sl-express app
###More about plugin In a simple way, it's a connector between the app and the library / modules you write.
Sometimes, in your application, you may need to solve some use cases that involve a complicated logic flow. it may involve many models and intaken services which only concern this part of this logic but not the app. In this case, you would like to "hide" these models and services and wrap it into a modules. And you may also need to setup a bit before pluging it into the application. Using the integration feature of the plugin layer, your modules can simply focus on the logic.