- nodejs (v10)
- Kafka (v2)
Configuration for the notification server is at config/default.js
.
The following parameters can be set in config files or in env variables:
- DISABLE_LOGGING: whether to disable logging
- LOG_LEVEL: the log level
- KAFKA_URL: comma separated Kafka hosts
- KAFKA_GROUP_ID: the Kafka group id
- KAFKA_CLIENT_CERT: Kafka connection certificate, optional; if not provided, then SSL connection is not used, direct insecure connection is used; if provided, it can be either path to certificate file or certificate content
- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; if not provided, then SSL connection is not used, direct insecure connection is used; if provided, it can be either path to private key file or private key content
- TOPICS: Kafka topics to listen to
- CORONA_TOPIC: corona topic to send event result to
- AUTH0_URL: Auth0 URL, used to get TC M2M token
- AUTH0_AUDIENCE: Auth0 audience, used to get TC M2M token
- AUTH0_PROXY_SERVER_URL: Auth0 proxy server url, used to get TC M2M token
- AUTH0_CLIENT_ID: Auth0 client id, used to get TC M2M token
- AUTH0_CLIENT_SECRET: Auth0 client secret, used to get TC M2M token
- GET_CHALLENGE_DETAILS_URL: URL to get challenge details
- GET_USER_DETAILS_URL: URL to get user details
- GET_USER_DETAILS_BY_HANDLE_URL: URL to get user details by handle
- BUSAPI_URL: Bus API URL
- KAFKA_ERROR_TOPIC: Kafka error topic used by bus API wrapper
Test config is at test/testConfig.js
, you don't need to change it.
The following test parameters can be set in test config files or in env variables:
- WAIT_MS: the time in milliseconds to wait for some processing completion
- USE_MOCK: flag to use mock requests in tests to avoid server issues with real apis and make tests to run faster
http://kafka.apache.org/quickstart
contains details to setup and manage Kafka server, below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead- download kafka at
http://kafka.apache.org/downloads
- extract out the doanlowded file
- go to extracted directory
- start ZooKeeper server:
bin/zookeeper-server-start.sh config/zookeeper.properties
- use another terminal, go to same directory, start the Kafka server:
bin/kafka-server-start.sh config/server.properties
- note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
- use another terminal, go to same directory, create some topics:
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic challenge.notification.events
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.create
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.update
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.delete
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic notifications.autopilot.events
- verify that the topics are created:
bin/kafka-topics.sh --list --zookeeper localhost:2181
, it should list out the created topics
- setup Kafka as above
- install dependencies
npm i
- run code lint check
npm run lint
- run code lint fix
npm run lint:fix
- run tests
npm run test
- run tests with coverage
npm run cov
and you can check coverage report incoverage
folder - start app
npm start
, it starts Kafka consumer to listen to configured topics
You may no need to run git init
if already git repo.
- git init
- git add .
- git commit -m init
- heroku create
- heroku config:set KAFKA_URL=... TOPICS=topic1,topic2
- git push heroku HEAD:master
-
setup Kafka as above
-
see above for details to run tests
-
start app
-
to do manual verification for Kafka consumer, go to the Kafka folder
-
run Kafka producer for topic
challenge.notification.events
:bin/kafka-console-producer.sh --broker-list localhost:9092 --topic challenge.notification.events
-
input message of user registration to producer:
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "type": "USER_REGISTRATION", "data": { "challengeId": 30049360, "userId": 23124329 } } }
-
watch the app console output, below is shown:
info: It is user registration (unregistration) message.
- input message of add resource to producer:
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "type": "ADD_RESOURCE", "data": { "challengeId": 30049360, "request": { "roleId": 14, "resourceUserId": 23124329, "phaseId": 0, "addNotification": true, "addForumWatch": true, "checkTerm": false, "studio": false } } } }
- watch the app console output, below is shown:
info: It is add resource message.
- input message of update draft challenge to producer:
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "type": "UPDATE_DRAFT_CHALLENGE", "data": { "id": 30049360, "confidentialityType": null, "technologies": [], "subTrack": null, "name": null, "reviewType": "COMMUNITY", "billingAccountId": 80000632, "milestoneId": 1, "prizes": [10], "projectId": 18693 }, "userId": 22838965 } }
- watch the app console output, below is shown:
info: It is update draft or activate challenge message.
- input message of active challenge to producer:
{"topic":"challenge.notification.events","originator":"originator","timestamp":"2018-01-02T00:00:00","mime-type":"application/json","payload":{"type":"ACTIVATE_CHALLENGE","data":{"id":30049360,"confidentialityType":null,"technologies":[],"subTrack":null,"name":"test name","reviewType":"COMMUNITY","billingAccountId":123,"milestoneId":1,"detailedRequirements":null,"submissionGuidelines":null,"registrationStartsAt":"2018-01-02T00:11:22.001Z","registrationEndsAt":"2018-01-02T00:11:22.001Z","checkpointSubmissionStartsAt":null,"checkpointSubmissionEndsAt":null,"submissionEndsAt":"2018-01-02T00:11:22.001Z","round1Info":null,"round2Info":null,"platforms":[],"numberOfCheckpointPrizes":0,"checkpointPrize":0,"finalDeliverableTypes":"test type","prizes":[10],"projectId":123,"submissionVisibility":false,"maxNumOfSubmissions":0,"task":null,"assignees":null,"failedRegisterUsers":null,"copilotFee":null,"copilotId":null,"codeRepo":null,"environment":null,"fixedFee":null,"percentageFee":null}}}
- watch the app console output, below is shown:
info: It is update draft or activate challenge message.
- input message of close task to producer:
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "type": "CLOSE_TASK", "data": { "challengeId": 30049360, "userId": 23124329, "winnerId": 22678451 } } }
- watch the app console output, below is shown:
info: It is close task message.
- input message that can not be handled:
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "type": "OTHER", "data": { "challengeId": 30049360, "userId": 23124329, "winnerId": 22678451 } } }
- watch the app console output, below is shown:
info: No processor can recognize and handle the message, it will be ignored.
-
in the Kafka producer, write some invalid messages:
invalid message [{
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "abc", "mime-type": "application/json", "payload": { "key1": "value1" } }
{ "topic": "challenge.notification.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json" }
-
watch the app console output, errors details are shown
-
run another Kafka producer for another topic
submission.notification.create
:bin/kafka-console-producer.sh --broker-list localhost:9092 --topic submission.notification.create
-
input message of contest submission to producer:
{ "topic": "submission.notification.create", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "resource": "submission", "id": "slkdf", "type": "Contest Submission", "url": "http://demo.com", "memberId": 23124329, "challengeId": 30049360, "created": "2018-01-02T00:11:22.000Z", "updated": "2018-01-02T00:11:22.000Z", "createdBy": "Amith", "updatedBy": "Amith", "submissionPhaseId": 961198, "fileType": "zip", "isFileSubmission": false } }
-
watch the app console output, below is shown:
info: It is contest submission message.
-
run another Kafka producer for another topic
notifications.autopilot.events
:bin/kafka-console-producer.sh --broker-list localhost:9092 --topic notifications.autopilot.events
-
input message of auto pilot event to producer:
{ "topic": "notifications.autopilot.events", "originator": "test-originator", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "date": "2018-01-02T00:11:22.000Z", "projectId": 30049360, "phaseId": 953326, "phaseTypeName": "Submission", "state": "END", "operator": "22841596" } }
-
watch the app console output, below is shown:
info: It is auto pilot event message.
- go to Lauscher, provide your user name and password will be
appirio123
, choosecorona.saturate.create
topic and view the events result.
- a good enhancement is made, it uses Kafka group consumer instead of simple consumer, so that it can handle Kafka messages raised when the app is down