This project is built with React18 support which means it has bleeding edge support of the latest web framework. To get started in clicks, follow the docker deployment guide below or follow the coder's guide (advanced).
⚠ All data used in the project, portrayed in the screenshot are for exemplar purposes only. The author/company owns no right on the data, and is not at all meant to be used on a production level. This was a training project and needs to be used for knowledge delivery purposes only.
The mandatory features are compulsory tasks and the optional features are for
extra credit points, which will give you an added advantage.
Mandatory Features | Optional Features |
---|---|
1. UI Creation | 1. Predict Button activation with Grid Data |
2. Grid Creation | 2. Shortcut search button on Grid for Customer Id |
3. Grid Data Loading | 3. Sorting columns |
4. Crud Operation (ADD/EDIT/DEL) | 4. View - Analytics |
5. Pagination | |
6. Advanced Search |
- MySQL (Database)
- JDBC w/ Servlets (Java to Database Connectivity/API/ORM)
- Maven (Dependency Management)
- Tomcat 10 (Server for Servlets)
- Python3
- Flask (Server for Machine Learning Model)
- React 18 ⚛️ (Frontend)
- NodeJS (Server)
- Axios (API Communicator)
- Redux Toolkit w/ Redux thunk
-
Docker 🐳 (w/ Dockerfiles and docker-compose)
- IntelliJ IDEA Ultimate (Ultimate is necessary for JavaEE projects and tomcat servers configuration.)
- Raspberry Pi
- Gitpod OpenVSCode Server (For air gapped development environment.)
- Postman for API Testing
- Please visit the Docker folder to have a glance at the setup files
You would get a similar folder structure as shown in the image below.
Now take a look at the docker-compose.yaml
file, that would expose us to the services that would be references while running the backend using Docker.
version: "3"
services:
highradiustraining-servlet:
container_name: highradiustraining-servlet
deploy:
replicas: 1
restart_policy:
condition: unless-stopped
ports:
- "280:8080"
image: ankurpaul19/highradiustraining-servlet
highradiustraining-flask:
container_name: highradiustraining-flask
deploy:
replicas: 1
restart_policy:
condition: unless-stopped
ports:
- "5000:5000"
image: ankurpaul19/highradiustraining-flask
db:
image: mysql:5.7
command: --default-authentication-plugin=mysql_native_password
volumes:
- /var/lib/mysql:/var/lib/mysql
restart: always
environment:
- MYSQL_ROOT_PASSWORD=mysql
- MYSQL_DATABASE=grey_goose
- MYSQL_USER=mysql
- MYSQL_PASSWORD=mysql
ports:
- "3306:3306"
db_seeder:
image: mysql:latest
volumes:
- ./Database/db.sql:/db.sql
environment:
- MYSQL_ALLOW_EMPTY_PASSWORD=true
entrypoint:
[
"bash",
"-c",
"sleep 10 && mysql --user=mysql --password=mysql --host=db --port=3306 grey_goose < /db.sql && exit",
]
depends_on:
- db
phpmyadmin:
image: phpmyadmin:latest
restart: unless-stopped
ports:
- 8080:80
environment:
# we specify that we connect to an arbitrary server with the flag below
# "arbitrary" means you're able to specify which database server to use on login page of phpmyadmin
- PMA_ARBITRARY=1
depends_on:
- db_seeder
We can see that it uses these services for the backend to be running.
- highradiustraining-servlet (Tomcat server for servlets)
- highradiustraining-flask (Flask service for AI predictions in the dashboard)
- db (MySQL Container)
- db_seeder (Seeder service for initial setup of the database)
- phpmyadmin (Database monitoring) (optional, remove if not needed)
Before you can use the docker-compose.yaml
file readily, you need to build the images first. Please visit the Docker
folder and execute the below scripts to build the images.
docker build -t ankurpaul19/highradiustraining-servlet -f Dockerfile-servlet.dockerfile .
docker build -t ankurpaul19/highradiustraining-flask -f Dockerfile-flask.dockerfile .
Once the images are built, you can start the containers from the docker-compose.yaml file by,
docker-compose up
(-d to detach from the console and keep running in the background)
-
Navigate to the Frontend folder.
-
Install the dependencies for the web project using your preferred package manager (yarn / npm)
-
Perform a build of the project. Please make modifications to the
.env
file by copying the default variables from.env.example
file.Build the Project or download the build from releases.
yarn build
ornpm run build
The build time shall be around ~3 minutes ⌚ depending upon your pc specs.
-
Open the backend project in your favorite IDE like IntelliJ IDEA or Eclipse.
You would get a similar folder structure for the backend project.
-
Bootstrap/Prepare Tomcat as per your IDE.
-
In IntelliJ IDEA you can see the build configuration here and perform the build server setup following the images below.
-
Folder Structure
-
Install the dependencies
-
Copy the
.env.example
->.env.local
and tweak the env variables as per your setup. -
Run
yarn serve
in the terminal (script source from package.json).
Please import the followings for Postman to auto import the api endpoints
API Route Table
Service | Request Name | Request Type | Request Endpoint | URL Parameters | JSON Body | Description | CURL Equivalent |
---|---|---|---|---|---|---|---|
Servlets | GET ROWS | POST | http://192.168.0.118:280/getrows | ?start=0&limit=10 | Gets the rows from start to the next limit rows | curl --location --request POST ’http://192.168.0.118:280//getrows?start=0&limit=10’ \ --header ‘Content-Type: application/json’ | |
Servlets | GET ANALYTICS | POST | http://192.168.0.118:280/getanalytics | { “clear_date”: [ “2019-01-01”, “2020-12-31” ], “due_in_date”: [ “2019-01-01”, “2020-12-31” ], “baseline_create_date”: [ “2019-01-01”, “2020-12-31” ], “invoice_currency”: “INR” } | Returns an array with analytics data. Refer JSON body | curl --location --request POST ’http://192.168.0.118:280//getanalytics’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘{ “clear_date”: [ “2019-01-01”, “2020-12-31” ], “due_in_date”: [ “2019-01-01”, “2020-12-31” ], “baseline_create_date”: [ “2019-01-01”, “2020-12-31” ], “invoice_currency”: “INR” }’ | |
Servlets | ADD ROWS | POST | http://192.168.0.118:280/addRow | { “sl_no”:48558, “business_code”: null, “cust_number”: “200020431”, “name_customer”:“test company”, “clear_date”: “2022-3-1”, “business_year”: null, “doc_id”: null, “posting_date”: “2022-3-1”, “document_create_date”: “2022-3-2”, “document_create_date1”: null, “due_in_date”: “2022-3-19”, “invoice_currency”: “”, “document_type”: null, “posting_id”: null, “area_business”: null, “total_open_amount”: null, “baseline_create_date”: “2022-3-15”, “cust_payment_terms”: null, “invoice_id”: null, “isOpen”: null, “aging_bucket”: null } | Adds the rows that is passed as JSON raw body | curl --location --request POST ’http://192.168.0.118:280//addRow’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘{ “sl_no”:48558, “business_code”: null, “cust_number”: “200020431”, “name_customer”:“test company”, “clear_date”: “2022-3-1”, “business_year”: null, “doc_id”: null, “posting_date”: “2022-3-1”, “document_create_date”: “2022-3-2”, “document_create_date1”: null, “due_in_date”: “2022-3-19”, “invoice_currency”: “”, “document_type”: null, “posting_id”: null, “area_business”: null, “total_open_amount”: null, “baseline_create_date”: “2022-3-15”, “cust_payment_terms”: null, “invoice_id”: null, “isOpen”: null, “aging_bucket”: null }’ | |
Servlets | ADVANCED SEARCH | POST | http://192.168.0.118:280/advancedSearch | { “doc_id”: 1929873765, “invoice_id”: 1929873765, “cust_number”: 200792734, “business_year”: 2019 } | Performs advanced search with the fields passed. Refer PRS and JSON body for example. | curl --location --request POST ’http://192.168.0.118:280//advancedSearch’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘{ “doc_id”: 1929873765, “invoice_id”: 1929873765, “cust_number”: 200792734, “business_year”: 2019 }’ | |
Servlets | GET BUSINESSES | GET | http://192.168.0.118:280/getbusinesses | Returns the businesses with their codes in the database. | curl --location --request GET ’http://192.168.0.118:280//getbusinesses’ | ||
Servlets | GET CUSTOMERS | GET | http://192.168.0.118:280/getcustomers | Returns the customers with their codes in the database. | curl --location --request GET ’http://192.168.0.118:280/getcustomers’ | ||
Servlets | EDIT ROW | POST | http://192.168.0.118:280/editRow | ?serialNumber=2&tableName=winter_internship | { “invoice_currency”: “USD”, “cust_payment_terms”: “NAA8” } | Edits a row by sending a RAW JSON body | curl --location --request POST ’http://192.168.0.118:280//editRow?serialNumber=2&tableName=winter_internship’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘{ “invoice_currency”: “USD”, “cust_payment_terms”: “NAA8” }’ |
Servlets | DELETE ETEROW | DELETE | http://192.168.0.118:280/deleteRow | ?sl_no=48568,48569,48570 | Deletes a row or a range of rows | curl --location --request DELETE ’http://192.168.0.118:280//deleteRow?sl_no=48568,’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘’ | |
Flask | GET PREDICTION | POST | http://192.168.0.118:5000/get_prediction | { “data”: [ 1929646410, 1929873765, 1930147974, 1930083373, 1930659387, 1929439637, 1928819386, 1930610806, 1928550622, 1929151655, 1930022117, 1930788296, 1930817482, 1930052739, 1930209407, 1930153511, 1930438462, 1991837617, 1929773400, 1930676042, 1929626925, 1930431304, 1928620435, 1930592246, 1929194820, 1929170780, 1929907681, 1929847863, 1929541405, null ] } | Does prediction on the array of doc ids passed to it. | curl --location --request POST ’http://192.168.0.118:5000/get_prediction’ \ --header ‘Content-Type: application/json’ \ --data-raw ‘{ “data”: [ 1929646410, 1929873765, 1930147974, 1930083373, 1930659387, 1929439637, 1928819386, 1930610806, 1928550622, 1929151655, 1930022117, 1930788296, 1930817482, 1930052739, 1930209407, 1930153511, 1930438462, 1991837617, 1929773400, 1930676042, 1929626925, 1930431304, 1928620435, 1930592246, 1929194820, 1929170780, 1929907681, 1929847863, 1929541405, null ] }’ | |
Flask | PREDICT 200 ROWS | POST | http://192.168.0.118:5000/all | Gets prediction of first 200 rows from Final.csv. This is for debugging purpose only that the flask service works. | curl --location --request POST ’http://192.168.0.118:5000/all’ | ||
Dear Student,
Finally the wait is over! The day has come when we are about to start with the internship program. So, brace yourselves for the upcoming roller coaster ride. The starting date for the Internship is 27-Jan 2022. The tenure for the Internship will be of 11 weeks wherein you'll be responsible to build an AI Enabled Fintech B2B Invoice Management Application..
Please read the PRS Document to get in-depth knowledge about the project.
Hope you have a pleasant journey ahead!
Regards,
HighRadius Corporation