Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
204 changes: 169 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
# Yape Code Challenge :rocket:

Our code challenge will let you marvel us with your Jedi coding skills :smile:.

Don't forget that the proper way to submit your work is to fork the repo and create a PR :wink: ... have fun !!

- [Problem](#problem)
- [Tech Stack](#tech_stack)
- [Send us your challenge](#send_us_your_challenge)
- [Tech Stack](##tech_stack)
- [API Resources](#api_resources)
- [High Volume Scenario Considerations Implemented](#considerations)
- [How to Run the Project](#how_to_run_the_project)

# Problem

Expand All @@ -30,53 +28,189 @@ Every transaction with a value greater than 1000 should be rejected.
Transaction -- Update transaction Status event--> transactionDatabase[(Database)]
```

# Tech Stack
## Tech Stack

<ol>
<li>Node. You can use any framework you want (i.e. Nestjs with an ORM like TypeOrm or Prisma) </li>
<li>Any database</li>
<li>Kafka</li>
</ol>
- NestJS
- Prisma ORM
- PostgreSQL
- Redis
- GraphQL
- Kafka for messaging

## API Resources

1. Create a transaction:

We do provide a `Dockerfile` to help you get started with a dev environment.
Request:

You must have two resources:
```gql
mutation CreateTransaction($input: CreateTransactionInput!) {
createTransaction(input: $input) {
transactionExternalId
value
createdAt
accountExternalIdCredit
accountExternalIdDebit
}
}
```

1. Resource to create a transaction that must containt:
Response:

```json
{
"accountExternalIdDebit": "Guid",
"accountExternalIdCredit": "Guid",
"tranferTypeId": 1,
"value": 120
"data": {
"getTransaction": {
"transactionExternalId": "81a08e52-b288-4f93-a2db-55b29892480a",
"createdAt": "2025-08-12T15:37:00.863Z",
"value": 999,
"transactionStatus": {
"name": "Approved"
},
"transactionType": {
"name": "Standard"
}
}
}
}
```

2. Retrieve a transaction

Request:

```gql
query GetTransaction($transactionExternalId: String!) {
getTransaction(transactionExternalId: $transactionExternalId) {
transactionExternalId
createdAt
value
transactionStatus {
name
}
transactionType {
name
}
}
}
```

2. Resource to retrieve a transaction
Response:

```json
{
"transactionExternalId": "Guid",
"transactionType": {
"name": ""
},
"transactionStatus": {
"name": ""
},
"value": 120,
"createdAt": "Date"
"data": {
"createTransaction": {
"transactionExternalId": "81a08e52-b288-4f93-a2db-55b29892480a",
"value": 999,
"createdAt": "2025-08-12T15:37:00.863Z",
"accountExternalIdCredit": "2d13c348-4209-439f-8c9f-bcbf72fe23c1",
"accountExternalIdDebit": "1a368a89-09d5-4177-a581-990bce3110cb"
}
}
}
```

## Optional
## High Volume Scenario Considerations Implemented

In this project, several strategies were applied to ensure robustness and scalability under high concurrency and volume:

You can use any approach to store transaction data but you should consider that we may deal with high volume scenarios where we have a huge amount of writes and reads for the same data at the same time. How would you tackle this requirement?
- **Caching:**
To improve performance and reduce the load on the primary database, Redis was used as a caching layer. Frequently accessed data, such as transaction details and statuses, are cached to provide faster read responses. The cache is properly invalidated or updated whenever the underlying data changes to maintain consistency.

You can use Graphql;
- **Event-Driven Architecture:**
Kafka was used as the messaging backbone to enable asynchronous communication between microservices. This decouples services and enhances scalability and fault tolerance.

# Send us your challenge
- **Idempotency:**
A basic idempotency check was implemented by validating if the transaction status to update was already set. If so, the update is skipped to avoid duplicate processing.

When you finish your challenge, after forking a repository, you **must** open a pull request to our repository. There are no limitations to the implementation, you can follow the programming paradigm, modularization, and style that you feel is the most appropriate solution.
This can be extended using idempotency identifiers for better performance and reliability.

If you have any questions, please let us know.
- **Optimistic Concurrency Control (OCC):**
Transactions include version numbers to detect concurrent modifications. Updates verify that the version matches the current one in the database, preventing lost updates and ensuring data integrity.

- **CQRS Pattern:**
Write and read models are separated to optimize for scalability and performance, enabling independent scaling and simpler read operations.

These considerations help the system handle high throughput scenarios efficiently and reliably.

## How to Run the Project

You can run the project in two ways:

### 1. Using Docker Compose (Recommended)

This method will build and start all necessary services including the microservices, database, Kafka, and Redis.

Run the following command in the root folder:

```bash
docker-compose up --build
```

By default, the transaction microservice is expose on port 3001. So you can access to the playground:

```bash
http://localhost:3001/graphql
```

### 2. Running Services Locally (Manual Setup)

If you prefer to run only specific services locally, follow these steps:

#### 2.1 Set environment variables for the transactions and antifraud microservices, for example:

```env
// ms-transactions
PORT=3000
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/yape
KAFKA_CONSUMER_GROUP="transaction-service-consumer"
KAFKA_BROKER="localhost:9093"
REDIS_HOST=localhost
REDIS_PORT=6379
```

```env
// ms-anti-fraud
KAFKA_CONSUMER_GROUP="anti-fraud-consumer"
KAFKA_BROKER="localhost:9093"
```

#### 2.2 Install dependencies and prepare the database:

```bash
cd ms-transactions
npm install
npm run prisma:generate
npm run prisma:migrate:dev
npm run prisma:seed
```

```bash
cd ms-anti-fraud
npm install
```

#### 2.2 Start the microservices in development mode:

```bash
cd ms-transactions
npm run start:dev
```

```bash
cd ms-anti-fraud
npm run start:dev
```

Make sure Kafka, Redis, and PostgreSQL are running locally and accessible at the configured hosts and ports before starting the microservice. You can use the docker-compose.yml file.

```bash
docker-compose up --build
```

Finally, access to the playground in the port 3000:

```bash
http://localhost:3000/graphql
```
52 changes: 49 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,12 +1,19 @@
version: "3.7"
services:
postgres:
image: postgres:14
image: postgres:14.3
ports:
- "5432:5432"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=yape
cache:
image: redis
ports:
- 6379:6379
volumes:
- redis:/data
zookeeper:
image: confluentinc/cp-zookeeper:5.5.3
environment:
Expand All @@ -16,10 +23,49 @@ services:
depends_on: [zookeeper]
environment:
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:9093
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092,PLAINTEXT_HOST://0.0.0.0:9093
KAFKA_BROKER_ID: 1
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_JMX_PORT: 9991
ports:
- 9092:9092
- 9093:9093
ms-transactions:
build:
context: ./ms-transactions
dockerfile: Dockerfile
container_name: ms-transactions
ports:
- "3001:3000"
depends_on:
- postgres
- cache
- kafka
environment:
REDIS_HOST: cache
REDIS_PORT: 6379
KAFKA_BROKER: kafka:9092
PORT: 3000
DATABASE_URL: postgresql://postgres:postgres@postgres:5432/yape
KAFKA_CONSUMER_GROUP: transaction-service-consumer
volumes:
- ./ms-transactions:/app
ms-anti-fraud:
build:
context: ./ms-anti-fraud
dockerfile: Dockerfile
container_name: ms-anti-fraud
ports:
- "3002:3000"
depends_on:
- kafka
environment:
KAFKA_BROKER: kafka:9092
PORT: 3000
KAFKA_CONSUMER_GROUP: anti-fraud-consumer
volumes:
- ./ms-anti-fraud:/app
volumes:
redis:
driver: local
3 changes: 3 additions & 0 deletions ms-anti-fraud/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# dotenv environment variables file
.env
.env.test
2 changes: 2 additions & 0 deletions ms-anti-fraud/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
KAFKA_CONSUMER_GROUP="transaction-service-consumer"
KAFKA_BROKER="localhost:9093"
56 changes: 56 additions & 0 deletions ms-anti-fraud/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# compiled output
/dist
/node_modules
/build

# Logs
logs
*.log
npm-debug.log*
pnpm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*

# OS
.DS_Store

# Tests
/coverage
/.nyc_output

# IDEs and editors
/.idea
.project
.classpath
.c9/
*.launch
.settings/
*.sublime-workspace

# IDE - VSCode
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json

# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local

# temp directory
.temp
.tmp

# Runtime data
pids
*.pid
*.seed
*.pid.lock

# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
4 changes: 4 additions & 0 deletions ms-anti-fraud/.prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"singleQuote": true,
"trailingComma": "all"
}
15 changes: 15 additions & 0 deletions ms-anti-fraud/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
FROM node:22-alpine3.21

WORKDIR /app

COPY package*.json ./

RUN npm install --production

COPY . .

RUN npm run build

EXPOSE 3000

CMD ["node", "dist/main.js"]
Loading