-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat/separate givback verfied #1770
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @MohammadPCh LGTM
(Just please run migrations on your local machine before merging this PR)
WalkthroughThe recent changes focus on updating properties and parameters related to project verification and eligibility for the Givback program throughout the application. This shift clarifies the distinction between verification and eligibility criteria, enhancing the logical flow of donation processing. The modifications span multiple files, ensuring consistent application of the updated terminology, which impacts how projects and donations are managed within the system. Additionally, new error handling and schema modifications have been introduced for improved robustness and functionality. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant DonationService
participant ProjectRepo
participant DonationRepo
User->>DonationService: Create Donation
DonationService->>ProjectRepo: Check isProjectGivbackEligible
ProjectRepo-->>DonationService: Eligibility Status
DonationService->>DonationRepo: Save Donation
DonationRepo-->>User: Donation Created
Possibly related PRs
Poem
Recent review detailsConfiguration used: CodeRabbit UI Files selected for processing (1)
Files skipped from review as they are similar to previous changes (1)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Outside diff range, codebase verification and nitpick comments (1)
migration/1725260193333-projectGivbackRankView.ts (1)
7-60
: LGTM! The SQL query for creating the materialized view is well-structured and follows best practices.The SQL query is well-structured and follows best practices for creating materialized views. It uses appropriate joins and aggregations to calculate the total power for each project and creates appropriate indexes on the materialized view for better query performance.
A few additional suggestions:
- Consider adding comments to explain the purpose of each subquery and the overall purpose of the materialized view.
- Consider using more descriptive names for the subqueries and aliases to improve readability.
- Consider adding a comment to explain the purpose of the
CASE
statement and the conditions used to calculate the total power.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
for (const project of projects) { | ||
newProjectCampaignCache[project.id] | ||
? newProjectCampaignCache[project.id].push(campaign.slug) | ||
: (newProjectCampaignCache[project.id] = [campaign.slug]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refactor assignment within expression.
The assignment within the expression at line 72 should be refactored for better readability and to avoid potential confusion. Here's a suggested refactor:
- newProjectCampaignCache[project.id] ? newProjectCampaignCache[project.id].push(campaign.slug) : (newProjectCampaignCache[project.id] = [campaign.slug]);
+ if (!newProjectCampaignCache[project.id]) {
+ newProjectCampaignCache[project.id] = [];
+ }
+ newProjectCampaignCache[project.id].push(campaign.slug);
This change separates the conditional logic from the assignment, making the code clearer and easier to understand.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
: (newProjectCampaignCache[project.id] = [campaign.slug]); | |
if (!newProjectCampaignCache[project.id]) { | |
newProjectCampaignCache[project.id] = []; | |
} | |
newProjectCampaignCache[project.id].push(campaign.slug); |
Tools
Biome
[error] 72-72: The assignment should not be in an expression.
The use of assignments in expressions is confusing.
Expressions are often considered as side-effect free.(lint/suspicious/noAssignInExpressions)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Outside diff range, codebase verification and nitpick comments (1)
migration/1716367359560-add_base_chain_tokens.ts (1)
54-56
: Clarify the purpose of the emptydown
method.The
down
method is empty, which might be intentional, but it's best practice to include a comment explaining why this method is left unimplemented. This helps other developers understand the decision, especially in a collaborative environment.Consider adding a comment like:
// Down migration not implemented due to the irreversible nature of the operation
try { | ||
const environment = config.get('ENVIRONMENT') as string; | ||
|
||
const networkId = | ||
environment === 'production' | ||
? NETWORK_IDS.BASE_MAINNET | ||
: NETWORK_IDS.BASE_SEPOLIA; | ||
const networkId = | ||
environment === 'production' | ||
? NETWORK_IDS.BASE_MAINNET | ||
: NETWORK_IDS.BASE_SEPOLIA; | ||
|
||
await queryRunner.manager.save( | ||
Token, | ||
seedTokens | ||
.filter(token => token.networkId === networkId) | ||
.map(token => { | ||
const t = { | ||
...token, | ||
}; | ||
t.address = t.address?.toLowerCase(); | ||
delete t.chainType; | ||
return t; | ||
}), | ||
); | ||
const tokens = await queryRunner.query(` | ||
SELECT * FROM token | ||
WHERE "networkId" = ${networkId} | ||
`); | ||
const givethOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='giveth'`) | ||
)[0]; | ||
await queryRunner.manager.save( | ||
Token, | ||
seedTokens | ||
.filter(token => token.networkId === networkId) | ||
.map(token => { | ||
const t = { | ||
...token, | ||
}; | ||
t.address = t.address?.toLowerCase(); | ||
delete t.chainType; | ||
return t; | ||
}), | ||
); | ||
const tokens = await queryRunner.query(` | ||
SELECT * FROM token | ||
WHERE "networkId" = ${networkId} | ||
`); | ||
const givethOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='giveth'`) | ||
)[0]; | ||
|
||
const traceOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='trace'`) | ||
)[0]; | ||
const traceOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='trace'`) | ||
)[0]; | ||
|
||
for (const token of tokens) { | ||
// Add all Base tokens to Giveth organization | ||
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | ||
(${token.id}, ${givethOrganization.id}), | ||
(${token.id}, ${traceOrganization.id}) | ||
;`); | ||
for (const token of tokens) { | ||
// Add all Base tokens to Giveth organization | ||
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | ||
(${token.id}, ${givethOrganization.id}), | ||
(${token.id}, ${traceOrganization.id}) | ||
;`); | ||
} | ||
} catch (e) { | ||
// tslint:disable-next-line: no-console | ||
console.log('Error in migration AddBaseChainTokens1716367359560', e); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enhance error handling and logging practices.
The addition of the try-catch block is a good practice for robust error handling in migrations. However, using console.log
for error logging is not recommended in production environments. Consider using a more sophisticated logging framework that can handle log levels and is configurable based on the environment.
Additionally, the use of the delete
operator on line 26 is flagged by static analysis tools for potential performance issues. It's recommended to set the property to undefined
instead of deleting it to maintain performance, especially in a loop or frequent operations.
Consider the following changes:
- console.log('Error in migration AddBaseChainTokens1716367359560', e);
+ // Consider using a more sophisticated logging framework
+ logger.error('Error in migration AddBaseChainTokens1716367359560', e);
- delete t.chainType;
+ t.chainType = undefined;
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
try { | |
const environment = config.get('ENVIRONMENT') as string; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
delete t.chainType; | |
return t; | |
}), | |
); | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
delete t.chainType; | |
return t; | |
}), | |
); | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
} | |
} catch (e) { | |
// tslint:disable-next-line: no-console | |
console.log('Error in migration AddBaseChainTokens1716367359560', e); | |
try { | |
const environment = config.get('ENVIRONMENT') as string; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
t.chainType = undefined; | |
return t; | |
}), | |
); | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
} | |
} catch (e) { | |
// Consider using a more sophisticated logging framework | |
logger.error('Error in migration AddBaseChainTokens1716367359560', e); |
Tools
Biome
[error] 26-26: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
GitHub Check: test
[failure] 53-53:
Unexpected console statement
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
try { | ||
const environment = config.get('ENVIRONMENT') as string; | ||
|
||
const networkId = | ||
environment === 'production' | ||
? NETWORK_IDS.BASE_MAINNET | ||
: NETWORK_IDS.BASE_SEPOLIA; | ||
const networkId = | ||
environment === 'production' | ||
? NETWORK_IDS.BASE_MAINNET | ||
: NETWORK_IDS.BASE_SEPOLIA; | ||
|
||
await queryRunner.manager.save( | ||
Token, | ||
seedTokens | ||
.filter(token => token.networkId === networkId) | ||
.map(token => { | ||
const t = { | ||
...token, | ||
}; | ||
t.address = t.address?.toLowerCase(); | ||
delete t.chainType; | ||
return t; | ||
}), | ||
); | ||
const tokens = await queryRunner.query(` | ||
SELECT * FROM token | ||
WHERE "networkId" = ${networkId} | ||
`); | ||
const givethOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='giveth'`) | ||
)[0]; | ||
try { | ||
await queryRunner.manager.save( | ||
Token, | ||
seedTokens | ||
.filter(token => token.networkId === networkId) | ||
.map(token => { | ||
const t = { | ||
...token, | ||
}; | ||
t.address = t.address?.toLowerCase(); | ||
delete t.chainType; | ||
return t; | ||
}), | ||
); | ||
} catch (e) { | ||
// eslint-disable-next-line no-console | ||
console.log( | ||
'Error in migration AddBaseChainTokens1716367359560, saving tokens', | ||
e, | ||
); | ||
} | ||
const tokens = await queryRunner.query(` | ||
SELECT * FROM token | ||
WHERE "networkId" = ${networkId} | ||
`); | ||
const givethOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='giveth'`) | ||
)[0]; | ||
|
||
const traceOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='trace'`) | ||
)[0]; | ||
const traceOrganization = ( | ||
await queryRunner.query(`SELECT * FROM organization | ||
WHERE label='trace'`) | ||
)[0]; | ||
|
||
for (const token of tokens) { | ||
// Add all Base tokens to Giveth organization | ||
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | ||
(${token.id}, ${givethOrganization.id}), | ||
(${token.id}, ${traceOrganization.id}) | ||
;`); | ||
for (const token of tokens) { | ||
// Add all Base tokens to Giveth organization | ||
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | ||
(${token.id}, ${givethOrganization.id}), | ||
(${token.id}, ${traceOrganization.id}) | ||
;`); | ||
} | ||
} catch (e) { | ||
// eslint-disable-next-line no-console | ||
console.log('Error in migration AddBaseChainTokens1716367359560', e); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enhanced Error Handling with Try-Catch Blocks
The addition of try-catch blocks around critical operations is a good practice for robust error handling. This change ensures that any exceptions thrown during the migration process are caught and logged, preventing the migration from failing silently.
However, the use of console.log
for error logging is not recommended in production environments due to its lack of flexibility and inability to handle different log levels. Consider using a more sophisticated logging framework that can handle log levels and is configurable based on the environment.
Additionally, the use of the delete
operator on line 27 is flagged by static analysis tools for potential performance issues. It's recommended to set the property to undefined
instead of deleting it to maintain performance, especially in a loop or frequent operations.
Consider the following changes:
- console.log('Error in migration AddBaseChainTokens1716367359560', e);
+ // Consider using a more sophisticated logging framework
+ logger.error('Error in migration AddBaseChainTokens1716367359560', e);
- delete t.chainType;
+ t.chainType = undefined;
These changes will improve the performance and maintainability of the migration script.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
try { | |
const environment = config.get('ENVIRONMENT') as string; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
delete t.chainType; | |
return t; | |
}), | |
); | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
try { | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
delete t.chainType; | |
return t; | |
}), | |
); | |
} catch (e) { | |
// eslint-disable-next-line no-console | |
console.log( | |
'Error in migration AddBaseChainTokens1716367359560, saving tokens', | |
e, | |
); | |
} | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
} | |
} catch (e) { | |
// eslint-disable-next-line no-console | |
console.log('Error in migration AddBaseChainTokens1716367359560', e); | |
try { | |
const environment = config.get('ENVIRONMENT') as string; | |
const networkId = | |
environment === 'production' | |
? NETWORK_IDS.BASE_MAINNET | |
: NETWORK_IDS.BASE_SEPOLIA; | |
try { | |
await queryRunner.manager.save( | |
Token, | |
seedTokens | |
.filter(token => token.networkId === networkId) | |
.map(token => { | |
const t = { | |
...token, | |
}; | |
t.address = t.address?.toLowerCase(); | |
t.chainType = undefined; | |
return t; | |
}), | |
); | |
} catch (e) { | |
// Consider using a more sophisticated logging framework | |
logger.error( | |
'Error in migration AddBaseChainTokens1716367359560, saving tokens', | |
e, | |
); | |
} | |
const tokens = await queryRunner.query(` | |
SELECT * FROM token | |
WHERE "networkId" = ${networkId} | |
`); | |
const givethOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='giveth'`) | |
)[0]; | |
const traceOrganization = ( | |
await queryRunner.query(`SELECT * FROM organization | |
WHERE label='trace'`) | |
)[0]; | |
for (const token of tokens) { | |
// Add all Base tokens to Giveth organization | |
await queryRunner.query(`INSERT INTO organization_tokens_token ("tokenId","organizationId") VALUES | |
(${token.id}, ${givethOrganization.id}), | |
(${token.id}, ${traceOrganization.id}) | |
;`); | |
} | |
} catch (e) { | |
// Consider using a more sophisticated logging framework | |
logger.error('Error in migration AddBaseChainTokens1716367359560', e); |
Tools
Biome
[error] 27-27: Avoid the delete operator which can impact performance.
Unsafe fix: Use an undefined assignment instead.
(lint/performance/noDelete)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
@@ -10,6 +10,7 @@ export class createTokensTable1646295724658 implements MigrationInterface { | |||
name text COLLATE pg_catalog."default" NOT NULL, | |||
symbol text COLLATE pg_catalog."default" NOT NULL, | |||
address text COLLATE pg_catalog."default" NOT NULL, | |||
"isQR" BOOLEAN DEFAULT FALSE NOT NUL, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct the typographical error in the SQL statement.
The column definition for "isQR"
incorrectly uses NOT NUL
instead of NOT NULL
. This will cause the SQL execution to fail due to syntax error.
Please apply this correction:
- "isQR" BOOLEAN DEFAULT FALSE NOT NUL,
+ "isQR" BOOLEAN DEFAULT FALSE NOT NULL,
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
"isQR" BOOLEAN DEFAULT FALSE NOT NUL, | |
"isQR" BOOLEAN DEFAULT FALSE NOT NULL, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
await queryRunner.query( | ||
`ALTER TABLE token ADD COLUMN IF NOT EXISTS "isQR" BOOLEAN DEFAULT FALSE NOT NULL`, | ||
); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Optimize Token Handling
The use of the delete
operator on t.chainType
can lead to performance issues, especially in loops or frequent operations. Consider setting the property to undefined
instead to maintain optimal performance.
Apply this diff to optimize the token handling:
- delete t.chainType;
+ t.chainType = undefined;
Committable suggestion was skipped due to low confidence.
await queryRunner.query( | ||
`ALTER TABLE token ADD COLUMN IF NOT EXISTS "isQR" BOOLEAN DEFAULT FALSE NOT NULL`, | ||
); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enhance Error Handling and Logging
Consider wrapping the migration operations within a try-catch block to handle potential exceptions gracefully. Additionally, replace console.log
with a more sophisticated logging framework to provide better control over logging levels and formats.
Here's a suggested implementation:
+ try {
await queryRunner.query(
`ALTER TABLE token ADD COLUMN IF NOT EXISTS "isQR" BOOLEAN DEFAULT FALSE NOT NULL`,
);
// Additional operations...
+ } catch (e) {
+ logger.error('Error in migration AddBaseChainTokens1716367359560', e);
+ }
Committable suggestion was skipped due to low confidence.
@CarlosQ96 Can you review it? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! thanks for the tests and work!
Just make sure migrations ran good locally and can merge.
Thanks man, will merge it after our release |
noteschanged status from verified -> givbacksEligible
|
@mateodaza The needed changes you should apply on FE
Seems we just have these few changes in our API, but if you had any question please let me know cc: @divine-comedian |
* fix: remove memo for project verification managing funds * fix: remove memo for project verification managing funds * add projectId and qfRoundId to qf data export * fix: getDraftDonationById bug (toWalletMemo can be null) * fix: add memo for stellar project address uniqueness * fix: add memo for manage address validation * fix: add duplicate address error message for stellar * fix: linter error * add index for project stellar address * eslint error * fix: case when owner donate to his own peoject (Stellar chain) * fix: add calculateGivbackFactor to Stellar cron job * onlyEndaement option added to donationResolvers to get only endaoment projects * chore: implementing coderabbitai suggestion to remove string literal * feat: register secondary donation * running migration to set project banners appropriately for endaoment … (#1778) * running migration to set project banners appropriately for endaoment projects * chore: correcting tab spaces for syntax * fix: linter errors * Modify add banner to endaoment projects migration (#1791) related to #1600 * Fix lint errors * Fix running tests * Fix projectResolver test cases * Fix donationResolver test cases * skip should renew the expiration date of the draft donation test case --------- Co-authored-by: Hrithik Sampson <hrithiksampson@Hrithiks-MacBook-Air.local> Co-authored-by: mohammadranjbarz <mranjbar.z2993@gmail.com> * improve adminjs to import qfround matching and better filters * fix eslint * fix: remove adding secondary donation logic * fix minor form issues * order middleware in bootstrap file * test: add test cases to fetch only Endaoment projects * chore: change the second Project to first Project * chore: change the second Project to first Project * chore: change the second Project to first Project * chore: change the second user to new user since it is interfering with the pre-existing test cases * delete previous_round_rank when deleting a project (#1809) * Implement allocatedGivbacks function (#1808) * WIP Implement allocatedGivbacks function related to Giveth/giveth-dapps-v2#4678 Giveth/giveth-dapps-v2#4679 * allocatedGivbacks() endpoint implemented and works related to Giveth/giveth-dapps-v2#4678 Giveth/giveth-dapps-v2#4679 * Fix allocatedGivbacksQuery test cases * migration: project banners for endaoment projects need to have the correct banners * chore: underscore before unused variable in add_endaoment_project_banners * Use Gnosis giv token for getting price of GIV * Use superfluid mock adapter for test cases * Use superfluid adapter on test env again * Feat/separate givback verfied (#1770) * add isGivbackEligible field * add AddIsGivbackEligibleColumnToProject1637168932304 * add UpdateIsGivbackEligibleForVerifiedProjects1637168932305 migration * add migration to rename isProjectVerified to isProjectGivbackEligible * change isProjectVerified tp isProjectGivbackEligible * update octant donation * add approve project * treat project.verified and project.isGivbackEligible equally on sorting * remove reset verification status on verify * check isGivbackEligible on create ProjectVerificationForm * add ProjectInstantPowerViewV3 migration * use verifiedOrIsGivbackEligibleCondition * Use different materialized view for givback factor related to #1770 * Fix build error * Fix build error * Fix project query for isGivbackEligible and verified * Fix add base token migration * Fix eslint errors * Fix add base token migration * Fix add base token migration * Fix add base token migration * Fix donation test cases related to isGivbackEligible * Fix build error --------- Co-authored-by: Mohammad Ranjbar Z <mranjbar.z2993@gmail.com> * Fix test cases related to isProjectVerified * add isImported And categories to project tab * fix isProjectGivbackEligible Migration in wrong folder * add chaintype and solana networks to tokenTab * update branch * add environment and energy image mapping * add categories to show and edit forms in adminjs for projects * fix eslint * add best match sort option * update addSearchQuery to prioritize the title * Add Stellar to QFRound * run linter * remove eager from project categories in entity * Add isGivbackEligible filter * Hotfix automatic model score sync (#1849) * add user mbdscore sync workers and cronjob * add active env var for syncing score * add tests to the user sync worker and cronjob * prevent duplicate tokens being added in adminJS * Ensure correct emails are sent for project status changes related to decentralized verification * fix test * fix test cases * fix test cases --------- Co-authored-by: Meriem-BM <barhoumi.meriem1@gmail.com> Co-authored-by: Carlos <carlos.quintero096@gmail.com> Co-authored-by: HrithikSampson <hrithikedwardsampson@gmail.com> Co-authored-by: HrithikSampson <56023811+HrithikSampson@users.noreply.github.com> Co-authored-by: Hrithik Sampson <hrithiksampson@Hrithiks-MacBook-Air.local> Co-authored-by: CarlosQ96 <92376054+CarlosQ96@users.noreply.github.com> Co-authored-by: Cherik <Pourcheriki@gmail.com> Co-authored-by: Ramin <raminramazanpour@gmail.com>
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Refactor
Tests
Chores