Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/planning area grid custom piece importer #887

Merged
merged 2 commits into from
Mar 3, 2022

Conversation

angelhigueraacid
Copy link
Contributor

This PR adds planning area grid custom piece importer

@angelhigueraacid angelhigueraacid added the API Everything related the api label Mar 3, 2022
@angelhigueraacid angelhigueraacid self-assigned this Mar 3, 2022
@vercel
Copy link

vercel bot commented Mar 3, 2022

This pull request is being automatically deployed with Vercel (learn more).
To see the status of your deployments, click below or on the icon next to each commit.

marxan – ./app

🔍 Inspect: https://vercel.com/vizzuality1/marxan/6ZXGCDVLAqwt67zBK9HAQDx6c7J1
✅ Preview: https://marxan-git-feat-planning-area-grid-custom-pi-40e9a4-vizzuality1.vercel.app

marxan-storybook – ./app

🔍 Inspect: https://vercel.com/vizzuality1/marxan-storybook/AUhzpcpxb6x4LLpKi3fcBo2yA5Gz
✅ Preview: https://marxan-storybook-git-feat-planning-area-grid-f86de8-vizzuality1.vercel.app

@aciddaute aciddaute merged commit 8b9af04 into develop Mar 3, 2022
@aciddaute aciddaute deleted the feat/planning-area-grid-custom-piece-importer branch March 3, 2022 11:21
const buffers = geomPUs.map((pu) => Buffer.from(pu.geom));
await this.geoprocessingEntityManager.query(
`
INSERT INTO planning_units_geom (type,project_id,the_geom)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quick note on this and in general as you progress through imports of geo data which in some real cases may span tens of thousands of records, time a few columns: let's consider both PostgreSQL's limits on query parameters (IIRC depends on versions, but easy to hit at the 10k scale, I think) and the efficiency of TypeORM at composing queries with "high" (for some value of high" numbers of params.

My general suggestion would be to open a transaction (if not in one already) and split the inserts into batches considering the number of parameters needed to safely set the batch size according to PostgreSQL's limits.

I also suggest to always add a short note about this in a comment next to the batching logic, so that whoever may update the query next and for example add more params could take that into account and see if it may be necessary to reduce the batch length according (e.g. if doubling the number of params needed for each row, halve the batch size)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrapping the insert operations inside a transaction makes totally sense 👍 . However, as we are already working with streams the inserts are being made in batches (chunks) right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Everything related the api
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants