Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Respect archiver limits on BatchAction download #9055

Merged
merged 2 commits into from
May 19, 2023

Conversation

pascalwengerter
Copy link
Contributor

Related Issue

Types of changes

  • New feature (non-breaking change which adds functionality)

Checklist:

  • Code changes
  • Needs clarification on fileSize calculation base (see comment)
  • Needs fix for action label reactivity
  • Unit tests added

Open tasks:

  • ...


const selectedFilesAmount = resources.length

// TODO: selectedFilesSize doesn't traverse into folders, so currently limited to directly selected resources
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kulmann I don't guess there's an easy way to do this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get your TODO comment. 🙈

  • We can check the accumulated size of the selected files and folders against the archiver capability. A folder size always reflects the size of all the children. Tree size aggregation is properly happening backend side. That's what's happening in the reduce statement below and seems to be correct.
  • We can't check the accumulated number of files in the selection (because you would need a propfind with depth infinity for that). That's what you did in the line above and is incomplete with just checking the number of selected resources. You only count the number of selected resources while you would need to count the number of all children in the full tree of the selection.

To be honest I would just not look at the number of files. Doing the file size check already helps and is by far the more important check I think.

@pascalwengerter pascalwengerter marked this pull request as ready for review May 16, 2023 21:06
@ownclouders
Copy link
Contributor

ownclouders commented May 16, 2023

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/66/1
💥 The oCISSharingPublic2 tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/56/1
💥 The oCISFiles3 tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/58/1
💥 The oCISFiles5 tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/53/1
💥 The oCISFavorites tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/52/1
💥 The oCISSharingBasic tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/55/1
💥 The oCISFiles2 tests pipeline failed. The build has been cancelled.

@ownclouders
Copy link
Contributor

Results for acceptance oCIS https://drone.owncloud.com/owncloud/web/35735/54/1
💥 The oCISFiles1 tests pipeline failed. The build has been cancelled.

}

const selectedFilesAmount = resources.length

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't we early return here, so we don't need to fire the accumulator if the limit is already reached ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Surely, good idea :)

@@ -17,6 +17,8 @@ export interface ArchiverCapability {
formats: string[]
// eslint-disable-next-line camelcase
archiver_url: string
max_num_files: string
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are those strings ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't we parse them beforehand?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, this only reflects the raw capabilities coming from the backend. Parsing makes sense in the capability composables though.

@@ -68,6 +90,12 @@ export const useFileActionsDownloadArchive = ({ store }: { store?: Store<any> }
label: () => {
return $gettext('Download')
},
disabledTooltip: ({ resources }) => {
return archiverLimitsExceeded(resources)
? $gettext('Unable to download that many files at once')
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Open for suggestions on the final text here btw 😄

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about something like The selection exceeds the allowed archive size (max. xyz MB)? We should let the user know what limits they are working against. Also wouldn't mention the number of files here since we can't check that.

By the way, I believe that the backend behaviour is faulty and should not offer an archive file for download at all if it's incomplete. What's your take on that? If same we should open an issue in ocis / reva.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if the average non-tech user knows how to parse allowed archive size, but let's roll with it since feedback can easily be implemented at a later point

@@ -0,0 +1,8 @@
Enhancement: Respect archiver limits

The archiver service can now announce a limit for both the amount of resources and
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The archiver service can now announce a limit for both the amount of resources and
The archiver service can now announce a limit for both the number and


const selectedFilesAmount = resources.length

// TODO: selectedFilesSize doesn't traverse into folders, so currently limited to directly selected resources
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get your TODO comment. 🙈

  • We can check the accumulated size of the selected files and folders against the archiver capability. A folder size always reflects the size of all the children. Tree size aggregation is properly happening backend side. That's what's happening in the reduce statement below and seems to be correct.
  • We can't check the accumulated number of files in the selection (because you would need a propfind with depth infinity for that). That's what you did in the line above and is incomplete with just checking the number of selected resources. You only count the number of selected resources while you would need to count the number of all children in the full tree of the selection.

To be honest I would just not look at the number of files. Doing the file size check already helps and is by far the more important check I think.

const selectedFilesSize = resources.reduce(
(accumulator, currentValue) =>
accumulator +
(typeof currentValue.size === 'string' ? parseInt(currentValue.size) : currentValue.size),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
(typeof currentValue.size === 'string' ? parseInt(currentValue.size) : currentValue.size),
parseInt(`${currentValue.size}`),

@@ -59,6 +59,28 @@ export const useFileActionsDownloadArchive = ({ store }: { store?: Store<any> }
})
}

const archiverLimitsExceeded = (resources: Resource[]) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
const archiverLimitsExceeded = (resources: Resource[]) => {
const areArchiverLimitsExceeded = (resources: Resource[]) => {

@@ -68,6 +90,12 @@ export const useFileActionsDownloadArchive = ({ store }: { store?: Store<any> }
label: () => {
return $gettext('Download')
},
disabledTooltip: ({ resources }) => {
return archiverLimitsExceeded(resources)
? $gettext('Unable to download that many files at once')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about something like The selection exceeds the allowed archive size (max. xyz MB)? We should let the user know what limits they are working against. Also wouldn't mention the number of files here since we can't check that.

By the way, I believe that the backend behaviour is faulty and should not offer an archive file for download at all if it's incomplete. What's your take on that? If same we should open an issue in ocis / reva.

@sonarqubecloud
Copy link

SonarCloud Quality Gate failed.    Quality Gate failed

Bug A 0 Bugs
Vulnerability A 0 Vulnerabilities
Security Hotspot A 0 Security Hotspots
Code Smell A 1 Code Smell

20.0% 20.0% Coverage
0.0% 0.0% Duplication

Copy link
Contributor

@kulmann kulmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Love it 🚀

@kulmann
Copy link
Contributor

kulmann commented May 19, 2023

for the record, easier to test if one decreases the max file size allowed by the archiver. E.g. as follows:

FRONTEND_ARCHIVER_MAX_SIZE: "4194304"

@kulmann kulmann merged commit 4141738 into owncloud:master May 19, 2023
pascalwengerter added a commit to pascalwengerter/web that referenced this pull request May 22, 2023
@pascalwengerter pascalwengerter deleted the feature/8456 branch July 13, 2023 16:15
@micbar micbar mentioned this pull request Jul 24, 2023
68 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Limit batch download to a certain selection size
4 participants