You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An automated check may be the best way to provide quick feedback to depositors about easily detected/correctable accessibility issues. However, we know that we likely won't be able to run an automated accessibility check on every type of file that is uploaded to ScholarSphere (in fact, we may only be able to do so with PDF files). We also know that a passing automated check does not at all guarantee that a file is actually accessible by all WCAG standards. The only way to guarantee that any given file meets the standards is for a human reviewer to carefully inspect it and remediate any issues.
To ensure that all works deposited in ScholarSphere are actually accessible, we would need each work (regardless of whether it was automatically checked) to enter into a queue for manual review by a ScholarSphere curator. After a curator has inspected each file that's attached to the current work version and worked with the depositor to remediate any issues, they would need a way to manually indicate that the work had been reviewed for accessibility and thus remove it from the queue. If a new version of the work is created in the process of remediating accessibility issues, then the work should not re-enter the queue, but if a new version of the work is created for any other reason, then it should re-enter the queue for another manual accessibility review.
This queue for manual review would be separate from the queue for works where the depositor actively requested an accessibility review from a curator prior to publication and should exclude works where curation was requested so that the curation of works in these two queues can be prioritized separately. The mechanism for resolving/removing works from both queues could probably be the same.
Open questions:
Again, could/should this list be tracked in AirTable?
Is this actually a reasonable thing to do? In other words, will there be enough curators to handle this work? Or will the to-do list just get hopelessly long?
Based on the conversation on other tickets for this workflow, I think it's safe to say we'll just want to add a label 'Needs accessibility review' during the existing export to Airtable. The label only needs to be added if the files have not already gone through remediation (ie accessibility_remediation_requested is not true). #1608 will handle the addition of autochecker results, but even if everything passes, we should still add the label so it is at least briefly checked.
From the epic (#1589 )
Part of #1589
The text was updated successfully, but these errors were encountered: