You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We continue to have an issue with the size of the Seattle dataset, and our current inability to reliably serve up that data! This is a particularly big issue for CV applications for which we have documentation that assumes that the APIs will actually work. The API that we run into the most issues with is /adminapi/labels/cvMetadata.
I've tried to increase Java's heap size, but it's surprisingly difficult to figure out whether changes I'm making are even taking effect. There are multiple places where I can set a max heap size, and not all of them actually work. And it's even harder to know if the changes work on production servers.
Potential solution(s)
@michaelduan8 had suggested in the past that we add parameters to allow users to select a range of label IDs. This way, they could query for only 50k labels at a time, for example.
We could also put more effort into figuring out how exactly to configure Java's max heap size. Would also want to talk to CSE Support about how much memory is too much for us to be taking up. Are we close to the limit for what is reasonable? Or have we just failed to figure out how to set an appropriate heap size for a real production website?
The text was updated successfully, but these errors were encountered:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request GET /adminapi/labels/cvMetadata.
Reason: Error reading from remote server
Brief description of problem/feature
We continue to have an issue with the size of the Seattle dataset, and our current inability to reliably serve up that data! This is a particularly big issue for CV applications for which we have documentation that assumes that the APIs will actually work. The API that we run into the most issues with is
/adminapi/labels/cvMetadata
.I've tried to increase Java's heap size, but it's surprisingly difficult to figure out whether changes I'm making are even taking effect. There are multiple places where I can set a max heap size, and not all of them actually work. And it's even harder to know if the changes work on production servers.
Potential solution(s)
@michaelduan8 had suggested in the past that we add parameters to allow users to select a range of label IDs. This way, they could query for only 50k labels at a time, for example.
We could also put more effort into figuring out how exactly to configure Java's max heap size. Would also want to talk to CSE Support about how much memory is too much for us to be taking up. Are we close to the limit for what is reasonable? Or have we just failed to figure out how to set an appropriate heap size for a real production website?
The text was updated successfully, but these errors were encountered: