Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OutOfMemoryError when downloading #6068

Closed
Helium314 opened this issue Dec 31, 2024 · 6 comments
Closed

OutOfMemoryError when downloading #6068

Helium314 opened this issue Dec 31, 2024 · 6 comments
Labels

Comments

@Helium314
Copy link
Collaborator

When downloading large areas with high element density, the app may crash with an OOM error.
I got a bunch of error reports for SCEE since release of 59.0, and reproduced it also in SC. Never seen it with older versions.

How to Reproduce

  1. go to a well mapped city, e.g. https://www.openstreetmap.org/#map=15/25.00673/121.31786
  2. zoom out as much as possible, but to a level where you can still download data
    note that it the area actually doesn't need to be close to the 12 km² limit, I got reports for 7.4 km² and even one coming from auto-downloading a 3.5 km² area
  3. manually download data
  4. see crash

Expected Behavior
Download completes, like it did on older versions.
I think it's related to the download, so probably not connected to MapLibre, as the stacktrace contains MapDataApiClient.getMap. #5686 was tested for performance, but looks like increased memory use was not a consideration.

Stacktrace for 60.0
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Failed to allocate a 38910912 byte allocation with 25165824 free bytes and 29MB until OOM, target footprint 195379296, growth limit 201326592
    at androidx.work.impl.utils.futures.AbstractFuture.getDoneValue(SourceFile:515)
    at androidx.work.impl.utils.futures.AbstractFuture.get(SourceFile:474)
    at androidx.work.impl.WorkerWrapper$2.run(SourceFile:316)
    at androidx.work.impl.utils.SerialExecutorImpl$Task.run(SourceFile:96)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
    at java.lang.Thread.run(Thread.java:919)
Caused by: java.lang.OutOfMemoryError: Failed to allocate a 38910912 byte allocation with 25165824 free bytes and 29MB until OOM, target footprint 195379296, growth limit 201326592
    at java.lang.StringFactory.newStringFromChars(StringFactory.java:260)
    at java.lang.StringBuilder.toString(StringBuilder.java:413)
    at io.ktor.utils.io.charsets.EncodingKt.decode(SourceFile:102)
    at io.ktor.utils.io.core.StringsKt.readText(SourceFile:246)
    at io.ktor.utils.io.core.StringsKt.readText$default(SourceFile:245)
    at io.ktor.client.plugins.HttpPlainText.read$ktor_client_core(SourceFile:155)
    at io.ktor.client.plugins.HttpPlainText$Plugin$install$2.invokeSuspend(SourceFile:137)
    at io.ktor.client.plugins.HttpPlainText$Plugin$install$2.invoke(SourceFile:0)
    at io.ktor.client.plugins.HttpPlainText$Plugin$install$2.invoke(SourceFile:0)
    at io.ktor.util.pipeline.SuspendFunctionGun.loop(SourceFile:131)
    at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SourceFile:89)
    at io.ktor.client.HttpClient$4.invokeSuspend(SourceFile:177)
    at io.ktor.client.HttpClient$4.invoke(SourceFile:0)
    at io.ktor.client.HttpClient$4.invoke(SourceFile:0)
    at io.ktor.util.pipeline.SuspendFunctionGun.loop(SourceFile:131)
    at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SourceFile:89)
    at io.ktor.util.pipeline.SuspendFunctionGun.proceedWith(SourceFile:99)
    at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invokeSuspend(SourceFile:142)
    at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(SourceFile:0)
    at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(SourceFile:0)
    at io.ktor.util.pipeline.SuspendFunctionGun.loop(SourceFile:131)
    at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SourceFile:89)
    at io.ktor.util.pipeline.SuspendFunctionGun.execute$ktor_utils(SourceFile:109)
    at io.ktor.util.pipeline.Pipeline.execute(SourceFile:77)
    at io.ktor.client.call.HttpClientCall.bodyNullable(SourceFile:89)
    at de.westnordost.streetcomplete.data.osm.mapdata.MapDataApiClient.getMap(SourceFile:156)
    at de.westnordost.streetcomplete.data.osm.mapdata.MapDataApiClient$getMap$1.invokeSuspend(Unknown Source:12)
    at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(SourceFile:33)
    at kotlinx.coroutines.DispatchedTask.run(SourceFile:101)
    at kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(SourceFile:113)
    at kotlinx.coroutines.scheduling.TaskImpl.run(SourceFile:89)
    at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(SourceFile:589)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(SourceFile:823)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(SourceFile:720)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(SourceFile:707)
Versions affected
Tested on 59.0-alpha1 and 60.0.
Does not happen on 58.2

@Helium314 Helium314 added the bug label Dec 31, 2024
@Helium314
Copy link
Collaborator Author

If the memory use can't be reduced, maybe the max download area could be dependent on the memoryClass, i.e. how many MB the app is allowed to use. I got the crash with 192, also saw it in a crash report with 256.

@westnordost
Copy link
Member

westnordost commented Dec 31, 2024

I think what changed in v59 is that the XML response of the OSM API is not parsed from the stream but first read entirely into a string, then parsed from there. This requires naturally more RAM. But this is known, and there are TODOs in the code to change that to stream parsing as soon as it is possible to do so with the HTTP library used. See #5686 (comment)

What devices are affected?

@Helium314
Copy link
Collaborator Author

I'd need to search, but what I remember looking up is a Samsung A15 (2023), and an A5 (2017).
Didn't you get any OOM reports? I would assume you'd get a bunch more than I do.

@westnordost
Copy link
Member

As a matter of fact, I didn't (in the last 60 days). The currently most frequent error is #3858

@Helium314
Copy link
Collaborator Author

Oh, I absolutely didn't expect that, but...
I Just re-checked, and for some reason just looking at the logs hid an important fact literlly in plain sight: SC doesn't actually crash, it's only the download worker. So there is no visible error message, and the user just sees an incomplete download (i.e. only notes are displayed).

So I'm not sure how I acutally got crash reports from this (though I have a suspicion related to SCEE-specific stuff).

@westnordost
Copy link
Member

Hmm, I think I'd rather not try some workaround then, i.e. close this as will not fix:

  • It is not (really) visible to the user
  • It likely mostly only affects old or low-end devices
  • a proper fix can be done as soon as the dependencies used support proper stream parsing, which is already a TODO in the code. Anything else is just bricolage that can at most reduce the frequency it occurs

@westnordost westnordost closed this as not planned Won't fix, can't repro, duplicate, stale Jan 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants