Skip to content
This repository has been archived by the owner on Jun 20, 2023. It is now read-only.

Commit

Permalink
Overhaul of DiagnosisKey related Download and Caching Mechanisms (EXP…
Browse files Browse the repository at this point in the history
…OSUREAPP-2469) (#1136)

* Rework of keyfile download and caching.
* Supports interop
* More modular for better testing and build flavour based behavior adjustments
* More resilient handling of failed downloads
* Preperations for future hourly download and serverside checksums

TODO: Finish unit tests, keycache migration and cache health check

* First batch of unit tests and some fixes for incorrect behavior that the tests surfaced.

* Added unit tests for the KeyCacheRepository

TODO: Tests for downloader and migration.

* Implemented POC for migration old key files.

* Fixed legacy file migration and cleanup, improved logging.

* Added unit tests for legacy key file migration.

* Add fallback for different file hashes in the header.

* Yes kLint, we know it's a long method,
but for this it's better to read it in one block vs jumping to extra methods.

* More linting issues, adjusting project code style prevent a few of these in the future.

* Added missing unit tests for `KeyFileDownloader` and fixed faulty behavior that was noticed during testing.

* CRUD (instrumentation) test for `KeyCacheDatabase`

* Remove unused `FileStorageHelper` and related constants+tests.

* Fix last3Hours unit test in deviceRelease mode, we need to explicitly enable debug for these tests.

* Until we have more information about the hashsum's format in the header, default to `ETag

* Split app config server API from diagnosis key download API,
and reintroduce caching for the app config download.

* Add test to check that the cache is used on flaky connections.

* Code changes based on PR comment, part #1.

* Code fluff, formatting.

* Handle download errors correctly.

* Refactoring:
* Remove unnecessary `currentDate` we always start with the newest date from the servers index.
* Make a specialised class for header validation

* Let legacy cache migration abort early, depending on whether the key dir exists.

* If we can't create the base directory for the key repo, throw an exception.

* Delete cache entry if a download fails.

* Fixed test regression due to refactoring.

* Consolidate staleness check into `getStale`

* Consolidate clean up for failed downloads into the download method.
Added tests to check that we delete the keycache entry if the download fails (which we didn't for hours :O!)

* Because the hour-mode uses caching too, we add an explicit button to the test menu that clears the cache.

* Add comment with reference to ticket regarding follow up on the other headers.

* Move expected storage size per country into a named constant.

Co-authored-by: Matthias Urhahn <matthias.urhahn@sap.com>
  • Loading branch information
d4rken and d4rken authored Sep 11, 2020
1 parent 0530e63 commit f7f185a
Show file tree
Hide file tree
Showing 77 changed files with 4,188 additions and 1,413 deletions.
2 changes: 2 additions & 0 deletions .idea/codeStyles/Project.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions Corona-Warn-App/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -276,6 +276,8 @@ dependencies {
testImplementation "io.kotest:kotest-runner-junit5:4.2.0"
testImplementation "io.kotest:kotest-assertions-core-jvm:4.2.0"
testImplementation "io.kotest:kotest-property-jvm:4.2.0"
androidTestImplementation "io.kotest:kotest-assertions-core-jvm:4.2.0"
androidTestImplementation "io.kotest:kotest-property-jvm:4.2.0"

// Testing - Instrumentation
androidTestImplementation 'androidx.test:runner:1.2.0'
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
{
"formatVersion": 1,
"database": {
"version": 1,
"identityHash": "c4ef5f7d4d9672d11c8eb97a63d4a3c5",
"entities": [
{
"tableName": "keyfiles",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` TEXT NOT NULL, `type` TEXT NOT NULL, `location` TEXT NOT NULL, `day` TEXT NOT NULL, `hour` TEXT, `createdAt` TEXT NOT NULL, `checksumMD5` TEXT, `completed` INTEGER NOT NULL, PRIMARY KEY(`id`))",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "location",
"columnName": "location",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "day",
"columnName": "day",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "hour",
"columnName": "hour",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "createdAt",
"columnName": "createdAt",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "checksumMD5",
"columnName": "checksumMD5",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "isDownloadComplete",
"columnName": "completed",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": false
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, 'c4ef5f7d4d9672d11c8eb97a63d4a3c5')"
]
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
{
"formatVersion": 1,
"database": {
"version": 1,
"identityHash": "03f6e8cba631c8ef60e506006913a1ad",
"entities": [
{
"tableName": "keyfiles",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` TEXT NOT NULL, `type` TEXT NOT NULL, `location` TEXT NOT NULL, `day` TEXT NOT NULL, `hour` TEXT, `sourceUrl` TEXT NOT NULL, `createdAt` TEXT NOT NULL, `checksumMD5` TEXT, `completed` INTEGER NOT NULL, PRIMARY KEY(`id`))",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "location",
"columnName": "location",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "day",
"columnName": "day",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "hour",
"columnName": "hour",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sourceUrl",
"columnName": "sourceUrl",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "createdAt",
"columnName": "createdAt",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "checksumMD5",
"columnName": "checksumMD5",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "isDownloadComplete",
"columnName": "completed",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": false
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '03f6e8cba631c8ef60e506006913a1ad')"
]
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
package de.rki.coronawarnapp.diagnosiskeys.storage

import android.content.Context
import androidx.test.core.app.ApplicationProvider
import androidx.test.ext.junit.runners.AndroidJUnit4
import de.rki.coronawarnapp.diagnosiskeys.server.LocationCode
import io.kotest.matchers.shouldBe
import kotlinx.coroutines.runBlocking
import org.joda.time.Instant
import org.joda.time.LocalDate
import org.joda.time.LocalTime
import org.junit.Test
import org.junit.runner.RunWith

@RunWith(AndroidJUnit4::class)
class KeyCacheDatabaseTest {
private val database = KeyCacheDatabase.Factory(
ApplicationProvider.getApplicationContext<Context>()
).create()
private val dao = database.cachedKeyFiles()

@Test
fun crud() {
val keyDay = CachedKeyInfo(
type = CachedKeyInfo.Type.COUNTRY_DAY,
location = LocationCode("DE"),
day = LocalDate.now(),
hour = null,
createdAt = Instant.now()
)
val keyHour = CachedKeyInfo(
type = CachedKeyInfo.Type.COUNTRY_HOUR,
location = LocationCode("DE"),
day = LocalDate.now(),
hour = LocalTime.now(),
createdAt = Instant.now()
)
runBlocking {
dao.clear()

dao.insertEntry(keyDay)
dao.insertEntry(keyHour)
dao.getAllEntries() shouldBe listOf(keyDay, keyHour)
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_DAY.typeValue) shouldBe listOf(keyDay)
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_HOUR.typeValue) shouldBe listOf(keyHour)

dao.updateDownloadState(keyDay.toDownloadUpdate("coffee"))
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_DAY.typeValue).single().apply {
isDownloadComplete shouldBe true
checksumMD5 shouldBe "coffee"
}
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_HOUR.typeValue).single().apply {
isDownloadComplete shouldBe false
checksumMD5 shouldBe null
}

dao.updateDownloadState(keyHour.toDownloadUpdate("with milk"))
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_DAY.typeValue).single().apply {
isDownloadComplete shouldBe true
checksumMD5 shouldBe "coffee"
}
dao.getEntriesForType(CachedKeyInfo.Type.COUNTRY_HOUR.typeValue).single().apply {
isDownloadComplete shouldBe true
checksumMD5 shouldBe "with milk"
}

dao.deleteEntry(keyDay)
dao.getAllEntries() shouldBe listOf(
keyHour.copy(
isDownloadComplete = true,
checksumMD5 = "with milk"
)
)

dao.clear()
dao.getAllEntries() shouldBe emptyList()
}
}
}

This file was deleted.

Loading

0 comments on commit f7f185a

Please sign in to comment.