-
Notifications
You must be signed in to change notification settings - Fork 28
4. The AppTask Model
An Application Task – or AppTask
for short – is a special type of task that allow you to create a task, which is triggered by the user via the application.
The PulmonaryMonitor app provides a good example of how the AppTask model works and we shall use this as the running example here.
An AppTask is typically shown to the user, e.g. in a list of tasks as shown in Figure 1.
Figure 1 - User interface of the PulmonaryMonitor app with a task list for the user to do. Each card represent an AppTask.
The task list in Figure 1 is created from the different AppTask
s defined in the study_protocol_manager.dart
file.
For example, the sensing app task at the bottom of the list is created by this configuration:
StudyProtocol protocol = StudyProtocol(
name: 'Pulmonary Monitor',
ownerId: 'alex@uni.dk',
);
// add an app task that once pr. hour asks the user to
// collect air quality and weather data - and notify the user
//
// note that in order for this to work, the air_quality and weather
// services needs to be defined and added as connected devices to this
// phone
protocol.addTriggeredTask(
IntervalTrigger(period: Duration(hours: 1)),
AppTask(
type: BackgroundSensingUserTask.ONE_TIME_SENSING_TYPE,
title: "Weather & Air Quality",
description: "Collect local weather and air quality data",
notification: true,
)..addMeasures([
Measure(type: ContextSamplingPackage.WEATHER),
Measure(type: ContextSamplingPackage.AIR_QUALITY)
]),
phone);
As can be seen from the above code, an AppTask follows the general Trigger-Task-Measure
model in CAMS. Hence, an AppTask is triggered by one of the different triggers available and contains a set of measures to be done.
In comparison to an BackgroundTask
, an AppTask can also be configured to have user-facing properties, such as:
-
type
- e.g., sensing, survey, or audio -
title
- a short title for the task card -
description
- a short description shown on a task card -
instructions
- more detailed instructions -
time to complete
- the estimated time for the user to complete this task -
notification
- should a notification on a new app task be send to the user?
The above code adds an IntervalTrigger
with an AppTask
of type ONE_TIME_SENSING_TYPE
to the study.
This app task contains the two measures of WEATHER
and AIR_QUALITY
.
The result of this sensing configuration is that when the trigger triggers, the app task is added to the task queue, which can be show in a user interface, like the one in Figure 1.
Later, the app task can be activated by the user (by e.g. pushing the PRESS HERE TO FINISH TASK
button), thereby resuming the collection of the measure (i.e., weather and air quality) exactly once. When the measures have been collected, the app task is marked as "done" in the task queue, illustrated by a green check mark as shown in Figure 1 (right).
The README file of the PulmonaryMonitor app contain more detailed description of this app and how the different AppTasks are configured. I recommend to read this, before continuing here.
When an AppTask is triggered, it is executed by an AppTaskExecutor. The runtime behaviour, when triggered, is the following:
- Based on the AppTask configuration, a
UserTask
is created. This user task embeds aBackgroundTaskExecutor
which later (when the user wants), can be used to execute the defined measures. - This user task in enqueued in the
AppTaskController
. The AppTaskController is a singleton and is core to the handling of AppTasks, including creating notifications. - The UserTask is now available for the app to be shown in any custom manner.
- When the user wants to execute the task, the app calls back to the UserTask for execution, using the embedded
BackgroundTaskExecutor
.
The code examples below shows how this is used in the PulmonaryMonitor app.
Access to the enqueued user tasks is handled in the sensing_bloc.dart
BLoC of the PulmonaryMonitor app.
The queue of user tasks can be accessed by the userTaskQueue
property of the AppTaskController
singleton:
List<UserTask> get tasks => AppTaskController().userTaskQueue;
An app can also listen to event on the user task queue:
AppTaskController().userTaskEvents.listen((event) {
switch (event.state) {
case UserTaskState.initialized:
//
break;
case UserTaskState.enqueued:
//
break;
case UserTaskState.dequeued:
//
break;
case UserTaskState.started:
//
break;
case UserTaskState.done:
//
break;
case UserTaskState.undefined:
//
break;
case UserTaskState.canceled:
//
break;
case UserTaskState.expired:
//
break;
}
});
The enqueued user tasks are now available to be rendered in the app in any manner fitted for the specific app. Again, here are the examples of how it is done in the Pulmonary Monitor app. The user interface code for the task list in Figure 1 is in the task_list_page.dart file.
For example, to build the scrollable list view of cards, the following StreamBuilder
is used:
body: StreamBuilder<UserTask>(
stream: AppTaskController().userTaskEvents,
builder: (context, AsyncSnapshot<UserTask> snapshot) {
return Scrollbar(
child: ListView.builder(
itemCount: tasks.length,
padding: EdgeInsets.symmetric(vertical: 8.0),
itemBuilder: (context, index) =>
_buildTaskCard(context, tasks[index]),
),
);
},
),
And to render the UI of the cards representing a user task, the following StreamBuilder
is used:
Widget _buildTaskCard(BuildContext context, UserTask userTask) {
return Center(
child: Card(
elevation: 10,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(15.0),
),
child: StreamBuilder<UserTaskState>(
stream: userTask.stateEvents,
initialData: UserTaskState.initialized,
builder: (context, AsyncSnapshot<UserTaskState> snapshot) => Column(
mainAxisSize: MainAxisSize.min,
children: <Widget>[
ListTile(
leading: taskTypeIcon[userTask.type],
title: Text(userTask.title),
subtitle: Text(userTask.description),
trailing: taskStateIcon[userTask.state],
),
(userTask.state == UserTaskState.enqueued ||
userTask.state == UserTaskState.onhold)
? ButtonBar(
children: <Widget>[
FlatButton(
child: const Text('PRESS HERE TO FINISH TASK'),
onPressed: () => userTask.onStart(context)),
],
)
: Text(""),
],
),
),
),
);
Note the call back to the user task when the button is pressed and the task has to be started - userTask.onStart(context)
.
A UserTask has three callback methods to use from the app:
-
onStart
- will start the task, i.e resume the measures defined in it. -
onCancel
- will cancel the task. -
onDone
- will mark the task as done and typically pause the measures defined in it. -
onNotification
- is called by the OS when a user click the notification linked to this user task.
An app task can be configured to provide a notification
. In this case, a local notification to the phones notification system will be send containing the title
and description
of the task. This will look like shown below.
This notification setup uses the awesome_notifications
Flutter package and needs some configurations in your app to work. Please check the notes on Best Practice on Notifications page.
Currently, CAMS support four type of AppTasks:
-
BackgroundSensingUserTask.SENSING_TYPE
- a type of sensing user task which can be resumed and paused callingonStart
andonDone
, respectively. -
BackgroundSensingUserTask.ONE_TIME_SENSING_TYPE
- a type of sensing user task which can be resumed once callingonStart
-
SurveyUserTask.SURVEY_TYPE
- an app task that starts a survey from the carp_survey_package. -
SurveyUserTask.COGNITIVE_ASSESSMENT_TYPE
- an app task that handles a cognitive assessment the carp_survey_package.
As CAMS is made to be extensible, it is also possible to extend the AppTask model and make custom app tasks.
This is done by creating a new types of UserTask
and ensuring it is available. This model is very similar to how new probes / measures are added to CAMS, as explained in Chapter 4.
The audio tasks used in the PulmonaryMonitor app is an example of a custom AppTask made specifically for this app. The definition of this custom audio user task is defined in the audio_user_task.dart file:
/// A user task handling audio recordings.
/// When started, creates a [AudioMeasurePage] and shows it to the user.
class AudioUserTask extends UserTask {
static const String AUDIO_TYPE = 'audio';
StreamController<int> _countDownController = StreamController.broadcast();
Stream<int> get countDownEvents => _countDownController.stream;
/// Duration of audio recording in seconds.
int recordingDuration = 10;
AudioUserTask(AppTaskExecutor executor) : super(executor);
void onStart(BuildContext context) {
super.onStart(context);
Navigator.push(
context,
MaterialPageRoute(
builder: (context) => AudioMeasurePage(audioUserTask: this)),
);
}
/// Callback when recording is to start.
void onRecord() {
executor.resume();
Timer.periodic(new Duration(seconds: 1), (timer) {
_countDownController.add(--recordingDuration);
if (recordingDuration == 0) {
timer.cancel();
_countDownController.close();
executor.pause();
state = UserTaskState.done;
}
});
}
}
In order to know how to create and enqueue the right user task, CAMS is using a UserTaskFactory which has to be defined for this AudioUserTask:
class PulmonaryUserTaskFactory implements UserTaskFactory {
List<String> types = [
AudioUserTask.AUDIO_TYPE,
];
UserTask create(AppTaskExecutor executor) {
switch (executor.appTask.type) {
case AudioUserTask.AUDIO_TYPE:
return AudioUserTask(executor);
default:
return SensingUserTask(executor);
}
}
}
This user task factory has to be registered in the AppTaskController, like this:
AppTaskController().registerUserTaskFactory(PulmonaryUserTaskFactory());
This typically takes place where the registers the different sampling packages.
In the Pulmonary Monitor, this takes place when the Sensing
class is created in
the sensing.dart file.