Skip to content

4. The AppTask Model

Jakob E. Bardram edited this page Oct 13, 2024 · 7 revisions

An Application Task – or AppTask for short – is a special type of task that allows you to create a task, which is triggered by the user via the application. The PulmonaryMonitor app provides a good example of how the AppTask model works and we shall use this as the running example here.

Configuring and Using AppTasks

An AppTask is typically shown to the user, e.g. in a list of tasks as shown in Figure 1.

pm_1 pm_2

Figure 1 - The user interface (UI) of the PulmonaryMonitor app with a task list for the user to do. Each card represents an AppTask.

The task list in Figure 1 is created from the different AppTasks defined in the study_protocol_manager.dart file. For example, the sensing app task at the bottom of the list is created by this configuration:

    var protocol = StudyProtocol(
      name: 'Pulmonary Monitor',
      ownerId: 'alex@uni.dk',
    );

    // Add an app task that once pr. hour asks the user to
    // collect air quality and weather data - and notify the user.
    //
    // Note that for this to work, the air_quality and weather services needs
    // to be defined and added as connected devices to this phone.
    protocol.addTaskControl(
        PeriodicTrigger(period: const Duration(hours: 1)),
        AppTask(
            type: BackgroundSensingUserTask.ONE_TIME_SENSING_TYPE,
            title: "Weather & Air Quality",
            description: "Collect local weather and air quality data",
            notification: true,
            measures: [
              Measure(type: ContextSamplingPackage.WEATHER),
              Measure(type: ContextSamplingPackage.AIR_QUALITY),
            ]),
        phone);

As can be seen from the above code, an AppTask follows the general Trigger-Task-Measure domain model in CAMS. Hence, an AppTask is triggered by one of the different triggers available and contains a set of measures to be done. In comparison to an BackgroundTask, an AppTask can also be configured to have user-facing properties, such as:

  • type - e.g., sensing, survey, or audio
  • title - a short title for the task card
  • description - a short description shown on a task card
  • instructions - more detailed instructions
  • time to complete - the estimated time for the user to complete this task
  • notification - should a notification on a new task be sent to the user?
  • expire - when does this task expire and no longer be available?

The above code adds an PeriodicTrigger with an AppTask of type ONE_TIME_SENSING_TYPE to the study. This app task contains the two measures of WEATHER and AIR_QUALITY. The result of this sensing configuration is that when triggered, the app task is added to the task queue, which can be shown in a user interface, like the one in Figure 1. Later, the app task can be activated by the user (e.g., by pushing the PRESS HERE TO FINISH TASK button), thereby starting the collection of the measures (i.e., weather and air quality) exactly once. When the measures have been collected, the app task is marked as "done" in the task queue, illustrated by a green check mark as shown in Figure 1 (right).

The README file of the PulmonaryMonitor app contains a more detailed description of this app and how the different AppTasks are configured. It is recommended to read this, before continuing here.

AppTask Execution

When an AppTask is triggered, it is executed by an AppTaskExecutor. The runtime behavior, when triggered, is the following:

  1. Based on the AppTask configuration, a UserTask is created. This user task embeds a BackgroundTaskExecutor which later (when the user wants), can be used to execute the defined measures.
  2. This user task is enqueued in the AppTaskController. The AppTaskController is a singleton and is core to the handling of AppTasks, including creating notifications.
  3. The UserTask is now available for the app to be shown in any custom manner.
  4. When the user wants to execute the task, the app calls back to the UserTask for execution, using the embedded BackgroundTaskExecutor.

The code examples below show how this is used in the PulmonaryMonitor app.

The AppTaskController and User Task Queue

Access to the enqueued user tasks is handled in the sensing_bloc.dart BLoC of the PulmonaryMonitor app. The queue of user tasks can be accessed by the userTaskQueue property of the AppTaskController singleton:

List<UserTask> get tasks => AppTaskController().userTaskQueue;

An app can also listen to events on the user task queue:

 AppTaskController().userTaskEvents.listen((event) {
   switch (event.state) {
     case UserTaskState.initialized:
       //
       break;
     case UserTaskState.enqueued:
       //
       break;
     case UserTaskState.dequeued:
       //
       break;
     case UserTaskState.started:
       //
       break;
     case UserTaskState.done:
       //
       break;
     case UserTaskState.undefined:
       //
       break;
     case UserTaskState.canceled:
       // 
       break;
     case UserTaskState.expired:
       // 
       break;
   }
 });

Using the User Task in the User Interface of the app

The enqueued user tasks are now available to be rendered in the app in any manner fitted for the specific app. Again, here are the examples of how it is done in the Pulmonary Monitor app. The user interface code for the task list in Figure 1 is in the task_list_page.dart file.

For example, to build the scrollable list view of cards, the following StreamBuilder is used:

      body: StreamBuilder<UserTask>(
        stream: AppTaskController().userTaskEvents,
        builder: (context, AsyncSnapshot<UserTask> snapshot) {
          return Scrollbar(
            child: ListView.builder(
              itemCount: tasks.length,
              padding: const EdgeInsets.symmetric(vertical: 8.0),
              itemBuilder: (context, index) =>
                  _buildTaskCard(context, tasks[index]),
            ),
          );
        },
      ),

To render the UI of the cards representing a user task, the following StreamBuilder is used:

  Widget _buildTaskCard(BuildContext context, UserTask userTask) {
    return Center(
      child: Card(
        elevation: 10,
        shape: RoundedRectangleBorder(
          borderRadius: BorderRadius.circular(15.0),
        ),
        child: StreamBuilder<UserTaskState>(
          stream: userTask.stateEvents,
          initialData: UserTaskState.initialized,
          builder: (context, AsyncSnapshot<UserTaskState> snapshot) => Column(
            mainAxisSize: MainAxisSize.min,
            children: <Widget>[
              ListTile(
                leading: taskTypeIcon[userTask.type],
                title: Text(userTask.title),
                subtitle: Text(userTask.description),
                trailing: taskStateIcon[userTask.state],
              ),
              (userTask.availableForUser)
                  ? ButtonBar(
                      children: <Widget>[
                        TextButton(
                            child: const Text('PRESS HERE TO FINISH TASK'),
                            onPressed: () {
                              userTask.onStart();
                              if (userTask.hasWidget) {
                                Navigator.push(
                                  context,
                                  MaterialPageRoute<Widget>(
                                      builder: (context) => userTask.widget!),
                                );
                              }
                            }),
                      ],
                    )
                  : const Text(""),
            ],
          ),
        ),
      ),
    );
  }

When a user clicks on the "PRESS HERE TO FINISH TASK" button, the onPressed method is called. Here there is a callback to the user task (userTask.onStart()) and the widget for the user task (userTask.widget) is pushed to the UI.

A UserTask has five callback methods that can be called by the app:

  • onStart - will start the task and start collecting the measures defined in it.
  • onCancel - will cancel the task.
  • onDone - will mark the task as done and typically stop collecting the measures defined in it.
  • onExpired - called when this user task expires, which will remove it from the task queue.
  • onNotification - is called by the OS when a user clicks the notification linked to this user task.

Notifications

An app task can be configured to provide a notification. In this case, a local notification to the phone's notification system will be sent containing the title and description of the task. This will look like shown below.

Screenshot_20210908-232155

This notification setup uses the flutter_local_notifications Flutter package and needs some configurations in your app to work. Please check the notes on Best Practice on Notifications notes.

Types of App Tasks

Currently, CAMS supports four types of AppTasks:

  • BackgroundSensingUserTask.SENSING_TYPE - a sensing app task that can be started and stopped by calling onStart and onDone, respectively.
  • BackgroundSensingUserTask.ONE_TIME_SENSING_TYPE - a sensing app task that can be started once calling onStart.
  • SurveyUserTask.SURVEY_TYPE - an app task that starts a survey from the carp_survey_package.
  • SurveyUserTask.COGNITIVE_ASSESSMENT_TYPE - an app task that handles a cognitive assessment from the carp_survey_package.

Extending the App Task Model

As CAMS is made to be extensible, it is possible to extend the App Task model and make custom app tasks. This is done by creating new types of UserTask and ensuring it is available. This model is similar to how new measures and corresponding probes are added to CAMS, as explained in Chapter 4.

The audio tasks used in the PulmonaryMonitor app is an example of a custom AppTask made specifically for this app. The definition of this custom audio user task is defined in the audio_user_task.dart file:

/// A user task handling audio recordings.
///
/// The [widget] returns an [AudioMeasurePage] that can be shown on the UI.
///
/// When the recording is started (calling the [onRecord] method),
/// the background task collecting sensor measures is started.
class AudioUserTask extends UserTask {
  static const String AUDIO_TYPE = 'audio';

  final StreamController<int> _countDownController =
      StreamController.broadcast();
  Stream<int> get countDownEvents => _countDownController.stream;
  Timer? _timer;

  /// Duration of audio recording in seconds.
  int recordingDuration = 10;

  AudioUserTask(super.executor, [this.recordingDuration = 10]);

  @override
  bool get hasWidget => true;

  @override
  Widget? get widget => AudioMeasurePage(audioUserTask: this);

  /// Callback when recording is to start.
  /// When recording is started, background sensing is also started.
  void onRecord() {
    backgroundTaskExecutor.start();

    // start the countdown, once tick pr. second.
    _timer = Timer.periodic(const Duration(seconds: 1), (_) {
      _countDownController.add(--recordingDuration);

      if (recordingDuration <= 0) {
        _timer?.cancel();
        _countDownController.close();

        // stop the background sensing and mark this task as done
        backgroundTaskExecutor.stop();
        super.onDone();
      }
    });
  }
}

To create and enqueue the right type of user task, CAMS is using a UserTaskFactory which has to be defined for each type of user task, including this AudioUserTask:

/// A factory that can [create] a [UserTask] based on the type of app task.
/// In this case an [AudioUserTask].
class PulmonaryUserTaskFactory implements UserTaskFactory {
  @override
  List<String> types = [
    AudioUserTask.AUDIO_TYPE,
  ];

  @override
  UserTask create(AppTaskExecutor executor) {
    switch (executor.task.type) {
      case AudioUserTask.AUDIO_TYPE:
        return AudioUserTask(executor);
      default:
        return BackgroundSensingUserTask(executor);
    }
  }
}

This user task factory has to be registered in the AppTaskController, like this:

 AppTaskController().registerUserTaskFactory(PulmonaryUserTaskFactory());

This typically occurs when the app's sensing part is initialized. In the Pulmonary Monitor app, this happens when the Sensing class is created in the sensing.dart file.