Skip to content
ahjohannessen edited this page Apr 5, 2011 · 8 revisions

Refactoring

  • Focus on a stronger semantic model for spark configuration that gets baked into Fubu by getting executed by conventions that are aware of that semantic model.

  • Find proper place in Fubu to allow Spark as a package to be opted-in.

  • Push knowledge from DSL (the extension methods on FubuRegistry, SparkSettings etc) into a real model. We rely too much on extension methods to add basic functionality, e.g. view folders.

###Details

  • The consumption of the model needs to happen through an activation mechanism that translates it down to Fubu and the DSL needs to let the user define package-specific information.

  • Strip all spark types out of the DSL and build up a true configuration model and register this into the services, and then the activation mechanism would spin up, pull all the instances of spark configurations and then configure the fubu part.

  • Package registries (even the primary registry) could build up the spark semantic model and then we can turn around and consume it then bake it into the Fubu model. Implies that focus needs to be on keeping the spark settings pieces and everything isolated, and then "merging" spark configuration expressions by way of the activation mechanism baking each into fubu at proper point. The SparkFactory is our end result.

  • Spark defaults (provided by spark as a package) should bake in "default conventions" that then get baked into Fubu via our activation mechanism.

Note: Timing is the tricky part of all of this. We need to get the model into the FubuRegistry before the graph gets built up. IActivator seems not the proper solution as these run after the graph is built up.

View Discovery in Packages

  • Start by looking at folders under package containing action call.

  • Shared folders on all descriptors in all packages (convention).

  • Shared folders scoped to individual packages. Implies policy/dsl that scopes a package.

  • No hard-coding package folders (should pull from packages themselves)

  • Packaging debug mode does not have enough information to find views

Diagnostics

  • Push diagnostics into ISparkPolicy

View Rendering

  • Implement caching of spark compilation (ISparkViewEntry)

Ideas

  • Multiple views to same output model - useful for mobile clients and similar. Implies strategy to conditionally select view depending on context.