Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching redux: move as much attribute/association caching code into the serializer as possible, minimize caching code in the adapter #1638

Merged
merged 3 commits into from
Apr 15, 2016

Conversation

bf4
Copy link
Member

@bf4 bf4 commented Mar 31, 2016

Purpose

Caching is broken and creates too many objects #1586

This PR begins consolidating the caching code both so that we may understand it better and so that we may improve it.

Changes

Removed CachedSerializer and FragmentSerializer and most of the cache logic from the Attributes adapter, and moved in the Serializer::Caching mixin.

Caveats

It does reduce the number of objects, but doesn't fix the issue, yet.

Related GitHub issues

Additional helpful information

Apparently, the main culprit is the fragment cache

Changing

diff --git a/test/benchmark/fixtures.rb b/test/benchmark/fixtures.rb
index bdba919..4a6c87b 100644
--- a/test/benchmark/fixtures.rb
+++ b/test/benchmark/fixtures.rb
@@ -34,7 +34,7 @@ end
 Rails.configuration.serializers << PostSerializer

 class CachingAuthorSerializer < AuthorSerializer
-  cache key: 'writer', only: [:name], skip_digest: true
+  cache key: 'writer', skip_digest: true
 end
 Rails.configuration.serializers << CachingAuthorSerializer

leads to a speed-up of the benchmark so that caching is slightly faster (for this in-memory non-activerecord benchmark). Running BENCH_STRESS=true bin/bench -r 3

with fragment cached serializer without
Benchmark results:
{
  "commit_hash": "2b66951",
  "version": "0.10.0.rc4",
  "rails_version": "4.0.13",
  "benchmark_run[environment]": "2.2.2p95",
  "runs": [
    {
      "benchmark_type[category]": "caching on: caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 156.594,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 10093
    },
    {
      "benchmark_type[category]": "caching off: caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 147.682,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 10093
    },
    {
      "benchmark_type[category]": "caching on: non-caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 227.205,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 6695
    },
    {
      "benchmark_type[category]": "caching off: non-caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 196.208,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 6695
    }
  ]
}
Benchmark results:
{
  "commit_hash": "2b66951",
  "version": "0.10.0.rc4",
  "rails_version": "4.0.13",
  "benchmark_run[environment]": "2.2.2p95",
  "runs": [
    {
      "benchmark_type[category]": "caching on: caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 166.582,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 9850
    },
    {
      "benchmark_type[category]": "caching off: caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 151.88,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 9850
    },
    {
      "benchmark_type[category]": "caching on: non-caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 228.776,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 6695
    },
    {
      "benchmark_type[category]": "caching off: non-caching serializers: gc off",
      "benchmark_run[result][iterations_per_second]": 209.399,
      "benchmark_run[result][total_allocated_objects_per_iteration]": 6695
    }
  ]
}

@bf4
Copy link
Member Author

bf4 commented Apr 1, 2016

Coverage dropped below threshold on windows. ha

# @return [Array] all cache_key of collection_serializer
def object_cache_keys(collection_serializer, adapter_instance, include_tree)
cache_keys = []

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

look at e.g. #1276 in 0.8 for how we expand cache keys

@bf4
Copy link
Member Author

bf4 commented Apr 14, 2016

I'm doing some exploratory work in https://github.com/bf4/active_model_serializers/tree/wip/caching_redux right now. Interestingly, comparing ams caching to a 'virtual caching' (manual) shows there is room for improvement

no caching caching virtual (manual)
598.9890084759535/ips; 2348 objects 251.90341731442047/ips; 5879 objects 89169.87612473704/ips; 16 object
ActiveModelSerializers::SerializableResource.new(
model, adapter: :json, serializer: PostSerializer
).as_json
ActiveModelSerializers::SerializableResource.new(
model, adapter: :json, serializer: CachingPostSerializer
).as_json
Should be using read_mutli/fetch_multi.. needs more parity, esp fragment caching. in any case:
 parts = []
  parts << 'ams_blog'
  parts << 'attributes'
  cache_key = parts.join('/')
  cache_store.fetch(cache_key) do
    include_tree = ActiveModel::Serializer::IncludeTree.from_include_args('*')
    post_serializer = CachingPostSerializer.new(model)
    json = {post: post_serializer.attributes }
    post_serializer.associations(include_tree).each do |association|
      # FIXME: yields each association twice
      json[:post][association.key] ||=
        case association.key
        when :comments
          cache_store.fetch(['ams_comments', 'attributes'].join('/')) do
            association.serializer.map(&:attributes)
          end
        when :blog
          association.serializer.attributes
        when :author
          cache_store.fetch(['ams_author', 'attributes'].join('/')) do
            association.serializer.attributes
          end
        else
          raise ArgumentError, "unexpected association #{association}"
        end
    end
    json
  end

@bf4
Copy link
Member Author

bf4 commented Apr 14, 2016

I also did some very simple (fuzzy, not precise) performance diffing of the caching behavior. Read from bottom to top. (diff at bottom shows starting point)


back to normal, -340
aching on: caching serializers: gc off 667.6587301587302/ips; 1802 objects
caching on: non-caching serializers: gc off 1016.6666666666663/ips; 1256 objects
caching off: caching serializers: gc off 618.0555555555555/ips; 1802 objects
caching off: non-caching serializers: gc off 929.7619047619046/ips; 1256 objects

include fragment cache in 'cache' strategy, -260
caching on: caching serializers: gc off 790.4761904761903/ips; 1546 objects
caching on: non-caching serializers: gc off 1050.0000000000005/ips; 1256 objects
caching off: caching serializers: gc off 761.9047619047617/ips; 1546 objects
caching off: non-caching serializers: gc off 1025.0000000000002/ips; 1256 objects

enabled cache_store.fetch, -190
caching on: caching serializers: gc off 814.2857142857141/ips; 1513 objects
caching on: non-caching serializers: gc off 1025.0/ips; 1256 objects
caching off: caching serializers: gc off 830.9523809523802/ips; 1513 objects
caching off: non-caching serializers: gc off 1024.9999999999995/ips; 1256 objects

resource object for, cached_attributes.fetch cached_fields. -160
caching on: caching serializers: gc off 864.2857142857141/ips; 1500 objects
caching on: non-caching serializers: gc off 1024.9999999999995/ips; 1256 objects
caching off: caching serializers: gc off 835.7142857142851/ips; 1500 objects
caching off: non-caching serializers: gc off 1024.9999999999995/ips; 1256 objects

enable read multi (uses object_cache_keys) -160
caching on: caching serializers: gc off 885.7142857142859/ips; 1470 objects
caching on: non-caching serializers: gc off 1041.6666666666667/ips; 1256 objects
caching off: caching serializers: gc off 854.7619047619042/ips; 1470 objects
caching off: non-caching serializers: gc off 1008.3333333333334/ips; 1256 objects

fragment cache enabled -120
caching on: caching serializers: gc off 921.4285714285713/ips; 1389 objects
caching on: non-caching serializers: gc off 1041.666666666667/ips; 1235 objects
caching off: caching serializers: gc off 933.3333333333334/ips; 1389 objects
caching off: non-caching serializers: gc off 1024.9999999999998/ips; 1235 objects


cache enabled -120
caching on: caching serializers: gc off 950.0/ips; 1389 objects
caching on: non-caching serializers: gc off 1041.6666666666667/ips; 1235 objects
caching off: caching serializers: gc off 921.428571428571/ips; 1389 objects
caching off: non-caching serializers: gc off 1033.3333333333326/ips; 1235 objects

cache attributes -120
caching on: caching serializers: gc off 958.3333333333333/ips; 1389 objects
caching on: non-caching serializers: gc off 1083.3333333333333/ips; 1235 objects
caching off: caching serializers: gc off 941.6666666666666/ips; 1389 objects
caching off: non-caching serializers: gc off 1049.9999999999998/ips; 1235 objects

commented out
caching on: caching serializers: gc off 963.0952380952382/ips; 1387 objects
caching on: non-caching serializers: gc off 1058.3333333333337/ips; 1234 objects
caching off: caching serializers: gc off 938.0952380952383/ips; 1387 objects
caching off: non-caching serializers: gc off 1000.0000000000003/ips; 1234 objects



commented out
diff --git a/lib/active_model/serializer/caching.rb b/lib/active_model/serializer/caching.rb
index 60c7ef7..a149f9e 100644
--- a/lib/active_model/serializer/caching.rb
+++ b/lib/active_model/serializer/caching.rb
@@ -147,10 +147,12 @@ module ActiveModel
         end

         def cache_enabled?
+          return false
           perform_caching? && cache_store && !_cache_only && !_cache_except
         end

         def fragment_cache_enabled?
+          return false
           perform_caching? && cache_store &&
             (_cache_only && !_cache_except || !_cache_only && _cache_except)
         end
@@ -158,6 +160,7 @@ module ActiveModel
         # Read cache from cache_store
         # @return [Hash]
         def cache_read_multi(collection_serializer, adapter_instance, include_tree)
+          return {}
           return {} if ActiveModelSerializers.config.cache_store.blank?

           keys = object_cache_keys(collection_serializer, adapter_instance, include_tree)
@@ -210,15 +213,16 @@ module ActiveModel
       end

       def cache_check(adapter_instance)
-        if self.class.cache_enabled?
-          self.class.cache_store.fetch(cache_key(adapter_instance), self.class._cache_options) do
-            yield
-          end
-        elsif self.class.fragment_cache_enabled?
-          fetch_fragment_cache(adapter_instance)
-        else
-          yield
-        end
+        yield
+        # if self.class.cache_enabled?
+        #   self.class.cache_store.fetch(cache_key(adapter_instance), self.class._cache_options) do
+        #     yield
+        #   end
+        # elsif self.class.fragment_cache_enabled?
+        #   fetch_fragment_cache(adapter_instance)
+        # else
+        #   yield
+        # end
       end

       # 1. Create a CachedSerializer and NonCachedSerializer from the serializer class
diff --git a/lib/active_model_serializers/adapter/attributes.rb b/lib/active_model_serializers/adapter/attributes.rb
index 50e958f..8a69290 100644
--- a/lib/active_model_serializers/adapter/attributes.rb
+++ b/lib/active_model_serializers/adapter/attributes.rb
@@ -55,19 +55,19 @@ module ActiveModelSerializers

       # Set @cached_attributes
       def cache_attributes
-        return if @cached_attributes.present?
-
-        @cached_attributes = ActiveModel::Serializer.cache_read_multi(serializer, self, @include_tree)
+        # return if @cached_attributes.present?
+        #
+        # @cached_attributes = ActiveModel::Serializer.cache_read_multi(serializer, self, @include_tree)
       end

       def resource_object_for(options)
-        if serializer.class.cache_enabled?
-          @cached_attributes.fetch(serializer.cache_key(self)) do
-            serializer.cached_fields(options[:fields], self)
-          end
-        else
+        # if serializer.class.cache_enabled?
+        #   @cached_attributes.fetch(serializer.cache_key(self)) do
+        #     serializer.cached_fields(options[:fields], self)
+        #   end
+        # else
           serializer.cached_fields(options[:fields], self)
-        end
+        # end
       end
     end
   end
diff --git a/test/benchmark/benchmarking_support.rb b/test/benchmark/benchmarking_support.rb
index 79fb663..4422ca0 100644
--- a/test/benchmark/benchmarking_support.rb
+++ b/test/benchmark/benchmarking_support.rb
@@ -14,7 +14,7 @@ module Benchmark
       end
     end
     module FakeIps
-      ITERATIONS = 500
+      ITERATIONS = 50

       # rubocop:disable Metrics/AbcSize
       def ips(*args)
@@ -142,8 +142,8 @@ module Benchmark
   end


-  require 'benchmark/ips'
-  # extend Benchmark::ActiveModelSerializers::FakeIps
+  # require 'benchmark/ips'
+  extend Benchmark::ActiveModelSerializers::FakeIps
   extend Benchmark::ActiveModelSerializers
 end
 # puts "memory usage after large string creation %d MB" %

@bf4 bf4 changed the title [WIP] Caching redux Caching redux: move as much attribute/association caching code into the serializer as possible, minimize caching code in the adapter Apr 15, 2016
@bf4 bf4 removed the C: Feature label Apr 15, 2016
@bf4 bf4 merged commit 63e9337 into rails-api:master Apr 15, 2016
@bf4 bf4 deleted the caching_redux branch April 15, 2016 19:08
bf4 added a commit to bf4/active_model_serializers that referenced this pull request Apr 18, 2016
Caching redux: move as much attribute/association caching code into the serializer as possible, minimize caching code in the adapter
bf4 added a commit to bf4/active_model_serializers that referenced this pull request Apr 18, 2016
bundle exec bin/bench_regression a5eaf6cd7a7fed42d9e64777753a1762e187eadc 1033b71 --pattern bm_caching

["perf/only_calc_associations_once", "a5eaf6cd7a7fed42d9e64777753a1762e187eadc", "1033b711c7d7c231bb5b832e7dfe7f99389f22c4", "a5eaf6c"]
  "version": "0.10.0.rc5",
  "rails_version": "4.2.6",
  "benchmark_run[environment]": "2.2.2p95",

Note: checking out 'a5eaf6cd7a7fed42d9e64777753a1762e187eadc'.

HEAD is now at a5eaf6c... Nicer debug; compare caching by serializer, grouped by caching on/off

caching on: caching serializers: gc off 783.6956866669746/ips; 1355 objects
caching on: non-caching serializers: gc off 798.8629770532652/ips; 1257 objects
caching off: caching serializers: gc off 682.3661326140281/ips; 1355 objects
caching off: non-caching serializers: gc off 721.2175067555897/ips; 1257 objects

HEAD is now at 1033b71... Merge pull request rails-api#1638 from bf4/caching_redux

caching on: caching serializers: gc off 570.6905948477781/ips; 1803 objects
caching on: non-caching serializers: gc off 822.8418206976623/ips; 1257 objects
caching off: caching serializers: gc off 523.4174806572001/ips; 1803 objects
caching off: non-caching serializers: gc off 747.6026493097758/ips; 1257 objects
bf4 added a commit to bf4/active_model_serializers that referenced this pull request Apr 18, 2016
bundle exec bin/bench_regression a5eaf6cd7a7fed42d9e64777753a1762e187eadc 1033b71 --pattern bm_caching

["perf/only_calc_associations_once", "a5eaf6cd7a7fed42d9e64777753a1762e187eadc", "1033b711c7d7c231bb5b832e7dfe7f99389f22c4", "a5eaf6c"]
  "version": "0.10.0.rc5",
  "rails_version": "4.2.6",
  "benchmark_run[environment]": "2.2.2p95",

Note: checking out 'a5eaf6cd7a7fed42d9e64777753a1762e187eadc'.

HEAD is now at a5eaf6c... Nicer debug; compare caching by serializer, grouped by caching on/off

caching on: caching serializers: gc off 783.6956866669746/ips; 1355 objects
caching on: non-caching serializers: gc off 798.8629770532652/ips; 1257 objects
caching off: caching serializers: gc off 682.3661326140281/ips; 1355 objects
caching off: non-caching serializers: gc off 721.2175067555897/ips; 1257 objects

HEAD is now at 1033b71... Merge pull request rails-api#1638 from bf4/caching_redux

caching on: caching serializers: gc off 570.6905948477781/ips; 1803 objects
caching on: non-caching serializers: gc off 822.8418206976623/ips; 1257 objects
caching off: caching serializers: gc off 523.4174806572001/ips; 1803 objects
caching off: non-caching serializers: gc off 747.6026493097758/ips; 1257 objects
bf4 added a commit to bf4/active_model_serializers that referenced this pull request Apr 18, 2016
bundle exec bin/bench_regression a5eaf6cd7a7fed42d9e64777753a1762e187eadc 1033b71 --pattern bm_caching

["perf/only_calc_associations_once", "a5eaf6cd7a7fed42d9e64777753a1762e187eadc", "1033b711c7d7c231bb5b832e7dfe7f99389f22c4", "a5eaf6c"]
  "version": "0.10.0.rc5",
  "rails_version": "4.2.6",
  "benchmark_run[environment]": "2.2.2p95",

Note: checking out 'a5eaf6cd7a7fed42d9e64777753a1762e187eadc'.

HEAD is now at a5eaf6c... Nicer debug; compare caching by serializer, grouped by caching on/off

caching on: caching serializers: gc off 783.6956866669746/ips; 1355 objects
caching on: non-caching serializers: gc off 798.8629770532652/ips; 1257 objects
caching off: caching serializers: gc off 682.3661326140281/ips; 1355 objects
caching off: non-caching serializers: gc off 721.2175067555897/ips; 1257 objects

HEAD is now at 1033b71... Merge pull request rails-api#1638 from bf4/caching_redux

caching on: caching serializers: gc off 570.6905948477781/ips; 1803 objects
caching on: non-caching serializers: gc off 822.8418206976623/ips; 1257 objects
caching off: caching serializers: gc off 523.4174806572001/ips; 1803 objects
caching off: non-caching serializers: gc off 747.6026493097758/ips; 1257 objects
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant