Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add caching to the token introspection policy #656

Merged
merged 8 commits into from
Jun 4, 2018

Conversation

davidor
Copy link
Contributor

@davidor davidor commented Mar 15, 2018

This PR adds caching to the token introspection policy.

The caching is configurable. It can be configured to disable it, have a max TTL, or always respect the TTL that the token introspection returns.

@davidor davidor requested a review from mikz March 15, 2018 14:08
@davidor davidor force-pushed the add-cache-token-introspection branch from 932efe7 to 1dcbced Compare March 15, 2018 14:11
-- If there's no 'max_ttl_tokens', the ttl will be the one returned by the
-- token introspection endpoint. To disable caching, max_ttl_tokens needs to
-- be set to 0.
self.caching_enabled = not config.max_ttl_tokens or config.max_ttl_tokens > 0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about:

local max_cached_tokens = tonumber(config.max_cached_tokens) or 0

self.caching_enabled = max_cached_tokens > 0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The advantage of your solution is that when none of the 2 new config params are specified, the policy behaves as it does now, that is, without caching anything.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed 👍

return self
end

--- OAuth 2.0 Token Introspection defined in RFC7662.
-- https://tools.ietf.org/html/rfc7662
local function introspect_token(self, token)
if self.caching_enabled then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about using some noop cache instead? I wonder how this affects LuaJIT.

If this file would be cached then there would be just one instance of this function shared with all policies. When some would have cache and others not then LuaJIT could not compile this.

But because each policy instance is required again every policy has own JITable function.

So having a "no cache" implementation would be just for readability and not for performance. Polymorphism over conditionals. LuaJIT will then inline those functions to nothing if they are empty.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I didn't think about LuaJIT while doing this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed this to get rid of the conditionals. Although in this particular case I don't see a big difference.

-- If the config does not contain a max ttl and the token instrospection did
-- not return one, we cannot cache the token.
if ttl and ttl > 0 then
-- It's important that we do not allow setting keys with ttl = 0, because
Copy link
Contributor

@mikz mikz Mar 15, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right. I was confused, but this only affects the comment.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

Copy link
Contributor

@mikz mikz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 nothing I've commented is really critical

We should hook it up to keycloak and see it in action before merging.

@davidor davidor force-pushed the add-cache-token-introspection branch 2 times, most recently from 56ca046 to ed6d30a Compare March 15, 2018 19:47
},
"max_ttl_tokens": {
"description": "Max TTL for cached tokens",
"type": "integer"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This probably should be > 0 and also have some upper limit.

We should think on how to enforce same limits for everyone on SaaS .

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added max and min both for token TTLs and the max tokens to cache. I've chosen some numbers that sounded reasonable to me. Let me know if you agree. It's in the latest commit.

@davidor davidor force-pushed the add-cache-token-introspection branch from ed6d30a to 957aee9 Compare May 29, 2018 12:31
@davidor
Copy link
Contributor Author

davidor commented May 29, 2018

Rebased on master to solve conflicts and see tests coverage.

@mikz
Copy link
Contributor

mikz commented Jun 4, 2018

So checked how Keycloak 4.0 works.

Token Introspection endpoint

Token Introspection endpoint requires authentication and it can't be the token itself. It can for example use Basic auth with client_id+secret of a client that has correct permissions.

POST /auth/realms/APIcast/protocol/openid-connect/token/introspect HTTP/1.1
Authorization: Basic aW5zb21uaWE6ZGRjODQ1OGEtODFjYi00YmNkLTg3ZWYtZDYzNjMzYmJhZWI1
Content-Type: application/x-www-form-urlencoded; charset=utf-8
Host: localhost:8000
Connection: close
User-Agent: Paw/3.1.7 (Macintosh; OS X/10.13.4) GCDHTTPRequest
Content-Length: 1241

token=eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICJ6S1FEQWpkZGNWXzc3YlN5eE5oMmxfdXZlQVdhOGNmR3NnZlFUTkxPQXdnIn0.eyJqdGkiOiJmYTlkNzU5Mi04MzM4LTQ5MDAtYjI4ZC03ZTExMjRmMzU2N2YiLCJleHAiOjE1MjgxMjIyNTksIm5iZiI6MCwiaWF0IjoxNTI4MTIxOTU5LCJpc3MiOiJodHRwOi8vbG9jYWxob3N0OjgwMDAvYXV0aC9yZWFsbXMvQVBJY2FzdCIsImF1ZCI6Imd1eSIsInN1YiI6IjYyZGFmM2FhLTA4M2EtNDJjMy05ZGYwLTI4YzRkNjgzOGRjNCIsInR5cCI6IkJlYXJlciIsImF6cCI6Imd1eSIsImF1dGhfdGltZSI6MTUyODEyMTMwNSwic2Vzc2lvbl9zdGF0ZSI6ImM4M2VlNWM1LWQ1ZWYtNDc1MC1iMjBlLTQ4ZmE1NTA0N2ZlNyIsImFjciI6IjAiLCJhbGxvd2VkLW9yaWdpbnMiOltdLCJyZWFsbV9hY2Nlc3MiOnsicm9sZXMiOlsidW1hX2F1dGhvcml6YXRpb24iXX0sInJlc291cmNlX2FjY2VzcyI6eyJhY2NvdW50Ijp7InJvbGVzIjpbIm1hbmFnZS1hY2NvdW50IiwibWFuYWdlLWFjY291bnQtbGlua3MiLCJ2aWV3LXByb2ZpbGUiXX19LCJuYW1lIjoiRm9vIEJhciIsInByZWZlcnJlZF91c2VybmFtZSI6Imluc29tbmlhIiwiZ2l2ZW5fbmFtZSI6IkZvbyIsImZhbWlseV9uYW1lIjoiQmFyIn0.LqVIQFL1rPzIgFRof0N-NqADxe3rGlNM0ibztOb9lXxZZ4MYFie2vvOkAJ3EZHIrD3LFWTw5MO90thf0ruIdCU2jyt1ybs3Wu_AXi5CiCM7bPnMKDIu0r05wrGA9DzK15Ss49Hw8aUIC099utL8nbx3YBAPMa9TPVd3fxcqiFFrcYI7cQ07m37sdqOfVAzDzWWOIqUGc98aFbXAJGqLt2e4MQdp225zqnwPPXgGXDye_5czR_qM2S6xTjibqy5o-u_iGqRlknN778kzdsO45-yTuxFShqQQgwMEj-G5yOtzNlidNbMdh7iWs3zptlisXbjQ4VVKoNmEOIdFMShsLLg&token_type_hint=access_token
HTTP/1.1 200 OK
Connection: close
Content-Type: application/json
Content-Length: 617
Date: Mon, 04 Jun 2018 14:19:29 GMT

{
  "jti": "fa9d7592-8338-4900-b28d-7e1124f3567f",
  "exp": 1528122259,
  "nbf": 0,
  "iat": 1528121959,
  "iss": "http://localhost:8000/auth/realms/APIcast",
  "aud": "guy",
  "sub": "62daf3aa-083a-42c3-9df0-28c4d6838dc4",
  "typ": "Bearer",
  "azp": "guy",
  "auth_time": 1528121305,
  "session_state": "c83ee5c5-d5ef-4750-b20e-48fa55047fe7",
  "name": "Foo Bar",
  "given_name": "Foo",
  "family_name": "Bar",
  "preferred_username": "insomnia",
  "acr": "0",
  "allowed-origins": [],
  "realm_access": {
    "roles": [
      "uma_authorization"
    ]
  },
  "resource_access": {
    "account": {
      "roles": [
        "manage-account",
        "manage-account-links",
        "view-profile"
      ]
    }
  },
  "client_id": "guy",
  "username": "insomnia",
  "active": true
}

When the client is invalidated using "not before", disabled or the user logs out then the endpoint responds with:

HTTP/1.1 200 OK
Connection: close
Content-Type: application/json
Content-Length: 16
Date: Mon, 04 Jun 2018 14:21:15 GMT

{"active":false}

Userinfo endpoint

Userinfo endpoint does not require extra authorization and it prints user information of the token.

GET /auth/realms/APIcast/protocol/openid-connect/userinfo HTTP/1.1
Authorization: Bearer eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICJ6S1FEQWpkZGNWXzc3YlN5eE5oMmxfdXZlQVdhOGNmR3NnZlFUTkxPQXdnIn0.eyJqdGkiOiI1YmQ3M2EyMS1hMjg4LTRmMDctYTIxMS05YWMxNmY0YjIyYmQiLCJleHAiOjE1MjgxMjI1ODgsIm5iZiI6MCwiaWF0IjoxNTI4MTIyMjg4LCJpc3MiOiJodHRwOi8vbG9jYWxob3N0OjgwMDAvYXV0aC9yZWFsbXMvQVBJY2FzdCIsImF1ZCI6Imd1eSIsInN1YiI6IjYyZGFmM2FhLTA4M2EtNDJjMy05ZGYwLTI4YzRkNjgzOGRjNCIsInR5cCI6IkJlYXJlciIsImF6cCI6Imd1eSIsImF1dGhfdGltZSI6MTUyODEyMjI4Nywic2Vzc2lvbl9zdGF0ZSI6ImJhNWZiZTY5LTE2YWMtNGNkNy1iOGRjLWNiZGM4ODQ0ODUxZCIsImFjciI6IjEiLCJhbGxvd2VkLW9yaWdpbnMiOltdLCJyZWFsbV9hY2Nlc3MiOnsicm9sZXMiOlsidW1hX2F1dGhvcml6YXRpb24iXX0sInJlc291cmNlX2FjY2VzcyI6eyJhY2NvdW50Ijp7InJvbGVzIjpbIm1hbmFnZS1hY2NvdW50IiwibWFuYWdlLWFjY291bnQtbGlua3MiLCJ2aWV3LXByb2ZpbGUiXX19LCJuYW1lIjoiRm9vIEJhciIsInByZWZlcnJlZF91c2VybmFtZSI6Imluc29tbmlhIiwiZ2l2ZW5fbmFtZSI6IkZvbyIsImZhbWlseV9uYW1lIjoiQmFyIn0.NsItiiICsyM2qhi5ihIuKeXu9PZ7CmvXW0ot4t--w_mrSBW1BoSyd4WwdJ61h0sm4apKhde84_bzAPX35wdfkoXt_Nzn4HrTelgaY9R5gGzg_2-kkZhh8gSxrbkvzdY6hQXif_CTkXsRr3CnN7iR0hDiN9l0scE24z7c2ltqmYMpY-FHgmRu12cY8g61MkyvwN_WsDJy08FQ7EJTy_S5eP5XItIn5JM6g_VV5kUBoRlTkriH2aLRE-t2SOoApJ8JP7nPTHWJqQ12MFJyRNCVrq4SvJgbuZnzSqCHNsL7pvJKsi9n3LpXCIUws0eJZW1E6UxUOG7N2edUW20cT7OqHw
Host: localhost:8000
Connection: close
User-Agent: Paw/3.1.7 (Macintosh; OS X/10.13.4) GCDHTTPRequest
HTTP/1.1 200 OK
Connection: close
Cache-Control: no-cache
Content-Type: application/json
Content-Length: 134
Date: Mon, 04 Jun 2018 14:25:19 GMT

{"sub":"62daf3aa-083a-42c3-9df0-28c4d6838dc4","name":"Foo Bar","preferred_username":"insomnia","given_name":"Foo","family_name":"Bar"}

And when the client is disabled:

HTTP/1.1 400 Bad Request
Connection: close
Content-Type: application/json
Content-Length: 65
Date: Mon, 04 Jun 2018 14:26:40 GMT

{"error":"invalid_request","error_description":"Client disabled"}

And when user logs out:

HTTP/1.1 401 Unauthorized
Connection: close
Content-Type: application/json
Content-Length: 110
Date: Mon, 04 Jun 2018 14:27:12 GMT

{"error":"invalid_request","error_description":"User session not found or doesn't have client attached on it"}

Client "not before" revocation has no effect on this endpoint.

"description": "Max number of tokens to cache",
"type": "integer",
"minimum": 0,
"maximum": 10000
Copy link
Contributor

@mikz mikz Jun 4, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably should have "default" 0 ?

edit: maybe not needed when it is not required property

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right. It already defaults to 0 in the code but should be here too.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@davidor probably not needed. Only if we want to populate it in the UI.

Copy link
Contributor

@mikz mikz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on #656 (comment) this should work.

I'd love to have an integration test, but that can wait for some bigger chunk of work regarding OIDC tests.

@davidor davidor force-pushed the add-cache-token-introspection branch from 957aee9 to 393fc62 Compare June 4, 2018 14:40
@davidor davidor added this to the 3.3 milestone Jun 4, 2018
@davidor davidor merged commit b473531 into master Jun 4, 2018
@davidor davidor deleted the add-cache-token-introspection branch June 4, 2018 15:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants