Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get rid of store_type/1 in auth backends #2254

Merged
merged 4 commits into from
Apr 11, 2019
Merged

Get rid of store_type/1 in auth backends #2254

merged 4 commits into from
Apr 11, 2019

Conversation

chrzaszcz
Copy link
Member

@chrzaszcz chrzaszcz commented Apr 4, 2019

Before the change

Each auth backend had a store_type/1 callback, which could return
'external', 'plain' or 'scram'.

There was a function called store_type/1 in ejabberd_auth as well.
It traversed all backends, trying to determine the store type
for the whole host with a logic that e.g. favored 'external' over
'plain' (see the implementation for details).

Each SASL mechanism had its own password type, which was stored
in the sasl_mechanism ETS table on mechanism registration.
The possible password types were 'plain', 'digest', 'scram' and 'cert'.

To determine supported mechanisms in cyrsasl:listmech/1,
the store type for the current host was used to filter the SASL
mechanisms with the following rules:

Auth store_type     Supported password types
--------------      ------------------------
external            plain
scram               plain, scram, cert
plain               plain, digest, scram

After the change

There is no notion of store types anymore - auth backends have a new
supports_password_type/1 callback, which returns a boolean that
indicates the support for individual password types.

To determine support for a particular password type for the whole host,
the corresponding function in ejabberd_auth checks if any backend
supports it.

To determine supported mechanisms in cyrsasl:listmech/1,
for each mechanism it is enough to check if the current host supports
its password type.

Furthermore, the 'cert' password type is now supported only by the 'pki'
auth backend and no longer matches mechanisms supporting SCRAM.

Configuration is simplified,
e.g. the HTTP auth backend does not disable SASL EXTERNAL for the whole host.

@mongoose-im
Copy link
Collaborator

6293.1 / Erlang 19.3 / small_tests / d197d55
Reports root / small

@chrzaszcz chrzaszcz force-pushed the no-store-type branch 2 times, most recently from 3a43817 to e57d1e3 Compare April 4, 2019 13:11
@mongoose-im
Copy link
Collaborator

mongoose-im commented Apr 4, 2019

6295.1 / Erlang 19.3 / small_tests / b09e9f3
Reports root / small


6295.2 / Erlang 19.3 / internal_mnesia / b09e9f3
Reports root/ big
OK: 1247 / Failed: 2 / User-skipped: 65 / Auto-skipped: 0

sm_SUITE:parallel:subscription_requests_are_buffered_properly
{error,{{badmatch,false},
    [{escalus_session,stream_management,2,
              [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_session.erl"},
               {line,227}]},
     {escalus_connection,connection_step,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,134}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {escalus_connection,start,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,118}]},
     {sm_SUITE,'-subscription_requests_are_buffered_properly/1-fun-3-',6,
           [{file,"sm_SUITE.erl"},{line,848}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1529}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1045}]}]}}

Report log

sm_SUITE:parallel:subscription_requests_are_buffered_properly
{error,{{badmatch,false},
    [{escalus_session,stream_management,2,
              [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_session.erl"},
               {line,227}]},
     {escalus_connection,connection_step,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,134}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {escalus_connection,start,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,118}]},
     {sm_SUITE,'-subscription_requests_are_buffered_properly/1-fun-3-',6,
           [{file,"sm_SUITE.erl"},{line,848}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1529}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1045}]}]}}

Report log


6295.6 / Erlang 19.3 / elasticsearch_and_cassandra_mnesia / b09e9f3
Reports root/ big
OK: 469 / Failed: 0 / User-skipped: 8 / Auto-skipped: 0


6295.5 / Erlang 19.3 / ldap_mnesia / b09e9f3
Reports root/ big
OK: 1168 / Failed: 0 / User-skipped: 100 / Auto-skipped: 0


6295.3 / Erlang 19.3 / mysql_redis / b09e9f3
Reports root/ big
OK: 3074 / Failed: 0 / User-skipped: 232 / Auto-skipped: 0


6295.4 / Erlang 19.3 / odbc_mssql_mnesia / b09e9f3
Reports root/ big
OK: 3076 / Failed: 0 / User-skipped: 230 / Auto-skipped: 0


6295.8 / Erlang 20.0 / pgsql_mnesia / b09e9f3
Reports root/ big / small
OK: 3121 / Failed: 1 / User-skipped: 198 / Auto-skipped: 0

mod_global_distrib_SUITE:mod_global_distrib:test_pm_with_ungraceful_reconnection_to_different_server
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"eve1.58679@localhost/res1">>,escalus_tcp,
          <0.23273.3>,
          [{event_manager,<0.23264.3>},
           {server,<<"localhost">>},
           {username,<<"eve1.58679">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.23264.3>},
            {server,<<"localhost">>},
            {username,<<"eve1.58679">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"eve1.58679">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"eve1.58679">>},
           {server,<<"localhost">>},
           {password,<<"password">>},
           {port,5222},
           {stream_management,true},
           {stream_id,<<"9D04CAB3EBC5078A">>}]},
        10000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {mod_global_distrib_SUITE,
       '-test_pm_with_ungraceful_reconnection_to_different_server/1-fun-0-',
       4,
       [{file,"mod_global_distrib_SUITE.erl"},{line,610}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
        {line,72}]},
     {test_ser...

Report log


6295.9 / Erlang 21.0 / riak_mnesia / b09e9f3
Reports root/ big / small
OK: 1438 / Failed: 0 / User-skipped: 63 / Auto-skipped: 0


6295.2 / Erlang 19.3 / internal_mnesia / b09e9f3
Reports root/ big
OK: 1247 / Failed: 2 / User-skipped: 65 / Auto-skipped: 0

sm_SUITE:parallel:subscription_requests_are_buffered_properly
{error,{{badmatch,false},
    [{escalus_session,stream_management,2,
              [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_session.erl"},
               {line,227}]},
     {escalus_connection,connection_step,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,134}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {escalus_connection,start,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,118}]},
     {sm_SUITE,'-subscription_requests_are_buffered_properly/1-fun-3-',6,
           [{file,"sm_SUITE.erl"},{line,848}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1529}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1045}]}]}}

Report log

sm_SUITE:parallel:subscription_requests_are_buffered_properly
{error,{{badmatch,false},
    [{escalus_session,stream_management,2,
              [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_session.erl"},
               {line,227}]},
     {escalus_connection,connection_step,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,134}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {escalus_connection,start,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,118}]},
     {sm_SUITE,'-subscription_requests_are_buffered_properly/1-fun-3-',6,
           [{file,"sm_SUITE.erl"},{line,848}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1529}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1045}]}]}}

Report log


6295.9 / Erlang 21.0 / riak_mnesia / b09e9f3
Reports root/ big / small
OK: 1438 / Failed: 0 / User-skipped: 63 / Auto-skipped: 0


6295.2 / Erlang 19.3 / internal_mnesia / b09e9f3
Reports root/ big
OK: 1203 / Failed: 0 / User-skipped: 65 / Auto-skipped: 0

@codecov
Copy link

codecov bot commented Apr 5, 2019

Codecov Report

Merging #2254 into master will increase coverage by 1.76%.
The diff coverage is 87.5%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2254      +/-   ##
==========================================
+ Coverage   77.07%   78.84%   +1.76%     
==========================================
  Files         333      333              
  Lines       28802    28796       -6     
==========================================
+ Hits        22200    22703     +503     
+ Misses       6602     6093     -509
Impacted Files Coverage Δ
src/sasl/cyrsasl_external.erl 91.83% <ø> (ø) ⬆️
src/auth/ejabberd_auth_pki.erl 16.66% <100%> (ø) ⬆️
src/sasl/cyrsasl.erl 88.23% <100%> (+3.23%) ⬆️
src/auth/ejabberd_auth_jwt.erl 89.13% <100%> (ø) ⬆️
src/auth/ejabberd_auth_external.erl 32.07% <100%> (ø) ⬆️
src/auth/ejabberd_auth_http.erl 73.68% <100%> (+4.93%) ⬆️
src/auth/ejabberd_auth.erl 60.67% <100%> (+0.96%) ⬆️
src/auth/ejabberd_auth_ldap.erl 59.25% <100%> (ø) ⬆️
src/auth/ejabberd_auth_riak.erl 80.76% <75%> (+80.76%) ⬆️
src/auth/ejabberd_auth_rdbms.erl 40.14% <75%> (-0.29%) ⬇️
... and 29 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7ff43e8...360bac0. Read the comment docs.

big_tests/tests/oauth_SUITE.erl Outdated Show resolved Hide resolved
1. Before the change

Each auth backend had a store_type/1 callback, which could return
'external', 'plain' or 'scram'.

There was a function called store_type/1 in ejabberd_auth as well.
It traversed all backends, trying to determine the store type
for the whole host with a logic that e.g. favored 'external' over
'plain' (see the implementation for details).

Each SASL mechanism had its own password type, which was stored
in the sasl_mechanism ETS table on mechanism registration.
The possible password types were 'plain', 'digest', 'scram' and 'cert'.

To determine supported mechanisms in cyrsasl:listmech/1,
the store type for the current host was used to filter the SASL
mechanisms with the following rules:

Auth store_type     Supported password types
--------------      ------------------------
external            plain
scram               plain, scram, cert
plain               plain, digest, scram

2. After the change

There is no notion of store types anymore - auth backends export a
function called 'supports_password_type/1' which returns a boolean that
indicates the support for individual password types.

To determine support for a particular password type for the whole host,
the corresponding function in ejabberd_auth checks if any backend
supports it.

To determine supported mechanisms in cyrsasl:listmech/1,
for each mechanism it is enough to check if the current host supports
its password type.

Furthermore, the 'cert' password type is now supported only by the 'pki'
auth backend and no longer matches mechanisms supporting SCRAM.

Configuration is simplified,
e.g. the HTTP auth backend does not disable SASL EXTERNAL for the whole host.
@mongoose-im
Copy link
Collaborator

mongoose-im commented Apr 11, 2019

6307.1 / Erlang 19.3 / small_tests / 332f032
Reports root / small


6307.5 / Erlang 19.3 / ldap_mnesia / 332f032
Reports root/ big
OK: 1168 / Failed: 0 / User-skipped: 100 / Auto-skipped: 0


6307.3 / Erlang 19.3 / mysql_redis / 332f032
Reports root/ big
OK: 3073 / Failed: 1 / User-skipped: 232 / Auto-skipped: 0

mam_SUITE:rsm04:pagination_last5
{'EXIT',tcp_close_timeout}

Report log


6307.2 / Erlang 19.3 / internal_mnesia / 332f032
Reports root/ big
OK: 1203 / Failed: 0 / User-skipped: 65 / Auto-skipped: 0


6307.6 / Erlang 19.3 / elasticsearch_and_cassandra_mnesia / 332f032
Reports root/ big
OK: 469 / Failed: 0 / User-skipped: 8 / Auto-skipped: 0


6307.4 / Erlang 19.3 / odbc_mssql_mnesia / 332f032
Reports root/ big
OK: 3076 / Failed: 0 / User-skipped: 230 / Auto-skipped: 0


6307.8 / Erlang 20.0 / pgsql_mnesia / 332f032
Reports root/ big / small
OK: 3108 / Failed: 0 / User-skipped: 198 / Auto-skipped: 0


6307.9 / Erlang 21.0 / riak_mnesia / 332f032
Reports root/ big / small
OK: 1438 / Failed: 0 / User-skipped: 63 / Auto-skipped: 0


6307.3 / Erlang 19.3 / mysql_redis / 332f032
Reports root/ big
OK: 3090 / Failed: 2 / User-skipped: 232 / Auto-skipped: 0

pubsub_SUITE:dag+collection:discover_top_level_nodes_test
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"alicE62.756642@localhost/res1">>,escalus_tcp,
          <0.10472.3>,
          [{event_manager,<0.10436.3>},
           {server,<<"localhost">>},
           {username,<<"alicE62.756642">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.10436.3>},
            {server,<<"localhost">>},
            {username,<<"alicE62.756642">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"alicE62.756642">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"alicE62.756642">>},
           {server,<<"localhost">>},
           {password,<<"matygrysa">>},
           {stream_id,<<"4B24A5189F076CA5">>}]},
        5000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {pubsub_tools,receive_response,3,
       [{file,"pubsub_tools.erl"},{line,456}]},
     {pubsub_tools,receive_and_check_response,4,
       [{file,"pubsub_tools.erl"},{line,447}]},
     {pubsub_SUITE,'-discover_top_level_nodes_test/1-fun-0-',2,
       [{file,"pubsub_SUITE.erl"},{line,1461}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/def...

Report log

pubsub_SUITE:tree+hometree_specific:disco_node_children_by_path_prefix
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"alicE83.621322@localhost/res1">>,escalus_tcp,
          <0.12148.3>,
          [{event_manager,<0.12139.3>},
           {server,<<"localhost">>},
           {username,<<"alicE83.621322">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.12139.3>},
            {server,<<"localhost">>},
            {username,<<"alicE83.621322">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"alicE83.621322">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"alicE83.621322">>},
           {server,<<"localhost">>},
           {password,<<"matygrysa">>},
           {stream_id,<<"C71D041324D21219">>}]},
        5000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {pubsub_tools,receive_response,3,
       [{file,"pubsub_tools.erl"},{line,456}]},
     {pubsub_tools,receive_and_check_response,4,
       [{file,"pubsub_tools.erl"},{line,447}]},
     {pubsub_SUITE,'-disco_node_children_by_path_prefix/1-fun-0-',2,
       [{file,"pubsub_SUITE.erl"},{line,1792}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_buil...

Report log

@chrzaszcz chrzaszcz requested a review from fenek April 11, 2019 12:30
@chrzaszcz chrzaszcz merged commit 78e72e0 into master Apr 11, 2019
@chrzaszcz chrzaszcz deleted the no-store-type branch April 11, 2019 14:16
@fenek fenek added this to the MongooseIM 3.3.0++ milestone Apr 11, 2019
@michalwski michalwski mentioned this pull request Jan 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants