Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removal of personal data from mod_mam #2328

Merged
merged 19 commits into from
Jun 25, 2019
Merged

Removal of personal data from mod_mam #2328

merged 19 commits into from
Jun 25, 2019

Conversation

zofpolkowska
Copy link
Contributor

Removed are only pm conversations from archive.

@mongoose-im

This comment has been minimized.

@codecov
Copy link

codecov bot commented May 31, 2019

Codecov Report

Merging #2328 into master will increase coverage by 2.38%.
The diff coverage is 93.65%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2328      +/-   ##
==========================================
+ Coverage   75.76%   78.14%   +2.38%     
==========================================
  Files         335      335              
  Lines       29229    29287      +58     
==========================================
+ Hits        22146    22887     +741     
+ Misses       7083     6400     -683
Impacted Files Coverage Δ
src/mam/mod_mam_rdbms_async_pool_writer.erl 63.2% <ø> (+0.59%) ⬆️
src/mam/mod_mam_rdbms_user.erl 76.05% <0%> (-1.09%) ⬇️
src/mam/mod_mam_cassandra_arch.erl 85.47% <100%> (+64.64%) ⬆️
src/pubsub/mod_pubsub.erl 71.9% <100%> (+0.01%) ⬆️
src/mam/mod_mam_rdbms_arch.erl 52.61% <100%> (+0.77%) ⬆️
src/mam/mod_mam_riak_timed_arch_yz.erl 64.95% <100%> (-11.11%) ⬇️
src/mam/mod_mam_elasticsearch_arch.erl 91.07% <100%> (+85.56%) ⬆️
src/mam/mod_mam_mnesia_prefs.erl 94.56% <100%> (+1.23%) ⬆️
src/mam/mod_mam.erl 90.51% <100%> (+0.29%) ⬆️
src/mam/mod_mam_cassandra_prefs.erl 82.82% <100%> (+82.82%) ⬆️
... and 26 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d40f82a...bce17c8. Read the comment docs.

@mongoose-im

This comment has been minimized.

@mongoose-im

This comment has been minimized.

@fenek fenek force-pushed the gdpr-mam-remove branch from 89909e3 to 691dff8 Compare June 12, 2019 13:05
@mongoose-im

This comment has been minimized.

Copy link
Collaborator

@DenysGonchar DenysGonchar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see the comment.


AliceU = escalus_utils:jid_to_lower(escalus_client:username(Alice)),
AliceS = escalus_utils:jid_to_lower(escalus_client:server(Alice)),
mongoose_helper:wait_until(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we have to wait here? is removal async?

@@ -0,0 +1,5 @@
-module(ejabberd_gen_mam_user).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this behaviour is absolutely unneeded. let's not increase the complexity of MAM. it's only RDBMs uses Archive ID, other backends don't.

end end, [mod_mam_cache_user | Backends]).

try_archive_id_int(Host, #jid{ luser = LUser } = _ArcJID) ->
UserIDBackends = mongoose_lib:find_behaviour_implementations(ejabberd_gen_mam_user),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's only one module, and i don't think that we will have any other in the future. also this code doesn't support multiple ejabberd_gen_mam_user implementations. just remove it and call mod_mam_rdbms_user:get_archive_id/2.
another option - remove UserID parameter from remove_archive/3, and extract it internally in those backends that really need it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the latter option makes more sense.

@@ -18,6 +18,7 @@
-export([archive_size/4,
archive_message/9,
lookup_messages/3,
remove_archive/3,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add interface as na optional in ejabberd_gen_mam_archive behaviour. better call it remove_mam_gdpr_data, so it's easier to grep.


remove_archive(Host, _UserID, UserJID) ->
ensure_params_loaded(Host),
PoolName = mod_mam_cassandra_arch_params:pool_name(),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

either get rid of pool_name/1 interface at all, or bring it back. keep the code in the module consistent

Copy link
Member

@fenek fenek Jun 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not really a subject of this PR to improve consistency of pool_name/1 behaviour.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aaaaand I decided to remove it after all. :D

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I misunderstood your comment at first. :P

@@ -18,6 +18,7 @@
-export([get_behaviour/5,
get_prefs/4,
set_prefs/7,
remove_archive/3,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add interface into ejabberd_gen_mam_prefs behaviour as optional. rename it into something like remove_mam_gdpr_data

@@ -28,6 +28,7 @@
-export([archive_size/4,
archive_message/9,
lookup_messages/3,
remove_archive/3,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please remove it

@@ -10,11 +10,14 @@
%%%-------------------------------------------------------------------
-module(mod_mam_rdbms_user).

-behaviour(ejabberd_gen_mam_user).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't remove the user, it's required for MAM MUC db as well!!!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we care about this ID after user is removed? It's used only for retrieval, correct?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, right now it's required only for mam muc gdpr retrieval. but i would prefer to keep data in RDBMS consistent.

@mongoose-im

This comment has been minimized.

@mongoose-im

This comment has been minimized.

@mongoose-im

This comment has been minimized.

@mongoose-im
Copy link
Collaborator

mongoose-im commented Jun 21, 2019

6705.1 / Erlang 20.3 / small_tests / df5789c
Reports root / small


6705.2 / Erlang 20.3 / internal_mnesia / df5789c
Reports root/ big
OK: 1271 / Failed: 3 / User-skipped: 168 / Auto-skipped: 0

pubsub_SUITE:dag+collection:subscribe_unsubscribe_collection_test
{error,{{badmatch,false},
    [{pubsub_tools,check_response,2,
             [{file,"pubsub_tools.erl"},{line,468}]},
     {pubsub_tools,receive_response,3,
             [{file,"pubsub_tools.erl"},{line,458}]},
     {pubsub_tools,receive_and_check_response,4,
             [{file,"pubsub_tools.erl"},{line,448}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1546}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1062}]},
     {test_server,run_test_case_eval,9,
            [{file,"test_server.erl"},{line,994}]}]}}

Report log

pubsub_SUITE:tree+node_config:disable_payload_test
{error,{{badmatch,false},
    [{pubsub_tools,check_response,2,
             [{file,"pubsub_tools.erl"},{line,468}]},
     {pubsub_tools,receive_response,3,
             [{file,"pubsub_tools.erl"},{line,458}]},
     {pubsub_tools,receive_and_check_response,4,
             [{file,"pubsub_tools.erl"},{line,448}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1546}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1062}]},
     {test_server,run_test_case_eval,9,
            [{file,"test_server.erl"},{line,994}]}]}}

Report log

mod_global_distrib_SUITE:mod_global_distrib:test_pm_with_ungraceful_reconnection_to_different_server
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"eve98.134421@localhost/res1">>,escalus_tcp,
          <0.20709.1>,
          [{event_manager,<0.20700.1>},
           {server,<<"localhost">>},
           {username,<<"eve98.134421">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.20700.1>},
            {server,<<"localhost">>},
            {username,<<"eve98.134421">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"eve98.134421">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"eve98.134421">>},
           {server,<<"localhost">>},
           {password,<<"password">>},
           {port,5222},
           {stream_management,true},
           {stream_id,<<"F33CB34415CB983A">>}]},
        10000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {mod_global_distrib_SUITE,
       '-test_pm_with_ungraceful_reconnection_to_different_server/1-fun-0-',
       4,
       [{file,"mod_global_distrib_SUITE.erl"},{line,610}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
        {line,72}]},
    ...

Report log


6705.3 / Erlang 20.3 / odbc_mssql_mnesia / df5789c
Reports root/ big
OK: 3143 / Failed: 0 / User-skipped: 278 / Auto-skipped: 0


6705.4 / Erlang 20.3 / ldap_mnesia / df5789c
Reports root/ big
OK: 1199 / Failed: 0 / User-skipped: 206 / Auto-skipped: 0


6705.5 / Erlang 20.3 / elasticsearch_and_cassandra_mnesia / df5789c
Reports root/ big
OK: 532 / Failed: 0 / User-skipped: 55 / Auto-skipped: 0


6705.8 / Erlang 21.3 / mysql_redis / df5789c
Reports root/ big
OK: 3157 / Failed: 2 / User-skipped: 280 / Auto-skipped: 0

pubsub_SUITE:dag+collection:discover_top_level_nodes_test
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"alicE2.492401@localhost/res1">>,escalus_tcp,
          <0.15554.3>,
          [{event_manager,<0.15475.3>},
           {server,<<"localhost">>},
           {username,<<"alicE2.492401">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.15475.3>},
            {server,<<"localhost">>},
            {username,<<"alicE2.492401">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"alicE2.492401">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"alicE2.492401">>},
           {server,<<"localhost">>},
           {password,<<"matygrysa">>},
           {stream_id,<<"D1804794D104B51F">>}]},
        5000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {pubsub_tools,receive_response,3,
       [{file,"pubsub_tools.erl"},{line,457}]},
     {pubsub_tools,receive_and_check_response,4,
       [{file,"pubsub_tools.erl"},{line,448}]},
     {pubsub_SUITE,'-discover_top_level_nodes_test/1-fun-0-',2,
       [{file,"pubsub_SUITE.erl"},{line,1461}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/...

Report log

pubsub_SUITE:tree+hometree_specific:can_create_node_with_existing_parent_path
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"alicE23.518992@localhost/res1">>,escalus_tcp,
          <0.17213.3>,
          [{event_manager,<0.17187.3>},
           {server,<<"localhost">>},
           {username,<<"alicE23.518992">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.17187.3>},
            {server,<<"localhost">>},
            {username,<<"alicE23.518992">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"alicE23.518992">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"alicE23.518992">>},
           {server,<<"localhost">>},
           {password,<<"matygrysa">>},
           {stream_id,<<"4A0D0E0A157A875B">>}]},
        5000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,138}]},
     {pubsub_tools,receive_response,3,
       [{file,"pubsub_tools.erl"},{line,457}]},
     {pubsub_tools,receive_and_check_response,4,
       [{file,"pubsub_tools.erl"},{line,448}]},
     {pubsub_SUITE,'-can_create_node_with_existing_parent_path/1-fun-0-',
       1,
       [{file,"pubsub_SUITE.erl"},{line,1768}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/...

Report log


6705.7 / Erlang 21.3 / pgsql_mnesia / df5789c
Reports root/ big / small
OK: 3175 / Failed: 0 / User-skipped: 246 / Auto-skipped: 0


6705.9 / Erlang 21.3 / riak_mnesia / df5789c
Reports root/ big / small
OK: 1489 / Failed: 0 / User-skipped: 149 / Auto-skipped: 0


6705.9 / Erlang 21.3 / riak_mnesia / df5789c
Reports root/ big / small
OK: 1489 / Failed: 0 / User-skipped: 149 / Auto-skipped: 0

Copy link
Collaborator

@DenysGonchar DenysGonchar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks ok.

@@ -15,6 +15,7 @@

%% ejabberd handlers
-export([archive_id/3,
remove_archive/3,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to export this one

@mongoose-im
Copy link
Collaborator

mongoose-im commented Jun 24, 2019

6715.1 / Erlang 20.3 / small_tests / 48a91aa
Reports root / small


6715.2 / Erlang 20.3 / internal_mnesia / 48a91aa
Reports root/ big
OK: 1260 / Failed: 1 / User-skipped: 167 / Auto-skipped: 0

sm_SUITE:parallel:subscription_requests_are_buffered_properly
{error,{{badmatch,false},
    [{escalus_session,stream_management,2,
              [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_session.erl"},
               {line,240}]},
     {escalus_connection,connection_step,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,154}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {escalus_connection,start,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_connection.erl"},
                {line,138}]},
     {sm_SUITE,'-subscription_requests_are_buffered_properly/1-fun-3-',6,
           [{file,"sm_SUITE.erl"},{line,848}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1546}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1062}]}]}}

Report log


6715.3 / Erlang 20.3 / odbc_mssql_mnesia / 48a91aa
Reports root/ big
OK: 3156 / Failed: 1 / User-skipped: 278 / Auto-skipped: 0

mod_global_distrib_SUITE:mod_global_distrib:test_pm_with_ungraceful_reconnection_to_different_server
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"eve75.146306@localhost/res1">>,escalus_tcp,
          <0.27508.3>,
          [{event_manager,<0.27499.3>},
           {server,<<"localhost">>},
           {username,<<"eve75.146306">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.27499.3>},
            {server,<<"localhost">>},
            {username,<<"eve75.146306">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"eve75.146306">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"eve75.146306">>},
           {server,<<"localhost">>},
           {password,<<"password">>},
           {port,5222},
           {stream_management,true},
           {stream_id,<<"D3FD5A295E7F9CD2">>}]},
        10000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,136}]},
     {mod_global_distrib_SUITE,
       '-test_pm_with_ungraceful_reconnection_to_different_server/1-fun-0-',
       4,
       [{file,"mod_global_distrib_SUITE.erl"},{line,610}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
        {line,72}]},
    ...

Report log


6715.4 / Erlang 20.3 / ldap_mnesia / 48a91aa
Reports root/ big
OK: 1199 / Failed: 0 / User-skipped: 206 / Auto-skipped: 0


6715.5 / Erlang 20.3 / elasticsearch_and_cassandra_mnesia / 48a91aa
Reports root/ big
OK: 532 / Failed: 0 / User-skipped: 55 / Auto-skipped: 0


6715.8 / Erlang 21.3 / mysql_redis / 48a91aa
Reports root/ big
OK: 3141 / Failed: 0 / User-skipped: 280 / Auto-skipped: 0


6715.7 / Erlang 21.3 / pgsql_mnesia / 48a91aa
Reports root/ big / small
OK: 3182 / Failed: 1 / User-skipped: 246 / Auto-skipped: 0

sm_SUITE:parallel_manual_ack_freq_1:resume_session_state_stop_c2s
{error,{{assertion_failed,assert_many,true,
              [is_presence,#Fun<sm_SUITE.16.116926647>],
              [{xmlel,<<"presence">>,
                  [{<<"from">>,
                  <<"alicE48.336288@localhost/escalus-default-resource">>},
                   {<<"to">>,
                  <<"alice48.336288@localhost/escalus-default-resource">>},
                   {<<"xml:lang">>,<<"en">>}],
                  []},
               {xmlel,<<"presence">>,
                  [{<<"from">>,
                  <<"alicE48.336288@localhost/escalus-default-resource">>},
                   {<<"to">>,
                  <<"alice48.336288@localhost/escalus-default-resource">>},
                   {<<"xml:lang">>,<<"en">>}],
                  []}],
              "   <presence from='alicE48.336288@localhost/escalus-default-resource' to='alice48.336288@localhost/escalus-default-resource' xml:lang='en'/>   <presence from='alicE48.336288@localhost/escalus-default-resource' to='alice48.336288@localhost/escalus-default-resource' xml:lang='en'/>"},
    [{escalus_new_assert,assert_true,2,
               [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_new_assert.erl"},
                {line,84}]},
     {sm_SUITE,resume_session_state_stop_c2s,1,
           [{file,"sm_SUITE.erl"},{line,581}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1562}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1080}]},
     ...

Report log


6715.9 / Erlang 21.3 / riak_mnesia / 48a91aa
Reports root/ big / small
OK: 1531 / Failed: 4 / User-skipped: 149 / Auto-skipped: 0

mam_SUITE:rsm03:pagination_first0
{'EXIT',tcp_close_timeout}

Report log

pubsub_SUITE:tree+basic:retract_test
{error,{{badmatch,false},
    [{pubsub_tools,check_response,2,
             [{file,"pubsub_tools.erl"},{line,468}]},
     {pubsub_tools,receive_response,3,
             [{file,"pubsub_tools.erl"},{line,458}]},
     {pubsub_tools,receive_and_check_response,4,
             [{file,"pubsub_tools.erl"},{line,448}]},
     {pubsub_SUITE,'-retract_test/1-fun-0-',2,
             [{file,"pubsub_SUITE.erl"},{line,625}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1562}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1080}]},
     {test_server,run_test_case_eval,9,
            [{file,"test_server.erl"},{line,1012}]}]}}

Report log

pubsub_SUITE:tree+basic:subscribe_options_test
{error,{{badmatch,false},
    [{pubsub_tools,check_response,2,
             [{file,"pubsub_tools.erl"},{line,468}]},
     {pubsub_tools,receive_response,3,
             [{file,"pubsub_tools.erl"},{line,458}]},
     {pubsub_tools,receive_and_check_response,4,
             [{file,"pubsub_tools.erl"},{line,448}]},
     {escalus_story,story,4,
            [{file,"/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
             {line,72}]},
     {test_server,ts_tc,3,[{file,"test_server.erl"},{line,1562}]},
     {test_server,run_test_case_eval1,6,
            [{file,"test_server.erl"},{line,1080}]},
     {test_server,run_test_case_eval,9,
            [{file,"test_server.erl"},{line,1012}]}]}}

Report log

mod_global_distrib_SUITE:mod_global_distrib:test_pm_with_ungraceful_reconnection_to_different_server
{error,
  {timeout_when_waiting_for_stanza,
    [{escalus_client,wait_for_stanza,
       [{client,<<"eve10.901827@localhost/res1">>,escalus_tcp,
          <0.31562.1>,
          [{event_manager,<0.31553.1>},
           {server,<<"localhost">>},
           {username,<<"eve10.901827">>},
           {resource,<<"res1">>}],
          [{event_client,
             [{event_manager,<0.31553.1>},
            {server,<<"localhost">>},
            {username,<<"eve10.901827">>},
            {resource,<<"res1">>}]},
           {resource,<<"res1">>},
           {username,<<"eve10.901827">>},
           {server,<<"localhost">>},
           {host,<<"localhost">>},
           {port,5222},
           {auth,{escalus_auth,auth_plain}},
           {wspath,undefined},
           {username,<<"eve10.901827">>},
           {server,<<"localhost">>},
           {password,<<"password">>},
           {port,5222},
           {stream_management,true},
           {stream_id,<<"8E320982C0022268">>}]},
        10000],
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_client.erl"},
        {line,136}]},
     {mod_global_distrib_SUITE,
       '-test_pm_with_ungraceful_reconnection_to_different_server/1-fun-0-',
       4,
       [{file,"mod_global_distrib_SUITE.erl"},{line,610}]},
     {escalus_story,story,4,
       [{file,
          "/home/travis/build/esl/MongooseIM/big_tests/_build/default/lib/escalus/src/escalus_story.erl"},
        {line,72}]},
    ...

Report log


6715.9 / Erlang 21.3 / riak_mnesia / 48a91aa
Reports root/ big / small
OK: 1489 / Failed: 0 / User-skipped: 149 / Auto-skipped: 0


6715.9 / Erlang 21.3 / riak_mnesia / 48a91aa
Reports root/ big / small
OK: 1489 / Failed: 0 / User-skipped: 149 / Auto-skipped: 0

@fenek fenek merged commit 07a2fb5 into master Jun 25, 2019
@fenek fenek deleted the gdpr-mam-remove branch June 25, 2019 09:16
@fenek fenek added this to the 3.4.0 milestone Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants