Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include TICKLESS stack size increase even without LPTICKER_DELAY_TICKS #10781

Merged
merged 1 commit into from
Jun 27, 2019

Conversation

LMESTM
Copy link
Contributor

@LMESTM LMESTM commented Jun 6, 2019

Description

LPTICKER_DELAY_TICKS extra stack size has been introduced with the LP
ticker C++ wrapper. Now we've removed the wrapper and LPTICKER_DELAY_TICKS
definition, but we still need extra stack size for the low level wrapper.

This change is needed for #10701

As such the change would impact many targets, more than expected, so maybe there is a better way to create the condition - feedback welcome

Pull request type

[x ] Fix
[ ] Refactor
[ ] Target update
[ ] Functionality change
[ ] Docs update
[ ] Test update
[ ] Breaking change

Reviewers

@mprse

Release Notes

@ciarmcom
Copy link
Member

ciarmcom commented Jun 6, 2019

@LMESTM, thank you for your changes.
@mprse @ARMmbed/mbed-os-core @ARMmbed/mbed-os-maintainers please review.

@adbridge
Copy link
Contributor

@mprse @ARMmbed/mbed-os-core could we have a review please ?

Copy link
Member

@bulislaw bulislaw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could set idle-thread-stack-size-tickless-extra to be 0 by default and set other value for platforms that need it in targets.json?
@mprse do you know is ST platforms would be the only ones the old condition would be true right now? In any case should be easy to check.

Copy link
Contributor

@mprse mprse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a general code for all targets - should not be removed.

rtos/TARGET_CORTEX/mbed_rtx_conf.h Show resolved Hide resolved
@LMESTM
Copy link
Contributor Author

LMESTM commented Jun 25, 2019

Hi @0xc0170 - I think we would benefit from your help and guidance to know which direction to take for this PR :-)

@0xc0170
Copy link
Contributor

0xc0170 commented Jun 25, 2019

Where is this coming from? Looking at #10701 - I dont see there where this extra stack size is needed ? If delay ticks are set to 0, what else needs extra stack in the idle loop?

@LMESTM
Copy link
Contributor Author

LMESTM commented Jun 25, 2019

Where is this coming from? Looking at #10701 - I dont see there where this extra stack size is needed ? If delay ticks are set to 0, what else needs extra stack in the idle

The HAL LP ticker driver's code has grown in size because it does about the same checks that C++ Wrapper was doing, so now it uses most probably more stack and gets called from IDLE stack size....

@0xc0170
Copy link
Contributor

0xc0170 commented Jun 25, 2019

Based on that, looks like the extension to the idle stack size should be provided for a target - rather tha removing this delay. something like @mprse proposed above: `if target sets extra idle stack, add it to the idle stack calculation).

@LMESTM LMESTM force-pushed the Idle_Stack_Size_For_Tickless branch from c7de96d to 5789f01 Compare June 25, 2019 14:59
@LMESTM
Copy link
Contributor Author

LMESTM commented Jun 25, 2019

PR updated with a new proposal - @mprse @0xc0170 something like that ?

…size

Adding a check to let application or target force increase idle thread
stack size.
@LMESTM LMESTM force-pushed the Idle_Stack_Size_For_Tickless branch from 5789f01 to 9e88719 Compare June 25, 2019 15:01
@0xc0170
Copy link
Contributor

0xc0170 commented Jun 26, 2019

Ci started

@mbed-ci
Copy link

mbed-ci commented Jun 26, 2019

Test run: SUCCESS

Summary: 11 of 11 test jobs passed
Build number : 1
Build artifacts

@0xc0170 0xc0170 merged commit eee8d0a into ARMmbed:master Jun 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants