Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function executing in parallel despite maxConcurrentCalls=1 #2667

Closed
rogersmj opened this issue Apr 12, 2018 · 15 comments
Closed

Function executing in parallel despite maxConcurrentCalls=1 #2667

rogersmj opened this issue Apr 12, 2018 · 15 comments

Comments

@rogersmj
Copy link

Investigative information

Please provide the following:

  • Timestamp: 2018-04-12T13:49:33.128Z
  • Function App version (1.0 or 2.0-beta): 1.x
  • Function App name:
  • Function name(s) (as appropriate):
  • Invocation ID: 04082261-42bd-402b-9b3c-2e1522cae883 and eb424b52-7334-4a0d-930d-41951f821831
  • Region: East US

Repro steps

Provide the steps required to reproduce the problem:

  1. Setup a long-running Service Bus topic triggered function. In the function's host.json, set maxConcurrentCalls to 1 as described in the host.json reference documentation.
  2. Post a message to the topic that will trigger your function.
  3. Observe the function start to execute
  4. Post another message to the topic with different content.
  5. Observe another instance of the function begin to execute in parallel to the first.

Expected behavior

With maxConcurrentCalls=1, I would expect another instance of the function to not kick off until the first one has completed and released its lock on the topic message.

Actual behavior

Multiple instances of the function are executing in parallel.

Known workarounds

?

Related information

  • Function written in node.js
  • Uses message content from topic to make several API calls to other services.
  • I am experiencing this behavior both in a locally run Functions host on Windows 10 as well as in Azure.
  • I am currently trying to limit this to 1 execution at a time in my dev environment to test the general concept of limiting parallel execution. In production, I will need to limit this to 5 or 6 parallel executions.
@paulbatum
Copy link
Member

The host.json settings apply per instance of your app, not globally across all instances. So if you were running this function app on a dedicated app service plan with two VMs, both of them would run one execution (for a total of two concurrently). If you're running on the consumption plan, then the total number of concurrent executions will vary as the system scales your app up and down based on load.

If you're on the consumption plan, the WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT documented here will help you limit how many VMs your app runs on (though it is not a 100% guaranteed limit due to how the system behaves when it runs into capacity constraints).

This issue tracks improving the overall experience in this area, but there is no ETA:
#1207

@paulbatum paulbatum added this to the Active Questions milestone Apr 12, 2018
@rogersmj
Copy link
Author

Oh, interesting. I didn't realize that, thanks for the clarification.

It is important for the function we're building that we limit it to a certain number of parallel operations, because the service it is hitting (Microsoft Power BI to perform dataset refreshes, actually) will choke if more than 6 at a time are happening, yet it has no internal limiting. I will try WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT and if it works most of the time that should be sufficient. Otherwise it seems the only option would be to have some sort of state tracker with an intermediary.

@MisinformedDNA
Copy link

@paulbatum Any updates on when V2 and WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT will be "go-live"?

@paulbatum
Copy link
Member

On the topic of V2, I can't really say much right now (is it relevant to what is being discussed here?).

For any updates on the status of WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT I would defer to @cgillum and @tohling

@MisinformedDNA
Copy link

V2 is relevant since WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT is a V2 feature, right?

@paulbatum
Copy link
Member

No, it applies for both versions.

@MisinformedDNA
Copy link

Oh. Awesome. I didn't know that. Very good to know!

@markusfoss
Copy link

Are there any workarounds on this item. We are reading a subscription on a message bus topic, and writing to a CosmosDb collection, and need to limit the concurrency (some sort of bulkheading).

@MisinformedDNA
Copy link

@markusfoss Did you look into this: WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT?

@markusfoss
Copy link

markusfoss commented Aug 20, 2018

We have discovered the following when using the Azure Function 2.0 with a Service Bus Topic Trigger

[FunctionName("DirectHitMessageFunc")]
public static void Message(
[ServiceBusTrigger("contentfeed", "directhitfunc", Connection = "MessageBusConnection")]
string messageBody,
TraceWriter logger,
ExecutionContext context)
{
Invoke(messageBody, logger, context);
}

We observe that 16 concurrent invocations are run on the instance despite the following host.json:
{
"serviceBus": {
"maxConcurrentCalls": 1,
"prefetchCount": 0,
"autoRenewTimeout": "00:05:00"
}
}

We log to an external system (serilog --> Seq) and record the serverIp - these are allways the same (unless we restart the func app in azure).

@MisinformedDNA Also I have not had success with setting the WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT higher than "1" - it is allways executed on only one instance (deduced based on same serverIp logged to Seq). Several 1000 messages have been processes, and I have introduced a Thread.Sleep(7 seconds) to try and make the Azure Function runtime spread the load to more instances.

To sum up:

  1. maxConcurrentCalls is not respected when using a ServiceBus Topic trigger in Azure Functions v2
  2. Unable to conclude if WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT is respected in Azure Functions v2

From our point of view, we can not go in production when we do not know what load is put on downstream resources throught concurrency control (the reason why we went for a Pub/Sub architecture to begin with).

Have others had made other conclusions?

@christian-vorhemus
Copy link
Member

@markusfoss - I experienced the same behaviour, WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT in the app settings and maxConcurrentCalls in host.json do not seem to have the desired effect on the function (see also #912)

@paulbatum
Copy link
Member

Theres a bug in V2 right now that means that the maxConcurrentCalls value is not being used correctly:
#2828

This is one of several issues that are tracked as a blocker for V2 to go into general availability.

As for the issue your reported about lack of scale out, that would require more investigation. You'd need to fill out the details in the issue template (invocation ID, region, etc).

@markusfoss
Copy link

markusfoss commented Aug 21, 2018 via email

@NSTA1
Copy link

NSTA1 commented Jan 21, 2019

As V2 is now in GA, can I treat this issue as resolved? (I'm contemplating upgrading a V1 Event Hub consumer application to V2...)

@paulbatum
Copy link
Member

So a few different things were discussed in this issue:

  1. MaxConcurrentCalls was not working as expected ServiceBus Config Options have changed #2828 - this is now fixed
  2. The platform has a feature for controlling how many instances your function app runs on, but it does not work in some edge cases and so it is still a preview feature Set a max degree of parallelism  #1207

At this point, I agree that this issue can be closed because everything it touched on is covered by another issue.

@ghost ghost locked as resolved and limited conversation to collaborators Jan 1, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants