Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MLA-1724] Reduce use of IEnumerable during inference #4887

Merged
merged 4 commits into from
Jan 27, 2021

Conversation

chriselion
Copy link
Contributor

@chriselion chriselion commented Jan 27, 2021

Proposed change(s)

This makes several changes to decrease the amount of garbage generated during inference. Most of it is from changing IEnumerables to ILists and iterating over them by index, but there are a few others changes:

  • Reuse a Dictionary and List in ModelRunner
  • Rearrange the ProfileMarkers in ModelRunner to not do any string manipulation
  • Add an overload to VectorSensor that uses an IList (we don't actually use either the old or new form in any examples)

For a test scene (3DBall with decisions being requested every step), this reduces the GC Allocation for a typical frame for the ModelRunner.DecideAction from 102.2KB to 74.4KB. The vast majority of the remainder is coming from Barracuda (and will be addressed by them separately).

Before

image

After

image

Note that tensor generation and application (for continuous actions) now have no garbage generated.

Useful links (Github issues, JIRA tickets, ML-Agents forum threads etc.)

https://jira.unity3d.com/browse/MLA-1724
#4883

Types of change(s)

  • Bug fix

Checklist

Other comments

/// Adds a list or array of float observations to the vector observations of the agent.
/// </summary>
/// <param name="observation">Observation.</param>
public void AddObservation(IList<float> observation)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are the overload resolution rules in C# for this? If the caller passes an List or float[], how do I know whether the IList or IEnumerable will be chosen? I can give this a different name to make it less ambiguous...

@@ -144,6 +145,27 @@ public void AddRange(IEnumerable<float> data, int writeOffset = 0)
}
}

public void AddList(IList<float> data, int writeOffset = 0)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comments for overloading resolution, but I made this unambiguous (and marked the old version obsolete).

@@ -122,6 +122,7 @@ internal void SetTarget(TensorProxy tensorProxy, int batchIndex, int channelOffs
/// </summary>
/// <param name="data"></param>
/// <param name="writeOffset">Optional write offset.</param>
[Obsolete("Use AddList() for better performance")]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is in the public API so we can't remove it, but we can strongly steer users to AddList instead.


Profiler.BeginSample($"MLAgents.{m_Model.name}.GenerateTensors");
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These were allocating for the string concatenation, and m_Model.name also allocates, so cache it at the start.

{
var inputs = new Dictionary<string, Tensor>();
foreach (var inp in infInputs)
m_InputsByName.Clear();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed this and FetchBarracudaOutputs to reuse their results between frames.

@chriselion chriselion changed the title [MLA-1724] Reduce use of IEnumerable during infernce [MLA-1724] Reduce use of IEnumerable during inference Jan 27, 2021
@chriselion chriselion merged commit 4cb9384 into master Jan 27, 2021
@delete-merged-branch delete-merged-branch bot deleted the MLA-1724-reduce-IEnumerable branch January 27, 2021 19:55
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 27, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants