Skip to content

Conversation

@jameswex
Copy link
Contributor

@jameswex jameswex commented Nov 9, 2018

Implementation for issue #1593.

  • Change UI for PD plots controls to have separate buttons for global PD plot vs individual point PD plot
  • example_index is set to -1 for global plots and the python code when it sees this does PD plot inferences on all examples and averages the results to create the line/bar chart numbers to plot
  • For global PD plots, don't show a point on the line where the selected example is, as a selected example's value is unrelated to the global PD plot values
  • Update all demos to calculate global PD plots in javascript correctly.
  • Updated image demo to not have any features to show for PD plots as they don't make sense for image inputs.
  • Added message in PD plots display when a model has no input features to show PD plots for (as in the image demo).
  • Updated demo code to correctly dispose of tensors after inference for better memory management (necessary for global PD plot generation in JS in demos)

jameswex and others added 30 commits March 19, 2018 19:38
@jameswex
Copy link
Contributor Author

jameswex commented Nov 9, 2018

@tolga-b please review

Copy link
Contributor

@tolga-b tolga-b left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something seems to be broken. In UCI census demo, when I click on Global PDP the browser locks up for a while then throws:
WebGL: CONTEXT_LOST_WEBGL: loseContext: context lost demo.html:5303 Uncaught (in promise) Error: Failed to compile fragment shader. at createFragmentShader (demo.html:5303) at e.createProgram (demo.html:5303) at compileProgram (demo.html:5303) at demo.html:5303 at e.getAndSaveBinary (demo.html:5303) at e.compileAndRun (demo.html:5303) at e.concat (demo.html:5303) at ENV.engine.runKernel.a (demo.html:5303) at e.runKernel (demo.html:5303) at concat2Tensors (demo.html:5303)

If this is about the number of mutants we have to spawn for all examples at once, I wonder if we somehow can do it in the background and make "Global" button available once all samples are returned instead of locking the GUI.

Copy link
Contributor

@tolga-b tolga-b left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some more comments

<div class="main-vertical">
<div class="main-bottom-bar">
<div class="datapoint-left-controls-holder">
<div class="button-prefix-label">Partial Depdependence Plots:</div>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Partial Depdependence -> Partial Dependence

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

const inferenceAbsErrorStr = ' Inference absolute error';
const inferenceSquaredErrorStr = ' Inference squared error';
const inferenceScoreStr = ' Inference score ';
const inferenceScoreStr = ' Inference score';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add back the space

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

# allow for computation of global PD plots when not all examples have
# the same number of feature values for a feature.
mutant_examples.append(copied_example)
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this pass necessary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed it

mutant_features, inference_result_proto.result.classifications):
for idx, classification in enumerate(
inference_result_proto.result.classifications):
mutant_feature = mutant_features[idx % len(mutant_features)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain the modulo here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added explanatory comment

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't see it

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok now really done :)

@jameswex
Copy link
Contributor Author

thanks @tolga-b . Fixed the perf issues with global PD plots in the demos by batching of tf.js requests and constructing smaller tensors. Also made async so that UI isn't frozen and the spinner appears for the PD plots while generating, just as it does with the non-demo version already.

series[key] = {}
if not mutant_feature.mutant_value in series[key]:
series[key][mutant_feature.mutant_value] = []
series[key][mutant_feature.mutant_value].append(classification_class.score)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this line seems to be over the width limit

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

mutant_features, inference_result_proto.result.classifications):
for idx, classification in enumerate(
inference_result_proto.result.classifications):
mutant_feature = mutant_features[idx % len(mutant_features)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't see it

Copy link
Contributor

@tolga-b tolga-b left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

@jameswex jameswex requested a review from stephanwlee November 21, 2018 14:01
@jameswex
Copy link
Contributor Author

@stephanwlee tolga has approved the changes, can you take a look so i can merge? thanks!

ex.features.feature[e.detail.feature_name].bytesList.value[0] =
btoa(this.$.dash.partialDepPlotEligibleFeatures[
featureMapping[e.detail.feature_name]].samples[i]);
const method = async () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this method has to be async since it seems to be just a callback to setTimeout. Just curious, what is the intention?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is async because it uses await inside. and it all is inside a setTimeout so that the UI doesn't freeze up and the display will correctly show a spinning loader while the chart is generated, as it does when not in demo mode already.

const input = tensors.reshape([indices.length, 105])
const res = this.model.predict(input, {batchSize: 128});
const predValues = await res.data();
tensors.dispose();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tensors is null when indices.length is 0.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added if check around this

inferences.indices = indices;
inferences.results = [{regressionResult: {regressions: []}}];
const tensors = [];
let tensors = null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct me if I am wrong but it seems easier to parse and read to create an array and use tf.concat like this:

const tensorArr = indices.map(idx => this.convertExToTensor(this.data[idx]));
const tensors = tf.concat(tensorArr);
// Does concatting create a new tensor? Is it okay to dispose it here?
tensorArr.forEach(tensor => tensor.dispose());
const input = tensors.reshape([indices.length, 105]);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good suggestion. changing.

}
}
const input = tf.concat(tensors).reshape([tensors.length, 105])
const input = tensors.reshape([indices.length, 105])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: add semi-colon at the end for consistency

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done here and in iris demo as well

for (let idx = i;
idx < Math.min(i + BATCH_SIZE, examples.length);
idx++) {
const tensor =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this scoped variable is not being used.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed in all demos

idx < Math.min(i + BATCH_SIZE, examples.length);
idx++) {
const tensor =
tlist.push(this.convertExToTensor(examples[idx]));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

idea: it would be an awesome addition if we can add this.convertExamplesToTensors(iterator) where the iterator returns an example to avoid some of these repetitions.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i have a plan to consolidate the logic in the demos for less repeated code as it is getting to be a problem. was planning on saving it for a clean-up PR


.pd-no-features-text {
font-size: 18px;
color: #777;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

stylistic nit: css property names should be ordered alphabetically a-z here and below

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you have a tool you use to auto-sort your CSS rules inside of polymer elements?

},
// True if showing global PD plots. False if showing individual PD
// plots.
globalPdPlots: Boolean,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be a readonly property?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

<div class="button-prefix-label">Partial Dependence Plots:</div>
<div>
<paper-button class="control-button" disabled$="[[!areExamplesEditable_(modelName, inferenceAddress)]]"
on-click="showGlobalPartialDependencePlots_" alt="show global partial dependence plots">
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider using on-tap which works also with touch devices :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

didn't realize this. did a global search and replace

Args:
example_proto: The example to mutate.
example_proto: The examples to mutate.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

example_proto -> example_protos

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@jameswex jameswex merged commit 100a2a7 into tensorflow:master Nov 21, 2018
@jameswex jameswex deleted the globalpd branch November 21, 2018 20:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants