Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add set_data_types_from_keras_model() and set_accum_from_keras_model() for automatic precision inference #321

Open
wants to merge 50 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
14869c6
Add set_data_types_from_keras_model() to config.py
Apr 9, 2021
f6862b6
Implement find_optimal_a_b() in set_data_types_from_keras_model()
Apr 9, 2021
41ce599
Add non-standard suffix processing to set_data_types_from_keras_model()
Apr 9, 2021
ab9c8b5
Make find_optimal_a_b() account for an extra sign bit
Apr 9, 2021
4d09be5
Make set_data_types_from_keras_model() not change types inferred from…
Apr 9, 2021
842a6f5
Refactor args and add docstring to set_data_types_from_keras_model()
Apr 9, 2021
f792fd6
Add a_final and b_final null checks to find_optimal_a_b()
Apr 15, 2021
33da4a3
Add extra suffix check to the weight loop in set_data_type_from_keras…
Apr 15, 2021
0ec6d42
Add info about not accounting for optimizations to set_data_types_fro…
Apr 15, 2021
f84aaf5
Remove "set_data_types_from_keras_model(config, model, max_bits=10)"
Apr 15, 2021
167d9b7
Base weights_keras()'s weight and suffix processing on a map
Apr 17, 2021
6cf4e2b
Update suffix_map in set_data_type_from_keras_model()
Apr 17, 2021
e4af098
Add extra dict key checks to activation processing in set_data_types_…
Apr 17, 2021
2e941d5
Add *_linear layer processing to set_data_types_from_keras_model()
Apr 17, 2021
da3691e
Add "except ..." guards around matplotlib, pandas, seaborn imports in…
Apr 20, 2021
931384f
Add best_type_algorithm parameter to set_data_types_from_keras_model()
Apr 19, 2021
3d867fb
Make best_type_algorithm differentiate between layer types
Apr 19, 2021
e168ed1
set_data_types_from_keras_model(): Ignore layers not present in config
Apr 21, 2021
5139f33
Add set_data_types_from_keras_model to utils/__init__.py imports
Apr 21, 2021
1ffec18
Add 'accum' to 'Precision' in config_from_keras_model()
Apr 23, 2021
0eebc2b
Refactor set_data_types_from_keras_model() and make it also set 'accum'
Apr 23, 2021
79aa528
Add set_accum_from_keras_model()
Apr 23, 2021
9b1da01
Make data type search error message in set_data_types...() more clear
Apr 23, 2021
4881218
Add docstring to set_accum_from_keras_model()
Apr 23, 2021
24d1d76
Add set_accum_from_keras_model import to utils/__init__.py
Apr 23, 2021
c6cbef6
Merge remote-tracking branch 'upstream/master' into auto-type-setup
Apr 26, 2021
3138ade
Add fixes and support of Conv-type layer and non-fixed precision type
Apr 26, 2021
4ea529f
Fix going to previous layer for obtaining input precision in set_accu…
Apr 26, 2021
55603d9
Add FixedPrecisionType and IntegerPrecisionType imports
Apr 26, 2021
f43ad94
Add AveragePooling1D/2D support to set_accum_from_keras_model()
Apr 26, 2021
7af9d00
Update accum_t profiling "ignoring layer" message
Apr 26, 2021
f4df75b
Add 'result' to 'Precision' in QKeras layer config generation
Apr 28, 2021
3137179
Make set_data_types_from_...() set input precision as well
Apr 28, 2021
0cf9a37
Add auto data type inference args to config_from_keras_model()
Apr 28, 2021
5d6cf7e
Fix typo in docstring for config_from_keras_model()
Apr 28, 2021
83100dd
Add 'auto_accum_only' as data_type_mode to config_from_ker...()
Apr 28, 2021
44bca78
Add accum inference clarification to doc for set_data_...()
Apr 28, 2021
b5d0528
Change the way QKeras-inferred data types are flagged
Apr 29, 2021
66038c5
Add change_flagged_types arg to set_data_types_from_...()
Apr 29, 2021
70b7afa
Add 'result' to 'Precision' for norm_layers
Apr 29, 2021
2a4b249
Merge remote-tracking branch 'origin/master' into auto-type-setup
May 17, 2021
853165f
Merge remote-tracking branch 'upstream/master' into auto-type-setup
May 17, 2021
054bf77
Improve input layer detection in set_accum_from_keras_model()
May 19, 2021
2115d9c
Add 'LayerType' key existence check to line 134 in config.py
May 19, 2021
c241f3e
Make config_from_keras_model() check for 'name' granularity when data…
Jun 4, 2021
f7f84e1
Add explicit input layer presence check to set_data_types_...
Jun 7, 2021
9adab45
Add extra implicit activation layer checks to set_data_types_...
Jun 7, 2021
f5c6649
set_data_types_from_keras_model(): Make max_bits 15 by default and ad…
Jun 30, 2021
e269d9b
Replace set_data_types args with **set_data_types_kwargs in config_fr…
Jun 30, 2021
15a1497
Add mitigation suggestions to error messages in set_data_types_from_k…
Jun 30, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
115 changes: 86 additions & 29 deletions hls4ml/model/profiling.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,30 @@
from hls4ml.model.hls_model import HLSModel
from hls4ml.model.hls_layers import IntegerPrecisionType, FixedPrecisionType
import matplotlib.pyplot as plt
import numpy as np
import pandas
import seaborn as sb
import uuid
import os
import shutil
from collections import defaultdict

from hls4ml.model.hls_model import HLSModel

try:
import matplotlib.pyplot as plt
__plt_present__ = True
except ImportError:
__plt_present__ = False

try:
import pandas
__pandas_present__ = True
except ImportError:
__pandas_present__ = False

try:
import seaborn as sb
__seaborn_present__ = True
except ImportError:
__seaborn_present__ = False

try:
from tensorflow import keras
import qkeras
Expand All @@ -25,6 +39,24 @@
__torch_profiling_enabled__ = False


def _check_plt():
if not __plt_present__:
raise RuntimeError("matplotlib could not be imported. Have you installed it by installing hls4ml[profiling] "
"or otherwise?")


def _check_pandas():
if not __pandas_present__:
raise RuntimeError("pandas could not be imported. Have you installed it by installing hls4ml[profiling] "
"or otherwise?")


def _check_seaborn():
if not __seaborn_present__:
raise RuntimeError("seaborn could not be imported. Have you installed it by installing hls4ml[profiling] "
"or otherwise?")


def get_unoptimized_hlsmodel(model):
from hls4ml.converters import convert_from_config

Expand Down Expand Up @@ -61,6 +93,9 @@ def array_to_summary(x, fmt='boxplot'):
return y

def boxplot(data, fmt='longform'):
_check_plt()
_check_seaborn()

if fmt == 'longform':
f = plt.figure() #figsize=(3, 3))
hue = 'layer' if 'layer' in data.keys() else None
Expand Down Expand Up @@ -93,6 +128,9 @@ def boxplot(data, fmt='longform'):
return None

def histogram(data, fmt='longform'):
_check_plt()
_check_seaborn()

f = plt.figure()
from matplotlib.ticker import MaxNLocator
n = len(data) if fmt == 'summary' else len(data['weight'].unique())
Expand All @@ -115,6 +153,8 @@ def histogram(data, fmt='longform'):
'histogram' : histogram}

def types_boxplot(data, fmt='longform'):
_check_plt()

from matplotlib.patches import PathPatch
from matplotlib.patches import Rectangle
ax = plt.gca()
Expand All @@ -140,6 +180,9 @@ def types_boxplot(data, fmt='longform'):
ax.add_patch(rectangle)

def types_histogram(data, fmt='longform'):
_check_plt()
_check_seaborn()

ax = plt.gca()
layers = np.array(ax.get_legend_handles_labels()[1])
colors = sb.color_palette("husl", len(layers))
Expand All @@ -160,6 +203,8 @@ def ap_fixed_WIF(dtype):
return W, I, F

def types_hlsmodel(model):
_check_pandas()

suffix = ['w', 'b']
data = {'layer' : [], 'low' : [], 'high' : []}
# Plot the default precision
Expand All @@ -183,6 +228,8 @@ def types_hlsmodel(model):
return data

def activation_types_hlsmodel(model):
_check_pandas()

data = {'layer' : [], 'low' : [], 'high' : []}
# Get the default precision
default_precision = model.config.model_precision['default']
Expand Down Expand Up @@ -225,10 +272,35 @@ def weights_hlsmodel(model, fmt='longform', plot='boxplot'):
data[-1]['weight'] = l

if fmt == 'longform':
_check_pandas()
data = pandas.DataFrame(data)
return data


def activations_hlsmodel(model, X, fmt='summary', plot='boxplot'):
if fmt == 'longform':
raise NotImplemented
elif fmt == 'summary':
data = []

_, trace = model.trace(np.ascontiguousarray(X))

if len(trace) == 0:
raise RuntimeError("HLSModel must have tracing on for at least 1 layer (this can be set in its config)")

for layer in trace.keys():
print(" {}".format(layer))

if fmt == 'summary':
y = trace[layer].flatten()
y = abs(y[y != 0])

data.append(array_to_summary(y, fmt=plot))
data[-1]['weight'] = layer

return data


def _keras_batchnorm(layer):
weights = layer.get_weights()
epsilon = layer.epsilon
Expand All @@ -255,30 +327,6 @@ def _keras_layer(layer):
})


def activations_hlsmodel(model, X, fmt='summary', plot='boxplot'):
if fmt == 'longform':
raise NotImplemented
elif fmt == 'summary':
data = []

_, trace = model.trace(np.ascontiguousarray(X))

if len(trace) == 0:
raise RuntimeError("HLSModel must have tracing on for at least 1 layer (this can be set in its config)")

for layer in trace.keys():
print(" {}".format(layer))

if fmt == 'summary':
y = trace[layer].flatten()
y = abs(y[y != 0])

data.append(array_to_summary(y, fmt=plot))
data[-1]['weight'] = layer

return data


def weights_keras(model, fmt='longform', plot='boxplot'):
if fmt == 'longform':
data = {'x' : [], 'layer' : [], 'weight' : []}
Expand All @@ -305,6 +353,7 @@ def weights_keras(model, fmt='longform', plot='boxplot'):
data[-1]['weight'] = l

if fmt == 'longform':
_check_pandas()
data = pandas.DataFrame(data)
return data

Expand Down Expand Up @@ -332,6 +381,7 @@ def activations_keras(model, X, fmt='longform', plot='boxplot'):
data[-1]['weight'] = layer.name

if fmt == 'longform':
_check_pandas()
data = pandas.DataFrame(data)
return data

Expand Down Expand Up @@ -364,6 +414,7 @@ def weights_torch(model, fmt='longform', plot='boxplot'):
data[-1]['weight'] = l

if fmt == 'longform':
_check_pandas()
data = pandas.DataFrame(data)
return data

Expand Down Expand Up @@ -392,6 +443,7 @@ def activations_torch(model, X, fmt='longform', plot='boxplot'):
data[-1]['weight'] = lname

if fmt == 'longform':
_check_pandas()
data = pandas.DataFrame(data)
return data

Expand Down Expand Up @@ -423,6 +475,7 @@ def numerical(model=None, hls_model=None, X=None, plot='boxplot'):
respectively. (Optimizations are applied to an HLSModel by hls4ml,
a post-optimization HLSModel is a final model)
"""
_check_plt()
wp, wph, ap, aph = None, None, None, None

hls_model_present = hls_model is not None and isinstance(hls_model, HLSModel)
Expand Down Expand Up @@ -579,6 +632,8 @@ def get_ymodel_keras(keras_model, X):

def _norm_diff(ymodel, ysim):
"""Calculate the square root of the sum of the squares of the differences"""
_check_plt()

diff = {}

for key in list(ysim.keys()):
Expand All @@ -601,6 +656,7 @@ def _dist_diff(ymodel, ysim):
meaning "very difference".
If difference < original value then the normalized difference would be difference/original.
"""
_check_plt()

diff = {}

Expand Down Expand Up @@ -662,7 +718,8 @@ def compare(keras_model, hls_model, X, plot_type = "dist_diff"):
matplotlib figure
plot object of the histogram depicting the difference in each layer's output
"""

_check_plt()

#Take in output from both models
#Note that each y is a dictionary with structure {"layer_name": flattened ouput array}
ymodel = get_ymodel_keras(keras_model, X)
Expand Down
2 changes: 1 addition & 1 deletion hls4ml/utils/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from __future__ import absolute_import

from hls4ml.utils.config import config_from_keras_model
from hls4ml.utils.config import config_from_keras_model, set_data_types_from_keras_model, set_accum_from_keras_model
from hls4ml.utils.plot import plot_model
from hls4ml.utils.example_models import fetch_example_model, fetch_example_list
Loading