Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue591 add semantic tags #1

Merged
merged 81 commits into from
May 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
81 commits
Select commit Hold shift + click to select a range
61faa5d
Merge pull request #24 from ibpsa/master
EttoreZ Mar 12, 2023
b534e0d
Change scenario tests to 2 days and update ref results
dhblum Nov 3, 2023
5e6871c
Run scenario tests using 1 day steps
dhblum Nov 3, 2023
0b380aa
Merge pull request #25 from ibpsa/master
EttoreZ Dec 6, 2023
27cc738
Update JModelica Docker address in testing
EttoreZ Dec 7, 2023
035a974
Merge pull request #592 from EttoreZ/issue590_updateJModelicaImage
EttoreZ Dec 7, 2023
be123ce
Merge branch 'master' into issue239_extendForecast
EttoreZ Dec 7, 2023
345a67f
update the env for running the js examples
Dec 11, 2023
b879536
remove temp file
Dec 11, 2023
e9cc8b9
Merge pull request #595 from SenHuang19/issue594_javascript_update
dhblum Dec 11, 2023
f384f8a
update release note
Dec 12, 2023
c2dc112
typo fix
Dec 12, 2023
510687d
Merge pull request #597 from SenHuang19/issue594_javascript_update
dhblum Dec 12, 2023
fc742c9
Merge pull request #596 from ibpsa/issue594_javascript_update
dhblum Dec 12, 2023
9ede925
Merge branch 'master' into issue590_UpdateJModelicaImage
dhblum Dec 12, 2023
2dc3472
Correct releasenotes.md placement from #597
dhblum Dec 12, 2023
2299ea6
Extend data_manager get_data reading beyond one year
EttoreZ Dec 12, 2023
df18e24
Merge pull request #26 from ibpsa/master
EttoreZ Dec 12, 2023
dff59a4
Added unit tests for simulation and forecast across the year
EttoreZ Dec 13, 2023
34cf952
Merge pull request #598 from EttoreZ/issue239_extendForecast
EttoreZ Dec 13, 2023
0dc0a8b
Updated release notes
EttoreZ Dec 13, 2023
759fe5d
Merge pull request #600 from EttoreZ/issue239_extendForecast
EttoreZ Dec 13, 2023
ab8396b
Merge pull request #593 from ibpsa/issue590_UpdateJModelicaImage
dhblum Dec 14, 2023
e583645
Merge pull request #28 from ibpsa/master
EttoreZ Dec 14, 2023
7a46bd2
Merge branch 'master' into issue239_extendForecast
EttoreZ Dec 14, 2023
ef82e60
Merge pull request #602 from EttoreZ/issue239_extendForecast
EttoreZ Dec 14, 2023
87b26dd
Reduce size of reference result files
dhblum Jan 10, 2024
f5710c4
Clean docs
dhblum Jan 10, 2024
90f20de
Updated forecasts documentation
EttoreZ Jan 16, 2024
4646ef6
Update design text and image
dhblum Jan 19, 2024
3583585
Merge pull request #599 from ibpsa/issue239_extendForecast
dhblum Jan 20, 2024
1b05f4b
Pin Dockerfile image to `linux/x86_64`
mattrobmattrob Jan 21, 2024
1332199
Add release notes and contributor entry
mattrobmattrob Jan 22, 2024
3c04cc0
Merge pull request #607 from mattrobmattrob/patch-1
dhblum Jan 23, 2024
34f7fdf
Fix typo
dhblum Jan 25, 2024
9db3b35
Update release notes
dhblum Jan 25, 2024
494bc9f
Update data= to json= in some tests
dhblum Jan 26, 2024
8c733d5
Merge pull request #609 from ibpsa/issue601_typoDesDoc
dhblum Jan 27, 2024
6e6c6a0
Install and initialize Git LFS in `testing/Dockerfile`
mattrobmattrob Feb 3, 2024
77ffad3
Add release notes entry
mattrobmattrob Feb 3, 2024
f3d0124
Update releasenotes.md
dhblum Feb 5, 2024
8382663
Corrected typo in documentation
Feb 6, 2024
f1c71b0
Merge pull request #614 from mattrobmattrob/mr/correct.compilation.gi…
dhblum Feb 6, 2024
5c90c3b
Fixed typo in revision notes and update release notes
Feb 7, 2024
e7bd1cf
Merge branch 'master' into issue605_fixTypoDocMulZonSimOffAir
dhblum Feb 7, 2024
a2a2583
Merge pull request #615 from ibpsa/issue605_fixTypoDocMulZonSimOffAir
dhblum Feb 8, 2024
4b88f5f
Merge branch 'master' into issue576_reduceTestTime
dhblum Feb 9, 2024
8adcbc1
Update releasenotes
dhblum Feb 10, 2024
2b4328c
Merge pull request #584 from ibpsa/issue576_reduceTestTime
dhblum Feb 10, 2024
43a2b62
Remove use of travis-specific make targets for testcases (for #620)
dhblum Feb 14, 2024
fc1be6d
Add check on loaded logs and print log list
dhblum Feb 15, 2024
a93ceb5
Update unit test results for time ratio
dhblum Feb 15, 2024
0ebf621
Update tim_rat kpis for multizone_residential_hydronic
dhblum Feb 16, 2024
928e6fe
Update release notes [ci skip]
dhblum Feb 17, 2024
c7c5fbe
Merge pull request #621 from ibpsa/issue620_travisJobFailure
dhblum Feb 17, 2024
be4913b
Update get_html_IO script to print activate and new total file
EttoreZ Feb 21, 2024
c52c2f9
Update testcases documentation
EttoreZ Feb 21, 2024
cd2c03b
Updated release notes
EttoreZ Feb 21, 2024
e3316d7
Address review comments
EttoreZ Feb 22, 2024
50396f7
Update documentation
EttoreZ Feb 22, 2024
fd39c7a
Update documentation
EttoreZ Feb 22, 2024
8afd1f3
first implementation
HWalnum Feb 27, 2024
5780811
fixed get_results call
HWalnum Feb 27, 2024
ee92c06
fixed results quiery as some outputs are removed from output_names
HWalnum Feb 27, 2024
525a5e9
Update README and release notes
EttoreZ Feb 28, 2024
8b510d3
Edits to readme text
dhblum Mar 5, 2024
cf4a164
Merge pull request #624 from ibpsa/issue555_missingActivateDocumentation
dhblum Mar 6, 2024
3d4099d
created _get_test_results() to avoid duplicate code
HWalnum Mar 8, 2024
f50b7b4
updated releasenotes.md
HWalnum Mar 8, 2024
fd67865
Merge pull request #629 from HWalnum/issue626_storeResults
dhblum Mar 11, 2024
9987e2d
Make time as index for csv
dhblum Mar 11, 2024
74314ea
Use points instead of parameters
dhblum Mar 11, 2024
485301c
Doc formatting and edits
dhblum Mar 11, 2024
ed371c0
Revert back to forecastParameters without other points
dhblum Mar 13, 2024
2d70472
Merge branch 'master' into issue626_storeResults
dhblum Mar 13, 2024
edc0c3c
Update dict in python2,3 compatible way for unit tests to pass
dhblum Mar 13, 2024
4b81091
Update releasenotes.md [ci skip]
dhblum Mar 14, 2024
841c803
Merge pull request #632 from ibpsa/issue626_storeResults
dhblum Mar 14, 2024
a27544b
Add docs, model annotations, ttl file and notebook postprocess
EttoreZ Mar 27, 2024
8ccbfa7
Add final semantic model
EttoreZ Mar 27, 2024
30ca5a2
Update semantic model
EttoreZ Apr 15, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,22 +74,22 @@ jobs:
script: cd testing && make build_jm_image && make test_twozone_apartment_hydronic
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air_travis ARG="-s test_peak_heat_day,test_typical_heat_day"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air ARG="-s test_peak_heat_day,test_typical_heat_day"
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air_travis ARG="-s test_peak_cool_day,test_typical_cool_day,test_mix_day"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air ARG="-s test_peak_cool_day,test_typical_cool_day,test_mix_day"
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air_travis ARG="-s API"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_office_simple_air ARG="-s API"
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic_travis ARG="-s test_peak_heat_day,test_shoulder"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic ARG="-s test_peak_heat_day,test_shoulder"
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic_travis ARG="-s test_typical_heat_day,test_summer"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic ARG="-s test_typical_heat_day,test_summer"
- python: 3.9
install: pip install --upgrade pip && pip install pandas==1.2.5 numpy==1.20.2 matplotlib==3.3.4 requests==2.25.1
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic_travis ARG="-s API"
script: cd testing && make build_jm_image && travis_wait 90 make test_multizone_residential_hydronic ARG="-s API"
- python: 2.7
install: pip install --upgrade pip && pip install pandas==0.24.2 numpy==1.16.6 matplotlib==2.1.1 requests==2.18.4
script: cd testing && make test_python2
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM ubuntu:20.04
FROM --platform=linux/x86_64 ubuntu:20.04

# Install required packages
RUN apt-get update && \
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Example RESTful interaction:

| Interaction | Request |
|-----------------------------------------------------------------------|-----------------------------------------------------------|
| Advance simulation with control input and receive measurements. | POST ``advance`` with optional json data "{<input_name>:<value>}" |
| Advance simulation with control input and receive measurements. | POST ``advance`` with optional arguments ``<input_name_u>:<value>``, and corresponding ``<input_name_activate>:<0 or 1>``, where 1 enables value overwrite and 0 disables (0 is default) |
| Initialize simulation to a start time using a warmup period in seconds. Also resets point data history and KPI calculations. | PUT ``initialize`` with required arguments ``start_time=<value>``, ``warmup_period=<value>``|
| Receive communication step in seconds. | GET ``step`` |
| Set communication step in seconds. | PUT ``step`` with required argument ``step=<value>`` |
Expand Down
1 change: 1 addition & 0 deletions contributors.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Thank you to all who have provided guidance on the development of this software.
- Robert Lutes, Pacific Northwest National Laboratory
- Kefei Mo, Pacific Northwest National Laboratory
- Erik Paulson, Independent
- Matt Robinson, University of Colorado - Boulder
- Jermy Thomas, National Renewable Energy Laboratory
- Christian Veje, University of Southern Denmark
- Draguna Vrabie, Pacific Northwest National Laboratory
Expand Down
67 changes: 55 additions & 12 deletions data/data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -333,20 +333,36 @@ def get_data(self, horizon=24*3600, interval=None, index=None,
# the closest possible point under stop will be the end
# point in order to keep interval unchanged among index.
index = np.arange(start,stop+0.1,interval).astype(int)
else:
if not isinstance(index, np.ndarray):
index = np.asarray(index)

# Reindex to the desired index
data_slice_reindexed = data_slice.reindex(index)

for key in data_slice_reindexed.keys():
# Use linear interpolation for continuous variables
if key in self.categories['weather']:
f = interpolate.interp1d(self.case.data.index,
self.case.data[key], kind='linear')
# Use forward fill for discrete variables
else:
f = interpolate.interp1d(self.case.data.index,
self.case.data[key], kind='zero')
data_slice_reindexed.loc[:,key] = f(index)
start = index[0]
# 1 year in (s)
year = 31536000
# Starting year
year_start = int(np.floor(start/year))*year
# Normalizing index with respect to starting year
index_norm = index - year_start
stop_norm = index_norm[-1]
# If stop happens across the year divide df and interpolate separately
if stop_norm > data_slice.index[-1]:
idx_year = (np.abs(index_norm - year)).argmin() + 1
# Take previous index value if index at idx_year > year
if index_norm[idx_year - 1] - year > np.finfo(float).eps:
idx_year = idx_year -1
df_slice1 = data_slice.reindex(index_norm[:idx_year])
df_slice1 = self.interpolate_data(df_slice1,index_norm[:idx_year])
df_slice2 = data_slice.reindex(index_norm[idx_year:] - year)
df_slice2 = self.interpolate_data(df_slice2,index_norm[idx_year:] - year)
df_slice2.index = df_slice2.index + year
data_slice_reindexed = pd.concat([df_slice1,df_slice2])
else:
data_slice_reindexed = data_slice.reindex(index_norm)
data_slice_reindexed = self.interpolate_data(data_slice_reindexed,index_norm)
# Add starting year back to index desired by user
data_slice_reindexed.index = data_slice_reindexed.index + year_start

if plot:
if category is None:
Expand Down Expand Up @@ -504,6 +520,33 @@ def get_data_metadata(self):

return data_metadata

def interpolate_data(self,df,index):
'''Interpolate testcase data.

Parameters
----------
df: pandas.DataFrame
Dataframe that needs to be interpolated
index: np.array()
Index to use to get interpolated data

Returns
-------
df: pandas.DataFrame
Interpolated dataframe

'''
for key in df.keys():
# Use linear interpolation for continuous variables
if key in self.categories['weather']:
f = interpolate.interp1d(self.case.data.index,
self.case.data[key], kind='linear')
# Use forward fill for discrete variables
else:
f = interpolate.interp1d(self.case.data.index,
self.case.data[key], kind='zero')
df.loc[:,key] = f(index)
return df

if __name__ == "__main__":
import sys
Expand Down
29 changes: 22 additions & 7 deletions data/get_html_IO.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,9 @@
2. Run BOPTEST test case on localhost:5000
3. Run this script

Outputs:
"inputs.txt": html code documenting the inputs
"measurements.txt": html code documenting the outputs

Output:
"inputs_measurements_forecasts.html" html code documenting inputs, outputs and
forecasts together
"""

# GENERAL PACKAGE IMPORT
Expand Down Expand Up @@ -40,24 +39,40 @@ def run():

# GET TEST INFORMATION
# --------------------
# Create single I/O file
# Inputs available
inputs = requests.get('{0}/inputs'.format(url)).json()['payload']
with open('inputs.txt', 'w') as f:
with open('inputs_measurements_forecasts.html', 'w') as f:
f.write('<h3>Model IO\'s</h3>\n')
f.write('<h4>Inputs</h4>\n')
f.write('The model inputs are:\n')
f.write('<ul>\n')
for i in sorted(inputs.keys()):
if 'activate' not in i:
f.write('<li>\n<code>{0}</code> [{1}] [min={2}, max={3}]: {4}\n</li>\n'.format(i,inputs[i]['Unit'],inputs[i]['Minimum'], inputs[i]['Maximum'], inputs[i]['Description']))
else:
f.write('<li>\n<code>{0}</code> [1] [min=0, max=1]: Activation signal to overwrite input {1} where 1 activates, 0 deactivates (default value)\n</li>\n'.format(i,i.replace('activate','')+'u'))
f.write('</ul>\n')
# Measurements available
measurements = requests.get('{0}/measurements'.format(url)).json()['payload']
with open('measurements.txt', 'w') as f:
with open('inputs_measurements_forecasts.html', 'a') as f:
f.write('<h4>Outputs</h4>\n')
f.write('The model outputs are:\n')
f.write('<ul>\n')
for i in sorted(measurements.keys()):
if 'activate' not in i:
f.write('<li>\n<code>{0}</code> [{1}] [min={2}, max={3}]: {4}\n</li>\n'.format(i,measurements[i]['Unit'],measurements[i]['Minimum'], measurements[i]['Maximum'], measurements[i]['Description']))
f.write('</ul>\n')
# Forecasts available
forecast_points = requests.get('{0}/forecast_points'.format(url)).json()['payload']
with open('forecast_points.txt', 'w') as f:
with open('inputs_measurements_forecasts.html', 'a') as f:
f.write('<h4>Forecasts</h4>\n')
f.write('The model forecasts are:\n')
f.write('<ul>\n')
for i in sorted(forecast_points.keys()):
if 'activate' not in i:
f.write('<li>\n<code>{0}</code> [{1}]: {2}\n</li>\n'.format(i,forecast_points[i]['Unit'],forecast_points[i]['Description']))
f.write('</ul>\n')
# --------------------

if __name__ == "__main__":
Expand Down
34 changes: 32 additions & 2 deletions docs/design/source/forecasts.rst
100755 → 100644
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,13 @@
Forecast Generation
===================

Forecaster Module
-----------------

A forecaster module is developed to retrieve forecast from a BOPTEST
testcase, which is needed for predictive controllers. This module uses the
data_manager object of a test case to read the deterministic forecast from
the testcase wrapped.fmu. In future developments it will be possible to
:code:`data_manager` object of a test case to read the deterministic forecast from
the testcase :code:`wrapped.fmu`. In future developments it will be possible to
request stochastic forecast with a predefined distribution over the
deterministic forecast for research purposes. This distribution will be
added on the top of the deterministic forecast mentioned before.
Expand All @@ -15,3 +18,30 @@ The controller developer can choose the prediction horizon and interval of
the forecast from the actual simulation time. The controller developer may
also filter the forecast for a specific data category or request all data
variables and filter it afterwards.

Getting Weather Forecasts Across Year-End
-----------------------------------------

The data in TMY weather files used in BOPTEST test cases are discontinuous
at the end of the year relative to the start of the year.
Therefore, so is the weather data in the .csv files supplied for weather
forecast generation. As an example, see the relative humidity in the
figure below (orange line). If weather forecasts are asked for that cross
year-end, the :code:`data_manager` object used by the forcaster module splits the
data at year-end into one portion that is inclusive of the last data point
at the end of the year (midnight), and one portion after the end of the
year that is not inclusive of the first data point at the start of the
year (midnight). The implementation is done this way so that the forecast
is more consistent for any interval through the full first year if a user
only intends to simulate one year. The relative humidity plot below shows
the interpolation behavior of the implementation graphically for forecast
intervals of 1800s and 123s (intervals used in the unit tests), compared
to the reference weather boundary condition data at 1800s intervals
in the .csv file.

.. figure:: images/relative_humidity_over_year_sim.png
:scale: 50 %

Forecast retrieved across one year for intervals of 1800s (blue) and
123s (yellow) compared to the reference data in the boundary condition
.csv file (orange), shown for relative humidity.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
17 changes: 15 additions & 2 deletions docs/design/source/testcasedev.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,8 @@ following structure:
| | | |--prices.csv // Energy pricing schedules
| | | |--emissions.csv // Energy emission factor schedules
| | | |--setpoints.csv // Thermal and IAQ comfort region schedules
| |--bacnet.ttl // BACnet object definition file
| |--bacnet.ttl // BACnet object definition file
| |--semantic_model.ttl
|--doc // Documentation directory
| |--doc.html // Copy of .mo file documentation
| |--images // Image directory
Expand Down Expand Up @@ -146,7 +147,7 @@ The second function is to export a wrapper FMU that utilizes the signal exchange

4. Add one output for every Read block found named :code:`<block_instance_path>_y`. Assign :code:`<block_instance_path>_y` the unit, descriptions, min/max, and other signal attribute data specified by the Read block.

5. For Overwrite blocks, connect :code:`<block_instance_path>_u` to :code:`<block.instance.path>.u`, :code:`<block_instance_path>_activate` to :code:`<block.instance.path>.activate`, and :code:`<block_instance_path>_y` to :code:`<block.instance.path>.y`.
5. For Overwrite blocks, connect :code:`<block_instance_path>_u` to :code:`<block.instance.path>.uExt.y`, :code:`<block_instance_path>_activate` to :code:`<block.instance.path>.activate.y`, and :code:`<block_instance_path>_y` to :code:`<block.instance.path>.y`.

6. For Read blocks, connect :code:`<block_insance_path>_y` to :code:`<block.instance.path>.y`.

Expand Down Expand Up @@ -307,6 +308,18 @@ The file can be created automatically using the script
:code: `bacnet/create_ttl.py` located in the IBPSA github repository.


Semantic model .ttl generation
--------------------------------------------
In order to enable a semantic application to read and write to BOPTEST points, a
:code:`semantic_model.ttl` file is available in the testcase :code:`models` directory. This
model can be generated by parsing the semantic annotations present in the testcase model
with :code:`modelica-json` (https://github.com/lbl-srg/modelica-json) and finilize the .ttl file
with the python script present in the testcase folder.
The semantic tagging in the model is done on the :code:`read` and :code:`overwrite` blocks
where the points are assigned using the Brick schema https://brickschema.org/ and the model
equipement is automatically inferred using modelica-json and the custom python script.


Data Generation and Collection Module
-------------------------------------

Expand Down
7 changes: 5 additions & 2 deletions examples/javascript/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,21 @@ ARG testcase

RUN apt-get update && \
apt-get install -y \
python3.7 \
python3-pip \
firefox \
wget

RUN pip3 install splinter urllib3 pandas selenium
RUN python3.7 -m pip install --upgrade pip

RUN python3.7 -m pip install splinter urllib3 pandas selenium

ENV PATH $PATH:/home

WORKDIR /home

RUN cd /home && \
wget https://github.com/mozilla/geckodriver/releases/download/v0.24.0/geckodriver-v0.24.0-linux64.tar.gz && \
wget https://github.com/mozilla/geckodriver/releases/download/v0.33.0/geckodriver-v0.33.0-linux64.tar.gz && \
tar -xvzf geckodriver* && \
rm geckodriver-*

Expand Down
3 changes: 2 additions & 1 deletion examples/javascript/makefile
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ remove-image:

run:
$(COMMAND_RUN) && \
docker exec -it ${IMG_NAME} /bin/bash -c "python3 wrapper.py ${Script}" && \
docker exec -it ${IMG_NAME} /bin/bash -c "python3.7 wrapper.py ${Script}" && \
make copy_from_container ARGS=. && \
docker stop ${IMG_NAME}

Expand All @@ -25,3 +25,4 @@ copy_from_container:

stop:
docker stop ${Script}

11 changes: 11 additions & 0 deletions releasenotes.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,17 @@ Released on xx/xx/xxxx.
**The following changes are backwards-compatible and do not significantly change benchmark results:**

- Add materials for RLEM23 workshop at ``docs/workshops/RlemWorkshop_20231112``. This is for [#585](https://github.com/ibpsa/project1-boptest/issues/585).
- Change JModelica docker container address in ``testing/Dockerfile``. This is for [#590](https://github.com/ibpsa/project1-boptest/issues/590).
- Specify the Python version (3.7) used for building the wrapper to execute the example JavaScript controllers in the unit test. This is for [#594](https://github.com/ibpsa/project1-boptest/issues/594).
- Allow simulations and forecast to work across the end of the year to the next year. This is for [#239](https://github.com/ibpsa/project1-boptest/issues/239).
- Pin base Docker image to ``linux/x86_64`` platform. This is for [#608](https://github.com/ibpsa/project1-boptest/issues/608).
- Correct typo in design documentation about connecting inputs to overwrite blocks in wrapper model. This is for [#601](https://github.com/ibpsa/project1-boptest/issues/601).
- Add Git LFS configuration in the ``testing/Dockerfile`` image used in tests and compilation. This is for [#613](https://github.com/ibpsa/project1-boptest/issues/613).
- Correct typo in documentation for ``multizone_office_simple_air``, cooling setback temperature changed from 12 to 30. This is for [#605](https://github.com/ibpsa/project1-boptest/issues/605).
- Modify unit tests for test case scenarios to only simulate two days after warmup instead of the whole two-week scenario. This is for [#576](https://github.com/ibpsa/project1-boptest/issues/576).
- Fix unit tests for possible false passes in certain test cases. This is for [#620](https://github.com/ibpsa/project1-boptest/issues/620).
- Add ``activate`` control inputs to all test case documentation and update ``get_html_IO.py`` to print one file with all inputs, outputs, and forecasts. This is for [#555](https://github.com/ibpsa/project1-boptest/issues/555).
- Add storing of scenario result trajectories, kpis, and test information to simulation directory within test case docker container. This is for [#626](https://github.com/ibpsa/project1-boptest/issues/626).


## BOPTEST v0.5.0
Expand Down
Loading