Skip to content
This repository has been archived by the owner on May 16, 2023. It is now read-only.

Commit

Permalink
Merge pull request #18 from daniel-thom/suppress-logging
Browse files Browse the repository at this point in the history
Suppress logging
  • Loading branch information
claytonpbarrows authored Jan 16, 2021
2 parents d7304c8 + 256fa32 commit 4392959
Show file tree
Hide file tree
Showing 15 changed files with 966 additions and 1,488 deletions.
10 changes: 2 additions & 8 deletions Manifest.toml
Original file line number Diff line number Diff line change
Expand Up @@ -543,12 +543,6 @@ git-tree-sha1 = "81690084b6198a2e1da36fcfda16eeca9f9f24e4"
uuid = "682c06a0-de6a-54ab-a142-c8b1cf79cde6"
version = "0.21.1"

[[JSON2]]
deps = ["Dates", "Parsers", "Test"]
git-tree-sha1 = "66397cc6c08922f98a28ab05a8d3002f9853b129"
uuid = "2535ab7d-5cd8-5a07-80ac-9b1792aadce3"
version = "0.3.2"

[[JSON3]]
deps = ["Dates", "Mmap", "Parsers", "StructTypes", "UUIDs"]
git-tree-sha1 = "f17f647d78ade849298039b75bbd48c05da77900"
Expand Down Expand Up @@ -843,9 +837,9 @@ version = "1.3.4+2"

[[OpenBLAS32_jll]]
deps = ["Artifacts", "CompilerSupportLibraries_jll", "JLLWrappers", "Libdl", "Pkg"]
git-tree-sha1 = "a459bb2c511b679d726cfd6c4c370149c6837fe3"
git-tree-sha1 = "ba4a8f683303c9082e84afba96f25af3c7fb2436"
uuid = "656ef2d0-ae68-5445-9ca0-591084a874a2"
version = "0.3.12+0"
version = "0.3.12+1"

[[OpenBLAS_jll]]
deps = ["CompilerSupportLibraries_jll", "Libdl", "Pkg"]
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
DisplayAs = "0b91fe84-8a4c-11e9-3e1d-67c38462b6d6"
InfrastructureSystems = "2cd47ed4-ca9b-11e9-27f2-ab636a7671f1"
Ipopt = "b6b21f68-93f8-5de0-b562-5493be1d77c9"
JSON2 = "2535ab7d-5cd8-5a07-80ac-9b1792aadce3"
JSON3 = "0f8b85d8-7281-11e9-16c2-39a750bddbf1"
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Expand Down
412 changes: 215 additions & 197 deletions notebook/2_PowerSystems_examples/add_forecasts.ipynb

Large diffs are not rendered by default.

1,969 changes: 706 additions & 1,263 deletions notebook/4_PowerSimulationsDynamics_examples/03_inverter_model.ipynb

Large diffs are not rendered by default.

4 changes: 3 additions & 1 deletion script/2_PowerSystems_examples/PowerSystems_intro.jl
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,9 @@ Pkg.status()
using SIIPExamples;
using PowerSystems;
using D3TypeTrees;
IS = PowerSystems.IS
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")

# ## Types in PowerSystems
# PowerSystems.jl provides a type hierarchy for specifying power system data. Data that
Expand Down
15 changes: 9 additions & 6 deletions script/2_PowerSystems_examples/US_system.jl
Original file line number Diff line number Diff line change
Expand Up @@ -23,11 +23,14 @@ using Dates
using TimeZones
using DataFrames
using CSV
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")

# ### Fetch Data
# PowerSystems.jl links to some test data that is suitable for this example.
# Let's download the test data
@info "downloading data..."
println("downloading data...")
datadir = joinpath(dirname(dirname(pathof(SIIPExamples))), "US-System")
siip_data = joinpath(datadir, "SIIP")
if !isdir(datadir)
Expand Down Expand Up @@ -57,8 +60,8 @@ initial_time = ZonedDateTime(DateTime("2016-01-01T00:00:00"), timezone)
#
# First, PowerSystems.jl only supports parsing piecewise linear generator costs from tabular
# data. So, we can sample the quadratic polynomial cost curves and provide PWL points.
@info "formatting data ..."
!isnothing(interconnect) && @info "filtering data to include $interconnect ..."
println("formatting data ...")
!isnothing(interconnect) && println("filtering data to include $interconnect ...")
gen = DataFrame(CSV.File(joinpath(datadir, "plant.csv")))
filter!(row -> row[:interconnect] == interconnect, gen)
gencost = DataFrame(CSV.File(joinpath(datadir, "gencost.csv")))
Expand Down Expand Up @@ -171,7 +174,7 @@ timeseries = []
ts_csv = ["wind", "solar", "hydro", "demand"]
plant_ids = Symbol.(string.(gen.plant_id))
for f in ts_csv
@info "formatting $f.csv ..."
println("formatting $f.csv ...")
csvpath = joinpath(siip_data, f * ".csv")
csv = DataFrame(CSV.File(joinpath(datadir, f * ".csv")))
(category, name_prefix, label) =
Expand Down Expand Up @@ -239,7 +242,7 @@ end
# describing the column names of each file in PowerSystems terms, and the PowerSystems
# data type that should be created for each generator type. The respective "us_decriptors.yaml"
# and "US_generator_mapping.yaml" files have already been tailored to this dataset.
@info "parsing csv files..."
println("parsing csv files...")
rawsys = PowerSystems.PowerSystemTableData(
siip_data,
100.0,
Expand All @@ -253,7 +256,7 @@ rawsys = PowerSystems.PowerSystemTableData(
# time series, we also need to specify which time series we want to include in the `System`.
# The `time_series_resolution` kwarg filters to only include time series with a matching resolution.

@info "creating System"
println("creating System")
sys = System(rawsys; config_path = joinpath(config_dir, "us_system_validation.json"));
sys

Expand Down
15 changes: 7 additions & 8 deletions script/2_PowerSystems_examples/add_forecasts.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,10 @@
# ### Dependencies
# Let's use the 5-bus dataset we parsed in the MATPOWER example
using SIIPExamples
using PowerSystems
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")
pkgpath = dirname(dirname(pathof(SIIPExamples)))
include(joinpath(pkgpath, "test", "2_PowerSystems_examples", "parse_matpower.jl"))

Expand All @@ -22,14 +26,9 @@ include(joinpath(pkgpath, "test", "2_PowerSystems_examples", "parse_matpower.jl"
FORECASTS_DIR = joinpath(base_dir, "forecasts", "5bus_ts")
fname = joinpath(FORECASTS_DIR, "timeseries_pointers_da.json")
open(fname, "r") do f
for line in eachline(f)
println(line)
end
@JSON3.@pretty JSON3.read(f)
end

# ### Read the pointers
ts_pointers = PowerSystems.IS.read_time_series_file_metadata(fname)

# ### Read and assign time series to `System` using the `ts_pointers` struct
add_time_series!(sys, ts_pointers)
# ### Read and assign time series to `System` using these parameters.
add_time_series!(sys, fname)
sys
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@
using SIIPExamples
using PowerSystems
const PSY = PowerSystems
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")


# # Step 1: System description

Expand Down
2 changes: 2 additions & 0 deletions script/2_PowerSystems_examples/network_matrices.jl
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
# ### Dependencies
# Let's use a dataset from the [tabular data parsing example](../../notebook/2_PowerSystems_examples/parse_matpower.ipynb)
using SIIPExamples
using Logging
logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")
pkgpath = dirname(dirname(pathof(SIIPExamples)))
include(joinpath(pkgpath, "test", "2_PowerSystems_examples", "parse_matpower.jl"))

Expand Down
3 changes: 3 additions & 0 deletions script/2_PowerSystems_examples/parse_matpower.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,9 @@ Pkg.status()
using SIIPExamples
using PowerSystems
using TimeSeries
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")

# ### Fetch Data
# PowerSystems.jl links to some test data that is suitable for this example.
Expand Down
3 changes: 3 additions & 0 deletions script/2_PowerSystems_examples/parse_psse.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,9 @@ Pkg.status()
using SIIPExamples
using PowerSystems
using TimeSeries
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")

# ### Fetch Data
# PowerSystems.jl links to some test data that is suitable for this example.
Expand Down
3 changes: 3 additions & 0 deletions script/2_PowerSystems_examples/parse_tabulardata.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ using SIIPExamples
using PowerSystems
using TimeSeries
using Dates
using Logging

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")

# ### Fetch Data
# PowerSystems.jl links to some test data that is suitable for this example.
Expand Down
6 changes: 4 additions & 2 deletions script/2_PowerSystems_examples/serialize_data.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,19 @@
# ### Dependencies
# Let's use a dataset from the [tabular data parsing example](../../notebook/2_PowerSystems_examples/parse_matpower.ipynb)
using SIIPExamples
using Logging
logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")
pkgpath = dirname(dirname(pathof(SIIPExamples)))
include(joinpath(pkgpath, "test", "2_PowerSystems_examples", "parse_matpower.jl"))

# ### Write data to a temporary directory

folder = mktempdir()
path = joinpath(folder, "system.json")
@info "Serializing to $path"
println("Serializing to $path")
to_json(sys, path)

filesize(path) / 1000000 #MB
filesize(path) / (1024 * 1024) #MiB

# ### Read the JSON file and create a new `System`
sys2 = System(path)
2 changes: 2 additions & 0 deletions script/3_PowerSimulations_examples/01_operations_problems.jl
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ using D3TypeTrees
# ### Data management packages
using Dates
using DataFrames
using Logging

# ### Optimization packages
using Cbc #solver
Expand All @@ -30,6 +31,7 @@ using Cbc #solver
# This data depends upon the [RTS-GMLC](https://github.com/gridmod/rts-gmlc) dataset. Let's
# download and extract the data.

logger = configure_logging(console_level = Error, file_level = Info, filename = "ex.log")
rts_dir = SIIPExamples.download("https://github.com/GridMod/RTS-GMLC")
rts_src_dir = joinpath(rts_dir, "RTS_Data", "SourceData")
rts_siip_dir = joinpath(rts_dir, "RTS_Data", "FormattedData", "SIIP");
Expand Down
4 changes: 2 additions & 2 deletions src/SIIPExamples.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ export print_struct

# using Weave
using Literate
using JSON2
import JSON3

repo_directory = dirname(joinpath(@__DIR__))

Expand All @@ -24,7 +24,7 @@ end

function read_json(filename)
return open(filename) do io
JSON2.read(io, Dict)
JSON3.read(io, Dict)
end
end

Expand Down

0 comments on commit 4392959

Please sign in to comment.