Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework structure for 8bit -> 24bit mode #238

Merged
merged 16 commits into from
Mar 23, 2022
Merged

Conversation

t-bltg
Copy link
Member

@t-bltg t-bltg commented Mar 20, 2022

  • use UInt32 instead of Union{Nothing,UInt8} for canvas color (allows packing r, g, b channels for truecolor or 256 colors), blending rgb, ...
  • use more Crayons: reduces test files by about 25%
  • switch between 24bit and 8bit at runtime on COLORTERM env variable (generic) or UP_COLORMODE
  • these changes will allow a following PR for saving a UnicodePlots as a png image (Save Plot as png #239)
  • get rid of printstyled in Base and delegate colors to Crayons (more flexible for style change (e.g. bold, italic, ...))
  • this PR allows a finer (increased resolution) color range in 24bit mode for heatmaps or when using colorbars:
    hm
  • in order to ease transition, USE_LUT is set to false by default, hence using 256 colors when using named colors or integers in 24bit mode.

perf regression

using BenchmarkTools, UnicodePlots

main() = begin
  @show ENV["COLORTERM"]
  io = IOContext(PipeBuffer(), :color=>true)
  @btime show($io, surfaceplot(0:0.5:(2π), 0:0.5:(2π), (x, y) -> sin(x) + cos(y), color = :yellow))
  @btime show($io, lineplot([cos, sin, tan], -π / 2, 2π))
  @btime show($io, barplot(["I need a\nbreak", "I dont but could", :Hello, :Again], [30, 40, 20, 10]))
  @btime show($io, surfaceplot(-2:2, -2:2, (x, y) -> 15sinc((x^2 + y^2) / π), zscale=z -> 0, lines=true, colormap=:jet))
  @btime show($io, heatmap(collect(0:30) * transpose(collect(0:30)), xfact=.1, yfact=.1, xoffset=-1.5, colormap=:inferno))
end

main()

pr

ENV["COLORTERM"] = "truecolor"
  300.874 μs (2411 allocations: 127.09 KiB)
  368.014 μs (2886 allocations: 127.16 KiB)
  83.622 μs (584 allocations: 25.94 KiB)
  592.752 μs (5182 allocations: 221.33 KiB)
  1.348 ms (17366 allocations: 792.30 KiB)
ENV["COLORTERM"] = "nothing"
  242.127 μs (1635 allocations: 90.72 KiB)
  323.428 μs (2034 allocations: 87.22 KiB)
  73.201 μs (496 allocations: 21.81 KiB)
  561.642 μs (5662 allocations: 214.48 KiB)
  1.127 ms (14799 allocations: 668.06 KiB)

master

  706.757 μs (9052 allocations: 502.95 KiB)
  795.607 μs (9363 allocations: 496.50 KiB)
  73.877 μs (601 allocations: 35.06 KiB)
  1.271 ms (13516 allocations: 915.23 KiB)
  1.268 ms (14826 allocations: 1021.61 KiB)

visual inspection

root="~/.julia/dev/UnicodePlots.jl"  # fork

for f in $(find $root/test/references_8 -name '*.txt'); do
  tmp=$(mktemp)
  fm=${f/references_8/references}
  url="https://raw.githubusercontent.com/JuliaPlots/UnicodePlots.jl/master/${fm#$root/}"
  wget -q -O $tmp $url
  if ! diff $f $tmp 1>/dev/null; then
    echo; echo
    echo "== remote($url) =="
    cat $tmp; echo
    echo "== local8($f) =="
    cat $f; echo
    f24=${f/references_8/references_24}
    echo "== local24($f24) =="
    cat $f24; echo
  else
    echo "== same($f) =="
  fi
  rm $tmp
done

@t-bltg t-bltg changed the title 8 24bit Rework structure for 8bit -> 24bit mode Mar 20, 2022
@codecov-commenter
Copy link

codecov-commenter commented Mar 20, 2022

Codecov Report

Merging #238 (10eacb8) into master (406f5cc) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##           master     #238   +/-   ##
=======================================
  Coverage   99.93%   99.93%           
=======================================
  Files          28       28           
  Lines        1501     1572   +71     
=======================================
+ Hits         1500     1571   +71     
  Misses          1        1           
Impacted Files Coverage Δ
src/canvas.jl 100.00% <100.00%> (ø)
src/canvas/asciicanvas.jl 100.00% <100.00%> (ø)
src/canvas/blockcanvas.jl 100.00% <100.00%> (ø)
src/canvas/braillecanvas.jl 98.11% <100.00%> (ø)
src/canvas/densitycanvas.jl 100.00% <100.00%> (ø)
src/canvas/dotcanvas.jl 100.00% <100.00%> (ø)
src/canvas/heatmapcanvas.jl 100.00% <100.00%> (ø)
src/canvas/lookupcanvas.jl 100.00% <100.00%> (ø)
src/colormaps.jl 100.00% <100.00%> (ø)
src/common.jl 100.00% <100.00%> (ø)
... and 4 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 406f5cc...10eacb8. Read the comment docs.

@t-bltg t-bltg force-pushed the 8_24bit branch 2 times, most recently from 2db58e3 to b65cf5c Compare March 21, 2022 10:41
@t-bltg t-bltg requested a review from johnnychen94 March 21, 2022 18:57
@t-bltg
Copy link
Member Author

t-bltg commented Mar 21, 2022

@johnnychen94, this PR is massive (due to the changes and addition of reference files), and I don't know if you have the time or the will to review it. If not, no worries, tell me so and I'll merge it later this week.

@t-bltg t-bltg mentioned this pull request Mar 22, 2022
@johnnychen94
Copy link
Collaborator

I don't know if you have the time or the will to review it.

Not really... Sorry.. I'm starting to put more time into my own work here to make my phd career not that a failure :sad:

For the long-term goal that might never be achieved, I'm expecting to work closely with Term.jl, and eventually build a terminal-based IDE for Julia. This is why I was writing a decoder as well in JuliaImages/ImageInTerminal.jl#62. But I have to confess that I haven't looked into Term.jl at all.

@johnnychen94 johnnychen94 removed their request for review March 22, 2022 17:08
@t-bltg t-bltg merged commit ee91545 into JuliaPlots:master Mar 23, 2022
@t-bltg t-bltg deleted the 8_24bit branch March 23, 2022 12:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants