Skip to content

Commit

Permalink
Add CI job to test memory consumption on large files
Browse files Browse the repository at this point in the history
  • Loading branch information
Bodigrim committed Dec 19, 2023
1 parent d94a988 commit dfecfa1
Show file tree
Hide file tree
Showing 2 changed files with 45 additions and 0 deletions.
29 changes: 29 additions & 0 deletions .github/workflows/large-files.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name: large-files
on:
- push
- pull_request

defaults:
run:
shell: bash

jobs:
build:
runs-on: ${{ matrix.os }}
os: ['ubuntu-latest']
ghc: ['latest']
steps:
- uses: actions/checkout@v3
- uses: haskell/actions/setup@v2
id: setup-haskell-cabal
with:
ghc-version: ${{ matrix.ghc }}
- name: Update cabal package database
run: cabal update
- uses: actions/cache@v3
name: Cache cabal stuff
with:
path: ${{ steps.setup-haskell-cabal.outputs.cabal-store }}
key: ${{ runner.os }}-${{ matrix.ghc }}
- name: Test
run: htar/test-large-files.sh
16 changes: 16 additions & 0 deletions htar/test-large-files.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/bin/sh
# Test that htar is capable to pack and unpack large files,
# without materialising them in memory in full.

set -eux
cabal build htar
HTAR=$(cabal list-bin htar)
cd "$(mktemp -d)"
mkdir input
for i in $(seq 0 4); do
dd if=/dev/zero of="input/$i.txt" bs=1M count=2048
done;
$HTAR --create --verbose --file input.tar.gz input +RTS -s -M50M
rm -rf input
$HTAR --extract --verbose --file input.tar.gz +RTS -s -M50M
ls -l input

0 comments on commit dfecfa1

Please sign in to comment.