storj/scripts/test-uplink.sh
paul cannon 1d78ddc3df
Test that network stalls don't cause indefinite hangs in uplink (#1530)
* test-network-stalls tests... network stalls!

in particular, right now, it just tests whether an uplink correctly
times out after some amount of time when one of the nodes it's talking
to suddenly goes offline.

This tool is meant to be run under `storj-sim network test`.

Also included here:

* fix storj-sim-related test scripts on Mac

the default storj config dir on Mac has a space in it
('~/Library/Application Support/Storj'), which breaks everywhere it
shows up in an unquoted variable in a sh/bash script. easy enough to fix
as a one-off, but quoting bash vars avoids a dozen other potential
problems too.

change a few things using `head -c` to use `dd`. `head -c` works,
but is not as widely understood (as evidenced by each of these scripts
getting through code review, one at a time, with the comments not
matching the numbers actually used).

* storj-sim reports PIDs of worker processes to test

so that the tests can cause unfortunate "accidents" to befall the worker
processes in the course of the test, and find out whether everything
reacts correctly.
2019-03-20 08:58:07 -06:00

67 lines
1.9 KiB
Bash
Executable File

#!/bin/bash
set -ueo pipefail
TMPDIR=$(mktemp -d -t tmp.XXXXXXXXXX)
cleanup(){
rm -rf "$TMPDIR"
echo "cleaned up test successfully"
}
trap cleanup EXIT
BUCKET=bucket-123
SRC_DIR=$TMPDIR/source
DST_DIR=$TMPDIR/dst
mkdir -p "$SRC_DIR" "$DST_DIR"
random_bytes_file () {
size=$1
output=$2
dd if=/dev/urandom of="$output" count=1 bs="$size" >/dev/null 2>&1
}
random_bytes_file 2x1024 "$SRC_DIR/small-upload-testfile" # create 2kb file of random bytes (inline)
random_bytes_file 5x1024x1024 "$SRC_DIR/big-upload-testfile" # create 5mb file of random bytes (remote)
uplink --config-dir "$GATEWAY_0_DIR" mb "sj://$BUCKET/"
uplink --config-dir "$GATEWAY_0_DIR" cp "$SRC_DIR/small-upload-testfile" "sj://$BUCKET/"
uplink --config-dir "$GATEWAY_0_DIR" cp "$SRC_DIR/big-upload-testfile" "sj://$BUCKET/"
uplink --config-dir "$GATEWAY_0_DIR" cp "sj://$BUCKET/small-upload-testfile" "$DST_DIR"
uplink --config-dir "$GATEWAY_0_DIR" cp "sj://$BUCKET/big-upload-testfile" "$DST_DIR"
uplink --config-dir "$GATEWAY_0_DIR" rm "sj://$BUCKET/small-upload-testfile"
uplink --config-dir "$GATEWAY_0_DIR" rm "sj://$BUCKET/big-upload-testfile"
uplink --config-dir "$GATEWAY_0_DIR" ls "sj://$BUCKET"
uplink --config-dir "$GATEWAY_0_DIR" rb "sj://$BUCKET"
if cmp "$SRC_DIR/small-upload-testfile" "$DST_DIR/small-upload-testfile"
then
echo "small upload testfile matches uploaded file"
else
echo "small upload testfile does not match uploaded file"
fi
if cmp "$SRC_DIR/big-upload-testfile" "$DST_DIR/big-upload-testfile"
then
echo "big upload testfile matches uploaded file"
else
echo "big upload testfile does not match uploaded file"
fi
# check if all data files were removed
# FILES=$(find "$STORAGENODE_0_DIR/../" -type f -path "*/blob/*" ! -name "info.*")
# if [ -z "$FILES" ];
# then
# echo "all data files removed from storage nodes"
# else
# echo "not all data files removed from storage nodes:"
# echo $FILES
# exit 1
# fi