storj/scripts/test-sim-aws.sh
paul cannon 1d78ddc3df
Test that network stalls don't cause indefinite hangs in uplink (#1530)
* test-network-stalls tests... network stalls!

in particular, right now, it just tests whether an uplink correctly
times out after some amount of time when one of the nodes it's talking
to suddenly goes offline.

This tool is meant to be run under `storj-sim network test`.

Also included here:

* fix storj-sim-related test scripts on Mac

the default storj config dir on Mac has a space in it
('~/Library/Application Support/Storj'), which breaks everywhere it
shows up in an unquoted variable in a sh/bash script. easy enough to fix
as a one-off, but quoting bash vars avoids a dozen other potential
problems too.

change a few things using `head -c` to use `dd`. `head -c` works,
but is not as widely understood (as evidenced by each of these scripts
getting through code review, one at a time, with the comments not
matching the numbers actually used).

* storj-sim reports PIDs of worker processes to test

so that the tests can cause unfortunate "accidents" to befall the worker
processes in the course of the test, and find out whether everything
reacts correctly.
2019-03-20 08:58:07 -06:00

71 lines
2.6 KiB
Bash
Executable File

#!/bin/bash
set -ueo pipefail
#setup tmpdir for testfiles and cleanup
TMPDIR=$(mktemp -d -t tmp.XXXXXXXXXX)
cleanup(){
rm -rf "$TMPDIR"
}
trap cleanup EXIT
SRC_DIR=$TMPDIR/source
DST_DIR=$TMPDIR/dst
mkdir -p "$SRC_DIR" "$DST_DIR"
aws configure set aws_access_key_id "$GATEWAY_0_ACCESS_KEY"
aws configure set aws_secret_access_key "$GATEWAY_0_SECRET_KEY"
aws configure set default.region us-east-1
random_bytes_file () {
size=$1
output=$2
dd if=/dev/urandom of="$output" count=1 bs="$size" >/dev/null 2>&1
}
random_bytes_file 1x1024x1024 "$SRC_DIR/small-upload-testfile" # create 1mb file of random bytes (inline)
random_bytes_file 5x1024x1024 "$SRC_DIR/big-upload-testfile" # create 5mb file of random bytes (remote)
random_bytes_file 5x1024 "$SRC_DIR/multipart-upload-testfile" # create 5kb file of random bytes (remote)
echo "Creating Bucket"
aws s3 --endpoint="http://$GATEWAY_0_ADDR" mb s3://bucket
echo "Uploading Files"
aws configure set default.s3.multipart_threshold 1TB
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp "$SRC_DIR/small-upload-testfile" s3://bucket/small-testfile
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp "$SRC_DIR/big-upload-testfile" s3://bucket/big-testfile
# Wait 5 seconds to trigger any error related to one of the different intervals
sleep 5
echo "Uploading Multipart File"
aws configure set default.s3.multipart_threshold 4KB
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp "$SRC_DIR/multipart-upload-testfile" s3://bucket/multipart-testfile
echo "Downloading Files"
aws s3 --endpoint="http://$GATEWAY_0_ADDR" ls s3://bucket
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp s3://bucket/small-testfile "$DST_DIR/small-download-testfile"
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp s3://bucket/big-testfile "$DST_DIR/big-download-testfile"
aws s3 --endpoint="http://$GATEWAY_0_ADDR" cp s3://bucket/multipart-testfile "$DST_DIR/multipart-download-testfile"
aws s3 --endpoint="http://$GATEWAY_0_ADDR" rb s3://bucket --force
if cmp "$SRC_DIR/small-upload-testfile" "$DST_DIR/small-download-testfile"
then
echo "small-upload-testfile file matches uploaded file";
else
echo "small-upload-testfile file does not match uploaded file";
fi
if cmp "$SRC_DIR/big-upload-testfile" "$DST_DIR/big-download-testfile"
then
echo "big-upload-testfile file matches uploaded file";
else
echo "big-upload-testfile file does not match uploaded file";
fi
if cmp "$SRC_DIR/multipart-upload-testfile" "$DST_DIR/multipart-download-testfile"
then
echo "multipart-upload-testfile file matches uploaded file";
else
echo "multipart-upload-testfile file does not match uploaded file";
fi