For backups, you sync your local folder to some servers in the cloud. Since only the updated files are copied, you can run the command via cron automatically, daily or hourly. You can do the same to encrypted files if you use ecryptfs.
For bittorrent downloads into the cloud, you can automatically sync the downloads to your local drive. With automatic sync, it's as good as downloading directly, without the hazzle of starting bt client on your computer. But the bandwidth usage increases by 50%, because the files got transferred out of the cloud twice, one for seeding and one to your computer.
The easiest way is using Ubuntu in EC2 and locally, which I am. First to copy a file or whole folder, from EC2 to hard drive. Assume you have a folder called downloads in EC2, and your default username there is ubuntu.
scp -i /home/yourusername/yourkeyfile.pem firstname.lastname@example.org:downloads/foldername /home/yourusername/downloads
The advantage is that there is a counter for the percentage completion. You can custom the filenames and folder names.
If you sync the files, a single command is all you ever need, and you can put it into cron and forget about it.
rsync -e "ssh -i /home/yourusername/yourkey.pem" -av --exclude '*.part' email@example.com:downloads ~
For bittorrents, you exclude the partial downloads and never waste bandwidth. The disadvantage is that the command is completely slient. You have no way of knowing the progress unless using other means such the system network monitor. Also the partial file is hidden somewhere, so you cannot take a peep first.