- Published on
Optimize Your Synology NAS for Downloading
- Authors

- Name
- Kevin van Zonneveld
- @kvz
I recently bought a NAS so my data is safe & available, with the benefit of being low power / noise / heat. I've considered Netgear, QNAP, but decided to go for a Synology as it was affordable, still had a big community, decent reviews & Time Machine support.
I wanted 4 bays so that I could use RAID5 and only lose 25% space on fault tolerance
instead of (RAID1) 50%. Synology has 2 offerings in the 4-bay home-user range this year:
the DS411+ (fast) and the DS411j (slow).
I figured as long as it can blast a bandwidth adequate for 1080p over my network,
I'd save myself some money (300), heat and power consumption that
come with the more powerful + version.
However now that it's here I want it do download from newsgroups and am running into
performance issues with my junior edition.
No worries. With a little bit of hacking you can squeeze just enough performance out of this thing to make sense of it all.
Here's how I turned my budget NAS that's mediocre at 8 things into a more powerful one that's good at 3 things: downloading / file serving / backups.
Warning
This article assumes you're somewhat skilled in Linux. By applying these suggestions you could seriously mess up your Disk Station.
I'm doing this on a DS411j running DSM 3.0. Your mileage may vary.
Downloading
In an earlier article I
described how to install SABnzbd. After testdriving it for a while I was never able to
get it to download above 3MB/s (2 average). Where as nzbget (the program used by Synology's
own Download Station), peaks at 8MB/s (6 average).
Although I really like that SABnzbd automatically unpacks your downloads, these speed differences
made me decide to go back to nzbget. The j is just not powerful enough to do SABnzbd at these
speeds, and I can write auto-unpackers myself.
Optimal Config
I found that optimal speeds can be reached by letting your Synology download with 8 connections on 1 single download. With these settings the load reaches 11, so don't expect your NAS to do anything else while it's busy. But at least you're saturating your connection.
If you want it to multitask, limit it to 1 connection on 1 single download at any time, but you won't see it peak beyond 2MB/s.
If you use it for torrents as well, you don't want 1 slow torrent blocking the rest of the queue. In that case, set it to 2 to 3 connections with 2 to 4 threads each for optimal downloading.
Turn Off Unused Protocols
Decide on 1 file-sharing protocol (I chose Mac File service cause all my systems speak it and use Time Machine, but SMB/Windows is typically the right choice). Disable the rest in your configuration panel, saving a few precious MBs of RAM.
This is all just done from your web-interface.
SSH Access
Before you can do any hacking on your Synology, turn on SSH access in the web-interface's
control panel.
You can now type: ssh root@<nas ip>. Followed by sh. The root password is the same as
admin password.
AppStore :)
Get your hands on ipkg, which is like your Synology's secret AppStore. From here on, it's much easier to install cool additional software.
Turn Off Media Indexers to Free Up CPU & Memory
When I logged in to see what was eating up my NAS' resources, I saw a lot of processes running that I don't need
such as thumbnail generators and media indexers (ffmpeg & convert).
They were endlessly consuming 100% CPU, leaving nothing for my other tasks.
Any currently available NAS is a terrible media streamer. And that's ok, just get yourself an AC Ryan (250) to do that instead and dedicate your NAS to less tasks.
In my case that meant killing off all these wannabe media processes that are eating up your poor handheld CPU with 128MB RAM (every MB we'll save from this point forward counts to faster download speeds :)
So if you don't use the Photo/Media/iTunes station and would like more power for other tasks, consider turning off indexers:
-
Turn off all services in the bottom configuration panel (iTunes, everything except Download Station, unless you're going to use SABnzbd for this)
-
Login as root via SSH and stop all indexing by pasting:
/usr/syno/etc/rc.d/S??synoindexd.sh stop
/usr/syno/etc/rc.d/S??synomkflvd.sh stop
/usr/syno/etc/rc.d/S??synomkthumbd.sh stop
killall -9 convert
killall -9 ffmpeg
# If you don't use Download Station (but e.g. SABnzbd instead):
# /usr/syno/etc/rc.d/S??pgsql.sh stop
- Make sure they won't restart on your next reboot by pasting:
chmod a-x /usr/syno/etc/rc.d/S??synoindexd.sh
chmod a-x /usr/syno/etc/rc.d/S??synomkflvd.sh
chmod a-x /usr/syno/etc/rc.d/S??synomkthumbd.sh
# If you don't use Download Station (but e.g. SABnzbd instead):
# chmod a-x /usr/syno/etc/rc.d/S??pgsql.sh
Hint) After a DSM firmware upgrade, you need to repeat these steps.
Custom Cleanup & Rename Script Cause SAB Is Too Slow
Building your own cleanup scripts can be fun (and risky). If you want to get into it, you'll need some system tools at your disposal.
Here's what I cooked up to take care of my downloads:
#!/opt/bin/bash
# @todo: Don't delete parent dir if Dir == Root
# @todo: Root = $1 - But what about series!
set +x
export PATH="/opt/bin:/opt/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/syno/bin:/bin:/sbin:/usr/bin:/usr/sbin:/usr/syno/bin:/usr/syno/sbin:/usr/local/bin:/usr/local/sbin:/bin:/sbin:/usr/bin:/usr/sbin:/usr/syno/bin:/usr/syno/sbin:/usr/local/bin:/usr/local/sbin"
# Locking
LockFile="/volume1/downloads/nas_is_unpacking.lock"
if [ -f "${LockFile}" ]; then
echo "Lockfile still exists: ${LockFile}. Aborting"
exit 0
fi
trap "{ rm -f ${LockFile} ; exit 255; }" EXIT
date > ${LockFile}
echo "Running ${0} on $(date)"
Root="/volume1/downloads"
Home="$(pwd)"
Purged=""
# Downloadstation
echo "Looking for downloadstation tasks..."
Prevdir=""
find ${Root}/_queue -mmin +5 -iname '*.nzb' -o -iname '*.torrent' |sort | while read File; do
Dir="$(dirname "$File")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
echo ""
echo "= $(pwd)"
echo "================================================================================================"
fi
# Process the first par file in this directory thanks to |sort
/opt/bin/downloadstation add "${File}"
if [ $? -eq 0 ]; then
echo "Successfully added ${File}; purging file"
rm -f "${File}"
else
echo "Unable to add ${File}"
fi
Prevdir=$Dir
done
cd "${Home}"
# PAR
echo "Looking for files to repair..."
Prevdir=""
find ${Root} -mmin +5 -iname '*.par2' |sort | while read File; do
Dir="$(dirname "$File")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
echo ""
echo "= $(pwd)"
echo "================================================================================================"
# Process the first par file in this directory thanks to |sort
par2 r "${File}"
if [ $? -eq 0 ]; then
echo "Successfully repaired; purging par files"
rm -f *.par2
rm -f *.PAR2
else
echo "Unable to repair; purging entire directory"
Purged="${Purged}${Dir}\n"
cd ..
rm -rf "${Dir}"
fi
fi
Prevdir=$Dir
done
cd "${Home}"
# RAR
echo "Looking for rar files to unpack..."
Prevdir=""
find ${Root} -mmin +5 -iname '*.rar' |sort | while read File; do
Dir="$(dirname "$File")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
echo ""
echo "= $(pwd)"
echo "================================================================================================"
# Process the first rar file in this directory thanks to |sort
unrar e -y -o+ -p- "${File}"
if [ $? -eq 0 ]; then
echo "Successfully unpacked; purging rar files"
rm -f *.rar
rm -f *.r[0-9][0-9]
rm -f *.s[0-9][0-9]
rm -f *.t[0-9][0-9]
else
echo "Unable to unpack; purging entire directory"
Purged="${Purged}${Dir}\n"
cd ..
rm -rf "${Dir}"
fi
fi
Prevdir=$Dir
done
cd "${Home}"
# 7zip
echo "Looking for 7zip files to unpack..."
Prevdir=""
find ${Root} -mmin +5 -iname '*.7z.001' |sort | while read File; do
Dir="$(dirname "$File")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
echo ""
echo "= $(pwd)"
echo "================================================================================================"
# Process the first 7zip file in this directory thanks to |sort
7z x "${File}"
if [ $? -eq 0 ]; then
echo "Successfully unpacked; purging rar files"
rm -f *.7z.[0-9][0-9][0-9]
else
echo "Unable to unpack; purging entire directory"
Purged="${Purged}${Dir}\n"
cd ..
rm -rf "${Dir}"
fi
fi
Prevdir=$Dir
done
cd "${Home}"
# Move 1 Dir Up & Rename to Parent Dir
echo "Looking for files to clean..."
Prevdir=""
find ${Root} -mmin +5 -iname '*.mkv' -o -iname '*.avi' |sort | while read File; do
Dir="$(dirname "$File")"
Parent="$(dirname "$Dir")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
echo ""
echo "= $(pwd)"
echo "================================================================================================"
rm -f *.1 2> /dev/null
rm -f *.2 2> /dev/null
rm -f *.nzb 2> /dev/null
rm -f *.nfo 2> /dev/null
rm -f *.par2_hellanzb_dupe0 2> /dev/null
rm -f *.sfv 2> /dev/null
rm -f *.srr 2> /dev/null
rm -f *.segment000[0-9] 2> /dev/null
# in the middle
rm -f *[.-][Ss][Aa][Mm][Pp][Ll][Ee][.-]*.{mkv,avi,mpg,srs} 2> /dev/null
# at the end
rm -f *[.-][Ss][Aa][Mm][Pp][Ll][Ee].{mkv,avi,mpg,srs} 2> /dev/null
# at the beginning
rm -f [Ss][Aa][Mm][Pp][Ll][Ee][.-]*.{mkv,avi,mpg,srs} 2> /dev/null
# complete
rm -f [Ss][Aa][Mm][Pp][Ll][Ee].{mkv,avi,mpg,srs} 2> /dev/null
# Synology media thumbs
rm -rf @eaDir
fi
Prevdir=$Dir
done
cd "${Home}"
# Move Lonely Files 1 Dir Up & Rename to Parent Dir
echo "Looking for lonely files to promote 1 directory up..."
Prevdir=""
find ${Root} -mmin +5 -iname '*.mkv' -o -iname '*.avi' -o -iname '*.ts' |sort | while read File; do
Dir="$(dirname "$File")"
Parent="$(dirname "$Dir")"
if [ "${Prevdir}" != "${Dir}" ]; then
cd "${Dir}"
if [ "$(ls -l |grep -v 'total ' |wc -l)" = "1" ]; then
Basedir="$(basename "${Dir}")"
Newname="$(echo "${Basedir}")"
Ext=${File##*.}
Newname="${Newname}.${Ext}"
#cmd="mv \"${File}\" \"${Parent}/${Newname}\" && rmdir \"${Dir}\""
mv "${File}" "${Parent}/${Newname}" && rmdir "${Dir}"
echo "promoted: ${Parent}/${Newname}"
fi
fi
Prevdir=$Dir
done
cd "${Home}"
## TV Episodes
# Please Use FileBot Instead. Much Better Results.
#if [ "${1}" = "tvnamer" ]; then
# echo "Looking for tv episodes to rename..."
# tvnamer -r --batch /volume1/video/series
#fi
# REPORT
if [ -n "${Purged}" ]; then
echo ""
echo "Had to purge these directories cause they were damaged beyond repair:"
echo -e "${Purged}"
fi
echo "Done"
It runs every 15 minutes by cron, will remove broken downloads, unpack complete downloads, move lonely files 1 directory up,
delete a bunch of unwanted extensions, etc.
It makes a few assumptions (e.g. downloads must be in /volume1/downloads), so be sure to only use it for inspiration.
It's a work in progress, and improvements are more than welcome.
Downloadstation CLI
To have your Synology scan a directory for new download tasks, you can use Downloadstation CLI.
$ ipkg install python24 py-pgsql py24-mx-base
$ curl https://downloadstation.jroene.de/downloadstation -ko /opt/bin/downloadstation \
&& chmod a+x $_
With the command
$ downloadstation add $nzbfile
The download will be added to the queue. If you use an adaptation of my unpacker script, it will already automatically scan /volume1/downloads/_queue for any new torrent or nzb task.
Tools
These programs may take up a little bit of space, but won't be active in
memory until you call upon them (except for cron), so feel free to
install without performance loss:
$ ipkg install vim bash bash-completion less rsync mtr \
sudo tshark htop openssl mlocate perl ack hdparm sysstat dstat \
bzip2 unrar unzip zlib p7zip wget
$ curl https://raw.github.com/timkay/solo/master/solo -ko /usr/bin/solo \
&& chmod a+x $_
Optionally do ipkg install clamav so you can run clamscan on freshly downloaded files
and check them for viruses (I decided not to).
Renaming Files
There's a neat program called tvnamer that will rename all your TV series files.
Install:
$ ipkg install python25 py25-setuptools git \
&& cd /volume1/@tmp \
&& git clone https://github.com/dbr/tvnamer.git \
&& cd tvnamer \
&& python setup.py install \
&& ln -s /opt/local/bin/tvnamer /usr/bin/tvnamer \
Use:
$ tvrenamer -r /volume1/video/tv
FileBot is even better but requires a GUI.
Crontab
Crontab works slightly different than on more high-level Operating Systems.
Here's how to edit your crontab:
$ $EDITOR /etc/crontab
Every job needs a user prefix. e.g. root:
*/15 * * * * root /usr/bin/solo -port=1111 /volume1/video/unpacker.sh 1>&2 > /volume1/@tmp/unpacker.log
When you're done editing the new crontab, reload it by executing:
$ /usr/syno/etc.defaults/rc.d/S??crond.sh stop
$ /usr/syno/etc.defaults/rc.d/S??crond.sh start
Tmux or Screen
If you start programs from within tmux, you can
close your SSH session without killing it. You can check back later on it
with tmux attach || tmux.
This makes it perfect to run cleanup/rename scripts in while you're still experimenting and need to check up on them regularly.
Tmux similar to screen,
but I think it's a bit easier to deal with (just tmux attach || tmux is all).
However screen is a lot easier to install thanks to ipkg, so pick your poison.
Screen
$ ipkg install screen
Tmux
$ ipkg install libevent optware-devel ncurses-dev
# https://forum.synology.com/enu/viewtopic.php?f=90&t=30132
$ mkdir /opt/arm-none-linux-gnueabi/lib_disabled \
&& mv /opt/arm-none-linux-gnueabi/lib/libpthread* /opt/arm-none-linux-$ gnueabi/lib_disabled \
&& cp /lib/libpthread.so.0 /opt/arm-none-linux-gnueabi/lib/ \
&& cd /opt/arm-none-linux-gnueabi/lib/ \
&& ln -s libpthread.so.0 libpthread.so \
&& ln -s libpthread.so.0 libpthread-2.5.so
$ cd /volume1/@tmp \
&& wget https://sunet.dl.sourceforge.net/project/tmux/tmux/tmux-1.4/tmux-1.4.tar.gz \
&& tar -zxvf tmux-1.4.tar.gz \
&& cd tmux-1.4 \
&& export CC=gcc \
&& export CFLAGS="-L /opt/lib -I /opt/include/ncurses" \
&& ./configure --prefix=/opt # prefix is not supported. So we'll need some symlinks \
&& make # This will take a while \
&& make install \
&& ln -s /opt/lib/libevent-1.4.so.2 /usr/lib/libevent-1.4.so.2 \
&& ln -s /opt/share/terminfo/* /usr/share/terminfo/
Legacy Comments (26)
These comments were imported from the previous blog system (Disqus).
Hi,
thank you for your great tutorial, but i cannot get working the crontab part.
I am somewhat of newbie to synology and bad using command line, but is it possible that there is a mistake in the crontab edit and /video/ should be /downloads/?
anyways my crontab refuses to to do what im trying, I have to do it manually...
any ideas?
best regards
Hello,
Thank you for the tutorial. I have only one issue where you may be able to help me with.
When I use: downloadstation add <nzbfile>, the download appears in the download queue, but a few seconds later it is marked as 'Broken link' and the download failes.
When I add the same nzbfile to the queue through the Synology Download Station appp it does work as expected.
Did you run into that problem? I use a DS211 by the way.
kind regards
@ uninb: Did you add the root user? Did you restart cron after making changes?
@ René: Yeah, turns out CLI has no support for nzb and never will get that either :/
Still looking for a good solution for that..
The script is a really great idea for the auto check / unrar. Do know how I could do to only check the real par2 files (not the .vol files) (so file.par2 is taken, but file.vol000+01.par2 is not taken?
It interesting because sometimes there are more than one file in a nzb so there is more than one par2 file that is needed (file.par2, test.par2 and Newtest.par2). There will be 3 rar and 3 different par2 set. So instead of removing the complete par2 it could be great to remove only the files that are being tested. For example when file.par2 is done, only files starting by file.vol*.par2 (and file.par2) are deleted Not the par2 for test.par2)
Thank you
@Kevin: I might have found kind of a workaround for loading NZB files.
It appears that Downloadstation CLI is expecting an URL to load the NZB file from. That's why the link appears as 'broken' or 'unknown' because downloadstation cannot find the path to the NZB file.
What I have done now, is move the NZB file to a directory on the internal webserver of the diskstation. I then use the URL to this file, eg. http://diskstation/nzb/filename.nzb, as a parameter for Downloadstation CLI. So the commandline becomes:
[CODE=text]/opt/bin/downloadstation add http://diskstation/nzb/filename.nzb[/CODE]
I don't delete the NZB file after adding it to downloadstation, because it would then again turn into a broken link.
Instead, I let the unpacker script remove the NZB file when the download is completed and unpacked in the cleanup portion of the script.
Excellent article... I'm using SABnzbd at the moment but I'm sorely tempted to switch usenet downloading to Synologys own Download Station after reading this. I've had problems updating SABnzbd to the newest versions on occasion.
I created a script to download torrents from RSS feeds (made before I was using a DiskStation). I modified it to use jroenes DownloadStation-CLI to load the torrents; it could easily be modified to download nzbs via rss. Using the comments from Rene they could be dropped into a directory on the internal webserver of the DiskStation and loaded from there.....
Just need to see if there's a way to generate a custom RSS feed to select nzbs that I want now :D
I followed your guide, but cannot seem to get my full connection speed. I have 120Mbit, but the speed is always around 6MB/s, which is not even half of what I have. Is this really the max? This is rather disappointing for a NAS marketed as a download box.
FileBot will support CLI in with the next release. There is a test version available right now, check forums on sourceforge for details.
Thank you very much for the script.
I had the problem that after downloads finished, the whole volume1/downloads folder was disappearing. After a bit of poking around, I found out that the problem was some of the downloads ended on ".par2", so the find command tried to repair those. The command "-type f" limits the search to files, and everything works just fine. Thanks again!
I am getting the followring erro meassage when i try running the unpacker.sh script... any idea what i am doing wrong? :)
[CODE="Javascript"]
: command not found:
: invalid optione 5: set: +
set: usage: set [--abefhkmnptuvxBCHP] [-o option] [arg ...]
: command not found:
unpacker.sh: line 41: syntax error near unexpected token `fi'
'npacker.sh: line 41: ` fi
[/CODE]
re: "After testdriving it for a while I was never able to get it to download above 3MB/s (2 average)." and "The j is just not powerful enough to do SABnzbd at these speeds"
After installing sabnzbd and before it is restarted I can only get 3 Mbps and shutting down other services did not help. Only after restarting sabnzbd can it max out my bandwidth at 6 Mbps. Note that the Synology box may also have to be restarted too. Mine is restarted daily because I schedule it to power down on workdays during peak electricity rate.
I only found out about this several days after installation after re-starting sabnzbd in my attempt to start the queue. Restarting sabnzbd did not not fixed the queuing problem and it turned out to be caused by a schedule to pause all activities at that period. It did however, speed up my downloads.
Software used:
DSM 3.2-1922; Build Date: 2011/09/06.
SABnzbd 0.6.9 spk is from synoblog.superzebulon.org/2...
Hardware:DS-211j
Settings:
Download threads: 12 SSL connections, from Astraweb.
Hi Kevin,
Great article, makes me want to buy a DS411. I already have fun_plugged a CH3MNAS to be an automatic RSS torrent scanner/downloader, but I now have the cash to take it a step further; I want my own PHP/MySQL server and do the download thingy at the same location.
So my question is: when I go berserk on inserting neat little ipkg files in that NAS, will I compromise the server functionality? I mean, you literally state: "Here's how I turned my budget NAS that's mediocre at 8 things into a more powerful one that's good at 3 things: downloading / file serving / backups."
Best,
Vince
Just a small modification (I just had to make the adjustments again because of the DSM update): The URL for the curl command to download "solo" should be https://raw.github.com/timk... if you don't want to download a redirected info page. Took me a moment to figure out why that can't be executed. ;-)
FileBot as an awesome CLI, working very well for a few months now.
Is it possible to install FileBot CLI also on Synology NAS with PPC processor?
the section "turning off media indexers" really speed things up, thanks for the tips.
still waiting for Synology to fix HTTP downloads in Download Station, I can only get 30-150KB/s with HTTP while I can get 600-900KB/s with torrents.
Im really having difficulty editing or inputting anew line in Crontab.
I've gone into the files using vim /etc/crontab
but i cant enter a new line? I have messed around and managed to put a line between the two which are already in there but when i try to :quit it wont let me?
Hi ! Thanks a lot for your script.
I'm having an issue with folders having multiple episodes in them. For example, a folder containing all the rar/par files for a whole season.
The script will actually process the first file, then remove all the remaining. Is there any way you could look into improving your script ?
Thanks !
Hello, thanks for your advice. I have a DS211j which creates miniatures for my 50000 pictures in a very very slow.
Do you know a way to create thumbnails on the PC and then upload them to the NAS?
Thanks for attention. By
Hi Kevin,
Same issue as frey when lauching the script. Synthax errors are fournd...
Any idea?
Marcello, Upload Image files using Synology Assistant. In this way, the PC on which Assistant runs, will do the computing for thumbnails generation
Great introduction to using your syno full speed. Thanks!
The redirect in the crontab file is not working (on my Syno). I changed this:
1>&2 >
into:
2>&1 >
(meaning stderr is redirected to the same file as stdout)
Regards,
Hugo
Problem was in solo, not in the redirection of stdout/stderr: I had some piece of HTML as program instead of the perl code... See also comment #13.
Gr. Hugo
Hello,
I was very interested in the Download CLI section. Your article gives the impression it can handle nzb files which it can not.
Email received from Matthias Radig:
[CODE]
The script does not support NZB downloads and I do not plan to add this feature myself.
However, as the script is licensed under the GPL, someone else might in
the future.
[/CODE]
Have I missed something?
Thanks to René's post on 30 April 2011. It seems even Matthias Radig (the creator of Download CLI) is unaware of it's ability to handle nzb files.
Hello all.
First of all, thanks for a great blog. very useful.
But im having a problem with sab and my synology nas 207+ which i hope you guys could help out with.
Im experiencing that the sab program is stopping regularly on my nas. If i download with 200kbps it can stay running for half a day, if i turn the speed up to 800kbps it normally shuts down after 2-3 hours. Im running sab, Couch Potato, python and sickbeard on my nas. They are all from Superzebulon.
I have followed your blog, and i have stopped the processed that i dont use. Byt it dosen't seem to make a difference. I also tried following this tutorial: (crontab job)
http://thanatosblog.wordpre...
And under "trouble with sab", i tried following the part where i entered a line in Crontab.
But my problem is that i cant access crontab. Even with the commands:
crontab -e
or
vim crontab
Putty tells me: crontab: file not found
And i cant, for the life of me, figure out what to do next. So even with the tweaks on this great blog, im still experiencing that sab is shutting down. Could soneone please help me figure out how i avoid this? And how do i get access to the crontab as described in this blog?
Sorry for partially referring to another blog, but i just dont know where else to ask.
Daniel