Question Good bittorrent client?

Currently reading
Question Good bittorrent client?

So I found this thread and thought I'd stop by for help. I have been running Safihre's Deluge client on my DSM but the problem is the WebUI is constantly timing out and I can barely get a connection, and whenever I tried to set it up with my local PC Deluge and connect via Connection Manager it never worked. So I'm kinda getting sick of Deluge and looking for other options.

I have tried Docker in the past and damned if I can figure out how to get from an Image to a working piece of software I can use. Even if I manage to create a Container, I have no idea what the URL is or how to config. I have no plans to connect externally, I would access everything from my LAN via PC, I just don't want to keep a client running on my desktop all the time to seed stuff on the NAS.

I've put the linuxserver.io versions of Deluge and qbittorrent images on Docker but can't figure out where to next, and as oRBIT has mentioned, they don't make it noob-friendly.

Any quick and painless tips on how I actually progress beyond an Image or access the application after I create a Container?
 
Transmission
Deluge
Download Station

Not much left :)

How about qbittorrent in Docker?
 
With what error exactly?

There was no error at all. the files just remained where they were.

These is what I've set:
2020-07-05_12-24-50.jpg


2020-07-05_12-24-00.jpg
 
A horrible option I have written off many times before. I've also tried Transmission.
Download Station is Transmission just with Synology web ui over it. If you need a hand with running qbittorrent or deluge inside a container, please start a new topic describing what image you have used and what exact steps did you do, or PM me and I will help you in private with it.
 
There was no error at all. the files just remained where they were.
This sounds like the problem I had (and still have) Sonarr couldn't move the files. I played with "rights" and PUID/GUID all day and nothing was accomplished. My half-baked solution was to create a folder...
/docker/sonarr/tv series
for the transferred files to reside. That worked! QBT had no issue. Yet, it would not deliver files to
/video/tv series.
 
Last edited:
Can you verify that the container has indeed created that gmovies folder bu using bash command inside it and browsing to the folder?

I bashed inside the container and I see the gmovies folder. And I can also browse inside that folder that is actually the movies folder on my NAS.

I actually just decided to re-check the settings inside the application itself. Perhaps I misread:

1595010005709.png


Highlighted in Red appears to only move a .torrent file to the specified location, and not the actual downloaded data.
So changed the 2 settings highlighted in Yellow. I would now expect it to perform the download and keep the data inside the /docker/ volume. Only once it finishes, it can than move the file into my regular share with movies, which happens to also sync with 2 other NAS'es.

I'll test this out.
 
Last edited:
There was no error at all. the files just remained where they were.
To follow up on this (as I've made some headway)... Originally I wanted sonarr to transfer files to /video/tv shows/* (the shared folder created by Video Station). I could never get the transfer to work, so I added a /tv shows folder under /docker/sonarr/tv shows (that worked).

But I never liked that... so this week I created a shared folder "media" and moved my files from /video/tv shows (subsequently I uninstalled Video Station, and deleted the "video" shared folder.

I assigned "user1" as the owner of /media/tv shows/* and updated the sonarr docker to this path. Success... files transferred... BUT... I was using PUID/PGID for "admin1" under sonarr... and I edited this to PUID/PGID for "user1"

Guess what? No transfer.

Next, I added PUID/PGID for "user1" to the qbittorrentvpn docker... same result, no transfer.

Then I did an odd thing... I changed sonarr's PGID to the administrator (leaving its' PUID as "user1")... and success... downloaded files transferred to /media/tv shows/*

When I checked the ownership of the transferred file, it said "admin1". Huh? So both qbittorrentvpn and sonarr dockers used PUID for "user1", but the file ended up with "admin1" as the owner. That seems illogical to me, but I am no Linux master.

The only variable I've yet to try is qbittorrentvpn docker allows for UMASK to be set (another variable I have no understanding of), so I will tinker some more.

But back to @Shadow 's situation... changing sonarr's PGID to that for administrators allowed the transfer to my newly created shared folder (I have no idea whether the system-created folder "video" had any peculiarities that prevent the sonarr transfers... Just a thought (or a ramble).

All that said this seems somewhat academic as there are no new tv episodes to look forward to in our current times, and my sonarr calendar is devoid of future downloads.
 
Long time DS209 owner (tricked into thinking it was DS211j for DSM version), I have 2x 3TB drives with 75k+ on hours under my belt (8.5 YEARS). I just got a DS918+ and migrated all of my data over and started playing with Docker, which was not an option before.

I have read the instructions here and on the linuxserver/docker-deluge page carefully. I set up the package carefully via docker compose as suggested with default values and logical volume mapping.

I have the container up and running and can get access to the webUI via default port 8112. I have zero issues accessing the docker container via webUI. What I am trying to do is get the daemon access set up for remote login going. The goal here is to kick off the deluge docker process via a local machine daemon logged in instance, then be able to shut that machine down and let the DS918+ do its thing. I had the DS209/fake DS211j set up to do this.

I have tried on my Windows 10 machine w/deluge client to connect to my docker instance using 192.168.1.101:58846 using default login credentials, admin/deluge, I've also done my best to try and recreate the instructions here [Solved] Trying to setup a client/server connection on my LAN - Deluge Forum but I cannot get a valid connection remotely to the daemon to save my life.

Any advice? I am willing to pay someone to remote in and look at this, it is driving me NUTS. I need a client that can be installed on the DS918+ that has SOCKS5 auth proxy capability and can connect with my Windows 10 machine to initiate torrent downloads.
 
I have tried on my Windows 10 machine w/deluge client to connect to my docker instance using 192.168.1.101:58846 using default login credentials, admin/deluge, I've also done my best to try and recreate the instructions here [Solved] Trying to setup a client/server connection on my LAN - Deluge Forum but I cannot get a valid connection remotely to the daemon to save my life.
Have you tried to create auth and core.conf files as explained in the link? Ofc, you will have to make these changed on the container level not the locations that are mentioned in the deluge help considering those are referring to package center (bare metal) install of Deluge.
 
Last edited:
Have you tried to create auth and core.conf files as explained in the link? Ofc, you will have to make these changed on the container level not the locations that are mentioned in the deluge help considering those are referring to package center (bare metal) install of Deluge.
That's where I'm running into the problem. SSH into the running docker container returns 2 results for auth:
Bash:
root@synologyfZ:/# find / 2> /dev/null -name "auth"
/var/lib/pam/auth
/config/auth

On a package center install of deluge you see:
Bash:
/volume1/@appstore/deluge/var/auth
cat /volume1/@appstore/deluge/var/auth
localclient:3e65eff431a1cb44685abf534dfdb4bf13e4d898ddd:10
(no, this is not the actual hash PW, copied example from the link). Then you use username: localclient and for password the 3e65eff431a1cb44685abf534dfdb4bf13e4d898ddd hash.

I also tried using the credentials from /config/auth (not sure this is the right auth for daemon access) and can't get connected.

The core.conf change was made:
Bash:
"allow_remote": true,

So my issue is with auth for the docker instance. Is the /config/auth string the correct one to use for daemon access or do I need another auth?
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.

Similar threads

  • Question
Hi brother, I am using a removable drive as my download directory, but it still doesn't work according to...
Replies
4
Views
7,396
This is great for those of us who miss Luuk's DSDownload extension (especially Mac/Safari users!)...
Replies
3
Views
4,478

Welcome to SynoForum.com!

SynoForum.com is an unofficial Synology forum for NAS owners and enthusiasts.

Registration is free, easy and fast!

Back
Top