Hey guys, new to this forum and looking to confirm the bad news. Is this thing bricked?
I've had a DS718+ for the last 4 yrs or so. 2 WD RED 4TB drives in SHR mode, BTFRS.
This is primarily storage for family pics. I also ran pi-hole and Unifi in docker containers. Since day 1, I installed an addition 8GB of RAM:
I've always kept my NAS up-to-date over the years. On May 18 I said yes to DSM 7.1, and it failed to update. Now it won't boot up: I get the status light blinking orange.
I've gone through tech support level 1, and now level 2. I've run extended tests on both drives (WD Lifeguard diagnostics). Took nearly 8 hours each, and they're both fine.
I've attempted reinstall on a blank SSD, and it always fails with: Failed to install the file. The file is probably corrupt. (13)
I've verified the md5 hash on all downloads, tried every version imaginable and it always fails the same way.
There's a support article that mentions 3 possible reasons for such failure: bad RAM, bad sectors on the drives, or system partition too small:
www.synologythailand.com
I confirmed that after formatting a spare SSD, and after the DSM installation fails, that the SSD drive contains 2 system partitions: 2:38GB and 2.00GB
I also tried all of the above with only the original 2GB module installed, and again with only the 8GB module installed.
I can also telnet to the NAS and login as root, though the busybox ash shell has a limited set of commands, don't think I can get anything useful out of this.
I'm seriously questioning sticking with a Synology box. Anyone ever see this issue?
I've had a DS718+ for the last 4 yrs or so. 2 WD RED 4TB drives in SHR mode, BTFRS.
This is primarily storage for family pics. I also ran pi-hole and Unifi in docker containers. Since day 1, I installed an addition 8GB of RAM:
Crucial RAM 8GB DDR3 1600 MHz CL11 CT102464BF160B
I know that although it's unsupported, but it's been solid all these years with 10GBI've always kept my NAS up-to-date over the years. On May 18 I said yes to DSM 7.1, and it failed to update. Now it won't boot up: I get the status light blinking orange.
I've gone through tech support level 1, and now level 2. I've run extended tests on both drives (WD Lifeguard diagnostics). Took nearly 8 hours each, and they're both fine.
I've attempted reinstall on a blank SSD, and it always fails with: Failed to install the file. The file is probably corrupt. (13)
I've verified the md5 hash on all downloads, tried every version imaginable and it always fails the same way.
There's a support article that mentions 3 possible reasons for such failure: bad RAM, bad sectors on the drives, or system partition too small:
What can I do to troubleshoot DSM update problems?
This article identifies common problems and error messages which may occur during a DSM update, and provides possible solutions. Contents How do I manually update DSM? Do I need to execute the .pat file? Why is the update file corrupt and cannot be used? Why has the connection failed? Why is...

I confirmed that after formatting a spare SSD, and after the DSM installation fails, that the SSD drive contains 2 system partitions: 2:38GB and 2.00GB
I also tried all of the above with only the original 2GB module installed, and again with only the 8GB module installed.
I can also telnet to the NAS and login as root, though the busybox ash shell has a limited set of commands, don't think I can get anything useful out of this.
I'm seriously questioning sticking with a Synology box. Anyone ever see this issue?