General NAS-Central Forums

Welcome to the NAS community
It is currently Tue Dec 12, 2017 4:20 am

All times are UTC




Post new topic Reply to topic  [ 4 posts ] 
Author Message
PostPosted: Thu Sep 18, 2014 6:29 pm 
Offline

Joined: Thu Sep 18, 2014 6:14 pm
Posts: 2
I have the original 5big ver 1 . I have lost the username password to access the drive. I have tried resetting the drive as per instructions on the lacie website, without luck. I connected each drive to a pc to verify if any drives were not functioning, but the all passes hd tune. Does anyone have any idea how to reset this unit. could something be corrupt in the firmware preventing the reset? any ideas would be appreciated.

Thanks


Top
 Profile  
 
PostPosted: Sat Sep 20, 2014 5:19 pm 
Offline

Joined: Mon Jun 16, 2008 10:45 am
Posts: 6086
A factory reset actually is an erase of the raid array on /dev/sd[abcde]9, and an erase of /dev/sd[abcde]10. This is done by the initial rootfs on /dev/sda7, which is never changed. On boot /dev/sda7 is mounted read-only, and it assembles a layered filesystem of the raid arrays on /dev/sd[abcde]8 and /dev/sd[abcde]9, where --8 contains the original firmware, and --9 all modifications. (User settings, firmware updates, ...). After assembling the new fs is used.
Futher the rootfs in /dev/sda7 handles the factory reset. As this filesystem never changes it's highly unlikely that a firmware error could prevent a factory reset.

I can think of two reasons why it doesn't work for you:
  • a hardware error. If the button is defect, or the red led, it's hard to reset.
  • (Maybe) a damaged filesystem on /dev/sd[abcde]9. Cannot remember if this filesystem is renewed, or if it is cleared. In the latter case it has to be mounted first, and a filesystem error could prevent that

The only work around I can think of is a manual reset. Do you have the possibility to connect all disks simultaneously to a PC?


Top
 Profile  
 
PostPosted: Tue Sep 30, 2014 3:10 pm 
Offline

Joined: Thu Sep 18, 2014 6:14 pm
Posts: 2
Currently I do not ,but can get a sata card in order to do so. What is the process after they are all connected?


Top
 Profile  
 
PostPosted: Tue Sep 30, 2014 3:45 pm 
Offline

Joined: Mon Jun 16, 2008 10:45 am
Posts: 6086
Boot the system in Linux, find the 5 disks:
Code:
cat /proc/partitions
I'll assume the 5 disks are sdb, sdc, sdd, sde and sdf.
Assemble the raid array on partitions 9:
Code:
su
mdadm --assemble /dev/md0 /dev/sdb9 /dev/sdc9 /dev/sdd9 /dev/sde9 /dev/sdf9
Mount the array
Code:
mkdir -p /tmp/mountpoint
mount /dev/md0 /tmp/mountpoint
and delete the contents[/code]cd /tmp/mountpoint
rm -fr *[/code]unmount it and stop the array
Code:
cd /
umount /tmp/mountpoint
mdadm --stop /dev/md0

Then erase all partitions 10
Code:
dd if=/dev/zero of=/dev/sdb10
dd if=/dev/zero of=/dev/sdc10
dd if=/dev/zero of=/dev/sdd10
dd if=/dev/zero of=/dev/sde10
dd if=/dev/zero of=/dev/sdf10


Oh, and mark the disks before you remove the disks from the NAS. The sequence is not important as long everything works fine, but if you ever need to repair the data raid array it can be important.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 5 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group