• Home
  • Help
  • Search
  • Login
  • Register
Pages: [1]
Author Topic: 3TB Disk Woes  (Read 3260 times)
madmike
Newbie
*

Karma: 0
Posts: 13


View Profile
« on: March 26, 2013, 09:56:36 AM »

I am having a nightmare with a new, external, USB 3TB disk getting to work under linux

Sheevaplug (Linux hardy 2.6.30.2 #11 PREEMPT Wed Jul 22 19:53:31 MDT 2009 armv5tel GNU/Linux)


. When I bought it, it arrived with one 3TB partition with NTFS on and I successfully tested it under Windows
without a problem. When I tried to reformat it into two partitions under linux using fdisk or cfdisk, it only
showed up as a 800 GB disk. After some reading I learned the difference between MBR and GPT partition
tables and that these two utils do not support the required GPT. So I tried to use parted but ran into the following

Warning: /dev/sda contains GPT signatures, indicating that it has a GPT table.  However, it does not
have a valid fake msdos partition table, as it should.  Perhaps it was corrupted -- possibly by a
program that doesn't understand GPT partition tables.  Or perhaps you deleted the GPT table, and are
now using an msdos partition table.  Is this a GPT partition table?
Yes/No? Yes                                                               
Error: Invalid argument during seek for read on /dev/sda                 
Retry/Ignore/Cancel? I                                                   
Error: The backup GPT table is corrupt, but the primary appears OK, so that will be used.
OK/Cancel? O                                                             
Backtrace has 0 calls on stack:
                                                                         

You found a bug in GNU Parted! Here's what you have to do:
...   


Then someone recommended that I use gdisk but I cannot work out how to install it - can someone please help! Is there a way to install it with apt-get?
I have downloaded the source and tried to compile it but get several errors which I have put at the end.

Could someone please help get this disk up and working on my Sheevaplug?

Many thanks in advance,

Madmike.


Make error messages:

hardy:~/gptfdisk-0.8.1# make
g++ -Wall -D_FILE_OFFSET_BITS=64 -D USE_UTF16   -c -o gptpart.o gptpart.cc
gptpart.cc:19:28: error: unicode/ustdio.h: No such file or directory
In file included from gptpart.h:22,
                 from gptpart.cc:27:
parttypes.h:7:29: error: unicode/ustream.h: No such file or directory
In file included from gptpart.h:22,
                 from gptpart.cc:27:
parttypes.h:59: error: ‘UnicodeString’ does not name a type
In file included from gptpart.cc:27:
gptpart.h:57: error: ‘UnicodeString’ does not name a type
gptpart.h:64: error: ‘UnicodeString’ does not name a type
gptpart.h:78: error: ISO C++ forbids declaration of ‘UnicodeString’ with no type
gptpart.h:78: error: expected ‘,’ or ‘...’ before ‘&’ token
gptpart.cc:57: error: ‘UnicodeString’ does not name a type
gptpart.cc:73: error: ‘UnicodeString’ does not name a type
gptpart.cc: In member function ‘void GPTPart::SetType(PartType)’:
gptpart.cc:100: error: ‘GetDescription’ was not declared in this scope
gptpart.cc:100: error: ‘class PartType’ has no member named ‘UTypeName’
gptpart.cc: In member function ‘void GPTPart::SetName(const std::string&)’:
gptpart.cc:110: error: ‘UnicodeString’ was not declared in this scope
gptpart.cc: At global scope:
gptpart.cc:115: error: ISO C++ forbids declaration of ‘UnicodeString’ with no type
gptpart.cc:115: error: expected ‘,’ or ‘...’ before ‘&’ token
gptpart.cc: In member function ‘void GPTPart::SetName(int)’:
gptpart.cc:116: error: ‘theName’ was not declared in this scope
gptpart.cc:120: error: ‘UChar’ was not declared in this scope
gptpart.cc:120: error: expected primary-expression before ‘)’ token
gptpart.cc: In member function ‘void GPTPart::ShowSummary(int, uint32_t)’:
gptpart.cc:174: error: ‘UnicodeString’ was not declared in this scope
gptpart.cc:174: error: expected ‘;’ before ‘description’
gptpart.cc:196: error: ‘GetDescription’ was not declared in this scope
gptpart.cc:196: error: ‘description’ was not declared in this scope
gptpart.cc: In member function ‘void GPTPart::ShowDetails(uint32_t)’:
gptpart.cc:228: error: ‘GetDescription’ was not declared in this scope
gptpart.cc: In member function ‘void GPTPart::ChangeType()’:
gptpart.cc:274: error: ‘GetDescription’ was not declared in this scope
gptpart.cc:274: error: ‘GetUTypeName’ was not declared in this scope
make: *** [gptpart.o] Error 1

 
Logged

madmike
Newbie
*

Karma: 0
Posts: 13


View Profile
« Reply #1 on: March 27, 2013, 09:41:40 AM »

OK, so I fully read the included README file  Wink and was able to compile gdisk,
but even gdisk reports the disk as 800GB (where MacOS and Windows see it
correctly as a 3TB disk): can someone please help me to get my Sheevaplug
to recognise this correctly?

Logged

madmike
Newbie
*

Karma: 0
Posts: 13


View Profile
« Reply #2 on: April 04, 2013, 06:26:13 AM »

So I am now booting into Debian Squeeze from an image kindly given to me by NewIT as being fully GPT aware. I have downloaded ported and GPT disk tools and they still see the disk as 800GB. Help! Please! Someone!
Logged

jkwilborn
Newbie
*

Karma: 0
Posts: 4


View Profile
« Reply #3 on: April 16, 2013, 11:31:26 AM »

Madmike, I don't know how useful this may be, but not too long ago I was reading, somewhere that there is still a limit on disk size on some computers and software.  Do you think that could be a problem?  I run Debian Linux on my home machine and also on my Dreamplug, and somewhere I ran into a discussion that the OS was about at it's limit for disk size and it was fixed on some...  Unfortunately I don't really remember where or when I read this and what it applied to as I only have one 1 TB disk around here.  It wasn't too long ago.

Just a thought that you might look and see if that size of disk is really supported or not, may be a quick diagnosis.  Again, I don't know how useful this may be but I would check if you haven't already.

Jack
Logged

apemberton
Newbie
*

Karma: 1
Posts: 31


View Profile
« Reply #4 on: April 20, 2013, 05:15:34 AM »

I may be wrong, but with a 2.6.30 kernel, I think the size limit is 2.2TB. Is this possibly a cause?
Logged

Tony Pemberton

Pages: [1]
Print
Jump to: