• Defrag nightmare

    Author
    Topic
    #492254

    Please recommend a good (must be safe) hard drive defrag software.

    Had defrag nightmare the day after Turkey dinner/Black Friday.
    Making it worse, the drive was a 2TB all-data drive (SATA connected). Space left=about 400G.
    Story:
    The frequent accessed drive was too fragmented.
    Defragged with a known good free ‘reliable’ program … [be ‘gentle’ re ‘free’]
    Took too long (over 24 hours). Must stop for immediate work. Stop properly per the application. No more defraq after that.
    More than 2 weeks past, the hard drive was suddenly not available. Symptom: not accessible by Windows. Drive letter disappeared.
    Diagnosis:
    Hardware-wise OK.
    Found file table corrupted, including the twin copy! Hence, unable to recover.

    Had to do lengthy data recovery … on a 2TB drive. Had backup but new data must be recovered.
    Hard drive reformatted; still works.

    Defrag software recommendation highly appreciated.

    Viewing 27 reply threads
    Author
    Replies
    • #1426523

      A few years ago, I bought Perfect Disk, from Raxco Software. It’s pretty good, but I always image before letting it run – it’s never good if somethings goes wrong and I prefer to play safe.

      P.S.: They are offering a 35% discount as a cyber week promotion.

      • #1426971

        A few years ago, I bought Perfect Disk, from Raxco Software. It’s pretty good, but I always image before letting it run – it’s never good if somethings goes wrong and I prefer to play safe.

        P.S.: They are offering a 35% discount as a cyber week promotion.

        This is the defrag software that I use on my few non-SSD installs & has performed wonderfully.

        Raxco does offer a 15 day trial on most of their software. It will likely take 3 defrag passes to fix this, plus a boot time defrag to begin with. The system & page files cannot be defragmented during Windows operation.

        After install, it will likely run an auto analysis to determine the fragmentation status, then suggest which operation to perform first, usually a Boot Time defrag on new installs or badly fragmented ones.

        Then normally a basic defrag, followed by SmartPlacement, then a Consolidate Free Space pass. Sometimes it’ll repeat an operation if needed. It may be best to allow this auto operation to take place while sleeping or at work, as it will likely take several hours on a heavily fragmented drive.

        Was going to take/post a screenshot of mine, but Print Screen works differently on this HP w/7 Pro than on my newer Dell.

        EDIT: Believe I found it!

        35686-Capture

        Cat

    • #1426560

      If you are using an OS later than XP and unless you have a specific use case with very very large files, I recommend you just let Windows do the job. Microsoft has done extensive research on disk usage and defragging. They have optimized the internal Windows defrag program to work well for most people.

      Joe

      --Joe

    • #1426564

      I use MyDefrag. It utilizes Windows’ own defrag API. I’ve used it for years and have never had an issue of any kind.

      And it’s free.

      Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
      We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
      We were all once "Average Users".

    • #1426597

      Thanks for the suggestions from ruirib, JoeP517, and bbearren.
      Yes, files are big size data files beyond 2.2G. Some >4G.
      Hmm, for some reason I never thought of using Windows own built-in defrag. My old experience is that it is much slower than most commercial free and non-free programs. So the ‘logic’ is use outside programs. Speed kills I guess.
      Not sure if I’m going to experiment trying to repeat it. The darn thing is it fails after more than 2 weeks, and corrupted the two file tables as well. If the copy of the file table is intact, it’d be walk in the park.
      I double checked hardware on the hard drive. No sign of error. After formatting, runs fine under stress tests.
      This delayed total failure is really a nightmare.
      The drive was so fragmented (48% reported), hence my urge to defrag.
      Maybe NEVER defrag a huge drive?

      Light bulb lights up!
      I should have just make file copy to another hard drive (aka lowest form of backup). Quick format the drive. Then copy back files. Voila! Completely defrag 2TB, 100% perfect. Much less than than 24 hours as well.

      The hack with defreg.
      Thanks guys.

      • #1435614

        Light bulb lights up!
        I should have just make file copy to another hard drive (aka lowest form of backup). Quick format the drive. Then copy back files. Voila! Completely defrag 2TB, 100% perfect. Much less than than 24 hours as well.

        You’re right; a simple file copy to another drive, then quick formatting the original drive, then copying everything back, will give you a perfect and quick defrag.

        Best of all, it is extremely simple. And you have a backup copy of your files on the other drive.

        Often we do things the hard way; it’s usually better to keep it simple, in my opinion.

        Another simple way to defrag, which should work if you have enough drive space:

        Create a folder on your drive, and call it “defrag”. Copy everything on the drive into that folder. Erase the original copy of everything. Move the copy back to the original location. The copy process will reassemble all of the fragments into whole files, so I don’t see why this method wouldn’t also work just as well as using a separate drive.

        Group "L" (Linux Mint)
        with Windows 10 running in a remote session on my file server
        • #1435619

          Create a folder on your drive, and call it “defrag”. Copy everything on the drive into that folder. Erase the original copy of everything. Move the copy back to the original location. The copy process will reassemble all of the fragments into whole files, so I don’t see why this method wouldn’t also work just as well as using a separate drive.

          Jim,

          I don’t think this will work as I understand disk usage. See illustration.
          36095-DiskCOpy
          Note: on the New Folder copy line the blank space is the space still occupied by the originals and on the Copy back line the empty space is occupied by the new folder until it is erased.

          Also note that for simplicity I didn’t show a fragmented file only fragmented disk. However, the principle should still apply.

          HTH :cheers:

          May the Forces of good computing be with you!

          RG

          PowerShell & VBA Rule!
          Computer Specs

    • #1426624

      Windows built-in defrag runs in the background.

      There are those who say that with the size of contemporary spinning drives and the typical utilization defragging is un-necessary.

      Joe

      --Joe

    • #1426655

      All disk defrag programs will use the applicable Microsoft APIs to do the defragging operation. Not to use them would be sheer madness!

      Moving large amounts of data around a large hard disk is going to take an enormous amount of time. The amount of time taken is probably out of all proportion to the benefit obtained, as JoeP517 says.
      (I’m secretly sceptical whether Microsoft actually does any defragmenting of hard disks “in the background”, since it probably realises that this action is largely pointless, but doesn’t want to say so…)

      BATcher

      Plethora means a lot to me.

    • #1426702

      The background defragging by MS does indeed take place. It is scheduled by default as a task in Task Scheduler. One can check Task Scheduler History.

      As for large drives not needing defragging, physical disc sizes have not changed (3.5″ and 2.5″), nor has the mechanism by which they operate changed. The only change is in the size of the magnetic bits. On larger and larger drives, the bits have gotten smaller and smaller. But none of that changes the mechanism by which a file is retrieved or written – it’s still a head moving back and forth above the surface of a spinning disc.

      All else being equal, the more fragmented a file, the longer it takes to retrieve the complete file from disc, and the less fragmented a file, the shorter it takes to retrieve the complete file from the disc. I continue to defrag.

      Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
      We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
      We were all once "Average Users".

    • #1426798

      Now it gets me thinking.
      The physical size of the platter in a hard drive is always the same. Bit density increases 3.7X or more, using 270G vs 1TB. By design from drive makers, access time is about the same (from position x to position y). (They would be out of business if access time increases linearly with bit density!)
      Physical time is the same! In reality, access same-size file would be 3.7X faster for high bit density platter, when all things being equal, because the physical distance is 3.7X shorter.
      Do a mind experiment:
      1TB vs 100,000,000TB, same size platter. The design spec is that they have the same access time (physical travel time from position X to Y). Seems fragmentation adds little time to it, at the most r.m.s. to it. But overwhelming of the time, bits of the entire file are so physically close that it is actually much faster. Statistically, the AVERAGE seek time will then be dominated by them (statistical averaging), fragmentation increasingly has less influence.
      Say, access a 100G file. The linear distance is d micro-inch. For the huge drive, it is d/10^9 linear micro-inch. Fast access! The chance of fragmentation is much less too, because all it needs is 0.000,000,001*d linear micro-inch, vs d micro-inch.

      I see now. Yes, fragmentation is less important in high bit density hard drive.
      My defrag nightmare is self inflicted.

      • #1436851

        I have 2 terabyte and 3 terabyte drives on my rig. It is still a good idea to use disk defrag because often times defragmentation cleans up some of the mess in the file system such as reporting file space. The difference with larger drives is that you do not have to defrag it as often, but they still need to be defrag’s at some point; once a month is not extreme. Mechanical hard drives have increased in capacity but they mechanically more or less still work the same way. The problem that you will experience with larger hard drives is not only the heat but finding a way to back up 3 TB can be quite a chore. If you have an external 3 TB drive, that would be okay but at some point it is going to get “strange”. However, there is no way around this. In fact, the more stuff you put on a large hard drive, the greater the need for imaging and file backups. Having a large hard drive means to have more stuff to lose!

    • #1426886

      Hey Y’all,

      Just my 2 cents worth but if I remember correctly, and that’s a big IF, there is a major difference in the way files are stored/allocated between FAT and NTFS file systems thus making defragmentation much less necessary with NTFS. Of course all us guys who got started with computing in the Dark Ages had the defrag habit drilled into us and we all know about old habits! 😆 HTH :cheers:

      May the Forces of good computing be with you!

      RG

      PowerShell & VBA Rule!
      Computer Specs

    • #1426955

      Dear Mr R Geek: when’s the last time you saw something which used the FAT file system?! Certainly something which needed defragging?

      BATcher

      Plethora means a lot to me.

    • #1426961

      BATcher,

      I have one or two … right here next to my punch cards, 8″ Floppies, 5.25″ floppies, etc.! 😆 :cheers:

      May the Forces of good computing be with you!

      RG

      PowerShell & VBA Rule!
      Computer Specs

    • #1426972

      You’re only a real old geek if you have punch cards in your top shirt pocket……

      I have a vague memory that there were even bigger floppies than 8″ – is that memory correct?

      Eliminate spare time: start programming PowerShell

    • #1427023

      Those 8″ floppies went into IBM 3274 terminal controllers, from which to load the firmware, configured for the attached terminals.

      And of course 8″ floppies went into the 6360 diskette unit of DisplayWriters.

      How many boxes of unpunched punch cards would you like?!

      BATcher

      Plethora means a lot to me.

      • #1436716

        If you are using any third-party defragger, you should turn off the (default) Windows scheduled defragger first. If they use different algorithms, they may be working at cross-purposes and waste a lot of computing time, not to mention drive life.

        Your defragging principles and software should match the technology of the drive and the software. Drive technology is getting smarter all the time (SMART and all that). If you replace an existing drive with a later drive, you should do the homework and you and your software should adapt to the newer technology. (Do as I say and not as I do.) Check the reviews; I ran across a few recently (and may append links if I can find them).

        There has been no mention in this thread of so-called optimising, which is to say arranging the relative location of files on the drive into related groups that allows faster access to the most-likely needed files relative to the others, irrespective of fragmentation. MyDefrag, mentioned in the thread, is capable of this. So, for that matter, is System Mechanic.

        I think tape was before 8-inch drives. You can see it on any Sci-Fi TV program with a mainframe in the background, and the spools never budge by a degree, which suggests there is no reading or writing going on. I have no comment on certain other historical matters.

        Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word); the ‘Go Sideways’ reference reminded me to remark on drive errors, which are one quick way for everything to Go South for no apparent reason, and which should be the first thing to check before messing things up beyond repair; I have before me a one GB Verbatim flash drive for which the native format is FAT.

        • #1436728

          Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word).

          [Font=Georgia][Size=3]Actually, I did use that word:

          It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.

          From the MyDefrag site (emphasis mine):

          “MyDefrag organizes files into zones, such as directories, Windows files, files used while booting, regular files, and rarely used files. The most accessed files are placed at the beginning of the harddisk, and files that are commonly used together are placed in close proximity to each other. This results in a dramatic speed increase, and is in fact more important than defragmentation. The program comes with scripts with a zone organization suitable for most users, power users can customize the zones through scripts.”

          Another neat optimization trick that MyDefrag performs is to allocate plenty of empty space before and after $Logfile so that, even though the file is immovable, it can grow without becoming fragmented. Each run of MyDefrag will analyze and adjust that empty space around $Logfile, so that there’s always room to grow without fragmentation of the file.

          Similarly, MyDefrag puts empty space between its zones to allow room for temporary files to be written in close proximity to the program/utility/function that is creating them.

          FWIW, I just timed a fresh start on my 3 year old Windows 7 Ultimate dual boot; from boot manager OS selection to lock screen was 30 seconds. I don’t have a youtube video for comparison, but to the best of my recollection, that’s about what it was in January 2010, give or take a second or two.

          I used WindowsSystem32Defrag.exe to analyze my Windows 7 Ultimate OS partition (MyDefrag uses the Windows defrag API to do all its file manipulations) and got the following report, which I think speaks for itself as to how well MyDefrag does its job for me.[/Size][/Font]

          36153-defrag-analysis

          And as always, YMMV.

          Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
          We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
          We were all once "Average Users".

        • #1436838

          If you are using any third-party defragger, you should turn off the (default) Windows scheduled defragger first. If they use different algorithms, they may be working at cross-purposes and waste a lot of computing time, not to mention drive life.

          Your defragging principles and software should match the technology of the drive and the software. Drive technology is getting smarter all the time (SMART and all that). If you replace an existing drive with a later drive, you should do the homework and you and your software should adapt to the newer technology. (Do as I say and not as I do.) Check the reviews; I ran across a few recently (and may append links if I can find them).

          There has been no mention in this thread of so-called optimising, which is to say arranging the relative location of files on the drive into related groups that allows faster access to the most-likely needed files relative to the others, irrespective of fragmentation. MyDefrag, mentioned in the thread, is capable of this. So, for that matter, is System Mechanic.

          I think tape was before 8-inch drives. You can see it on any Sci-Fi TV program with a mainframe in the background, and the spools never budge by a degree, which suggests there is no reading or writing going on. I have no comment on certain other historical matters.

          Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word); the ‘Go Sideways’ reference reminded me to remark on drive errors, which are one quick way for everything to Go South for no apparent reason, and which should be the first thing to check before messing things up beyond repair; I have before me a one GB Verbatim flash drive for which the native format is FAT.

          When PerfectDisk is installed, the installer options offer 3 different ways for PerfectDisk to handle “Windows Optimization”. One of the methods is to continue allowing Windows to perform its own “Optimization” process undisturbed. The second is to allow PerfectDisk to “take over” these operations. The third is to have the system not perform “Optimization” at all. The user has the choice of selecting whichever pattern they wish.

          If the user opts to allow PerfectDisk to perform this optimization, the PerfectDisk Defrag process will use the data gathered by Windows for its own “Optimizer” – to perform a similar optimization process – except that PerfectDisk normally consolidates this data to the very front of the disk – where the data transfer rate is fastest. This allows Windows to start up in the shortest amount of time possible. PerfectDisk also marks the blocks on the disk involved with startup in a special colour in its disk map – so the user can see how fragmentation affects the bootup process. Similar options are offered with Ultimate Defrag – the difference being the user has full control of placement and priority for these files – rather than following the defragger’s “pattern” without the ability to customize as desired.

          Note: Every month when Patch Tuesday rolls around – and every time the user installs/updates a program on their machine – the “Windows Optimization” information becomes obsolete. Thus the need for a Defrag pass to regain startup-efficiency. This is normal – and a standard consequence of how Windows operates. All modern Defraggers are aware of this limitation – and automatically react to reconsolidate the startup-fileset when the fileset is disturbed by changes which occur due to updates. There is nothing special about PerfectDisk, Ultimate Defrag or MyDefrag in this situation – even the built-in Windows Defragger performs this operation. What does change with various different defraggers is where that reconsolidated information is repositioned on the Hard Disk.

          As bbearren pointed out, an important part of modern Defragmenter operations is to “zone allocate” the total fileset on the Hard Disk – such that typical file-operations not only work with contiguous files – but also have those files situated close to each other on the Hard Disk. As he mentioned – this markedly improves the system’s speed-of-response – both when loading programs and when loading datafiles. Pretty well all modern Defraggers do this. Competition in the Defragger field has led to improvements in Defragger efficiency – including the use of “zone allocation” – becoming standard “checkbox items” in any modern Defragger’s feature list.

          The intent of Boot-Defrag in Defraggers that offer this option – is to avoid the need to do a copy-out/reformat/copy-in operation (or an OS reinstall) every 6-months to a year – in order to force the reconsolidation of $Logfile data.

          Note: One of the reasons a fresh install of Windows tends to be faster than one which has been in use for a while – is precisely because the $Logfile data is less fragmented. Defraggers that offer Boot Optimization allow the user to avoid the need to perform a copy-out/reformat/copy-in operation (or an OS reinstall) to regain that performance.

          Both PerfectDisk and Ultimate Defrag are Hard Disk “SMART” aware. PerfectDisk has an explicit set of screens which report the Hard Disk SMART attributes and allow the user to view the SMART data if desired – so the user can “keep an eye” on their Hard Disk as it operates – rather than just rely on the SMART system to tell the user the disk is in imminent danger of failure. However, there are many other programs which also allow the user to view this information (such as SpeedFan) – so I don’t consider the presence/absence of this option to be a dealbreaker.

          My experience is that Defraggers (either built into Windows or third-party) divide themselves into two groups. The first group does all the things mentioned by bbearren in his post about myDefrag. The second group does all the things mentioned by bbearren in his post about myDefrag – as well as the items I’ve mentioned which are performed by Defraggers that support Boot-Defrag operations.

          IMO, the use of a Defragger that supports a Boot-Defrag option is sort of like deciding to go to the Dentist for an annual cleaning. Is it absolutely necessary? No. But in the same way the cleaning also allows the Dentist the opportunity to check for “tooth rot” – the regular use of a Defragger with a Boot-Defrag option offers the user the same opportunity to prevent Hard Disk “bit rot” – without having to go through the procedure of removing all the user’s teeth and then allowing them to grow in again (copy-out/reformat/copy-in operation).

          In Computer terms – it’s not as if the choice to use of one or the other of the Defragger types mentioned above is mandatory. It’s that one method requires a tedious procedure to be performed every once in a while (copy-out/reformat/copy-in or OS reinstall). The other obviates that need – at the cost of a one-time expense to purchase a Defragger with Boot-Time Defrag capability.

          With the full knowledge of the advantages and limitations of either approach – the user can then choose which option to exercise – on the basis of full knowledge of the advantages/disadvantages of each.

          Hope this helps.

          • #1436840

            In Computer terms – it’s not as if the choice to use of one or the other of the Defragger types mentioned above is mandatory. It’s that one method requires a tedious procedure to be performed every once in a while (copy-out/reformat/copy-in or OS reinstall). The other obviates that need – at the cost of a one-time expense to purchase a Defragger with Boot-Time Defrag capability.

            In my experience, I have found no need (nor justification) whatsoever for “copy-out/reformat/copy-in or OS reinstall”. For me, it’s the-monster-under-the-bed/windows-needs-a-yearly-reinstall boogeyman – it simply doesn’t line up with my reality and experience accumulated over many years and several systems, including a few DIY rigs, and a great many client systems. I routinely install MyDefrag on client systems (it’s free, remember) and set it up in Task Scheduler. Then when I get a call from a client, “My computer is slow”, it actually means their internet is slow, and I always find numerous toolbars, BHO’s, etc. plugged into their browser. I uninstall/disable all the extras, and suddenly the computer is fast again.

            When Windows is first installed, there are files scattered all over the hard drive; the installation is by no means contiguous. Hiberfil.sys is often in the middle of the drive. Can’t shrink your OS partition below 500GB? It may well be that you have a system file (unmovable) sitting at the 500GB position on the platter. And no, it doesn’t make a lot of sense. But don’t take my word for it; try it on a system, and immediately run your favorite defragger to get a GUI display of fragmentation and which files are where.

            On a new install of Windows, $Logfile is almost always contiguous, and close to the beginning of the disk. A first run of MyDefrag will put lots of free space before and after $Logfile, to give it plenty of room to grow, yet remain contiguous. Every subsequent run of MyDefrag will monitor this free space around $Logfile, and add to it as needed. The end result is that $Logfile doesn’t become fragmented, and remains contiguous.

            I have no need for a boot-defrag, because I dual boot. If I felt the need, I could defrag one OS from the other and move system files that are not in use. Then when I’m booted back into that OS, the optimization algorithms of MyDefrag take care of any tidying up running under Task Scheduler. MyDefrag can be installed on a Rescue Disk for those who don’t dual boot, and accomplish the same thing.

            My bottom line remains the fact that my systems continue to perform as crisply and flawlessly and error-free today as they did when I first set them up; this particular one beginning its fourth year. What I found that worked for me years ago still works for me the same way today.

            Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
            We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
            We were all once "Average Users".

    • #1427108

      “… all us guys who got started with computing in the Dark Ages had the defrag habit drilled into us and we all know about old habits!”
      Sigh! That string of white hair is a reminder every morning …
      Now, the ‘urge’ to defrag too?
      Seriously, I’ll not do defrag on big volume hard drive again. Not worth the risk.
      Don’t wanna fix something that is not broken (or shows slow-down).

    • #1435072

      Please recommend a good (must be safe) hard drive defrag software.

      Had defrag nightmare the day after Turkey dinner/Black Friday.
      Making it worse, the drive was a 2TB all-data drive (SATA connected). Space left=about 400G.
      Story:
      The frequent accessed drive was too fragmented.
      Defragged with a known good free ‘reliable’ program … [be ‘gentle’ re ‘free’]
      Took too long (over 24 hours). Must stop for immediate work. Stop properly per the application. No more defraq after that.
      More than 2 weeks past, the hard drive was suddenly not available. Symptom: not accessible by Windows. Drive letter disappeared.
      Diagnosis:
      Hardware-wise OK.
      Found file table corrupted, including the twin copy! Hence, unable to recover.

      Had to do lengthy data recovery … on a 2TB drive. Had backup but new data must be recovered.
      Hard drive reformatted; still works.

      Defrag software recommendation highly appreciated.

      First question – are the two events related? From what you’ve told us, I don’t think so. There is no way a Hard Disk can survive for 2 weeks working properly – and then suddenly “go away” – from an event that was 2 weeks previous. The MFT on NTFS drives simply doesn’t work that way.

      The highest probability is that something went sideways on that drive just shortly before it “disappeared”. And that something had nothing to do with the defrag operation 2 weeks previous.

      Now, does this mean all defrag operations are perfection incarnate? Of course not. However, the problem is of a different kind. Each defrag software package has its strengths and weaknesses. The defrag software packaged with Windows is an obsolete and Microsoft-sanitized version of DiskKeeper (sanitizing means it contains no references to DiskKeeper itself – you’d think it was Microsoft-developed software from the look – but it’s not). This software (because it’s old) has a bunch of limitations. The most important one is it does not work well with only a small amount of free space available. Anything less than 10%-20% of the drive free – and it’s as slow as molasses. The other thing is it does not properly handle defragmentation of NTFS Metadata – and thus leaves this info scattered all over your hard disk – causing almost-immediate-refragmentation after a defrag run.

      If you look through the feature-list of the “free” Third-Party Defragmentation Utilities – you will find they usually don’t support Metadata defragmentation. Without this, I don’t think there’s much point in doing defrag as a preventative measure – as the drive refragments so quickly I consider it almost pointless.

      Note: This is one of the reasons W7/W8 do a “background defrag” every 3 days – using the Microsoft-supplied Defrag Utility. Nice try – no cigar.

      Thus we come to “commercial” Third-Party Defragmentation Utilities. When I researched these – I found many of them also did not defragment the NTFS Metadata. There were two standouts in this regard – PerfectDisk and Ultimate Defrag. I own the latest licences for both.

      Of the two, PerfectDisk is by far the faster defragger. It also has the ability to continuously detect when defragmentation is required – and “keep out of the way” otherwise. It also has a feature that attempts to ensure drive writes go in the proper places to prevent quick refragmentation. I use this software on my W7 boxes.

      Ultimate Defrag has a unique feature I really appreciate. It has the ability to move the NTFS MFT and its associated Metadata to any desired location on the Hard Disk. Thus, you can move the entire piece of the Hard Disk associated with “file housekeeping” somewhere out of the way of all the data on the machine – so the creation of new metadata when new files are created is the only thing that can cause refragmentation. I wish PerfectDisk had this feature. The problem with Ultimate Defrag is it’s really slow. What PerfectDisk does in 20 minutes takes 2 Hours with Ultimate Defrag. However, Ultimate Defrag does a much more thorough defragmentation job than PerfectDisk when it comes to the “technical definition” of defragmentation – albeit with the consequent real-world performance penalty.

      Both PerfectDisk and Ultimate Defrag properly announce the delay inherent in stopping a defrag operation in progress. Both programs take several seconds to continue the defrag to the point where it can safely be stopped – and this shows in their GUI Windows during the “stop” operation. Neither program will let you “do anything” while the “stop” is in progress – which is proper and correct operation. Both PerfectDisk and Ultimate Defrag use the Windows API to do defragmentation operations while Windows is running – which is the safest way to perform defragmentation. They both have “Boot Defrag” capabilities that perform the MFT and/or Metadata “housekeeping” – which cannot be performed while Windows is running when using the current Windows defragmentation API.

      If I had to pick one – PerfectDisk. I haven’t found anything else on the market that touches it – when it comes to the comprehensiveness of its disk-housekeeping prowess. It has a compatibility issue with Zentimo (another piece of software I use) – but only with certain Hard Disk controllers (Silicon Image) and under certain conditions. This prompted a search for alternatives for the machines I own which use SIL hardware – which was how I found Ultimate Defrag. Therefore, I use Ultimate Defrag on machines where PerfectDisk gives me trouble – and PerfectDisk everywhere else.

      Hope this helps.

    • #1435526

      Thanks for the response, twixt.
      Good info; if it does not defrag metadata, why defrag?

      Has free trial version of PerfectDisk?

      Re the 2 events, incomplete defrag + hdd died 2 weeks later, and your comment that the 2 events may not be related.

      There was no concrete CSI proof. The drive is still spinning. If I want, I can reformat it to as good as new.
      Maybe we should never stop defrag before it completes; their manuals warn so.

      The PC and all devices continue to work as usual. The 2TB was the only device that died. MFT was corrupted, as well as its copy. It is still spinning. Files in it are recoverable by software means.
      Based on my investigation, it points to incomplete defrag.

      I recognize my guess is as good as yours. But the evidence, though seems impossible, is the possible outcome.
      Moreover, Terabyte is a new territory. MFT size could be huge and hard to manage.

      I could make one statement on the incident though: heed their warning, do NOT stop the defrag process.
      Or, no defrag at all (for huge drive).

      Does anyone know a way to backup MFT?

      • #1436088

        Thanks for the response, twixt.
        Good info; if it does not defrag metadata, why defrag?

        Has free trial version of PerfectDisk?

        Re the 2 events, incomplete defrag + hdd died 2 weeks later, and your comment that the 2 events may not be related.

        There was no concrete CSI proof. The drive is still spinning. If I want, I can reformat it to as good as new.
        Maybe we should never stop defrag before it completes; their manuals warn so.

        The PC and all devices continue to work as usual. The 2TB was the only device that died. MFT was corrupted, as well as its copy. It is still spinning. Files in it are recoverable by software means.
        Based on my investigation, it points to incomplete defrag.

        I recognize my guess is as good as yours. But the evidence, though seems impossible, is the possible outcome.
        Moreover, Terabyte is a new territory. MFT size could be huge and hard to manage.

        I could make one statement on the incident though: heed their warning, do NOT stop the defrag process.
        Or, no defrag at all (for huge drive).

        Does anyone know a way to backup MFT?

        Hi. I’ve been away. To answer your questions:

        1. Yes, PerfectDisk has a 30-day free trial version. I used this myself to validate the product when I first started using PerfectDisk.

        2. Manually stopping defrag operations in the middle of a PerfectDisk defrag has always terminated safely for me. I have tested this many many times. The GUI tells you processing is continuing while it is finalizing – before giving you back your machine. I have never lost data. Even in those situations where I ran into compatibility problems which bluescreened the two machines with SIL Add-On PCI SATA Controller cards (which are notoriously flaky under certain conditions) – everything came back cleanly after the bluescreen reboots. I’ve really beaten this program up – it hasn’t led to data loss when I’ve had things happen unexpectedly.

        That doesn’t mean I don’t do complete Image backups on a regular basis. But so far at least – I haven’t had a “gotcha” from PerfectDisk. Still gonna do my backups…

        3. Terabyte disks are old hat for PerfectDisk. They’ve been dealing with large disk support for years now – through multiple generations of product – using both MFT and GPT. See the “About Us” page at the PerfectDisk website for some history on Raxco as a company. Lots of firsts regarding large disk support – therefore the longest real-world experience with supporting large disks in both Corporate and Consumer environments. BTW, PerfectDisk does not charge extra for large disk support.

        Note: Both PerfectDisk and Ultimate Defrag consolidate the MFT during Defrag – and add sufficient free space after the MFT to allow the MFT to “grow” without refragmenting due to file-create operations. No “bits and pieces” of the MFT strewn all over the disk. Ultimate Defrag even allows you to manually adjust the size of the MFT “free block” if you wish. If you have some idea of the future of that drive and the number of files you expect to put on it (Eg: Backup drive with only a small number of large files) you can tell Ultimate Defrag to shrink the size of the MFT “free block” down to nearly nothing – because you know the number of files on that drive is always going to be small – which is exactly correct for backup drives.

        4. The MFT is automatically “mirrored” as part of normal NTFS operations. Using the Microsoft Defragmentation API – it is “supposed to be impossible” to corrupt both the main copy of the MFT and its mirror – concurrently. As far as I am aware – it takes two failed write operations sequentially – one which corrupts the main copy of the MFT and another which corrupts the mirror – with no automatic recognition of the original MFT write failure in between – to cause the circumstance you describe.

        Things to check: Intel’s OROM and Driver support for the ICH5R and later has been notoriously unstable – and has led to data loss with older versions of the OROM and the drivers. This is especially true for the ICH5R through ICH9R controllers – with their associated OROMs and drivers. Early versions of the ICH10R OROM and driver are also suspect – but failures here more commonly cause spontaneous lockup without data loss. It’s still annoying. It is highly recommended to upgrade the ICHxxR OROM and the IAA/Matrix/RST Driver to the latest version supported for your particular Intel Hard Disk Controller Chipset – above and beyond what is shown at the Motherboard Manufacturer’s website.

        Note: I have personal experience with the above regarding Intel’s own D875PBZ motherboard and the Asus P5Q and P5Q3 motherboards. The drivers on the Intel/Asus websites for those specific board models are completely out-of-date – compared to the OROMs and Drivers available from the dedicated Intel websites for those chipsets and drivers. The same situation applies for Marvell, JMicron and SIL Hard Disk Controllers – and their ESATA/RAID support.

        On the Asus Systems – I updated the BIOS insert for the Intel OROM in the latest unmodified ASUS BIOS to Version 10.1.0.1008 (later versions than this also need investigation). I also updated the Marvell OROM from Version 1.2.0.L70d in the latest unmodified ASUS BIOS to Version 1.2.0.L73 (L75 needs investigation). I am using these with the Intel RST 11.7.0.1013 Drivers (latest I’ve found so far for the ICH10R Chipset) which replace the stock Asus-provided 8.6.0.1023 drivers. I also use the Marvell 1.2.0.8300 Drivers (8400 requires investigation) which replace the stock Asus-provided 1.2.0.68 drivers.

        If you are using older versions of these BIOS inserts and drivers – look at the long history of spontaneous disk (especially SSD) corruption due to stability problems with earlier BIOS inserts (OROM) and earlier driver versions. Those manufacturers didn’t go through all those dozens and dozens of revs – with all that associated embarrassment – for no reason. Google: intel RST bug

        Hope this helps.

    • #1435537

      I prefer to use Auslogics disk defrag and it’s a MS partner.

      http://www.auslogics.com/en/software/disk-defrag/

    • #1435627

      Yeah, you’re right. A fragmented drive will have the fragments spread all over the drive, thereby resulting in there not being a big enough contiguous block of space to use as the work space.

      And even if there was a big enough contiguous block of space, Windows would first fill in the empty spaces, then the contiguous block of space.

      You will have to use an empty drive as the target for the initial copy operation.

      Group "L" (Linux Mint)
      with Windows 10 running in a remote session on my file server
    • #1435628

      Ya, it doesn’t really copy when you change folders on the same drive/partition, it just changes the location index. Another drive or partition would suffice but then how would one assure contiguous space for the file/folder if copied back (even a defrag insertion may not suffice)? So I think the only way is different drive/partition AND formatted free space on return destination.

    • #1435656

      In my view, if you have only data on your drive, and you want to defragment that drive, there is no better, safer, and simpler way than copying all of the data to a different, clean drive, then wiping (formatting) the original drive, then copying all of the data back to it.

      Group "L" (Linux Mint)
      with Windows 10 running in a remote session on my file server
      • #1435717

        “In my view, if you have only data on your drive, and you want to defragment that drive, there is no better, safer, and simpler way than copying all of the data to a different, clean drive, then wiping (formatting) the original drive, then copying all of the data back to it. ”

        I did that! Fragmentation check: 0%. Hard drive is 2TB. Data 1.3TB. Performance gain? Not numerically significant. But I ‘feel’ it’s faster! Science vs psych! Nothing better than a psycho uplifting.
        Yes, must be wholesale copy. Then quick format the entire hdd. Lastly, copy all back.

    • #1435999

      When you copy files to a clean, empty hard drive, the copy process will reassemble the file pieces into complete files as it is writing the files to the target drive. This happens because there are no empty spaces on the target drive, but rather the whole thing is empty. If there is data on the target drive, Windows will try to fill in the empty spaces with the incoming data, refragmenting your data. But since the target drive is empty, there are no little spaces to fill up; the whole drive is one big empty space.

      So that step alone defragments your files.

      In order to prevent fragmentation when you copy them back to the original drive, you must first empty the drive, to eliminate the empty spaces between pieces of files. With one big chunk of empty space, Windows can write the files as single pieces, because there are no spaces to fill up.

      Group "L" (Linux Mint)
      with Windows 10 running in a remote session on my file server
    • #1436427

      @Twixt, thanks for the valuable info re outdated drivers. The related motherboard *is* Intel chip set.
      I save the corrupted hard drive untouched. I want to learn more of the failure in the near future.
      As mentioned before, both copies of MFT were corrupted. I was like you: “Not possible.”
      Now that you mention Intel/Marvell drivers, it gives me a pointer to start the investigation.
      I have friends working at Marvell. If it is them, I’ll give them a earful.
      For those interested:
      RST: Intel Rapid Storage Technology
      OROM: Option ROM

    • #1436438

      My preference for Intel SATA chipset drivers is the default Microsoft W7 and later, it does use some of the Intel drivers, allows TRIM on SSDs, etc., and can be faster than the IntelRST drivers (which may not be the ‘correct’ or best drivers anyway, RST primarily being for SSD caching, if my understanding is correct).

      Marvell/JRaid etc. chipsets, I don’t connect anything to and disable them in the BIOS.

      I’ve seen PerfectDisk triggering BSODs when set to defrag at boot time and IntelRST drivers also implicated in causing BSODs.

      • #1436555

        My preference for Intel SATA chipset drivers is the default Microsoft W7 and later, it does use some of the Intel drivers, allows TRIM on SSDs, etc., and can be faster than the IntelRST drivers (which may not be the ‘correct’ or best drivers anyway, RST primarily being for SSD caching, if my understanding is correct).

        Marvell/JRaid etc. chipsets, I don’t connect anything to and disable them in the BIOS.

        I’ve seen PerfectDisk triggering BSODs when set to defrag at boot time and IntelRST drivers also implicated in causing BSODs.

        I haven’t seen PerfectDisk trigger a BSOD at boot defrag. I have seen PerfectDisk trigger a BSOD at shutdown – which was not data-destructive. The only other thing I have seen regarding PerfectDisk at startup is what happens if the affected Hard Disk needs to run Chkdsk. PerfectDisk’s Boot Defrag has a built-in integrity checker – which runs before the Boot-Defrag runs. If this integrity check fails – an abbreviated Boot-Defrag runs instead of the “full-monty” version.

        If this happens (and it did to me several times on Intel D875PBZ motherboards with ICH5R controllers when running WXP) – running a proper Chkdsk /r on the affected partition – to diagnose the disk and schedule a Chkdsk /r boot-run if necessary – cleaned up the problem. After that, a PerfectDisk boot-defrag ran properly and all was well.

        Note: All the above experience references PerfectDisk Version 12.5 – Build 312 with Hotfix 4 applied – OR – PerfectDisk Version 13.0 – Build 776 (latest of which I am aware). I have not had extensive personal experience with versions of PerfectDisk earlier than that. However my research before purchasing the product indicates that users trying to get by with using earlier versions of PerfectDisk or Ultimate Defrag – when using Intel or 3rd-party Hard Disk Controllers/Drivers – are taking avoidable risks.

        There is absolutely nothing wrong with using the Marvell controller on the Asus P5Q/P5Q3 series motherboards. I have mine set up as a RAID1 Array for my OS Drive – which is partitioned into 3 pieces (one for the OS, one for Data, one as a Scratch Drive for things like Photoshop). If you are avoiding the use of these extra SATA Ports because the “jungle drums network” thinks they’re evil – you’re missing out on a perfectly-acceptable opportunity to add extra Hard Disks to your system – on the basis of horror stories from people who have problems with these chipsets. IMO, these people have chosen to blame the chipset – instead of admitting they didn’t do their homework regarding the required OROM and Driver Support. Google this. There’s lots of info out there on how to make this stuff work properly – with people sharing their experience with these chipset OROMs and Drivers.

        Note: You seem perfectly willing to accept the risks inherent with the Intel Chipset OROMs – and even accept the risk of using Intel RST Drivers in some circumstances. Yet you categorically reject the need to perform the same due-diligence in regards to the Marvell/JMicron chipset OROM and Driver categories. This doesn’t make sense.

        Simply apply the same logic required for the Intel RST Drivers to the Marvell/JMicron chipsets – and they become available to you as well. Yes, getting this stuff working initially is a pain – there is a learning curve. But that’s what Motherboard Manuals, Motherboard Support Forums and Norton System Recovery (the replacement for Ghost) are for. I’ve played with this extensively – had endless problems with Windows-startup-failures after Marvell Driver updates – as well as return-from-sleep problems with the various Marvell Drivers available. Regardless, 10 minutes with NSR gets me out of whatever jam I get into while experimenting – and each time something goes wonky I learn more about the “bigger picture” in regards to Intel/Marvell OROM/Driver integration.

        Note: I’ve also found that Microsoft dip their fingers into this mix as well – since “things that didn’t work before” in regards to Intel/Marvell Hard Disk Controller Driver integration – suddenly “work properly” after Patch Tuesday some months. Those NT Kernel updates buried in the “security updates” solve more problems than Microsoft admit publicly. Ditto for the various “DotNet” fixes.

        Then – on a related topic – there’s the whole USB3 firmware/driver debacle with the various USB3 chipsets – and their impact on External Hard Disk compatibility when used for Backup purposes. This affects many many many Laptops using USB3 – as well as Desktops with native USB3 or USB3 Add-on PCI or PCI-E cards. Again it’s almost always solvable – when the correct firmware updates for the USB3 chipset are installed – and the correct updated drivers are used. The exact same due-diligence routine is required here – as for the Marvell/JMicron Hard Disk Controller Drivers mentioned above.

        My $0.02 :^_^:

        • #1436563

          … snip …

          “jungle drums network”, interesting.

          I only use 3 internal drives, I have no use for more; minidumps don’t take up much drive space.

          36144-dmps

    • #1436481

      What is wrong with the microsoft version?

      Why not just let it run till it is finished??
      I let mine run all weekend once doing a similar task.
      Better to avoid problems than to fix them.

      Add another HD and move a lot of the data and less used programs first.
      Then defrag the original disk.
      Then move the data back if you cant use it on the second one.

      Please recommend a good (must be safe) hard drive defrag software.

      Had defrag nightmare the day after Turkey dinner/Black Friday.
      Making it worse, the drive was a 2TB all-data drive (SATA connected). Space left=about 400G.
      Story:
      The frequent accessed drive was too fragmented.
      Defragged with a known good free ‘reliable’ program … [be ‘gentle’ re ‘free’]
      Took too long (over 24 hours). Must stop for immediate work. Stop properly per the application. No more defraq after that.
      More than 2 weeks past, the hard drive was suddenly not available. Symptom: not accessible by Windows. Drive letter disappeared.
      Diagnosis:
      Hardware-wise OK.
      Found file table corrupted, including the twin copy! Hence, unable to recover.

      Had to do lengthy data recovery … on a 2TB drive. Had backup but new data must be recovered.
      Hard drive reformatted; still works.

      Defrag software recommendation highly appreciated.

    • #1436667

      FWIW

      Metadata in NTFS is stored in the MFT, which is a reserved space on the HDD. Files themselves can sometimes be written within the MFT. Is the metadata defragged when the HDD is defragged? The MFT knows the filename and attributes and where all the file fragments are stored on disk. If a file has 13 fragments, there are 26 entries concerning those locations; beginning of fragment/end of fragment. If the file is defragged into a single contiguous file, there are only two entries concerning the file location; beginning of file/end of file. 24 now useless bits of information have been eliminated from the MFT on one file.

      I’ve been using MyDefrag since is was JKDefrag, and I’m still using it. MyDefrag does more than just re-write files into contiguous space. It can be run in the background while you’re using your PC, or it can be scheduled to run with Windows Task Scheduler. It can be safely stopped at any time. It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.

      Here is some more information on file systems, with some more things to worry about. Another way I look at defragging my HDD is that I’m actively refreshing my files little by little.

      I have the Windows background defragger and prefetch updater disabled, and use MyDefrag scheduled tasks to take care of that for me. I run a “daily defrag” (actually nightly), and a “monthly defrag”. Do I notice any increase in system performance? No. Do I notice the so-called “Windows bloat” and performance slowdown? Nope. I have a youtube video of this PC booting Windows 8. In the video, I’m booting to the lock screen of Windows 8. You’ll notice the date is January 5 – that was 2013. The video runs from the time my boot manager loads (immediately after the BIOS) and the OS to boot is selected. The video runs 31 seconds, OS selection is about 3 – 5 seconds of that.

      I just timed a fresh boot to the lock screen from OS selection in my boot manager – 26 seconds. Has defragging speeded up my PC? No, not noticably. But it certainly hasn’t slowed it down. Programs launch and everything works as crisply now as it always has. YMMV

      Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
      We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
      We were all once "Average Users".

      • #1436706

        FWIW

        Metadata in NTFS is stored in the MFT, which is a reserved space on the HDD. Files themselves can sometimes be written within the MFT. Is the metadata defragged when the HDD is defragged? The MFT knows the filename and attributes and where all the file fragments are stored on disk. If a file has 13 fragments, there are 26 entries concerning those locations; beginning of fragment/end of fragment. If the file is defragged into a single contiguous file, there are only two entries concerning the file location; beginning of file/end of file. 24 now useless bits of information have been eliminated from the MFT on one file.

        I’ve been using MyDefrag since is was JKDefrag, and I’m still using it. MyDefrag does more than just re-write files into contiguous space. It can be run in the background while you’re using your PC, or it can be scheduled to run with Windows Task Scheduler. It can be safely stopped at any time. It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.

        Here is some more information on file systems, with some more things to worry about. Another way I look at defragging my HDD is that I’m actively refreshing my files little by little.

        I have the Windows background defragger and prefetch updater disabled, and use MyDefrag scheduled tasks to take care of that for me. I run a “daily defrag” (actually nightly), and a “monthly defrag”. Do I notice any increase in system performance? No. Do I notice the so-called “Windows bloat” and performance slowdown? Nope. I have a youtube video of this PC booting Windows 8. In the video, I’m booting to the lock screen of Windows 8. You’ll notice the date is January 5 – that was 2013. The video runs from the time my boot manager loads (immediately after the BIOS) and the OS to boot is selected. The video runs 31 seconds, OS selection is about 3 – 5 seconds of that.

        I just timed a fresh boot to the lock screen from OS selection in my boot manager – 26 seconds. Has defragging speeded up my PC? No, not noticably. But it certainly hasn’t slowed it down. Programs launch and everything works as crisply now as it always has. YMMV

        Some things to note:

        There are many pieces of Metadata that are not carried in the MFT.

        The complete list is as follows: http://ntfs.com/ntfs-system-files.htm

        Of the total list, there are some which cannot be defragmented while the OS is up and running. For these a Boot-Defrag is necessary. Any Defragger which does not offer a Boot-Defrag option does not defrag the affected metadata files – which end up strewn all over the Hard Disk as Windows adds/deletes files.

        Yes, the $Mft entries can be defragged “on the fly”. No, these are not all the items each file “messes with” when created/deleted.

        Metadata “quirks”:

        The most common Metadata file you will see Chkdsk fiddling with is the $Bitmap file. When “lost clusters” are found – this is the file that must be rewritten so the free-space on the drive that NTFS “thinks is there” – corresponds with what “actually is there”. Systems where this value is corrupt will give “out of disk space” errors when there is lots of free space available – because NTFS is querying the $Bitmap metadata rather than the actual free space. Running Chkdsk /f brings theory and reality back into sync – along with fixing a bunch of other “sins”.

        Furthermore, when any new file-entry is created – the journaling system in NTFS creates a $LogFile entry for each transaction. This allows system crashes to be stably recovered in situations where FAT would fall over completely. The presence of the NTFS journaling system is the main reason an OS Crash on a System running NTFS tends to recover transparently to the user on restart – moreso than systems running the FAT file system. In fact, this was one of the prime design critera for NTFS as the replacement for FAT.

        See the following for more info: http://www.ntfs.com/transaction.htm

        Journal entries are usually created at the beginning of a segment of the hard disk where a set of file-operations is going to occur. However – once created – that $LogFile entry will stay at that location forever – until the Hard Disk is either reformatted or a Boot-Defrag is performed.

        To see an example of this, use the old Speed Disk found in Norton SystemWorks – on an NTFS-based WXP System which has been around for a while. After a full defrag using Speed Disk or any other Windows-based Defragger without a Boot-Defrag option – you will notice a whole bunch of little orange squares strewn all over the Speed Disk map. These are $LogFile fragments – which Speed Disk, Windows Defrag, and all other Defraggers that do not have a Boot-Defrag option cannot consolidate. Thus, every time you write new files to a NTFS volume – you refragment your files as they “skip around” those $LogFile fragments. This is utterly unavoidable – it is inherent to the design of NTFS.

        Note: If your Defragger is not showing you this – it is lying to you in order to “look good” – while not actually doing a “full-pull” job.

        Thus, the only way to fully defragment an NTFS volume is to properly consolidate all the $Metadata files on the Hard Disk – which currently can only be done during a Boot-Defrag operation.

        Note: Even copying-out/reformatting/copying-in does not truly solve the problem – as the $LogFile data is recreated during the copy-in process and refragmentation starts anew as soon as file system operations occur during normal Windows operation. This can be verified by running PerfectDisk or Ultimate Defrag after a copy-out/format/copy-in operation. Even after a full defrag of the newly-restored system – a Boot-Defrag will still find lots of things to consolidate on its first run – due to the nature of how NTFS stores its metadata – and how it handles metadata capacity-expansion requirements.

        Visualizing how Metadata and the MFT interact:

        Ultimate Defrag allows the user to reposition not only the consolidated MFT/Metadata “block” – but also to sort the consolidated Metadata inside the “block” into whatever order the user wishes. This is a great way to visualize the interaction between the MFT and its associated Metadata.

        Note: Both PerfectDisk and Ultimate Defrag also allow the user to see the individual Metadata “blocks” – that reappear spontaneously during normal NTFS operations after a PerfectDisk or Ultimate Defrag pass – which cannot be reconsolidated back into the consolidated MFT/Metadata block until a Boot-Defrag is rerun. Such is the tao of an honest defragger. 🙂

        Final observations:

        It is no accident that people running Exchange Server – which is especially prone to NTFS fragmentation problems bringing Email performance to its knees – commonly use a special form of PerfectDisk to ensure their Corporate Email Systems run with acceptable speed. IMO, the above-described limitations in NTFS make that requirement self-evident.

        TANSTAAFL.

        • #1436717

          Final observations:

          Exchange Server is not an issue for me. Running Windows 8 a full year on a moderate, low-end system (Dell Inspiron 580 with Intel Core i3 CPU) with absolutely no loss in performance, no BSOD’s, no issues of any kind tells me a little bit. My system has three 1TB drives with 19 partitions (two are hidden) spread across them.

          Prior to the year of flawless, faultless performance with Windows 8, I ran the same system for two years dual booting two versions of Windows 7 with the same flawless, faultless, performance with no fall off. Windows 7 Ultimate has been running 3 years on this machine (I upgraded Windows 7 Home Premium to Windows 8). That sort of experience is all the information I really need to know to tell me that the tools I’m using and what I’m doing is working just fine for me. I see no viable reason to change anything that I’m doing.

          YMMV

          Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
          We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
          We were all once "Average Users".

    • #1436769

      Thanks for all the inputs. Very educational, and lots of information to digest.
      Re: Copy_out-reformat-copy_back and yet still has something to defrag after that. I’m confused.
      I thought after a format, the hard drive is ‘raw’: no data. NTFS formatting is different? Still has ghostly trace left behind for defrag? What if I first format to FAT/exFAT, then again format it to NTFS? Even first partition differently then re-do partitioning. Still ghostly trace left behind?
      Please discuss and advise. Much appreciated.

      • #1436803

        I’m confused.
        I thought after a format, the hard drive is ‘raw’: no data.

        Before a format, the hard drive is ‘raw’. Formatting lays down the MBR, reserves space for the MFT, marks off the sectors and tracks, creates the bitmap that records the drive layout, and a few other odds and ends. That’s what formatting does.

        A housing development begins with the streets and utilities and surveyed lots before the house building ever starts.[/Font]

        Always create a fresh drive image before making system changes/Windows updates; you may need to start over!
        We all have our own reasons for doing the things that we do with our systems; we don't need anyone's approval, and we don't all have to do the same things.
        We were all once "Average Users".

    • #1436928

      To reply to the original question, I recommend Diskeeper[/COLOR]. It is a paid but affordable product (the home version is thirty bucks and is good for three computers), it does the job in real time, and it gives you a place to go if you run into trouble with it. Set it and forget it, although an occasional visit to see if it reports any problems is in order. (One problem that I don’t know if they’ve cured is the ‘program not found – skipping autocheck’ error message that can turn up at boot time in certain circumstances. You can safely ignore it, but it may drive you crazy if you don’t go through whatever it takes to get rid of it.)

      I suggest you view the SpinRite video, which is interesting, and see if there is any freeware (or SpinRite itself) that interests you. SpinRite is not a defragmenter, but a disk maintenance program.

      I mentioned System Mechanic in a post, and if already have it you may know that they have got on the optimization bandwagon in a big way in recent years, complete with an alarm to tell you it’s time you optimized. It is very good at detecting drive errors early in the game, and yes, it has a defragmenter. A number of other things in the suite work well – I just turn off everything I can find that goes on behind my back and run it manually, using what is useful to me. Prices are all over the map, so shop around if buying.

      MyDefrag is something I keep on the computer(s), and these days I just use that for defragging external data drives. Running it (or any other) on a daily schedule if you are set up for it is probably the secret to success, and keeping your system partition optimized makes a real difference in performance.

    • #1436936

      Assuming most defrag software are safe, defrag is still time consuming for huge drive. Copy-out/reformat/copy-in, in my 2TB-drive case, is faster than using MyDefrag.
      With the understanding of fragmentation of $Logfile, after copy-out/in, I can elect to do a quick defrag (for the $Logfile), instead of reinstall Windows. This 2-strp seems faster, and safer too. With a backup to boot.
      @MQG1023
      Too huge a drive, risks losing huge data… Good point. With my nightmarish experience on the 2TB, limit on-the-field hard drive to 1TB is a good compromise. Only use huge drives for backup storage.

    Viewing 27 reply threads
    Reply To: Defrag nightmare

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: