-
WSgadget
AskWoody LoungerMarch 13, 2015 at 12:15 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1495029Chung,
Thanks for the very well written post. It is much appreciated.
For the record, both PhotoRec and Recuva are available in portable form. They can be found at the Portable Freeware collection.
It should also be noted that if the original photos were on the C: drive prior to copying to the portable, then some of those deleted originals may actually be recovered from the C: drive. I once ran Recuva on my C: drive looking for a certain deleted file. Since I didn’t know the file name, I looked for everything. It was shocking how many K’s of files it presented for recovery.
MichaelThanks Michael. The ability to recover old files from the C: drive is what makes me want to use a file shredder. :B):
The very first time I needed data recovery I didn’t know where to start. So many tools that really only handle “undeletes” (which are actually very easy to do) while the good ones don’t get enough attention. Windows Secrets is likely to be around for a long time so hopefully other forum users will find the review helpful.
Speaking of portable apps, the PortableApps.com suite uses Wise Data Recovery (http://www.wisecleaner.com/wise-data-recovery.html). It’s also available for free so I’ll run the same tests and update my earlier post.
Chung
-
WSgadget
AskWoody LoungerMarch 13, 2015 at 11:50 am in reply to: Data recovery for 2TB external HD after partition wiped all data? #1495025A lot of expertise on this forum! I feel for anyone losing otherwise non-replaceable data.
I have a closely related question that might save me from being in that position. I have a 4 tb 2 disk Western Digital “MyCloud” system setup with Raid 1. My understanding is that Raid 1 stripes the data in such a way that if one of the disks goes belly up all the data will be recoverable from the other disk. And further that if the system itself dies the disks can be easily removed and put in a PC or other external chassis for data recovery.
Am I correct in my understanding?
(I also have a 1 tb attached to the WD system via USB to duplicate the most important data – as if I had anything so important it needed so much redundancy. )
Hi wiiiindy,
Which model of My Cloud model is it? (I’m wondering if it’s a pair of 4 TB disks or or a pair of 2 TB disks.) It sounds like you have one of the NAS units because the more basic units don’t support backups to a USB drive.
RAID 1 sets up a mirror, so disk 2 is a clone of disk 1. Usually one disk is considered the “primary” where all reads/writes are directed. The “secondary” disk is only written to until the primary disk fails, at which time data is then read/write to the secondary disk. Because the disks are just duplicates, either one can be removed and plugged into a PC to access the data. Unless there’s corruption and/or damage to the disks, no data recovery tool is needed.
The two-disk My Cloud units also support RAID 0 and JBOD…
RAID 0 stripes data across both disks so the total storage capacity is the sum of both disks. Because reads and writes go to both disks, it provides faster speeds than RAID 1. The downside is that if either drive fails, the entire array can be lost. Data recovery is complicated because files can be split across both disks (it’s like file fragmentation on a single disk, but worse). Adding to the difficulty is that different brands of RAID controllers each have their own algorithms for storing data — a pair of disks formatted in one RAID 0 enclosure might not be compatible with another enclosure from another company (this also applies to RAID 5, RAID 6 and RAID 10 arrays).
JBOD (just a bunch of disks) treats each drive separately. If the My Cloud supports direct USB connections, it behaves like a USB hub with each disk appearing as a separate USB drive. If the My Cloud offers network access, then each disk is just a separate volume on the server.
Chung
-
WSgadget
AskWoody LoungerMarch 12, 2015 at 8:23 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494894I found plenty of reviews on data recovery tools that compared cost, user-friendliness, features and other factors, but none that covered the recovery rates. So, taking a few of the most promising choices recommended on this thread, I ran tests to see which ones had the best rate of success recovering lost files.
Test Environment
Oracle VirtualBox virtual machine running Windows 7. This was done so that the tests could be repeated consistently with each recovery program.
“Damaged” Disk
A 100 MB virtual hard drive is partitioned and formatted as NTFS (1 partition, “Simple Disk” layout).In preparing the “damaged” disk the following steps were taken:
-
[*]Various JPEG images are copied onto the disk until Windows says it is full (~50 files between 2.5 and 5.5 MB).
[*]A few random images are deleted to make space for new images.
[*]More images are copied onto the disk until it is full again.
[*]Steps 2 & 3 are repeated three times in order to deliberately create empty gaps between disk clusters.
[*]One 1920 x 1080 PNG image is copied to the disk (“A.png”).
[*]Two JPEG images are copied to the disk (“B.jpg” and “C.jpg”).
[*]All of the JPEG images except B.jpg and C.jpg are deleted.
[*]The disk is formatted via Windows Explorer’s “Quick Format” option.
[*]Windows is shut down.
[*]The disk is disconnected from the virtual machine, marked “immutable” and then reconnected.Step #10 was done to ensure that each recovery program got exactly the same disk. Because the disk was essentially read-only, no unexpected changes could be made.
Piriform’s Defraggler program was used to analyze the disk for fragmented files, specifically making sure that A.png, B.jpg and C.jpg were fragmented. The PNG image A.png was stored in 4 fragments, B.jpg was in 4 fragments, and C.jpg was in 2 fragments.
Test Results
(Although I posted about Adroit Photo Recovery, I didn’t include it in the test because there wasn’t a free version that can save the results so I could run checksums to verify the recovered files. Previewing images that can be recovered doesn’t guarantee that they’ll be 100% intact.)
[INDENT]Lazesoft Recovery Suite
[/INDENT]
[INDENT=2]URL: http://lazesoft.com/lazesoft-recovery-suite.html
supported operating systems: Windows installer, standalone bootable WinPE or Linux live system[/INDENT]
[INDENT=2]version: 4.0.1
installed size: 96 MB
files recovered:[/INDENT]
[INDENT=3]Fast Scan: 0[/INDENT]
[INDENT=3]Undelete: 0
Unformat: 20 JPEG
Deep Scan: 22 JPEG[/INDENT]
[INDENT]
PhotoRec[/INDENT]
[INDENT=2]URL: http://www.cgsecurity.org/wiki/PhotoRec
supported operating systems: portable app for DOS, Windows, OS X, Linux[/INDENT]
[INDENT=2]version: 6.14
installed size: 5.5 MB
files recovered: 42 JPEG, 1 PNG[/INDENT]
[INDENT]
Recuva
[/INDENT]
[INDENT=2]
URL: http://www.piriform.com/recuva[/INDENT]
[INDENT=2]supported operating systems: Windows installer or portable version (http://www.piriform.com/recuva/builds)[/INDENT]
[INDENT=2]version: 1.51.0.1063
installed size: 3.6 MB
files recovered:
[/INDENT]
[INDENT=3]Fast Scan: 0
Deep Scan: 22 JPEG[/INDENT]
[INDENT]Wise Data Recovery
[/INDENT]
[INDENT=2]
URL: http://www.wisecleaner.com/wise-data-recovery.html
supported operating systems: Windows installer[/INDENT]
[INDENT=2]version: 3.51.188
installed size: 3.3 MB
files recovered: 18 JPEG[/INDENT]LRS, Recuva and WDR weren’t able to recover any fragment of the PNG file. PhotoRec wasn’t able to recover the entire file, but was able to get the first fragment (60% of the image = 1920 x 649).
For the JPEGs, the results were much more varied:
-
[*]LRS’ “Unformat” tool recovered 20 of the deleted images, while its “Deep Scan” tool was able to recover two additional images. Neither of its tools were able to completely recover the fragmented B.jpg and C.jpg files — both tools recovered 20% of B.jpg and 0% of C.jpg.
[*]Recuva, interestingly, recovered the same number of images, but instead of recovering just a fragment of B.jpg and C.jpg, it was able to recover the thumbnail image that is often created by digital cameras and stored with the full size image. It skipped over any image that could only be partially recovered.
[*]PhotoRec was able to recover the most files overall, but half the files were thumbnails that matched the larger full size images, so technically it recovered 21 unique images. Not sure why it skipped over the fragmented B.jpg and C.jpg files while recovering at least part of A.png.
[*]WDR’s results were also very interesting. Although it recovered only 18 JPEG files, it was the only one that also recovered the correct file names. Considering that the files on the test disk were deleted and then the disk was reformatted, that’s impressive.Each time a partition is formatted as NTFS, the MFT (master file table) in NTFS can end up in a different location in the partition. The more times a partition is reformatted (even with just a “Quick Format”), the greater the chances of valuable data being overwritten.
As a final test, the disk was reformatted with the “Quick Format” option disabled. A full format writes zeros to every cluster in the partition. As expected, none of the recovery tools were able to extract even a fragment of a single image file. (One of the many great advantages of virtualization… because the test disk was set as immutable in VirtualBox, a shutdown is all it takes to revert any changes.)
Summary
So, it turned out to be close as far as the number of images recovered (18 to 22 each). None of the recovery tools alone did a better job than the others.
If the goal was to get back as many pictures as possible (regardless of resolution), then a combination of PhotoRec and Recuva did the best job. Combined, they were able to recover at least part of the fragmented files along with complete thumbnails (A.png, B.jpg, C.jpg).
LRS paired up with PhotoRec or Recuva wasn’t quite as good as PhotoRec + Recuva, and combining all of them didn’t increase the number of images recovered.
Thoughts…As the test results confirmed, when magnetic media is overwritten even just one time, software recovery tools alone are often not enough. More advanced (and expensive) techniques using magnetic force microscopy (MFM), atomic force microscopy (AFM) and/or scanning tunneling microscopy (STM) are required. With the right equipment and software, even data from a disk with holes drilled through it can still be recovered. (I used to handle IT support at a place that had quite a few state-of-the-art equipment including laser scanning microscopes, electron microscopes, and atomic force microscopes… cool stuff!)
One advantage LRS has over the others is the ability to create a bootable CD and/or USB flash drive for standalone operation (it uses a choice of Linux or WinPE for the operating system). This makes it ideal if you only have one computer and the lost data is on the internal drive.
It was nice that WDR was able to recover the file names, but it’s really not that important considering that most people don’t actually rename their photos. Look on just about anyone’s digital camera and/or smartphone and you’re likely to find hundreds or thousands of files with names like “100_5678.jpg” and “2015-03-13 14.15.16.jpg”. A much better option is to use one of the many free Exif tools to update the file names and correct the time stamps using the internal metadata.
Given the size of each program and the simpler interfaces (i.e. fewer options to select from), I’d use Recuva first, followed by PhotoRec if I wanted to try to recover more files, and then maybe LRS for any file types not covered by Recuva and PhotoRec.
That’s all for now,
Chung
-
WSgadget
AskWoody LoungerMarch 11, 2015 at 7:09 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494683Yes, that was the program I’d passed on and it may be a while before ladybugamk gets because she’s waiting for the machine to come back.
As it says it’s not for use on HDDs does that mean it won’t do the job for ladybugamk ?
Hi Sudo15,
I took a closer look at the specs for RescuePro and it unfortunately won’t work for ladybugamk. RescuePro will only scan devices that are of a certain type and capacity. The standard edition is limited to media up to 64 GB and the deluxe edition tops out at 128 GB. The URL for the comparison chart:
[INDENT]http://www.lc-tech.com/documents/rescueprocomparison3.pdf[/INDENT]
I can’t think of a technical reason why RescuePro couldn’t work on a hard drive other than what appears to be built-in restrictions.
Even if RescuePro supported hard drives, it wouldn’t be practical for ladybugamk’s needs. I just noticed one of the system requirements:
[INDENT]Minimum of free hard disk space twice as large as the media you wish to recover
[/INDENT]So ladybugamk would need to buy/borrow a 4 TB drive to try and recover her 2 TB drive… yikes!
The company’s FILERECOVERY software will handle hard drives. The downside is that the license is via an annual subscription. For the standard edition, it’s £53.58 (about $80 USD) per year. The licensing cost makes Recuva and Adroit better choices for ladybugamk.
Chung
-
WSgadget
AskWoody LoungerMarch 11, 2015 at 4:49 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494667Hi ladybugamk,
If Recuva wasn’t able to recover the pictures that you’re looking for, I found another program that might have better success:
[INDENT]Adroit Photo Recovery – http://photo-recovery.info/[/INDENT]
It’s a commercial program that charges $19.99 for a 7-day license or $49.99 without a time limit. The tool scans and shows all recoverable image files it finds for free, so you only pay the license fee to save the results.
Adroit uses a recovery technique that’s more adaptive than Recuva, PhotoRec and RescuePro. It’s based on heuristics and statistics about file fragmentation and image files to try and reconstruct the images. The authors that created the algorithm for a forensics research paper published in IEEE back in 2006 founded the company (Digital Assembly) that develops Adroit.
Good luck,
Chung
-
WSgadget
AskWoody LoungerMarch 11, 2015 at 3:12 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494647What about RescuePRO Deluxe http://www.lc-tech.co.uk/pc/sandisk-rescuepro-and-rescuepro-deluxe/ – perhaps it won’t do the job I thought it would as I’ve just come across this link.
Hi Sudo15,
Is RescuePro the one that you PM’d to ladybugamk?
RescuePro looks very interesting. There aren’t a lot of details in how it works, but it appears to use the same techniques as Recuva and PhotoRec. When I get a chance I’ll run some tests. It would be interesting to see how they compare.
At the hardware level, flash media stores data very differently than magnetic media so not all tools will work on both media types. It makes sense that RescuePro’s webpage has the following note:
[INDENT]RescuePRO Standard and RescuePRO Deluxe are not for use on hard drives or RAIDs
[/INDENT]
Data scrubbing/shredding tools also have problems with flash media. Except maybe the cheapest flash memory, wear-leveling is a standard feature. The flash controller moves data around to even out wear across all available blocks. The operating system has no control over when and where data is moved, so even data that hasn’t changed in a while gets moved around to balance the number of write cycles. A data scrubber can’t reliably overwrite a particular block multiple times because the block’s address could be remapped at any time by the flash controller. In any given moment of time, there could be two identical copies of a block of data — one block is in use, the other is marked as “free”. A recovery tool could still access the old data that’s still stored in the “free” block. It’s great for data recovery, but bad for data privacy.Recovering data from a RAID is a whole other problem. With luck, I’ll never have to deal with that again because it’s a drag.
Personally, PhotoRec is my go-to choice followed by Recuva because I use Windows, Macs and Linux so cross-platform support is important (PhotoRec even supports DOS :D). I carry a bootable FAT-32 formatted USB flash drive with PhotoRec and Recuva. With a few gigabytes of free space on the flash drive, there’s enough room to recover and hold at least a few important files in an emergency.
PhotoRec has a huge list of patterns that cover all of the most common file formats and then some (http://www.cgsecurity.org/wiki/File_Formats_Recovered_By_PhotoRec). It looks like RescuePro also supports hundreds of file types but I couldn’t find a list to compare with.
Chung
-
WSgadget
AskWoody LoungerMarch 10, 2015 at 7:07 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494501Look sudo15 has already generously offered a valid subscription to a recovery service. The OP will load it and examine what the recovery app can see for recovery. Likely everything. Once supplying the product key the recovery app will recover.
So the rest is pretty much moot other than to say the first step without such an offer is to see what the free recuva sees. Which is often a lot.
As for recovering partitions, etc., if it came to that I’d use the free testdisk (conveniently found on the bootable system rescue CD).
(@Fascist Nation: Your post above sounded like a reply to my earlier post #15, but I wasn’t sure. If you found my post unhelpful and/or upsetting in some way, I apologize. I was only trying to provide some technical info to go along with the other tips that were already provided by other posters.)
SystemRescueCD is excellent. TestDisk is also really great, and so is its sibling PhotoRec by the same developer.
Recuva is definitely worth trying since the free version covers the basics without limitations (PhotoRec might be too overwhelming for anyone more accustomed to a graphical interface).
Recuva, PhotoRec and other similar tools use a pattern-based file scraping technique for recovering files. They are excellent tools, but they can’t recover files that are overwritten multiple times (the file meta-data can be lost, but the data blocks must be intact): no pattern to match against = no recovered file.
Compressed image formats like JPEG and PNG make the recovery more difficult because small errors result in big data corruption. In those cases, if part of the file can be recovered, using a program like IrfanView might help to save at least part of a picture because it’s a lot better at dealing with corrupted images than the simple viewer built into Windows Explorer.
A catch with recovering partitions is if the partitions have been deleted and formatted more than once, which unfortunately happened to ladybugamk. There’s no backup copy of the partition table on a drive so a recovery tool scans a disk for recognizable file systems. If the markers that identify the ends of a file system and/or other meta-data are corrupted, there might not be enough clues to reconstruct the original partition table with an intact NTFS.
ChungLinks:[INDENT]Recuva (http://piriform.com/recuva)[/INDENT]
[INDENT]SystemRescueCD (http://sysresccd.org/)[/INDENT]
[INDENT]TestDisk (http://www.cgsecurity.org/wiki/TestDisk)[/INDENT]
[INDENT]PhotoRec (http://www.cgsecurity.org/wiki/PhotoRec)[/INDENT]
[INDENT]IrfanView (http://irfanview.com/)[/INDENT] -
WSgadget
AskWoody LoungerMarch 10, 2015 at 4:13 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494478A couple of things to watch out for when dealing with data recovery:
Partition layout …
When trying to restore the original partitions on a drive that’s been inadvertently repartitioned, the capacity and vintage of the drive, the operating system being used to run the repair tool(s), the repair tools themselves and the BIOS affect the partition layout.
When drive capacities were smaller (2TB or less), using MBR was fine. For drives larger than 2TB, the newer GPT (GUID Partition Table) is used instead of MBR. For example, on a 4TB drive, only the first 2TB can be used if the partitioning scheme is MBR.
(For the geeks among us, here’s the math… most drives are natively 512-byte sectors or have 4096-byte sectors presented as chunks of 512-bytes for compatibility reasons. MBR uses a 32-bit value to address sectors. 32-bits = ~4 billion possible addresses. 512 bytes is ~0.000512 MB. 4 billion x 0.000512 = 2 TB.)
GPT is great, except that 32-bit versions of Windows don’t support GPT. So a 2TB drive could end up being partitioned using either MBR or GPT depending on the computer that’s being used. From a user’s point of view via a file manager, the drive partitions might look the same, but under the covers the partition scheme and sector alignment can be very different.
GPT also uses a portion of the disk outside of the master boot record to store additional partition info. With MBR, removing the partitions from a disk and then restoring the same partition layout won’t destroy any of the data blocks, but going from MBR to GPT will end up overwriting some existing data even if the new partitions aren’t reformatted.
Assuming that I followed ladybugamk’s timeline correctly, her external 2TB USB drive has gone through the following changes since the problem started:
-
[*]A 2TB USB hard drive with files already on it was used as the target drive for Windows 8 to create a bootable recovery disk.
(Ladybugamk said that the drive was 32GB after the process completed, so we know that up to the first 32GB of the disk was overwritten. From personal experience, a barebones Windows 8 installation can create around 8GB worth of files for a recovery disk. OEMs will often include drivers and other software so it’s likely a minimum of 10GB of lost data assuming that only a quick format was done after the 32GB partition was created. Recovery programs often do full formats to check for bad blocks so it’s more likely that the entire 32GB was wiped.)
[*]After realizing that the 2TB USB hard drive had been repartitioned from 2TB to 32GB, the 32GB partition was deleted.
[*]Personal files were then manually copied from the laptop to the 2TB USB hard drive.
(This meant the drive had to be repartitioned and reformated again before the files could be copied.)
So we’re looking at repartition to 32GB -> format -> write recovery files -> delete 32GB partition -> repartition to 2TB -> format -> copy personal files from the laptop. That’s two repartitions, two reformats and two file copy sessions. Newer versions of Windows default to a “quick format” so the data on the remaining 2TB might still be there.
File fragmentation …
Most users don’t defragment external drives and also tend to work directly off of them so the chances of heavy file fragmentation is high.
NTFS defaults to 4K blocks which works out to 256 block per megabyte. So for every MB that’s newly written, potentially up to 256 lost files could be partially erased.
In her first post, ladybugamk said, “[…] So it runs the real scan to recover the files, and instead of seeing pictures in the folders, all I could see was that generic thumbnail graphic for pictures. Attempting to open the pictures yielded a ‘this file does not exist’ type of error. I got back maybe a handful of actual pictures.” Windows assigns a generic icon for images that have a valid file extension, but contain errors preventing Windows from generating thumbnails. This is often a sign of file fragmentation, causing some data blocks to be lost by the earlier reformatting and file copies.
Based on the timeline and symptoms so far, it doesn’t sound promising, but I still hope that ladybugamk is able to get her photos back. :unsure:
Chung
-
WSgadget
AskWoody LoungerMarch 10, 2015 at 1:57 pm in reply to: Data recovery for 2TB external HD after partition wiped all data? #1494434that’s what I meant, any specific cluster containing user data which is over-written later by new info — old info gone forever. The making and unmaking of partitions rarely interfere with user data clusters. Just how does NTFS work compared to FAT32? Does NTFS have things similar to FAT32’s FAT/File Allocation Table and/or DIR tables?
Yup, it’s called the MFT (master file table) in NTFS. In Windows NT and higher, running the built-in defragmenter or another similar tool like Defraggler will show the blocks that are holding the MFT.
There are some basic similarities between MFT and FAT, but MFT is also able to store file permissions beyond just basic read-only and read/write.
Unlike FAT, NTFS is a journaling file system so a separate transaction log is used to hold changes to the MFT, data blocks, etc. until they are completely written to disk. If a write request is interrupted (e.g. power outage, OS crash, drive goes offline), a “dirty bit” isn’t cleared. This triggers an automatic recovery where the journal is scanned for incomplete transactions and the bad data blocks are rolled back to their previous state. With FAT, corrupted blocks can build up over time until the file system becomes unstable unless regular checks are done.
-
WSgadget
AskWoody LoungerA new user doesn’t choose XP or Vista. Most new users would even find it difficult to buy Windows 7 now.
Obviously, someone buying a new computer right now wouldn’t have to decide between Windows XP and Vista (at least we hope not :eek:), but they still exist and not every new user buys the latest model, so with millions of XP and Vista computers out there they are likely to run across one.
Windows 7 is still readily available in-store and online — a quick check on BestBuy.com lists more than 400 models of Windows 7 laptops to choose from. Even with Windows 10 just over the horizon, Windows 7 will still be in demand for at least another year or two. Dell, HP, Lenovo and others aren’t going to ignore a sale just because a customer doesn’t want Windows 8.
A new user buying a Windows 8 computer has a choice of four editions: “consumer”, Pro, Enterprise and RT. Besides the included software, each edition also has differences in functionality (e.g. consumer editions don’t support Active Directory logins) and one doesn’t run existing Windows software. A lot of high-end Windows software won’t work on home/consumer editions of Windows.
Between Windows, Mac and Linux, the Mac OS is the least fragmented. Mac users don’t have to choose between home and professional editions. Even compatibility (outside of gaming) is less of an issue because there’s Microsoft Office for Mac. There used to be Internet Explorer for the Mac but it wasn’t feature parity with the Windows version because it didn’t support ActiveX which is still heavily used by business applications.
The point is that two of the most frequently quoted reasons why Windows is better than Linux is “fragmentation” and “compatibility”. The reality is that both OSes suffer from it. Just as Linux has built up different distributions/editions over the years, so too has Windows during the same time frame. Windows compatibility was a big selling point but web-based applications, better cross-platform frameworks, software containers (e.g. Docker) and other technologies (Wine, AMIDuOS – http://amiduos.com/) are saving users from having to think about the operating system as much as they used to. We’re past the time of everything being about the operating system. It’s all about the apps.
Chung
-
WSgadget
AskWoody LoungerHad the source code for OpenSSL been closed-source, the buggy code may have been reviewed by more than one person and so discovered in far less than the two years it actually took the open-source community to find it. Even the guy who made the mistake said it should have had greater scrutiny before release. Heartbleed was an open-source failure which may have cost $500,000,000 to put right.
Bruce
Hi Bruce,
I have to respectfully disagree with you.
[INDENT]- $500 million certainly isn’t pocket change, but at the same time it’s a relatively small price considering how many people, equipment and software worldwide were factored into that total. Based just on Facebook’s user base alone, there are conservatively over a 1 billion users that depend on OpenSSL. That’s $0.50/person. Target spent more than $100 million to replace the point-of-sale terminal network in its stores plus an estimated $400 million more in other costs related to a data breach in late 2013. With 110 million or so customer accounts that were affected, that’s about $4.50/person, or about 9x the cost compared to Heartbleed. Target’s point-of-sale terminals were running closed-source software and it didn’t make it any more secure.[/INDENT]
[INDENT]- Following similar logic, the newly discovered FREAK (http://en.wikipedia.org/wiki/FREAK) exploit that dates back to the 1990’s should have been discovered much sooner. It impacts just about everything that relies on SSL/TLS and will likely cost more and take longer to fix than Heartbleed because it can’t be fixed by a simple software update.[/INDENT]
[INDENT]- Then there’s Superfish. Komodia’s software has been used in dozen’s of products since at least 2011, but it took almost 4 years before widespread news broke of what it does (on purpose) to the SSL/TLS security on Windows. It affected Internet Explorer, Google Chrome and Mozilla Firefox — the first one is closed-source, the second is a closed/open source hybrid, and the last one is completely open-source. Internet Explorer being closed-source didn’t provide any extra protection. The person credited with the discovery had to watch the network traffic in order to see what Komodia’s software was doing because the source code wasn’t available for review. How much time and money will be spent to clean up the mess?[/INDENT]
Just because software is closed-source in no way increases the odds that it will be sufficiently reviewed by more than one person. Think about how many Windows, Mac and Linux software are written by single or small groups of individuals. Toss in mobile development and we’re now talking about hundreds of thousands of apps that are each potentially reviewed for security by perhaps one person at best. In the business world, scheduling code reviews for security issues often takes a back seat to product release schedules.
And how many private app developers have staff who are well-versed in computer security? The programmer who made the coding error in OpenSSL was a PhD student and the programmer (with a PhD in Mathematics and a background in cryptography) who peer reviewed the code is one of the four core developers of OpenSSL. Programmers at all skill levels can, and will, make mistakes.
Having a large amount of resources (e.g. staff, money) still doesn’t mean that code is free of security bugs. OpenSSL is used not only by open-source projects, but also by companies who spend millions of dollars on R&D (e.g. Google, Cisco, IBM, Dell, HP, Amazon, Red Hat, etc.) — any one of them could have reviewed the source code before incorporating it into their smartphones, network routers, firewalls, web services and hundreds of other products. Although Windows, iOS and OS X weren’t directly impacted by Heartbleed, they were still indirectly affected because software that relies on OpenSSL are commonly used on those two platforms. One of Google’s security team members was credited with discovering the Heartbleed bug. Had OpenSSL been closed-source, we’d all be relying on a single entity to audit the code and alert us to any problems. A fix for Heartbleed was rolled out within a couple of days of the announcement. There is a lot of closed-source software with security problems that aren’t fixed that quickly, or ever fixed at all.
Heartbleed wasn’t a failure of open-source, or a celebration for closed-source, but rather a collective “dropping of the ball” with regards to a widespread assumption that a critical piece of software was perfect without enough people actually validating it. It was a wake-up call that resulted in a who’s who list of companies jumping in to fund the OpenSSL project and other widely used software. Everyone wins.
Chung
-
WSgadget
AskWoody LoungerI have long been interested in some flavor of Linux, but not to complicate things by running Windows under Linux. And I depend on a program which works with a modem and one phone line, listens to an incoming call for a fax tone, if a fax tone, sends back the handshake, and receives the fax digitally. If there is no fax tone, it answers, sends an outgoing message, and records the voice mail. I started with WinFax Pro until it dropped voice, then HotFax Message Center until Smith Micro dropped it, and then FaxTalk Messenger Pro.
I have checked several times and never found a Linux program that will perform that function. Does anyone know of such a program which will run under Linux?
Hi meadi8r,
Because fax and voice are different telephony protocols, you’ll need a two programs. The first two that come to mind are HylaFAX (http://hylafax.org/) and Asterisk (http://asterisk.org/). They can be installed on just about any Linux distribution, and for your particular use case, an old computer will work fine. If you’d prefer something more unified, check out Elastix (http://elastix.org/) which is built on HylaFAX and Asterisk. I’ve seen Asterisk used for large groups (30,000+ people) and also know someone who uses it at home.
Chung
-
WSgadget
AskWoody LoungerThanx for the informative article. However it, and the responses, illustrate the reason I (and most users) stay away from Linux by the droves. I don’t want to work with an OS that comes with dozens of different major versions, options and choices. The Linux community has identified the fragmentation of the their OS as a main barrier to its widespread adoption and have often tried to encourage some sort of unification. But their answer seems to be yet more versions, options and choices. Yes Windows is a pain, but a unified, consistent, and supported one.
It’s a very interesting point…
With Linux, a new user typically picks a “distro” (http://distrowatch.com) after trying out a few of them and seeing which has the most appeal and/or works best with his/her existing hardware. Having so many choices is certainly overwhelming.
With Windows, a new user first chooses between the various editions (since XP there’s been more than 20+, and not all run x86 Windows software):
[INDENT]http://en.wikipedia.org/wiki/Windows_XP_editions
http://en.wikipedia.org/wiki/Windows_Vista_editions
http://en.wikipedia.org/wiki/Windows_7_editions
http://en.wikipedia.org/wiki/Windows_8_editions[/INDENT]After the initial boot, the user then spends the next several days, weeks and/or months tweaking the look-and-feel with all kinds of additional software to change the Windows toolbars, add desktop widgets, virtual desktops, firewall and so on (e.g. Start10, Fences, NeXus, ZoneAlarm). Once that’s done, it’s still running Windows, but it no longer looks or functions exactly like the Windows computer across the street. The loss of the Start Menu in Windows 8 only made customizing even more appealing. With the exception of company computers managed by IT staff, just about every Windows user heavily customizes his/her workspace to suit personal tastes and workflow.
So when it comes to fragmentation, really the only difference between Linux and Windows is when the fragmentation happens.
Having used both Windows and Linux for years, I’ve found that backwards software compatibility for commercial software is definitely better in Windows, but not for all Windows software. Where Linux has the software advantage is how long it’s still supported — for example, I’ve got an old (but nice) flatbed scanner and an equally old webcam that stopped working starting with Windows Vista/7 but still works fine in the latest bleeding edge releases of 32-bit and 64-bit Linux. I can also run games from the early 90’s using DOSEMU (http://dosemu.org/) and Wine (http://winehq.com/) while the same games hang or crash in newer versions of Windows.
-
WSgadget
AskWoody LoungerThe article is right on. Linux works on older machines and it frequently updates. I am bound to Windows only because Quicken doesn’t make a Linux version. I’ve been toying with Wine, a windows emulator, which should run Quicken just fine. If Microsoft doesn’t offer free upgrades to Windows 10 from the awful Windows 8, I’ll probably make the switch on my desktop. On the road, I already use a Linux laptop with Libre Office and Skype and, for the cloud (e.g. One Drive, Google Drive), all you need is a browser.
Why don’t you just run Windows in a virtual machine on Linux? Plenty of ways to do that. Here’s one way: http://www.howtogeek.com/howto/18768/run-windows-in-ubuntu-with-vmware-player/[/QUOTE]
VMware Player and Oracle’s VirtualBox (http://virtualbox.org/) are both great options, but Wine can be better in cases when there are…
-
[*]Older computers — Wine creates a “compatibility layer” that translates Windows library calls into corresponding Linux library calls so it doesn’t require partial and/or full virtualization. Without the need to run a complete copy of Windows, it requires a lot less RAM and CPU power.
[*]Budget considerations — No need to purchase additional Windows licenses. OEM copies of Windows can’t legally be transferred to a different computer so reusing Windows from a Dell, HP or other OEM system that’s being scrapped isn’t authorized by Microsoft. At $100 a pop for a Windows 7 Home OEM license, it might not be financially practical to run a Windows program that’s used just every now and then.
Wine isn’t a perfect solution, but it’s definitely worth trying first before going with virtualization even on a fast computer (a Windows 7 or 8 installation can easily eat up 16GB of disk space and a couple gigabytes of RAM even before any other software is installed).
Chung
-
WSgadget
AskWoody LoungerThe article is right on. Linux works on older machines and it frequently updates. I am bound to Windows only because Quicken doesn’t make a Linux version. I’ve been toying with Wine, a windows emulator, which should run Quicken just fine. If Microsoft doesn’t offer free upgrades to Windows 10 from the awful Windows 8, I’ll probably make the switch on my desktop. On the road, I already use a Linux laptop with Libre Office and Skype and, for the cloud (e.g. One Drive, Google Drive), all you need is a browser.
Hi rms800,
If you use Wine a lot, take a look at PlayOnLinux (http://playonlinux.com/) and CodeWeavers’ CrossOver (http://codeweavers.com/). They both use Wine but with custom GUIs that make installing and managing programs a lot easier.
Chung
![]() |
Patch reliability is unclear, but widespread attacks make patching prudent. Go ahead and patch, but watch out for potential problems. |
SIGN IN | Not a member? | REGISTER | PLUS MEMBERSHIP |

Plus Membership
Donations from Plus members keep this site going. You can identify the people who support AskWoody by the Plus badge on their avatars.
AskWoody Plus members not only get access to all of the contents of this site -- including Susan Bradley's frequently updated Patch Watch listing -- they also receive weekly AskWoody Plus Newsletters (formerly Windows Secrets Newsletter) and AskWoody Plus Alerts, emails when there are important breaking developments.
Get Plus!
Welcome to our unique respite from the madness.
It's easy to post questions about Windows 11, Windows 10, Win8.1, Win7, Surface, Office, or browse through our Forums. Post anonymously or register for greater privileges. Keep it civil, please: Decorous Lounge rules strictly enforced. Questions? Contact Customer Support.
Search Newsletters
Search Forums
View the Forum
Search for Topics
Recent Topics
-
Updates seem to have broken Microsoft Edge
by
rebop2020
2 hours, 28 minutes ago -
Wait command?
by
CWBillow
1 hour, 37 minutes ago -
Malwarebytes 5 Free version manual platform updates
by
Bob99
5 hours, 23 minutes ago -
inetpub : Microsoft’s patch for CVE-2025–21204 introduces vulnerability
by
Alex5723
11 hours, 59 minutes ago -
Windows 10 finally gets fix
by
Susan Bradley
20 hours, 52 minutes ago -
AMD Ryzen™ Chipset Driver Release Notes 7.04.09.545
by
Alex5723
22 hours, 12 minutes ago -
Win 7 MS Essentials suddenly not showing number of items scanned.
by
Oldtimer
16 hours, 45 minutes ago -
France : A law requiring messaging apps to implement a backdoor ..
by
Alex5723
1 day, 11 hours ago -
Dev runs Windows 11 ARM on an iPad Air M2
by
Alex5723
1 day, 12 hours ago -
MS-DEFCON 3: Cleanup time
by
Susan Bradley
7 hours, 5 minutes ago -
KB5056686 (.NET v8.0.15) Delivered Twice in April 2025
by
lmacri
3 hours, 46 minutes ago -
How to enable Extended Security Maintenance on Ubuntu 20.04 LTS before it dies
by
Alex5723
1 day, 23 hours ago -
Windows 11 Insider Preview build 26200.5562 released to DEV
by
joep517
2 days, 3 hours ago -
Windows 11 Insider Preview build 26120.3872 (24H2) released to BETA
by
joep517
2 days, 3 hours ago -
Unable to eject external hard drives
by
Robertos42
13 hours, 49 minutes ago -
Saying goodbye to not-so-great technology
by
Susan Bradley
1 hour, 30 minutes ago -
Tech I don’t miss, and some I do
by
Will Fastie
7 hours, 23 minutes ago -
Synology limits hard drives
by
Susan Bradley
3 days, 7 hours ago -
Links from Microsoft 365 and from WhatsApp not working
by
rog7
2 days, 9 hours ago -
WhatsApp Security Advisories CVE-2025-30401
by
Alex5723
3 days, 13 hours ago -
Upgrade Sequence
by
doneager
3 days, 6 hours ago -
Chrome extensions with 6 million installs have hidden tracking code
by
Nibbled To Death By Ducks
1 day, 12 hours ago -
Uninstall “New Outlook” before installing 2024 Home & Business?
by
Tex265
2 days, 5 hours ago -
The incredible shrinking desktop icons
by
Thumper
4 days, 10 hours ago -
Windows 11 Insider Preview Build 22635.5240 (23H2) released to BETA
by
joep517
4 days, 12 hours ago -
Connecting hard drive on USB 3.2 freezes File Explorer & Disk Management
by
WSJMGatehouse
1 day, 11 hours ago -
Shellbag Analyser & Cleaner Update
by
Microfix
1 day, 4 hours ago -
CISA warns of increased breach risks following Oracle Cloud leak
by
Nibbled To Death By Ducks
4 days, 21 hours ago -
Outlook 2024 two sent from email addresses
by
Kathy Stevens
7 hours, 12 minutes ago -
Speeding up 11’s search
by
Susan Bradley
2 days, 9 hours ago
Recent blog posts
Key Links
Want to Advertise in the free newsletter? How about a gift subscription in honor of a birthday? Send an email to sb@askwoody.com to ask how.
Mastodon profile for DefConPatch
Mastodon profile for AskWoody
Home • About • FAQ • Posts & Privacy • Forums • My Account
Register • Free Newsletter • Plus Membership • Gift Certificates • MS-DEFCON Alerts
Copyright ©2004-2025 by AskWoody Tech LLC. All Rights Reserved.