*** |
TOP STORY[/size] Why the need to reboot after updating Windows?[/size]
Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation. Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.[/size] The full text of this column is posted at WindowsSecrets.com/2010/01/14/01 (opens in a new window/tab). Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns. |
*** |
![]() |
Patch reliability is unclear. Unless you have an immediate, pressing need to install a specific patch, don't do it. |
SIGN IN | Not a member? | REGISTER | PLUS MEMBERSHIP |
-
Why the need to reboot after starting Windows?
Home » Forums » Newsletter and Homepage topics » Why the need to reboot after starting Windows?
- This topic has 141 replies, 18 voices, and was last updated 15 years, 2 months ago.
AuthorTopicWSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #465753Viewing 106 reply threadsAuthorReplies-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1198867
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1199652
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1200499
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1201380
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1202193
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1202905
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSStephanie Small
AskWoody LoungerJanuary 13, 2010 at 6:31 pm #1203871
TOP STORY
Why the need to reboot after updating Windows?
By
Susan BradleyNot so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.
Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.
The full text of this column is posted at
WindowsSecrets.com/2010/01/14/01
(opens in a new window/tab).Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.
-
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1198260 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1199025 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1199766 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1200557 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1201438 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1202251 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1203055 -
WSaggletonm
AskWoody LoungerJanuary 14, 2010 at 1:48 am #1203946 -
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1198263Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1198482Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1198525Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdeepsand
AskWoody LoungerJanuary 16, 2010 at 11:59 pm #1204840Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
What the product was named is wholly irrelevant to the fact that the GUI ran atop DOS throughout the entire Win 9x series. That some real-time functionality of DOS was crippled in the latter versions does not serve to alter that fact.
The NT series was also named “Windows;” yet, it was built from the ground up as a true multi-tasking system, something which 9x and its predecessors most definitely were not.
-
-
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1199782Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1200115Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1200841Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1201787Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1202531Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1203440Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdbneeley
AskWoody LoungerJanuary 15, 2010 at 3:09 am #1204294Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.
The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.
Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.
It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.
The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.
Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1199708Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1200079Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1200806Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1201725Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1202496Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1203405Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 7:58 pm #1204240Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.
Actually, the GUI ran atop DOS for the entire Win9x series.
For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1199030Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1199774Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1200560Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1201441Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1202254Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1203058Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WSdbneeley
AskWoody LoungerJanuary 14, 2010 at 2:15 am #1203949Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.
Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.
It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.
When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.
Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.
It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.
Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1198265A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1199036A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1199779A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1200563A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1201444A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1202257A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1203061A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WStomtells
AskWoody LoungerJanuary 14, 2010 at 2:33 am #1203953A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.
I don’t see how these inconveniences are mitigated by the author’s suggestion,
Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.
WSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1198290Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1199093Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1199825Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1200590Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1201471Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1202284Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1203088Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This ThreadWSchrisgreaves
AskWoody LoungerJanuary 14, 2010 at 5:11 am #1203980Why the need to reboot after updating Windows?
Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
See also How can I make Word save or back up my document automatically? and also This Threaddbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1198303One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1199171One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1199854One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1200610One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1201487One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1202300One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1203104One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
dbm1rxb
AskWoody PlusJanuary 14, 2010 at 6:48 am #1203996One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1198306The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1199178The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1199857The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1200613The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1201490The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1202303The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1203107The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
WSOllieJones
AskWoody LoungerJanuary 14, 2010 at 6:55 am #1203999The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.
The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.
The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1198352Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1199327Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1199902Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1200658Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1201535Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1202348Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1203152Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
cubbage@pobox.com
AskWoody PlusJanuary 14, 2010 at 10:49 am #1204050Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”
I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.
As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1198359Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1198361Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1199378Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1199913Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1200669Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1201546Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1202359Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1203163Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
-
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:23 am #1204071Oops- That’s for XP (sorry need more coffee…)
Run GPEDIT.MSC and navigate to
Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.(PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)
Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1199360Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1199910Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1200666Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1201543Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1202356Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1203160Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSfasteddie26
AskWoody LoungerJanuary 14, 2010 at 11:09 am #1204068Greetings-
Here’s a registry change which permits you to manage auto rebooting after Windows Updates:
http://support.microsoft.com/kb/555444
Cheers
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1198383The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1198483The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1199711The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1200080The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1200807The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1201726The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1202497The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1203406The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
-
WSdeepsand
AskWoody LoungerJanuary 14, 2010 at 8:01 pm #1204241The easiest solution to this is to do all of your Windows Updates manually!
Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.
And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1199468The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1199937The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1200693The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1201583The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1202383The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1203222The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
WSal35763
AskWoody LoungerJanuary 14, 2010 at 1:18 pm #1204096The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1198390Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1199481Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1199943Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1200699Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1201618Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1202389Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1203262Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
TXWizard_2018
AskWoody PlusJanuary 14, 2010 at 1:27 pm #1204102Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.
I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.
- [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.
If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.
In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.
Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.
As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.
David A. Gray
Designing for the Ages, One Challenge at a Time
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1198407I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1199540I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1199969I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1200725I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1201644I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1202415I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1203294I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSkfrantzen
AskWoody LoungerJanuary 14, 2010 at 2:31 pm #1204129I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1198480Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1199704Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1200077Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1200804Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1201723Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1202494Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1203403Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSgordonwoolf
AskWoody LoungerJanuary 14, 2010 at 7:54 pm #1204238Windows Secrets wrote:
> Office 2007’s “AutoRecover” function autosaves open files every 10
> minutesDoes it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.
I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.
WSmy2cents
AskWoody LoungerJanuary 15, 2010 at 8:27 am #1204338Another issue I come across from time to time is this: after an update (XP), Windows will give the familiar instruction to restart the machine. It will start to shutdown and remain in that state – sometimes forever. After 15 or 20 minutes a hard shutdown is required, but I am always left wondering if the machine will boot back up correctly. Shutting down this way is a last resort, but what is the alternative?
WSDaRam
AskWoody LoungerJanuary 15, 2010 at 3:38 pm #1204544For non-critical patches/updates, I usually use the WhyReboot Utility to judge if a reboot is really warranted.
It is a freeware utility available here http://exodusdev.com/products/whyreboot-
TXWizard_2018
AskWoody PlusFebruary 6, 2010 at 7:08 pm #1208378For non-critical patches/updates, I usually use the WhyReboot Utility to judge if a reboot is really warranted.
It is a freeware utility available here http://exodusdev.com…ducts/whyrebootIME, the safest approach to patch installations is to follow them immediately with a restart, whether or not it is suggested. Before I adopted this practice, on too many occasions, my machine became unstable after an update, most likely because the update was only partially installed, and some of the new code got loaded and linked to old code that was part of the complete update.
David A. Gray
Designing for the Ages, One Challenge at a Time
Stokersson
AskWoody LoungerJanuary 18, 2010 at 4:26 pm #1205260[indent][/indent]
AutoUpdate will STILL INSTALL AND REBOOT for either of these settings if the update is flagged as critical (and far too many are). There have been several series of threads about this on various boards and Microsoft’s MVP’s and employees cannot seem to agree whether it actually happens. However in an actual developers blog, he admitted to a “forced update” flag which was only supposed to be used to update the the update system was being used to force other updates. Sorry I cannot quote a link as my last foray into this was about a year ago.
Well my XP system is set to download and notify, and last update Tuesday I just caught it 5 seconds before a reboot!!!
To prevent interruptions to TV recordings my Windows 7 system is switched to MANUAL and everyone agrees that updates do not get actioned until deliberately checked for. In the past (when I was still running Vista) some updates left the PC powered off rather than restarting!! – Not very good when I was away for a few days.
TomC
AskWoody LoungerJanuary 21, 2010 at 2:14 pm #1205996The fact that Windows wants to reboot after an update is not nearly so much of an aggrevation to me as its incessant nagging every few minutes to ‘Reboot Now?’ Most of the flaws being patched have been around for quite some time, and the way I see it, waiting a few more hours until I go to lunch or the end of my workday is not a big deal. When I make the mistake of loading a patch which ultimately requires a reboot, then those nags impact my ability to keep working and ultimately force me to go ahead and stop what I’m doing and wait for the reboot. I would really appreciate it if Windows would accept ‘Reboot Later’ as really meaning later and allow me to wait 1 hour, 2 hours, or 4 hours, etc. without constantly interrupting my work.
Adobe can be even worse, in that in addition to its nags, Acrobat often will not accept a system shutdown followed by a startup the next day in place of the ‘Restart’ it wants.
Viewing 106 reply threads -

Plus Membership
Donations from Plus members keep this site going. You can identify the people who support AskWoody by the Plus badge on their avatars.
AskWoody Plus members not only get access to all of the contents of this site -- including Susan Bradley's frequently updated Patch Watch listing -- they also receive weekly AskWoody Plus Newsletters (formerly Windows Secrets Newsletter) and AskWoody Plus Alerts, emails when there are important breaking developments.
Get Plus!
Welcome to our unique respite from the madness.
It's easy to post questions about Windows 11, Windows 10, Win8.1, Win7, Surface, Office, or browse through our Forums. Post anonymously or register for greater privileges. Keep it civil, please: Decorous Lounge rules strictly enforced. Questions? Contact Customer Support.
Search Newsletters
Search Forums
View the Forum
Search for Topics
Recent Topics
-
two pages side by side land scape
by
marc
16 hours, 1 minute ago -
Deleting obsolete OneNote notebooks
by
afillat
18 hours, 6 minutes ago -
Word/Outlook 2024 vs Dragon Professional 16
by
Kathy Stevens
18 hours, 16 minutes ago -
Security Essentials or Defender?
by
MalcolmP
18 hours, 20 minutes ago -
April 2025 updates out
by
Susan Bradley
20 minutes ago -
Framework to stop selling some PCs in the US due to new tariffs
by
Alex5723
6 hours, 27 minutes ago -
WARNING about Nvidia driver version 572.83 and 4000/5000 series cards
by
Bob99
10 minutes ago -
Creating an Index in Word 365
by
CWBillow
9 hours, 39 minutes ago -
Coming at Word 365 and Table of Contents
by
CWBillow
1 hour, 8 minutes ago -
Windows 11 Insider Preview Build 22635.5170 (23H2) released to BETA
by
joep517
1 day, 13 hours ago -
Has the Microsoft Account Sharing Problem Been Fixed?
by
jknauth
1 day, 16 hours ago -
W11 24H2 – Susan Bradley
by
G Pickerell
1 day, 18 hours ago -
7 tips to get the most out of Windows 11
by
Alex5723
1 day, 16 hours ago -
Using Office apps with non-Microsoft cloud services
by
Peter Deegan
1 day, 9 hours ago -
I installed Windows 11 24H2
by
Will Fastie
1 hour, 18 minutes ago -
NotifyIcons — Put that System tray to work!
by
Deanna McElveen
1 day, 21 hours ago -
Decisions to be made before moving to Windows 11
by
Susan Bradley
41 minutes ago -
Port of Seattle says ransomware breach impacts 90,000 people
by
Nibbled To Death By Ducks
2 days, 6 hours ago -
Looking for personal finance software with budgeting capabilities
by
cellsee6
1 day, 14 hours ago -
ATT/Yahoo Secure Mail Key
by
Lil88reb
1 day, 14 hours ago -
Devices with apps using sprotect.sys driver might stop responding
by
Alex5723
2 days, 23 hours ago -
Neowin – 20 times computers embarrassed themselves with public BSODs and goofups
by
EP
3 days, 7 hours ago -
Slow Down in Windows 10 performance after March 2025 updates ??
by
arbrich
2 days, 9 hours ago -
Mail from certain domains not delivered to my outlook.com address
by
pumphouse
2 days, 16 hours ago -
Is data that is in OneDrive also taking up space on my computer?
by
WShollis1818
3 days, 2 hours ago -
Nvidia just fixed an AMD Linux bug
by
Alex5723
4 days, 18 hours ago -
50 years and counting
by
Susan Bradley
1 day, 16 hours ago -
Fix Bluetooth Device Failed to Delete in Windows Settings
by
Drcard:))
1 day, 19 hours ago -
Licensing and pricing updates for on-premises server products coming July 2025
by
Alex5723
5 days, 5 hours ago -
Edge : Deprecating window.external.getHostEnvironmentValue()
by
Alex5723
5 days, 5 hours ago
Recent blog posts
Key Links
Want to Advertise in the free newsletter? How about a gift subscription in honor of a birthday? Send an email to sb@askwoody.com to ask how.
Mastodon profile for DefConPatch
Mastodon profile for AskWoody
Home • About • FAQ • Posts & Privacy • Forums • My Account
Register • Free Newsletter • Plus Membership • Gift Certificates • MS-DEFCON Alerts
Copyright ©2004-2025 by AskWoody Tech LLC. All Rights Reserved.