• Why the need to reboot after starting Windows?

    Home » Forums » Newsletter and Homepage topics » Why the need to reboot after starting Windows?

    Author
    Topic
    #465753
    ***


    TOP STORY[/size]

    Why the need to reboot after updating Windows?[/size]

    BySusan Bradley

    Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

    Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.[/size]


    The full text of this column is posted at WindowsSecrets.com/2010/01/14/01 (opens in a new window/tab).

    Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

    ***
    Viewing 106 reply threads
    Author
    Replies
    • #1198867
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1199652
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1200499
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1201380
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1202193
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1202905
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1203871
         



      TOP STORY


      Why the need to reboot after updating Windows?


      By
      Susan Bradley

      Not so long ago, Microsoft promised that fewer Windows patches would require restarting the system to complete their installation.

      Microsoft clearly hasn’t delivered on that promise, so PC users need to take steps to ensure that they don’t lose data due to unexpected post-update reboots.



      The full text of this column is posted at
      WindowsSecrets.com/2010/01/14/01
      (opens in a new window/tab).

      Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.

         
    • #1198260

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1199025

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1199766

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1200557

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1201438

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1202251

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1203055

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1203946

      In a meeting with MS UK around the time of Vista RC I was told the reason that reboots were done was that because the code developers were specialists in one area they didn’t know what effect their changes made on other developers code so they used a ‘better safe than sorry’ regime.

    • #1198263

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

      • #1198482

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

        • #1198525

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

          • #1204840

            Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

            What the product was named is wholly irrelevant to the fact that the GUI ran atop DOS throughout the entire Win 9x series. That some real-time functionality of DOS was crippled in the latter versions does not serve to alter that fact.

            The NT series was also named “Windows;” yet, it was built from the ground up as a true multi-tasking system, something which 9x and its predecessors most definitely were not.

        • #1199782

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1200115

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1200841

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1201787

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1202531

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1203440

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

        • #1204294

          Actually, the GUI ran atop DOS for the entire Win9x series.

          For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

          Sorry, but that’s just silly. What constituted “Windows” certainly changed greatly–but it is Microsoft, and not you, who determined what “Windows” was and is. They named Windows 1.0 “Windows” after all, and defined it when they did. Later, that definition obviously changed.

          The NT basic architecture was done by a gentleman who had previously been in charge of Digital Equipment Corporation’s operating system development (Dave Cutler)–and many programmers at the time it was under development referred to it as “Portable VMS”–because various features were simply re-implementations of the VMS design.

          Thus, it could with some justification be said that the later versions were a “true VMS experience” in some ways.

          It is true that the Windows 9X series used many parts of DOS in its underpinnings, but as I said the kernel was no longer replaceable for the reasons I stated. In fact, in that same Win 9X series, a surprising number of processes were reduced to 16 bit operations through a process known as “thunking”. I remember a Microsoft representative giving sales reps a talk about the Win 95 OS and claiming it was a “32 bit operating system”…she was somewhat embarrassed when I asked her about “thunking” and the various limitations that 16 bit processes imposed upon it.

          The fact remains that having the GUI connected so deeply into the operating system kernel is what causes the need for the frequent reboots more than any other single thing. Otherwise, many software updates would not interrupt the running system. All the details about file system operation and such are significant precisely because Windows has so much cruft being accessed at any given time. That, in turn, has serious consequences in updates not playing nicely together, as too much else must be considered at any given time through a nearly infinite combination of drivers, other running programs, etc.

          Finally, the number of processes most systems have running at startup are yet another complication that hogs memory, often for no more than to have a particular application start a few seconds faster. For many reasons, it is often best to get rid of as many of these elements in startup as possible–they slow down the reboot process and can suck a surprising amount of memory from smaller systems such as netbooks and computers with limited ram.

      • #1199708

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1200079

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1200806

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1201725

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1202496

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1203405

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

      • #1204240

        Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95.

        Actually, the GUI ran atop DOS for the entire Win9x series.

        For consumers, few of whom ran any of the NT series, which continued through Win 2K, which was NT5 renamed, they did not experience a true Windows OS until the release of XP.

    • #1199030

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1199774

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1200560

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1201441

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1202254

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1203058

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1203949

      Originally, back in the earliest days of Windows, the entire GUI ran as a process on top of a DOS kernel. When people started to use DRDOS or IBM’s PCDOS instead of MSDOS underneath, to keep control of the entire software stack, Microsoft drove the GUI code down into the kernel space for Win 95. After that, the multiple reboots were required.

      Still later, with the introduction of excessively obtuse and fragile Registry, even more was “baked in” and required this kind of shenanigans.

      It is worth noting that UNIX and its variants such as Linux still separate all the GUI code from the operating system kernel, and only require rebooting when an update is to the kernel itself. Even if a misbehaving graphical app takes down the GUI, the kernel and associated services are still running, and only the GUI needs to be restarted–which is blessedly seldom with modern iterations of any of these systems.

      When I am in Linux, therefore, and find some programs need updating, usually I simply go on working while the updates are downloaded and installed. Then too, the configuration files in Linux are text files, and can be edited with any text editor–none of this “registry” foolishness to contend with.

      Windows has many good features, but architecturally it is a nightmare. How often have you had a “fix” that merely broke other things on the system? I know I have, too many times to count. Generally, that comes from far too much complexity mixed together, and makes patches very difficult.

      It would be very good if Microsoft bit the bullet and ripped out much of the underlying plumbing and replaced it–as Apple did when they went to OSX. That would be painful, granted, but in the end it could result in a far better system for everyone.

      Remember, for example, the “Min-Win” version they showed off in a user group meeting or two, that had far less code to run the basic system? That was extremely promising–but so far as I am aware it never saw the light of day in an actual product. Windows remains as bloated as ever.

    • #1198265

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1199036

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1199779

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1200563

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1201444

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1202257

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1203061

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1203953

      A major premise of the article is that users are inconvenienced when required to restart their computers to complete installation of an update. Other than time lost, the major inconvenience would be to have the computer restart automatically, before open documents can be saved… leading to lost work. A related inconvenience is having to set up a work space all over again, leading to more lost time.

      I don’t see how these inconveniences are mitigated by the author’s suggestion,

      Doesn’t this strategy simply lead to the same lost work and lost time, but without the benefit of completing the update? It seems to me the only way to truly control what updates are installed (and when) is to follow the author’s earlier suggestion to I prefer “notify me when updates are available” to avoid losing bandwidth at inopportune moments.

    • #1198290

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1199093

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1199825

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1200590

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1201471

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1202284

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1203088

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1203980

      Why the need to reboot after updating Windows?

      Susan Bradley states ” … Second, configure your applications to save files automatically.” and describes a process in Office 2007 & 2003.
      My understanding of AutoSave in 2003 and earlier versions is that they do not save the file.
      That is, there is no substitute for manually saving every 10 minutes (or after a major change etc.) or instituting a 3rd-party auto-save utility.
      See also How can I make Word save or back up my document automatically? and also This Thread

    • #1198303

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1199171

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1199854

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1200610

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1201487

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1202300

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1203104

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1203996

      One thing I have repeatedly seen on Vista HM PR, Gateway, Q6600, 3 gig, no Ready Boost, is that restart appears to not flush memory completely and either restart fails completely, or after boot some software is failing or not working correctly. Now, I always manually do a complete shut down and then start up which always works without any complications occurring to the software, even after the restart failed. I don’t know why this is happening, but I no longer trust restart.

    • #1198306

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1199178

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1199857

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1200613

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1201490

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1202303

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1203107

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1203999

      The root cause of this Windows reboot-on-update situation is a basic limitation of the file systems used in Windows. An NTFS or FAT32 file that’s “in use” can’t be removed from a directory (folder) and replaced with another.

      The other major line of operating systems in the world (starting with the Bell Labs stuff) have separated directory entries and files; open files (for example files containing running code) can be removed (“unlinked”) from their directories and replaced, without having to be overwritten.

      The Bell Labs patent on that kind of file system is long expired. Maybe the Redmond gang will figure out how to use that technology.

    • #1198352

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1199327

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1199902

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1200658

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1201535

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1202348

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1203152

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1204050

      Susan you said: “However, even Windows 7 fails to live up to Jim Allchin’s no-reboot promise.”

      I reread you column and Jim did not promise no re-boot, he just said that it would require fewer reboots and my experience with Win7 has proven this out, fewer reboots.

      As to lost work, do the patches when the workers are not there. I have my PC’s set to download the patches and once I have verified that the updates are ok I have the patches/updates installed during off hours.

    • #1198359

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

      • #1198361

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1199378

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1199913

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1200669

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1201546

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1202359

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1203163

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

      • #1204071

        Oops- That’s for XP (sorry need more coffee…)

        Run GPEDIT.MSC and navigate to
        Computer Config > Admin Templates > Windows Components > Windows Update, and look for “No auto restart with logged on users for scheduled automatic updates installations” – change this to “Enabled”.

        (PLEASE Use GPEDIT, as the registry key may/may not be the same as in XP!!!!)

        Greetings-

        Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

        http://support.microsoft.com/kb/555444

        Cheers

    • #1199360

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1199910

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1200666

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1201543

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1202356

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1203160

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1204068

      Greetings-

      Here’s a registry change which permits you to manage auto rebooting after Windows Updates:

      http://support.microsoft.com/kb/555444

      Cheers

    • #1198383

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

      • #1198483

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1199711

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1200080

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1200807

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1201726

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1202497

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1203406

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

      • #1204241

        The easiest solution to this is to do all of your Windows Updates manually!

        Not only is this the easiest, but, it’s the only truly foolproof method of ensuring that no tasks are interrupted at an inopportune time.

        And, if you want any of the optional software/hardware updates, you’re going to have to still manually visit MS.

    • #1199468

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1199937

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1200693

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1201583

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1202383

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1203222

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1204096

      The easiest solution to this is to do all of your Windows Updates manually! I find Automatic Updates extremely annoying! First there is the sudden “reboot in 15 seconds” message that pops up and if you don’t see that you’re dead! Second is that if you’ve missed a scheduled update because your computer is shut down, the first thing it does is that upon firing up it does the updates firstat the expense of everything else! In a computer running XP with limited resources, (memory), it comes almost to a screeching halt and you can’t do anything until it’s done! If you turn off Automatic Updates in XP, you’re going to get annoying messages out of the system tray constantly. to stop that you have to go to Run and then type msconfig and then go to Services and disable Security Center. In Windows 7 things are much improved in that a console comes up to turn those annoying messages off! Remember that you need to do all your updates manually from this point forward, so don’t forget to them, they are important in many cases. That is how I solved the problem of Windows Update causing me to lose data due to a necessity to reboot the computer

    • #1198390

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1199481

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1199943

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1200699

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1201618

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1202389

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1203262

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1204102

      Notwithstanding the excellent points made by others about the architectural deficiencies in Windows, I think we can all agree that the situation has improved somewhat.

      I have a general idea why those bulletins use the word may, rather than will. The reason is related to delay loading of dynamic link libraries. Most of the working code in a large application lives in dynamic link libraries, which would consume a staggering amount of memory and drastically delay startup if all of them loaded the instant you started the program. There are two ways to load a DLL.

        [*]Static linking and immediate loading is the simplest way to load. This may sound like a contradiction, and it is, until you get your head around it. The main program contains a DLLImport structure, which tells the program loader how to link the program to its DLL when the program is initially loaded, and before the loader hands off control of the CPU to it.[*]Delay loading dispenses with the DLLImport structure, and the library is not loaded at startup. Instead, when the main program needs to use code from the library, the LoadLibrary API function is invoked to load the library, followed by GetProcAddress, to get the address of the desired function. Finally, the function is called by pointing at the address returned by GetProcAddress.

      If a DLL is loaded by the first method, it loads at startup, whether or not the main program needs it, and it stays loaded until the main program shuts down and is unloaded. This is fine for libararies of utility functions that are used frequently, and throughout the lifetime of the program, and system libraries, which are usually already loaded.

      In a large application, such as Internet Explorer, Word, or Excel, there are dozens, if not hundreds, of specialized functions that are seldom used, and whose code may be safely discarded when the main program is finished with them. A good hypothetical example for Microsoft Word is mail merge. Most people use this feature infrequently, most sessions with Word never use it, and some people never use it, period. For Internet explorer, the same goes for SSL and print preview.

      Since many features are implemented as delay loaded DLLs, nobody, not even Microsoft, can say for sure whether a restart will be needed to update code used by one of them. For example, if the module that implements SSL is to be updated, a reboot won’t be required unless you used SSL, and the group that implemented SSL took care to unload the DLL when the program is no longer doing SSL.

      As an aside, if you load a program into Depends.exe, in static mode, it lists only the first type of external links. Delay loaded DLLs don’t show up as dependencies unless you attach Depends to an active process.

      David A. Gray

      Designing for the Ages, One Challenge at a Time

    • #1198407

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1199540

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1199969

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1200725

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1201644

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1202415

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1203294

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1204129

      I spent 30 years in Bell Labs organizations developing switching systems where downtime requirements translated to an up time in the 5 9s range – 99.999% uptime which translates to an average downtime of 3 minutes per year. That is a very difficult problem to solve for fix updates or package updates and we controlled all of the software in our beasts. Most PCs have a lot of non-MS products loaded on them making it impractical to test all products with a new MS release and impossible to test the combinations of products, i.e., the combination problem might require a few hundred million years of test time. We tried to get annual releases down to zero downtime, but gave up! All of a sudden the update software added to applications and to the OS became very large and it became impractical to test and buggy because of its complexity! Basically, it would work 2/3 of the time and you’d end up with a system restart 1/3 of the time. We quit trying. Now our application required continuous availability and transaction reliability was not important. Banking systems would reverse those priorities. I wouldn’t even think of designing a banking system update mechanism with a continuous availability requirement and a transaction reliability requirement where both approached “high”. You would use a different technique involving multiple processors. Do you want to design your own PC network, i.e., have multiple desktops to work through an update scenario? Another headache when the zero downtime update worked was “what’s going to happen when the system needs a boot?”. For a very small percentage of the updates, the next boot (6 months later?) may fail! Now what? Of course, we designed for some of that but the problem grows exponentially with system software growth. The simple point is that Microsoft has their OSs used in a wide range of applications with different reliability requirements. If you selected the MS platform, you knew that (I hope) and designed your system to what MS could do and not what you hoped for (Hope is not a plan!). Many of our systems were proprietary (you had some control) and many were Unix based. The proprietary systems allowed you design to your requirements. Unix was Unix. If every application designer followed the rules, things MIGHT be nice. But you never knew and there was always someone! And old and new versions of a process may not work when both are in execution communicating to a common program in who knows what version! My advice is to Be Happy. Boot your Windows Box after an update to make sure it will work after an unexpected boot and except the boot for an update because you won’t have to worry about the unexpected boot! If you require specific reliability requirements your selection of Operating Systems and System Architecture must be based on those requirements. It is really kind of simple: document the requirements, then architect. Commercial off the shelf boxes will seldom address high reliability requirements unless your system architecture is designed to deal with it. Long and boring … but I hate it when authors imply that system updates, fixes, whatever are a simple problem to solve. They are exposing their ignorance!

    • #1198480

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1199704

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1200077

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1200804

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1201723

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1202494

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1203403

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1204238

      Windows Secrets wrote:
      > Office 2007’s “AutoRecover” function autosaves open files every 10
      > minutes

      Does it only seem obvious to me that the Autosave in Office and other programs from other software companies should also automatically save when the program is closed by the system? It could surely be programmed to do this regardless of the frequency set for autosaves.

      I know that we often want to look at something, make changes, then close without saving, but that would still be a normal choice. If the program is being closed by the system, the occasional warning dialog, if it appears, is very brief and sometimes behind the current window.

    • #1204338

      Another issue I come across from time to time is this: after an update (XP), Windows will give the familiar instruction to restart the machine. It will start to shutdown and remain in that state – sometimes forever. After 15 or 20 minutes a hard shutdown is required, but I am always left wondering if the machine will boot back up correctly. Shutting down this way is a last resort, but what is the alternative?

    • #1204544

      For non-critical patches/updates, I usually use the WhyReboot Utility to judge if a reboot is really warranted.
      It is a freeware utility available here http://exodusdev.com/products/whyreboot

      • #1208378

        For non-critical patches/updates, I usually use the WhyReboot Utility to judge if a reboot is really warranted.
        It is a freeware utility available here http://exodusdev.com…ducts/whyreboot

        IME, the safest approach to patch installations is to follow them immediately with a restart, whether or not it is suggested. Before I adopted this practice, on too many occasions, my machine became unstable after an update, most likely because the update was only partially installed, and some of the new code got loaded and linked to old code that was part of the complete update.

        David A. Gray

        Designing for the Ages, One Challenge at a Time

    • #1205260

      [indent][/indent]

      AutoUpdate will STILL INSTALL AND REBOOT for either of these settings if the update is flagged as critical (and far too many are). There have been several series of threads about this on various boards and Microsoft’s MVP’s and employees cannot seem to agree whether it actually happens. However in an actual developers blog, he admitted to a “forced update” flag which was only supposed to be used to update the the update system was being used to force other updates. Sorry I cannot quote a link as my last foray into this was about a year ago.

      Well my XP system is set to download and notify, and last update Tuesday I just caught it 5 seconds before a reboot!!!

      To prevent interruptions to TV recordings my Windows 7 system is switched to MANUAL and everyone agrees that updates do not get actioned until deliberately checked for. In the past (when I was still running Vista) some updates left the PC powered off rather than restarting!! – Not very good when I was away for a few days.

    • #1205996

      The fact that Windows wants to reboot after an update is not nearly so much of an aggrevation to me as its incessant nagging every few minutes to ‘Reboot Now?’ Most of the flaws being patched have been around for quite some time, and the way I see it, waiting a few more hours until I go to lunch or the end of my workday is not a big deal. When I make the mistake of loading a patch which ultimately requires a reboot, then those nags impact my ability to keep working and ultimately force me to go ahead and stop what I’m doing and wait for the reboot. I would really appreciate it if Windows would accept ‘Reboot Later’ as really meaning later and allow me to wait 1 hour, 2 hours, or 4 hours, etc. without constantly interrupting my work.

      Adobe can be even worse, in that in addition to its nags, Acrobat often will not accept a system shutdown followed by a startup the next day in place of the ‘Restart’ it wants.

    Viewing 106 reply threads
    Reply To: Why the need to reboot after starting Windows?

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: