• Perplexing laptop battery life issue with Mint 19 Cinnamon

    Home » Forums » AskWoody support » Linux for the Home user » Linux – all distros » Perplexing laptop battery life issue with Mint 19 Cinnamon

    Author
    Topic
    #209059

    Ever since I bought my Acer Swift 1 laptop in the beginning of June, it’s been running Linux Mint Cinnamon, initially 18.3, but now 19.  I’ve been using it on battery power quite a bit lately (that’s what I bought it for… I have my Core 2 Duo laptop for semi-portable duty when an electrical outlet is near).  I’ve noticed that the run time while watching videos is pretty poor, with the claimed 10 hour battery life dropping to only four when continuously playing 1080p video (h.264).  That’s with hardware acceleration confirmed to be working, and with Laptop Mode Tools installed for power saving, with Intel Powertop confirming that all of the power saving options are active other than the autosuspend for the external mouse (which makes it turn off completely after only two seconds, refusing to respond until I press a button).

    I noticed that one of the laptop review sites used Big Buck Bunny as a test video for this kind of thing, so I downloaded it for this test (same specs as cited above).  I confirmed the 4 hour run time in Mint Cinnamon using the configuration as above, then booted Windows 10, which is still on the mostly unused internal eMMC drive inside the Swift (Linux gets the big, beautiful Samsung SSD), and tried it there using Windows Media Player.

    The video ran continuously on loop for seven hours on the nose before the laptop ran out of juice.  I had the wifi enabled, bluetooth enabled, sound muted, and an external mouse plugged in the whole time.  Screen brightness was 40%, with the timer-based screen blanking and sleep functions turned off.  Otherwise, all power savings options were at their defaults.

    It’s understood that Linux power saving has some catching up to do compared to Windows.  Drivers for many of the components within a PC are often much more optimized in Windows than in Linux.  Still, that’s a BIG difference in run time.

    I tried using Powertop to figure out what was using all the extra power, but it seemed to be malfunctioning.  It was claiming that the unused fingerprint reader was using 2 watts and that the also unused SD card reader was using 1.5 watts.  Both devices showed as “good” in the power saving options, indicating that autosuspend was enabled for both of them.

    In total, it estimated baseline power use of about 8 watts, even though it also reported when on battery that the actual discharge rate (not showing video) was 5 watts.  The actual power use shouldn’t be less than baseline, so that’s no help.

    I saw in the system monitor (like Task Manager) that Cinnamon consistently was on top of the CPU use rankings while a video was playing– more than the media player (VLC, at that moment).  Could that be part of why the battery life was so much worse in Linux?

    I backed up the Linux installation with Macrium and again with Timeshift, then installed Mint Xfce 19 over Mint Cinnamon 19.  After getting the hardware acceleration driver for the Intel iGPU and a few other things installed, I installed TLP (another Linux power-saving tool).  I didn’t try laptop mode tools this time because it had introduced an annoying screen blanking after 2 minutes even though I had the option set in the power options to not do it at all.

    I thought maybe TLP would be better in that way, saving me the trouble of figuring it out, and it was.  Laptop Mode Tools had set all of the options reported in Powertop to “good” except two, and I checked these manually and confirmed they were both actually enabled, just not with the settings Powertop was looking for.  TLP did the same.  Regardless of the tool used to turn the options on or off, it’s the same options, so it’s still a valid test to me.

    I began the test, and it’s still going now.  It’s been going about three hours, and the battery just passed 50%.  It’s on par for about a six hour run time (the estimates of remaining run time were very accurate in Mint and Windows in the previous tests)– not as good as the seven in Windows, but far better than the four in the same OS (Linux Mint) with Cinnamon instead of Xfce.

    EDIT: The final run time was 5 hours, 50 minutes on Mint Xfce, quite closed to the estimate of 6 hours.

    It would appear that whatever Cinnamon is doing to use that CPU time during the playback of the video is causing the battery to run down faster.  Not only that, but the laptop is noticeably cooler during the Xfce and Windows runs than it was in Mint.

    I prefer Cinnamon’s look and feel to any other Linux desktop environment, but cutting the battery run time during video playback by 33% is brutal.  On a desktop, or on a laptop while plugged in, it’s pretty trivial (Cinnamon does not feel slow even on my slowest PC, my Dell laptop), but it’s not trivial on battery.

    I think I will post about this over on the Mint forum too.  If there’s a way to fix this, I would prefer to use Cinnamon, but Xfce isn’t bad at all either.  I would be curious to know the results other people have if they are running Linux on a laptop.

    The 4.17 kernel is supposed to have some noticeable improvements in power saving on laptops.  It’s not (yet?) available as an optional kernel in the official Mint/Ubuntu repos, but it is available from a PPA.  I’m thinking of giving it a try!

    I think Linux will get close to Windows eventually in power use, and maybe sooner than later.

     

    Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
    XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
    Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

    2 users thanked author for this post.
    Viewing 6 reply threads
    Author
    Replies
    • #209067

      Ascaris: ” After getting the hardware acceleration driver for the Intel iGPU and a few other things installed, I installed TLP (another Linux power-saving tool).

      I had understood that to save power and, consequently, also hear the fan less often, one disables hardware acceleration, at least with Windows (running which, until last year, when I got the Mac laptop, had been the sum total of my direct, hands on, experience with modern OS for PCs — other than using Linux already installed and set up for me, and not for watching videos, in someone else’s machine.). Perhaps I am wrong about this?

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

      • #209098

        Oscar,

        Hardware acceleration, if it is done correctly, is a very good thing, in terms of speed and of efficiency, but there can be exceptions.  (Wow, this is harder to describe than I thought it would be).

        In this case, the hardware in question, the Intel Apollo Lake SoC, is built throughout to be a very power-efficient design (it doesn’t even need a fan… it is passively cooled).  It has hardware built into its onboard iGPU (integrated GPU) that is made to be very effective at decoding certain types of video streams, like h.264, without it having to work very hard at all.  If it’s not done in hardware, it has to be done in software, which is a lot less efficient.  The CPU has to work much harder than if the iGPU were doing it, and that pulls a lot more power.  It’s about matching the task with the hardware that is best suited to that specific kind of work.

        Sometimes the payoff of the hardware acceleration isn’t a reduction in power consumption.  When you’re talking about something like a discrete GPU used for gaming, it is about being able to deliver more performance than a CPU acting alone ever would.  GPUs can draw a ton of power, very often more than the CPU (and sometimes much more).  When you get into that kind of situation, the GPU is not saving power compared to doing the same work on the CPU, because the CPU flat out is not capable of handling what the GPU is doing.

        Sometimes hardware acceleration is not well-implemented, and it can cause all kinds of glitches and performance issues, but that’s not the way it’s supposed to be.  Firefox for Linux still ships by default (in other words, the default settings it sets on an initial installation) with all hardware acceleration off– it’s a very choppy, un-smooth, tearing-filled, really lousy experience, but it works with nearly everything.  Forcing on all of the hardware acceleration is one of the first things I do when I install Firefox or a relative of Firefox, and so far it has always worked beautifully.

        The hardware acceleration in Firefox in Linux is considered (by Mozilla) to be incomplete, so in some cases enabling it will crash Firefox or cause it to perform worse than having it off.  It’s not because hardware acceleration is bad, but because they haven’t gotten it finished yet, or sometimes because there are bugs or unfinished bits in some other library or process.  Linux often gets the short end of the stick because so few people use it relative to Windows, but it is steadily improving, just a little belatedly.

        I am not sure about what you read that suggests that turning hardware acceleration off would reduce the heat output.  It might end up saving power if parts of the task aren’t done (like showing a 60 fps movie at 30 fps; only half of the work is being done), and that is sometimes a valid approach, as long as you are happy with the results.

        Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
        XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
        Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

        4 users thanked author for this post.
        • #209103

          Ascaris, thank you so much for your answer.

          Probably this has to do with how old is the hardware we are considering here. My 7-year old Windows 7 machine has separate GPU and CPU chips, and the heat was coming from the GPU, when using hardware acceleration.

          Turning off the GPU to avoid several unwanted effects of having it on was more or less standard online advice some years ago, when I was streaming with IE11 using “Silverlight.” So I turned it off and had no issues when streaming in HD video (TV shows, movies) from Netflix. By the way: the only issues I ever had with screen tearing were caused by the use of the “Classic” view. Switching to glassy “Aero” ended that (although I got stuck with a busybody of a GUI that does things on its own without my asking for them; but this is neither here nor there.)

          Also: am I guessing correctly here that the problem with hardware acceleration being by default in FF and FF-derived browsers (WF, PM) is restricted to their Linux versions?

          I have WF in both my old Windows and my new Mac machines and have made no adjustments to its settings without noticeable issues concerning heating or laptop battery usage.

          Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

          MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
          Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
          macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

          • #209120

            Oscar,

            The GPU will (of course) generate some heat when it is used, but so will the CPU if it is asked to do the same level of work that the GPU would have done (often times, it’s not… the frame rate is reduced, so it doesn’t have to do as much work).  The GPU fan is often more annoying than the CPU fan in tone/volume.

            I never tried using Netflix with Silverlight.  I’ve only used it in the last year or so, using HTML5 under Firefox natively.  I don’t know what it was like before that.  I do know it works perfectly well under Linux, which was a pleasant surprise.

            The Windows Aero themes (which includes non-transparency themes like in Windows 8 and 10, even though “Aero” originally was about the transparency) are GPU-accelerated, whether that GPU is discrete or integrated.  I think we talked about this before, how the Aero themes use the Windows Display Manager (WDM), while the Classic and basic themes use the old-style GDI (CPU, not hardware accelerated) to draw the display.  The Windows 7 GDI was horrendous in terms of tearing, something I had not noticed much in XP (which only used GDI).

            Aero was introduced with Vista, and not all integrated GPUs of the day worked with Aero themes.  They had to be WDDM (Windows Display Driver Model) compatible, and I think there may also have been a minimum score they had to get on the WEI (Windows Experience Index) in order to enable Aero.

            This was a source of controversy back then, as Microsoft had initially intended to only certify PCs as “Vista Ready” if they had the chops to handle Aero.  Intel had a ton of 915 chipsets to sell, though, and they didn’t have what it takes to run Aero.  Intel pushed MS to create another category to let people know that the 915 could indeed run Vista– just without Aero.  Microsoft (in a series of correspondence that leaked to the public later on) resisted at first, but eventually gave in to Intel and created “Vista Capable,” which meant it could run Vista, but not at a “Vista Ready” level.

            In a completely predictable way, the public confused “Vista Ready” and “Vista Capable,” perhaps not even realizing that they were two separate labels, and a lot of people ended up with new PCs that didn’t perform as they had thought they would, and they ended up blaming Microsoft.  It’s part of the reason Vista was so disliked at first… it had real performance issues that were present even on higher-end machines (famously, copying files was far faster on XP), and the Vista capable thing had just made it worse.

            The unfortunate thing was that the WDDM and Aero were legitimate leaps in quality (zero tearing, no artifacts if a program stopped responding, Aero Peek effects if you wanted them), but they got dragged down along with Vista, until 7 arrived.  A lot of people chafed at the idea of the “fancy” Aero consuming more power than Classic, which was sometimes true, but they interpreted that to mean it was slower.  The additional power consumed was by the GPU, though, which was typically almost idle and underutilized while the PC was running desktop applications.  Games and videos were a different story, but the rest of the time, the GPU just sat there while the CPU had to draw the display AND handle all the operation of the system AND running all of the programs.  Offloading some of the work to the GPU made sense… it freed up the CPU for the kinds of things a CPU is meant to do, while allowing the bits that were meant to handle graphics to do what they were designed for too.  Of course, that depended on everything working as it should, and in those early days of Vista, that was not a given.

            Things are a little different with most discrete GPUs compared with integrated ones.  A discrete GPU isn’t meant to save power… generally, they consume a lot more power than an integrated one, but they deliver a lot more performance.  With the stock blower-style fans on desktop GPUs, they were often quite loud, and since they vented the hot air out of the back of the PC, the sound they made was not muffled by the PC’s case as it would have with a GPU that vented internally.

            One of the first things I did with my discrete GPUs (starting with the nVidia Fermi, my first really hot GPU) was put on an aftermarket heat sink and fan (Arctic Cooling Accelero Twin Turbo II), which was able to keep the GPU much cooler than the stock fan, and with far, far less noise.  It added cost and bulk (I always made a bracket of some kind to hold the thing up, as it was far too heavy for me to be comfortable letting it dangle from the card), but the difference in sound level was night and day.  I’ve used that one cooler on three different GPUs, just moving it from one to the next when I upgraded.  It’s in my PC I am using to write this now!

            Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
            XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
            Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

            1 user thanked author for this post.
            • #209124

              Ascaris, Thanks.

              Yes, you have explained this to me already (tearing, Aero). The thing is, after I turned off hardware acceleration on the Win 7 Pro SP1 (with an I-7 “sandy bridge” CPU), I never turned it back on. Unless it turned itself back on, it should be still off, but the quality of streaming video is still very good all the same (I can’t check right now, because am writing this on the Mac laptop, a machine that stays always as cool as a cucumber no matter what I’m doing on it: streaming HD video+HiFi sound or heavy number-crunching data analysis, and I’ve no idea of how that works.)

              Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

              MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
              Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
              macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

            • #209173

              Oscar,

              It depends on what you mean by turning off hardware acceleration after installing SP1.  If you selected a non-basic/classic theme, that automatically turns GPU acceleration on for the desktop.  If you mean in Firefox… well, I don’t have any experience with the Silverlight plugin, so if it has to do with that, I don’t really know.  I know in my case, I could not get Firefox to stop tearing in Windows 7 until I switched to a non-Classic theme.

              Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
              XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
              Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

    • #209086

      During my testing of Mint 19 (Cinnamon and Mate), on my desktop with separate nVidia GPU, I did find that Cinnamon did generate higher CPU and Graphics fan speed than Mate. This was monitored by ear, and by the temps and fan speed in Windows after immediately shutting down the liveUSB image and rebooting to Windows 7-64Pro.

      For both tests, I did install the distro’s nVidia GPU drivers. In fact both versions temps and fan sppeds were higher than Windows, but that was because I did not have under Linux the ability to set custom fan profiles like I run in WIndows.

      On my wife’s Lenovo T420 i7-2620M laptop, Ubuntu 16.04 generated higher temps and fan speed that Mint 18.3 Mate (based upoon Ubuntu 16.04 LTS core). That laptop has the Intel graphics. The laptop body by the CPU also felt warmer, but nowhere near real warm, under Ubuntu.

      Either way, I have to say an honest 4 hours on just battery sounds very good. I can only dream of that with my Lenovo E440 Thinkpad i5-4100M, under Windows 7-64Pro. Then again I have it set to the middle of the battery settings so it is not turning off every 5 minutes. Turning on airplane mode does extend it.

      2 users thanked author for this post.
      • #209104

        About Lenovo: I wonder what do you know about this? #209090

        Thanks.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

        • #209131

          When that came out I checked my PC (a Lenovo E440 Thinkpad with Win7-64Pro_SP1) for Superfish and it did not have that issue (it was new, but outside the period of the Superfish installs, but I still checked), nor did my wife’s a Thinkpad T420, which is her daily driver. On her laptop, she wanted Ubuntu like her previous machine, so I totally updated Windows (during the real Win7 WU nightmare period), popped out the HDD and installed an SSD for Ubuntu 16.04 LTS. It was later changed to Linux Mint-Mate 18.3 LTS. I later did re-insert the Windows 7-64Pro HDD and it was not on that HDD either. That laptop under Linux Mint is very, very fast with decent battery life for a fast i7 CPU.

          On mine I did remove some of the solution Center programs due to intrusiveness or not being easy to use or configure, but kept a few. That was during the Intel Bluetooth being killed by WU and I needed the system to check for the Lenovo patches after some bad install of Intel drivers. That laptop Model number was said to be affected by the Intel ME vulnerability, but the ME software or drivers were not in my machine as it was targetted to a non-corporate user. I believe a subsequent UEFI release closed the vulnerability at the firmware level. It was a fast laptop, but took a hit with the Spectre/Meltdown patches. The Spectre one is disabled with InSpectre.

          I find the Lenovo business line to be the studiest and most user serviceable laptops. Updating RAM and changing the SSD or WiFi card is very easy. The later E440 also does not need a HDD caddy like the earlier T420 model. The wife’s was an off-lease refurb, mine was a new Win7 closeout during the initial hoopla over Win10. I jumped on it to avoid Win 8.1 and Win10. It will eventually become a Linux Mint-Cinnamon laptop, but for now remains Win7 for those very rare times that an iPad is not enough – like for onsite image editing.

          2 users thanked author for this post.
          • #209384

            I find the Lenovo business line to be the studiest and most user serviceable laptops.

            That’s from their IBM legacy. IBM made good, solid computers.

            Group "L" (Linux Mint)
            with Windows 10 running in a remote session on my file server
            1 user thanked author for this post.
    • #209217

      You may want to try out MX 17 Linux, (https://mxlinux.org/)

      (here’s a review:https://www.dedoimedo.com/computers/mx-17-lenovo.html)

      It has great battery life, and is very attractive. May be worth a spin using a live usb!

       

    • #209382

      I prefer Cinnamon’s look and feel to any other Linux desktop environment, but cutting the battery run time during video playback by 33% is brutal.

      Bummer! I just installed Mint Cinnamon 18.3 on my laptop!

      Group "L" (Linux Mint)
      with Windows 10 running in a remote session on my file server
    • #213372

      MrJimPhelps,

      Have you had a chance to test the battery life yet on that laptop?

      This battery life issue is weird.  One person on the Mint forum reported having the issue once he upgraded to Mint 19; Mint 18.3 had been better.  He didn’t say what desktop environment he was using, though, but I suspect Cinnamon.  I replied with my own experiences, but I still don’t know exactly what is going on.

      I decided to use my Dell Inspiron 11 laptop for testing.  I backed up its internal eMMC drive using Macrium Reflect (on USB), then used Macrium again to roll it back to a fresh Mint 18.3 state (backup image taken right after I installed Mint).  I looked in the System Monitor, and Cinnamon was behaving– it was idling at 0% CPU, just as it should.  Whenever the screen updated, it briefly surged (which makes sense, since the Muffin compositor that is integrated with Cinnamon is probably inside the Cinnamon process space), but dropped back down.

      I decided to test it.  I left wifi on, bluetooth on, TLP installed and left with default settings, screen at 40% brightness (more or less, since the slider has no markings), screen set to never dim or go off.  I let it idle until it ran out of juice, and it went 11 hours, 20 minutes– an impressive showing.

      I had previously tested it with Xfce (Mint 19) under the same conditions.  While Cinnamon is generally regarded as a resource-heavy DE, Xfce is the opposite.  Surprisingly, the Xfce run had only gone a shade under 11 hours… so Cinnamon had bested Xfce.  Not by much, though, and I suspect that more repetitions with each DE would make the difference evaporate.

      I decided to experiment with the Mint 18.3 setup and see if I could recreate the “Cinnamon always using 1-2% CPU” thing.  The first thing I tried was changing the desktop effects settings (they were still on default).  I tried putting them all on, and sure enough, Cinnamon began using 1-2% CPU at all times, no longer dropping at all to 0%.  I tried putting the desktop effects all OFF now, and Cinnamon still used 1-2% always.

      I repeated the idle-down battery test and got a shocking 5 hours, 40 minutes… half of what it had been.

      I have not been able to get Cinnamon to stop using 1-2%, even by putting the settings back to their defaults.

      Cinnamon remains my favorite DE in terms of usability, features, look and feel, etc., but it has this Jekyll and Hyde thing with battery life.  If Cinnamon is behaving, it performs well in the idle-down test; if not, it’s truly awful.

      Note that even when Cinnamon is behaving, any time the screen updates or changes in any way, Cinnamon surges right up, and that could mean that other tests (like my continuous Big Buck Bunny movie test) that involve lots of screen updating will still show Cinnamon as being a poor choice when battery life matters.  I was unable to test this because I could not get Cinnamon to behave before I started the test… I already know that when Cinnamon is misbehaving, battery life is bad.

      So, much to my chagrin, I have settled on Xfce for my laptops that are going to be used with battery (the Core 2 Duo laptop still has Cinnamon… its battery life is so poor that it is more of a portable than a laptop in terms of how I use it). I did install Cinnamon’s file manager, Nemo, and set that as the default with Xfce.  Xfce is lacking in a number of areas, and its included file manager (Thunar) is one of them.  It’s incredibly fast, but it has no integrated search function or per-folder settings– meaning that if you set one folder to details view, sorted by modified date, all folders will be viewed according to those same settings.  It won’t remember them per-folder, which is a shocking omission for any modern file manager (as is the lack of integrated search, which I use in Nemo frequently).

      Xfce also lacks a built-in ability to use color profiles, and with LCD panels often having terrible default color balance, that’s important.  My Swift has a decidedly blue cast by default, while the Dell takes it much further… it’s way, way too blue.  I have the correct ICC profiles for both of them, but Xfce doesn’t have a place to put them, and simply installing the color manager from Cinnamon (and GNOME) didn’t work.

      I was able to do it by installing DisplayCal from the Ubuntu repo, which comes with Argyll as a dependency.  Argyll has a series of tools, and one of them does allow setting of the ICC profile by command line.  It’s not as convenient as gnome-color-manager (preinstalled with Cinnamon) to browse dozens of profiles to see which one looks the best, if that is what you wish to do, but I’ve already got the ones I want to use lined up, so it’s easy to use the built-in applet in the Xfce settings to install the profile at boot.

      Of course, I could just have installed Argyll by itself, but some of the DisplayCal tools are nice to have, like one that can show the graph of the color correction profile currently loaded into the video card (to make sure it’s working properly), even though I don’t have a display calibrator (which is what DisplayCal is really meant for).

      The Xfce applet for touchpad settings is also woefully inadequate, offering only the most basic settings.  Cinnamon’s settings for mouse/touchpad are based on GNOME, which is a lot more advanced than Xfce in that way (and KDE is better still).  It’s not a deal-breaker either, though, as the touchpad can be configured via .conf file, and once it’s set, it’s set.  I rolled back to the older synaptics driver (which is not just for Synaptics touchpads), as the libinput one still has some rather annoying bugs/errors/whatever you want to call them.  Synaptics is no longer being maintained, as is so common in the open source world, so the distros all say to use libinput if possible, but it’s just not there yet for touchpads (it’s still in use for everything other than the touchpad).  IMO, of course.

      Kubuntu, or Ubuntu with KDE, managed to slightly best Xfce in the continuous Big Buck Bunny video test, though I suspect that this also would come out to it being equal to Xfce if more repetitions were performed.  KDE managed to go a little bit above 6 hours, while Xfce went 5:50.  KDE’s not the heavyweight that most people still think it is anymore… the devs have done a lot to lighten it up, and it is now not much heavier than Xfce in memory footprint, and about equal to Cinnamon.

      The KDE System Monitor shows it as having a far lighter footprint than Xfce… it consistently showed 0.39 GB of RAM used at first bootup, while Xfce is about 0.7 GB, nearly twice as much.  That made no sense to me, so I tried installing gnome-system-monitor (the default system monitor in Cinnamon, which I had also used in Xfce, as I found Xfce’s to be lacking) in KDE, and the truth was revealed… it showed KDE as using 0.9 GB, while the KDE system monitor was onscreen at the same time showing 0.39 GB.  They’re using different definitions of what counts as memory used, apparently, so if you ever want to compare two dissimilar Linux setups in this way, make sure you’re using the same memory reporting tool on both of them!

      So, as it stands, I am using Xfce, but with some Cinnamon bits sprinkled in (as much as I can get to work).  If Cinnamon ever manages to tame that power usage, I will happily return on my laptops, for it really is my favorite other than the battery life issue.

      KDE could be the go-to desktop now that it’s light and fast, but it still has some annoyances that push me away, and now that Mint has dropped the KDE version, it just isn’t clicking with me.  Mate might be a good choice instead of Xfce, but I do not like the UI for Caja at all (the Mate file manager), so I would still be substituting Nemo.  It might fix a few of the minor issues (no color manager, not enough touchpad settings), or it might not… I have not tried, since I was able to get acceptable workarounds in Xfce.

      That’s one thing that’s so neat about Linux.  Not only do you get choices, but you get choices within the choices… you can mix elements from different desktop environments.  Some Linux sites advise against this (particularly in the case of file managers), but using Nemo has been flawless in Xfce.  I do still have Thunar at the ready in case Nemo falls down on the job, but so far, it hasn’t.

      Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
      XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
      Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

      2 users thanked author for this post.
      • #213389

        Try Manjaro XFCE: https://manjaro.org/get-manjaro/

        Much to lose?

        greynad

      • #213430

        Have you had a chance to test the battery life yet on that laptop?

        No I haven’t, because the battery on the laptop is defective – totally dead. As soon as I unplug the AC adapter, the laptop instantly dies.

        Group "L" (Linux Mint)
        with Windows 10 running in a remote session on my file server
    • #213424

      Mint is using an Ubuntu base for Tara (unless  you’re using LMDE), and I think the battery life issue is a result of that. Supposedly, Ubuntu is working on better battery life in 18.10, but that’s not an LTS (though it may be backported to Tara at some point).

      If you like Cinnamon, it’s worth a try to setup the laptop with Manjaro Cinnamon to compare. I prefer XFCE, but I’m liking MX 17 Linux XFCE vs Mint XFCE. MX 17 XFCE seems to have very good battery life, as you can see from these reviews:

      MX Linux MX-17 Horizon – Shaping up beautifully

      MX Linux MX-17 Horizon – Second test, top notch.

      1 user thanked author for this post.
      • #214510

        Mint is using an Ubuntu base for Tara (unless you’re using LMDE), and I think the battery life issue is a result of that.

        The problem is present in Mint 19 Cinnamon, but not Mint 19 Xfce or Kubuntu 18.04 (same Ubuntu base as Cinnamon 19).  There’s something with Cinnamon.

        If you like Cinnamon, it’s worth a try to setup the laptop with Manjaro Cinnamon to compare. I prefer XFCE, but I’m liking MX 17 Linux XFCE vs Mint XFCE.

        I could do that, but it would be somewhat non-trivial.  The Cinnamon live sessions I’ve tried did not show the errant Cinnamon CPU usage at idle, frustratingly.  I could try installing it in a VM, but while that might allow me to see if Cinnamon refuses to drop to 0% CPU, but it wouldn’t allow me to test for battery life.  I’d have to actually set it up on the device in question, and I’m not really familiar with non-Ubuntu derivatives.  I will keep it in mind, though.

        I did try installing Kubuntu on a separate partition within my Swift laptop.  As has always been my experience with KDE, its extreme configurability was appealing (although the “you can’t run Dolphin in root” change before its functional replacement was checked in remains exceedingly frustrating), but there were little niggling issues that pushed me away.  One of them (which seems trivial, and I guess it is, but I’m very particular about how things should be in my PCs) is the continued inferiority KDE showed in laying out the icons and their captions on the desktop.  They are too widely spaced, which matters to me because I have a particular layout that won’t work without making the icons really tiny on the desktop if they’re not spaced tightly enough.

        In addition, the padding around the “cell” where the icon and the caption live doesn’t allow the text of that caption to use that padding for anything useful, so a caption that is a single long word like “Cinnamon” would be “Cinnam” and then on the next line “on”, even though there is plenty of room to put it all on one line if it would just use that huge space between it and the neighboring “cell.”

        KDE, for all of its customizability, has no easy way to tighten up the layout… no editing a dconf entry or two as in Cinnamon as I have in the past.  One has to go through the actual interpreted plasmoid code and change the parameters, and none of the canned answers about “how to” really sufficed for me.  They tightened up the icons, but the cells the captions had to fit within got even smaller, while that padding between the icons remained unusable.

        I could learn enough about the syntax of the code to try to figure it out, but… that’s a lot of effort to fix a problem that would be trivial in any other desktop environment.

        I decided to try the latest KDE Plasma version, 5.13.4, by pointing Kubuntu’s updater to the KDE Neon repos.  Neon started (as I understand) as repos that were intended to be used to modify Kubuntu (so that it could have the latest and greatest KDE Plasma versions), but now it is a distinct derivative of Ubuntu in its own right, and KDE devs don’t advise the repo switch trick now.  I knew I might mess it up, but it was an experimental installation anyway, so I went for it.

        To my surprise, the desktop layout issues with the icons (and their captions, more specifically) were completely fixed to my satisfaction. Unfortunately, the installation did get messed up in a rat’s nest of dependencies, which was made all the more likely by the difference in Ubuntu bases (Neon is still on Ubuntu 16.04 as its base).

        I decided to wipe that installation and try Neon itself.  It’s really a barebones, stripped down installation, and like Mint, it’s based on Ubuntu… so my familiarity with Mint meant that I knew how to find most of the packages I would need to get it up to speed.

        With this bit fixed, KDE is more appealing, and while there are still annoyances, I have to say that Xfce has its own frustrations too, and between the two, I am now kinda leaning KDE– but only in its 5.13 version.  The 5.12 release in Kubuntu had the bad desktop layout, and while I could probably locate and transplant just the plasmoid containing the fix, I elected to keep fixing up Neon.  It was kind of fun getting everything situated (and this is in stark contrast to how it would be with a non apt-based, Ubuntu-repositoried distro that I am not so familiar with.  It’s not that Ubuntu is necessarily better, but it’s what I know!).

        So far, so good with Neon.  Now I have to wait and see how the upgrade to Ubuntu 18.04 will go, eventually.  I still have Mint Xfce installed too, so I can go back if I change my mind, but so far, Neon looks really good.

        I wish Mint still supported KDE.  The Ubuntu update process is really the pits compared to Mint. I wonder if I can install the Mint updater in Neon…

         

        Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
        XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
        Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

        1 user thanked author for this post.
        • #214688

          Okay, some more interesting results, if you’ve been following this long-winded narrative.

          I got KDE Neon to a good point where it felt good and seemed to have all of my toys installed (at least the ones I remember).  It came with the Intel i965 vaapi driver installed, so I went ahead and tried my looping “Big Buck Bunny” test again in VLC to make sure the battery performance I’d seen in Kubuntu was still there.

          It wasn’t.

          I let it play from 100% battery down to 90%, and according to the OS estimate of remaining battery time (which has proven accurate in the past), it was reporting a remaining battery run time of 4.5 hours, which would represent a total run time of only around 5 hours when the half hour that had already elapsed in the run from 100% to 90% was added.  The math worked too… 10 percent battery life went for 30 minutes, so ten times 10% (the entire battery) would be 300 minutes, or 5 hours.

          I had gotten nearly six hours in Xfce and a bit over six in Kubuntu, so this surprised me.  I checked over all the things I could think of… volume and mic were muted, brightness of the LCD backlight at 40%, TLP installed and working properly, bluetooth and wifi were on but idle, and the mouse was plugged in, with nothing else running in the background.  These were all the same as when I got 6+ hours in Kubuntu playing the same video.

          Either the 16.04 base of Neon wasn’t as good as 18.04 in battery life, or the Neon team had removed or changed something that was important to get the longer run time.  Or was it possible that I’d made a mistake before when I had recorded a 6+ hour run time? I didn’t think so, but I was now in doubt.

          I tried putting Kubuntu 18.04 back on to verify its performance (I had Neon backed up, so I could go back easily if I wanted to), and when I played the Big Buck Bunny video again, I noticed one thing right off the bat: The CPU utilization (shown by the little panel widget) was nearly zero.  It had been low in Neon, so I had thought that hardware acceleration was on and working as expected, but now that I had tried again with Kubuntu immediately after, the difference was very obvious.  The CPU utilization was much lower in Kubuntu than in Neon.

          So, having discovered this, I had a choice to make.  I wanted the good desktop layout of Neon, but the long battery run-time of Kubuntu. Which way I got there (adding the good battery life to Neon or adding the good desktop layout to Kubuntu) didn’t much matter, as long as I got to the goal.  So here I was on Kubuntu… Could I really do as I suggested before and transplant the bit of 5.13 code, the little plasmoid that controls the desktop icons from Neon to Kubuntu?

          Well, as I learned… yes!

          I simply copied /usr/share/plasma/plasmoids/org.kde.desktopcontainment/ (the entire directory, which contains a bunch of plasmoids) from the Neon installation to the Kubuntu one.  After a log out/in, the desktop looked just as I had hoped!

          I had not expected it to work.  It hadn’t at first, when I transplanted just the one plasmoid that apparently controls the icon spacing, according to some stuff I read while searching for an answer to this issue.  Copying just the one plasmoid resulted in nothing showing on the desktop at all… so I went back and tried copying the whole directory containing that file.  And it actually worked!

          KDE still has the annoyance where it arrogantly refuses to allow the file manager to be started with root (admin) privileges, since the KDE devs have decided it’s a security risk, and have chosen to make that choice for all of us too.  Eventually, they will be adding functionality similar to how Windows does it, where you simply do what you need to do, like drag and drop the file you want moved, and if it is something that requires admin privs, it will prompt you for elevation at that moment for just that task.

          When implemented, it will be easier than ever to get admin stuff done, but we’re not there yet.  We’ve been using desktop Linux the old, “insecure” way for 20 years, and all of the other desktop environments still do it that way, but somehow it was so critical to fix this issue in KDE right now that they couldn’t even wait for the “better way” to be implemented before adding new code whose only purpose is to thwart users who try to use the file manager with admin privileges. For their own good, of course.

          The people who were concerned about this issue always had the option of not opening the file manager with root privileges– it’s not like this change had to be made to allow the elite users like those KDE devs to have their way on their own systems.  If you think running the file manager as root is insecure, you always had the option of not doing it!  The difference is that now, the rest of us no longer have the option.  It’s “I don’t do this because it’s not the best practice, and now, neither will you.”

          If I wanted to have unpleasant, functionality-destroying choices made for me “for my own good,” I would not have left Windows behind.

          At least there are plenty of workarounds.  Some distros have modded Dolphin, the KDE file manager, to again allow root access, and third parties are passing around fixes to accomplish the same thing (some security fix that turned out to be!).  It’s also possible to use other file managers while in root, like the excellent Krusader (which I would not have discovered if not for this).

           

          Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
          XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
          Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

          1 user thanked author for this post.
    • #213608

      XFCE, KDE, Gnome are the official Manjaro editions. Cinnamon (& several others) are community editions.

      greynad

    Viewing 6 reply threads
    Reply To: Perplexing laptop battery life issue with Mint 19 Cinnamon

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: