Tldr; looking for energy efficient desktop that can still game

tldr; looking for energy efficient desktop that can still game.

I live off grid so every watt counts. I have a gaming desktop with a i7 2600 and gtx 980 that draws 100-120 watts idle/browsing the internet. I also have a laptop that draws ~10-15 watts idle/browsing the internet.

Question: Is there a desktop mobo/cpu that has the power efficiency of a laptop but also a 16x pcie slot to house my 980? Anyone else dealt with this problem?

I've read a few articles on the subject that seem to think 60-80 watts is somehow acceptable for a computer to use idle. Is that really the best desktops can do?

Attached: RVElectricalProblems7.jpg (256x220, 8K)

Other urls found in this thread:

tomshardware.com/news/ryzen-2400ge-2200ge-apus-leaked,36497.html
downloadcenter.intel.com/en/download/24075/Intel-Extreme-Tuning-Utility-Intel-XTU-
amd.com/en/technologies/ryzen-master
techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/24.html
twitter.com/SFWRedditImages

You could use one of the new 2400ge ryzen low power chips.
You could also try to undervolt and underclock any modern cpu and also replace that power wasting 980 asap when nvidias 11 series launches.
You could also use intels/amds tool to use a very downclocked profile normally and add a second higher clocking profile for your gayming shit (i don't think anybody would be stupid enough to try to use the gpu for calculations if he's off the grid).
tomshardware.com/news/ryzen-2400ge-2200ge-apus-leaked,36497.html
Last but not least you could use a laptop with an external gpu enclosure.

What the first post said, plus calculate how much power are you gonna use to match the peak of the efficiency curve of your PSU.

Also get an RPI or a cheap laptop for idle shitposting.

Attached: 80.jpg (500x354, 56K)

Good point user, completely forgot about the psu.

Unironically, if your goal is maximum performance per watt, just go with the highest end Intel NUC and undervolt it. Go with low-power SODIMMs (

you are talking about the new intel nucs with amd vega inside, aren't you?

Yep. Their GPUs sit pretty firmly between an RX560 and 570 (4gb editions) when overclocked with the stock cooler, while only drawing ~100-150w total (including CPU). It's not 980 levels of performance, but it's also 130 Watts less than the 980 alone for the whole package.

Note, the benchmarks are dumb. They max out VRAM usage, which thrashes main memory, making them effectively useless. If you cap the VRAM usage at 4GB on any given game it'll hit damn good 1080p performance.

You forgot replace the default PSU with a Titanium one, can probably find a server intended model that'll be of ok size. Might have to play splice the wire a bit I don't know how NUCs are constructed. Probably save 10% off the total power use.

The 980 draws 15 watts idle. It's fine. I think my problem is more the mobo/cpu. I tried underclocking it but the savings where within margin of error. I wonder if the newer chips would actually see a decrease.

I don't use a standard PSU and agree in the thought that having a desktop/laptop is going to be the best solution if not inconvenient.

Does undervolting the cpu save that much?

>Does undervolting the cpu save that much?
It depends. If your cpu has the biggest power hunger it will probably have some bigger impact. But if tyou want that low power you'd need to go lower than a ghz with something like 0.7V. Something my i5 8250U does for example.

Can the newer stuff throttle voltage based on demand or is it still a reboot to change?

variable voltage according to demand has been a thing for over 10 years, hasn't it? stuff like speedstep

but for variable voltage while overclocking/underclocking, that's more recent like in the last two years or so that it'd been more accessible since skylake came out?

downloadcenter.intel.com/en/download/24075/Intel-Extreme-Tuning-Utility-Intel-XTU-

amd.com/en/technologies/ryzen-master

>Is there a desktop mobo/cpu that has the power efficiency of a laptop but also a 16x pcie slot to house my 980?
Not laptop tier efficiency since it's like 40w under load but the new 2700 is pretty efficient.

Attached: efficiency-multithread[1].png (500x1130, 55K)

find a decent specced laptop with good battery, and buy multiple batteries. boom. take out charger while gamin or wjatever. and if the battery dies pop in an already charged back up

That does look like a good cpu. beats even the i8 in perf/watt.

I don't think you understand the question.

Attached: taskenergyscatter.png (620x288, 7K)

Raising the power to get a 2% increase in efficiency is still consuming more power you dolt

>gaming off the grid
what is your energy collection set up? solar panels into lead acids?

GT1050Ti external GPU into a modern laptop, full load will be about 110W, both CPU/GPU on torture; you won't get the same class of performance as the Hades Canyon NUC on CPU but it'll be way more power efficient and you won't need to worry about the screen.

>That does look like a good cpu. beats even the i8 in perf/watt.
2700 isn't on that chart btw. The 2700X is less efficient than the 2700 but it is a bit faster.

Nobody said anything about raising the power consumption, you can get an over sized power supply to ride the efficiency peak, for example if your system needs 300W, get an 600W PSU, you will be using only half of the total power, but off the wall your consume will be closer to 300W.

Attached: aho.png (543x435, 128K)

600 watts of non-tracking solar that average 200-300 watts when sunny and a 600 watt wind turbine that averages probably around 150-200ish going into 4 4d and a couple odd batteries adding up to around a kah.

It's not the gaming drawing 200+ watts that is killing me, it's the idle 100+ watt when I'm watching cat videos and shitpost. If that was 40 watts my life would be good.

Closest I could find comparing perf/watt. Seems AMD is catching up with Intel. That's nice to see.

I'm going to need a source on that. I don't think that's true, 2700X is binned better so at the same frequency/voltage, 2700X should be more efficient. 2700X only appears less efficient on paper because XFR2 tends to boost it much higher (2 core boosts up to 4.3GHz), while the efficiency sweet spot is like 3.7GHz per the Stilt if I recall correctly.

>I'm going to need a source on that.
Literally every benchmark shows it drawing 100W while getting like 20% more performance.

This is a good thread

... Are you daft? If you're worried about power efficiency that much you'll be willing to undervolt the processor; you will very likely be able to undervolt the 2700X further while maintaining the same frequency as the 2700 because it is better binned, therefore, the 2700X will be more efficient, you neanderthal.

Well, the laptop alone will draw 40W at the absolute maximum, with some power management and running an SSD, setting the brightness down a bit you'll probably be able get to around 15W if you unplug the GPU.

Do you know anything about those external gfx card adapters? How badly do the impact the card's performance? There's no way a laptop can export a whole 16x bus of throughput right?

It depends on how much you're willing to spend. The poverty versions will use mPCIe (x1 PCIe 2.0 protocol in modern laptops), ExpressCard (x1 PCIe 1.0 e.g. 250MB/s if your laptop is only EC1.0 but theoretically EC2.0 can deliver up to PCIe 1.0 at x4 or PCIe 2.0 at x1), and NGFF/M.2. which should be rated for PCIe x4 at PCIe 3.0. This latter option is the best if you're destitute - M.2. offers a peak of 4GB/s, and you can take a look at TPU benchmark here:

techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/24.html

I think using a mainstream GPU and a relatively weak CPU would probably bottleneck prior to starving the GPU memory bandwidth for most games, especially since most of these laptops will be near 1080p. The main disadvantage of the eGPU setup here is that you have to rip off the bottom plate of your computer at all times (assuming you can even access the M.2. slot conveniently), you'll lose that M.2. slot, and in order to play things on the internal screen you have to use Optimus (nVidia only, and it probably won't play nice with AMD's drivers if you're using Ryzen with their IGP) to play video on the internal screen.

The other options is Thunderbolt. TB is only really offered on Intel CPUs although you can get USB-C eGPUs you can only really get x16 bandwidth on Intel laptops which support TB since that USB-C is offered over TB3. I am also not sure about how the internal display really works here, I've only really read about these boxes. I think there's some fuckery with display emulation and using headless display adapters to accomplish this, although it might just be doable with hybrid graphics anyway. I think if you use XConnect (AMD's eGPU technology), it can be painlessly done using their application.