Threads

What a comfy amount of threads, Let's see your threads user. No shilling allowed.

Attached: 2799k.jpg (481x453, 81K)

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=9SIA4M54R44548
ebay.com/p/Intel-Core-2-Extreme-Qx9770-3-2-GHz-SLAWM-LGA-775-Prozessor-CPU-Quad-core/2233144679?iid=264311782064&rt=nc
twitter.com/NSFWRedditImage

these are the threads i have now but i want more. more threads. more power. more everything.

Attached: wantmorethreads.png (1346x1352, 100K)

Nice thread user.

I prefer real cores over SMT threads.

Attached: 2019-05-06 11_31_55.png (432x473, 37K)

s

I like big boy threads to. I need more threads.

Nothing I do needs more than 8, so I prefer 8 cores and high clockspeed over a massive number of cores a 2-3Ghz clockspeed.

OP here, I hear you. 9900K at 5ghz. But i like all threads. Give me some threadripper.

Attached: spooky.jpg (641x577, 177K)

Give me some 2500k threads, Give me phenom threads, Give me an athlon 64 single thread. I don't care.

Overclock that bad boy to 5ghz user?

i mean, whatever gets you going I guess.

I want to see it user......

I had it OC'd for a short while at 5Ghz, but the extra idle power draw was a bit high and I wasn't really doing anything that NEEDED the performance.

I saved the OC profile in the BIOS though, so I can re-activate it whenever I feel I need it.

Whoever has the oldest thread wins.

You do know about adaptive voltage and overclock right? You can still be at 5ghz and be idle at 800mhz if you set it right in the bios. Just set it to adaptive or whatever it's called for your motherboard manufacterer.

I don't get it, why would you want more than one thread? People keep complaining about having one core at 100% usage while others do nothing, just have one core so you can use 100% of your cpu power always.

Well a lot of programs nowadays use more than one thread. It's more of a luxury to have more than 4 now days. There's always something that will use it.

Yeah, it doesn't work. At least not anywhere near what you're saying.

It would idle at 800mhz and then shoot straight to 5ghz with literally ANY load, my idle power draw on average was easily 50w higher.

That's exactly what it's suppose to do user. When you're sitting there reading a hentai post it will stay at 800mhz, That's normal. You play a game it will stick to 5ghz. Personally i stick my all core to 5ghz because i'm an idiot. But what you're describing is what it's supposed to do.

Nigger, the problem is just moving my mouse is enough for it to spike to 5ghz, it's retarded.

It fluctuates a lot. It's hard to really get a good readout on wattage with it going on but trust me. You're going to be running a lot cooler and have potential for more life in your computer user.

It might fluctuate half way but yeah i know what you mean. It's still better than having it 5ghz all core constantly when you aren't doing anything.

Can't wait for Ampere.
RTX is too over-priced for features I don't need.

Attached: HWiNFO64_2019-05-06_17-57-16.png (1280x1154, 156K)

Are you just not reading what i'm saying?

I use a UPS that keeps track of my power draw over time, I can visually see the difference between stock settings and overclocked 5ghz with adaptive voltage.

It's really up to you and doesn't matter, It will still last you 20 years at a high all core.

Yeah, but again, I do nothing that requires 5Ghz anyway, i'm not CPU bottlenecked in any game I play. It would just be a waste of energy and would create more heat under-load.

Hell yeah man, Hopefully it's not going to be just another filler gen. RTX series was very bland to me.

2080 Ti is a good upgrade for me, but I'm not paying Quadro-tier prices for a card that can't do 30-bit (10-bit color) outside of DirectX.

Once you go 10-bit it's hard to look at the color grading that 8-bit graphic cards show you.

>Once you go 10-bit it's hard to look at the color grading that 8-bit graphic cards show you.
I disagree personally.

I have a 4k 60hz 10-bit display right next to a 27" 1440p 144hz 8-bit display, I have no issues using the 8-bit display.

Is 10-bit nice to have? Sure. But outside of niche use-cases it's hardly going to be a huge difference. Especially in any sort of high-movement use-case like gaming or videos.


If you're doing production work, then sure. But besides that? Meh.

Well belive it or not, You're benefiting a lot from having a 5ghz cpu. Game do use 1-8 cores and with that kind of clock speed, You should notice it if you experiment with you chip. I've clocked mine all the way down to 2 ghz and turned HT off and yeah. Huge difference in single core games.

I9-7900x @ 4.8GHz

Attached: Schermafdruk van 2019-05-06 18-06-42.png (919x703, 184K)

It's nice to have but 99% wouldn't notice any difference desu. I do hope they will start to release 10-bit soon. I can see the difference and would love for it to be mainstream.

Lmao, shut the fuck up moron.

It turbos to 4.6Ghz all core at stock settings, it's not a massive difference from 5Ghz all core that an OC gives me, but it uses far less power and heat on average, especially just for normal desktop usage.

So you have a FirePro or Quadro?

If not, you're not using 10-bit outside of DirectX-based games.

You know this trash they call HDR? That's what 10-bit really is, but without the over contrasted colors that ruin the image.

Damn nice dude. Did you delid? No offence lol.

>So you have a FirePro or Quadro?
Yup, cheap quadro just for that purpose.

I also have a gaming GPU for games though.

threads in a thread...

>x58 master race

Attached: count.jpg (985x849, 115K)

ye boi. rendering some CAD with ray tracing

Attached: threads.png (666x593, 38K)

yes delidded. before the delid i couldnt get 4.6 to run stabel. so im pretty happy i went with it.

>37 days of uptime
impressive with windows 10

Yes i know this, After 4.5 Ghz coffee lake needs an exponential amount of voltage to carry the clock speed. Just saying 5ghz does give higher framerates overall. Especially without HT.

>Just saying 5ghz does give higher framerates overall
What do higher frame rates give me? I am already at 144fps+ in all the games I play.

Like I said, I have the OC profile saved if I actually think I need it, currently I do not.

>Yup, cheap quadro just for that purpose.
Just wanted to make sure.

Same here.
Bought a Quadro 4000 for $50.

Attached: s-l640.jpg (640x480, 27K)

Based braodwell...Nice man.

Isn't Ampere going to be for compute GPUs just like Volta?

Shit dude, Got those maxed out. At least you know how to use it properly.

I got a similar deal, K420 for like $50-60.

Awesome. You're going to be good for a while.

Better future proofing? Shit i dunno. You might need that extra power in the future.

>You might need that extra power in the future.
>Like I said, I have the OC profile saved if I actually think I need it, currently I do not

HMMMMMMM

Attached: yeet.png (666x562, 40K)

Supposedly, I'm sure they'll have a cut variant of it.

Right on.

Nice threads bro.

I want that........

Where's the old school windows ME threads at? Any p3's out there?

>K420
Thought the kepler ones were much faster, but at least you have DX12, I only have DX11.

dx12 still isn't that great. You're fine for older games for sure.

Yeah, at least we have Vulkan.
But funny enough, being limited to a certain Direct X and the whole 10-bit vs 8-bit is funny because Geforce cards seem to have 10-bit in Linux, so it's an intentional limit on Windows that Nvidia did.

Is it a third party video driver thing or just how they made them for linux? Seems odd.

No idea desu.
I haven't used Linux since Red Hat back in 2005.

poorfag here

Attached: 2500k.png (571x579, 37K)

Fucking based user. How long you been rocking that?

because of side channel attacks right?

No, because they're actual cores instead of pretending to be one.

its running actually at 3.45

Attached: q93.png (482x494, 23K)

He's probably a legitimate gamer. Who have you known that has ever been hacked through the HT exploit or any for that matter?

reboot your machine bro

Winner so far. How long?

Holy shit, Just realized 41 days. Yeah it needs a rebootin'.

I have recently switched to it, i used to use a Q9300 at 3.26ghz.

Is this legit? Holy crap.
newegg.com/Product/Product.aspx?Item=9SIA4M54R44548

2011

Woops, accidentally tagged you.

I paid 14$ for my Q9450 soo...

How far do those overclock?

I run a tight winchad install. I don't need to reboot 'just because'. WTF.

Pretty far, but i cant say since this has multiplier locked to 8 and when raising FSB above 435 the memory starts to become really unstable, i have 800MHZ sticks and it overclocks them to 870.

You usually get slowdown and memory leaks from it being on that long.

Is it possible to underclock the sticks and then overclock the FSB?

No, you cant manually adjust them, they get both overclocked with FSB. If i had a unlocked extreme QX series cpu i could raise the multiplier instead and that would kinda fix it.

Attached: juuu.png (657x690, 44K)

ebay.com/p/Intel-Core-2-Extreme-Qx9770-3-2-GHz-SLAWM-LGA-775-Prozessor-CPU-Quad-core/2233144679?iid=264311782064&rt=nc

Cheapest one i found. I can't believe they're still so expensive....

Nice man. What kind of work are you doing currently?

yeah its pretty fucking stupid that i had to even pay over 10$ for a 2008 cpu.

Attached: 2019-05-06-105446_1837x228_scrot.png (1837x228, 265K)

Considering, Yeah you got a good deal lol.

no load right now, maybe i'll post later when I have things running. probably not though

Threadripper by chance?

Sure.

Attached: 234.png (1250x1051, 59K)

Damn man, What do you do if you don't mind my asking?

Attached: 145.jpg (255x197, 9K)

Primarily Android and backend development. Also need to run couple of android emulators at the same time.

>7970
heh

currently hovering around 2900 threads

Attached: file.png (963x709, 84K)

They're logical processors, not "threads."

It's 7950 actually. I will upgrade it, eventually. There is just no need right now.

Attached: muh_threads.png (1887x738, 171K)

been using this PC since 2009

Attached: threads.png (821x593, 45K)

Attached: muh_GPU.png (890x103, 17K)

my cores are fewer, but they are more robust, and turgid with sheer throughput. i laugh at corelets who think their 16 meager cores do anything but prove that their beloved savior's entire business model is built around not even bothering to compete in single thread performance.

indeed, i have no competition with threadripper, because anyone looking for superior single thread performance simply would never buy an AMD chip