2 HOURS LEFT
STREAM IS HERE:
youtube.com
U REDY, BOYZ & GURLZ? LISA SU IS IN DA HOUSE!
AMD CES 2019 OFFICIAL THREAD, PRE-SHOW EDITION
Other urls found in this thread:
youtube.com
youtube.com
timeanddate.com
merriam-webster.com
en.wiktionary.org
youtube.com
twitter.com
NOOOOOOOO
inb4 intlel damage control
just wait for sunny cope
In before disappointment.
LET THE SHITKIKE BUTTHURT FLOW
working on a deep fake of lisa su pegging intel. brb
>9am PT
Why doesnt anyone use UTC?
I know what time it is if they use utc, people around the world know their local time zones relation to utc. No one has heard of "PT"
Recent developments:
>Steve from Hardware Unboxed murders Nvidia with a 12 minute long video (no way he would burn the Nvidia bridge unless he had knowledge of Navi and what it will perform like)
youtube.com
1700Z
youtube.com
OW RATE Jow ForumsUI-Z, HOUSE IT GO IN
Nobody gives a shit, third-worlder. The event is in the greatest city in the greatest country in the world and you'll work to our time.
fucking this and FUCK daylight savings time
GMT+ 2 a shit
here's a countdown timer (livestream starts in a little under 2 hours)
timeanddate.com
>Shitty Cope
EU will prob kill dst in a couple years.
Quick, is this going to be a popcorn or chips&dip type of an event?
>the prophecy will come true
just put the fucking time on google you absolute dumbtard
Will the GPU prices be finally normal-ish?
Will Intel go bankrupt?
Will little Timmy fall down the well?
Tune in to CES 2019 to find out
I have prepared so many fucking screencaps from deluded AMDrones, it's going to be so fun to laugh at you fuckers in a couple of hours.
i like steve he is beautiful
>just google the time whenever theres an event
How about save a couple billion google events every year by using UTC as the standard?
SOPA
DE COBRA
right now? of course not, amd will offer competitive prices to current hardware (so nvidia-$50 for the same performance)
ask again in a year
>nvidia is so retarded they BTFO themselves after AMD gives them the high end GPU segment
How can they fuck everything up so fucking badly
>I've wasted time saving screen caps that I will just have to delete later today
damn, pack it in fellow amdrones we've been btfod
Intoddlers BTFO real soon now.
Give me ryzen 12c 24t 5ghz momma su
kek
Is Vega 2 going to make my 2080 obsolete?
>debiru
Every American has
I've heard that AMD has a final solution to Specter in their upcoming cpu's, am I correct?
It's going to make your 1080 obsolete
BLEASE G-GET RID OF THIS A-ANTI-SEMITIC THREAD. B-BUY AMD IF YOU ARE R-RACIST!
What's wrong Phoenix? Eat your hambaagaa
Look at stock Veg64. Added 20% to its performance.
Thats 7nm Vega.
I'm also sure its VII, as in 7, not Vega 2.
>nvidia -$50
I hole you mean the last lines pricing because there's no way AMD is going as full retard as nvidia went with rtx
Why would you buy AMD? Do you have a problem with your reptilian satanic jewish baby-eating overlords?
No. Vega 2 will be all about shrinking dies and reducing power consumption/increasing performance per Watt.
debiru is engrish for devil
Will probably compete with the rtx 2070. That is even if it exists
What if Vega 2 is actually just a rebranded Navi?
At the Next Horizon event AMD only compared it at equal power to current Vega64. They increased clocks up to 1800mhz at the same 295w. No reduction in power consumption there.
It does neither, it's a dedicated HPC chip.
And so is for debile. The irony.
Maybe, it doesn't make sense, but amd gpu marketing for like the last 2 years doesn't either
everyone will be wrong
based
Nah, Navi is actual new arch iteration.
The shit posting will be amazing either way
Actually I don't recall any info on this, just rumors. What AMD specified on their slides was "next gen memory" which is obviously gddr6, that's about everything we have on Navi. Might as well be another GCN iteration.
They touted scalability, what that would be is anyone's guess.
>Might as well be another GCN iteration.
Everything will be another GCN iteration until they switch the ISA, which is never, RISC SIMT just werks.
Navi is GCN, user. It isn't new arch family.
>Might as well be another GCN iteration
That's exactly what it is. Anyone expecting a miracle is deluded.
P6 was a miracle and it was x86.
Intel numba 1
AMD numba 4
So should I ejaculate right as Lisa is walking out to the stage, or when she announces the 3700X?
Despite being called "Pentium Pro", it certainly wasn't another Pentium iteration, though.
Both. Do it for Lisa.
It was still x86 and executed compiled for x86 code.
You don't ditch the ISA if it just works, Intel learned it the HARD way.
Right as she walks out, if the announcements are good enough you can work up for a 2nd.
And that was about all it had in common with the Pentium.
Honestly, CPUs should ditch 32 bit support and any other legacy feature from 2005 or earlier. Use the die space for something else, like more transistors or better front ends.
I dunno, both P5 and P6 were superscalar, had Int and FP units, had x86 decoders and a lot of other shit.
Legacy stuff costs nothing given the modern xtor counts.
>i have absolutely no conception of how little space the real mode circuitry takes up
Breaks things (every motherboard on Earth assumes that an x86 starts in real mode, just for starters), and makes enough room for another 16KB of L1 cache.
Forget it, user.
Why can't GCN scale past 64 compute units?
Do you mean Vega?
It can, they just have no reason to, they're already sitting at power limits.
"4 triangles per clock front end" seems to be the meme.
1 hour anons,
1 hour
my cock is fully erect.
i need release.
mommy su
give me release
GIVE PRAISE FOR AMADA!!!
That's unrelated to them just slapping more ALUs.
They just don't.
You better start edging for mommy, right now, don't you dare disappoint her
Somebody call in a wellness check for Tim.
She'll release all that cum from her pussy as Intel is done raping her.
People sure are excited for some computing units.
Prepare for disappointment AMDrones.
But I guess you are used to it at this point.
>Jow Forums - Technology
It seems like Vega 64 can't even feed all the ALUs
It costs in debugging and validation time, but his idea is retarded and CPUs aren't fast enough to do 32bit in microcode, and likely won't be for another 10 years.
Inshills OUT
In games.
Area and power aren't free anyway.
>It costs in debugging and validation time
Yes but given modern design complexity, it's peanuts.
Can't wait until PooMD fanboys' dreams are crushed.
Checked, and also the temperature of the oven as Intelaviv goes in.
How the fuck does g stand for technology.
The Vega 64 already competes with that though
Jow Forumsentoo is technolo/g/y.
Brb making some popcorn, I want to be ready for when the AMD bois start the mass suicide.
Not my dreams. I've been saying right from the get-go that 30% IPC increase and 5GHz are unrealistic.
However, 10% IPC lift and 4.6GHz is enough for me to buy into the Ryzen party.
What's gentoo?
a kernel
A meme to be ignored. Like Core i9.