Message from @SchloppyDoggo
Discord ID: 424766182210404352
and it ran nice and cool
1270 base
relative to stock card boost you, get just over 17% higher freq with 1425mhz
just leave it there and be done with it
and 3600 on mem
I will, just gotta figure out what voltage it's happy at
you're manually changing voltage for your GPU?
yeah, Precision XOC
If you're going over highest auto-voltage, then you are degrading the card faster, it will barely do shit, but also voids warranty
what's the max voltage increase it allows?
don't do more than like 8% more than max auto volatge
and find a stable spot there
I did like 50mV over lmao
I don't care, gonna run this baby into the ground
@Polygon what card are u tryin to oc?
trying?
I already did, and billy's just being a mom
make and model
Gigabyte GTX 980 gaming, three fan version
ive never oc'd anything under 1060
didn't need help
just wanted other's scores
when will the fucking cryptocurrency rape end?
i remember when GTX 1*** series was cheap
and RX 400/500s were like the best deals out there
It's already gotten much better, but is still cancerous
@Wayne Quantum computing is the only way "Artificial Intelligence" will ever come to fruition.
Depends on their productivity. If we can make a quantum computer that allows us to create true AI, then what's to say we can't use conventional architecture processing at an enormous scale to achieve the same goal? Of course there will need to be improvements on many things to achieve what quantum processors could, but who's to say quantum computing is required for it? We already have some intense deep learning performance between GPU's CPU's and more intelligent software. At the rate of progress we're at for that sort of thing, we may be able to create neural networks as complex as some more complex multi-cell organisms by the next decade, but that's just speculation, I can't personally back that up. Time will tell. Just remember, quantum processors can have more packed in atomic transistors with quantum properties that doesn't make them as straight forward as binary processing, but it will likely prove difficult to take advantage of those properties, so the software side of quantum computing alone could easily set us back.
Also the precision of the current CPU manufacturing process is as fine as 10nm, and the amount of transistors in one confined area is incredible. So the benefit we'd get from quantum transistors, especially given our current progress is debatable until there is observable benefit.
hey nerds
gaming desktop or gaming laptop
why not both
poorfag
@GirlBeGood depends, are you in college?
because in all reality, get a chromebook for on the go, and invest most of your money into a nice ass desktop
unless you really need portability desktop is way better value
you're a degenerate...but I agree
@Wayne going over the highest auto-voltage doesnt necessarily kill the card fast enough for it to matter lmao
of course you need to know what you're doing