Message from @Autistic Dog
Discord ID: 423199648748929045
Yea heβs right u donβt need it unless ur case is a baby
Buy a 1080 bundle on massdrop or something
Prices arenβt complete ass
just because you can run the cards at 80c with the fans on max doesnt mean its optimal
and single cards arent gonna run games maxed out in 4k or 144hz+
even the fastest cards on the market
@Bearchoyboi there should be no reason to water cool those 1070s unless you are hardcore clocking them for mining. running them at 75c will not hurt them.
and the king of autism is right about 4k - even a 1080ti will struggle with it
two 1080tis could handle it tho
75c wont hurt them but that increases heat for the whole system
heat and noise are a big problemo
sli in particular
Here's the thing. Playing max on 4K at a higher refresh rate than 60fps is something not worth striving for until after Volta imo. The best thing you can do is run a 1440p (2K) monitor with 120hz+ and Gsync. My monitor (Acer XB271HU) is 2K 165Hz IPS Gsync @27" and once you go to the higher refreshrate (especially with Gsync, it's pretty detrimental), you'll never want to go back and 4K wouldnt likely be as enjoyable. Also considering you can run 2K at higher than 60fps average very comfortably with one 1070 alone, so with a 1080Ti or newer Volta card it'll be a while before you need an upgrade. That's the best route imo and from other people I've discussed this with. Also an ACX 2.0/3.0 cooler card nowadays won't likely run the fan at more than 60% under high loads and will still keep the card under 75C no biggie. As long as you have intake fans in the front and outs on the top/back (or cooler on back), then you should get most of that excess heat trapped in the case out. 1070 SLI again, is less practical than the 1080Ti standalone and will also cost more for less performance on most games and lack of SLI optimization on many. If the game isn't optimized for SLI, you can only use 1 card.
I have a 1070 and I easily run games at very high 60fps at 4K π€·π»ββοΈπ€·π»ββοΈ
Except for rainbow six siege, gotta use TAA at half res there
But 4K 60fps is much less enjoyable than 2K 120fps (especially with Gsync). At least that's what a couple forum polls and other people I've talked to said. From personal experience it's true. Even with Gsync, 60fps is a noticeable drop especially for competitive games. For example, Dark Souls 3 has an engine limit of 60fps. I enjoy the gameplay, but that jump from 60fps to 120+ is important for lots of movement, especially when you're used to it. And with Gsync making the frametimes even, with no stutter or tearing whatsoever, in movement, even not that drastic, I could easily see choppy displaying because of the frame rate. It's especially important for precision in games like that, and it really helps you get more information to your brain when you have all of those filler frames. Makes competitive, action and fast paced gaming much better. So although for visuals 4K is a little bit better, the framerate versatility and performance benefit of 2K is truly "optimal", especially if you have good money.
And Rainbow 6 isn't very well optimized tbh, little bad for comparison. Better to use something like Battlefield, Doom, a newer COD game, crisis 3 (still relevant), PUBG, GTA V, Metro: Last Light, Tomb Raider, and some others I'm not gonna waste time naming.
4K is really only practical for movies and laid-back games. If you're the typical gamer, the faster 2K is the best experience. Sorry if I'm getting repetitive.
nice meme but 4k will never be a thing ona single card
When the 2080Ti comes out, it should (if at least a 35% performance advantage over the 1080Ti which it almost certainly will be) be more than sufficient for max 4k gaming on current games at 60fps, but even then, 2K 120hz+ with Gsync is still the best way to go for gaming. Games will get more demanding, but also better optimized with new engines and API's. Therefore when games get more demanding and optimized, you can still have to option to play at a sweet spot based on your needs. That's why I still favor 2K 120hz+ Gsync panels for versatility and long term enjoyment.
I'd be willing to say relative to the 1080Ti, the 2080Ti would be in many cases even overpowered for my 165hz Gsync 1440p panel.
it wont be sufficient idk why ppl keep repeating that
ppl have been saying that _____ card is overkill for 1080p since 6 series gtx
gl running high refresh rates on ultra even today
Uhm, I think I'd know what my limitations are since I have metro last light, gta v, battlefield 4 and tomb raider. If I downscaled to 1080p and ran those games on max setting (with optimal AA for best picture without getting excessive), I run comfortably over 100FPS consistently... that's just with a GTX 1070 as well, the 1080Ti is over 60% the performance. There's no excuse to say that 1080p nowadays isn't demanding relative to the higher end cards we have. Also, with the 78% resolution jump with 2K over 1080p, the performance loss is not linear, so you don't usually lose more than 40% performance, except AA has a slightly more drastic performance drop (although you don't need more than 4x msaa or 2xSSAA on good 2K monitors anyways). So the 1080Ti is more sufficient for 2K than the 1070 is for 1080p, and that is more than "sufficient" for high refresh rate 1440p. So given the likely 40%+ performance jump with the 2080Ti over the 1080Ti (or likely more, as much as 60% based on history and architectural expectation), 2K @165Hz and Gsync should run without a hitch on the upcoming flagship. Yes a single GPU, and from experience SLI has way too many downsides. Lots of lack of optimization meaning it can't be used in many games, and not good optimization in most games that actually support it. Cost to performance with optimization versus the flagship card is also not worth it, the 2x1070's in SLI on an optimized game will run barely better (if any better) than 1x 1080Ti, this is from experience and benchmarks that you yourself can look up.
I know what I'm talking about.
Also the 600 series GTX graphics cards had predominantly 720p users, I'm talking about arguments based on numbers and speculation, not assumptions and dreams.
which magical system are you using
1080ti cant even run witcher 3 at 1080p with hairworks enabled above 100hz
2k is a nice meme for ppl who have never had high res monitors
1080p has half the latency and can run a *consistent* 144hz
There is no latency difference with 2K, latency (such as GTG) are based on the display type, such as IPS, VA or TN
This proves you don't know what you're talking about
there is def a latency
look up tests
the bigger the screen + res the more latency
And my Gsync 165Hz monitor has the same latency for frametimes as a 165Hz Gsync 1080p monitor at the same framerate.
youre going by the latency on the box