gaming_tech
Discord ID: 356288574485954561
9,508 total messages. Viewing 250 per page.
Prev |
Page 17/39
| Next
the max performance gain from OC that you'll get is 13% whein maxwell cards would get over 20% gain
blower style is good for SLI but not for single card use
i thought blower was worse
because the top would just blow hot air at the bottom
so the slight voltage headroom with FE isn't gonna make a major difference
ppl use blower + aftermarket combos in sli to blow out one card and aftermarket cool the other
which is pretty autistic
lol
trust me, just go for whatever card you like best, don't worry about power phases and shit, just cooler, looks, model and price
i like 1080 but im too poor for it right now
<:GWfroggyFacepalm:398569573865226240>
what is the point of buying a 165hz monitor to run it at 80hz
im prob not gonna oc because ive had bad experience with it
80fps*
i really dont understand this mindset
just buy a 60hz monitor
retard I explained this
there are actually 80 or 90hz monitors at 1440p
i think lg makes them
I bought the monitor prematurely to upgrading my GPU in the future
gpus dont matter buy integrated graphics <:wesmart:359946049588166657>
and there are FEW versatility options for monitors
I got a 1080 for 550$ and Iโm real happy with it
how
there are no 90hz gsync IPS monitors...
craiglist?
Nope
Newegg
used?
so you really don't understand the market and are looking from such an ignorant perspective
or before cryptos
Evga gtx 1080 ftw
New
Before half the crypto
note: ftw is the model
lucky son of a bitch
Like August
i didnt have a job till december
and by then
bITcOIn was big
Rip
I got my watercooled MSI GTX 1070 Sea Hawk X
for $450 in december 2016
yall can piss off
Not bad
the watercooling is amazing, I can run synthetic load for 15 minutes and never hit over 61C and that's without the radiator fan working hard at all
like 60%
thats a really high temp for that cooler
brb gotta piss which involves me going upstairs
should be sub 50
especially with low TDP 1070s
you need to understand HVAC to know what to expect
you dont need fans or liquid cooling
If I'm drawing over 100W, I can measure how much heat is being output, and with math can determine BTU transfer relative to ambient
just blow very hard at the gpu
so given the paste and all is good, I know my cooler's limits
i know yall have experience blowing things
just buy a portable ac and hook up a hose to the case
and 61 synthetic over long periods, I never get over 55C in games for long term where the GPU is realistically stressed
brb
evga uses the same coolers they shouldnt get above 50c is your ambient temp really high
theyre all made by asatek and the pump dies after 1-3 years
whos gonna use it for 3 years
my builds never had a part stay for longer than 2
my ambient was raised during those tests because I had two other GPU's in my PC's
but it's completely irrelevant
it stays cool as shit for a GPU
way better than ACX
also, some games I can easily get over 165 fps, some only 80, so the advantage of that monitor is versatility. When I have lower framerate but higher than 60fps, I'm not limited to only 60. Also, gsync makes frametimes even, so my 100fps is visually superior to 140fps with no gsync. Even with games like CS:GO, I can see a DRASTIC difference with Gsync on, I'm not unnecessarily exaggerating. I used the monitor before I bought it because a friend had one.
I was so impressed with Gsync that I had to have it, that's why I got it
worth the extra $ by far, have no regrets
also I got the monitor over half a year after the GPU, and bought the monitor expecting to upgrade GPU
since the monitor at the time was a relatively good price, I wanted it at the time, also because I was playing a competitive game that heavily benefitted from high fps and gsync at the time, so I bought it earlier on because for that as well
but I never run a 60fps average, so I am getting benefit either way, and honestly saying that getting a 165Hz monitor means should be able to run that on everything is retarded. The disparity between 100fps and 165fps is noticeable with very fast scenery/fast mouse movement, but in games even competitive is almost completely irrelevant. From my experience, having an average FPS of 90fps+ and 99th percentile no lower than 60fps is not very far off when it comes to experience as 165hz with 90fps 99th percentile. BUT this is from experience with Gsync on and extremely even frametimes, so maybe there's a slightly more exaggerated disparity, but you don't exactly have it to compare. If you don't have Gsync, you should go buy a Gsync monitor like mine to try it, you WILL notice it and after going to it, you will have no desire to have a non-gsync monitor in future upgrades @Autistic Dog
also you said the 2080Ti will likely be "$1500", well I'm gonna say it won't be more than $850 MSRP. Likely $799. Because that $1500 seems to be relative to the Titan V, since past 980ti and 1080ti have been a little bit more expensive than 50% of the Titan cost. First off, the Titan V is not a gamer-directed card, and will not produce the performance we should expect of the 2080Ti, Nvidia has considered arguments and demand for Titan to be a developer/specialist oriented card like quadro and tesla cards, and the Titan V is almost the equivalent to a consumer and practicality experiment. Titan from now on will likely perform close/slightly better than gaming flagship cards but have more tensor cores and such for certain applications that aren't for the average gaming consumer. Around the time the 2080Ti is released, there may be another Titan release as well and it is very likely, but it may still sustain the higher price/non-gamer orientation and won't reflect a relative price point for the 2080Ti itself. So, it's safe to say that the 2080Ti will be around $800 msrp at first launch.
So the speculation for $1500 is seemingly ignorant tot eh actual circumstances. good day/night. @Autistic Dog
its not
That's just speculation on inflated prices because of mining.
There is supposedly huge innovation with the Titan V and Tensor cores for deep learning and other arithmetic.
Some of these benches show huge performance jumps over the Titan Xp, multiple times the performance in many scenarios. Just as a support for my argument. I think the 2080 will be $700 and the 2080Ti will be $800, maybe $850.
It's possible because of slower progression with making more complex microprocessors that it's as much as $900 but right now all we have is speculation.
i think mining will just shift to use the new hardware
can i sli gigabyte gtx 1070 windforce 2x with zotac founders edition
im probably not i just wanna know
yes
it will just run at the speeds of the slowest card
ok
I'm assuming that nvidia is going to change the contract details with AIB partners, something along the lines of cards not being X% higher than MSRP, $1500 prices out 90% of users, unless this thing is literally double the performance of the 1080, then I'm not expecting it to sell. No matter what, though, this is just speculation still
@Bearchoyboi you can SLI any card of the same model. For the most part, you can match the same stable frequency with almost any version of the same Pascal model. Again, given poor optimization and lack of support entirely on many games, I don't recommend SLI, The best thing is a single 1080Ti. If you are still within return/exchange period with microcenter, the single 1080Ti is likely cheaper than both collectively as well, and if you truly do ever need more performance it gives you way more headroom in the future.
So let's say new Volta cards come out and the 2080 (not Ti) beats the 1080Ti, but at the time you already have 1x1080Ti, then you can get a cheaper second 1080Ti because of people reselling or reseller price drop.
theyre going to be 1500$ either way unless something happens to stop mining
as I said, nvidia is likely going to renegotiate contracts with vendors when volta releases
^ and aside from that, their is potential that they will make cards specifically for mining, and limit the firmware of the gaming cards to mine. It makes sense for them to do that because it gets them the crypto miner and gamer market at the same time.
mining GPUs aren't practical
a large reason why people like them is because of the resale value afterwards, you get to mine thousands of dollars worth of bitcoin, then you sell them at maybe 20-30% under what you bought them for
actually mining gpus dont get as good a hashrate as regular gpus
so they arent even a profitable decision
best mining gpus right now are the 1070tis
makes sense
THE BIRD IS THE NEST
@.PunishedShlomo gibs stream link immediately
biz wont give it
my god hes even wearing "THE SHIRT"
memes
$1000 eoy confirmed
link is $.45 rn
perfect time for the proper investors to buy in
smart money like me got in at .16
hodl
ive been enjoying this dip - buyin moar
repubs like crypto
The 1070Ti is far from the best mining GPU, lol. For ETH, XMR, ETN, BTC, LCN and way more, the best singular GPU is the Vega 64. Especially for Cryptonight algorithm based coins.
And if these mining GPU's had drastic advantages for hashing algorithms, the resell would be lower but the demand would be extravagant.
so that argument saying mining GPU's aren't practical is wrong. Especially because companies that make ASIC miners don't have damn near the capability of making compact, high efficient processors for specific functions. So if there were specifically miner designated cards, they could probably destroy current GPU's for hashing.
The only problem for all of us that are speculating, we don't know shit for details about what Nvidia actually intends on doing...
@Wayne you dont even have a mining rig faggot. the 1070ti is the most efficient when you put power cost into the equation. ie the biggest baddest gpus are not power efficient lil brainlet.
and if you look at the hashrates on mining gpus, (if you actually researched before you talked shit) youll quickly discover that non-mining gpus out clock them majorly
and asic mining is for dumbfags
not to mention that you dont even understand mining at all
asic miners can only mine sha-256 coins / etc. they cant mine coins that gpus can exclusively
with gpu mining you can mine literally almost EVERYTHING with asic you can only mine ASIC coins
never talk to me about mining again
roasted and toasted
look at me, I don't even have a pascal GPU, but hey, I know everything about the 1070 Ti!
the 1070 Ti is most definitely the most efficient, it has the same heatsink as the 1080, it's space efficient, and most importantly, high yields. I wish I had gone with a couple 1070 Ti's rather than my 6 1060's
the best 1070ti ive found is the zotac mini's - just 1 8pin instead of 2 and they clock better than others
probably shit for overclocking, but that's alright
who do you go through @SchloppyDoggo?
@3laa are they your videos?
I mine and have plenty knowledge as to the most cost effective cards for different hashrates, lol. Don't try and debunk my argument and say I don't know what I'm talking about without numbers. @SchloppyDoggo Also, there are ASIC miners for other algorithms rather than just SHA256 so that statement is flat out wrong. Also saying you found a more efficient version of the same GPU is retarded, the only thing that draws power is the GPU itself, the particular model (evga ftw, MSI aero, etc) has no effect on power efficiency. If it had a higher base clock at factory clocks, then you could adjust down to account for power and the voltage will automatically regulate back down. Also, a flashed Vega 64 can get 2000h/s for cryptonight. The 1070ti can't get more than 700. Even if the Vega is drawing 2x as much wattage if is way more cost effective, especially given relative market cost.
So don't shit talk me lol
you are one ignorant faggot
And note, if you did GPU mining, you'd know you get a mining performance advantage from overclocking the VRAM. If you have a Vega 64 with Samsung HBM, you've fuckin scored
dont ever @ me again. thanks.
Calling me an ignorant faggot but can't prove me wrong with numbers, can you @SchloppyDoggo ?
Thought not
>calls me ignorant >everything said by them is wrong
The only thing you said that was accurate was that GPU's are more versatile because they can mine most any algorithm. Congrats, you stated common knowledge for an entry level miner.
Please don't argue against me, thanks. I only argue with people who actually know about the subject and rather than get petty and try insulting the other person, has a respectable discussion.
Also don't make arguments on a open discussion that insults other people if you can't handle being argued against, makes you look like a jerkwad. You don't have the balls to respond with an actual fact-based counter-argument because you don't have one.
Bitmain's ASICs are great, but they're hot, they're expensive, and they have no resale value
About 1 year ago, something like the Antminer S9 would've been a great investment, especially for the return you'd have by now.
If there was a really well priced cryptonight ASIC, I'd probably be on it asap.
you spend 1200-2000 dollars, depending on if it's a preorder, or if you buy it afterwards. They take 1200W of power, and they're good for about a year and a half, the S9 doesn't make enough money for it to be profitable anymore, in most cases
the difficulty of cryptocurrency mining has increased exponentially
Yeah it doesn't, but it used to not too long ago. Around new year, that's when there was a huge drop in mining profitability all around.
I *almost* bought the S9 7 months ago
I used to get as much as $11.50 a day with 1x1070 and 2x760's on ETN with cryptonight algorithm.
kinda glad I didn't, with how the market went
I got abouttttttt 6 bucks on my dual xeons and a 980
Get about 700h/s with 1070 and 350 for each 760
right now I get 50 cents a day per GTX 1060, of which I have 6
back when I first built it, I got about 1.25 per
Between my full rig, I'd probably get no more than 3.5 a day in ETN
That's 1575h/s on cryptonight for ETN
what miner do you use?
Actually I'd probably get less than that, that's why I quit mining it and gave up on mining.
I use a shell, custom set up.
Xmr-stak
I use nicehash, but it seems to take a bit of my money
ah
It's called xmr-stak because XMR is monero and it can function as a monero miner
But also for ETN
Same algo and I use a mining pool
Or technically used
I started a bit late and without great optimization/constant running for mining, so I only have 2400 ETN and that ain't worth jack.
Especially because of price drop. According to how the market reacts though, it could jump up and turn a pretty great profit. Especially if it goes back to around 20C
Then that's like $480
I'm hoping to hold and watch it, hoping for more adoption in the U.K. To influence a spike and that's when I'll cash out. They just got on more exchanges and have made some deals that could bring the currency back up again.
The best way to do it is find out how much you get a day average between every month, then calculate cost of electricity and divide amount of coins gained by electricity cost, and that gives you the equivalent amount it costed you for each coin relative to buying the coin on the market (more accurately if you mean it over each day of the month)
So I did all of the math and everything to know my position, but since it's a personal computer and I want it for personal needs rather than to mine all of the time, and the cost of GPU's was getting excessively high at the time I was considering maybe investing in rx580's/vega64's, I gave up on that idea because of shitty mining performance and shitty market on hardware. Which is bullshit, there's no way that the GPU's have this much demand anymore with such shitty fucking payoff for crypto mining. Seems like distributors are just fucking with us at this rate.
Or miners are just joining in a fad rather than doing math to find out how stupid it is to invest in crypto rn.
*role Minecraft
how does it work
*roleMinecraft
the fuck
@Matvey there is no minecraft role
why does the heading says so
Hwot
HWOT
Hi
fixed srry
oh
hm
tell him he's a fuck
I just bought SNES
Mini version
Fortnight has mades its way to IOS
guys
what are the parts of a desktop where warranty is absolutely necessary
power supply
if that thing dies, and takes anything with it, they'll generally take care of it all
so don't cheap out on a PSU
wrong
that only goes for battery backups/surge protectors, PSU's usually don't get you that coverage, unless maybe you sue the manufacturer
uhhhhh
EVGA covers it
so
I have an EVGA SuperNova 850G2, still never heard of that happening with a PSU
never heard of it failing?
show me an article on it and I'll believe you
Or never heard of it being covered?
I've never heard of other damaged parts being covered
other than the PSU alone
I guess I must've misread
although, Corsair's flagship PSU does
the platinum efficiency or high wattage?
Prove it
I really don't care enough to
it's not like he's gonna buy the AX1500i
or the now 1600i
I don't even think they have that kind of coverage
tbh
YOOOOOOOOOOOOOO
somebody ask me some shit, I'll likely have the answer
hardware related I mean
not the fuckin powerhouse of the cell
i have a software question
after you download drivers is that it?
can you just leave it in the download folder
Yes. drivers usually get assigned a random name and are binned in sys32 after install
ok
for codecs and whatnot
otherwise some third party software needs to run to act as the driver.
just installed my motherboard drivers
i got a cd for it
motherboard drivers?
but i didnt get optical drive
yeah
usually that's just GPU oc software and benchmark crap
usually irrelevant crap
this is a motherboard driver right?
yeah so
i didnt have an optical drive and the only external one i have is this superdrive
and its only works when its with a mac
blackwidow chroma?
cynosa chroma
relatable
cynosa?
new series?
9,508 total messages. Viewing 250 per page.
Prev |
Page 17/39
| Next