Message from @Tervy
Discord ID: 643726539967692810
idk how plex works exactly because I stopped using it long ago in favor of jellyfin
but the settings that matter are on the client
the first one is only used as a default for external (outside LAN) connections
the second one is usually not even used because you want to stream original files
IIRC the transcoding is used in case for whatever reason the client device can't play the original format too
_Which shouldn't ever really happen these days lel_
that's why i said the settings that matter are on the client
there's 9900 xeon versions on tao right now, e2278g and e2288g, both turboboost to 5ghz.
NOOO INTEL BROS
at what point do you cross from the threadripper environment into milan/eypc idgi
2S+ / higher than 32 core requirements
are modern games really sufficently parallelized to see benefit from this many cores? what about desktop performance? last time I saw real benefit for desktop/development use was when I switched to SSDs. It's always nice to have files decompress faster, and intense encoding and the like go faster (but sometimes these things only run on one processor), but it doesn't compare to the bang-for-buck I got from going from HDD to SSD...
games are not this heavily parallelized. but depending on what you do with your desktop, definitely.
these processors are targeted at 3d modelers (3ds max, blender), video editors (premiere, resolve, vegas) and developers whose workloads are sufficiently multithreaded to take advantage of the chip's pure horsepower
i can't talk for the other workloads but premiere's newer releases take advantage of as many cores as you can feed it during video encode and render
Imagine a game engine that feeds on as many cores as it can get
<:woah:333623269674713098>
I am not so sure how much it might change the game
also you get huge benefits if your run any kind of virtual enviroment specialy in cases where all your homes pcs get served and processed in single homeserver(s)
and ofc CAD and Fluidsims benefit a lot from the extra parallelization (and beef behind that)
Mini pcie for WiFi on x99 <:pepesmug:589101277879992339>
what is the pepe part
that's quite normal
especially on older shit like x99
100 grams
huh
those are always neat news
but i always wonder what they sacrifice in process
head dissipation ? lifetime ? brightness ? color gammut ?
i mean technology boosts brigthness uniformity too
but does it tie it all to the 250 nits ?
tldr: intels mitigation what hits the performance heavily on some cases is flaved and the attack kind of works still
Not only full of these bugs but also more expensive than AMD
They're not playing their cards well
still waiting for AMD to get a big enough market share where security researchers will actually care enough to investigate them
Implying they don't already do it