GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Example. The old intel stock CPU coolers had a solid copper core, then they went semi copper-aluminum, and finally all aluminum core... at an increase of cost.

This was one of the reasons why AMD was so compelling with Ryzen 1000. It might have only matched (better in multi thread, worse in single thread) the Intel offerings of the time, but the standard coolers they came with were genuinely good. The Wraith Max / Wraith Prism could even do some fairly significant overclocking on a suitable board.

Ideally you'd go for something like the 3080 with 10GB but that can be over a thousand right now due to miners.

More like a grand and a half. And the new RTX 3080 Ti is knocking on two grand.
 
it's not like my active scythe is that loud to begin with, my keyboard with brown switches is louder.
Which Scythe cooler do you have? I'm currently running the Ninja 5 and it more than adequately keeps my 3800x cool even with high ambient temps. My HDDs are louder than the cooler it's hilarious.
 
  • Like
Reactions: IAmNotAlpharius
More like a grand and a half. And the new RTX 3080 Ti is knocking on two grand.
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.
 
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.

Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).

But note that your link says "Out of Stock". If you can find a 3080 for the price on that page, grab it. But would be a small miracle if you do. And watch out for scams if you resort to ebay for one. I really hate having to tell you all this but you're about to see why everyone who wanted a graphics card in the past however many months has come to hate crypto miners. The AMD 6xxx series launched, 4 months back? And I've never seen one in real life nor know someone who got one.
 
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.
The jump will be absolutely huge for After Effects and it would still be a massive leap if he got a last gen 5700XT or a last-last-gen GTX 1080. Heavy CUDA and OpenCL processes really benefits from more VRAM and he's got almost none.
 
AMD has somewhat better drivers for Linux, so I guess that's something.

Not somewhat, hand-over-fist superior. Nvidia's drivers barely work, and can't advance, because it's closed source. There's nouveau, which is open source, but with no technical data on any of the GPUs, it can't go very far, either.
 
  • Like
Reactions: Allakazam223
Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).
the 770 is still a decent card. it might not play all the new games at the highest settings and at the highest resolutions, but if people are fine with having their settings down to 720p low, or for older games, more modest settings, then it'd serve perfectly well.
but yeah the 3080 would blow that thing out of the water by a couple hundred miles and then some
 
the 770 is still a decent card. it might not play all the new games at the highest settings and at the highest resolutions, but if people are fine with having their settings down to 720p low, or for older games, more modest settings, then it'd serve perfectly well.
but yeah the 3080 would blow that thing out of the water by a couple hundred miles and then some

Agreed. Wasn't meaning to write off 720p gaming - honestly, I'm only really interested in cheap strategy games when I can even find the time to game so it's all good with me. Just highlighting what nearly a decade of intense research and development has achieved compared to back then.

Anyway, somewhat more helpfully to the OP, I'd say a 1080 would be sufficient. I mean, not nearly as nice as a 3080 if you could get one at a reasonable price, but you can't. Not that getting hold of a 1080 is easy or cheap either but if it's all that can be got it's still a good card.
 
Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).
You know that if you use a CRT, you are going to have an awesome experience.

The main reason we left the crt field is because of.

1. Weight.
2. Fucking Greed.

As stated before, I had access to a refurbishment company so I exactly know why the video monitors in technology went the way it did.


Let me give you a hint. It was not out of the goodness of their heart.

So if you want to screw around a bit and go retro Keep your card and create an Win2000/Xp machine and have fun

Or just say fuck it and do what you want.
 
The main reason we left the crt field is because of.

1. Weight.
2. Fucking Greed.
I would add power consumption and regulations to that. Look at plasma, it went the way of incandescent lightbulbs.

SED was the true successor to CRT and it looked really spectacular(never seen it, just read about specs) but it was dropped when the other pipe-dream, OLED, looked like it could finally happen.
 
I would add power consumption and regulations to that. Look at plasma, it went the way of incandescent lightbulbs.

SED was the true successor to CRT and it looked really spectacular(never seen it, just read about specs) but it was dropped when the other pipe-dream, OLED, looked like it could finally happen.
Heh, The CRT Hybrid was great however around 2007 the peanut counters got together and stated the following.
That 1920x1080 was going to be the facto size. The excuse was quote: "for best entertainment value". Everything else at a higher rate was considered commercial/business/artistic.

By doing this they increased the prices of everything by 50 to 100%. that was higher than 1080p

This meant they were going to sell shit, literal poor quality shit at whatever prices they can get, because marketing said so. From late 2008 to 2014 most monitors life expectancy 2 to 3 years stuck at 1080 @60hz. Anything out of this spec you got your ass handed to you.

Man I was there when that happened at the repair factory talking to the owner. I bought a soyo tech 1920x1200 for $250. A few months later it was stupid high. That soyo I purchased lasted me for 6 years.

https://bjorn3d.com/2008/07/soyo-24-pearl-series-lcd/

Then hit the Korean wave of cheap monitors at high refresh rates... and they were good monitors overall.

I purchased the 27" Qnix 2710 in late 2014 which lasted me to 2019. Purchased that for $320

From there my current monitor is the Pixio monitor 32" PX329. This one is a nice monitor but It'll probably last for 1 or 2 more years. That should be in the 2023 range however your transformer will probably give you issues sooner.

Because things these days are made to be broken... so that consuoooomers will buy the greatest and bestest things that they really don't need.

And finally I won't spend more than around $350 for a monitor. I just find that nothing is really made well anymore. It just easier to place a stamp of quality on something than actually do R&D.
 
And finally I won't spend more than around $350 for a monitor. I just find that nothing is really made well anymore. It just easier to place a stamp of quality on something than actually do R&D.

I find the historical background interesting and I agree with most of what you said but this last I do not. It's true that most electronic goods are designed to fail, these days. Fridges and washing machines fail within literal months of the warranty expiring with remarkable frequency. But high-end monitors are well worth it if you can afford it. There are 5k2k ones now which cost a LOT but to work on them is a joy compared to something in the $350 region.
 
I find the historical background interesting and I agree with most of what you said but this last I do not. It's true that most electronic goods are designed to fail, these days. Fridges and washing machines fail within literal months of the warranty expiring with remarkable frequency. But high-end monitors are well worth it if you can afford it. There are 5k2k ones now which cost a LOT but to work on them is a joy compared to something in the $350 region.
That depends on your parameters. IMHO as well as I tell my clients is to look at the 1440P range @ 144hz at around the $350 range and wait 2 or more years until the the price of 4K comes down. I'm not sold on it the same way I'm was not sold on the past gimmicks Such as G-sync from Ngreedia.

I saw no real degradation in playing (borrowed a monitor) the usual hosts of video games that denoted a 100+ price increase during the time. The same thing goes with Ray Tracing. Neat concept except 80% of the world does not have bleeding edge tech to make full use of it. Nor all of the video game companies make full use of it as well. But it sure as hell brought the price up on the video cards.

The 1440P and high refresh rate was incredibly expensive prior to 2014, just like 4K is now. I'll wait a few more years until the price goes down enough where it is affordable to purchase.

My 32in" Pixio329 @165mz got good reviews, works well with my 5700XT. But what I have said before is what is going to go out is probably going to be your transformer block. I've already have an extra and will be automatically replacing it at the end of this year.

Oh that is another thing. You really have to pair up what your video card will do. Of course if you willing to sell a kidney( or you really do have the money) and got a 3070 or higher... then okay I see where you are coming from. It would be better to take advantage your video card and get a high performance monitor.

In my case by the time the Pixio monitor hits 4 years I'll be looking for a replacement and retire my old ViewSonic (that I got free) that is on my back up computer.
 
1. Weight.
2. Fucking Greed.
There's more to it, I know @Smaug's Smokey Hole mentions power consumption, but the biggest issue with CRT's are that they are environmental landmines due to how much mercury is in a single CRT monitor. Recycling warehouses have to take proper precautions to prevent Mercury from leaking out.
 

Thieves Hit Internet Cafe and Make Off With GPU Stash​

Chinese news outlet 浙样红TV has reported that thieves have stolen over $7,000 worth of high-end graphics cards from an Internet cafe in the city of Hangzhou.


I'm surprised that we haven't head about this happening more often. Unlike other relatively small items with high value like laptops a graphics card won't be geofenced and brick itself or phone home or any of that stuff. Phones can be IMEI-blocked or locked as well. A functioning graphics card will continue to be a functioning graphics card and it can be used to bolster someones personal mining efforts or sold at the current ridiculous scalper prices. From previous pictures it looks like some miners rents an empty house and just places everything on the floor in the empty rooms and I'm surprised that someone isn't raiding them. Maybe they booby trap the place and thieves don't want to risk it? The Chinese that collects and sells birds nests supposedly booby trap the caves they take them from to keep the competition out, a hazard for tourists that likes to explore off the beaten path, and I've heard that houses rented to function as grow houses for weed can be booby trapped.
 

Thieves Hit Internet Cafe and Make Off With GPU Stash​

Chinese news outlet 浙样红TV has reported that thieves have stolen over $7,000 worth of high-end graphics cards from an Internet cafe in the city of Hangzhou.


I'm surprised that we haven't head about this happening more often. Unlike other relatively small items with high value like laptops a graphics card won't be geofenced and brick itself or phone home or any of that stuff. Phones can be IMEI-blocked or locked as well. A functioning graphics card will continue to be a functioning graphics card and it can be used to bolster someones personal mining efforts or sold at the current ridiculous scalper prices. From previous pictures it looks like some miners rents an empty house and just places everything on the floor in the empty rooms and I'm surprised that someone isn't raiding them. Maybe they booby trap the place and thieves don't want to risk it? The Chinese that collects and sells birds nests supposedly booby trap the caves they take them from to keep the competition out, a hazard for tourists that likes to explore off the beaten path, and I've heard that houses rented to function as grow houses for weed can be booby trapped.

I heard about scalpers being robbed in Toronto for them. And in Scotland, a PS5 scalper was conned into driving 120 miles at his own expense before the purported buyer revealed how he's done it to waste their time (cash on delivery, you see). Scalper had a shit fit and said "I hope your kids are disappointed." Chappie replied, "lol they already have one fuckface."

But yeah, these are reasons not to scalp, frankly. If I didn't have such a finely tuned moral compass ("lol, what moral compass, Piglet's a lawyer") I would be sorely tempted to sell my 3080 on greedbay and show pictures of it with its original box and suchlike, and when the money came in, send the box with a bag of sand in it carefully filled to the same weight as the card. And then torch my account and run for the hills. I'm also surprised that scalpers haven't been jumped when they gave their address for cash on collection for cards and the money retrieved.
 
I heard about scalpers being robbed in Toronto for them. And in Scotland, a PS5 scalper was conned into driving 120 miles at his own expense before the purported buyer revealed how he's done it to waste their time (cash on delivery, you see). Scalper had a shit fit and said "I hope your kids are disappointed." Chappie replied, "lol they already have one fuckface."

But yeah, these are reasons not to scalp, frankly. If I didn't have such a finely tuned moral compass ("lol, what moral compass, Piglet's a lawyer") I would be sorely tempted to sell my 3080 on greedbay and show pictures of it with its original box and suchlike, and when the money came in, send the box with a bag of sand in it carefully filled to the same weight as the card. And then torch my account and run for the hills. I'm also surprised that scalpers haven't been jumped when they gave their address for cash on collection for cards and the money retrieved.

Scalpers are annoying but I read an article about them and it said they were mostly college kids. Or at least the ones who aren't part of organized groups are.

Idk how mad I can get at some 19 year old who's already financially fucked by student loans and the generally shitty state of the economy tryna make a few bucks off PS5s.
 
Idk how mad I can get at some 19 year old who's already financially fucked by student loans and the generally shitty state of the economy tryna make a few bucks off PS5s.
Great moments in scalper history: the PS3 launch. In a sea of listings with insane prices the scalpers started desperately pimping out their wives, girlfriends and sisters to get people to click on their listing because people weren't buying. Some had bought many systems on credit.
27.jpg
New PS3s were soon selling below retail on eBay.
 
Well nice stuff from AMD if you can get it. They've just announced their new line-up of RDNA-2 based Pro graphics cards. Their W6800 has a very comfy 32GB of RAM. Supposed to be a lot faster than the last gen as well.


They've got a W6600 which has 8GB. Supposed to be about $600 so I'd be up for one if you could actually get one for that. Maybe with the big crypto slump the past few weeks might actually be able to get a card soon.

They've got their new mobile version out too, which I don't care so much about but if you need a workstation laptop I guess it's pretty swell.

EDIT: Anyone discussed FSR, yet? What I'm hearing is not quite competitive with Nvidia's DLSS but a lot easier to implement and has big studios already on board. It's also backwards compatible with previous generations and it's also an open standard so looks like they are trying to (and very likely will) pull a FreeSync vs. GSync move all over again.

 
Last edited:
EDIT: Anyone discussed FSR, yet? What I'm hearing is not quite competitive with Nvidia's DLSS but a lot easier to implement and has big studios already on board. It's also backwards compatible with previous generations and it's also an open standard so looks like they are trying to (and very likely will) pull a FreeSync vs. GSync move all over again.

Oh, I mentioned that. It's available on console(s) which is a huge boon for AMD and anything on PC that touches console(s). AMD is not as strong at having RT shit going at this point but as expected they have made considerable gains now that games that are also on consoles are using their hardware implementation. Nvidia was the only game in town for a while.

Unreal Engine 5 reportedly performs way better with raytracing on AMD hardware because it (preferably) relies on a software/shader mode, ultimately it's rasterization, presumably so they don't exclude the vast majority of people that don't have RT cards because that would be dumb. This is the nature of the PC graphics card, new hardware features needs to see new hardware adoption and as of now it's not the time to make RT mandatory.
 
Back