GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

One tip if you are calibrating by feel or using a guide like Lagom, tweak for a little while and WALK AWAY.
This is btw. general good life advice that help you with a lot of stuff like this. The brain getting used to a particular pattern can be a bitch.

How fucked am I?
Not at all, just replace the cooler. The power outage wasn't so much the culprit but probably more what it finally pushed it over the edge.
 
Be nice if the mice were easier to take apart - no more of this shit where the case has plastic tabs that you have to nearly break to take it apart. If you could unscrew the entire mouse with 6-10 screws, mine would last alot longer.

Even the gamer mice have improved lately though, I had an original G502 that eventually fell apart and had slight nail marks (from 1mm long 'nails' on my right thumb) on the padded portion, but I replaced it with a G502 Hero that has some kind of scratch resistant padding so it has held up better. Still would be nice if you could take the whole thing apart easier via screws.

Another area I have to say isn't quite as good as the 'old' version are mechnical keyboards. What happens with everyone I have had is that eventually the keys are very easily bumped off, ie by my hand brushing against it. Right now my arrow keys are like this, but to properly clean it you have to take off a portion of the keys.

I am probably going to pay for a 1980s beast keyboard soon instead, they had their own set of problems - short cords, an obvious lack of hotkeys or volume controls since no one conceived of them that early - but I am tired of having to replace modern mechanicals.
Fuck but I bought a G502 Lightspeed. Did I make a stupid decision? Also, it was 50% off so I thought it could be a steal.
 
Heatsinks on RAM chips is a pure marketing gimmick. They don't need it.

I'm fantasizing now and then about putting my AMD APU in one of these tiny cases, the Fractal Design Node 202 for example looks nice. The last time I tried something like that though, it was a disaster. (constant heat problems, you just can't cool these tiny spaces)
Fans on SSDs are the new hotness:


I don't see why you can't cool a 65W APU in a small case, with no GPU. I haven't tried it but that would probably be my next build.
 
I can literally hear the fan in that first picture there.

Some of those m.2 drives might actually need it to stay on their top speeds. I can't imagine this would be necessary in a normal usage scenario though and I also just wouldn't do it to myself, speeds be damned.

I don't see why you can't cool a 65W APU in a small case, with no GPU. I haven't tried it but that would probably be my next build.
Hot air had literally nowhere to go. It was a pretty case, but not built for something like this. It'd probably work in the Node 202 because the fan can pull in air from the side and blow it out the topbut a bigger case with some fans causing a slight airflow will always be better. That airflow doesn't even have to be particularly noticeable but there will be an impact.

That said, I actually might end up buying that case and try. The idea of a proper desktop build, screen on top, keyboard in front and everything just doesn't quite let me go..
 
  • Agree
Reactions: Allakazam223
If those numbers are correct about max power consumption going from like, 6-10W in current generations, to 14W in gen 5, 20+ in gen 6s, it stands to reason that something has to be done if they're going to be able to perform as well as they're supposed to without running big temperature differentials with the stuff they're plugged into.

What seems more likely... a redesign to the NVME interface to dump heat to the motherboard better? Or a choice between bigger and bigger heatsinks and active cooling and just getting your SSD more spread out on a real PCIe card?
 
If those numbers are correct about max power consumption numbers going from like, 6-10W, to 14W in gen 4s, 20+ in gen 5s, it stands to reason that something has to be done if they're going to be able to perform as well as they're supposed to without running big temperature differentials with the stuff they're plugged into.

What seems more likely... a redesign to the NVME interface to dump heat to the motherboard better? Or a choice between bigger and bigger heatsinks and active cooling and just getting your SSD more spread out on a real PCIe card?
I'm not sure if this heat ramp up is just the laws of physics coming into play as you try to move data faster, or if some node shrinks and better designs can take care of it.

Here you can see 28nm memory controllers in mainstream PCIe 4.0 drives and 12nm only at the high end:

As with most SSD controllers aiming for the high end PCIe 4.0 product segment, the SM2264 is fabbed on a smaller node: TSMC's 12nm FinFET process, which allows for substantially better power efficiency than the 28nm planar process used by the preceding generation of SSD controllers.

These new SSD controllers roughly double the performance available from PCIe 4.0 SSDs, meaning sequential read throughput hits 14 GB/s and random read performance of around 2M IOPS. To reach this level of performance while staying within the power and thermal limits of common enterprise SSD form factors, Marvell has had to improve power efficiency by 40% over their previous generation SSD controllers. That goes beyond the improvement that can be gained simply from smaller fab process nodes, so Marvell has had to significantly alter the architecture of their controllers. The Bravera SC5 controllers still include a mix of Arm cores (Cortex-R8, Cortex-M7 and a Cortex-M3), but now includes much more fixed-function hardware to handle the basic tasks of the controller with high throughput and consistently low latency.

Don't be an early adopter of PCIe 5.0 SSDs and other new technologies.
 
If those numbers are correct about max power consumption going from like, 6-10W in current generations, to 14W in gen 5, 20+ in gen 6s, it stands to reason that something has to be done if they're going to be able to perform as well as they're supposed to without running big temperature differentials with the stuff they're plugged into.

What seems more likely... a redesign to the NVME interface to dump heat to the motherboard better? Or a choice between bigger and bigger heatsinks and active cooling and just getting your SSD more spread out on a real PCIe card?
How about separating it from the motherboard and connect it with a riser cable so it can be mounted in the case?

The riser cable needs to be sturdy and will probably look something like this:
f2f983ff-982e-425d-936e-0b0eddf47b0a.JPG
If the NVME is placed in a 3.5" heatsink it will be super easy easy to mount in the case.
 
I learned a new thing today.

A few weeks ago I bought a 14" portable screen for various reasons I absolutely fell in love with because of it's size and high pixel density. It's basically a notebook panel some crafty chinaman stuck into a fancy brushed aluminum case and put some ports on. So far so good. Since most of what I'm doing at the computer is text based and text looks absolutely brilliant on it and sitting close to such a screen is actually not as bad to you and your eyes and posture as sitting close to a 20"+ screen I actually started using it for many things I do, and that went perfectly fine. It was basically like sitting at a notebook without all the disadvantages of the notebook.

Emboldened by this, I bought another portable screen, a 15.6" 4k screen. With even higher pixel density than the 14" screen and just a little bit bigger, it must look absolutely brilliant, right? Wrong. Immediately after connecting it I noticed things looked off. Not only were the colors absolute shit but at least the fonts even somehow looked somewhat worse than on the 14" screen.

And that's the day I learned that there's such a thing as fake 4k screens. If you try to draw a checkerboard pattern with 1 pixel white/1 pixel black this screen just cannot do it. I found out there are indeed lower value 4k screens that have different pixel layouts to lower the cost. Didn't know that either. So there you go, in case you didn't know like me. As a result of this, this screen also has very poor viewing angles.
 
I learned a new thing today.

A few weeks ago I bought a 14" portable screen for various reasons I absolutely fell in love with because of it's size and high pixel density. It's basically a notebook panel some crafty chinaman stuck into a fancy brushed aluminum case and put some ports on. So far so good. Since most of what I'm doing at the computer is text based and text looks absolutely brilliant on it and sitting close to such a screen is actually not as bad to you and your eyes and posture as sitting close to a 20"+ screen I actually started using it for many things I do, and that went perfectly fine. It was basically like sitting at a notebook without all the disadvantages of the notebook.

Emboldened by this, I bought another portable screen, a 15.6" 4k screen. With even higher pixel density than the 14" screen and just a little bit bigger, it must look absolutely brilliant, right? Wrong. Immediately after connecting it I noticed things looked off. Not only were the colors absolute shit but at least the fonts even somehow looked somewhat worse than on the 14" screen.

And that's the day I learned that there's such a thing as fake 4k screens. If you try to draw a checkerboard pattern with 1 pixel white/1 pixel black this screen just cannot do it. I found out there are indeed lower value 4k screens that have different pixel layouts to lower the cost. Didn't know that either. So there you go, in case you didn't know like me. As a result of this, this screen also has very poor viewing angles.
Sounds like you got a gypped TN screen. What happens if you display thin concentric circles or something similar, do they blur, show a moiré pattern or both? I've attached an image with circles if you'd like to check.

Panel technology can be interesting. (the RGBW technique was a new thing to me)
monitor-panel-types.jpg
 

Attachments

  • radial-circles-design-element-converge.jpg
    radial-circles-design-element-converge.jpg
    597.2 KB · Views: 44
Can't really test the screen anymore, brought it to the post office this morning to send it back. In germany everything you buy online you can send back inside of 14 days without as much as an explanation and they have to refund you. No way I'm gonna keep a crappy and wildly falsely advertised screen like this. But yes, you'd get Moiré patterns on pictures like this, it looked exactly like the necessary amount of pixels to display them 1:1 correctly just wasn't there.

Sounds like you got a gypped TN screen
I thought so too at first, because of the poor viewing angles. The colors were actually pretty strong though and not "foggy" like TN panels as I am familiar with them, they were literally just off (e.g. oranges were yellow etc., still strong though!) and if you sat right in front of it and had a somewhat dark-colored but not black background, colors and brightness in the corners would shift like crazy depending on how you held your head. Maybe it was a TN after all, because there was no right angle to look at it, just like with a TN. (still didn't have the fogginess of one though) From my googling around I found the occasional post of people that own 4k screened notebooks complaining about this on their notebook, leading me to believe they had a similar panel. From what I read some Pentile matrix displays have actual viewing angle problems. When they came around the blowback apparently was so strong from dissatisfied customers that some brands just have completely stopped using them. There are others like Asus who use them exclusively 1080p-up though. (Maybe later ones that are much better)

There are 4k portable screens that cost less and some that inexplicably cost double, although they advertise roughly the same stats. My guess is the more expensive ones are the ones with actually proper RGB-strip panels. Maybe and if they didn't do a sudden switcharoo in panel technology on the same model number to cut costs, that is. That's the chinese for you. I'm not willing to pay that much to play panel lottery on a workflow experiment and in total costs for a little more I could also already get an OLED screen of the same size. I do have to note though that not even the shitty 4k panel has any backlight bleeding problems and both have good and even brightness making even blacks quite good. (for IPS) I guess it's easier the smaller the screen is.

Second time I got burned by 4k, I guess 1080p still reigns supreme. I'm going to keep the 14" for now, buying screens is always a PITA for me because of my fussiness and high standards and it's rare that I find something I actually like and it was really cheap.

EDIT: Oh also, for whom it may concern - linux handles highDPI situations generally really good now, even outside of DEs.
 
Last edited:
  • Thunk-Provoking
Reactions: Smaug's Smokey Hole
sorry for the doublepost, but is a GTX Titan X worth it for $250? saw a craiglist ad and don't have enough info whether or not it's worth the extra bucks to go for the 6600XT instead.
 
sorry for the doublepost, but is a GTX Titan X worth it for $250? saw a craiglist ad and don't have enough info whether or not it's worth the extra bucks to go for the 6600XT instead.
For that price I'm assuming it's a Titan X maxwell. If so, I'd rather go with a 6600XT because if I remember correctly it's about on par with a 1080ti. Also much, much newer. If it's somehow a Pascal Titan X then I'd probably grab one of those for $250.
 
For that price I'm assuming it's a Titan X maxwell. If so, I'd rather go with a 6600XT because if I remember correctly it's about on par with a 1080ti. Also much, much newer. If it's somehow a Pascal Titan X then I'd probably grab one of those for $250.
maxwell.jpg
From what I've seen on the internet, this is probably a Maxwell.
 
How about separating it from the motherboard and connect it with a riser cable so it can be mounted in the case?

The riser cable needs to be sturdy and will probably look something like this:
View attachment 3271362
If the NVME is placed in a 3.5" heatsink it will be super easy easy to mount in the case.
Full circle back to the mid 90's harddrives, are we?
 
  • Like
Reactions: Smaug's Smokey Hole
Back