The original post: /r/pcmasterrace by /u/JetAbyss on 2024-05-27 13:20:15.

So I just built my first gaming desktop after years of using laptops. Been loving my 4060 Ti build (hey don’t judge, got it on Memorial Day sale) for around $1,000 but naturally I somewhat cheaped out on the monitor.

I focused on getting the biggest screen (27in) I could get for around $100 and got myself an MSI MP251 for $80. Apparently it’s a monitor with a 100hz but it still fulfilled my main requirement of at least being 1080P (I usually play High/Ultra at 1080P on casual games like Fortnite and Helldivers 2 alongside some older single player games).

I never really thought about monitor refresh rates until just now.

My last gaming laptop had a 144hz screen, but I barely got to use the most out of that refresh rate since naturally a laptop has trash performance, plus I didn’t even really notice it anyways.

Now I’m looking up, apparently 100hz is trash or something? I played a match of HD2 and watched some 1080P music videos and I didn’t really notice much lol

But I’ll play more games later to test it out but I wonder what others think, so I’m posting this before I sleep

Reason why I’m coping and sneeding right now is because I can’t return this monitor. ;_;

I threw away the box RIP.

But yeah to keep in mind my use cases and repeating it:

I play games casually for fun and I the only ‘competitive’ multiplayer games I play casually anyways is just Fortnite and Helldivers 2. Aside from that it’s usually singleplayer games.

  • @[email protected]
    link
    fedilink
    14 months ago

    I think people say that because they base monitor refresh rate on the power cycle coming out of the wall, which is 60 hz. So basically anything that divides well by 60 is good, i.e. 60, 75, 120, 240. Even if that is true that it matters, Europe has a 50hz power system… so 100 would be great.