Jump to content

PC Hardware Thread


ALPHA17
 Share

Recommended Posts

1 hour ago, roun90 said:

@pixeljunkie Why do you think 1080p 60 fps is bad? The Maxwell cards (esp. x70, x80 and x80ti variants) still give a better performance than any of the base consoles and in some cases better than the PS4 Pro as well.

 

These cards were made for those with monitors of 1080p 60 fps and could not/did not get 1440p/4k/144 hz monitors. That is the majority of PC owners currently, who are not hardware enthusiasts. Maxwell is still a very viable series for the average gamer, and not a dead series.

 

1080p60fps has been around for far too long. So long, that it has actually been detrimental to IQ within games. In the present 4K era, i.e., RIGHT NOW, if the average game developer isn't abusing the available computational horsepower to implement over-the-top PBR implementations or some inefficient combination of post-processing effects, passing it off for photorealism in the process, than the same depend too much on AA to not only sharpen/improve IQ but to also eliminate in-game artifacts. To make matters worse game developers poorly implement AA that actually saps performance.

Edited by pixeljunkie
Link to comment
Share on other sites

1 hour ago, pixeljunkie said:

1080p60fps has been around for far too long. So long, that it has actually been detrimental to IQ within games. In the present 4K era, i.e., RIGHT NOW, if the average game developer isn't abusing the available computational performance to implement over-the-top PBR implementations or some inefficient combination of post-processing effects, passing it off for photorealism in the process, than the same depend too much on AA to not only sharpen/improve IQ but to also eliminate in-game artifacts. To make matters worse game developers poorly implement AA that actually saps performance.

 

 

Dude, we are barely getting cards which are capable of pushing 1440p @60fps consistently these days, sorry I must have missed the memo on the 4K60fps revolution.  Also, since when did PC gaming devolve to the level of simple resolution wars?

 

Computational performance does not directly equate to real-world performance.  I would love to live in the perfect physical dimension where a single factor is traceable to causation. If that were the case, an RX480 would be performing right at the heels of the GTX 1070 which is not the case. 

 

I agree with the Anti-Aliasing filter being the biggest sapper of performance. 

2 hours ago, HEMAN said:

and what they do, Nvidia asks retailers to not to sell to miners and instead sell to gamers. Genius people. (sarcastic voice)

 

That is the company's problem. Not yours, not mine. 

 

The easiest way to prevent misuse of cards is by limiting the number of purchases to a sane limit. No private individual should have a need for more than two nVidia or four AMD GPU's (the highest limits of SLi and CrossFire). 

 

But hey, free market rules work both ways. 

 

Edited by ALPHA17
Link to comment
Share on other sites

3 minutes ago, pixeljunkie said:

@ALPHA17 The problem with this topic is that at its heart there is too much room for controversy because game developer(s), GPU manufacturer and display manufacturer(s) are simply unwilling to acknowledge their part in all of this. Please remember that i've kept this in mind while typing out a response.

2

 

Simple answer, if you want to look for conspiracies everywhere, you will find them. The point remains, due to poor standardisation and different mitigation technologies /methods by different companies there is simply no ways this issue will get adequately resolved unless someone like Microsoft steps in with a 'solution'.

5 minutes ago, pixeljunkie said:

But with technologies like adaptive VRR, 30FPS won't be any less enjoyable than 60FPS. Please continue reading below.

 

 

Except that most of the games played on PC are competitive and no amount of VRR will change the fact that lower frame rates mean higher input latency, poorer response and further knock-on effects. And, this is true of all titles, single or multi-player. Higher frame rates are beneficial at almost all levels. 

 

The only time I will think about locking up my frame rate is in case I am playing a title like Banished or Rise to Ruins or something which by default does not require a high frame rate and then I am hamstrung by the fact that a low floor like 30fps means everytime there is a drop in performance, the frame rate gets kicked into the 20's and late teens which are just as aggravating as coming down from a stable 60fps to ~50. 

7 minutes ago, pixeljunkie said:

I'm so glad you brought this up! :) No really i am! :)

 

Doesn't it surprise you in the least bit that gamers haven't made a fuss about higher resolutions which brings free AA along with it? We gamers got so caught up with frames per second, frame latency and refresh rate that we never thought to ask for higher resolution and free AA. What should've happened is that alongside higher, synchronised refresh rates, we gamers should've also demanded free AA with higher resolutions; which basically didn't happen.

 

 

No, it does not surprise me considering the fact that anti-aliasing is not required over 4K resolutions is a myth and it has been deflated. Yes, higher resolutions negate aggressive usage of the filter but does not outright remove the need, shimmer and jaggies are still present. 

 

This article still rings true even though it is close to three years old.

 

Again, higher frame rates are not just about the number, your overall experience and edge improve as you touch a  consistent higher frame rate. And that is before you consider that pixel density drops off a cliff with higher resolution displays currently, this is more of a problem because there simply is not enough demand for a 24" /27" or even 30" 4K resolution display panel. And pixel density is one of the reasons why we have issues like aliasing in the first place. 

20 minutes ago, pixeljunkie said:

True. But there's another reason. Refer to the very beginning of this post.

 

 

 

Again, not a whole lot you can do if there are no base standards to govern the overall myriad of systems on offer. 

 

And again, single factors are rarely the be all and end all causation in this world.  

Link to comment
Share on other sites

11 minutes ago, pixeljunkie said:

I'm not the one spinning a conspiracy. Unfortunately, this is the one damned excuse that different companies will exploit to avoid working together towards a common goal that will benefit the industry as well as the consumer.

 

 

Why don't you broker a peace between nVidia, AMD and Intel then,  since you are so motivated by the righteousness of the cause. 

12 minutes ago, pixeljunkie said:

Lower frame rates does not immediately equal higher latency. Further knock-on effects? I don't follow.

 

If game developers actually understand how a tool like PIX can benefit development then it will actually help them make better games.

 

 

A lower frame rate is the prime reason for latency. Your belief that lower frame rate does not equate to high latency is just that, a belief. 

 

Do simple maths, what is quicker, 1/1000 of a second or 1/100 of a second? Apply it here as well, 1/60+ or 1/30.

15 minutes ago, pixeljunkie said:

Please ask yourself this question. Shimmer and jaggies as a result of what?

 

The article you've mentioned only considers the jump in resolution from 1080p to 2160p. Surely, you must've noticed that?!

 

 

 

Math scales the same way irrespective of jump in resolution. 1440p was just appearing on the horizon then, same is true of current 4K implementation. 

 

Shimmer and jaggies are caused because of the perennial motion of graphics on a display o f relatively lower pixel density. Because at the end of the day, a game uses poly-meshes which can only be resolved to so much of a detail level. 

18 minutes ago, pixeljunkie said:

Could you please elaborate on this?

 

 

Take a small display, the best example I can give is of Apple's Retina display. Why are they so sharp? 

 

It is primarily a function of that for the given panel size, the pixel density is above a certain threshold, detail is retained. Take the same resolution, put it on even a slightly larger panel and you lose that sharpness. 

 

Mobile devices are replete with such examples. SONY's 4K panels are infinitely sharper vis-à-vis other comparably sized panels with lower resolution. And that is the trend now, small panels (relatively) coupled with super high pixel density. This is not the case with regular displays, a 27" monitor will only have a maximum of 1440p resolution, there is simply no demand for a 4K display. 

 

You also have to understand, more the pixels, also requires a comparably strong GPU to drive it and draws more power. 

Link to comment
Share on other sites

5 minutes ago, pixeljunkie said:

P.S. Numbers and simple math simply cannot accurately explain human perception.

 

 

The point is there is zero motivation for the company's to standardise. It reduces their lock-in. 

 

Numbers and maths do not accurately explain human perception, yet there is a push for certain features because they are noted and 'perceived' to better the experience. 

 

Realise that not everyone wants 4K-30fps graphics on PC, I am biased to highest possible IQ with a 60fps or higher frame-rate. 

Link to comment
Share on other sites

I really dont know anything about the crpto currency scenario other than that they use GPU. Wont the problem be solved if both compaines offer mining specific cards. I know its easier said than done but to some extend it can stabilize the market.

Link to comment
Share on other sites

5 minutes ago, HEMAN said:

I really dont know anything about the crpto currency scenario other than that they use GPU. Wont the problem be solved if both compaines offer mining specific cards. I know its easier said than done but to some extend it can stabilize the market.

And we are back to the first suggestion I made of offering mining specific cards and blocking others. The problem is firms are not doing it. They are too lax with their gamers and enjoying the current situation too much to do anything about the future. I think Nvidia will do something at the time of Volta release if the mining market has not crashed by then.

Link to comment
Share on other sites

 

4 minutes ago, pixeljunkie said:

I think you're forgetting the extremely effective marketing tactics that some companies employ to ensure that certain features are a commercial success.

 

Had we gamers demanded higher resolutions then you and i wouldn't be having this discussion.

1

 

I have done some time in the industry, they are about as enlightened about their concepts as a bunch of baboons know how to drive cars. 

 

Also, gamers have been demanding resolution jumps for some time now, it is the other factors which have lagged. 1080p >4K are far easier catch words than 60fps consistent or 144Hz otherwise we would have seen dirt cheap high refresh rate panels in today's market. You can get a good UHD high-refresh rate monitor at the same price you will get a 4K display (30"+). 

7 minutes ago, pixeljunkie said:

Who said anything about limiting high-resolution gaming to 30FPS? Technology will never stop evolving; unless certain companie(s) are deliberately uncooperative and breakaway from the original vision chasing after short-term goals.

1

 

8 hours ago, pixeljunkie said:

But with technologies like adaptive VRR, 30FPS won't be any less enjoyable than 60FPS. Please continue reading below.

 

 

Technology never stops evolving, agreed. That does not mean that it is also made affordable /available to all on the day of conceptualisation. 

 

I do not think any company is out there looking at short-term goals but I will say that they all have a different vision and how to achieve it. Hence the divergent paths taken. Also, the fact that nVidia has to diversify its market because out of the big three, it has only one viable product /service to offer currently. 

 

If you want to understand how a company goes about developing a product(s), check out this deep dive on AMD's HD4xxx and HD5xxx series GPU's at Anandtech .

Link to comment
Share on other sites

9 minutes ago, pixeljunkie said:

I never meant to imply otherwise. To whom are you referring? I'm assuming its the guys who are downplaying the need for higher res displays.

1

 

No. I am not downplaying anyone. I am just saying marketing puts money where it sees the trends are. 

 

Also, on PC the requirement for a super high-resolution panel is hampered by the fact that we use the display in our faces so a large panel cannot be used, conversely there is not enough of a demand for small panels with high pixel density. 

20 minutes ago, pixeljunkie said:

I'm not aware of these factors. What are they??

 

 

High consistent frame rate. We should be targetting ~75+ Hz refresh on almost all monitors, HDR support (although this is again chicken-egg problem as well as technical issues of HDR inducing lag). Higher density panels, 24" 1440p should be standard. ~1ms grey-to-grey should have been standardised. But all of these apart from HDR are really hard to market to the end user. 

22 minutes ago, pixeljunkie said:

This is something we have to live with. What we need to do is differentiate companies by drawing a line that separates the ones who have the consumer's best interests at heart from the ones who deliberately hold back development which regresses the industry because they're chasing after short term goals.

 

 

 

How do you plan to achieve this? 

 

Intel scalped us for six years when AMD was in the doldrums, now that Ryzen is here should everyone just drop Intel like a hot potato? 

 

Or should this be how we look everytime RTG will release a compelling product vis-à-vis nVidia? 

 

Please do not go into vague semantic driven arguments because there is no end to it then. 

25 minutes ago, pixeljunkie said:

I'm pretty sure in the context of this discussion that there are some companies who've lost sight of the big picture even if it was never their intention to hold back the industry.

 

 

Like? How? Why? 

27 minutes ago, pixeljunkie said:

Can you elaborate a bit more on this please? I'm not sure what to expect from the article you mentioned. Or do you want me to read the article with little context or zero expectations?

 

 

Read the article please, if you would have read it you would not have asked this question. It shows how two companies who want to achieve the same end goal look at the issue in differing light and thus take different routes to it. Morever how certain features are included /removed from the product because of an individuals will to have that in place. 

28 minutes ago, pixeljunkie said:

But how does the increased pixel density encourage aliasing?

 

 

Increased pixel density reduces aliasing, I do not know how you inferred the opposite.

28 minutes ago, pixeljunkie said:

I need to research more into Sony display panels. Thanks for the info anyways!

 

 

They are not SONY panels, it is a collaborative concern called JDI (composed of SONY, Toshiba, Hitachi) which manufactures LCD panels for a myriad of industries

29 minutes ago, pixeljunkie said:

Again, which is something we must learn to live with. So, do you agree that we need to differentiate the companies who care less about lock-ins because they only have the best interest of the industry (which in turn means the best possible product for the end-user) in mind from the ones that don't?

 

 

How do you discern this? This is going way too far into the direction of semantics. 

 

How do you know which company is doing something for only best interests whilst the other is doing it for long-term gain? 

 

Do you consider having to sign into GeForce Experience for enhanced demographic data sharing to nVidia a good thing or a bad thing, considering this might allow them to deliver better patches? 

 

Do you consider AMD including the toggle to optimise GPU performance under different usage scenarios a good thing considering it has a mining profile and a gaming profile? 

 

Is G-Sync better or FreeSync? 

 

Is Intel HyperThreading better or AMD's SMT?

 

And these are all surface level questions about products /features that are built for the same end result by the competing companies. 

Link to comment
Share on other sites

3 hours ago, pixeljunkie said:

  So, can we both agree that high resolution displays is now a firmly established trend? If so, then would you also agree with me 4K is the new 1080p and henceforth 8K is the new 1440p?

2

 

No. Cannot, what is the most common display paired these days with an average gaming PC sale? A 24" panel, if the guy has some extra money, they might get a UHD panel or something with ~75Hz or ~120Hz refresh. 

 

No one goes out and purchases a 4K monitor on a whim. Since,

  1. They are not cheap
  2. They are large (~30" and larger)
  3. Single GPU's still cannot drive a consistent 4K experience. 
3 hours ago, pixeljunkie said:

 This is the reason why manufacturers have been experimenting with curved displays.

 

 

Again, how many people do you see purchasing curved monitors? Please note that comparable monitors (~24", 1080p) the curved variant is more expensive and generally is the poorer option thanks to budget IPS panel limitations. 

3 hours ago, pixeljunkie said:

 At this juncture, that is the case. But going forth the trend can change.

 

 

... So. Do trends always change for the better or for the more pragmatic choice?

3 hours ago, pixeljunkie said:

 For one, you can take Microsoft as an example with the launch of their latest, powerful Xbox One X premium console. They've openly acknowledged that they're losing money on every system that they sell.

 

 

 

I do not know any console that makes a sizeable profit on its hardware, it is always the games, the accessories and the add-on ancillaries that run the business. 

 

Profit on hardware is the PC side of the story. So Microsoft saying that they do not make any money on the Xbox One X is not my concern, it will not change my opinion on consoles, it does not fundamentally change anything apart from under current market circumstances dictating it is a more VFM purchase over a gaming PC purchase. And no, it will not make droves of people quit PC gaming and go pick up the Xbox. 

3 hours ago, pixeljunkie said:

 Maybe you're speaking from personal experience but how has Intel been scalping you for the last six years? In the past six years, I’ve owned only two desktop Intel processors and one mobile processor. They have served me very well. So, personally speaking, i don't think Intel has been scalping me. Are you implying that Intel should apologise for AMD in the doldrums? Are you saying that AMD isn't making more money than they should with the Ryzen uarch in it's present state?

3

 

Dude, this is the main reason everyone just sours at the prospect of having any engagement with you. The absolutely banal amount of spoon feeding and condescension. 

 

But hey, since I have nothing better on my plate right now, I will indulge you. 

 

Intel f**ked over the entire market once Bulldozer flubbed. We got Sandy Bridge -->Ivy Bridge -->Broadwell -->Haswell -->Skylake -->Kaby Lake, all with ~5% -->10% increments in benchmark performance. IRL performance was not affected in the slightest. The moment Ryzen dropped, it showed how terrible the market turns sans competition.

 

What did Intel do for almost six years of the time it had a 'lead'. It rested on its laurels, made a disastrous foray into mobile, burnt itself. It scalped the user base which had zero choice to go for competing offerings. And then within nine months of Ryzen's launch, it dropped three new product SKU's, two were essentially EoL on launch day. 

 

Now, make whatever brilliant deductions you want out of this and let me know if Intel is a 'good' company, 'bad' company which looks at short-term gains or has a long-term vision for the market?  

3 hours ago, pixeljunkie said:

I don't see how i'm the one arguing semantics. I think Microsoft deserves credit for pushing out a premium product like the Xbox One X which they aren't making any money off.

1

 

I am sure Microsoft released the Xbox One X because of its magnanimous heart. 

3 hours ago, pixeljunkie said:

 Why don't you explain what you meant when you wrote that:

1) Intel was scalping everyone when AMD was in the doldrums

2) Everyone should drop Intel and go the way of Ryzen

 

 

Already explained something because people are averse to using Google /DuckDuckGo /Bing.

3 hours ago, pixeljunkie said:

  Weren't you the one who wrote that pixel density drops off a cliff at high resolutions?

 

 

No. Pixel density drops off a cliff in large panels with high resolutions, which is 90% of the monitor /TV market. Are there any 24" 4K display? Yes. Are they priced competitively vis-à-vis larger 4K panels? No

 

Why? Because

  • No demand
  • Why would anyone want a 4K panel stuck to their face at a distance of ~3 -->5 ft
  • Okay, why should I spend more and get a smaller display? 
3 hours ago, pixeljunkie said:

 Why is it so hard for you to understand that Microsoft cares about the Industry and end-user which they've proved by releasing a high-tech, premium console that they're not making much money off? They're certainly not doing this because they believe they're special. Working with T10, they've gathered decades of experience and expertise to steer the hardware, software and gaming industry in the right direction.

 

 

So why did Microsoft dick over its entire PC userbase when they had studios developing games for them with established IP's?

 

Why was Microsoft so desperate to say that PC gaming is dying till the Xbox One got so thoroughly trounced that they now consider 'Xbox' as a service on all their devices? 

 

Okay, forget PC's for a moment, they collaborated with NOKIA, bought out NOKIA's smartphone arm, proceeded to destroy the company and then shut down their entire Windows Phone lineup. Most devices (even flagships) are now EoL.

 

Is that some kind of grand vision that I do not understand or is it a company doing what it requires to get by in the short-term so that they can exist in the race for the long run?

 

Do you understand, why I keep bringing up semantics because these are semantics. Every company has to play the short-term game, lock in customers, build a walled portfolio so that it can stay relavent in the long-term. 

 

The moment Microsoft is in the lead again, it will make mistakes like it did in the Windows 95 era,  the Windows XP era, the Xbox 360 era, ad nauseum. 

3 hours ago, pixeljunkie said:

I don't understand what any of this has to do with what i wrote about Microsoft going out on a limb to steer the industry in the right direction. Think about this, especially when taking into consideration that naysayers who paint Microsoft as a evil corporate empire would've expected them to only release the hardware equivalent of a PS4 Pro? But i'll indulge you anyways.

 

Are either Nvidia or AMD losing tonnes of money on GeForce Experience or Adrenaline/Mining drivers respectively?

 

Are either AMD or Nvidia losing boatloads of money on G-Sync or Freesync?

 

Did either Intel or AMD misstep with hyperthreading technology? If so, then are they losing insane amounts of money because of hyperthreading?

5

 

Yeah! AMD loses a ton of money on FreeSync by licensing the technology for free. G-Sync makes money, compare prices of the two monitors. And guess which brand is in the lead? 

 

AMD and nVidia both stand to lose money in a cryptocurrency mining run dominating the market because hey the cards burn out faster, have to be replaced at an accelerated rate. The additional revenue generated by the card is not all passed back to them, it is generally pocketed by the seller. And if the bubble bursts and these cards are all dumped into the second-hand market, it will be beneficial for people to just pick dirt cheap GTX1070's over a GTX1160 /GTX1150 or whatever. 

 

And why does a company having to lose money, equate them to being good guys? Can Microsoft afford to lose money or is it because they are on a righteous crusade here? I am cynical enough to consider the former due to having seen how they move about. The Xbox as a service tag serves them well currently, I would like to see how it is going forward if Xbox sales pick up. 

Link to comment
Share on other sites

Intel have been slacking for some generations now because there was no threat from AMD. Amd's Ryzen was the shake up that the market really needed.

As soon as there is single gpu that can do 4k@60fps comfortably in modern games then 4K monitors sells will go up. Since the gpu's cant do that but can do 1080p@144hz or 120hz people prefer to buy higher refresh monitors rather than 4k ones.

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...