Jump to content

PC Hardware Thread


ALPHA17
 Share

Recommended Posts

6 hours ago, pixeljunkie said:

Ok. Having reread this discussion and giving it some more thought, i realise that what we both have in common is that GPUs are still underpowered to drive higher resolutions at higher frame rates. From what i've read on the internet since January 2017, RTG architecting Vega to drive higher resolutions. While they were on the right track, i don't think they were shifting gears properly. Could it be that they prioritised higher framerates just below higher resolutions when the former should've been equally prioritized alongside the latter, if not having received the highest priority?

 

 

RTG has bigger problems. They currently do not have the revenue to push anywhere. I hope Navi is a hard reset of their base GPU architecture à-la Ryzen was for their processor side of things. 

 

Also, AMD has had to resort to making larger die based GPU's to remain competitive with nVidia's GPU's which are now getting smaller and more efficient since Maxwell days. 

6 hours ago, pixeljunkie said:

If this were a list of priorities, then would you say that No.3 should receive the highest priority, i.e., immediate attention?

 

 

 

The list was in no particular order. 

6 hours ago, pixeljunkie said:

That's why i used the word 'experiment'. I may have not conveyed my opinion in so many words, but in my defense i was confident that the word 'experiment' was self-explanatory.

5

 

An 'experiment' would imply that there would only be a niche availability of the product. But they are quite widespread especially from Samsung and LG, companies which produce their own panels. 

6 hours ago, pixeljunkie said:

So, the pragmatic choice would be to immediately develop GPUs capable of achieving stable framerates in the upper range of 120~160fps. And i'm using the word 'immediately' to describe the upcoming uarchs, namely Volta/Ampere and VegaV2/Navi from Nvidia and AMD respectively?

2

 

Games and applications become more complex as time goes so it is not a factor of 'immediate' returns or not. 

 

Also, you have to understand that companies like to improve the efficiency of their products too, so maybe a generation might not show a very high-performance uplift but the relative efficiency at the same performance level makes all the difference. This also dovetails into the fact that all companies are focussing on using the same chips in their mobile portfolio. 

6 hours ago, pixeljunkie said:

This is too damned controversial. And trust me when i say that there's a lot of backstory to this than the larger public is actually aware of. So i won't engage you. 

 

Is Intel's Tick-Tock model really all that controversial? Hmmm ... i wonder why no one has ever tried spinning it into something controversial.

4

 

I do not have a problem with Intel's tick-tock model. What I do have a problem was the six years of relative stagnation in performance (IRL performance, not benchmarks) thanks to their only competition being hamstrung due to one major release gone bad. 

 

There is nothing controversial about calling a spade a spade. Also, there is no spin angle to this. 

7 hours ago, pixeljunkie said:

If  you've paid attention to any of my posts on IVG, then you'll realise that i'll be the last person on the face of this planet to think along the lines of 'good vs bad'.  ;)

 

 

Whatever man, explain how you define which company is our friend? When I have clearly shown that, none is out here looking out for us or has some grand strategic future vision. 

Link to comment
Share on other sites

2 hours ago, pixeljunkie said:

Hopefully, it will keep other competitors in the GPU manufacturing industry competitive. I wonder, who do you think designed Navi?

 

 

Cannot answer. Navi must have been in the pipes since Fury days for all we know, can be Raja had a hand in it, maybe, maybe not? 

2 hours ago, pixeljunkie said:

But if you factor in the current adoption rates among Samsung's and LG's Monitor/TV customer base, then i wouldn't be wrong in saying that manufacturers are still experimenting with the idea.

 

 

You do realise Samsung and LG are making panels for the majority of other OEM's as well. So. To them, it does not matter apart from a direct sales figure statistic. 

2 hours ago, pixeljunkie said:

If only game developers are actually honest with themselves about the quality of their respective codebases...

 

 

Heh! You would think open API's would be really popular but they are not. Proprietary middleware is super popular. 

 

Case in point, 

2 hours ago, pixeljunkie said:

Ok. Having thought about what you said from a different angle, don't you think i would be right if i said that AMD have Intel to thank for not losing a lot more mindshare?

3

 

Why does AMD have any say in it? Did they have an MVP at that time? No.

 

Was Intel really doing anything to promote AMD's product line even by surrogacy? No. 

 

Point is Intel was too busy buggering everyone which quite honestly did not have any choice in the matter of purchase. And it did not change its tack after Ryzen's launch. 

2 hours ago, pixeljunkie said:

It's not that i never think along the lines 'Good vs Bad' or 'Friend or Foe'. But when i do, i immediately make it a point to delve deeper into the subject.

 

Since we're discussing friends and friendship, i was stunned by how AMD treated Raja Koduri. Ask any veteran in the GPU industry and they'll tell you that in addition to his brilliance as an engineer he's also a genuinely good person.

1

 

So... Again, does it not reinforce the point I have been trying to get across. 

 

Rick Bergman and Carrell Killebrew were ejected by AMD after their decisions turned around AMD's GPU division and how it operated. 

 

Dirk Meyer, the guy who literally saved the company from going under immediately post ATi's acquisition and subsequent bloodying against Intel's Conroe family of processors was ejected on the eve of the Bulldozer launch. 

 

Jim Keller who was the man behind AMD's most iconic processor generations left the company every time after he was done with his work, sometimes 1+ years in advance of the design's commercial launch.

2 hours ago, pixeljunkie said:

P.S. I'm still waiting for your response - if and when GPU manufacturers come up with a GPU architecture roadmap that exploits compute performance to drive extremely high resolutions at even more extreme framerates, will you then agree that 4K is the new 1080p and 8k (or even 10k) is the new 1440p?

 

 

That will have to wait when we have viable hardware released for that. 

 

For all we know, VR or AR might pick up and that is a whole different ball game and requires different hardware capabilities. 

Link to comment
Share on other sites

Going through the last few pages of this discussion, I feel you guys are really underestimating the benefits of a larger screen size, and over-rating pixel density too much. I have a 27" 1440p monitor. The higher screen size really adds to the immersion in both games and desktop usage. The difference was immediately felt when I switched over from a 24" 1080p monitor. A friend of mine as a 34" ultrawide 3440x1440 monitor. Seeing the games at his place is just too awesome to behold.He also has a 27" 4k side monitor (which has very high pixel density) which he uses for work. Comparing side be side, frankly, the games in that were very under-whelming compared to the ultrawide monitor (not to mention the appx. 15 - 20 fps immediate cost in Assassin's Creed Origins compared to the ultra-wide monitor with AA turned off). 4k was not that great of an improvement to compensate for larger screen immersion and lost fps.

 

Frankly I feel every resolution has a sweet spot for maximum effectiveness. 24" is the perfect ideal for 1080p. 27" is perfect for 1440p. And if you are going 4k, a 32" monitor might be ideal.

Link to comment
Share on other sites

46 minutes ago, roun90 said:

~snip~

 

 

I was talking strictly in the sense of anti-aliasing. 

 

Your points simply corroborate why we do not have small 4K panel based displays. 

Edited by ALPHA17
Link to comment
Share on other sites

On 2/4/2018 at 9:19 PM, pixeljunkie said:

It remains to be seem if AR/VR actually require different GPU hardware capabilities aside from extra input devices.

 

I agree to wait as well. Though i'm leaning more towards an efficient architecture that avoids dedicated functional units for a very specific purpose.

Edited by pixeljunkie
Link to comment
Share on other sites

@ALPHA17 @Joe Cool Edited all of my posts. I don't know how Samsung ASICs, AMD CPUs and Nvidia/RTG GPUs are inter-related. I understand how Bitcoin/Blockchain technology at a very high level but don't know squat about where the bitcoin market (networking, hardware including latest news & developments) starts and ends. I also do not know how Bitcoin is inter-related with the development of the aforementioned industry manufacturers and their respective roadmaps.

Edited by pixeljunkie
Link to comment
Share on other sites

With the assumption that malware is behind this, get control of your account and edit these posts.

 

@ALPHA17 please keep an eye on the thread and user. Peace.

 

On 2/5/2018 at 1:52 PM, pixeljunkie said:

@ALPHA17 Just downloaded some malware on my computer. Will wipe my system clean and get back to you tomorrow.

 

 

14 hours ago, pixeljunkie said:

Nothing at all. Successfully restored the OS with a two-day old backup. No critical data lost whatsoever.

 

6 hours ago, pixeljunkie said:

BUMP!

 

5 hours ago, pixeljunkie said:

I understand Bitcoin/Blockchain technology at a very high level but don't know squat about where the bitcoin market (networking, hardware including latest news & developments) starts and ends.

 

5 hours ago, pixeljunkie said:

BUMP!

 

3 hours ago, pixeljunkie said:

 

Why don't i force you to remain unemployed for two years, promising employment only on the condition that you solve an impossible puzzle without the necessary pieces while you keep changing the rules, the timeperiod only to try moving the finishing line at the end?

 

P.S. Even then you'd have an enormous advantage because not only do you actually know what you're involved in but there's motivation for you because i've thrown in an incentive!

 

2 hours ago, pixeljunkie said:

 

Sh*t hasn't hit the fan. Not yet. Not until we start arguing semantics! :wallbash:

 

David

 

1 hour ago, pixeljunkie said:

I can't do anything because Linkedin charges membership in dollars! I can't change my display name on IVG. I don't know if the stipulated timezone is IST or EST!

 

I pissed and can barely think straight! Yet i'm expected to solve this puzzle blindfolded? Change the rules on a whim and keep moving the f**king finishing line!

 

Hasn't she proved her point. How much more convincing do you guys need? :wallbash:

 

David

 

1 hour ago, pixeljunkie said:

Or am i E*****?!

 

Are we two different people? Or are we one and the same?

 

1 hour ago, pixeljunkie said:

Two years of my life wasted! :wallbash:

 

1 hour ago, pixeljunkie said:

 

EDIT: Had to add the quotes. Here's something nice for the next prophecy!

 

Link to comment
Share on other sites

6 hours ago, piper said:

please keep an eye on the thread and user. Peace.

 

 

I take one day to chill and this happened. 

 

Will be on priority. 

 

EDIT: Next time, please use the report function folks instead of post padding. 

 

Thread cleaned. 

nothing-to-see-here-gif-4.gif

Edited by ALPHA17
Link to comment
Share on other sites

2 hours ago, pixeljunkie said:

@ALPHA17 I never said anything racist or sexist, neither did i insult anyone. So, how is it that you take pleasure in posting that meme? Do you think its nice want to mock someone for having a panic attack?

 

Dude.. just stop! I mean, as much as I enjoy your content... please stop.

Link to comment
Share on other sites

2 hours ago, pixeljunkie said:

@ALPHA17 I never said anything racist or sexist, neither did i insult anyone. So, how is it that you take pleasure in posting that meme? Do you think its nice want to mock someone for having a panic attack?

 

Dude, just stop, you need certified medical attention. Not random outbursts on a gaming forum. 

 

I am suspending your posting rights, temporarily. Take a break, go out, detox and you are welcome to contribute once you are better. Cheerio! 

Link to comment
Share on other sites

2 minutes ago, ALPHA17 said:

Dude, just stop, you need certified medical attention. Not random outbursts on a gaming forum. 

 

I am suspending your posting rights, temporarily. Take a break, go out, detox and you are welcome to contribute once you are better. Cheerio! 

 

Go ahead. I'm not here to rebel against authority. It's normal for people to end up in a bad place. But please remember that i did not say anything racist or sexist.

Edited by pixeljunkie
Link to comment
Share on other sites

38 minutes ago, pixeljunkie said:

 

Go ahead. I'm not here to rebel against authority. It's normal for people to end up in a bad place. But please remember that i did not say anything racist or sexist.

1

 

Might actually have made more sense.

  • Haha 1
Link to comment
Share on other sites

 

RUMOR: NVIDIA Ampere GA104 GPU Powering ‘GeForce GTX 2080’ and ‘GeForce GTX 2070’ Launching In April, Mass Production Has Begun And GP102 is EOL

 

The leak season for NVIDIA’s upcoming Ampere architecture is officially open for business. Fresh from 3DCenter, a rumor which states that the next generation lineup for NVIDIA will be based not on Volta, but on the Ampere micro-architecture. Not only that, but if the rumor is to be believed then GP102 has already been transitioned to EOL (End of Life) and will be replaced with the GA series soon.

NVIDIA’s Ampere micro-architecture will succeed Pascal, GA104 GPU allegedly launching on April 12, GP102 production has halted

Let me begin by saying that this news is a nothing but a rumor at this point – its not even a leak. There is a very good reason this is tagged as ‘rumor’ and not a leak or an exclusive so if unverified rumors aren’t your cup of tea, now would be the time to get off.  For those that are still here, lets start with the GP102 chip. The GP102 Pascal chip is the flagship product for the commercial segment and according to the report it has already been shifted to EOL status which basically means that production on the GPUs have stopped and no more will be made. According to the report, GP102 entered EOL status sometime in the beginning of November last month

 

https://wccftech.com/nvidia-ampere-ga104-gpu-geforce-gtc-2070-gtx-2080-launching-april/

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...