It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Nvidia Reveal GTX 1080 Ti, Faster than Titan X at Half the Cost

page: 2
4
<< 1   >>

log in

join
share:

posted on Mar, 1 2017 @ 04:02 PM
link   

originally posted by: Bleeeeep
a reply to: Aldakoopa

So you think the mindset / work ethic is more like "make an improvement we can sell"? And they somehow do every 3 to 6 months?

eeeeeeeeeeeeeh... maybe, but it's doubtful.


You know what just occurred to me: New smells like everything you've ever wanted, and they know it - they know how intoxicating the aroma is. Really, they're not even selling technology - they're selling something new - that's their business model. I don't know why I didn't see it before.

I sucked all the fun out of it, didn't I? I'm sorry.


You're not understanding me. They DO spend years developing the architecture. The architecture meaning the design of the core, how it's laid out, how many billions of transistors to use, what instructions it can handle. Once all that is done, they have a groundwork to work up from for years to come while they work on their next architecture. The underlying architecture doesn't change much. They do change the amount of cores or pipelines depending on the processor and what sort of tier they're aiming for (scientific/industrial applications, enthusiast level 'bragging rights', high end gaming, mid range gaming, low end basic, and everything in between). Aside from that, they make smaller tweaks to improve that architecture between generations. A lot of times they release 'rebrands' with these minor revisions as opposed to a whole new product lineup, perhaps with only a new flagship card to keep the pushing the boundaries. Usually that new flagship (as is the case with this 1080ti) is nothing but the same architecture as the last flagship, but with more cores, more VRAM, and more bragging rights.



posted on Mar, 1 2017 @ 04:11 PM
link   
I'm sitting at just barely below VR capable and trying to decide if it's worth upgrading mine just yet.

We did pop a new card in the husband's machine and it looks really, really nice ... but it was also lots of dollars of really, really nice too.



posted on Mar, 1 2017 @ 05:26 PM
link   
a reply to: Aldakoopa


Usually that new flagship (as is the case with this 1080ti) is nothing but the same architecture as the last flagship, but with more cores, more VRAM, and more bragging rights.


You're making my point for me. If all they're doing is adding a bit more to the box of cereal, then it's not not new cereal, they've just been holding back. Think about it: do you think they wouldn't have the mind to push the architecture to its limit on day 1? So why didn't they? Why not quadruple everything in this new build? Why not is because in 6 months to a year they're going to need something "new and improved" to sell.'

Blitting/Tiled memory has been around since forever. Just saying.



posted on Mar, 1 2017 @ 06:00 PM
link   
a reply to: Bleeeeep

Because it takes time to work out the kinks and iron everything out, that's why. And designing a new processor based on existing architecture (Oh my god I'm tired of typing that, but that's what it's called. lol) isn't something they do overnight, but it certainly doesn't take years like designing the architecture (GAAAAAAHHHHHH!!!!!!) does in the first place. When the framework is in place they take the time to work with it. They're just adding onto the house they've already built.



posted on Mar, 1 2017 @ 07:07 PM
link   
You guys don't really think, especially being on ATS, that us plebe consumers get the latest and greatest tech....do you?

Having been a avid PC builder for a number of years now and watching how competition stacks up, it is easy to see that Nvidiea has no reason to play their full hand.



posted on Mar, 1 2017 @ 09:41 PM
link   
a reply to: JinMI

The question I always have to ask when that comes up is, what purpose does it serve to hold back computer technology? Who benefits? And for what reason?

If you say it's to capitalize on it, they could just as easily capitalize on more advanced hardware. Then the games and programs developed would take advantage of the increase in performance, and they'd have to improve their product more to please the masses just like... oh, I dunno. Exactly what they've been doing for the past few decades!



posted on Mar, 1 2017 @ 09:53 PM
link   
a reply to: Aldakoopa

Proven business model comes to mind. Keep the tech for graphics behind the software. Carrot on a stick if you will.

The opposite is true for CPUs. How long have we had 8+ core CPUs without software that properly utilizes them?



posted on Mar, 1 2017 @ 10:27 PM
link   
I have used products from both nVidia and AMD over the years and do not favour one over the other. To me, the individual product is more important than the reputation of the manufacturer. Therefore, I have no qualms about changing from one to the other to best suit my needs.

For those with a strong itch to upgrade, I would advise you, at least, to wait for non-reference versions of the card to come out before buying anything. They have overwhelmingly had better cooling and functioning compared to the founder's edition in the past.

More importantly, I would advise you to wait for AMD's Vega release even before doing that. Unless you are a nVidia fanboy, there is no reason to rush out and buy a GTX 1080 Ti before waiting for Vega to compare the value for money between the two cards.


edit on 1/3/2017 by Dark Ghost because: (no reason given)



posted on Mar, 1 2017 @ 10:35 PM
link   
a reply to: Dark Ghost

Almost a year I believe since the announcement of Vega. Bought a RX480 in the meantime. I'm thinking we may see a 1070TI before we see Vega and that doesn't bode well for AMD, especially if it has increased functionality with Ryzen.

I'm not a fanboy of either but from my perspective, Nvidia has a whole year of product ahead of AMD at the moment.

Eagerly awaiting to see a shift however.



posted on Mar, 1 2017 @ 10:45 PM
link   
a reply to: JinMI

I agree.

All I'm saying is that it really is in the consumer's best interest to wait until Vega is released before they rush to get a 1080 TI. The potential advantages of waiting far outweigh the temporary inconvenience of having to utilise your patience skills.

Of course, if you are fortunate to be wealthy and want the best as soon as you can, go out and buy the 1080 ti as soon as you can. If you have a faulty/broken card and need an upgrade urgently and have the money to buy a 1080 ti and are happy with its specs, then go ahead and buy it.


edit on 1/3/2017 by Dark Ghost because: (no reason given)



posted on Mar, 1 2017 @ 10:56 PM
link   
a reply to: Dark Ghost

For all intents and purposes you are correct. The only hitch is waiting on AMD. Not to mention Ryzen chips are pretty pricey but compared to an I7 octo-core, it's not bad.



posted on Mar, 1 2017 @ 11:23 PM
link   
a reply to: JinMI

Then why does AMD excel at DX12 then? They're behind Nvidia on all other fronts, and don't have anywhere near the R&D budget that Nvidia has. They knew DX12 was coming. If they're so far advanced beyond the software then why does Nvidia fall behind when it comes to new software? On the other hand, even AMD's older HD 7000 cards support DX12, meaning, even with their limited budget and falling behind in performance/power consumption, they suddenly have something that DID exceed the software? It just doesn't make much sense to think that they're holding onto something. The only thing Nvidia has managed to do is re-release a modified Titan for a much cheaper price with less VRAM every time AMD has practically caught up with them. That's not releasing something more advanced that they're sitting on. That's repurposing what they've already got.
edit on 1-3-2017 by Aldakoopa because: (no reason given)



posted on Mar, 1 2017 @ 11:30 PM
link   
a reply to: Aldakoopa

If the goal is to hit a bar that software sets, then they will hit it everytime. Hardware drives software. I had a DX11 card a year before a game came out that used it.

I'm not saying AMD is ahead, in anything at the moment. I was simply highlighting that bleeding edge tech doesn't come to consumers first.



posted on Mar, 2 2017 @ 12:04 AM
link   
a reply to: JinMI

That's easy to explain. You got a card that was capable of handling the DX11 API. When Microsoft was developing DX11, they were working alongside AMD and Nvidia, to ensure compatibility with their products and so they could see what new features they would need to work to take advantage of. After that's done and DX11 is ready, then the GPU manufacturers will know what products they have made that are compatible with it, then they'll start marketing it as such. Games that were being developed before DX11 was ready had to decide whether or not to use the new API. Switching to a new API requires a lot of rework and rewriting, and since most games are already on a time restraint that the publishers set, they may decide to just go ahead with the old API. Only after that will they consider making a game capable of using the new API, then you'll get DX11 games. A year seems like an appropriate amount of time for all that to happen in, before you actually see the first game using the new API.

The same thing happened with DX12. There's still not many games that use it, and those that do came out at least a year after the API was finished.

But the argument you're trying to make, if I'm understanding correctly, is that tech companies sit around on new, advanced technology before releasing it to the public. Is that true? To a certain degree, yes. They have to test it internally and make sure everything is working properly and see if there's any last-minute improvements they can make to the product before release. They also have to write drivers for the new hardware and make sure that they make it compatible with as many games as possible, especially new AAA title games. But beyond that, there's no reason for them to hold it back. It doesn't make sense in any regard, especially when people are demanding the latest and greatest for their entertainment purposes. It won't help their profits to wait around before releasing it, only hurt them, and the bottom line is what matters.



posted on Mar, 2 2017 @ 12:15 AM
link   
a reply to: Aldakoopa




But the argument you're trying to make, if I'm understanding correctly, is that tech companies sit around on new, advanced technology before releasing it to the public. Is that true? To a certain degree, yes.


Not that they sit on it, but that they it doesn't make a b line to consumers first. That's all. Also, your speaking on an end user product and all of your information applies to that product(s). The tech itself, the chip and the memory it utilizes are not, IMO, new as we are led to believe.




top topics



 
4
<< 1   >>

log in

join