Whats with google? I had to go four pages deep to get to the proper GFX card benchmarking sites like Wccftech/Tom's/Techpowerup/Eurogamer/notebookcheck/kitguru/guru3d
It was all trash forums and crap benchmarks till then. Maybe they are being held for ransom for SEO.
FSR 3.0 in theory copies some of the DLSS 3.0 tech but is not bound to latest gen cards.
I still don't understand AI generated frames and frame interpolation. It was considered horseshit when it was first attempted but now its done by 'AI' people like it.
This post has been edited by EsotericSatire: Aug 30 2023, 15:29
FSR 3.0 in theory copies some of the DLSS 3.0 tech but is not bound to latest gen cards. I still don't understand AI generated frames and frame interpolation. It was considered horseshit when it was first attempted but now its done by 'AI' people like it.
Well, I am probably people and I don't.
QUOTE(-terry- @ Aug 30 2023, 11:08)
I doubt that FSR 3.0 interpolation will be close to NVIDIA's implementation, like with most things.
Nooo, you have to treat it as if it's superior even when it isn't, otherwise nvidia wins.
I think CUDA has been their ace in the hole for a while now. And I hate it. Moreso, I hate that people still voluntarily choose to use it for things instead of any other GPGPU technique.
AI is the latest hot advertising buzzword, in the vein of terms like 'the cloud' or 'internet of things' or (in web dev circles) 'javascript framework.'
This post has been edited by dragontamer8740: Aug 30 2023, 18:29
Looking at a few smaller reviewers, I think they are correct. The problem with frame generation is that it gives an excuse to game devs to have poorly optimized engines.
Games are coming out that assume you will use frame generation and upscaling for no reason other than shit engine / game optimization.
edit: Starfield may be the latest 'victim'. People are saying that at native resolution the game barely hits 60 fps on 1080p low without up-scaling on the new RTX 4060. GTX 1060 barely hits 30 fps on 1080p low with 50% resolution and up-scaling.
This post has been edited by EsotericSatire: Sep 2 2023, 01:04
Well, I am probably people and I don't. Nooo, you have to treat it as if it's superior even when it isn't, otherwise nvidia wins.
I think CUDA has been their ace in the hole for a while now. And I hate it. Moreso, I hate that people still voluntarily choose to use it for things instead of any other GPGPU technique.
AI is the latest hot advertising buzzword, in the vein of terms like 'the cloud' or 'internet of things' or (in web dev circles) 'javascript framework.'
It can be frustrating to see CUDA dominate the GPGPU space, especially when there are other viable options. Artificial Intelligence really seems to be the latest buzzword, similar to how Cloud and Internet of Things have been used.
Speaking of IoT, I recently took a deep dive into [www.cogniteq.com] iot companies. It's interesting to see how they use different technologies to create interconnected devices. This is a rapidly growing field with great potential, much like artificial intelligence.
I'll leave this thread open since the topic is on video card matters in general, but please remember that necroposting is frowned upon in these forums.
Here I am sticking to 1080p gaming with my 1080ti. It's really not that bad tbh, I still think 1440p or 4k gaming is overrated.
Thank you! At least some people still appreciate the old. Everything is supposed to be the newest. Newest OS,newest games,newest resolutions,newest pc. Everything has to be the newest. I dont say that but it feels that way. Where does it come from? Demand or companies or both? Anyways, why the world always has to dump the old and only keep the new. To keep wall street happy? To make money? I dont know but I dont like that tendency. Even now Windows 10 is being dumped for Windows 11. So it goes and it will continue in the future. Windows 12 will be dumped for windows 13. PS5 will be dumped for PS6 etc. Man..jeeez. Companies or demand? Why not keep some of the old that are at least good enough. Even if its not good, people might still want to use it for whatever reason.
This post has been edited by Striferuka: Jul 9 2024, 09:01
Please tell me 28 gb vram for 5090 is a joke. I desperately want to believe it's just a rumor.
Why? VRAM is important for machine learning and that's what the high-end cards are designed for. Unless you mean you expected more than 28gb. I kinda did, considering the 3090 is 4 years old and had 24gb. 32gb would sound like a big upgrade from a marketing perspective. 28 doesn't.
For gaming none of this obviously makes any difference. Maybe NVIDIA doesn't want to add more VRAM per card because they know the real AI companies that have money to spend will just buy more cards if they want more VRAM or get the much more expensive A-series. Or maybe they're afraid of scaring away gamers who don't want to "pay for wasted VRAM"? Maybe a bit of both.
This post has been edited by kotitonttu: Aug 14 2024, 19:42
They probably want to avoid another export ban from the US government with their consumer cards. Making them too beefy triggers the aggro, and some gamers *really* hate not buying the newest and *best* RGB-powered gaming card on the market.
They probably want to avoid another export ban from the US government with their consumer cards. Making them too beefy triggers the aggro, and some gamers *really* hate not buying the newest and *best* RGB-powered gaming card on the market.
Bans like this a fucking pointless anyway. Nvidia is very much a profit-oriented business, and could not give a flying fuck on whoever wins the cold war 2.0
They will keep coming up with workarounds. Doubt the US government can do much to stop that if Nvidia becomes really fed up with them.
on VRAM, still happy enough with 2GB here. Wouldn't mind 4GB but not rushed to upgrade
I have 12gb and that's kinda overkill but it doesn't hurt. I could settle with 8. 4 definitely started not being enough a few years ago. Even 6 was pushing it.
Was planning to buy the 5070 unless the 5080 is surprisingly cheap (I doubt it). Now they're saying the 5070 will have 8gb and the 5080 will have 16gb. I really, really would not want to buy a new graphics card that's supposed to serve me for 5+ years and go 50% DOWN in vram from my current card, which was always meant to be a temporary stepping stone while I upgraded the rest of my rig. (Too powerful for my old PC, not powerful enough for my current PC).
e. Apparently some sources are claiming the 5070 will have 12gb, which would be perfectly fine by me. Maybe it's not worth worrying about this stuff until they're on store shelves.
This post has been edited by kotitonttu: Aug 15 2024, 04:28