Welcome Guest ( Log In | Register )

5 Pages V < 1 2 3 4 5 >  
Reply to this topicStart new topic
> Video card talk, damn graphics card

 
post Aug 25 2023, 04:09
Post #41
ramenpunch



Newcomer
*
Group: Members
Posts: 19
Joined: 5-June 16


the gpu market has really stalled hasnt it
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 25 2023, 06:45
Post #42
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,563
Joined: 31-July 10
Level 500 (Ponyslayer)


Whats with google? I had to go four pages deep to get to the proper GFX card benchmarking sites like Wccftech/Tom's/Techpowerup/Eurogamer/notebookcheck/kitguru/guru3d

It was all trash forums and crap benchmarks till then. Maybe they are being held for ransom for SEO.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 25 2023, 07:29
Post #43
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,427
Joined: 22-August 12
Level 372 (Dovahkiin)


It's almost like all this AI based search algorithm crap is... making people dumber.



This post has been edited by dragontamer8740: Aug 25 2023, 07:34
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 30 2023, 15:28
Post #44
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,563
Joined: 31-July 10
Level 500 (Ponyslayer)





FSR 3.0 in theory copies some of the DLSS 3.0 tech but is not bound to latest gen cards.




I still don't understand AI generated frames and frame interpolation. It was considered horseshit when it was first attempted but now its done by 'AI' people like it.

This post has been edited by EsotericSatire: Aug 30 2023, 15:29
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 30 2023, 17:08
Post #45
-terry-



Veteran Poster
********
Group: Global Mods
Posts: 2,623
Joined: 9-August 19
Level 500 (Ponyslayer)


I doubt that FSR 3.0 interpolation will be close to NVIDIA's implementation, like with most things.
User is online!Profile CardPM
Go to the top of the page
+Quote Post

 
post Aug 30 2023, 18:16
Post #46
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,427
Joined: 22-August 12
Level 372 (Dovahkiin)


QUOTE(EsotericSatire @ Aug 30 2023, 09:28) *
FSR 3.0 in theory copies some of the DLSS 3.0 tech but is not bound to latest gen cards.
I still don't understand AI generated frames and frame interpolation. It was considered horseshit when it was first attempted but now its done by 'AI' people like it.
Well, I am probably people and I don't.
QUOTE(-terry- @ Aug 30 2023, 11:08) *
I doubt that FSR 3.0 interpolation will be close to NVIDIA's implementation, like with most things.
Nooo, you have to treat it as if it's superior even when it isn't, otherwise nvidia wins.

I think CUDA has been their ace in the hole for a while now. And I hate it. Moreso, I hate that people still voluntarily choose to use it for things instead of any other GPGPU technique.

AI is the latest hot advertising buzzword, in the vein of terms like 'the cloud' or 'internet of things' or (in web dev circles) 'javascript framework.'

This post has been edited by dragontamer8740: Aug 30 2023, 18:29
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Sep 1 2023, 23:42
Post #47
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,563
Joined: 31-July 10
Level 500 (Ponyslayer)


Looking at a few smaller reviewers, I think they are correct. The problem with frame generation is that it gives an excuse to game devs to have poorly optimized engines.

Games are coming out that assume you will use frame generation and upscaling for no reason other than shit engine / game optimization.



edit: Starfield may be the latest 'victim'. People are saying that at native resolution the game barely hits 60 fps on 1080p low without up-scaling on the new RTX 4060. GTX 1060 barely hits 30 fps on 1080p low with 50% resolution and up-scaling.





This post has been edited by EsotericSatire: Sep 2 2023, 01:04
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 14 2024, 17:06
Post #48
Maress



Lurker
Group: Lurkers
Posts: 3
Joined: 4-June 24


QUOTE(dragontamer8740 @ Aug 30 2023, 19:16) *

Well, I am probably people and I don't.
Nooo, you have to treat it as if it's superior even when it isn't, otherwise nvidia wins.

I think CUDA has been their ace in the hole for a while now. And I hate it. Moreso, I hate that people still voluntarily choose to use it for things instead of any other GPGPU technique.

AI is the latest hot advertising buzzword, in the vein of terms like 'the cloud' or 'internet of things' or (in web dev circles) 'javascript framework.'

It can be frustrating to see CUDA dominate the GPGPU space, especially when there are other viable options. Artificial Intelligence really seems to be the latest buzzword, similar to how Cloud and Internet of Things have been used.

Speaking of IoT, I recently took a deep dive into [www.cogniteq.com] iot companies. It's interesting to see how they use different technologies to create interconnected devices. This is a rapidly growing field with great potential, much like artificial intelligence.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 14 2024, 17:20
Post #49
meow_pao



It's Always Tea-Time
*********
Group: Global Mods
Posts: 5,347
Joined: 20-September 09
Level 457 (Godslayer)


I'll leave this thread open since the topic is on video card matters in general, but please remember that necroposting is frowned upon in these forums.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 9 2024, 09:00
Post #50
Striferuka



Ryuuou
****
Group: Catgirl Camarilla
Posts: 461
Joined: 12-August 13
Level 416 (Dovahkiin)


QUOTE(Agoraphobia @ Jul 19 2023, 11:12) *

Here I am sticking to 1080p gaming with my 1080ti. It's really not that bad tbh, I still think 1440p or 4k gaming is overrated.

Thank you! At least some people still appreciate the old. Everything is supposed to be the newest. Newest OS,newest games,newest resolutions,newest pc. Everything has to be the newest. I dont say that but it feels that way. Where does it come from? Demand or companies or both? Anyways, why the world always has to dump the old and only keep the new. To keep wall street happy? To make money? I dont know but I dont like that tendency. Even now Windows 10 is being dumped for Windows 11. So it goes and it will continue in the future. Windows 12 will be dumped for windows 13. PS5 will be dumped for PS6 etc. Man..jeeez. Companies or demand? Why not keep some of the old that are at least good enough. Even if its not good, people might still want to use it for whatever reason.

This post has been edited by Striferuka: Jul 9 2024, 09:01
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 22 2024, 01:31
Post #51
NeXT97



Lurker
Group: Recruits
Posts: 4
Joined: 4-November 16
Level 25 (Apprentice)


Waiting for 4090 to get below $700 (IMG:[invalid] style_emoticons/default/sad.gif)
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 22 2024, 10:01
Post #52
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,563
Joined: 31-July 10
Level 500 (Ponyslayer)


QUOTE(NeXT97 @ Jul 21 2024, 13:31) *

Waiting for 4090 to get below $700 (IMG:[invalid] style_emoticons/default/sad.gif)


RTX 2080tis are now hitting below that price range.

3080/90 briefly hit 700-1000 USD mark before the issues with 4090 became wide spread.


You might have to wait until 6000 series unless you lucky when people are dumping their 4090s to upgrade.


Also its worth noting 4090 has a end of life power connector, so you will need a PSU with compatible power connector that is already obsolete.

5000 series is launching with a new version of the power connector that will set a new standard for PSUs.

This post has been edited by EsotericSatire: Jul 22 2024, 10:03
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 30 2024, 00:49
Post #53
kotitonttu



Custom member title
*****
Group: Members
Posts: 705
Joined: 11-April 16
Level 338 (Dovahkiin)


Getting tired of waiting for the 5000-series, but since I waited for this long it'd be extra retarded to buy a 4000-serie now.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 14 2024, 16:38
Post #54
DearGoddess



Newcomer
**
Group: Members
Posts: 50
Joined: 3-May 23
Level 211 (Destined)


Please tell me 28 gb vram for 5090 is a joke. I desperately want to believe it's just a rumor.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 14 2024, 18:39
Post #55
kotitonttu



Custom member title
*****
Group: Members
Posts: 705
Joined: 11-April 16
Level 338 (Dovahkiin)


QUOTE(DearGoddess @ Aug 14 2024, 17:38) *

Please tell me 28 gb vram for 5090 is a joke. I desperately want to believe it's just a rumor.

Why? VRAM is important for machine learning and that's what the high-end cards are designed for.
Unless you mean you expected more than 28gb. I kinda did, considering the 3090 is 4 years old and had 24gb. 32gb would sound like a big upgrade from a marketing perspective. 28 doesn't.

For gaming none of this obviously makes any difference. Maybe NVIDIA doesn't want to add more VRAM per card because they know the real AI companies that have money to spend will just buy more cards if they want more VRAM or get the much more expensive A-series. Or maybe they're afraid of scaring away gamers who don't want to "pay for wasted VRAM"? Maybe a bit of both.

This post has been edited by kotitonttu: Aug 14 2024, 19:42
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 14 2024, 21:51
Post #56
Kagoraphobia



✝️ Ascension of Angel ✝️
***********
Group: Global Mods
Posts: 11,741
Joined: 12-August 19
Level 500 (Ponyslayer)


They probably want to avoid another export ban from the US government with their consumer cards. Making them too beefy triggers the aggro, and some gamers *really* hate not buying the newest and *best* RGB-powered gaming card on the market.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 14 2024, 21:56
Post #57
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,427
Joined: 22-August 12
Level 372 (Dovahkiin)


I'd put """best""" in quotes rather than asterisks, because to me what's best is what does the most with the least.

on VRAM, still happy enough with 2GB here. Wouldn't mind 4GB but not rushed to upgrade
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 14 2024, 23:12
Post #58
-terry-



Veteran Poster
********
Group: Global Mods
Posts: 2,623
Joined: 9-August 19
Level 500 (Ponyslayer)


QUOTE(Kagoraphobia @ Aug 14 2024, 21:51) *

They probably want to avoid another export ban from the US government with their consumer cards. Making them too beefy triggers the aggro, and some gamers *really* hate not buying the newest and *best* RGB-powered gaming card on the market.

[videocardz.com] https://videocardz.com/newz/nvidia-geforce-...cloud-computing
User is online!Profile CardPM
Go to the top of the page
+Quote Post

 
post Aug 15 2024, 01:17
Post #59
Kagoraphobia



✝️ Ascension of Angel ✝️
***********
Group: Global Mods
Posts: 11,741
Joined: 12-August 19
Level 500 (Ponyslayer)


Bans like this a fucking pointless anyway. Nvidia is very much a profit-oriented business, and could not give a flying fuck on whoever wins the cold war 2.0

They will keep coming up with workarounds. Doubt the US government can do much to stop that if Nvidia becomes really fed up with them.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 15 2024, 04:08
Post #60
kotitonttu



Custom member title
*****
Group: Members
Posts: 705
Joined: 11-April 16
Level 338 (Dovahkiin)


QUOTE(Moonlight Rambler @ Aug 14 2024, 22:56) *

on VRAM, still happy enough with 2GB here. Wouldn't mind 4GB but not rushed to upgrade


I have 12gb and that's kinda overkill but it doesn't hurt. I could settle with 8.
4 definitely started not being enough a few years ago. Even 6 was pushing it.

Was planning to buy the 5070 unless the 5080 is surprisingly cheap (I doubt it). Now they're saying the 5070 will have 8gb and the 5080 will have 16gb.
I really, really would not want to buy a new graphics card that's supposed to serve me for 5+ years and go 50% DOWN in vram from my current card, which was always meant to be a temporary stepping stone while I upgraded the rest of my rig. (Too powerful for my old PC, not powerful enough for my current PC).

e. Apparently some sources are claiming the 5070 will have 12gb, which would be perfectly fine by me. Maybe it's not worth worrying about this stuff until they're on store shelves.

This post has been edited by kotitonttu: Aug 15 2024, 04:28
User is offlineProfile CardPM
Go to the top of the page
+Quote Post


5 Pages V < 1 2 3 4 5 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 


Lo-Fi Version Time is now: 25th April 2025 - 18:01