• CONTEST ALERT - Experience the power of DDR5 memory with Kingston Click for details

PC Suggested Configs

shreeux

Movie Buff
Intel keeps putting out new cpus every 6-7 months-i dont see whats the point in this.That means effectively all of their products esp processors will become obsolete a year or 2 after release as motherboards for these cpus will no longer be available after this time span.

For instance,i bought an i5 9400f in 2019,and despite the fact that its just around 2 years old,decent mobos for these 8th/9th gen cpus have become really scarce in the market already(not available with any popular online retailers either like vedant,md etc).So if the existing mobo goes for a toss and i cant find a suitable replacement,i will have to sell off the cpu and be forced to buy a new one even though i didn't really need it.
Compare to Intel...AMD Best?
 

whitestar_999

Super Moderator
Staff member
At this point, AM4 mobos are going away soon as well. Let's see, post 1 year of Ryzen 7000 how it will be. Intel's 13th gen launching later this year will be LGA 1700 as well. So I don't see that point being an advantage as LGA 1700 mobos should be available for the next 2 years or so easily. If AM5 socket was out, maybe you could have said it might have an advantage. AMD is vague on AM5's long term support as well though.
I think AMD licensing contract is more flexible allowing mobo manufacturers to keep producing older gen mobos for longer time. Maybe AMD knows that their typical customer base like to upgrade/replace mobo more than processor & it also keeps the customers to their side instead of switching over to intel in case price/performance ratio at that time isn't in their favour(like the current situation).
 
I think AMD licensing contract is more flexible allowing mobo manufacturers to keep producing older gen mobos for longer time. Maybe AMD knows that their typical customer base like to upgrade/replace mobo more than processor & it also keeps the customers to their side instead of switching over to intel in case price/performance ratio at that time isn't in their favour(like the current situation).
For longevity, AMD is definitely better. I myself know 2 people who jumped from Ryzen 2000 to 5000 while upgrading their GPU.
 
RTX 3060Ti was supposed to be priced at 35k
3050 is basically a 8GB 1660 Super with RT cores and dlss support

Agreed, but with how situation has evolved, no guarantee it will ever drop back to those levels. Only time will tell how GPUs are priced, crypto crash will help.
 
Yes RTX 3060 12GB version should be priced around 35 to 40k not more than this but even this is over priced.

Sent from my SM-M317F using Tapatalk
30k is fine for it as it is 10% slower than older 2070S which was 40k. Sadly no one knows if prices will drop further & by how much. Currently 3060 costs 50k, 6600XT at 55k & 6600 for 46k.
 

quicky008

Technomancer
30k is fine for it as it is 10% slower than older 2070S which was 40k. Sadly no one knows if prices will drop further & by how much. Currently 3060 costs 50k, 6600XT at 55k & 6600 for 46k.
i asked someone the price of 3060,he quoted 57k-apparently thats the lowest price that one can get for a 3060 right now.


if cryptomining doesn't stop,it might very well kill off pc gaming,for budget and casual gamers that is.
 

TheSloth

The Slowest One
^It is not just the Crypto mining. Chip shortage the another very important issue here. If Nvidia is able to manufacture and push their FE cards to market like normal, then price of rest of the cards will fall immediately. Demand is too high against the supply, scalping is the just the situation created because of this. As per online articles, it will be around 2024 for shortage issue to resolve. As for mining, I have no idea where is this going. It goes down but again it comes up. A Crash can only happen when there is long awaited new etherium launches.
The real issue according to me is, these companies now know regardless of the cost of their product, they will be able to earn profit. Now AMD is turning out be as much evil as Nvidia with their recent launches.
 

Extreme Gamer

僕はガンダム!
Vendor
How does a vintage GPU like the GTX 780 compare against intel's Alder Lake IGPUs?

I'm debating whether to stick my old Zotac reference design GTX 780 into an HTPC box with an i5 12400 or to get an i5 12500/12600 depending on availability. AFAIK the GPU doesn't support hardware HEVC decoding and HDMI signals are restricted to up to 4K@24hz/UHD@30hz.

My confusion is on whether I can push a DisplayPort 4K/UHD@60hz signal to my Denon X6700H via a DP-HDMI converter and whether my GPU will struggle to decode HEVC without the presence of a hardware ASIC.

Another thing, between the UHD 730 and the UHD 770 what are the practical differences for my use case? My Vero 4K+ struggles to handle fancy animated subtitles and even full UHD Blu-Rays in certain film scenes (No Time to Die) so I really want to make sure my hardware is able to handle it.
 

whitestar_999

Super Moderator
Staff member
How does a vintage GPU like the GTX 780 compare against intel's Alder Lake IGPUs?

I'm debating whether to stick my old Zotac reference design GTX 780 into an HTPC box with an i5 12400 or to get an i5 12500/12600 depending on availability. AFAIK the GPU doesn't support hardware HEVC decoding and HDMI signals are restricted to up to 4K@24hz/UHD@30hz.

My confusion is on whether I can push a DisplayPort 4K/UHD@60hz signal to my Denon X6700H via a DP-HDMI converter and whether my GPU will struggle to decode HEVC without the presence of a hardware ASIC.

Another thing, between the UHD 730 and the UHD 770 what are the practical differences for my use case? My Vero 4K+ struggles to handle fancy animated subtitles and even full UHD Blu-Rays in certain film scenes (No Time to Die) so I really want to make sure my hardware is able to handle it.
GTX 780 hands down, with latest gen i5 I don't think you even need hardware HEVC decoding(gpu only decodes HEVC via hardware, software decoding of any video codec is via processor only). Fancy animated subtitles(aka .ssa/.ass) on certain anime videos can certainly consume a lot of processor power but still I don't think it will pose any issue to any latest gen i5 processor. The processor in Vero 4K is quite weak compared to any latest quad core processor not to mention only 2gb ram when I couldn't get a typical good quality 720p anime clip having the fancy subs effects in op/ed with madvr renderer to play smoothly at 4gb ram in some cases on win 7 years ago.
 

Extreme Gamer

僕はガンダム!
Vendor
GTX 780 hands down, with latest gen i5 I don't think you even need hardware HEVC decoding(gpu only decodes HEVC via hardware, software decoding of any video codec is via processor only). Fancy animated subtitles(aka .ssa/.ass) on certain anime videos can certainly consume a lot of processor power but still I don't think it will pose any issue to any latest gen i5 processor. The processor in Vero 4K is quite weak compared to any latest quad core processor not to mention only 2gb ram when I couldn't get a typical good quality 720p anime clip having the fancy subs effects in op/ed with madvr renderer to play smoothly at 4gb ram in some cases on win 7 years ago.
Is there a difference in image quality between Intel's decoding and Nvidia's?

And does the difference between UHD 730 and UHD 770 matter?
 
Top Bottom