Yup, the Fury X. It was the only one to feature it and consumers weren’t very impressed given the enthusiast price-point but general inability to overclock the RAM.
It wasn’t a bad idea. The card did pretty well for 1440p and 4K gaming at the time. However it just really didn’t offer any truly solid advantage nor any innovative uses of the memory. I owned two of these (bought separately over the course of a year) and don’t regret the purchases, but certainly wouldn’t have repeated that decision if I went back in time.
I think the higher bandwidth was better put to use in data centers, primarily those of CloudFlare for DDoS mitigation.
Back then the prices of graphics cards were a lot more reasonable, so the extra cost of HBM stood out and made the endeavor less worth it. Now with how crazy prices are (and hopefully HBM has gotten cheaper to integrate since then) I think it could be better incorporated with less of an impact to price.
Any interesting consumer uses or is this mainly relegated to dumb AI shit?
I think hbm is really beneficial for numerical simulations as well.
All AI/ML compute.
I believe the last time we saw HBM memory in consumer hardware was one of the Radeon series by AMD. Don’t think they did much with it though.
Yup, the Fury X. It was the only one to feature it and consumers weren’t very impressed given the enthusiast price-point but general inability to overclock the RAM.
It wasn’t a bad idea. The card did pretty well for 1440p and 4K gaming at the time. However it just really didn’t offer any truly solid advantage nor any innovative uses of the memory. I owned two of these (bought separately over the course of a year) and don’t regret the purchases, but certainly wouldn’t have repeated that decision if I went back in time.
I think the higher bandwidth was better put to use in data centers, primarily those of CloudFlare for DDoS mitigation.
Vega cards had HBM as well, for better or worse.
I wasn’t aware of that! Thanks for bringing that up. Those had mixed reviews, didn’t they?
I think the popularity of cryptocurrency mining just happened to save some of these cards as their compute power was very useful.
Back then the prices of graphics cards were a lot more reasonable, so the extra cost of HBM stood out and made the endeavor less worth it. Now with how crazy prices are (and hopefully HBM has gotten cheaper to integrate since then) I think it could be better incorporated with less of an impact to price.