Hacker News new | past | comments | ask | show | jobs | submit login
AMD to Retire the ATI Brand Later this Year (anandtech.com)
63 points by spcmnspff on Aug 30, 2010 | hide | past | favorite | 13 comments



Is this smart? I think AMD means "CPU" in people's minds and ATI means "GPU". And both seem to have the "2nd-best"-label attached to them (compare with nVidia and Intel). Perhaps an entirely new name would have been a better choice.

Are there any examples of similar rebrandings in the past that were successful?


Actually, ATI is pretty strong, from time to time, against nVidia. But as far as I know, AMD has been behind Intel since around the Core Duo days.

So really, AMD is dropping the brand that's actually competitive and are branding everything the same as their 80386 knockoffs. Interesting.

(AMD died to me when they bought ATI and continued to refuse to provide decent Linux drivers. Now I don't consider either brand as even existing, and spend my money accordingly.)


AMD died to me when they bought ATI and continued to refuse to provide decent Linux drivers. Now I don't consider either brand as even existing, and spend my money accordingly.

I ran nVidia for about 12 years (since I built my first PC in highschool), and in the past few months I've switched my home and work PCs to ATI HD4650 cards due to nVidia's refusal to make a halfway-decent 2D driver. Intel's always been good, and I was holding out for their discrete line, but once that was cancelled I gave ATI a shot, and in 2D land, they work really well. 3D is functional, but it's probably pretty slow compared with what the card should be capable of. Anyhow, I'm a lot happier with ATI's linux drivers than I was with the nVidia ones.


I always preferred ATI; GeForce was always more expensive for the equivalent (at least when I was keeping up on PC gaming) model and it had a wholly painful user interface, and it still does. ATI's user interface was always simplistic, and most of all 95% of what you needed to do could simply be done through the windows interface to begin with.

I think ATI started going down hill when AMD took over. It's strange that a fairly on-par brand name managed to go to a solidly second place when the second place CPU manufacturer purchased them.


I'm the opposite. There are entire UI concepts that only exist inside ATI's config GUIs. I had to research to find out how to disable HDMI overscan in Catalyst Control Center - it's there but there are buttons that aren't drawn as buttons.


Are you serious? Catalyst must be one of the worst, bloated and downright obtuse configuration software ever made.


So what about the configuration tool? I've never once needed to use it. What matters is the actual performance of the hardware during actual use. Not liking Catalyst is a stupid reason to not buy damn good performing parts for a very good price.


I never even used catalyst; like I said it's integration into windows meant I never needed it. The majority of the important things were integrated into display settings in XP and anything immediate was accessible through the right-click menu.

My problem with Geforce has always been that its performance is asymmetric, always has been and seems like it always will be. It's, simply put, always average. With Radeon's if you card wasn't performing great, you simply dropped the resolution and it dramatically increased, however with Geforce it only mildly increased. You have to drop from 1920x1200 to 1440x900, while with Radeon you only had to drop to 1680x1050 to get good performance again.

One thing Radeon always performed poorly with is anti-aliasing, however IMO this has always made graphics look like shit. I don't want blur, I want crispness. Yes anti-aliasing was good when watching 480i TV on a big screen. I don't want to be bullshitted with anti-aliasing when I can't actually see pixels to even notice any blocking on rounded shapes.


Since the DirectX 11 parts (octobre last year), ATI is certainly not second-label.

And with the upcoming Fusion, the clean difference between a CPU and a GPU brand doesn't hold anymore.


Maybe they believe that the long term benefits outweight the short term damages? If ATI keeps being competitive, it will now make AMD look better. Plus the fact that people who are looking into powerful GPU's keep hearing the word AMD may unconsciously make AMD's CPU seem faster too.


Their market research shows its fine.


One wonders if:

They bothered to do it at all.

It wasn't of the "confirm my thesis" type.


Good. With their upcoming fusion product line, ATI name on AMD CPUs would have only confused.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: