Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess for a few years there might have been a market for hardware MP3 decoders. I faintly remember back when playing MP3s did take a sizeable chunk of CPU time. But after a Moore cycle or two the cost was negligible, and today Spotify probably spends much more CPU on drawing its interface than decoding the audio stream…


Not really on PCs, IMHO. With well optimized software, you could decode in realtime on an i486@100Mhz, a pentium 75 could use winamp and mIRC simultaneously, IIRC. If you didn't have enough CPU to decode mp3, it probably made more sense to upgrade that rather than a mp3 decoder card (if they existed). Mpeg-1/2 (video) decoder cards were more necessary.


Thanks for the flashback. I too remember when playing an MP3 took a good chunk of processor utilization, maybe 40%, and doing too many other things at the same time would cause it to stutter. I also remember when 1080p and 4k were taxing respectively.


And anything I can think of that I'd want a sound card to do these days is being handled by GPUs - like that NVidia AI noise cancellation, where your kids can scream 5 feet away and no one can hear them on Zoom.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: