You can do somewhat better if you host it on a server with gzip compression. Since the PNG has a max dictionary size for the compression, it doesn't optimally compress out all the redundancy. But because the left-over redundancy also forms a repeating pattern (since the black is the same all over the image), gzip shrinks it even further.
I got slightly better results even by doing this with a JPG image, probably because it's based on 8x8 blocks. I used the colour red, but I don't think that matters much.
Correction, looking back to my results, it seems the PNG was smaller after all: png32512.png.gz is 36,077 bytes (a 32000x32000 JPG gzips to about 41k). I forget how I came to the 32512x32512 limit, maybe it was by trial & error, the largest size a browser still opens (probably tested on Firefox and Opera, didn't use Chrome at the time).
I also asked some friends with powerful (lots of memory) computers to try out a webpage that would load this image many times, with unique GET parameters to prevent caching, but apart from loads of harddisk access and maxing the CPU for a bit until they closed the tab, nothing crashy happened (and of course I did inform them what could happen and told them to save any work).
Reliably crashing a browser on a sufficiently high-end (say, gaming) PC, I haven't been able to do it since at least 5 years or so. I might have done better if I'd own a high-end computer myself, of course :) I remember it used to be as easy as making a webpage with 200 full-page DIV layers stacked at 1% opacity :-P
I got slightly better results even by doing this with a JPG image, probably because it's based on 8x8 blocks. I used the colour red, but I don't think that matters much.
Correction, looking back to my results, it seems the PNG was smaller after all: png32512.png.gz is 36,077 bytes (a 32000x32000 JPG gzips to about 41k). I forget how I came to the 32512x32512 limit, maybe it was by trial & error, the largest size a browser still opens (probably tested on Firefox and Opera, didn't use Chrome at the time).
I also asked some friends with powerful (lots of memory) computers to try out a webpage that would load this image many times, with unique GET parameters to prevent caching, but apart from loads of harddisk access and maxing the CPU for a bit until they closed the tab, nothing crashy happened (and of course I did inform them what could happen and told them to save any work).
Reliably crashing a browser on a sufficiently high-end (say, gaming) PC, I haven't been able to do it since at least 5 years or so. I might have done better if I'd own a high-end computer myself, of course :) I remember it used to be as easy as making a webpage with 200 full-page DIV layers stacked at 1% opacity :-P