Being human, there are other factors that appear to be 'speed' but are not. A lot of it has to do with perception...
If a page replies instantly and continues to render, it may appear faster. However a page that fully downloads first, THEN renders, is actually faster.
So if you have a rather large page, you may want to send it out in increments with the PHP "flush()" command or the CFML "<cfflush>" tag.
So why does this actually take longer?
* You're making the parser to stop what it's doing to send out parts.
* The web server must send out the parts.
* Your computer must get the parts.
* Your browser re-renders the page with every new part.
... in other words, you're wasting a lot of CPU all over the dang place. But it's worth it - because we are serving to humans, not stopwatches.
Another thing is complex tables, opacity, image types, image compression ratios, and of course, file sizes.
A long time ago, it was a tough game. You'd compress your images a lot so they'd be smaller and transmit faster. However you'd have to think of the slow computers back then. It might take longer to decmpress the image and draw it to screen than it would take to simply download a larger image!
Times have changed... CPUs are monsters compared to back then.
Complex tables take more CPU to render. So do a lot of DIVS with mucho CSS. Tableless designs are not the answer.
Using Opacity on CSS or IE's alpha filter will take a performance hit as well. Looks cool though.
Some image types take longer to render. Gifs are the quickest. It's probably not conceivable to the naked eye. But if you have lots of images on a page it might be a consideration.