There are plenty of things you can measure, and a huge amount of detail you can go into. Historically, the aim would be to keep a page under 100kB, as that would equate to a rough download time of 30 seconds on a dial-up connection.
That 100kB limit is still a great rule of thumb. Yes, broadband is commonplace now, and yes, many pages are substantially over that. But dialup is still in use, mobile phones are being used more often (and are not as fast as desktop browsers), and people usually get broadband so they can get the same stuff faster, not so you can stuff pages with more pictures. Another important factor is that Google is now using page speeds in its search ranking algorithms.
We needed a set of metrics that went beyond the simple 100kB limit and took into account the effects of caching on subsequent pages loads and showed us where bottlenecks were happening. One number just wasn't enough, so we ran through a large list of options and ended up with this list by which we now measure site performance:
Total page size (kilobytes) for homepage
(Assets included, uncached.) This indicates the likely volume of data needed for the first page view of a visit. This will give you a rough idea of how long a first page view landing on your homepage will take.
You can use a variety of sources for this, but I quite like this Web Page Analyzer - it gives you the total page size, but also breaks down what's being loaded by category (images, scripts, stylesheets etc).
Total page size (kilobytes) for a product page
The easiest way to measure this (that I found) was to use the "Net" panel in Firefox's FireBug extension. Clear your cache, load the website homepage (to cache basic assets), and then load a product page - you will see the total page size at the bottom of the Net panel.
Page load time (seconds) for homepage HTML
(Uncached.) This metric, unlike the others, will give you an idea of how quickly the website server can build and send out a page. This can be measured with FireBug as well, and should be measured several times, at different times of day, to ensure a reasonably accurate value.
What this gives us is an idea of how a page will load for first-time visitors - a hugely important factor in any ecommerce shop - as well as subsequent page views. It identifies whether slowness is caused by the server or by volume of data. It helps to identify where improvements might have the greatest impact.
And what can you do if your page is too slow?
It seems obvious, but it's not always easy to do. Consider removing and concatenating files where possible. If you can, ensure that files that are not needed on every page are not loaded on every page. Every extra connection a browser needs to make to a server increases the load and the time it takes for your site to appear.
Use a CDN
Using jQuery? Great, but get Google to serve it for you and if your user has visited another site using Google content delivery network, then jQuery will already be cached. Even if not, Google's servers are dead fast, and it will lighten the load on your own server. (Link will follow but it appears the Google CDN introduction is down at the time of writing.)
A dynamic site can do lots and lots of repeated work. You can cache almost everything on a website, from includes to database query results to full web pages. I've written a guide to caching output in PHP.
GZip is a simple method of compressing output before sending it to the user. It's built in to most servers, so just switch it on for an instant speed boost.
Want to know more? Try these resources: