Prevent Page Caching in Browsers - Surprisingly, there isn't an easy, 100% reliable method and success depends a bit on the type of content you are targeting (Flash, vidoe, images, text). Here are some approaches to try...
<META HTTP-EQUIV="expires" CONTENT="0">
<META HTTP-EQUIV=\"pragma\" CONTENT=\"NO-CACHE\">
The second line is for Internet Explorer (IE), which doesn't always recognize the code/tag in the first line.
However, this approach doesn't always work by itself. To improve your chances insert put the second line at the bottom of your content, just before the closing </BODY> tag:
<META HTTP-EQUIV="PRAGMA" CONTENT="NO-CACHE"> </BODY>
Even with all this, you may not get the anti-caching behavior you seek.
To be 100% guaranteed of not serving cached content, you need to create a unique URL each time the page is served. You can do this by programmatically adding a "&" character at the end of your URL string followed by any random string of characters.
The text after the "&" does not affect the target URL--the same page will be served in each URL example above. It simply tells the server to expect some variable (or command?), which you provide as a random, meaningless string.
But to the browser, so the page will not be rendered from browser cache. Try clicking the URLs above to confirm.
Question I have is whether that random string affects search engine indexing so that each URL is also considered unique. I don't think so, but it's something to test.
Source - Prevent Pages From Caching
(:amazonpl wanderings-20 0321687299:)(:amazonpl wanderings-20 0980455278:)