automated caching?


Recommended Posts

Ok with how dynamic websites are I want to set a program/script up to automatically run through a list of pre-determined websites so my proxy can cache them prior to the work day starting...I have timing mechanisms in place already I just need the right script to run through in linux to open up all the information on a page and cache it (such as images, styles etc) as a normal cache would... any suggestions? using a linux server btw

Link to comment
Share on other sites

Dude why don't you just get a faster internet connection ;)  Caching the internet is dead and pointless since everything is dynamic these days.  Have gone over this ;)

 

A wget with follow should be all that is needed for the proxy to cache what it was asked to get.

Link to comment
Share on other sites

You dont know what your users are going to browse each day, It could vary. Its pointless caching pages they may not use. If you use something like Bloxx as your proxy for example that would cache pages that users actually visit. 

Link to comment
Share on other sites

Dude why don't you just get a faster internet connection ;)  Caching the internet is dead and pointless since everything is dynamic these days.  Have gone over this ;)

 

A wget with follow should be all that is needed for the proxy to cache what it was asked to get.

I told you budman ;) you were pretty close with being in desert ya know :P its impossible to get a bigger pipe, this is just something I have to do. 

 

Ok il look into writing a fancy wget script, dont want it to go too deep or only one site will get done :P 

 

Why would you want to do this? All it could possibly help with is the initial load of the web site, as soon as that's done the next user would read it from the cache.

UX 

Link to comment
Share on other sites

You dont know what your users are going to browse each day, It could vary. Its pointless caching pages they may not use. If you use something like Bloxx as your proxy for example that would cache pages that users actually visit. 

yea I know, its done on analytics, its more of an educated guess. would be silly to load every web pages or random ones. I work in a niche market really really niche, so what ever I can do to improve UX is great... squid caches the sites people visit but I just want a little bit of a preemptive strike.. :)

 

all in all I have justifications as to why i need to do this and I wish I could share them because then everyone would understand :P

Link to comment
Share on other sites

I don't buy the statement you can not get a fatter pipe - does not matter where you are on the planet. Might not be cheap, but fatter is always available if willing to pay the cost ;)

So from before you said you had a 2mbps/512Kbps -- can you not just get more of those and then load share across them? You might not be able to get faster pipes, but you can always bundle more together to get fatter ;)

Is cell available in the middle of your BFE Hell?

I am with Decrypt on this - the first person hitting a site will cache the site for the rest of the users. Preloading "might" save the first user some time, but its more likely to just waste your bandwidth if nobody hits that site that day.

Link to comment
Share on other sites

I don't buy the statement you can not get a fatter pipe - does not matter where you are on the planet. Might not be cheap, but fatter is always available if willing to pay the cost ;)

So from before you said you had a 2mbps/512Kbps -- can you not just get more of those and then load share across them? You might not be able to get faster pipes, but you can always bundle more together to get fatter ;)

Is cell available in the middle of your BFE Hell?

I am with Decrypt on this - the first person hitting a site will cache the site for the rest of the users. Preloading "might" save the first user some time, but its more likely to just waste your bandwidth if nobody hits that site that day.

no cell, no space for another kind of connection, max i can sort is a 4mb/1mb ... but thats it honestly there is no higher band possible 

Link to comment
Share on other sites

I don't buy the statement you can not get a fatter pipe - does not matter where you are on the planet. Might not be cheap, but fatter is always available if willing to pay the cost ;)

So from before you said you had a 2mbps/512Kbps -- can you not just get more of those and then load share across them? You might not be able to get faster pipes, but you can always bundle more together to get fatter ;)

Is cell available in the middle of your BFE Hell?

I am with Decrypt on this - the first person hitting a site will cache the site for the rest of the users. Preloading "might" save the first user some time, but its more likely to just waste your bandwidth if nobody hits that site that day.

if it wastes the bandwidth thats fine we have unlimited data, just limited bandwidth and it will not be used by anyone or anything during a small period...this small window is the time to do what ever I can even if it does nothing to try and improve UX ... right now though I need to try and use squid to build an analytical system to see what sites people visit... should be fun... to google! (calamaris looks like a possibility)

Link to comment
Share on other sites

satellite internet provided in the Sahara desert up to 60Mb/s...don't know what desert you are in, but there are providers that offer faster service.

http://www.skycasters.com/broadband-satellite-compare/compare.html

*throws on the MIB shades* if only that were possible where I am son, but trust me REALLY trust me... it is not

Link to comment
Share on other sites

Even if you were a 2 miles deep into the earth, it is possible.  Hell, we could put communications in a sub, to a certain depth anyway. :shiftyninja:

you are getting closer to my situation xD :P 

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.