App to suck info out of website?

O

Owen

Guest
Hi there,

Does anyone know of an application that can 'suck' all the data out of a web site so it can be used offline at a later date? There's a site with hundreds of articles on that I'd like to be able to store as an off-line library. All the articles are separate links (not downloadable pdfs) and, as I said, there are literaly hundreds of them.
 
Joined
Oct 30, 2004
Messages
4,374
Reaction score
55
Points
48
Location
San Antonio, Texas
Your Mac's Specs
PowerMac G4 Cube 450mhz 832mb
the easiest way is to use a great browser called firefox, in firefox, control+click or right click the page>save page as, in the options, make sure it says web page, complete. this gives you any and all audio, video, images, etc.
 
OP
O

Owen

Guest
Thanks for that.

The problem is I'd need to save hundred of pages manually that way as there may be 50+ articles on each contents page (of which there are many). Do you know of a way to automatically save each of these linked articles as well (basically, the whole site)?
 
Joined
Jun 11, 2003
Messages
4,915
Reaction score
68
Points
48
Location
Mount Vernon, WA
Your Mac's Specs
MacBook Pro 2.6 GHz Core 2 Duo 4GB RAM OS 10.5.2
If you have tiger.. try automator. I hear that it should be able to do that. Or use sitesucker.. that really works well for me. You can find it on http://www.macupdate.com/
 

Shop Amazon


Shop for your Apple, Mac, iPhone and other computer products on Amazon.
We are a participant in the Amazon Services LLC Associates Program, an affiliate program designed to provide a means for us to earn fees by linking to Amazon and affiliated sites.
Top