Thread: App to suck info out of website?
05-01-2005, 07:47 AM #1OwenGuestApp to suck info out of website?
Does anyone know of an application that can 'suck' all the data out of a web site so it can be used offline at a later date? There's a site with hundreds of articles on that I'd like to be able to store as an off-line library. All the articles are separate links (not downloadable pdfs) and, as I said, there are literaly hundreds of them.
05-01-2005, 08:51 AM #2
- Member Since
- Oct 30, 2004
- San Antonio, Texas
- PowerMac G4 Cube 450mhz 832mb
the easiest way is to use a great browser called firefox, in firefox, control+click or right click the page>save page as, in the options, make sure it says web page, complete. this gives you any and all audio, video, images, etc.
05-01-2005, 10:38 AM #3OwenGuest
Thanks for that.
The problem is I'd need to save hundred of pages manually that way as there may be 50+ articles on each contents page (of which there are many). Do you know of a way to automatically save each of these linked articles as well (basically, the whole site)?
05-01-2005, 12:52 PM #4
- Member Since
- Jun 11, 2003
- Mount Vernon, WA
- MacBook Pro 2.6 GHz Core 2 Duo 4GB RAM OS 10.5.2
If you have tiger.. try automator. I hear that it should be able to do that. Or use sitesucker.. that really works well for me. You can find it on http://www.macupdate.com/---> Join the Mac-Forums Folding team: use 37954 as your team number.
View Mac-Forums Folding team statistics (More Info)
Don't forget to use the User Reputation System
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)
By rachaelwright in forum Web Design and HostingReplies: 5Last Post: 03-09-2011, 09:54 AM
By lisa_harrington in forum Web Design and HostingReplies: 2Last Post: 08-30-2010, 06:42 AM
By sajewi in forum OS X - Operating SystemReplies: 1Last Post: 08-23-2008, 09:46 AM
By techmonkey in forum Switcher HangoutReplies: 0Last Post: 09-11-2007, 10:51 PM