Website archiving software

Status
Not open for further replies.

kl_ravi

Journeyman
Hi Friends,

Can you suggest me few Website archiving software, which archive entire websites. :)

Note: - I use Webzip. But want to find some other software which does the job well. Pls Suggest only those softwares which you have tried
 

koolbluez

Šupər♂ - 超人
WebReaper

HTTrack Website Copier

Teleport Pro

BlackWidow

Website Extractor

The first 2 r free... the rest... u know what...
 

Saharika

In the zone
well
website coping has been old man tales now
i recenly came across the software which was able to copy all the pictures only...
well i know some web copier with appropiate filters can do that
but this soft is quick nice...
any way i hope even newer technoilogy will come soon.
 

ctrl_alt_del

A Year Closer To Heaven
Can anyone tell me how long does it take to archive a decent sized website?

I had been trying to copy the official "Band of Brothers" website with BlackWidow but it was taking ages. Can anyone tell me how long did it take for them to copy a similarly sized website?
 

Saharika

In the zone
ctrl_alt_del said:
Can anyone tell me how long does it take to archive a decent sized website?

I had been trying to copy the official "Band of Brothers" website with BlackWidow but it was taking ages. Can anyone tell me how long did it take for them to copy a similarly sized website?
it depends
i once asked the same question when i was trying to download a section of digit site .
But the software webcopier use to give how many files are in pending or how many files are there .If it had already downloaded some 500 files then it use to go up to may be 10000 files and say that ...files are remaining so that can some times help.
But mostly i gave up idea of downloading site after 3 days of downlaod. :eek:
any way try your luck.
 

ctrl_alt_del

A Year Closer To Heaven
@Saharika: The same was happening with me. I thoght the damn program was trying to download the whole net onto my HDD! :lol:
 

khattam_

Fresh Stock Since 2005
Meta Products Offline EXplorer is a very good program for this. It is fast and its enterprize edition lets you convert your downloaded website to a single chm file which has compressed size and is portable.
 

enoonmai

Cyborg Agent
@cody: You can always specify the "depth" of the site to leech, or the number of links that the parser is allowed to follow and download. Usually, you want to keep the depth down to less than 6 links to dig through from any single page, so that it doesn't go off and literally try to download each and every single HTML file it can reference.
 
Status
Not open for further replies.
Top Bottom