Hi team. Does anyone know how to download a STATIC copy of a given Runway website? I'd like to have a local copy of all the page and that won't need the database or connections for viewing locally for reference/archiving. How can this be done?
Download Static Version of Website With Content
- dosa
- Thread is marked as Resolved.
-
-
Something like Wget? https://en.wikipedia.org/wiki/Wget
Otherwise, for smaller sites, and for an example set of pages, I've done it manually.
-
Thanks Clive Walker - the Wget looks pretty "demo" style. I'll hunt around for something more robust and user-friendly. Not sure I want to learn terminal commands to complete a static backup. Hoping there is a 'crawl' style tool that can save it. The site has hundreds of pages.
So what's your manual method? I imagine all that's missing is the actual content, so just saving directly in your browser to the right directory or is there a smoother way?
-
File -> Save Page As -> Web Page, Complete is what I would normally do.
-
Right. Times 1000. I wonder if there is a browser plug-in that can crawl directories and batch save them....
thanks Clive. Cheers!
-
Something like this may work for you:
-
+1 for SiteSucker, I've used it on a Perch site previously at it worked pretty well.
And a link checker like Integrity really helps to check everything has gone well.
-
+1 for SiteSucker