Want to Automatically Save HTML/MAFF of Every Page I Visit with Depth 1

Started by commissarmo, June 26, 2015, 01:33:46 AM

Previous topic - Next topic

commissarmo

Hello,

I've been searching FOREVER for a way to automatically save the HTML of every page I visit across multiple browsers. 

Firefox Shelve Addon does very much of what I want (it saves an HTML copy of every page I visit in Firefox to a folder , with depth 1, no links (though I might like saving up to 1 link out too). 

But of course it only works in Firefox.  Further I would like to use MAFF which is a single file web page archive format if possible.

1.  I just discovered WebCopy and it seems very sophisticated.  I'm wondering if it can be set up to do what I want?

2.  I would want it to run in the background and auto-save every webpage I visit (it's ok if it can only do HTML format), regardless of browser?

3.  I assume there is a way to tell it to only save the exact webpage visited (so to depth 1)?  I couldn't figure out how to do this.  I tried to save www.reuters.com and it started saving the entire website!

4.  is there a way to automate it?

5.  If not, I have investigated the possibility of getting a copy of every URL visited on my computer and then feeding it to something like WebCopy.  Getting this URL list though has proven very difficult, as apparently I need a web proxy and logging to do this, and haven't figured it out yet. 

6.  Can WebCopy somehow be configured to solve my problem?

Richard Moss

Hello,

Thanks for your interest in WebCopy, although I'm afraid that WebCopy is never going to be able to do what you want to do. The program is designed to copy parts or entire web sites yes, but it is not designed to follow you as you browse the internet and copy bits at a time.

Might I suggest something like the OneNote Clipper? This can be used to copy web pages into a synchronized notebook. I understand that this works on all browsers so may be a good fit for you.

Even though WebCopy can't do it, I might as well flesh out the answers to your questions regardless.

1. No, it can't do what you want, nor will it ever be able to - sniffing around a users browser history is not something that is appropriate for the programs design.

2. No, as per point 1, although I do get the odd request to have an URL watcher to trigger downloads based on clipboard copying (which I haven't gotten around to doing yet).

3. No. WebCopy does actually support this, but it hacked into the crawler and not exposed for general copying. It's on hold for a major rewrite in v2.

4. Yes, recent builds include a CLI version of the program, although this is limited in functionality at the moment. It will be expanded over time.

5. While this isn't in the scope of anything I can do, you are on the right track. A program I sometimes use for monitoring HTTP/S traffic was Fiddler. However, I haven't use this since it was bought out by Telerik.

6. No. As I said, it's way beyond what WebCopy is designed to do.

Sorry I can't help!

Regards;
Richard Moss
Read "Before You Post" before posting (https://forums.cyotek.com/cyotek-webcopy/before-you-post/). Do not send me private messages. Do not expect instant replies.

All responses are hand crafted. No AI involved. Possibly no I either.