Web Capture

Started by Ken on 12/18/2008
Ken 12/18/2008 5:15 am
I know that Surfulator can capture a complete web page, its URL, and page title, as well as give you a field for your own comments. But, if I just want to capture the URL, page title and have a field for comments, are there other programs that that can easily capture this data and automatically populate the correct fields like Surfulator? I would like to start capturing this data, but I do not necessarily want to capture the web pages themselves. Can Surfulator be made to do this? Or, should I be looking at alternate programs? I should also add that I am looking for a program that has a tree-like organizational structure and tagging so I can organize my data. I am trying to expand my PIM tools to my web surfing.

Thanks,

--Ken
Pierre Paul Landry 12/18/2008 1:01 pm
I believe that URp will do it. My own InfoQube does it nicely too, but currently only for FireFox (IE extension will be completed in Jan 2009)

In both cases, you can copy the page or just link to it
Alexander Deliyannis 12/18/2008 3:42 pm
From what I gather, what you are looking for (URL, page title and have a field for comments) should be offered by any 'bookmark manager' out there. I personally use LinkStash ( http://www.xrayz.co.uk/linkstash/ ) but there are many more, some working as Firefox add-ins.

Surfulater can also do this (just right click in the page and select Bookmark this Page) but I would suggest a dedicated program is better. In addition, some bookmark managers offer cross-platform support, as well as web backup/sync.

An additional feature I would suggest you look for are tags; it's easy to get thousands of bookmarks but not so easy to search for or manage them. I find that tags are quick to add when entering a new bookmark, and they can help significantly when searching.

alx

Ken 12/18/2008 4:58 pm
Hi PPL,

I was wondering if this could be done in IQ. Is there a specific tutorial or article that I should look at on the web site that further explains how this will be done in IQ? I am hoping that the capture would be easy and somewhat automatic. Also, would it work with the portable version of Firefox?

Thanks,

--Ken
Ken 12/18/2008 5:02 pm
Alexander Deliyannis wrote:
From what I gather, what you are looking for (URL, page title and have a field for
comments) should be offered by any 'bookmark manager' out there. I personally use
LinkStash ( http://www.xrayz.co.uk/linkstash/ ) but there are many more, some
working as Firefox add-ins.

Surfulater can also do this (just right click in the
page and select Bookmark this Page) but I would suggest a dedicated program is better.
In addition, some bookmark managers offer cross-platform support, as well as web
backup/sync.

An additional feature I would suggest you look for are tags; it's easy
to get thousands of bookmarks but not so easy to search for or manage them. I find that
tags are quick to add when entering a new bookmark, and they can help significantly
when searching.

alx


Thank you for the link, Alexander. I woudl agree with your observations. I believe that I did mention tagging in my original post. Web captureis a new adventure for me, as I always thoguth that I could remember my web research finding in my brain, but the volume has increased substantially, and bookmarks just do not work well enough for me. I will take a look at LinkStash.

--Ken
Pierre Paul Landry 12/18/2008 5:22 pm
Hi PPL,

I was wondering if this could be done in IQ. Is there a specific tutorial or
article that I should look at on the web site that further explains how this will be done
in IQ? I am hoping that the capture would be easy and somewhat automatic. Also, would it
work with the portable version of Firefox?

This would be a good start:
http://www.sqlnotes.net/drupal5/index.php?q=node/120

I have not tried it with the portable version of FireFox. If extensions can be installed, I don't see why not. IQ is also portable
Ken 12/18/2008 6:05 pm
Thank you for the prompt reply and the link, Pierre. I will review the article and see if both IQ and FF Portable can do what I need. Unfortunately, unlike my traditional PIM needs, web capture is an evolving issue for me. So, I expect that everything will be in flux for some period of time while I try to better understand my needs.

--Ken
Ken 12/18/2008 6:06 pm
Alexander,

Do you use Linkstash? Is it stable? It looks like a great program, and it is portable as well.

--Ken
Alexander Deliyannis 12/18/2008 9:27 pm
Yes, I've been using it for many years. It's extremely stable and also keeps 3 rolling backups of each bookmark file that you use. I personally only use one file which is actually maintained in a virtual drive (Nomadesk) and is thus synced across my computers.

LinkStash development is not very regular (the developer seems to be focused on his other product, ClipCache, at the moment) but the program is mature. In fact, I am always impressed at how fast it runs in my machines --it's as optimised as it gets. It makes one wonder why many more 'modern' applications seem to take ages.

By the way, you did mention tagging in your original post, but I missed it somehow. LinkStash supports tagging since version 2.0 (I think) and I find it very useful.

alx

Derek Cornish 12/19/2008 7:33 am


Ken wrote:
I know that Surfulator can capture a complete web page, its URL, and page title, as well
as give you a field for your own comments. But, if I just want to capture the URL, page
title and have a field for comments, are there other programs that that can easily
capture this data and automatically populate the correct fields like Surfulator? I
would like to start capturing this data, but I do not necessarily want to capture the
web pages themselves. Can Surfulator be made to do this? Or, should I be looking at
alternate programs? I should also add that I am looking for a program that has a
tree-like organizational structure and tagging so I can organize my data. I am trying
to expand my PIM tools to my web surfing.


I think that Zoot might be able to do what you want. It's pretty flexible. Why not try asking over on their Yahoo group?

Derek
Thanks,

--Ken
Cassius 12/19/2008 8:33 pm
Ken wrote:
I know that Surfulator can capture a complete web page, its URL, and page title, as well
as give you a field for your own comments. But, if I just want to capture the URL, page
title and have a field for comments, are there other programs that that can easily
capture this data and automatically populate the correct fields like Surfulator? I
would like to start capturing this data, but I do not necessarily want to capture the
web pages themselves. Can Surfulator be made to do this? Or, should I be looking at
alternate programs? I should also add that I am looking for a program that has a
tree-like organizational structure and tagging so I can organize my data. I am trying
to expand my PIM tools to my web surfing.


With myBase + WebCollect, you can capture all of a page or just the part you highlight. In either case, the URL appears at the top. Thwere is a 2-tab interface. One tab shows the Web page (or portion captured). The other tab displays a blank RTF page on which you can write comments, paste graphics, etc.

-c
Ken 12/21/2008 12:55 am
I think that I am going to try LInk Stash for now. It seems to do exactly what I think I need. Its portable, and very easy to use. And, it requires no installation so I can use it at work. I will give IQ a go, but for now I am going to wait because it is tied to FireFox, while I am using IE at work and trying out Chrome on my netbook. Thanks for the recommendation, Alexander.

--Ken
Alexander Deliyannis 12/21/2008 6:32 am
Glad to have been of help :-)

By the way, regarding Chrome --which I also flirted a while with- you might want to check the following stories suggesting that it is nothing short of a resource hog, as is Internet Explorer 8:

http://www.infoworld.com/article/08/09/03/36TC-browsers_1.html

and
http://news.softpedia.com/news/Resource-Hogs-Google-Chrome-and-IE8-Beta-2-Compared-to-Firefox-3-0-1-92927.shtml

Interestingly, this was far from apparent during my own experience, because Chrome creates a multitude of threads, making it difficult for the Task Manager to 'add up' its total footprint; i.e. it looks lighter than it is.

alx

Michal 12/21/2008 5:05 pm


Ken wrote:
I know that Surfulator can capture a complete web page, its URL, and page title, as well
as give you a field for your own comments. But, if I just want to capture the URL, page
title and have a field for comments, are there other programs that that can easily
capture this data and automatically populate the correct fields like Surfulator? I
would like to start capturing this data, but I do not necessarily want to capture the
web pages themselves. Can Surfulator be made to do this? Or, should I be looking at
alternate programs? I should also add that I am looking for a program that has a
tree-like organizational structure and tagging so I can organize my data. I am trying
to expand my PIM tools to my web surfing.


Thanks,

--Ken

Hi,

My preferred web clipper is Surfulater. I use Macropool's Web Research for a while, but switched.

As for managing bookmarks, there are a few bookmarks managers I tried:
Link Commander http://www.resortlabs.com/bookmark-manager/linkcommander.php - supposed to be good, but crashed repeatedly under my Vista.
Linkman http://www.outertech.com/index.php?_charisma_page=product&id=5
LinkyCat http://linkycat.com/ - crashed under Vista

LinkStash was the quickest and the most stable, and therefore is the only one still installed on my Vista laptop. I like it's stability/simplicity. Too bad it doesn't support favicons (yet), and I wish it had a better Firefox3 integration. BUT. It's the best of the lot to my humble opinion :)

I'm debating whether to try the following:
Advanced URL Catalog http://www.jordysoft.com/aucatalog/advanced-url-catalog.aspx (um...20 Euro...)
Eluma - a freeware! - sort of combined RSS reader+bookmarks manager http://www.eluma.com/site/Default.aspx ... Here's a product demo: http://www.eluma.com/site/Product/ProductDemo.aspx

Michal

Alexander Deliyannis 12/22/2008 9:10 am
Michal, thanks for the heads-up on Eluma. It looks like an interesting tool, especially since it provides synchronisation between computers.

alx

Michal 1/2/2009 9:19 pm
There's another web capture tool, but it seems to be an overkill if used only as a bookmark manager:
Check&Get http://activeurls.com/
"Manage and validate bookmarks, Track Changes and Archive Web Pages"...
Some folks at donationcoder.com seem to like it. A recent discussion about bookmark managers: http://www.donationcoder.com/Forums/bb/index.php?topic=15658.0


Daly de Gagne 1/4/2009 6:42 pm
Surfulater does a real good job doing exactly what you're looking for, Ken. I like Surfulater for bookmarks over other programs because I have a unified collection of both bookmarks and pages. Thus I don't have to make my way through two separate collections of info when looking for something.

Surfulater's tagging and other sorting mechanisms allow me to easily show all my bookmarks together if that is what I wish, or in the context of their respective topics.

Surfulater is so much more versatile than many people realize.

And it keeps getting better.

As well, Neville is the kind of developer who stays on top of any user concerns or suggestions.

www.surfulater.com

Daly

Ken wrote:
I know that Surfulator can capture a complete web page, its URL, and page title, as well
as give you a field for your own comments. But, if I just want to capture the URL, page
title and have a field for comments, are there other programs that that can easily
capture this data and automatically populate the correct fields like Surfulator? I
would like to start capturing this data, but I do not necessarily want to capture the
web pages themselves. Can Surfulator be made to do this? Or, should I be looking at
alternate programs? I should also add that I am looking for a program that has a
tree-like organizational structure and tagging so I can organize my data. I am trying
to expand my PIM tools to my web surfing.

Thanks,

--Ken
Ken 1/5/2009 4:32 am
Daly de Gagne wrote:
Surfulater does a real good job doing exactly what you're looking for, Ken. I like
Surfulater for bookmarks over other programs because I have a unified collection of
both bookmarks and pages. Thus I don't have to make my way through two separate
collections of info when looking for something.

Surfulater's tagging and other
sorting mechanisms allow me to easily show all my bookmarks together if that is what I
wish, or in the context of their respective topics.

Surfulater is so much more
versatile than many people realize.

And it keeps getting better.

As well,
Neville is the kind of developer who stays on top of any user concerns or
suggestions.

www.surfulater.com

Daly

Hi Daly,

I looked at Surfulator, and I like both the program and Neville. However, I think that for the time being I am going to try Link Stash as it is portable. I may eventually get to the point where I want to capture full pages, but I am not there yet. I am also interested in seeing what PPL does with IQ as it matures, but Link Stash just seems like the right program for me now.

--Ken