English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

And even better, if i needed www.website.com/page=x where x is a variable, can I have a range of pages found based on changing that variable and collected into a single file?

I guess I need a script, but I´m not sure.

Thanks much!

2007-01-31 05:56:15 · 4 answers · asked by Lobo Mike 2 in Computers & Internet Programming & Design

4 answers

If you want to automate it, get this framework: WatiN. You can create an IE object and use the Goto() function to choose Url, there's an Html property on the IE object that is the document content. Oh, btw this is a .net framework, so use C# or VB.net or whatever floats your boat.

2007-01-31 06:08:50 · answer #1 · answered by Pfo 7 · 0 0

Its not accessible for HTML to pull information from Excel, yet for Excel it extremely is promptly ahead. as an get jointly you've table with information in a unmarried sheet you ought to convert this awareness in to an html record via basically saving the record as html flow to record Menu click save As decide on concepts superhighway internet web site and click save

2016-12-03 07:02:53 · answer #2 · answered by ? 4 · 0 0

Try wget or curl, they can grab pages on the command-line, and can save the files you tell it to 'get'.

Curl and Wget are for linux, but there's probably a windows version of them too.

If you're into scripting, like Perl, you could use the LWP package for getting pages and saving them to disk.

Also, if you have the ActiveState Perl installed on your Windows box, you can use the 'get' command, just like you would with curl/wget.

2007-01-31 06:25:21 · answer #3 · answered by fixedinseattle 4 · 0 0

Yes. Got o the webpage, to fo FILE, down to SAVE AS, and then drop down to txt.

2007-01-31 05:59:31 · answer #4 · answered by Toms777 3 · 0 1

fedest.com, questions and answers