As rudolpho suggested, I download the url and then read it to a memblock or string and parse it. I've written a few scrapers for game websites to collect data for a database and it seems the easiest way. The only improvment I would like is to do away with the `middle-man` of downloading to a file and then loading the file. It would be much nicer to have a `download url to memblock` command
The plugin I use is GOGA's tools. Below is the function I use for simple downloads:
function download_url(url$)
`Download passed url webpage and return as a string
if file exist("c:\downloadfile.txt") then delete file "c:\downloadfile.txt"
success=downloadfile(url$,"c:\downloadfile.txt") `try download url
if success>0 and file exist("c:\downloadfile.txt")
open datafile to read 1,"c:\downloadfile.txt" `Load txt
txt$=""
repeat
txt$=txt$+datafile string$(1)
until datafile end(1)
close datafile 1
endfunction txt$
Also needs IanM's matrix utils.
Edit: If you are trying to read the data from BlueGUI's web browser window then I would like to be able to do that too! I did have a quick try but coulden't find any easy way to extract the window's contents.