Thanks for your quick answer.<div><br>However I am not sure that it answers exactly to my needs: what I am looking for is a way to do that FROM Scilab (that is with a Scilab command, encapsulated in a script or a function for instance) and if I have well understood what you suggest, your proposal is to go through a download as a text file on my computer.</div>
<div><br></div><div>Eric.</div><div><br></div><div><br><div class="gmail_quote">2012/3/14 Adrien Vogt-Schilb <span dir="ltr"><<a href="mailto:vogt@centre-cired.fr">vogt@centre-cired.fr</a>></span><br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#330000"><div><div class="h5">
On 14/03/2012 21:58, Eric Dubois wrote:
<blockquote type="cite">Hello
<div><br>
</div>
<div>Does anyone know how to recover the (text) content of a web
page from Scilab (as it is possible with function dowlaod.file
in R software)?</div>
<div><br>
</div>
<div>Thanks for your answer!.</div>
<div>
<br>
</div>
<div>Eric. </div>
</blockquote></div></div>
hi<br>
<br>
if your machine runs on linux, you can use unix("wget -O myfile.txt
<a href="http://www.url.com" target="_blank">http://www.url.com</a>")<br>
and then open "myfile.txt"<br>
<br>
(you may want to deletefile("myfile.txt") once you are done)<br>
<br>
if you are using windows you can always install wget for windows
(<a href="http://gnuwin32.sourceforge.net/packages/wget.htm" target="_blank">http://gnuwin32.sourceforge.net/packages/wget.htm</a>) and use it the
same way<br>
<br>
on mac i guess you can natively use wget too.<br>
<br>
<br>
hope this helps<span class="HOEnZb"><font color="#888888"><br>
<br>
Adrien Vogt-Schilb<br>
</font></span></div>
</blockquote></div><br></div>