I'm currently using:
$file_contents = file($url);
This takes about 10 seconds for a file with about 20,000 lines.
I've tried get_file_contents() as well as fgets() without much of a change.
10 seconds isn't a huge deal since I'll be saving the data locally and only doing the url read once every few hours. However, my gut feeling is that the 10 seconds could be dramatically reduced.
- gss
PHP - need efficient way to read file over http
- Tank Program
- Forum & Project Admin, PhD
- Posts: 6711
- Joined: Thu Dec 18, 2003 7:03 pm
Just played around a bit with 'wget' from the command line (not linux, but macos has it as well). It still took 6 or 7 second to get the file. The file is about 200k, and it retrieved it at 44k/s. I'm not sure where the extra second or two went, but given protocol overhead and typical upstream rates from a cable-modem or DSL line, I'm probably unlikely to see much of an improvement.Tank Program wrote:I'd use fopen & co as well. Or to be really cheeky (if you're on linux) you can use system to wget the file to a temp location in the background while you do other things for those 10 seconds.
This shows, I guess, that 10 seconds isn't all that inefficient.
I suspect anybody playing on the server as it sends out the files sees a nice lag-spike. Maybe I'll limit the transfers to once or twice a day.
- gss
- wrtlprnft
- Reverse Outside Corner Grinder
- Posts: 1679
- Joined: Wed Jan 04, 2006 4:42 am
- Location: 0x08048000
- Contact:
Run a cronjob on the arma server that bzip2-compresses the file once in a while and have the PHP script download that (and uncompress it locally), that should shorten the download.
Anyways, if you're worrying about lag spikes you don't want the file to be downloaded as fast as possible but really slow Try wget's --limit-rate parameter.
Anyways, if you're worrying about lag spikes you don't want the file to be downloaded as fast as possible but really slow Try wget's --limit-rate parameter.
There's no place like ::1
I'm more worried about the site's page load time than the lag spike; so faster is better in this case.wrtlprnft wrote:Run a cronjob on the arma server that bzip2-compresses the file once in a while and have the PHP script download that (and uncompress it locally), that should shorten the download.
Anyways, if you're worrying about lag spikes you don't want the file to be downloaded as fast as possible but really slow Try wget's --limit-rate parameter.
A cron job isn't a bad idea, though the server from which I'm pulling the files isn't my own; so I wouldn't want to ask the admin to do that. Instead, I might put a cron job on my own machine to wget the file every so often and always have my php script parse the local files. This could actually solve both issues, page load time AND lag spikes on remote server.
EDIT: I was wrong about MacOS having wget. (I was playing with wget on my linux box) It comes with 'curl'. I can either download and install wget or use curl.
EDIT #2 : I tested using curl to get the 200k file at 1k/s while playing on the server and saw no effects. So this is likely the best route... curl and cron.
Thanks for the ideas.
- Tank Program
- Forum & Project Admin, PhD
- Posts: 6711
- Joined: Thu Dec 18, 2003 7:03 pm
You can.Lucifer wrote:You can use scp to copy through an ssh connection instead. Don't know if you can limit the transfer like you can with wget.
Code: Select all
-l limit
Limits the used bandwidth, specified in Kbit/s.