large uploads via HTTP
2006-08-25 16:58:23.368485+00 by
Dan Lyke
7 comments
Okay, I know that Google Video and YouTube must have a solution, and I just checked out YouSendIt and it does a reasonable job of actually giving user feedback, but...
We've got a simple PHP app that needs to do uploads from the browser, presumably via HTTP. It does small uploads just fine. It does not do large (> 10MB) uploads.
I've done everything I can to tell PHP to not restrict my upload size, I don't have access to the system-wide PHP.ini but I've added the lines:
> php_value upload_max_filesize 2000000000
> php_value post_max_size 2000000000
to my .htaccess. When I try to do the upload, my browser just sits there waiting for the response.
If PHP can't handle this I could pass the upload over to a Perl script or whatever, but we need to be able to upload animations and such.
Anyone done this? I can probably even wheedle a small budget for it.
[ related topics:
Web development
]
comments in ascending chronological order (reverse):
#Comment Re: large uploads via HTTP made: 2006-08-25 18:02:27.93522+00 by:
Jim S
You might also want memory_limit and max_execution_time. I use these in a system that takes mult-
megabyte uploads:
php_value memory_limit "21M"
php_value max_execution_time 3601
#Comment Re: made: 2006-08-25 18:11:31.927598+00 by:
Dan Lyke
Thanks, trying that now...
The YouSendIt solution does some JavaScript or somesuch magic to do a progress bar, which would be really nice, but if this works it'll at least take a distraction away.
#Comment Re: made: 2006-08-25 18:19:13.001159+00 by:
meuon
Note: There may also be apache post limits and timeouts involved.
And in one case, I setup temporary FTP accounts for such uploads,
but the clients were using MSIE so uploading through the browwser
was easy.
#Comment Re: made: 2006-08-25 18:20:06.119746+00 by:
aiworks
You know, Dan, I built an app in Lotus Domino that did this (and I regularly sent 250+ MB files through). As I recall, the trick is to not get the HTTP stack to keep the file data in memory as it's doing its thing; it has to be written to a temporary file.
I'd be curious to watch Apache/lighttpd/etc... memory usage as your upload is happening to see if that's what PHP is doing.
#Comment Re: made: 2006-08-25 18:36:49.822775+00 by:
Dan Lyke
I've been trying to not learn PHP, but I believe that it's spooling to disk.
And I know the Perl CGI module goes to disk.
I may have to implement this on my own server so that I can watch it more closely as it happens.
#Comment Re: made: 2006-08-25 19:21:52.741922+00 by:
spc476
[edit history]
Does it have to be through the web? Can it not be done through FTP? I know most browsers support browsing via FTP, and some even allow uploading through it (and I think there's an extension for Firefox to allow this).
#Comment Re: made: 2006-08-25 20:57:59.837456+00 by:
Shawn
[edit history]
What meuon said (regarding web server timeouts, etc.). I've run up against exactly this problem (didn't find out that the client wanted to upload 100+mb files until *after* the product had been delivered - there's a lesson there). BTW, PHP uploading just provides (or used to just provide) a wrapper around an HTTP POST. (Or was it PUT? It's been awhile.)
I wound up recommending the java applet UUpload, but left the company before it was actually implemented. It provides an easy, user/web interface, but uses FTP for the actual upload.