GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Best way to copy from server to server? (https://gfy.com/showthread.php?t=1010781)

camperjohn64 02-17-2011 10:02 AM

Best way to copy from server to server?
 
I have a video converter on www.*.com. After converting, it sends the file to img.*.com and want your ideas on how to send the file across. They are on different servers.

Currently, on www.*.com, I use something like this:

Code:

        fsockopen('img1.domain.com', 80);   
        fputs($fp,"POST /image_upload.php HTTP/1.1\r\n");
        etc etc etc...(also sends MD5 and password)

On the recieving end (img1.*.com), I have image_upload.php that takes the POST'ed file and saves it disk...

Code:

        if (MD5 security check) {
        $fp = fopen(IMAGE_PATH . '/'. $filename,"w");
        fwrite($fp,$uploaded);
        fclose($fp);

This has been working fine between several servers, for a couple of years of service, but it's failing on 100MB + video files.

Does anyone think there is a better method of doing this?

JM

Jakez 02-17-2011 10:06 AM

Try cURL?

Klen 02-17-2011 10:13 AM

Try PHPcli?

CYF 02-17-2011 05:07 PM

scp possibly?

rowan 02-17-2011 05:21 PM

Looks like you're loading the entire temp upload file into memory in order to save a copy? PHP is probably hitting the configured maximum memory limit for a script. Why don't you just use php's copy() function?

If it's still failing then try increasing the value of upload_max_filesize in php.ini

blackmonsters 02-17-2011 05:26 PM

Quote:

Originally Posted by rowan (Post 17923077)
Looks like you're loading the entire temp upload file into memory in order to save a copy? PHP is probably hitting the configured maximum memory limit for a script. Why don't you just use php's copy() function?

If it's still failing then try increasing the value of upload_max_filesize in php.ini

:2 cents::2 cents::2 cents::2 cents::2 cents:

rowan 02-17-2011 05:42 PM

Pre-empting an "I need to load the file into memory in order to md5 it" reply... use md5_file() :)

Dirty D 02-17-2011 05:56 PM

rsync or perhaps wget

camperjohn64 02-17-2011 06:23 PM

Quote:

Originally Posted by Dirty D (Post 17923135)
rsync or perhaps wget

- It is a push, not a pull, so I can't use wget. The sending server must initiate the call.

Quote:

Originally Posted by rowan (Post 17923112)
Pre-empting an "I need to load the file into memory in order to md5 it" reply... use md5_file() :)

- The md5 is just a security measure - it's not against the file itself. Since the destination server is listening for incoming files from multiple servers, I wanted to make sure a hacker could never stumble upon the URL call and send files to the server without some sort of checksum / security.

Quote:

Originally Posted by rowan (Post 17923077)
Looks like you're loading the entire temp upload file into memory in order to save a copy? PHP is probably hitting the configured maximum memory limit for a script. Why don't you just use php's copy() function?

If it's still failing then try increasing the value of upload_max_filesize in php.ini

- Duh!! Of course. copy is exactly what I neeed (rather than load/save). Thanks! And since it's a copy of a tmp file, I could actually just use move!

Amazing how a different pair of eyes saw that immediately!

CYF 02-17-2011 06:36 PM

Quote:

Originally Posted by camperjohn64 (Post 17923167)
- It is a push, not a pull, so I can't use wget. The sending server must initiate the call.

- The md5 is just a security measure - it's not against the file itself. Since the destination server is listening for incoming files from multiple servers, I wanted to make sure a hacker could never stumble upon the URL call and send files to the server without some sort of checksum / security.

check out scp :2 cents:

woj 02-17-2011 07:34 PM

send a url to the 2nd server...
then from the 2nd server initiate a GET request with curl to grab the file from the first server...

POSTing large files is usually problematic, there are more than dozen things than can go wrong... GET is 10x more robust...

or just use scp or rsync....

Cyandin 02-17-2011 10:12 PM

Quote:

Originally Posted by rowan (Post 17923077)
Looks like you're loading the entire temp upload file into memory in order to save a copy? PHP is probably hitting the configured maximum memory limit for a script. Why don't you just use php's copy() function?

If it's still failing then try increasing the value of upload_max_filesize in php.ini

:thumbsup </thread>

facialfreak 02-18-2011 02:19 AM

you can use the method you are using ... although maybe not the most efficient way, it does the trick ...

just run it from within the SCREEN command (linux/unix) so that even if your connection burps or hits a bump in the road, the process will keep going ...

http://magazine.redhat.com/2007/09/2...to-gnu-screen/

k0nr4d 02-18-2011 03:37 AM

Quote:

Originally Posted by camperjohn64 (Post 17923167)
- It is a push, not a pull, so I can't use wget. The sending server must initiate the call.

Do it the other way then:
file_get_contents('http://www.otherserver.com/get.php?filename=whatever');

get.php:
shell_exec("wget http://www.firstserver.com/files/".$_GET[filename]);

Thats it in a nutshell, but add sanitization on the inputs, check the server ip on get.php etc...

PowerCum 02-18-2011 03:44 AM

use rsync. It will make the entire md5 process for you and ensure that the content on both servers is always the same.


All times are GMT -7. The time now is 02:54 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc