使用PHP分割大文件 [英] Split big files using PHP

查看:110
本文介绍了使用PHP分割大文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从php代码中分成多个部分拆分大型文件(具体来说是tar.gz文件).这样做的主要原因是,PHP在32位系统上的2gb限制.

I want to split huge files (to be specific, tar.gz files) in multiple part from php code. Main reason to do this is, php's 2gb limit on 32bit system.

所以我想将大文件分成多个部分,并分别处理每个部分.

SO I want to split big files in multiple part and process each part seperately.

这可能吗?如果是,怎么办?

Is this possible? If yes, how?

推荐答案

一种简单的方法(如果使用基于Linux的服务器)是使用exec命令并运行split命令:

A simple method (if using Linux based server) is to use the exec command and to run the split command:

exec('split Large.tar.gz -b 4096k SmallParts'); // 4MB parts
/*    |     |            |      | |
      |     |            |______| |
      App   |                 |   |_____________
            The source file   |                 |
                              The split size    Out Filename
*/

有关更多详细信息,请参见此处: http://www.computerhope.com/unix/usplit. htm

See here for more details: http://www.computerhope.com/unix/usplit.htm

或者您可以使用: http://www.computerhope.com/unix/ucsplit.htm

exec('csplit -k -s -f part_ -n 3 LargeFile.tar.gz');


PHP在单个线程中运行,并且增加此线程数的唯一方法是使用fork命令创建子进程.


PHP runs within a single thread and the only way to increase this thread count is to create child process using the fork commands.

这不是资源友好的.我的建议是研究一种可以快速有效地完成此任务的语言.我建议使用node.js.

This is not resource friendly. What I would suggest is to look into a language that can do this fast and effectively. I would suggest using node.js.

只需在服务器上安装节点,然后创建一个名为node_split的小脚本即可为您自己完成这项工作.

Just install node on the server and then create a small script, called node_split for instance, that can do the job on its own for you.

但是我确实强烈建议您不要将PHP用于此作业,而应使用exec允许主机操作系统执行此操作.

But I do strongly advise that you do not use PHP for this job but use exec to allow the host operating system to do this.

这篇关于使用PHP分割大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆