处理大文件或多个文件时,file_put_contents太慢 [英] file_put_contents too slow when handling large file or multiple files

查看:293
本文介绍了处理大文件或多个文件时,file_put_contents太慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 file_put_contents 创建视频文件.问题是速度和性能.创建平均大小为50 mb的文件平均大约需要30到60分钟,而仅一个文件即可.我正在解码字节数组以创建文件.如何提高速度和性能?

I am using file_put_contents to create a video file. the problem is the speed and performance. It takes about an average of 30 to 60 minutes for an average file size of 50 mb to be created and that is just for one file alone. I am decoding a byte array to create the file. How can I improve the speed and performance?

$json_str = file_get_contents('php://input');
$json_obj = json_decode($json_str);
$Video = $json_obj->Video;
$CAF = $json_obj->CAF;
$Date = $json_obj->Date;
$CafDate = date("Y-m-d", strtotime($Date));

$video_decode = base64_decode($Video);
$video_filename = __DIR__ . '/uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$video_dbfilename = './uploads/'. $CAF . '_'.$CafDate.'_VID.mp4';
$save_video = file_put_contents($video_filename, $video_decode);

推荐答案

当无法预知文件大小或将要处理大文件时,请勿将整个文件加载到内存中.最好分块读取文件并逐块处理它.

You should not load entire files to the memory when you can't foresee the size or when it's going to handle huge files. It's better to read the file in chunks and process it chunk by chunk.

下面是一个简单而又肮脏的示例:

Here's a quick and dirty example of how to achieve it:

<?php
// open the handle in binary-read mode
$handle = fopen("php://input", "r");

// open handle for saving the file
$local_file = fopen("path/to/file", "w");

// create a variable to store the chunks
$chunk = '';

// loop until the end of the file
while (!feof($handle)) {
  // get a chunk
  $chunk = fread($handle, 8192);

  // here you do whatever you want with $chunk
  // (i.e. save it appending to some file)
  fwrite($local_file, $chunk);
}

// close handles
fclose($handle);
fclose($local_file);

这篇关于处理大文件或多个文件时,file_put_contents太慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆