一个文件夹中有一百万或以上的文件,用于包含(缓存) [英] 1 million or more files in one folder, for include (cache)

查看:407
本文介绍了一个文件夹中有一百万或以上的文件,用于包含(缓存)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个'理论'的问题,看看我正在计划的解决方案是否合理:

我有一个脚本,可以读取大量的数据从数据库中取出设置,配置等,并建立起来(每个注册用户)。我不会进入太多的细节为什么或什么。



我的想法是,我实际上只能做一次,并创建 .inc 文件与用户的ID一起缓存。如果用户改变了某些东西,文件将被重新创建。



但是现在,假设我这样做了,有1000万个,甚至更多的文件。我会遇到问题,而包括这些文件? (总是一个特定的文件,而不是每个文件一次)。这是一个很好的主意,或者我只是强调更多的服务器与此?

我打算把一切都在同一个缓存文件夹 - 将我有性能改进,如果我把该文件夹分成多个?



感谢您的帮助。

解决方案

您将受到文件系统的限制。无法在文件夹中找到多个文件。你可以这样做:


  1. 哈希文件名: file1.php 变成 3305d5836bea089f2a1758d8e89848c8

  2. 将散列分成若干部分: 3/3/0/5 / d / 5836bea089f2a1758d8e89848c8

  3. 已完成


I have a 'theoretical' question, to see if a solution I'm planing makes sense or not:

I have a script that reads a lot of data out from the Database, with settings, configuration, etc - and builds that togheter (for every registered user). I wont go into too much details why or what exactly.

My Idea was, that I could actually do that only once and create a .inc file, with the ID of the user, to cache it. If the user changes something, the file will be recreated of course.

But now, lets suppose I do that, with 1'000'000 - or even more files. Will I encounter issues, while including those files? (always one specific file, not every file at once). Is that generaly a good idea, or am I just stressing the server even more with this?

And I'm planing to put everything in the same cache folder - will I have performance improvements, if I split that folder up into multiple ones?

Thanks for the help.

解决方案

You will be limited by the file system. It's not possible to reach that many files in a folder. You can do something like this:

  1. Hash the filename: file1.php becomes 3305d5836bea089f2a1758d8e89848c8
  2. Split the hash in several parts: 3/3/0/5/d/5836bea089f2a1758d8e89848c8
  3. It's done

这篇关于一个文件夹中有一百万或以上的文件,用于包含(缓存)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆