bfitech/zapchupload

服务器端分块上传。

2.3.0 2020-07-15 09:09 UTC

This package is auto-updated.

Last update: 2024-09-15 18:49:41 UTC


README

服务器端分块上传器。

Latest Stable Version Latest Unstable Version Build Status Codecov GitHub license

此包为PHP提供上传可能超大的文件的能力,分块上传。

但是为什么要这样做?标准HTTP表单有什么问题吗?

一些主要考虑因素

优点

  • 无需担心RAM和其他服务器内在限制。
  • 如果配置得当,无需担心upload_max_filesize和相应的max_execution_time指令。文件大小限制可能仍然来自磁盘空间或文件系统限制。
  • 每个块可以进行处理,例如指纹识别,因此当文件在传输过程中损坏时,可以尽早失败。
  • 易于配置以与其他Web应用程序路由器协同工作。
  • 在纯PHP上运行。无需对Web服务器进行任何调整。

缺点

  • 网络开销急剧增加,因为必须进行多次请求才能上传单个文件。
  • 必须制作一个特殊的客户端。标准HTTP上传表单将不起作用。

安装

$ composer require bfitech/zapchupload
$ vim index.php

教程

服务器端

快速index.php设置

<?php

require_once __DIR__ . '/vendor/autoload.php';

use BFITech\ZapCore\Router;
use BFITech\ZapCore\Logger;
use BFITech\ZapChupload\ChunkUpload;

// create a logging service
$log = new Logger;

// create a router
$core = (new Router)->config('logger', $log);

// instantiate the chunk uploader class
$chup = new ChunkUpload(
    $core, '/tmp/tempdir', '/tmp/destdir',
    null, null, null, $log);

// uploader route
$core->route('/upload', [$chup, 'upload'], 'POST');

// downloader route for testing
$core->route('/', function($args) use($core) {
	$file = $args['get']['file'] ?? null;
	if ($file)
		$core->static_file('/tmp/destdir/' . $file);
	$core::halt('HELLO WORLD');
});

// that's it

您可以使用以下内置服务器在本地机器上运行它

$ php -S 0.0.0.0:9999 &

客户端

这里有一个简单的客户端,用Python编写

#!/usr/bin/env python3

# chupload-client.py

import os
import sys
import stat
import hashlib

# use pip3 to install this
import requests

# from your toy service
UL_URL = 'https://:9999/upload'

# from your toy service
DL_URL = 'https://:9999/?file='

# ChunkUpload default prefix
PREFIX = "__chupload_"

# ChunkUpload default chunk size
CHUNK_SIZE = 1024 * 100


def upload(path):
    try:
        fst = os.stat(path)
    except FileNotFoundError:
        sys.exit(2)
    if stat.S_ISDIR(fst.st_mode):
        sys.exit(3)
    size = fst.st_size
    base = os.path.basename(path)

    # chupload must have this as _POST data
    data = {
        PREFIX + 'index': 0,
        PREFIX + 'size': size,
        PREFIX + 'name': base,
    }

    # calculate max number of chunks for test
    chunk_max = divmod(size, CHUNK_SIZE)[0]

    index = 0
    with open(path, 'rb') as fhn:

        while True:

            # chupload must have this as uploaded chunk, where the
            # filename doesn't matter
            files = {
                PREFIX + 'blob': ('noop', fhn.read(CHUNK_SIZE)),
            }

            # make request on each chunk
            resp = requests.post(UL_URL, files=files, data=data).json()
            assert(resp['errno'] == 0)
            rdata = resp['data']

            # upload complete
            if rdata['done'] == True:
                assert(index == chunk_max == rdata['index'])
                print("UPLOAD: OK")
                return "%s%s" % (DL_URL, rdata['path'])

            # increment index for next chunk
            index += 1
            data[PREFIX + 'index'] = index

    raise Exception("Oops! Something went wrong.")


def compare(path, url):
    resp = requests.get(url)
    assert(resp.status_code == 200)
    print("DOWNLOAD: OK")

    # compare download with local file
    rhash = hashlib.sha256(resp.content)
    lhash = hashlib.sha256(open(path, 'rb').read())
    assert(rhash.hexdigest() == lhash.hexdigest())
    print("COMPARE: OK")


if __name__ == '__main__':
    try:
        path = sys.argv[1]
    except IndexError:
        sys.exit(1)
    compare(path, upload(path))

您可以从CLI运行以下命令

$ python3 chupload-client.py ~/some-file.dat || echo FAIL

要查看在浏览器上的运行方式,请运行演示

$ php -S 0.0.0.0:9998 -t ./demo &
$ x-www-browser localhost:9998

文档

完整的文档可在以下位置找到

$ doxygen
$ x-www-browser docs/html/index.html