带有存储上传触发器的 Azure 函数 (Python) 因大文件上传而失败
问题描述
从文件上传到 Azure 存储触发的 Azure 函数 (Python).该功能适用于最大 ~120MB 的文件.我刚刚用一个 2GB 的文件进行了负载测试,该函数产生了错误 Stream is too long.
Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long.
- 此限制记录在哪里?
- 如何使用 Python 克服它?
使用 boto3
库将文件 PUT 到 AWS S3
Using boto3
library to PUT files to AWS S3
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob
"
f"Name: {myblob.name}
"
f"Blob Size: {myblob.length} bytes")
myblobBytes = myblob.read()
fileName = pathlib.Path(myblob.name).name
s3 = boto3.resource(
's3',
aws_access_key_id="youguessedit",
aws_secret_access_key="noyoudidnt",
)
response=s3.Bucket(bucketName).put_object(Key="folder/" + fileName,
Body=myblobBytes, ContentMD5=md5Checksum)
response.wait_until_exists()
解决方案
将 boto3
从 put_object
更改为 upload_fileobj
并设置 TransferConfig
for multipart_threshold=1024*25, max_concurrency=10, multipart_chunksize=1024*25, use_threads=True
.
Changed boto3
from put_object
to upload_fileobj
and setup TransferConfig
for multipart_threshold=1024*25, max_concurrency=10, multipart_chunksize=1024*25, use_threads=True
.
现在撕!
能够在 89 秒内传输 2GB!还不错.
Able to transfer 2GB in 89secs! Not bad.
相关文章