PhpMyAdmin 数据导入性能问题
最初,我的问题与 PhpMyAdmin 的 SQL 部分工作不正常有关.正如评论中所建议的,我意识到这是无法处理的输入量.但是,这并没有为我提供一个有效的解决方案,说明如何处理具有 (CSV) 格式的文件(在我的情况下 - 35,000 条记录行):
<代码>...20120509,126,1590.6,020120509,127,1590.7,120120509,129,1590.7,6...
PhpMyadmin 中的导入选项正在挣扎,就像 SQL 部分中的基本复制粘贴输入一样.这次和之前一样,需要 5 分钟,直到调用 max execution time 然后停止.有趣的是,它向表中添加了 6-7 千条记录.所以这意味着输入实际上通过并且几乎成功地做到了.我还尝试将文件中的数据量减半.然而,一切都没有改变.
现在显然有问题.当简单的数据导入不起作用时,不得不在php脚本中处理数据非常烦人.
解决方案更改您的 php 上传最大大小.
你知道你的 php.ini 文件在哪里吗?
首先,尝试将此文件放入您的 Web 根目录:
phpinfo.php
(参见 http://php.net/manual/en/function.phpinfo.php )
包含:
然后导航到 http://www.yoursite.com/phpinfo.php>
寻找php.ini".
要上传大文件,您需要 max_execution_time、post_max_size、upload_max_filesize
另外,你知道你的 error.log 文件在哪里吗?希望它可以为您提供有关问题所在的线索.
这是我用于文件导入的查询:
$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY由 '"' 行以 '$nl'" 结尾;
其中 $file_name 是来自 php 全局变量 $_FILES 的临时文件名,$table_name 是已经准备好导入的表,$nl 是 csv 行结尾的变量(默认为 windows 行结尾,但我可以选择选择linux行尾).
另一件事是我脚本中的表($table_name)是通过首先扫描csv以确定列类型来提前准备的.在确定合适的列类型后,它会创建 MySQL 表来接收数据.
我建议您先尝试创建 MySQL 表定义,以匹配文件中的内容(数据类型、字符长度等).然后尝试上面的查询,看看它运行的速度有多快.我不知道 MySQL 表定义对速度有多大影响.
此外,在加载数据之前,我没有在表中定义索引.索引会减慢数据加载速度.
Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):
...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...
The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.
There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.
解决方案Change your php upload max size.
Do you know where your php.ini file is?
First of all, try putting this file into your web root:
phpinfo.php
( see http://php.net/manual/en/function.phpinfo.php )
containing:
<?php
phpinfo();
?>
Then navigate to http://www.yoursite.com/phpinfo.php
Look for "php.ini".
To upload large files you need max_execution_time, post_max_size, upload_max_filesize
Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.
EDIT:
Here is the query I use for the file import:
$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
ENCLOSED BY '"' LINES TERMINATED BY '$nl'";
Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).
The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.
I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.
Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.
相关文章