Home > Backend Development > PHP Tutorial > PHP processes TXT files to import massive data into the database_PHP tutorial

PHP processes TXT files to import massive data into the database_PHP tutorial

WBOY
Release: 2016-07-14 10:10:08
Original
838 people have browsed it

There is a TXT file containing 100,000 records in the following format:

Column 1 Column 2 Column 3 Column 4 Column 5
a 00003131 0 0 adductive#1 adducting#1 adducent#1
a 00003356 0 0 nascent#1
a 00003553 0 0 emerging#2 emergent#2
a 00003700 0.25 0 dissilient#1

……………………There are 100,000 following………………


The requirement is to import it into the database. The structure of the data table is

word_id automatic increment


word 【adductive#1 adducting#1 adducent#1】This TXT record needs to be converted into 3 SQL records


value =The third column-the fourth column; if =0, this record will be skipped and not inserted into the data table

[php]
$file = 'words.txt';//TXT source file of 10W records
$lines = file_get_contents($file);
ini_set('memory_limit', '-1');//Do not limit the Mem size, otherwise an error will be reported
$line=explode("n",$lines);
$i=0;
$sql="INSERT INTO words_sentiment (word,senti_type,senti_value,word_type) VALUES ";
         
foreach($line as $key =>$li)
{
         $arr=explode(" ",$li);
         $senti_value=$arr[2]-$arr[3];
If($senti_value!=0)
                                                                    If($i>=20000&&$i<25000)//Import in batches to avoid failure
                                                                                      $mm=explode(" ",$arr[4]);                                                               FOREACH ($ MM AS $ M) // [Adductive#1 AdDucting#1 Adducent#1] This TXT record must be converted to 3 SQL records {
$nn=explode("#",$m);
$word=$nn[0];
$sql.="("$word",1,$senti_value,2),";//It should be noted here that word may contain single quotes (such as jack's), so we need to use double quotes to contain word ( Note the escaping)                                                                                                                                                                                                                                                                                                                                                                                                                           $i++;
                                                                                    }  
//echo $i;
$sql=substr($sql,0,-1);//Remove the last comma
//echo $sql;
File_put_contents('20000-25000.txt', $sql); //Batch import database, 5000 entries at a time, takes about 40 seconds; importing too many at one time will not be enough max_execution_time, resulting in failure
?>

$file = 'words.txt';//TXT source file of 10W records
$lines = file_get_contents($file);
ini_set('memory_limit', '-1');//Do not limit the Mem size, otherwise an error will be reported
$line=explode("n",$lines);
$i=0;
$sql="INSERT INTO words_sentiment (word,senti_type,senti_value,word_type) VALUES ";
 
foreach($line as $key =>$li)
{
         $arr=explode(" ",$li);
          $senti_value=$arr[2]-$arr[3];
If($senti_value!=0)
           {
If($i>=20000&&$i<25000)//Import in batches to avoid failure
                {
               $mm=explode(" ",$arr[4]);                                                               foreach($mm as $m) //[adductive#1 adducting#1 adducent#1] This TXT record needs to be converted into 3 SQL records                            $nn=explode("#",$m);
                           $word=$nn[0];
$sql.="("$word",1,$senti_value,2),";//It should be noted here that word may contain single quotes (such as jack's), so we need to use double quotes to contain word ( Note the escaping)
                 }
            }
$i++;
                                                                          }
//echo $i;
$sql=substr($sql,0,-1);//Remove the last comma
//echo $sql;
File_put_contents('20000-25000.txt', $sql); //Batch import database, 5000 entries at a time, takes about 40 seconds; importing too many at one time will not be enough max_execution_time, resulting in failure
?>

1. When importing massive data, you must pay attention to some limitations of PHP. You can make temporary adjustments, otherwise an error will be reported

Allowed memory size of 33554432 bytes exhausted (tried to allocate 16 bytes)

2. PHP operates TXT files

file_get_contents()

file_put_contents()

3. When importing large amounts, it is best to import in batches to reduce the chance of failure

4. Before mass import, the script must be tested multiple times before use, such as testing with 100 pieces of data

5. After importing, if PHP’s mem_limit is still not enough, the program still cannot run

(It is recommended to increase mem_limit by modifying php.ini instead of using temporary statements)

http://www.bkjia.com/PHPjc/477529.html

www.bkjia.comtruehttp: //www.bkjia.com/PHPjc/477529.htmlTechArticleThere is a TXT file containing 100,000 records in the following format: Column 1 Column 2 Column 3 Column 4 Column 5 a 00003131 0 0 adductive#1 adducting#1 adducent#1 a 00003356 0 0 nascent#1 a 00003553 0 0 em...
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template