Home > Backend Development > PHP Tutorial > 洪量日志入库

洪量日志入库

WBOY
Release: 2016-06-13 12:53:52
Original
1117 people have browsed it

海量日志入库
日志目录下有10个日志文件,每个文件压缩后大约60M左右,文件后缀是.gz,如a.gz、b.gz等,文件中行的内容是id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,
id=2112112,email=xxx@163.com,等等其它,

现在是想把这个目录下的每个文件的所有内容insert到数据库中,数据库中的表,是通过email分表的,大约是log_1,log_2,一直到log_1000的分表的,请问下详细的解决方案,比如说怎么样能保证到每个文件在很快的时间内入库,使得脚本执行更有效率
先贴一段代码

<br />
	<?php<br />
		error_reporting(E_ALL & ~E_NOTICE);<br />
		//接收参数<br />
		$mysql_host  = XX.XX.XX.XX;<br />
		$mysql_user  = XXX;<br />
		$mysql_pass  = XX;<br />
		$mysql_port  = 3306;<br />
		$mysql_db    = 'test';		<br />
		$table_pre   = 'log_';<br />
		$gz_log_file = a.gz;<br />
		//脚本执行日志<br />
		$exec_log = '/data_log/record.txt';	<br />
	    file_put_contents ($exec_log,'*****************************************START***********************************'."\r\n",FILE_APPEND );<br />
		file_put_contents ($exec_log,'param is mysql_host='.$mysql_host.' mysql_user='.$mysql_user.' mysql_pass='.$mysql_pass.' mysql_port='.$mysql_port.' mysql_db='.$mysql_db.' table_pre='.$table_pre.' gz_log_file='.$gz_log_file.' start_time='.date("Y-m-d H:i:s")."\r\n",FILE_APPEND );		<br />
		//读日志入库		 <br />
		$z_handle = gzopen($gz_log_file,'r');<br />
		$time_start = microtime_float();<br />
		$mysql_value_ary = array();<br />
		//链接数据库<br />
		$conn = mysql_connect("$mysql_host:$mysql_port",$mysql_user,$mysql_pass);<br />
		if (!$conn) {<br />
			file_put_contents ($exec_log,'Could not connect database error, error='.mysql_error()."\r\n",FILE_APPEND );	<br />
			exit;<br />
		}<br />
		$selec_db = mysql_select_db($mysql_db);<br />
		if(!$selec_db){<br />
			file_put_contents ($exec_log,'select database error, database='.$mysql_db."\r\n",FILE_APPEND );	<br />
			exit;<br />
		}<br />
		while(!gzeof($z_handle)){<br />
			$each_gz_line = gzgets($z_handle, 4096);<br />
			$line_to_array = explode("\t",$each_gz_line);<br />
			//过滤无效日志<br />
			if(!empty($line_to_array[3]) && !empty($line_to_array[2]) && !empty($line_to_array[4])){<br />
				$insert_value = "('".$line_to_array[3]."','".$line_to_array[2]."','".$line_to_array[1]."','".$line_to_array[4]."','".$line_to_array[0]."') ";<br />
				$insert_sql = "insert into $table_name (uid,email,ip,ctime) values $insert_value ";<br />
				$table_id = abs(crc32($line_to_array[2]) % 1000);<br />
				$table_name = $table_pre.$table_id;<br />
				$result = mysql_query($insert_sql); <br />
				if(!$result){<br />
					//如果插入错误,则记录日志<br />
					file_put_contents ($exec_log,'table_name='.$table_name.' email='.$line_to_array[2]."\r\n",FILE_APPEND );	<br />
				}<br />
			}<br />
		}<br />
		$time_end = microtime_float();<br />
		$diff = $time_end - $time_start;<br />
		file_put_contents ($exec_log,'success to insert database,log_file is '.$gz_log_file.' time-consuming is='.$diff."s \r\n",FILE_APPEND );<br />
		file_put_contents ($exec_log,'*******************************************END***********************************'."\r\n",FILE_APPEND );<br />
		gzclose($z_handle);	<br />
Copy after login

上面的代码执行起来,很慢,不可忍受,请大牛帮忙

海量日志?日志分析?脚本?效率
------解决方案--------------------
表类型修改为:InnoDB,然后用事务实施,
还不行的话,换load file
Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template