Export The Slow_log Table To Slow Query Log File To Apply The Pt-query-digest
Working on customer cases always brings challenges. This week I got one where the customer provided the dump of mysql.slow_log table and it was necessary to analyze it. Unfortunately, pt-query-digest does not work with this format.
Imagine the common scenario: you have AWS RDS instance running and the slow log file is not available to perform the analysis.
Curiosity fact: Azure allows you to download the file, you can check it here.
To keep the story short, the query below is the hero of the day. The query will extract the data from the slow_log table and prepare it to save in the same format that the slow log file uses.
$ mysql -uroot -pmsandbox -h 127.0.0.1 -P 3306 -D mysql -s -r -e "SELECT CONCAT( '# Time: ', DATE_FORMAT(start_time, '%y%m%d %H%i%s'), '\n', '# User@Host: ', user_host, '\n', '# Query_time: ', TIME_TO_SEC(query_time), ' Lock_time: ', TIME_TO_SEC(lock_time), ' Rows_sent: ', rows_sent, ' Rows_examined: ', rows_examined, '\n', sql_text, ';' ) FROM mysql.slow_log" > ./slow_log.log
Please note that the options above can be customized. From the mysql –help:
-r, –raw Write fields without conversion. Used with –batch.
-s, –silent Be more silent. Print results with a tab as separator,
each row on new line.
-P, –port=# Port number to use for connection or 0 for default to, in
order of preference, my.cnf, $MYSQL_TCP_PORT,
/etc/services, built-in default (3306).
Finally, it is possible to use the pt-query-digest tool to create the analysis:
$ pt-query-digest slow_log.log > out
Hope it helped! See you next!