Hey there, fellow data wranglers and database enthusiasts! If you've ever found yourself knee-deep in the labyrinth of MySQL commands, you know the struggle is real. Whether you're a seasoned database wizard or just dipping your toes into the SQL sea, having a trusty arsenal of commands at your disposal is like having a treasure map in a vast ocean of data.
Intro
Over the years, I've amassed a collection of MySQL commands that have been my trusty companions through thick and thin. They've rescued me from database disasters, helped me unravel complex queries, and even brought a little joy to those long nights of debugging. And now, dear reader, I'm thrilled to share this treasure trove with you!
So grab your mug of coffee (or tea, I don't judge), cozy up in your favorite coding nook, and let's embark on a journey through some of the most useful MySQL commands you'll ever need. Whether you're looking to streamline your workflow, troubleshoot like a pro, or simply impress your peers at the next database party (hey, we all need goals), you're in the right place.
Let's dive in and unlock the power of MySQL, one command at a time!
Basic commands
The basics, from simple querying and modifying data in tables, mainly for completion.
-- SELECT, the most fundamental command in MySQL - used to retrieve data.
SELECT * FROM CommonWords; -- Retrieve all columns ('*') from table CommonWords
SELECT COUNT(*) FROM CommonWords; -- Returns the number of rows that exists in that table
SELECT CommonWord FROM CommonWords LIMIT 0,10; -- Retrieve column CommonWord, only 10 rows (Not guaranteed to be in any specific order though)
SELECT * FROM CommonWords ORDER BY CommonWord ASC LIMIT 0,10; -- Add sort order to guarantee LIMIT works as expected (the ORDER BY column should have an index for performance)
SELECT DISTINCT CommonWord FROM CommonWords; -- Retrieve column CommonWord and also removes any possible duplicates (You should avoid duplicate data in the table, but you don't always have control over that
-- INSERT, used to add new rows of data into a table.
INSERT INTO CommonWords (CommonWordID, CommonWord) VALUES (1, 'the'); -- Adds a new row in table CommonWords
INSERT INTO CommonWords VALUES (1, 'the'); -- Can be used if you are adding values for ALL the columns of the table (I would not use this, might cause problems after schema changes in the future)
INSERT IGNORE INTO CommonWords VALUES (1, 'the'); -- If table contains an existing value in a UNIQUE index or PRIMARY KEY field, it will not produce an error and will ignore that INSERT command entirely
-- UPDATE, used to modify existing data in a table.
-- Make sure to always include a WHERE clause unless you want ALL rows to be updated.
UPDATE CommonWords SET CommonWord='the' WHERE CommonWordID=1;
-- DELETE, used to delete rows from a table.
-- Make sure to always include a WHERE clause unless you want ALL rows to be deleted.
DELETE FROM CommonWords WHERE CommonWordID=1; -- Deletes a specific row
DELETE FROM CommonWords WHERE CommonWord='the'; -- Same as above, but COULD result in multiple rows being deleted if the column is not created with UNIQUE index or PRIMARY KEY field
Administration
A few commands that have to do with administrative tasks.
Show and switch databases
SHOW DATABASES; -- Yup, this will list all databases on the server. A good place to start.
USE my_database; -- Switch to another database 'my_database'
SHOW TABLES; -- Shows all tables (and views) in the selected database
TimeZones
SHOW variables LIKE '%time_zone%';
SELECT @@system_time_zone AS SystemTimeZone, @@time_zone AS TimeZone;
SELECT @@GLOBAL.time_zone, @@SESSION.time_zone;
DB Data Sizes
SELECT table_schema AS "Database", ROUND(SUM(data_length + index_length) / 1024 / 1024, 1) AS "Size (MB)" FROM information_schema.tables GROUP BY table_schema;
SELECT table_schema AS "Database", SUM(data_length + index_length) / 1024 / 1024 / 1024 AS "Size (GB)" FROM information_schema.tables GROUP BY table_schema;
Table Statistics
ANALYZE TABLE CommonWords; -- Generates / updates table statistics
SHOW TABLE STATUS LIKE 'CommonWords'; -- About 'Rows' in the statement result: Some storage engines, such as MyISAM, store the exact count. For other storage engines, such as InnoDB, this value is an approximation, and may vary from the actual value by as much as 40% to 50%. In such cases, use SELECT COUNT(*) to obtain an accurate count.
SELECT * FROM INFORMATION_SCHEMA.STATISTICS WHERE table_name='commonwords' AND table_schema='my_db_name_here';
Host Blocked Error
I have had this issue show up when a new export function to a statistics database was setup. The root cause was that this new solution spammed the database so hard with slow running queries, that eventually the server said 'enough is enough'. This was the error message we received:
Host 'XXX.XXX.XXX.XXX' is blocked because of many connection errors; unblock with 'mysqladmin flush-hosts'
If you already have access to the database (through MySQL workbench for example), the following commands might be useful to identify and mitigate and/or resolve the issue:
-- Read more about the error here: https://stackoverflow.com/questions/36272953/error-1129-host-blocked-because-of-many-connection-errors-unblock-with-mys
SELECT * FROM performance_schema.host_cache LIMIT 100; -- Check host cache to confirm that this is the issue
SHOW VARIABLES LIKE "max_connections"; -- Shorter version of same query: SELECT @@max_connections;
SHOW VARIABLES LIKE "max_connect_errors"; -- Shorter version of same query: SELECT @@max_connect_errors;
FLUSH HOSTS; -- Run after the issue has been resolved to remove the blocked hosts
Performance
EXPLAIN
Prefixing the query with EXPLAIN will give you a lot of useful data that might help identifying bottlenecks and other performance issues. After MySQL 5.7 it works on SELECT, DELETE, INSERT, REPLACE AND UPDATE statements. It does not work for Stored Procedures as of now, but you could use the EXPLAIN prefix inside the Stored Procedure.
EXPLAIN SELECT * FROM CommonWords ORDER BY WordLength DESC; -- Will return information in the Extra column: 'Using filesort' wich is not great and could cause performance issues. Column WordLength is not indexed -> slow.
EXPLAIN SELECT * FROM CommonWords ORDER BY CommonWordID; -- Returns '' in the Extra column, but we can see that the primary key index is being used -> fast.
EXPLAIN SELECT * FROM CommonWords ORDER BY CommonWordID DESC; -- Returns 'Backward index scan' in the Extra column. Primary key index is being used -> fast.
Profiling Stored Procedures
Here is a couple of queries that could be useful while profiling Stored Procedures. I never use these on production servers (I think it requires a lot of resources), but I use it on staging servers that contains a big dataset from production so that it simulates production environment as much as possible.
UPDATE performance_schema.setup_consumers SET ENABLED="YES" WHERE NAME = "events_statements_history_long"; -- Activate the profiling
CALL BlogPostGet('useful_mysql_commands'); -- The Stored procedure that I want to profile.
UPDATE performance_schema.setup_consumers SET ENABLED="NO" WHERE NAME = "events_statements_history_long"; -- Disable the profiling (OPTIONAL if you want to keep the profiling open for multiple Stored Procedures in a row)
SELECT event_id, SQL_TEXT, CONCAT(TIMER_WAIT/1000000000,"ms") AS Time FROM performance_schema.events_statements_history_long WHERE event_name="statement/sql/call_procedure"; -- Review the result data, use the event_id for the query below.
SELECT EVENT_NAME, SQL_TEXT, CONCAT(TIMER_WAIT/1000000000,"ms") AS Time FROM performance_schema.events_statements_history_long WHERE nesting_event_id=1234 ORDER BY event_id; -- Change 1234 to the id from the previous query.
TRUNCATE TABLE performance_schema.events_statements_history_long; -- Removes all rows in events_statements_history_long table if you need a fresh start.
Date and time
A random collection of queries related to Date and Time.
SELECT now(); -- date and time
SELECT curdate(); -- date
SELECT curtime(); -- time in 24-hour format
-- The one true date format
SELECT DATE_FORMAT(now(), '%Y-%m-%d %H:%i:%s') AS TheOneTrueDateFormat; -- Result: '2024-05-07 14:44:29'
-- First and Last day of the month (Replace now() and curdate() with actual date if needed
SELECT DATE_ADD(curdate(), INTERVAL -DAY(curdate())+1 DAY) AS first_day_of_month;
SELECT DATE_ADD(DATE_ADD(LAST_DAY(now()), INTERVAL 1 DAY), INTERVAL -1 MONTH) AS first_day_of_month_v2;
SELECT LAST_DAY(now()) AS last_day_of_month;
-- First and Last day of the week
SELECT DATE_ADD(curdate(), INTERVAL(-WEEKDAY(curdate())) DAY); -- If week starts on Monday - the normal way ;)
SELECT DATE_ADD(curdate(), INTERVAL(1-DAYOFWEEK(curdate())) DAY); -- If week starts on Sunday
SELECT DATE_ADD(curdate(), INTERVAL(6-WEEKDAY(curdate())) DAY); -- If week starts on Monday - the normal way ;)
SELECT DATE_ADD(curdate(), INTERVAL(7-DAYOFWEEK(curdate())) DAY); -- If week starts on Sunday
Misc
Find Duplicates
Sometimes you find yourself in a situation where you have to find duplicates in a table. I would prefer to have the tables created with unique keys (or some other solution) to avoid duplicates all together, but sometimes that might not be an option.
-- In this case, the CommonWordID column is a Primary Key and the CommonWord column is where we suspect that a duplicate exists.
SELECT
CommonWord, COUNT(CommonWordID)
FROM
CommonWords
GROUP BY
CommonWord HAVING COUNT(CommonWordID) > 1;
Find Specific Column Name
If you are working with a large and/or confusing database and simply cannot find that column that the legacy code uses. This is a simple search for exactly that.
SELECT
table_name,
column_name
FROM
information_schema.columns
WHERE
column_name LIKE '%CommonWord%';
JSON Result
JSON output directly from a query can be very useful.
SELECT
JSON_ARRAYAGG(JSON_OBJECT('index', cwSubquery.CommonIndex, 'word', cwSubquery.CommonWord))
FROM
(SELECT * FROM CommonWords ORDER BY commonIndex LIMIT 0,3) AS cwSubquery;
-- Result: '[{"word": "the", "index": 1}, {"word": "of", "index": 2}, {"word": "and", "index": 3}]'
The End
Well, that was a lot of fun looking through som old scripts that I had saved in some long forgotten folders. I'm sure I will find more over time and will add to this list when I do.
I hope you found some of these commands useful.