Mastering Postgres has dramatically improved my understanding of indexes. It has helped me make informed decisions around my queries and schema rather than blindly adding indexes and hoping for the best.Dwight Watson
Shorten dev cycles with branching and zero-downtime schema migrations.
All of these results that we've been getting back from EXPLAIN are estimates. They're good estimates, but they're estimates nonetheless. If you want an actual reporting of how the query ran, you can add the word analyze to your EXPLAIN.
With a huge red flashing caveat, it's going to run the query.
It's going to run the query even if it discards the results. It is going to run the query even if it is an update, delete and insert, it is going to run the query. You have been warned, be very careful.
Here's our estimated one that just shows the cost estimates and if you want to actually run the query, I'm gonna keep warning you, you can put analyze there. You don't have to put it in parentheses, but you can. That's fine, we'll just run it that way. You'll see we get a lot more information. We get a planning and execution time, which is very helpful. We get a little bit more detail, which we're not gonna go into right now, but then we also get this extra set of parentheses out here. We get the actual time of 28.8 and then 42 there, which is again, startup time and total time. We get the actual rows, which we can compare and see. It was pretty close, if not exactly right, still pretty close. And then some loops.
Now, we'll talk about loops in a second, but the cost and the actual time, those are fundamentally different units. The cost is Postgres arbitrary cost units, the time is milliseconds. If you're looking at those things and you're thinking, "Ah, those two should line up." And if they don't, then something's terribly wrong. They're just different units. They're not gonna line up. If that annoys you, you can turn them off. The way that you would do that is open the parentheses here and then you could do a comma costs off, and that will just show you the actual time.
The one thing that we have remaining here is the loops. Each node could be run multiple times. Depending on where it is in the query, what else is going on, it could be run once for every row in the result set. It could be run so many times. What you'll end up seeing here is the average. If you see a total cost of 19.604, that is a total cost per iteration based on an average. If you see 19 milliseconds in here, but it ran a million times, it's a pretty good offender. You might wanna check into that and see if we can get that down. That's what that loops means there.
EXLAIN ANALYZE, hugely helpful tool when you have exhausted all of the EXPLAINs. If you can't solve it with an EXPLAIN, you might reach for an EXPLAIN ANALYZE with the giant caveat that it's going to actually run your query, so be very careful out there.