
Mistake #3: Writing Database Queries Inside Loops (Killer of Speed)
March 29, 2025About 2 min
Mistake #3: Writing Database Queries Inside Loops (Killer of Speed) 관련
Why Your Code is Slow: Common Performance Mistakes Beginners Make
Maybe you’ve experienced something like this before: you’ve written code that works, but when you hit “run,” it takes forever. You stare at the spinner, wondering if it’s faster to just solve the problem by hand. But you end up looking something like...
Why Your Code is Slow: Common Performance Mistakes Beginners Make
Maybe you’ve experienced something like this before: you’ve written code that works, but when you hit “run,” it takes forever. You stare at the spinner, wondering if it’s faster to just solve the problem by hand. But you end up looking something like...
Why This is a Problem
This is one of the biggest slow-code mistakes beginners (and even intermediates) make. It happens because loops feel natural, and database queries feel straightforward. But mix the two together, and you’ve got a performance disaster.
Every time you call a database inside a loop, you're making repeated trips to the database. Each query adds network latency, processing overhead, and unnecessary load on your system.
Example
Imagine you’re fetching user details for a list of user_ids
like this:
pythonCopyEdituser_ids = [1, 2, 3, 4, 5]
for user_id in user_ids:
user = db.query(f"SELECT * FROM users WHERE id = {user_id}")
print(user) # Do something with the user
What's wrong here?
- You're hitting the database multiple times instead of once.
- Each call has network overhead (database queries aren’t instant).
- Performance tanks when user_ids gets large.
How to Fix It: Use Bulk Queries
Instead of making 5 separate queries, make one:
pythonCopyEdituser_ids = [1, 2, 3, 4, 5]
users = db.query(f"SELECT * FROM users WHERE id IN ({','.join(map(str, user_ids))})")
for user in users:
print(user) # Process users efficiently
Why this is better:
- In the above code, we just have one database call instead of many. This results in faster performance.
- There’s also less network overhead which makes your app feel snappier.
- And this works even if
user_ids
has 10,000+ entries.
A More Scalable Approach
If you're using an ORM (like SQLAlchemy in Python or Sequelize in JavaScript), use batch fetching instead of looping:
pythonCopyEditusers = db.query(User).filter(User.id.in_(user_ids)).all()