How to stop Django caching view results
I works properly. The problem is in caching only.
No it doesn't work properly.
If you face some caching issues, you probably have a scope issue somewhere and you're not aware of it.
For example, if you have something like this in your module:
# This is a file
queryset = SomeModel.objects.all()
def my_function():
return list(queryset) # list() will force the QS evaluation
If you call my_function()
it'll will return a list of instances. Add or remove some instances and you won't see a change.
The reason is queryset
is evaluated once and will stay evaluated.
if you change the code as:
def my_function():
return list(queryset.all())
you'll see the list change. This is because .all()
will return a new queryset which means it is not evaluated.
Most of the time, I prefer to be explicit and create the queryset directly from the function:
def my_function():
return list(SomeModel.objects.all())
How to disable Django query cache?
I came across behavior that I thought was some kind of caching, but it turned out to be database transactions fooling me.
I had the problem where in another process, items were get added to the database, and I wanted to monitor progress of the other process, so I opened up a django shell and issued the following:
>>> MyData.objects.count()
74674
>>> MyData.objects.count()
74674
The value wasn't changing, even though it actually was in the database. I realized that at least with the way I had MySQL & django setup that I was in a transaction and would only see a "snapshot" of the database at the time I opened the transaction.
Since with views in django, I had autocommit behavior defined, this was fine for each view to only see a snapshot, as the next time a view was called it would be in a different transaction. But for a piece of code that was not automatically committing, it would not see any changes in the db except those that were made in this transaction.
Just thought I would toss this answer in for anyone who may come upon this situation.
To solve, commit your transaction, which can be manually done like so:
>> from django.db import transaction
>> transaction.enter_transaction_management()
>> transaction.commit() # Whenever you want to see new data
Django caching queries (I don't want it to)
The queryset is not cached 'within a session'.
The Django documentation: Caching and QuerySets mentions:
Each QuerySet contains a cache to minimize database access. Understanding how it works will allow you to write the most efficient code.
In a newly created QuerySet, the cache is empty. The first time a QuerySet is evaluated – and, hence, a database query happens – Django saves the query results in the QuerySet’s cache and returns the results that have been explicitly requested (e.g., the next element, if the QuerySet is being iterated over). Subsequent evaluations of the QuerySet reuse the cached results.
Keep this caching behavior in mind, because it may bite you if you don’t use your QuerySets correctly.
(emphasis mine)
For more information on when querysets are evaluated, refer to this link.
If it is critical for your application that he querysets gets updated, you have to evaluate it each time, be it within a single view function, or with ajax.
It is like running a SQL query again and again. Like old times when no querysets have been available and you kept the data in some structure that you had to refresh.
Django Cache (Redis) auto refresh when data is changed
So after an Year, this is what I have came up with
We can register custom post_save register to the models in the app.py for each app where you want to update your cache or add the post_save listener for all the models while the project is initializing.
These two post's helped to come up with receiver function which will clear the cache for the redis key.
In my application I used "{}-{}-{}".format(slug, model_name, username) to as key to my redis key and I will delete this key every time the model data for that user is changed.
Here are the post I found helpful:
Blog
Django with tastypie how to refresh queryset?
Django will cache the results within a queryset, but not the results within unrelated querysets.
The following code will result in two database queries:
list(SomeTable.objects.all()) # First db query.
list(SomeTable.objects.all()) # Second db query unrelated to first.
The symptoms and your connection.close()
solution are consistent with a MySQL transaction issue. A new connection on each query will result in a different transaction.
It seems as though a transaction is being started in your Django code, but never committed or rolled back. When the transaction starts, further reads/writes in that transaction will only see the state of the database at the start of the transaction, plus and updates made in that transaction. If another process starts a session (eg your data mining script) and inserts data, it will not be seen by the Django transaction until the existing transaction is closed.
Without seeing your code it's impossible to tell why there would be a transaction started but not completed. To verify this assumption, turn on the MySQL query log and examine its contents. You should see a start transaction
query without a corresponding commit
or rollback
.
Related Topics
How to Distribute Python Programs
Str' Object Has No Attribute 'Decode'. Python 3 Error
Syntaxerror: Multiple Statements Found While Compiling a Single Statement
Python: Change the Scripts Working Directory to the Script's Own Directory
Generate a Random Derangement of a List
How to Add a New Column to a Spark Dataframe (Using Pyspark)
Play an Animated Gif in Python with Tkinter
Pip or Pip3 to Install Packages for Python 3
How Can One Continuously Generate and Track Several Random Objects with a Time Delay in Pygame
Integrating MySQL with Python in Windows
Can Flask Have Optional Url Parameters
Why Is Tensorflow 2 Much Slower Than Tensorflow 1
How to Hide the Browser in Selenium Rc
How to Dynamically Load a Python Class
Tkinter Vanishing Photoimage Issue
Adding a Legend to Pyplot in Matplotlib in the Simplest Manner Possible