We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This only happens under certain circumstances:
for entry in queryset
list(queryset)
I'm using the fork of theatlantic with redis, but I've verified that this problem also exists in jbalogh/master with memcached.
Given the following model which has a ImageWithThumbsField (provided by django-thumbs)
class Debug(caching.base.CachingMixin, models.Model): text = models.CharField(max_length=512, blank=True, null=True) img = ImageWithThumbsField( verbose_name='image', upload_to='uploads/projects/', styles=({'w': 100, 'h': 100},), thumb_method=generate_thumb_square_top, blank=True, null=True ) objects = caching.base.CachingManager()
and the following debug management command:
from django.core.management.base import NoArgsCommand class Command(NoArgsCommand): def __init__(self): pass def handle_noargs(self, **options): from django.core.cache import cache from atizo.apps.debug.models import Debug def _add_objects(count): print "add %d" % count n = 0 while n < count: Debug().save() n += 1 def _fetch_via_list(): print "via list()" cache.clear() # populate cache query = list(Debug.objects.all()) for entry in query: img = entry.img.url_100x100 # fetch from cache query = list(Debug.objects.all()) for entry in query: img = entry.img.url_100x100 def _fetch_via_iterator(): print "via iterator" cache.clear() # populate cache query = Debug.objects.all() for entry in query: img = entry.img.url_100x100 # fetch from cache query = Debug.objects.all() try: for entry in query: img = entry.img.url_100x100 # <- will fail if there are more than 100 obj's except: print "fail!" Debug.objects.all().delete() cache.clear() _add_objects(1) _fetch_via_list() # success _fetch_via_iterator() # success _add_objects(98) # total 99 _fetch_via_list() # success _fetch_via_iterator() # success _add_objects(1) # total 100 _fetch_via_list() # success _fetch_via_iterator() # fail _add_objects(270) # total 370 _fetch_via_list() # success _fetch_via_iterator() # fail
django.db.model.query has a constant CHUNK_SIZE = 100. If changed to 200, the problem will only appear with collections of 200 objects or bigger.
CHUNK_SIZE = 100
Also there is a difference in the pickled cache entry:
cache entry that was populated with query = list(Debug.objects.all()): http://pastie.org/3446872
query = list(Debug.objects.all())
cache entry that was populated with for entry in query: img = entry.img.url_100x100: http://pastie.org/3446860
for entry in query: img = entry.img.url_100x100
The text was updated successfully, but these errors were encountered:
I have the same issue. Any update on this?
Sorry, something went wrong.
No branches or pull requests
This only happens under certain circumstances:
for entry in queryset
:list(queryset)
I'm using the fork of theatlantic with redis, but I've verified that this problem also exists in jbalogh/master with memcached.
Given the following model which has a ImageWithThumbsField (provided by django-thumbs)
and the following debug management command:
django.db.model.query has a constant
CHUNK_SIZE = 100
. If changed to 200, the problem will only appear with collections of 200 objects or bigger.Also there is a difference in the pickled cache entry:
cache entry that was populated with
query = list(Debug.objects.all())
:http://pastie.org/3446872
cache entry that was populated with
for entry in query: img = entry.img.url_100x100
:http://pastie.org/3446860
The text was updated successfully, but these errors were encountered: