-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Block writing cache in cached
decorator if custom condition is not satisfied
#649
Comments
cached
decorator if custom condition is not satisfied
Sounds plausible. Can you give an example use case of what you'd use this for? |
The real setup is quite more complex so I d better describe the more simple one. For instance you want to request some external API which perform computational-intensive calculation inside and then return the result in JSON. Sure you want to call that external API as rare as possible so you use caching: @aiocache.cached(
alias="local-cache",
key_builder=some_key_builder,
ttl=ttl,
)
async def ask_ranking_worker(*args, **kwargs) -> tuple[int | None, Any]:
# <request-external-api-and-get-the-responce>
return err, result
async def main() -> None:
...
err, result = await ask_ranking_worker(*args, **kwargs)
...
return In some cases @aiocache.cached(
alias="local-cache",
key_builder=some_key_builder,
ttl=ttl,
)
async def ask_ranking_worker(*args, **kwargs) -> tuple[int | None, Any]:
# <request-external-api-and-get-the-responce>
if err is not None:
raise RuntimeError(f'Worker failed with an error: {err}')
return err, result That workaround prevents the result to be cached and makes retries possible |
Sure you can do it with direct |
Hmm, it looks a bit unpythonic. Wouldn't it make more sense to raise Generally, Python code does not follow the design of returning an error state (that's something more common in C or Go, for example). So, it feels like your second example would be better, but without returning |
Well, yeah, this code is not pythonic at all, I just tried to demonstrate the issue as simple as possible 😄 I can rewrite it without error status: @aiocache.cached(
alias="local-cache",
key_builder=some_key_builder,
ttl=ttl,
)
async def ask_ranking_worker(*args, **kwargs) -> Any:
# <request-external-api-and-get-the-responce>
if not result:
raise RuntimeError(f'Worker failed, no result get')
return result
async def main() -> None:
...
result = await ask_ranking_worker(*args, **kwargs)
...
return The problem remains the same -- empty |
Right, I'm just trying to understand a real world use case, to see what the best solution would be. I can think of a trivial example, but I'm not sure what the real use case is currently. In that example, it might work as you'd want, because I think the cache is not saved if the result is |
I guess I'm wondering if there's a reason you'd not want to cache a result which is not |
In some cases the empty |
Well, if it's only a question of whether it is falsy or not, we could just have a simple |
Well it's an interesting question ) I ve start thinking about that feature ( On my opinion |
I can give a few examples where callable takes place:
def checker(value: list[Post]) -> bool:
return len(value) < 100
@aiocache.cached(
alias="cache-alias",
key_builder=some_key_builder,
value_checker=checker,
ttl=ttl,
)
async def get_posts(user: User) -> list[Post]:
...
def checker(value: list[Post]) -> bool:
return not any(post.skip_cache for post in posts)
@aiocache.cached(
alias="cache-alias",
key_builder=some_key_builder,
value_checker=checker,
ttl=ttl,
)
async def get_posts(user: User) -> list[Post]:
...
USERS_WITHOUT_CACHE = [1, 2, 3]
def checker(result: Any, args, kwargs) -> bool:
return kwargs["user"].id not in USERS_WITHOUT_CACHE
@aiocache.cached(
alias="cache-alias",
key_builder=some_key_builder,
value_checker=checker,
ttl=ttl,
)
async def get_posts(*, user: User) -> list[Post]:
... The last example can be replaced with: USERS_WITHOUT_CACHE = [1, 2, 3]
@aiocache.cached(
alias="cache-alias",
key_builder=some_key_builder,
value_checker=checker,
ttl=ttl,
)
async def _get_posts_cache(user: User) -> list[Post]:
return await _get_posts(user)
async def _get_posts(user: User) -> list[Post]:
...
async def get_posts(user: User) -> list[Post]:
if user.if in USERS_WITHOUT_CACHE:
return await _get_posts(user)
return await _get_posts_cache(user) Much uglier than if aiocache provided the ability to control caching condition. |
Hmm, I'd probably expect 1 to use paging rather than return such large results (though maybe if you're using a memory cache you'd want to cache small responses, so maybe it does work..), and 3 could easily be done with an explicit cache bypass (e.g. 2 seems more reasonable, though it's not clear exactly why you'd skip the cache, but I can perhaps imagine something to avoid sensitive data being stored in the cache or specific privileged accounts that need to always have up-to-date results maybe... though I'm still struggling to think of an example where I wouldn't know this information before making the call. So, I think the callable solution is probably fine then. Anybody want to work on making a PR to add it? |
I'd like to try. Could you please provide me with your git flow? |
Just generally open a PR to master is fine. Once a PR is open, it's best to avoid rebases (as it makes it harder to review), the PR will be squash committed anyway. |
@VitalyPetrov if you are looking for something that already support that kind of functionality in advanced way, check here. Sorry for advertisement, by I don't get why aiocache can't be as much flexible as it possible. |
I m using the
aiocache.cached
decorator with wraps the certain processing function. The setup is as following:The problem is -- can i pass any argument to cached decorator in order to avoid cache writing for
processing(...)
return?It seems like the only way to perform it is to throw exception inside the
processing(...)
function if the result is not appropriate one. I m looking for more generic way to do the same stuff.For example, pass some
Callable
object tocached.__call__()
returning the bool var meaning caching the results. Something similar to:The text was updated successfully, but these errors were encountered: