Adding a simple cache on Python: My Wholesome Guide to Learning Python (Part 5)

Nepal Brothers
2 min readOct 2, 2021

Alright. You might have heard about caching for a long time, and you might have implemented this already. But, lets refresh again if you knew about it, but for rest, let me start with quick intro.

Why do we need a cache?

Imagine us making an API call to a server that gives us the list of posts.

We are using requests library to make a call. This will simply make a GET request to get the call. How about we set a timer and see how long it would take time to make this request.

How do I read the above code?

Let me write that code in a different way, which could be more understandable.

So, make_call_1 , make_call_2 and make_call_3 do not take any parameters. They simply are functions without any parameters. Then, we pass these functions in the calculate_time function.

It is the same concept there too. These functions are just repetitive and we can make this shorter by creating a function in fast way. That is where lambda comes in.

So, the calculate_time is a function that calculates how long it would take for the function func to complete. And, lambda creates a function that will basically execute the make_call(1) function.

I will a create different blog post about different ways functions can be made in python and post it here soon for reference.

The result of that call and call it 5 times

TimeTaken 0.05901312828063965
TimeTaken 0.09686493873596191
TimeTaken 0.07380008697509766

These results are in seconds. Now, lets try making a call 5 times.

TimeTaken 0.06667900085449219
TimeTaken 0.06765198707580566
TimeTaken 0.056787967681884766
TimeTaken 0.06363892555236816
TimeTaken 0.06034588813781738
TimeTaken 0.06716513633728027
TimeTaken 0.06261396408081055
TimeTaken 0.056501150131225586
TimeTaken 0.05838608741760254
TimeTaken 0.08418893814086914
TimeTaken 0.06550979614257812
TimeTaken 0.06678986549377441
TimeTaken 0.07350707054138184
TimeTaken 0.08434700965881348
TimeTaken 0.0632479190826416

Lets add a cache now

It is as simple as adding @lru_cache(maxsize=<any>) and you would get a simple cache. For the same parameter you have passed, you will always get the same result that is cached in the memory.

--

--