Data and caching go hand in hand for performance
The more applications we have, the greater the need for performance. Performance can be achieved by caching. This is an important strategy to take advantage of in web applications because loading web pages requires it time And Resources. When we reduce time and resources, our applications become faster and more performing. In addition, this increases the positive feedback in the user experience of our users.
User Experience leaked And by making use of caching, you can achieve cool things like improving search results rankings.
In this post, we’ll go through powerful ways caching can improve the way you think and write powerful code.
When Your site is indexed in Google search results, Googlebot takes a snapshot every time a bot crawls through your website. When it does, it keeps a “backup” of your website. This is called Google Cache.
For example, if your website has been hacked or removed for some strange reason, Google compares what your current site looks like to what it looks like last looked. If your website responds with a 404, Google is able to serve your website’s last “working” state from its cache, effectively saving your website from provoking users to bounce elsewhere.
We can learn from this by applying similar techniques such as caching our app resources to save a file Offline experience When the user is disconnected from the Internet.
In the modern world, offline experiences for users are more important than ever.
For example, when Airplane mode is on, we can provide them with an offline experience so they can use our apps.
We can make use of caching for this. One way is to save the last used state of your DOM by registering a listener as shown below:
This is a general idea example to help you get started with a more formal or secure solution.
We can take advantage of service workers (who live in a context separate from the DOM) to cache resources and intercept requests so they receive faster response times. It is more secure because it performs operations on a separate thread.
There are many other caching strategies that you can use from service workers to provide a more robust user experience for users.
If you’re not familiar with workers, here’s a short example of how the syntax looks when working with them:
For service workers:
For joint workers:
Workers must be separated because they work in a separate context, so they must be isolated in their own files:
You can learn more about the workers here.
Some great examples of offline experiences are Spotify and Netflix. Most of the time we hardly notice when my phone has no signal when listening to Spotify thanks to its buffering.
The power of spooling shines well at “rigging” things. The word “Faking” has a very bad connotation when used in the real world but in the context of the DOM it is a positive! The first goal (arguably) in web applications is to delight our users, and in faking good performance, you are actually helping the user feel that they are in good hands.
Here are some of the ways caching can help tamper with good performance for users:
Users should not be forced to wait for images to load in order to feel at home on our pages. By using placeholder images, we can allow the user to have an instantly easy experience with our pages if we replace the larger images with something else (such as a tracked SVG) While Real download ends. We can either cache the actual image to be loaded immediately the next time the user visits or we can Dimensional cache Which enables us to instantly display something like a silhouette at an expected size while it is being loaded.
If you’re not familiar with what a tracked SVG is, this is how it might look, with the left being the fully loaded image and the right of the partial (tracked SVG) placeholder:
Lazy loading is a common term in web development. It is a useful strategy that requests data for the user’s customer only when needed. Caching can help speed up slow loading if we cache resources it will be I asked when the time came. This is an import concept because Much The user agent never requests the data in web applications. Lazy loading helps you think about what needs to be cached so that it is only consumed when asked. You have likely interacted with an element on a web page using this concept.
For example, any web page that offers some form of file page numbering It is an implementation of this approach.
We previously talked about service workers and it is worth noting here that this is where they really shine in faking a great performing app for your users.
In the company I work for, I was amazed at the positive feedback I received after implementation Cached Replies via service workers. Regular users (and regularly I refer to people who aren’t developers) have no idea that their pages are just being picked up from a previous session of their browser during the “instant response” times they get when background workers cache their visits.
Another important thing that can be learned from this is that we also skip extra requests when we provide cached responses (assuming the content won’t change much) which is an added advantage in saving bandwidth and resources.
We can make our functions process data faster over time by caching their results which can effectively become a performance boost in our applications.
As engineers, it is necessary to consider which parts of our data should be cached. It is also important to know the best caching strategy to use depending on the situation.
In particular, we must take into account:
- when To update or revoke data from our cache. It is not uncommon in web development to mishandle caching when it comes to maintaining an updated cache.
- Save frequently accessed data that we can retrieve for later use when time-consuming or costly operations are no longer necessary. Users are important, so we also need to consider refreshing the cache periodically so that they can see the latest information.
For example, when we have a function responsible for retrieving and returning data, we can use a strategy called notes To retrieve data from the cache if it has already been fetched before:
By spamming this function it will only do it the first time and return the cached data immediately on subsequent calls:
This strategy is known in part as a cache staking pattern once an invalidation strategy such as TTL is in place.
- Gatsby caches JSON objects which are captured in later builds in order to make the build process faster
- SWR takes advantage of an HTTP cache invalidation strategy called state during revalidation to provide fast responses with the promise of first-hand data
- Chrome V8 engine stores code compiled in three stages which is fast and efficient
And that ends this post! Hope you found this helpful and look forward to more in the future!