Redis scenario problems

Session sharing in cluster

Session sharing: multiple Tomcat servers do not share the session storage space. Data loss occurs when switching to different Tomcat services is requested.
The alternative to session should meet the following requirements:
• data sharing
• memory storage
• key and value structures

So we implement shared session login based on Redis

flow chart


  • The token here is used to replace the session. Some processing of the session is handled by tomcat, so we need to do some processing on the token.
  1. The token will disappear after 30 minutes. It needs to be reset for 30 minutes when requested.
  2. Set the interceptor to intercept all user requests, reset the ttl of the token to 30min, and release

cache

What is caching

A cache is a buffer for data exchange (called cache [K æ ʃ ] ), It is a temporary place for storing data. Generally, it has high reading and writing performance.

Role of cache:

    • Reduce back-end load
    • Improve reading and writing efficiency and reduce response time

Cost of caching:

    • Data consistency cost
    • Code maintenance cost
    • O & M cost

Add cache

@Override
  public Result queryShopList() {
    String listShop = "shop:list";
    //Query cache
    List values = stringRedisTemplate.opsForHash().values(listShop);
    //Cache Hit 
    if(!values.isEmpty()){
      ArrayList<ShopType> shopTypes = new ArrayList<>();
      for(Object str:values){
        shopTypes.add(JSONUtil.toBean((String)str,ShopType.class));
      }
      return Result.ok(shopTypes);
    }
    //Cache Miss 
    List<ShopType> typeList =
        query().orderByAsc("sort").list();
    //There is no such record in the database (you can set the cache empty object here to prevent cache penetration, which will occur later)
    if (typeList==null) {
      return Result.fail("No shop");
    }
    //Find data write cache
    for(ShopType temp : typeList){
      stringRedisTemplate.opsForHash().put(listShop,temp.getId().toString(),JSONUtil.toJsonStr(temp));
    }
    //return
    return Result.ok(typeList);
  }

Cache update policy

  • Business scenario:
    • Low consistency requirements: use the memory obsolescence mechanism. For example, store type query cache
    • High consistency requirements: take the initiative to update and take the timeout elimination as the bottom solution. For example, store details query cache
Updating the cache while updating the database is a good solution


To delete the cache, you need to consider whether to delete the cache first or update the database first?
Scenario 1: delete the cache before updating the database

  • If another request is made to access the database after deleting the cache and before updating the database, and a cache miss is found, the database will be accessed and the queried data will be written to the cache, which will cause the old data to be written to the cache again, and the old data will be found in the next access.

Scenario 2: update the database first

  • When a miss is found during the query and the query data is ready to be written to the cache, the data is updated. First, the database is updated, the cache is deleted, and then the old data is overwritten by the previous thread.
    However, the probability of this occurrence is very small, and it is faster to query and write to the database.
Best practices for cache update policy:
  1. Low consistency requirements: use Redis' own memory elimination mechanism
  2. High consistency requirements: take initiative to update, and take timeout elimination as the bottom solution
    • Read operation:
      • Cache hits are returned directly
      • If the cache misses, query the database, write to the cache, and set the timeout
    • Write operation:
      • Write the database before deleting the cache
      • To ensure the atomicity of database and cache operations

Cache penetration

Cache penetration means that the data requested by the client does not exist in the cache or in the database, so the cache will never take effect, and these requests will be sent to the database.

There are two common solutions:

  • Cache empty objects
    • Advantages: simple implementation and convenient maintenance
    • Disadvantages:
      • Additional memory consumption
      • May cause short-term inconsistency
  • Bulon filtration
    • Advantages: less memory and no extra key s
    • Disadvantages:
      • Implementation complexity
      • There is possibility of misjudgment


Flow chart for resolving cache penetration

public Shop queryWithPassThrough(Long id) {
    String shopId = CACHE_SHOP_KEY+id;
    String info = stringRedisTemplate.opsForValue().get(shopId);
    if (StrUtil.isNotBlank(info)){
      //Find information
      Shop shop = JSONUtil.toBean(info, Shop.class);
      return shop;
    }
    if(info!=null){
      return null;
    }
    //not found
    Shop shop = getById(id);
    if(shop==null){
      //Prevent cache penetration
      stringRedisTemplate.opsForValue().set(shopId,"",CACHE_NULL_TTL,TimeUnit.MINUTES);
      return null;
    }
    stringRedisTemplate.opsForValue().set(shopId,JSONUtil.toJsonStr(shop),CACHE_SHOP_TTL,TimeUnit.MINUTES);
    return shop;
  }

What causes cache penetration?

  • The data requested by the user does not exist in the cache or in the database. The constant initiation of such requests puts great pressure on the database

What are the solutions for cache penetration?

  • Cache null value
  • Bulon filtration
  • Enhance the complexity of id and avoid guessing id rules
  • Verify the basic format of data
  • Strengthen user authority verification
  • Do a good job in current limiting of hot spot parameters

Cache avalanche

Cache avalanche means that a large number of cache key s fail at the same time or the Redis service goes down at the same time, resulting in a large number of requests reaching the database, which brings great pressure.

Solution:
  • Add random values to TTL S of different keys
  • Using Redis cluster to improve service availability
  • Add a degraded flow restriction policy to the cache service
  • Add multi-level cache to business

Buffer breakdown

The cache breakdown problem is also called the hot key problem, which means that a key that is accessed in high concurrency and has complex cache reconstruction business suddenly fails. Countless requests for access will have a huge impact on the database in an instant.

There are two common solutions:

  • mutex
  • Logical expiration

Mutex:

Logical Expiration:

Advantages and disadvantages:


Mutexes: the mutexes here are different from those in java. For java mutexes, process waiting will occur if the lock is not obtained. However, if the lock is not obtained, you need to continue to execute and return the old data. You can use setnx key val in redis (the key can only be written when it does not exist). The corresponding StringRedisTemplate function is:

Boolean flag = stringRedisTemplate.opsForValue()
        .setIfAbsent(key, "1", 10, TimeUnit.SECONDS);

Logical Expiration: it is necessary to add an expiration time to the information stored in redis. When querying, first match whether the following has expired. If it has expired, the cache will be rebuilt. This can solve the problem of cache breakdown.
We know that the original Shop class has no expiration time attribute. How do we add this attribute? There are two ways:

  1. Create a class with an expiration date attribute, and let the Shop class inherit this class. This method is simple. Subsequent values do not need to be modified, but the target class (Shop) will be modified.
  2. Create a class containing the expiration time attribute and the Data (Object) attribute to store the target class instance. This method will not modify the target class, but it needs to be modified when fetching Data from the cache. For example, the following code example:
String shopJson  = stringRedisTemplate.opsForValue().get(key);
/......code.etc........../
RedisData redisData = JSONUtil.tiBean(shopJson,RedisData.class);
JSONObject data = (JSONObject) redisData.getData();
Shop shop = JSONUtil.toBean(data,Shop.class);

Tags: Java Redis Cache

Posted by Gordicron on Thu, 02 Jun 2022 23:06:36 +0530