What is Cache

A cache is a type of dynamic and high speed memory that is used to supplement the function of the central processing unit and the physical disk storage. The cache acts as a buffer when the CPU tries to access data from the disk so the data traveling from the CPU and physical disks can have synchronized speed. Disk reading and writing process is generally slower than CPU function.

In computer science theory, a cache is any collection of data that duplicates original values which are stored elsewhere in the computer. The original data may be expensive to fetch because of the disparity in access time between components. So a cache can act as a temporary storage where data which are most frequently accessed are stored so fast processing. In future processing, the CPU may just access the duplicated copy instead of getting it from the physical disk storage which is slower and performance can suffer.

A cache can either be a reserved section in the memory of the computer or a separate storage device with very fast speed. In personal computers, there two common types of caching namely: memory caching and disk caching.

A memory cache is sometimes known as RAM cache of cache store. This is a portion of the random access memory (RAM) which is made of high speed static RAM (SRAM). SRAM is faster than the dynamic RAM (DRAM). When computers are executing, most programs access the same data or instructions repetitively so storing these data or instructions in memory cache makes performance effective.

Other memory caches are directly built in the main body of the microprocessor. For instance, the old Intel 80486 microprocessor has 8K of memory cache while the Pentium had 16K. The cache as also called Level 1 (L1) caches. Modern computers come with external cache memory which is called Level 2 (L2) cache. The L2 cache is situated between the CPU and the DRAM.

On the other hand, disk caching is almost similar to memory caching except that disk cache uses the conventional old memory instead of the high speed SRAM. Frequently accessed data from the disk storage device are stored in the memory buffer. The program first needs to see if there is data from the disk cache before getting data from the hard disk. This method significantly increases performance because access speed in RAM can be as much as thousands of times faster than access speed in hard disks.

A cache hit is a term used when data is found in the cache. The cache’s effectiveness is determined by its hit rate. A technique known as smart caching is used by many cache systems as well. The technique is able to recognize certain types of data which are being frequently used and automatically caches them.

Another kind of cache is the BIND DNS daemon which maps domain names to IP addresses. This makes it easier for numeric IP addresses to be matched faster with their corresponding domain names.

Web browsers also employ a caching system of recently viewed web pages. With these caching system, a user will not have to wait to get data from remote servers because the latest pages are on his computer’s web cache. A lot of internet service providers user proxy cache for there clients to save on bandwidth in their networks.

Some search engines have indexed pages in their cache so when links to these web pages are shown in the search results and the actual website is temporarily offline or inaccessible, the search engine will give the cached pages to the user.

Editorial Team at Geekinterview is a team of HR and Career Advice members led by Chandra Vennapoosa.

Editorial Team – who has written posts on Online Learning.

Pin It