Cache (pronounced cash) memory is extremely fast memory that is built into a computer’s central processing unit (CPU),
or located next to it on a separate chip. The CPU uses cache memory to store instructions that are repeatedly required
to run programs, improving overall
system speed.
The advantage of cache memory is that the CPU does not have to use the motherboard’s system bus for data transfer.
Whenever data
must be passed through the system bus, the data transfer speed slows to the motherboard’s capability.
The CPU can process data much faster by avoiding the bottleneck created by the system bus. When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or
writes to the cache, which is much faster than reading from or
writing to main
memory.
Disadvantage of
the Cache: Just in case the Cache memory is full and data that is required to
process or an application
required to run doesn't fit
into the Principle of
Locality (that
is not in the nearby location) then it is obvious
that the time required for the main memory to access the information would increase. Because, first the data need to be relocated
into the cache
and then process over
here
if the cache memory was missing it would be quicker. Furthermore being such
an extensive memory they are very small in size which requires location and relocation of the data or applications.
It allows commonly accessed data to be stored in
full and referenced
faster than recompiling the data each time.