The difference between buffer and cache

piece of information got from wiki to clarify the difference between buffer and cache

The terms “buffer” and “cache” are not mutually exclusive and the functions are frequently combined; however, there is a difference in intent.
A buffer is a temporary memory location that is traditionally used because CPU instructions cannot directly address data stored in peripheral devices. Thus, addressable memory is used as an intermediate stage. Additionally, such a buffer may be feasible when a large block of data is assembled or disassembled (as required by a storage device), or when data may be delivered in a different order than that in which it is produced. Also, a whole buffer of data is usually transferred sequentially (for example to hard disk), so buffering itself sometimes increases transfer performance or reduces the variation or jitter of the transfer’s latency as opposed to caching where the intent is to reduce the latency. These benefits are present even if the buffered data are written to the buffer once and read from the buffer once.
A cache also increases transfer performance. A part of the increase similarly comes from the possibility that multiple small transfers will combine into one large block. But the main performance-gain occurs because there is a good chance that the same datum will be read from cache multiple times, or that written data will soon be read. A cache’s sole purpose is to reduce accesses to the underlying slower storage. Cache is also usually an abstraction layer that is designed to be invisible from the perspective of neighboring layers.

put it simple:buffer is used to slowdown the write behavior such as cpu to disk.cache is used to speedup the read behavior.(buffer是缓冲写,cache是加速读取..)

details refer:http://en.wikipedia.org/wiki/Cache#The_difference_between_buffer_and_cache