Disclosed is a central cache that is updated without the overhead of locking. Updates are "atomic" in that they cannot be interrupted part way through. Applications are always free to read data in the cache, accessing the data through a reference table. Applications do not directly update the cache, instead, they send update requests to a service routine. To update the cache, the service routine proceeds in two phases. In the first phase, the service routine prepares the new data and adds them to the cache, without updating the reference table. During the first phase, an application accessing the cache cannot "see" the new data because the reference table has not yet been updated. After the first phase is complete, the service routine performs the second phase of the update process: atomically updating the reference table. The two-phase update process leaves the cache, at all times, in a consistent state.

 
Web www.patentalert.com

< Arbitrating and servicing polychronous data requests in Direct Memory Access

< Streaming information appliance with circular buffer for receiving and selectively reading blocks of streaming information

> Network and method for implementing network platform services for a computing device

> Using directional antennas to mitigate the effects of interference in wireless networks

~ 00281