Pandey A.K.; Dutta M.; Thomas A.2025-02-1720221http://dx.doi.org/10.1109/ITW54588.2022.9965862https://idr.iitbbs.ac.in/handle/2008/4234Caching has emerged as a potential way to reduce the latency of content delivery and decrease network traffic during peak hours. In this paper, decentralized caching is considered where caches are filled with random contents of the files. The shared caching problem is considered in which more than one user can access a cache. A precoding technique using erasure codes is employed on the files before the caching. It is shown that the precoding technique implemented improves the delivery rate as compared to the rate when no erasure precoding is employed. Moreover, it is established that the rate corresponding to the proposed decentralized scheme matches with that of the optimal centralized scheme for certain cache sizes. Hence for certain specific cache sizes, the proposed scheme is optimal. � 2022 IEEE.enCoded caching; decentralized caching; erasure precoding; shared cachingDecentralized Coded Caching for Shared Caches using Erasure CodingConference paper