H
Hacker News
Top
Best
New
Posted by walterbell 8 hours ago
KV Cache Transform Coding for Compact Storage in LLM Inference
(arxiv.org)
2 points
|
0 comments