Top
Best
New

Posted by tosh 2 days ago

Caveman: Why use many token when few token do trick(github.com)
874 points | 358 commentspage 4
SamuelBraude 16 hours ago|
We give spearheads to caveman

Call it Ix

Help caveman save even more tokens

https://github.com/ix-infrastructure/Ix

samus 2 days ago||
There's linguistic term for this kind of speech: isolating grammars, which don't decline words and use high context and the bare minimum of words to get the meaning across. Chinese is such a language btw. Don't know what Chinese think about their language being regarded as cavemen language...
adrian_b 1 day ago||
The fact whether a language is isolating, or not, is independent on the redundancy of the language.

All languages must have means for marking the syntactic roles of the words in a sentence.

The roles may be marked with prepositions or postpositions in isolating languages, or with declensions in fusional languages, or there may be no explicit markers when the word order is fixed (i.e. the same distinction as between positional arguments and arguments marked by keywords, in programming languages). The most laconic method for both programming languages and natural languages is to have a default word order where role markers are omitted, but to also allow any other word order if role markers are present.

Besides the mandatory means for marking syntactic roles, many languages have features that add redundancy without being necessary for understanding, i.e. which repeat already known information, for instance by repeating the information about gender and number that is attached to a noun also besides all its attributes. Whether a language requires redundancy or not is independent on whether it is an isolating language or a fusional language.

English has somewhat less syntactic role markers than other languages because it has a rigid word order, but for the other roles than the most frequent roles (agent, patient, beneficiary) it has a lot of prepositions.

Despite being more economic in role markers, English also has many redundant words that could be omitted, e.g. subjects or copulative verbs that are omitted in many languages. Thus for English it is possible to speak "like a caveman" without losing much information, but this is independent of the fact that modern English is a mostly isolating language with few remnants of its old declensions.

akdor1154 2 days ago|||
I thought the term for those were 'sane languages', and I say that as a native English speaker :)
samus 1 day ago||
As a non-native English speaker I think English is actually not that bad. Just the orthography is beyond awful :)
sfink 1 day ago||
English is diarrhea mouth language. Which is worse?
samus 1 day ago||
What's your point?
andai 2 days ago||
No articles, no pleasantries, and no hedging. He has combined the best of Slavic and Germanic culture into one :)
samus 2 days ago|
Both Slavic languages and German have complex declination systems for nouns, verbs, and adjectives. Which is unlike stereotypical caveman speech.
iammjm 2 days ago|||
I speak German, Polish, and English fluently and my take is: German is very precise, almost mathematical, there is little room to be misunderstood. But it also requires the most letters. English is the quickest, get things done kind of language, very compressible , but also risks misunderstanding. Polish is the most fun, with endless possibilities of twisting and bending it's structures, but also lacking the ease of use of English or the precision of German. But it's clearly just my subjective take
fissible 1 day ago||
I have always been annoyed at the verbosity of ChatGPT and (to a lesser degree) Claude. I am aware of the long-term costs associated with trading that bloated context back and forth all the time.
wktmeow 20 hours ago||
So this is really weird, I was using OpenClaw with GPT 5.4 via Codex on I think Friday of last week, and I noticed what looked like thinking tokens spilling to the main chat, and it sounded a lot like this trick! Couple of examples of what I was seeing in the output:

"Need resume task. No skill applies clearly. Need maybe memory? prior work yes need memory_search.” "Need maybe script content from history. Search specific.”

Possible that OpenAI has come up with something very similar here?

Edit: looks like not only me, https://github.com/openclaw/openclaw/issues/25592#issuecomme...

indiantinker 1 day ago||
It speaks like Kevin from The Office (US) https://youtube.com/shorts/sjpHiFKy1g8?is=M0H4G2o0d6Z-pBAC
stronglikedan 18 hours ago||
I feel justified! I've been prompting (not-agenting) like this for a while, and some of my colleagues have ribbed me for it. Now who laugh, JEFF!
vivid242 2 days ago||
Great idea- if the person who made it is reading: Is this based on the board game „poetry for cavemen“? (Explain things using only single-syllable words, comes even with an inflatable log of wood for hitting each other!)
stared 2 days ago||
I would prefer to talk like Abathur (https://www.youtube.com/watch?v=pw_GN3v-0Ls). Same efficiency but smarter.
rschiavone 2 days ago|
This trick reminds me of "OpenAI charges by the minute, so speed up your audio"

https://news.ycombinator.com/item?id=44376989

vntok 2 days ago|
Which worked great. Also, cut off silences.

> One half interesting / half depressing observation I made is that at my workplace any meeting recording I tried to transcribe in this way had its length reduced to almost 2/3 when cutting off the silence. Makes you think about the efficiency (or lack of it) of holding long(ish) meetings.

More comments...