Top
Best
New

Posted by futurisold 4 days ago

SymbolicAI: A neuro-symbolic perspective on LLMs(github.com)
221 points | 59 commentspage 2
nickysielicki 4 days ago|
I spent some time toying around with LLM-guided "symbolic regression", basically having an LLM review documents in order to come up with primitives (aka operators) that could be fed into github.com/MilesCranmer/PySR

I didn't get very far because I had difficulty piping it all together, but with something like this I might give it another go. Cool stuff.

futurisold 3 days ago|
Oh, definitely. I recommend you go for contracts. I've used something similar for a contract that iteratively "stitched together" a broken ontology graph. Here's some of the data models for inspiration -- you could have something similar for your ops, and write the contract to solve for one op, then apply the op, etc.

---

    class Merge(LLMDataModel):
        indexes: list[int] = Field(description="The indices of the clusters that are being merged.")
        relations: list[SubClassRelation] = Field(
            description="A list of superclass-subclass relations chosen from the existing two clusters in such a way that they merge."
        )
    
        @field_validator("indexes")
        @classmethod
        def is_binary(cls, v):
            if len(v) != 2:
                raise ValueError(
                    f"Binary op error: Invalid number of clusters: {len(v)}. The merge operation requires exactly two clusters."
                )
            return v
    
    
    class Bridge(LLMDataModel):
        indexes: list[int] = Field(description="The indices of the clusters that are being bridged.")
        relations: list[SubClassRelation] = Field(
            description="A list of new superclass-subclass relations used to bridge the two clusters from the ontology."
        )
    
        @field_validator("indexes")
        @classmethod
        def is_binary(cls, v):
            if len(v) != 2:
                raise ValueError(
                    f"Binary op error: Invalid number of clusters: {len(v)}. The merge operation requires exactly two clusters."
                )
            return v
    
    
    class Prune(LLMDataModel):
        indexes: list[int] = Field(description="The indices of the clusters that are being pruned.")
        classes: list[str] = Field(description="A list of classes that are being pruned from the ontology.")
    
        @field_validator("indexes")
        @classmethod
        def is_unary(cls, v):
            if len(v) > 1:
                raise ValueError(
                    f"Unary op error: Invalid number of clusters: {len(v)}. The prune operation requires exactly one cluster."
                )
            return v
    
    
    class Operation(LLMDataModel):
        type: Merge | Bridge | Prune = Field(description="The type of operation to perform.")
---
futurisold 3 days ago||
Thank you to everyone for participating in the discussion and for your overall support! As I said, I didn't expect this. I'm always just an email or tweet away, so you know how to reach me. It was great talking to you all!
bjt12345 4 days ago||
How did you sort out mapping python constructs to their semantic equivalents?

I hope you keep at this, you may be in the right place at the right time.

It's getting to the point where some of the LLMs are immediately just giving me answers in Python, which is a strong indication of what the future will look like with Agents.

futurisold 4 days ago|
I'm struggling to understand the question. I'll revisit this when I wake up since it's quite late here.
nbardy 4 days ago||
I love the symbol LLM first approaches.

I built a version of this a few years ago as a LISP

https://github.com/nbardy/SynesthesiaLisp

futurisold 4 days ago|
Very nice, bookmarked for later. Interestingly enough, we share the same timeline. ~2yo is when a lot of interesting work spawned as many started to tinker.
ian_medusa_ceo 3 days ago||
I already created a neurosymbolic code generator. Based on bench marks it received a score of 99 percent.
ian_medusa_ceo 3 days ago||
I did it for python and will come out with other languages afterwards
Aynur4 3 days ago||
Did you use symbolicai for that?
xpitfire 4 days ago||
We've been working on some exciting things with SymbolicAI and here a few things which might interest the HN community.

Two years ago, we built a benchmark to evaluate multistep reasoning, tool use, and logical capabilities in language models. It includes a quality measure to assess performance and is built on a plugin system we developed for SymbolicAI.

- Benchmark & Plugin System: https://github.com/ExtensityAI/benchmark

- Example Eval: https://github.com/ExtensityAI/benchmark/blob/main/src/evals...

We've also implemented some interesting concepts in our framework: - C#-style Extension Methods in Python: Using GlobalSymbolPrimitive to extend functionalities.

    - https://github.com/ExtensityAI/benchmark/blob/main/src/func.py#L146
- Symbolic <> Sub-symbolic Conversion: And using this for quality metrics, like a reward signal from the path integral of multistep generations. - https://github.com/ExtensityAI/benchmark/blob/main/src/func....

For fun, we integrated LLM-based tools into a customizable shell. Check out the Rick & Morty-styled rickshell:

- RickShell: https://github.com/ExtensityAI/rickshell

We were also among the first to generate a full research paper from a single prompt and continue to push the boundaries of AI-generated research:

- End-to-End Paper Generation (Examples): https://drive.google.com/drive/folders/1vUg2Y7TgZRRiaPzC83pQ...

- Recent AI Research Generation:

    - Three-Body Problem: https://github.com/ExtensityAI/three-body_problem  

    - Primality Test: https://github.com/ExtensityAI/primality_test 

    - Twitter/X Post: https://x.com/DinuMariusC/status/1915521724092743997 
Finally, for those interested in building similar services, we've had an open-source, MCP-like API endpoint service available for over a year:

- SymbolicAI API: https://github.com/ExtensityAI/symbolicai/blob/main/symai/en...

jaehong747 4 days ago||
great job! it reminds me genaiscript. https://microsoft.github.io/genaiscript/

// read files

const file = await workspace.readText("data.txt");

// include the file

content in the prompt in a context-friendly way def("DATA", file);

// the task

$`Analyze DATA and extract data in JSON in data.json.`;

futurisold 4 days ago||
Thank you! I'm not familiar with that project, will take a look
krackers 4 days ago||
Some of this seems a bit related to Wolfram Mathematica's natural language capabilities.

https://reference.wolfram.com/language/guide/FreeFormAndExte...

It can (in theory) do very similar things, where natural-language input is a first class citizen of the language and can operate on other objects. The whole thing came out almost a decade before LLMs, I'm surprised that they haven't revamped it to make it really shine.

futurisold 4 days ago|||
> I'm surprised that they haven't revamped it

No worries! I can't find it right now, but Wolfram had a stream (or short?) where he introduced "Function". We liked it so much we implemented it after one day. Usage: https://github.com/ExtensityAI/symbolicai/blob/main/tests/en...

futurisold 4 days ago|||
Wolfram's also too busy running his TOE exps to focus on LLMs (quite sadly if you ask me).
GZGavinZhao 4 days ago||
not to be confused with symbolica.ai
futurisold 4 days ago|
+
b0a04gl 4 days ago|
this works like functional programming where every symbol is a pure value and operations compose into clean, traceable flows. when you hit an ambiguous step, the model steps in. just like IO in FP, the generative call is treated as a scoped side effect. this can engage your reasoning graph stays deterministic by default and only defers to the model when needed. crazy demo though, love it
futurisold 4 days ago|
Yes, pretty much. We wanted it be functional from the start. Even low level, everything's functional (it's even called functional.py/core.py). We're using decorators everywhere. This helped a lot with refactoring, extending the framework, containing bugs, etc.