Top
Best
New

Posted by stagas 6 days ago

How to Synthesize a House Loop(loopmaster.xyz)
167 points | 65 comments
Slow_Hand 7 hours ago|
I've watched a lot of live coding tools out of interest for the last few years, and as much as I'd like to adopt them in my music making it's not clear to me what they can add to my production repertoire compared to the existing tools (DAWs, hardware instruments, playing by hand, etc).

The coding aspect is novel I'll admit, and something an audience may find interesting, but I've yet to hear any examples of live coded music (or even coded music) that I'd actually want to listen to. They almost always take the form of some bog-standard house music or techno, which I don't find that enjoyable.

Additionally, the technique is fun for demonstrating how sound synthesis works (like in the OP article), but anything more complex or nuanced is never explored or attempted. Sequencing a nuanced instrumental part (or multiple) requires a lot of moment-to-moment detail, dynamics, and variation. Something that is tedious to sequence and simply doesn't play to this formats' strengths.

So again, I want to integrate this skill into my music production tool set, but aside from the novelty of coding live, it doesn't appear well-suited to making interesting music in real time. And for offline sequencing there are better, more sophisticated tools, like DAWs or trackers.

MomsAVoxell 5 hours ago||
Every generation of musicians for the past 8 decades has had the same thoughts. What live coding tools for synthesis offers you is an understanding of the nature of generational technology.

Consider this: there are teenagers today, out there somewhere, learning to code music. Remember when synthesisers were young and cool and there was an explosion of different engines and implementations?

This is happening for the kids, again.

Try to use this new technology to replicate the modern, and then the old sound, and then discover new sounds. Like we synth nerds have been doing for decades.

anigbrowl 3 hours ago||
Music coding technology has been around a long time - think of tools like csound and pd and Max/MSP. They're great for coding synthesizers. Nobody uses them to do songs. Even Strudel has tools for basic GUI components because once you get past the novelty of 'this line of code is modulating the filter wowow' typing in numeric values for frequency or note duration is the least efficient way to interact with the machine.

Pro developers who really care about the sound variously write in C/C++ or use cross compilers for pd or Max. High quality oscillators, filters, reverb etc are hard work, although you can certainly get very good results with basic ones given today's fast processors.

Live coding is better for conditionals like 'every time [note] is played increment [counter], when [counter] > 15 reset [counter] to 0 and trigger [something else]'. But people who are focused on the result rather than the live coding performance tend to either make their own custom tooling (Autechre) or programmable Eurorack modules that integrate into a larger setup, eg https://www.perfectcircuit.com/signal/the-programmable-euror...

It's not that you can't get great musical results via coding, of course you can. But coding as performance is a celebration of the repl, not of the music per se.

alisonatwork 37 minutes ago|||
I see it as a neat way for nerds to nerd out about nerd stuff in an experiential way. Like, this is not going to headline a big time rave or festival or anything, but in a community of people who like math or programming or science, sure, why not introduce this kind of performance as another little celebration of their hobby?

Years ago I went to a sci-fi convention for the first time, because I had moved to a new town and didn't know anyone, and I like sci-fi. I realized when I was there that despite me growing up reading Hugo and Nebula award winners, despite watching pretty much every sci-fi show on TV, despite being a full-time computer nerd, the folks who go to sci-fi conventions are a whole nother subculture again. They have their own heroes, their own in-jokes, their own jargon... and even their own form of music! It's made by people in the community for the community and it misses the point to judge it by "objective" standards from the outside, because it's not about trying to make "interesting music" or write the best song of all time. The music made in that context is not being made as an end in itself, or even as the focus of the event, it's just a mechanism to enable a quirky subculture to hang out and bond in a way that's fun for them. I see this kind of live coded music as fulfilling a similar role in a different subculture. Maybe it's not for you, but that's fine.

ofalkaed 1 hour ago|||
I find trackers to be in the same category you put live coding into, probably DAWs as well, but many people do some amazing things with all three. In the more academic computer music world there is a fair amount of diversity in live coding where it is generally combined with algorithmic/generative techniques to rapidly create complexity and nuance. SuperCollider seems to have the most interesting output here, for me at least; I have seen little that really grabs me but they do show the capabilities of the process and I find that quite interesting. Improvisation and jamming is just not my thing, so live coding falls a bit short for me.
solomonb 6 hours ago|||
100% agree.

I think this format of composition is going to encourage a highly repetitive structure to your music. Good programming languages constrain and prevent the construction of bad programs. Applying that to music is effectively going to give you quantization of every dimension of composition.

I'm sure its possible to break out of that but you are fighting an uphill battle.

rbn3 2 hours ago|||
Quite the opposite actually. certain live coding languages give you the tools to create extremely complex patterns in a very controlled manner, in ways you simply wouldn't be able to do via any other method. the most popular artist exploring these ideas is Kindohm, who is sort of an ambassador figure for the TidalCycles language. Having used TidalCycles myself, the language lends itself particularly well to this kind of stuff as opposed to more traditional song/track structures. And yet it also constrains and prevents the construction of bad programs in a very strict manner via its type system and compiler.

It's also notable for being probably the only Haskell library used almost exclusively by people with no prior knowledge of Haskell, which is an insane feat in itself.

solomonb 1 hour ago||
> Quite the opposite actually. certain live coding languages give you the tools to create extremely complex patterns

I think I must not be expressing myself well. These tools seem to be optimized for parametric pattern manipulation. You essentially declare patterns, apply transformations to them, and then play them back in loops. The whole paradigm is going to encourage a very specific style of composition where repeating structures and their variations are the primary organizational principle.

Again, I'm not trying to critique the styles of music that lend themselves well to these tools.

> And yet it also constrains and prevents the construction of bad programs in a very strict manner via its type system and compiler.

Looking at the examples in their documentation, all I see are examples like:

    d1 $ sound "[[bd [bd bd bd bd]] bd sn:5] [bd sn:3]"
So it definitely isn't leveraging GHC's typechecker for your compositions. Is the TidalCycles runtime doing some kind of runtime typechecking on whatever it parses from these strings?

> It's also notable for being probably the only Haskell library used almost exclusively by people with no prior knowledge of Haskell, which is an insane feat in itself.

I think Pandoc or Shellcheck would win on this metric.

bigiain 2 hours ago||||
Some of us enjoy highly repetitive music, at least some of the time.

"Computer games don't affect kids. If Pac Man affected us as kids, we'd all be running around in darkened rooms, munching pills and listening to repetitive music." -- Marcus Brigstocke (probably?)

Also, related but not - YouTube's algorithm gave me this the other day - showing how to reconstruct the beat of Blue Monday by New Order:

https://www.youtube.com/watch?v=msZCv0_rBO4

solomonb 2 hours ago||
I'm not saying anything negative about repetitive music. I'm saying that tools like live coding are going to constrain the kind of music you can produce reasonably.
tialaramex 30 minutes ago||
I mean, sure, art has constraints.

My sister likes to work with [checks notes carefully to avoid the wrong words] old textiles. This of course constrains the kind of art she can make. That's the whole point.

I see live coding the same way as the harp, or a loop sampler, an instrument, one of an enormous variety of tools which you might find suits you or not. As performance I actually enjoy live coding far more than most ways to make music, although I thought Amon Tobin's ISAM Live was amazing that's because of the visuals.

cdr6934 6 hours ago|||
The ease of quantization in the DAW is pretty easy to do as well. So I am not sure that would be unique to music / live coding sessions.
tarentel 5 hours ago||
It is unique because everything is quantized. I've never used these tools but I am assuming you could give it some level of randomness but as someone who has performed and recorded a non-quantized performance is not random. So sure, it's super easy to quantize in your daw but it is a tool to be applied when needed, not something that is on all the time by default.
solomonb 3 hours ago||
yes exactly, and when I say "quantization of every dimension of composition" I mean an application of quantization to every aspect of composition not just pitch and rhythm.
stagas 3 hours ago||
Quantization and repetition are what some genres depend on. It won't be the right instrument for a Rock ballad, but for a Techno track you need this kind of "everything being quantized". That said, in loopmaster you can add swing and noise to the note offsets to humanize a sequence, a lot is left to the imagination and ability of the creator.
solomonb 3 hours ago||
No one in this thread is saying quantization is never appropriate.
filoleg 6 hours ago|||
> I've watched a lot of live coding tools out of interest for the last few years, and as much as I'd like to adopt them in my music making it's not clear to me what they can add to my production repertoire compared to the existing tools (DAWs, hardware instruments, playing by hand, etc).

Aside from the novelty factor (due to very different UI/UX) and the idea that you can use generative code to make music (which became an even more interesting factor in the age of LLMs), I agree.

And even the generative code part I mentioned is a novelty factor as well, and isn't really practical for someone who actually makes music as their end-goal (and not someone who is just experimenting around with tech or how far one can get with music-as-code UIUX).

fasterik 5 hours ago|||
Procedural generation can be useful for finding new musical ideas. It's also essential in specific genres like ambient and experimental music, where the whole point is to break out of the traditional structures of rhythm and melody. Imagine using cellular automata or physics simulations to trigger notes, key changes, etc. Turing completeness means there are no limits on what you can generate. Some DAWs and VSTs give you a Turing complete environment, e.g. Bitwig's grid or Max/MSP. But for someone with a programming background those kinds of visual editors are less intuitive and less productive than writing code.

Of course, often creativity comes from limitations. I would agree that it's usually not desirable to go full procedural generation, especially when you want to wrangle something into the structure of a song. I think the best approach is a hybrid one, where procedural generation is used to generate certain ideas and sounds, and then those are brought into a more traditional DAW-like environment.

Slow_Hand 1 hour ago||
I've actually tried all of the approaches that you've mentioned over the years, and - for my needs - they're not that compelling at the end of the day.

Sure it might be cool to use cellular automata to generate rhythms, or pick notes from a diatonic scale, or modulate signals, but without a rhyme or reason or _very_ tight constraints the music - more often than not - ends up feeling unfocused and meandering.

These methods may be able to generate a bar or two of compelling material, but it's hard to write long musical "sentences" or "paragraphs" that have an arc and intention to them. Or where the individual voices are complementing and supporting one another as they drive towards a common effect.

A great deal of compelling music comes from riding the tightrope between repetition and surprising deviations from that scheme. This quality is (for now) very hard to formalize with rules or algorithms. It's a largely intuitive process and is a big part of being a compelling writer.

I think the most effective music comes from the composer having a clear idea of where they are going musically and then using the tools to supplement that vision. Not allowing them to generate and steer for you.

-----

As an aside, I watch a lot of Youtube tutorials in which electronic music producers create elaborate modulation sources or Max patches that generate rhythms and melodies for them. A recurring theme in many of these videos is an approach of "let's throw everything at the wall, generate a lot of unfocused material, and then winnow it down and edit it into something cool!" This feels fundamentally backwards to me. I understand why it's exciting and cool when you're starting out, but I think the best music still comes from having a strong grasp of the musical fundamentals, a big imagination, and the technical ability to render it with your tools and instruments.

----

To your final point, I think the best example of this hybrid generative approach you're describing are Autechre. They're really out on the cutting edge and carving their own path. Their music is probably quite alienating because it largely forsakes melody and harmony. Instead it's all rhythm and timbre. I think they're a positive example of what generative music could be. They're controlling parameters on the macro level. They're not dictating every note. Instead they appear to be wrangling and modulating probabilities in a very active way. It's exciting stuff.

fasterik 21 minutes ago|||
I don't think any of that is an argument against the use of procedural generation, it's just an argument for the tasteful use of it. Partly it also depends on what works in your own workflow. I find that it's an essential component in the creative process of lot of the artists I admire. Autechre is a great example. I think a lot of the pioneers of early IDM like Autechre and Aphex Twin have found ways to incorporate randomness at the micro level, while maintaining control at the macro level over the shape and direction of the composition. I don't see this as competing with traditional composition methods, it's just leveraging code-based tools to give the artist more control over which elements are random and which ones they control.
stagas 1 hour ago|||
When you learn to use it you can throw a lot of intention into it, knowing the output even before you hit play. Yes, you can go the other way and "subtract" your way out of a chaos, but you can also intentionally piece together the components and produce an output you imagined beforehand. The missing pieces here for this format, my instinct tells me, are layers of abstraction or additional UI elements that will help in composing a final piece, using code for the fundamental components plus something else that hasn't been invented yet or noone has thought of glueing it together.
quaverquaver 42 minutes ago|||
here's a whole opera from a Star Trek episode I coded in Supercollider - can indeed code things other than EDM... (its a screen grab - being synthesized in real time)

https://vimeo.com/944533415?fl=ip&fe=ec

quaverquaver 39 minutes ago|||
the great advantage over DAWs etc is that you can name things and slowly build your own bespoke tools... for this work all timing was done in reference to the words rather than beats and bars - I can re-flow the whole piece by tapping through the syllables on my space-key. Something that would be totally impossible in a traditional platform!
stagas 33 minutes ago||
In loopmaster you can define functions and abstract slowly and build your tools as well. Not yet with callbacks but it's in the works to do more complex SuperCollider-style stuff.
stagas 36 minutes ago|||
Sign up please and teach us.
H1Supreme 6 hours ago|||
Look into the JUCE framework for building your own tools. I was using MaxMsp for a while, but would always think to myself "This would be so much easier to accomplish in pure code". So, I started building some bespoke VST's.

There's a learning curve for sure, but it's not too bad once you learn the basics of how audio and MIDI are handled + general JUCE application structure.

Two tips:

Don't bother with the Projucer, use the CMAke example to get going. Especially if you don't use XCode or Visual Studio.

If your on a Mac, you might need to self-sign the VST. I don't remember the exact process, but it's something I had to do once I got an M4 Mac.

Libidinalecon 49 minutes ago|||
I haven't really found anything yet that Gemini can't do in python for this.

LLMs have absolutely killed any interest I use to have in the max/pd/reaktor wiring up boxes UI.

I have really gone further though and thought why do I even care about VST or a DAW or anything like this? Why not break completely free of everything?

I take inspiration from Trevor Wishart and the Composers Desktop Project for this. Wishart's music could only really be made with his own tools.

It is easy to sound original when using a tool no one else has.

wahnfrieden 6 hours ago|||
AudioKit for iOS/Mac is also interesting and easy to work with.
stagas 7 hours ago|||
Fair point, and that's the challenge in both the software's abilities and the creator's skills.

If you see it as yet another instrument you have to master, then you can go pretty far. I'm finding myself exploring rhythms and sounds in ways I could never do in a DAW so fast, but at the same time I do find limiting a lot of factors, especially sequencing.

So far I haven't gotten beyond a good sounding loop, hence the name "loopmaster", and maybe that's the limit, which is why I made a 2 deck "dual" mode in the editor, so that it can be played as a DJ set where you don't really need that much progression.

That said, it's quite fun to play with it and experiment with sounds, and whenever you make something you enjoy, you can export a certain length and use it as a track in your mix.

My goal is certainly to be able to create full length tracks with nuances and variations as you say, just not entirely sure how to integrate this into the format right now.

Feedback[0] is appreciated!

[0]: https://loopmaster.featurebase.app/

karlshea 6 hours ago||
I've seen a couple of TikToks with someone doing live coding with this same tool and it was really cool to watch because they really knew it well, but like you said it was bog-standard house/techno.
mstngl 8 hours ago||
What‘s going on with all these code-2-music tools these days? See other front page discussion about strudel.cc [1]. Did I enter an established bubble or is there a rising trend? It‘s incredible, though, what people are able to obtain with it, especially when built-up during a live session [2].

[1] https://news.ycombinator.com/item?id=46052478 [2] Nice example: https://m.youtube.com/watch?v=GWXCCBsOMSg

c22 8 hours ago||
Often an article posted to hn will cause a mini-trend as users who are engaging with the subject discover and share more related resources.
hecanjog 8 hours ago|||
Computer music is as old as computers, live coding is pretty old too. (I posted this in the strudel discussion too: https://toplap.org/wiki/HistoricalPerformances) Maybe everyone doing live streams during the pandemic helped get visibility for live coding? It's interesting to see it kind of becoming popular now.
vidarh 1 hour ago|||
And simpler generative music significantly predates computers:

https://en.wikipedia.org/wiki/Musikalisches_W%C3%BCrfelspiel

baq 7 hours ago|||
classic overnight success 20 years in the making. many such cases.

I must say the narrated trance piece by switch angel blew me socks right off, to me feels like this should be a genre in itself.

cdr6934 5 hours ago|||
My guess is we are either at the top or rising to the top of cyclical curve of the trend.
SmirkingRevenge 6 hours ago||
CSOUND is the oldest code-2-music framework I know of, and that's been here since the 80's, so the concept is not new

The tools/frameworks have become more plentiful, approachable, and mature over the past 10-15 years, to the point where you can just go to strudel.cc and start coding music right from your browser.

pierrec 7 hours ago||
The language certainly looks nice! Is it open source? I think it makes sense for this kind of tool, since it's inherently "hackery". I mean people who want to write music with code also probably want the ability to understand and modify any part of the stack, it's the nature of the audience.

I'll shamelessly plug my weirdo version in a Forth variant, also a house loop running in the browser: https://audiomasher.org/patch/WRZXQH

Well, maybe it's closer to trance than house. It's also considerably more esoteric and less commented! Win-win?

stagas 6 hours ago|
Thanks! I tried to make it as familiar as possible, inspired by JS. It's not yet open-source, mainly because the source is a bit of a mess, but it will be once I tidy things up. Follow me on GitHub[0] for updates. Also that sounds to me like Tech-House/Electro-House :D Very nice!

[0]: https://github.com/stagas

ElijahLynn 3 hours ago||
SO fun!!!

fun experiment to get you tinkerers started, skip to the bottom play The Complete Loop - https://loopmaster.xyz/tutorials/how-to-synthesize-a-house-l...

Then, on line 21, with `pat('[~ e3a3c4]*4',(trig,velocity,pitches)->`.

Change *4 to *2 and back to *4, to reduce the interval that the "Chords" play. If you do it real fast with your backspace + 2 or backspace + 4 key, you can change the chords in realtime, and kinda vibe with the beat a little bit.

Definitely recommend wearing headphones to hear the entire audio spectrum (aka bass).*

ElijahLynn 2 hours ago||
you can also highlight one or multiple lines and press (cmd + /) to toggle if it is commented out or not to turn off a layer!
ElijahLynn 2 hours ago||
So many tweaks but here is another:

change line 12 from 8000 to 800

adzm 2 hours ago||
Is there a way to sidechain the bass to a compressor with the kick as an input? otherwise the low end is very muddy.
stagas 1 hour ago|
Yes, there is a `sidechain` function designed specifically for this. I wanted to keep this tutorial simple so I skipped a lot of mixing techniques or were left as an exercise to the reader, but I will try to cover those as well in future tutorials. Sign up to get notified for when they arrive!

For now you can see how it's done here[0] on line 139. I pretty much use it on every other track I've made as well.

[0]: https://loopmaster.xyz/loop/6221a807-9658-4ea0-bfec-8925ccf8...

panic 6 hours ago||
I was surprised at the audible difference it made to reset the RNG seed for the hi-hat noise function every time it triggered. I’m curious what the justification for doing this is—does the randomness arise from the geometry of the hi-hat itself and not the way you hit it? Is the idea to imitate the sound of sample-based percussion?
stagas 5 hours ago|
My understanding is that because it's a very small sample, it's basically a combination of a subset of sine waves, and because we're very sensitive to the nuances of high-pitched sounds, even small changes in that space make a lot of difference. Every RNG seed produces a different sounding hihat, and if you don't reset it, it continues producing different hihats, which is unnatural. Another explanation is also to resemble sample-based audio, but perhaps it's all of these things combined.
xnx 9 hours ago||
Very cool. https://loopmaster.xyz/generate is super fun also.
stagas 8 hours ago|
Thanks! For anyone trying this, it's being HN crushed right now and hitting rate-limits, you should try again in a bit if you see an error.

Also, there is an AI DJ mode[0] where you set the mood/genre and the AI generates and plays music for you infinitely.

[0]: https://loopmaster.xyz/editor?aidj

999900000999 8 hours ago||
I really want something like this as a VST plugin.

I don't imagine making a full song out of this, but it would be a great instrument to have.

stagas 8 hours ago||
I'm considering a VST version but for now there is an Export Audio feature you can use to get a perfect audio loop to use with Ableton (or any other DAW) with oversampling up to 8x for great quality.
999900000999 8 hours ago||
How much do you need in donations to make it happen.

I'll put 50$ down right now.

stagas 8 hours ago||
Click Export Audio next to the title here[0], there is a buymeacoffee button :) tysm

[0]: https://loopmaster.xyz/loop/75a00008-2788-44a5-8f82-ae854e87...

999900000999 7 hours ago||
I meant for the VST plugin.

The janky way to do this would be to run it locally, and setup a watch job to reload the audio file into a vst plugin every time the file changes.

stagas 7 hours ago||
The backend now is in WASM. I have a plan on how do this in a VST, I had done a version with a Rust+WASM backend in the past. My main concern is getting a Webview working for the editor, which is custom made, but I think that's also solved by now. The goal would be exactly the Web version working as a VST plugin with its real-time audio engine.
tscherno 8 hours ago|||
I route midi generated by strudel.cc in to my DAW.
duped 8 hours ago||
Shoutout to PACE who banned scripting in the JUCE 8 license terms so if you wanted to make this using the leading framework, you can't.
StableAlkyne 4 hours ago|||
Do you have a source for this? I don't see any indication from a quick Google other than this thread as the second result.

The license at: https://github.com/juce-framework/JUCE/blob/master/LICENSE.m...

indicates you can just license any module under agpl and avoid the JUCE 8 license (which to be fair, I'm not bothering to read)

duped 4 hours ago||
https://forum.juce.com/t/archived-juce-8-eula/60947/149

And sure you can license under APGL. It should be obvious that's undesirable.

999900000999 7 hours ago|||
Define scripting.

I'm not going to test it, but couldn't you just load a json file with all params.

Various instructions, etc.

I can't believe it's not code!

duped 6 hours ago||
The definition is up to them. They don't want to play around with loopholes since the whole point of the license change was to force more people to buy license seats.
jnsaff2 9 hours ago||
If you like this then check out Oxygene pt4 in JS[0].

[0] https://dittytoy.net/ditty/59b8a8d54d

djmips 1 hour ago|
Nice, that's the optimized version - sounds actually a little different than the original one it's derived from. (actually better which I didn't expect) Original: https://dittytoy.net/ditty/24373308b4

I like how music recognition flags it as the original Jarre piece.

I first did stuff like this when I was a teen using a 6502 machine and a synth card - using white noise to make tshhh snares etc. All coded in 6502. The bible was Hal Chamberlin's Musical Application of Microprocessors.

Then of course we had games abusing the SID etc to make fantastic tunes and then came very procedural music in size coded PC and Amiga demo coding that underneath the hood were doing tiny synth work and sequencing very much like dittytoy etc.

Shadertoy even has procedural audio but it doesn't get used enough.

Fantastic to experience all of this!

joemi 6 hours ago|
It strikes me as kind of weird (or maybe a red flag?) that there's no landing page nor an About page.
input_sh 6 hours ago|
I think it's more of a red flag that they chose a name that's one letter away from a well-known site that sells music samples: https://www.loopmasters.com/

Not like a fringe unknown one, but one with over 20 years of history and now-owned by Beatport.

dylan604 5 hours ago||
meh, if they were that worried about their brand, they should have bought up the variants of their domain plus TLDs. otherwise, they can't possibly be that concerned about their trademark.