Music DSL (similar to Regex) for Music Notation

After much research, I believe Swift could use a DSL for developers to build clean music notation editors and rendering engines. There are various solutions using custom music notation structures, converters to MusicXML, SMUFL, etc.

ABCNotation is a simple, yet powerful, text based method of generating and storing music notation that has open source C and JS tools to edit/render from ABC to PDF, MusicXML, SVG, PS. However, the code is pretty complex and not easily incorporated into Swift programs.

Given the upcoming RegEX DSL, I believe a similar DSL/Framework for Music notation would be well received by the Swift community.

Curious if others feel there is a (unsatisfied) need for a standard Music DSL framework within Swift?
This would enable many more developers to create innovative music notation apps.

1 Like

Though I personally have no use for it, I do find the idea interesting. What would be an example syntax? Is it just a way to easily export a staff to one of these formats?

Somewhat relevant thing was discussed here recently.

I wonder though, why DSL / compile time / ability to write notes directly in swift. Wouldn't it be enough to represent notes as either a string of characters or some custom binary or text data structure (like the mentioned ABC format, or a MIDI file, or lottie, or similar) editable by external means?

1 Like

Having just dipped my toes into this space recently, I found that my real need was on the score rendering side. It would be cool to have a go-to Swift DSL for describing scores, but the use case is really limited to programmers putting together small examples.

Ultimately you’ll just be generating some underlying representation of the score, and it’s that representation that’s actually useful for manipulating the music, rendering it, and building GUI tools for users to work with it. I felt the same way about the literal music syntax experiments mentioned in the other thread — it might be neat for small things, but in any non-trivial app your music data is coming from elsewhere, not the codebase.


I did spend some time playing with the markup-like systems (eg. VexFlow), and while initially I was taken with the idea of building a whole library of content using something like that, I realised pretty quickly that there were too many limitations in terms of control over layout etc. Not to mention that any non-trivial score starts to look so complex as text that — even to a programmer — the representation is no more useful than a big XML file.

Eventually I decided on MusicXML, since it’s easily created in many music programs and easily manipulated.


I would really love to have something Swift-native on the rendering side, though. At present your options are to try and get one of the C++ libraries building and call into it to generate an SVG or something. (Not all that hard to do, but these libraries tend to be older and you’re stuck if they don’t support some layout option you want). Or you can use OSMD or something similar from the JS ecosystem in a web view.

I think a Swift MusicXML renderer with a few engines (SVG for server-side, Core Graphics for apps) and a good set of layout configuration options would go a long way to helping us create much higher quality music apps.

3 Likes

I’m currently working on a project (not open-source yet) and it’s a library that uses a result builder style api to construct synthesisers and pieces from simple components. So far it’s working quite nicely and can certainly produce some cool sounding tracks. It’s especially good for generative music. It just renders tracks and sounds to wav files at the moment because that’s the easiest option.

I’ll post again here if/when I open source it

4 Likes

I should add that unlike most of the conversation here, my library is not aimed towards generating any sort of standard music notation, so there is definitely still a place for that. My library is more on the electronic sound generation side of music.

Thanks for the reference to the other thread re: notation - lots of commonality themes.
I'm thinking along the lines of SwiftUI - where the music is declared in a way that is "readable" by a human, using simple keywords, properties, etc., but without all the "extra" formatting that XML requires to transfer music between various systems. That's still required at some point, but would be abstracted away much like SwiftUI. Same for when the music is either rendered visually or rendered audibly.
So where SwiftUI might have VStack, HStack, Text syntax, "MusicUI" would have Score, Part, Tune, Notes (and variety of other) keywords with various dot properties, ie .westernNotation which would be used to determine how to render the music visually or audibly.
Such a framework would still be able to read/write MusicXML, read/write MIDI, convert it to Swift's internal music type structure (perhaps that ultimately is MusicXML?), send/play it to speakers or transmit it via standard MIDI devices.
Application designers/engineers could create music using small music notation text that is played/rendered on the fly rather than having to create/store audio files for the same music.
Similarly, music could be visually rendered in whether the .local notation is for the user (see Wikipedia's Music Notation topic for information on Music Notation history and the variety of forms throughout history as well as today.

Swift (using Xcode) can support multiple languages, date formats, time formats, etc., depending on locality. Seems like music, as a "universal" language should have similar support?

--- Here is a simple notation for a scale using abc Notation - about 51 bytes --
--- It's MusicXM equivalent (not compressed) is 3KB
X:1
T:abc Notation
M:4/4
L:1/4
K:C
CDEF | GABc | |]

Other thoughts....
How does a human create and "save" music? Possibilities (not likely exhaustive)

  • Mind to instrument, creating audible representation of the music (which may or may not be analog recorded or digitally recorded (ie midi)
  • Mind to written form - ie handwriting, separately or in tandem with playing an instrument
  • Mind to digital form using variety of computer software tools (with various forms of user interfaces to enter/play music in order to capture it in digital form and provide audio feedback of the notes/chords being created.

How is Music represented?

  • Written form (by hand or in sheet music form), which can then be scanned/saved in digital form
  • Various formats. ie Western Notation, Nashville, Japanese, Chinese music formats
  • Digital form - MusicXML, proprietary formats
  • From written form (handwritten) to digital form (scanning, manual entry)

For easy handling of XML you might try this library (authored by me). (Note that the library is not in a final state yet, and the repository URL might move, but this library is already used in production in a large organisation; more news about it soon.)