denemo-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Denemo-devel] MIDI convert of recorded midi


From: Richard Shann
Subject: Re: [Denemo-devel] MIDI convert of recorded midi
Date: Thu, 18 Mar 2010 10:20:23 +0000

On Thu, 2010-03-18 at 00:00 -0500, Jeremiah Benham wrote:
> I think there should be a higher level note insertion function. This
> function will detect if the measure has enough room for this note
> value. If not it subtracts the remaining bartime from the notes value.
> Then the remainder would be tied onto the next measure. This higher
> level insertion would check for not collision. If colliding notes have
> the same starttime and duration it can be added as a chord tone. If
> collision of uniquely timed notes happen it is passed to the next
> voice. If the voice does not exist it is created. 

I think you would venturing out onto a wide, wide ocean of complexity
with no hope of arriving: see below!

> 
> > 
> > One suggestion - do not try to detect rests: instead assume every
> note
> > on continues until another one. The output is far more likely to be
> near
> > what the user would want. They would need some editing commands to
> > shorten notes (by apply staccato signs, or shortening the duration
> and
> > making up the duration with a rest), but this is easier than coping
> with
> > rests already inserted.
> 
> Ok. That would not be hard. 
> 
> > 
> > On the whole, I think we should be looking to something else to
> provide
> > this midi conversion functionality 
> 
> A script?  
no, see below:)
> 
> >- it is comparable with OCR in
> > difficulty, with the added twist that there will be a range of
> notations
> > possible for a given input, requiring the user to provide (build-up)
> a
> > library of their own cliches for use by the recognition program. 
> 
> Yes. We don't have support for in inportmidi for tupplets, Lyrics,
> timesig changes, tempo changes, etc...

even those things don't get to the heart of the matter: Written music
notation has a very subtle and complex relationship with actual musical
performance, one that music readers often find it difficult to see. It
is so like other fields I have worked in (e.g. machine vision), where
what appears to be a reasonably straight-forward problem is in fact very
hard.
Deciding whether notes form part of a chord, are being played slightly
staccato, are just being swung, etc has to be done to a certain level of
competence for it to be worth while attempting. That level of
competence, I suspect, will require a substantial program: not, perhaps
as challenging as speech recognition, but still, not just an add on to
another program. That is not to say if a MIDI to notation system is
available we should not integrate it.
And we do have something to offer: in Denemo the user could assemble a
music staff with a selection of rhythmic patterns which reflect the sort
of music notation they want to see. These (along with a MIDI performance
of the patterns by the user) could then be used as input to a MIDI
conversion program, with subsequent editing in Denemo being used to
train the MIDI conversion program (I imagine this would be a neural net
thing).
But I think we would need a substantial amount of outside help -
something would have to be happening beyond our walls for this to get
off the ground.
(Always the pessimist, me:)
Richard




> 
> >(Did
> > you mention MMA in this context? That seems not to be a MIDI to
> notation
> > convertor).
> 
> MMA is a midi accompanyment generator. 
> http://www.mellowood.ca/mma/
> Its written in python to create an midi accompanyment file based off
> some chord definitions. My idea was this: 
> Fakechord can be insert above a melody line in denemo. Then a script
> creates an mma file based off the fakechords and user parameters. A
> dropdown menu or directive could used to set the style or styles used
> in this autogeneration. Then the script executes mma. The midi file
> that was created from mma would be imported into denemo current
> project in parallel with the pre-exising tracks/staffs. This
> accompanyment that would be generated would probably have notes that
> collided over one another. 
> 
> > 
> > I don't know where you want to take this...
> 
> It would not be much of a stretch to record midi input from a
> controller if we had an easy way of dealing with the collision and
> ties. 





reply via email to

[Prev in Thread] Current Thread [Next in Thread]