T.R | Title | User | Personal Name | Date | Lines |
---|
2227.1 | Nonreal Time Assistance? | DRUMS::FEHSKENS | | Fri Jan 05 1990 12:17 | 10 |
| Studios do the "grab the <specific drum> off its track and use it
as a MIDI trigger" all the time. Pitch tracking is a bit harder,
but possible on an isolated track for a monophonic line. I think
resolving chords on an isolated track is still beyond the state
of the art, and pulling a single instrument (even if it's playing
a monophonic line) out of a mix is also beyond the state of the
art. Amazing analyzers, these ear/brain things.
len.
|
2227.2 | 2� | CSOA1::SCHAFER | Brad - boycott hell. | Fri Jan 05 1990 12:23 | 10 |
| Depending on what you're trying to snag, I'm inclined to agree with
Len. Take your bass line example, for instance. Now let's suppose
that you've got someone like Abe Laboriel doing his multi-pop thang, or
a plethora of hammer-ons. How you gonna track that?
Or take me and my stumbling, self-taught technique. Even on a simple
line, I doubt that you could capture and quantify (via MIDI) all the
nuances that exist(ed). Not that you'd want to, of course...
-b
|
2227.3 | more $$ for copyright lawyers | KOBAL::DICKSON | You could be an ocarina salesman | Fri Jan 05 1990 12:41 | 15 |
| The legal question is interesting. What would the DAT-police say about
this? There are clearly legal uses for such a technology (making
scores from improvised performances, for example). Put there is
nothing to prevent taking a (copyrighted) record album of somebody and
running it thru this process. Something like the uproar about sampling
somebody else's performance.
Dunno about bass lines. It is hard to get the pitch right and also
follow high-speed playing. The problem is similar to what the
photographer faces in dealing with the color temperature of a light
source. Our eyes/brains see sunlight on snow and tungsten light on
paper as both being white, while camera film "knows" that the former is
much bluer than the latter.
How much would you be willing to pay for something that did this? :)
|
2227.4 | Ask the Navy | VOLKS::RYEN | Rick Ryen 240-6501 AET1-1/A6 | Tue Jan 09 1990 16:31 | 21 |
|
An interesting thought. I believe that more of the technology exists
today than we might expect, although it might be excessively complex
and expensive for a musical application.
I'd ask the department of defense. Most probably the
Navy. Signature analysis of acoustics has had quite a bit of
development over the years, especially in the area of submarine
warfare. In the 70's, I worked on a simulator of the S3A, anti-submarine
fighter. It did acoustic analysis, to identify submarine types, and
could differentiate many types of subs, wales, schools of shrimp etc..
Granted, that is a slightly different problem than decoding music.
But, if similar techniques were applied to music analysis, along with
the power of todays computers, I think it could be done. It's just a matter
of applying the right scientist to the job.
"Computer, please listen to this song, write me out the score, and
put the guitar parts in tab. Thank you, carriage return."
8^)
Rick
|
2227.5 | two problems | KOBAL::DICKSON | You could be an ocarina salesman | Wed Jan 10 1990 09:50 | 12 |
| Ah now, writing out the score is an Expert System problem, as it is not
easy to tell the difference between a ritard and sloppy time keeping.
You would have to tell it so much about the performance that you might
as well do it by hand. Some notation programs, notably "Finale" make
some attempts at this.
If all you want is a MIDI capture of a performance, that is just a
signal processing problem. MIDI is, after all, intended to convey the
nuances of performance, not the formalities of western music notation.
Not that both are not worthwhile endeavors in themselves. But they are
separate.
|
2227.6 | Yes But | DRUMS::FEHSKENS | | Wed Jan 10 1990 10:21 | 7 |
| re .5, specifically MIDI - well, not quite. There are these
troublesome MIDI clocks that are supposed to come 24 per quarter
note, implicitly defining both the local tempo and the bar lines,
were one to be able to intuit the applicable time signature.
len.
|
2227.7 | | KOBAL::DICKSON | You could be an ocarina salesman | Wed Jan 10 1990 11:22 | 28 |
| Those clocks are of no interest to anybody but a sequencer (or a drum
machine). They aren't even encoded in MIDI files. All a MIDI file
says is things like "n clock ticks after the previous event, thus and
so happened". The meaning of a "clock tick" can be anything you want.
For example, many sequencers use 480 ticks to the quarter note. A MIDI
file tempo directive says how many *microseconds* per quarter note.
So if you arbitarily set a tempo of 500000 (which is 120 quarter notes
per minute), with a resolution of 480, you encode your delta times
with 960 ticks to the second. The actual event coding has no musically
significant time information - just ticks.
You also have the option (in the file header) to say that you are not
using metrical time at all, but SMPTE time. Then you can say things
like "this file is coded for 30 frames per second, with a resolution of
4 ticks per frame". This gives you 120 ticks per second. (4 ticks per
frame happens to be the MTC resolution. Video SMPTE has 80 per frame.)
So you can take a random series of events, musical or not, and encode
them into a MIDI file in such a way that someone reading the file will
be able to reproduce those events with the exact timing you wanted.
Karl Moeller, for example, uses his sequencer as a tape machine, with
no regard for what tempo the sequencer thinks it is using. As long as
your sequencer has enough resolution, you don't need to worry about it.
(and all the good computer-based sequencers these days use 480 ticks
per quarter note.)
- Paul Dickson, MIDI file weenie.
|