iiwusynth-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [iiwusynth-devel] Multi channel output and number of midi channels


From: Peter Hanappe
Subject: Re: [iiwusynth-devel] Multi channel output and number of midi channels
Date: Mon, 02 Dec 2002 10:33:46 +0100
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.0.0) Gecko/20020623 Debian/1.0.0-0.woody.1

M. Nentwig wrote:
Hei,


I'm trying to make up for my abscence lately by writing very long
mails. Hang on.


Don't worry, I'll be rather quiet next month (programming sounds and
preparing gigs, call it 'field-testing the synth).

What I don't understand about your approach with several aux buses is
how the output signal from the Fx is merged back into the multichannel
output. The two extremes would be one more matrix mixer (every Fx chain
to every output), or all Fx output only to L/R. Most older hardware
synths (take Korg 01RW or Alesis QS) do the latter - outputs 3 and 4 are
without Fx only.
A hardware mixer would have effect returns, which are routable to master
out and / or a subgroup.
Otherwise, as I said, I think it's a good idea, it's sufficiently
flexible for most uses, and it will work fine (just not for me).

For multi-channel output, the output signals of the fx are not merged
back into the multichannel output. They are ouput seperately in the
fx_left and fx_right buffers.

The rationale is, first, that an application doing multi-channel audio
will probably have some internal mixer, and, second, that applications
that don't want to use the internal reverb and chorus are not obliged
to. For example, if you want to use a external reverb, either hardware
or software, then you don't want the reverb in the signal. Also, it
offers the possibility to listen to the effects separately
before mixing them into the master.

A matrix mixer would be overkill, I think. I don't want to implement
a full mixer in iiwusynth. I would just like to offer the possibility
to output instruments individually so the final mixing can be done
outside the synthesizer (in a multi-track editor, for example).
With JACK, you can freely connect the instrument signals and fx signals
to the hardware output and JACK will do the mixing for you.


If you want to allow loading arbitrary LADSPA plugins, you'll need
commands to load and unload them. You'll have to think about mono and
stereo inputs / outputs, about setting parameter values, and realtime
modulation of parameters.
All this is already done (the RT modulation is not yet implemented, but
the road is clear, see below).
In the end you'll end up with rewriting something very close to the
existing LADSPA 'subsystem'. I believe that the code I wrote is fully up
to the job,

You're right about that.


 maybe I'm just selling it badly.
I wouldn't consider it a new 'language', it's more like a netlist format
(like in electrical engineering). It could be very easily extended to
multichannel inputs and multiple aux ways as you planned and
multichannel output while keeping the flexibility that I need. And there
are no unneeded gimmicks right now- it's just basic functionality.

Okay, I think I need to play with it a little more. You convinced to
keep the current scheme. We just need to check that the changes I made
for the multi-channel output doesn't cause problems for the LADPSA Fx
code.


Let me know what you think about the ideas above. Can it suit you?

No, it won't be up to the job:

- Master effect: I use a tanh limiter in every sound I have programmed
as last plugin - it can tolerate some amount of clipping without
thrashing the sound.

Can you send me the line you use to create your plugin routing?


- I would like to also add an equalizer just in front of it for every
sound, but haven't gotten around to it. It serves two purposes: First
for balancing the sound during programming, second for on-the-fly sound
manipulation without having to think much.
- Arbitrary control information: It is very easy to add a 'fake'
LADSPA-plugin, that reads the CC status of a given controller on a given
channel and puts it to its control output. Add parameters gain, offset,
lower limit and higher limit and 95 % of all reasonable effect real time
modulations are implemented!
- The current implementation allows as much freedom as possible (any
routing that doesn't have feedback loops). Why should we restrict
ourselves to a concept that is limited right from the start? Take for
example Propellerhead's 'Reason' (I know the demo): Arbitrary routing
capability.

I never worked with Reason but I've seen it at work. I see the
realization of something like Reason for Linux on the level of
JACK+{Alsa sequencer|MidiShare}+some-not-yet-developed-interface.
And I think the Linux audio community will get there sooner than
later. However, I don't see it on the level of iiwusynth.

Have you ever considered implementing your LADSPA plugging routing
as a Jack client? That would make it useful for other applications
as well.



My own RME Multiface has 10 analog output channels, so I
can test the multichannel synthesizer.

Hey! You've got a RME multiface! Happy with it? No probs on Linux?


I haven't invested the amount of time that is appropriate, maybe one
working day, but the only applications that work are iiwusynth and
aplay.
I guess that the driver is OK, but the applications are not.
Jack doesn't work either, but it works with no other interface on my
machine. So that's not RME's fault.

I saw your email on the jack mailing list. Jack's been running on my
machine all weekend, with iiwusynth connected using 10 stereo output
channels. No xruns, even when sync'ing the disks. I do use a low-latency
kernel and did a 'hdparm -c1 -d1 /dev/hda'.


[cut]

Cheers,
Peter


Cheers

Markus



_______________________________________________
iiwusynth-devel mailing list
address@hidden
http://mail.nongnu.org/mailman/listinfo/iiwusynth-devel








reply via email to

[Prev in Thread] Current Thread [Next in Thread]