Re: [linux-audio-dev] Re: AudioServer Standard?
- From: David Olofson <audiality swipnet se>
- To: Benno Senoner <sbenno gardena net>, Stefan Westerfeld <stefan space twc de>, Havoc Pennington <hp redhat com>
- Cc: gnome-kde-list gnome org, linux-audio-dev ginette musique umontreal ca
- Subject: Re: [linux-audio-dev] Re: AudioServer Standard?
- Date: Thu, 23 Sep 1999 23:20:36 +0200
On Thu, 23 Sep 1999, Benno Senoner wrote:
[...]
> Later when audiality will be fully fuctional we could provide a compatibility
> layer for esd or arts-enabled sound apps.
> Right, David ?
The basic "compatibility layer" would be the /dev/dsp emulation, that Audiality
will have as well. Perhaps aRts could run as a client to Audiality as a
quick'n'dirty solution to get it all integrated? That way, applications
expecting aRts would still work, and the whole system could still be integrated
without having to port everything to one of the engines.
[...]
> It shares many concepts with audiality, but I don't know anything about the
> design.
Indeed, and I'll look more closely at the aRts code. Perhaps there is time and
work to save by reusing parts of aRts in the new engine. (Which might be
Audiality, or something else - my efforts so far have been in other areas than
hacking actual engine code.) Depends on the coding style, and how well aRts
fits with the new plug-in API currently in the design stage.
[...]
> > + network transparency
> >
> > Since aRts uses CORBA for almost everything, it is network transparent.
> > On the other hand, some audio server functionality that has been
> > implemented in aRts use TCP to transfer the signal for instance from your
> > external (non-arts-module-like) mp3 player to aRts. This TCP stuff is
> > also network transparent.
>
> tenwork transparency is ok, but we need to separate the things,
> that means if source and destination are on the same machine,
> use IPC/shmem to exchange data,
> if not then use sockets or so.
> But Corba seems a bit slow to me for a high performance event system.
I wouldn't worry much about the actual implementation. However, what does the
APIs look like, WRT dependencies on other APIs and standards? I'd rather stick
with very low level stuff, especially in the plug-in API. Partly for
performance reasons, partly because it makes it easier to port to exotic
environments like DSP farms, clusters and real time kernels, like RTLinux.
Also, I start to prefer C to C++, at least for this kind of stuff... Perhaps
I've read too much Linux kernel code? ;-) (And, I was an asm die hard for a few
years, when I hacked on the Amiga...)
> > + synthesis implemented GUI independant
> >
> > All the synthesis modules aRts uses (such as Synth_STD_EQUALIZER) don't
> > know anything from the GUI at all. They use no Qt and no whatever. They
> > are only connected to the GUI through the same signal flow mechanisms
> > that are used for everthing else
>
> Of course the DSP stuff must be separate from the GUI stuff, and the engine and
> the GUI should comunicate through a fast event system.
Indeed. Very important for the exotic environments mentioned above, and for any
truly useful form of network transparency.
Regards,
//David
·A·U·D·I·A·L·I·T·Y· P r o f e s s i o n a l L i n u x A u d i o
- - ------------------------------------------------------------- - -
·Rock Solid David Olofson:
·Low Latency www.angelfire.com/or/audiality ·Audio Hacker
·Plug-Ins audiality@swipnet.se ·Linux Advocate
·Open Source ·Singer/Composer
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]