[wdmaudiodev] Re: My first WASAPI implementation.

  • From: Matt Gonzalez <matt@xxxxxxxxxxxxx>
  • To: wdmaudiodev@xxxxxxxxxxxxx
  • Date: Wed, 22 Mar 2006 13:12:38 -0800

We do it in order to support legacy apps that don't support multichannel hardware.

Plus, it lets you address different sets of inputs and outputs with multiple applications simultaneously, which is useful.

Matt

wdmaudiodev-bounce@xxxxxxxxxxxxx wrote:
I never really did understand why vendors would publish a bunch of
stereo pairs rather than one device that had many channels. Was this to
support
legacy OS's that didn't have WAVEFORMATEXTENSIBLE? 

If I have one 8-channel stream I want to output, the last thing I want
to
do is open 4 virtual stereo devices and issues the play command to each
one.

Chris Perry
Bose Corporation

-----Original Message-----
From: wdmaudiodev-bounce@xxxxxxxxxxxxx
[mailto:wdmaudiodev-bounce@xxxxxxxxxxxxx] On Behalf Of Frank Yerrace
Sent: Wednesday, March 22, 2006 3:36 PM
To: wdmaudiodev@xxxxxxxxxxxxx
Subject: [wdmaudiodev] Re: My first WASAPI implementation.


  
my biggest gripe currently is not knowing if its possible for a device
    
manufacturer to define a huge 16 output (or more) endpoint in one
IAudioClient without having to fake it with WAVEFORMATEXTENSIBLE's
nChannels set to 16 and dwChannelMask set to 0x00, or to have that
manufacturer define a bunch of separate stereo pairs to do the job.

Can you clarify what you would *like* to see here? At this time, we feel
that if the hardware is intended to be used as a 16-channel output, then
it should be presented as a single 16-channel output accessed through a
single instance of IAudioRenderClient using a 16-channel
WAVEFORMATEXTENSIBLE.

Frank Yerrace
Microsoft Corporation

This posting is provided "AS IS" with no warranties, and confers no
rights


-----Original Message-----
From: wdmaudiodev-bounce@xxxxxxxxxxxxx
[mailto:wdmaudiodev-bounce@xxxxxxxxxxxxx] On Behalf Of David Viens
Sent: Wednesday, March 22, 2006 12:17 PM
To: wdmaudiodev@xxxxxxxxxxxxx
Subject: [wdmaudiodev] Re: My first WASAPI implementation.

Ron Kuper wrote:
  
David, were the "suprises" pleasant or unpleasant? 
    
Ha, of course I meant "surprisingly good" results,
with little to no clicking (but sometimes occurring with even
moderate DSP cpu loads)

I ran the code with my onboard NVidia AMD nforce4 ac97,
a Roland ua-1a device running "USB composite device",
and my Terratec EWS88MT's extra output (fixed at 48 Khz)
- not the full 12/10 combo which doesnt appear.
cant try my various SB cards, because of lack of working drivers.

I'm trying to put latency down, but i got tired of shooting in the dark
with trying things. Thats why i'm awaiting answers before continuing.
Knowing that vista is now scheduled to release in 2007, its not that
urgent it would seem.
The way it currently works is a bit odd, with the 100-nano second
"periods"
settings, which cant seem to fit on even sample frames in some
circumstances,
or .. sigh allow for power of two sized buffers.

Also since no pro audio card has Vista drivers yet, my biggest gripe
currently
is not knowing if its possible for a device manufacturer to define a
huge
16 output (or more) endpoint in one IAudioClient without having to fake
it with
WAVEFORMATEXTENSIBLE's nChannels set to 16 and dwChannelMask set to
0x00,
or to have that manufacturer define a bunch of separate stereo pairs to
do the job.

(i never implemented WDM-KS's pin querrying stuff mind you - so might be
a similar thing)

I've got another bag of questions...

  

****************** WDMAUDIODEV addresses: Post message: mailto:wdmaudiodev@xxxxxxxxxxxxx Subscribe: mailto:wdmaudiodev-request@xxxxxxxxxxxxx?subject=subscribe Unsubscribe: mailto:wdmaudiodev-request@xxxxxxxxxxxxx?subject=unsubscribe Moderator: mailto:wdmaudiodev-moderators@xxxxxxxxxxxxx URL to WDMAUDIODEV page: http://www.wdmaudiodev.com/

Other related posts: