Advanced Linux Sound Architecture (ALSA) is a framework which defines an API for sound device drivers. The NVIDIA ALSA System-on-Chip (ASoC) driver enables ALSA to work seamlessly with different NVIDIA SoCs. Platform-independent and generic components are maintained by the upstream Linux community.
The ASoC driver may not directly support all audio hardware interfaces or functionality present on Jetson products. Review the following sections to determine whether the driver supports the audio features you need.
For Jetson Xavier NX Developer Kit and Jetson TX2 Developer Kit, audio playback is supported by the carrier board’s HDMI™ and DP interfaces. For Jetson TX2 Developer Kit and Jetson Xavier NX Developer Kit, hardware for audio capture is not included in the developer kit; however, audio capture can be supported using interfaces such as I2S or DMIC.
For NVIDIA® Jetson Nano™ Developer Kit, audio playback is supported by the carrier board’s HDMI™ interface. Hardware for audio capture is not included in the developer kit; however, audio capture can be supported using interfaces such as USB, I2S, or DMIC.
The carrier board included with NVIDIA® Jetson AGX Xavier™ Developer Kit supports audio playback over the front panel audio header via an on-board Realtek ALC5658 codec, the HDMI™ interface, and the USB‑C interfaces (via a USB‑C to Display Port Adapter/USB). Audio capture is supported by the on-board Realtek ALC5658 codec. It is also possible to capture audio via other interfaces such as USB (using a USB microphone), I2S, and DMIC.
ALSA
The ALSA framework is a part of the Linux kernel that is supported and maintained by the larger Linux community. This makes it feasible to adapt the framework to the Jetson platform by designing a driver that leverages NVIDIA audio routing support. ALSA includes a collection of sound card drivers, including actual codec drivers, and can support adding new codec drivers.
ALSA includes libraries and utilities that enable more refined audio control in Linux user space. These libraries control audio applications without having to interact with kernel space drivers directly. These libraries include ALSA:
• amixer
• aplay
• arecord
The following diagram illustrates the ALSA software hierarchy.
User space ALSA applications interact with ALSA core (kernel space) through APIs provided by user-space libraries that initialize the actual hardware codecs at the backend of the audio pipeline.
More information about the ALSA framework is available at:
ALSA is designed to support various functionalities including, but not limited to, dynamic audio routing to the available PCM devices. The component of ALSA core that provides this support is called Dynamic Audio Power Management (DAPM). DAPM controls the power flow into and out of various codec blocks in the audio subsystem, minimizing power consumption. DAPM introduces switches or kernel controls in the form of widgets to turn a module’s power on and off and to help manipulate the required bit of the specific register dynamically using user space applications such as aplay, arecord, and amixer. The widgets are classified into various groups.
In terms of software hierarchy, DAPM is part of the ALSA core, which manages the codec module’s power efficiency. See the ALSA software hierarchy diagram under ALSA for details.
The device tree is a data structure that describes devices on the platform. It is passed to the operating system at boot time to avoid hard coding component details in the operating system. This makes it easier to change hardware configurations without rebuilding the kernel.
The device tree is composed of nodes and properties. Each node can have properties or child nodes. Each property consists of a name and one or more values. Device tree structures must be written in the correct format so that the data structure can be parsed by the operating system.
The ASoC driver provides better ALSA support for embedded system-on-chip processors (e.g. DSP, AHUB) and portable audio codecs. It comprises the components:
• Platform driver: Responsible for PCM registration and interfacing with the PCM driver. ADMAIF is the platform driver.
• Codec drivers: Any driver that registers the snd_soc_codec_driver structure with the ASoC core can be viewed as a codec driver.
• A codec driver must have at least one input or one output.
• The structure provides a way to define your own DAPM widgets for power management and kcontrols for register setting from user space.
• Machine driver: Connects one or more codec drivers and a PCM driver together for a given platform.
For details on writing a machine driver and identifying a sound card, consult ASoC Machine Driver.
Audio Hub Hardware Architecture
Audio Hub is a hardware module which is intended for audio acceleration. This topic provides an overview of:
• The audio hub hardware architecture inside the SoC.
• The software architecture of the ASoC driver.
This diagram summarizes the hardware architecture of the ASoC.
The audio hub contains the modules I2S, DSPK, and the Digital MIC Controller (DMIC) that interface with the external world. The audio hub also contains:
• Mixer
• Sampling Frequency Converter (SFC)
• Master Volume Control (MVC)
• Audio Multiplexers (AMX)
• Audio Demultiplexers (ADX)
• An Audio Direct Memory Access (ADMA) component is included (ADMAIF-DMA) to communicate with memory.
The Crossbar (XBAR) facilitates the routing of audio samples through these modules using a proprietary protocol called Audio Client Interface (ACIF).
Module
Jetson Xavier NX and Jetson AGX Xavier Instances
Jetson TX2 Instances
Jetson Nano & TX1 Instances
I2S
4x
6x
2x
DMIC
2x
3x
2x
Mixer
10 inputs, 5 outputs
10 inputs, 5 outputs
10 inputs, 5 outputs
AMX
4x
4x
2x
ADX
4x
4x
2x
SFC
4x
4x
4x
MVC
2x
2x
2x
ADMA
32 channels
32 channels
22 channels
ADMAIF
20 TX and RX channels
20 TX and RX channels
10 TX and RX channels
DSPK
1x
1x
None
Note:
Not all instances of an I/O module (I2S, DMIC or DSPK) are exposed on each platform. See Board Interfaces for more information about supported I/O instances.
The modules in the audio hub support various kinds of audio devices that are expected to interface with the application processor, such as:
• Cellular baseband devices
• Different types of audio CODECs
• Bluetooth modules
• Digital microphones
• Digital speakers
The audio hub supports the different interfaces and signal quality requirements of these devices.
• Each of the AHUB modules has at least one RX port or one TX port or both. Some AHUB modules have more than one RX/TX port.
• The RX ports receive data from XBAR, and the TX ports send data to XBAR. Hence, XBAR is a switch where an audio input can be fed to multiple outputs depending on the use case.
• Each ADMAIF has TX and RX FIFOs that support simultaneous recording and playback. ADMA transfers the data to the ADMAIF FIFO for all audio routing scenarios.
The software architecture of the ASoC driver for Jetson leverages the features supported by the hardware and conforms to the ALSA framework.
As mentioned earlier, the ASoC driver comprises the platform, codec and machine drivers. The roles of these drivers are described briefly below, and in more detail in subsequent sections.
The ASoC driver provides NVIDIA Audio Hub (AHUB) hardware acceleration to the platform and codec drivers. AHUB Direct Memory Access Interface (ADMAIF) is implemented as a platform driver with PCM interfaces for playback/record. The rest of the AHUB modules, such as the Audio Cross Bar (XBAR), Audio Multiplexer (AMX), Audio Demultiplexer (ADX), and Inter-IC sound (I2S), are implemented as codec drivers. Each of the drivers is connected to XBAR through a Digital Audio Interface (a DAI), inside a machine driver, forming an audio hub.
The machine driver probe instantiates the sound card device and registers all of the PCM interfaces as exposed by ADMAIF. After booting, but before using these interfaces to play back or record audio, you must set up the audio paths inside XBAR. By default, XBAR has no routing connections at boot, and no complete DAPM paths to power on the corresponding widgets. The XBAR driver introduces MUX widgets for all of the audio components and enables custom routing through kcontrols from user space using the ALSA amixer utility. If the audio path is not complete, the DAPM path is not closed, the hardware settings are not applied, and audio output cannot be heard.
For more details on how to set up the route and how to play or record on the PCM interfaces, see Usage and Examples.
Platform Driver
The platform driver initializes and instantiates the ports for playback and capture inside the AHUB.
Users must connect some or all of these ports to form a full audio routing path. For examples of full audio paths, see the examples in Usage and Examples. Note that there are other elements in a full audio path setup that are discussed in subsequent sections and that the playback/capture ports set up by platform driver are only a subset.
ADMAIF
ADMAIF is the platform driver in the ASoC driver. It interfaces with the PCM driver. The PCM driver helps perform DMA operations by overriding the function pointers exposed by the snd_pcm_ops structure. The PCM driver is platform-agnostic, and interacts with the SoC DMA engine’s upstream APIs. The DMA engine then interacts with the platform-specific DMA driver to get the correct DMA settings. The ADMAIF platform driver defines DAIs and registers them with ASoC core.
A Jetson platform has either 10 or 20 ADMAIFs depending on its generation, facilitating either 10 or 20 streams of playback and capture.
Hardware Devices in the ASoC Driver Registered under Primary Audio Codec Board
The ADMAIF channels are mapped to:
• /dev/snd/pcmC1Dnp for playback
• /dev/snd/pcmC1Dnc for capture
Where n is 1 less than the channel number. For example:
• ADMAIF1 is mapped to pcmC1D0p for playback, and pcmC1D0c for capture
• ADMAIF2 is mapped to pcmC1D1p for playback, and pcmC1D1c for capture
• …and so on.
Codec Driver
An overview of codec drivers is presented in the section ASoC Driver. In the ASoC driver implementation, the rest of the AHUB modules, except for ADMAIF, are implemented as codec drivers. Their responsibilities include:
• Interfacing to other modules by defining DAIs
• Defining DAPM widgets and establishing DAPM routes for dynamic power switching
• Exposing additional kcontrols as needed for user space utilities to dynamically control module behavior
Codec Driver Instantiation Using Device Tree
Based on architecture, the Makefile in the following directory conditionally compiles the required device tree structure files into DTB files:
$KERNEL_TOP/arch/arm64/boot/dts/
When the kernel is flashed, the flash script chooses the appropriate board-specific DTB file for parsing during boot, and the ASoC codecs listed in device tree are instantiated. To add new devices to the device tree, edit the DTS file identified in the dmesg log (as in the following example) and reflash the target.
To add a new device, add the device name with the base address and status as "okay":
ahub {
status = "okay";
i2s@2901000 {
status = "okay";
};
};
XBAR
The XBAR codec driver defines RX, TX and MUX widgets for all of the interfacing modules: ADMAIF, AMX, ADX, I2S, DMIC, Mixer, SFC and MVC. MUX widgets are permanently routed to the corresponding TX widgets inside the snd_soc_dapm_route structure.
XBAR interconnections are made by connecting any RX widget block to any MUX widget block as needed using the ALSA amixer utility. The get/put handlers for these widgets are implemented so that audio connections are stored by setting the appropriate bit in the hardware MUX register.
Mixer Controls
Sound card instantiation after boot indicates that the machine driver interconnects all codec drivers and the platform driver. The remaining step before obtaining the audio output on the physical codecs involves the use of MUX widgets to establish the DAPM path in order to route data from a specific input module to a specific output module. Input and output modules are dependent on the applicable use case. This provides flexibility for complex use cases.
For example, on Jetson AGX Xavier,[amixer –c tegrasndt19xmob cset name='I2S1 Mux' 'ADMAIF1'] realizes the internal AHUB path ADMAIF1 RX → XBAR → I2S1 TX. Usage and examples of various AHUB modules can be found in Usage and Examples.
AMX
The Audio Multiplexer (AMX) module can multiplex up to four streams of up to 16 channels, with a maximum of 32 bits per channel, into a time division multiplexed (TDM) stream of up to 16 channels with a maximum of 32 bits per channel. The AMX has four RX ports for receiving data from XBAR and one TX port for transmitting the multiplexed output to XBAR. Each port is exposed as a DAI and this is indicated with a solid line. Routes are established using DAPM widgets as indicated with the dotted lines.
The AMX code driver supports these features:
• Capable of multiplexing up to four input streams of up to 16 channels each and generating one output stream of up to 16 channels
• A “byte ram” which can assemble an output frame from any combination of bytes from the four input frames
• Two modes for data synchronization for the first output frame:
• Wait for All mode: Wait for all enabled input streams to have data before forming the first output frame.
• Wait for Any mode: Start forming the first output frame as soon as data is available in any enabled input stream.
Note:
AMX handles data synchronization for subsequent output frames differently on different Jetson platforms. On Jetson Nano, TX2, and TX1, it does not generate subsequent output frames until all input frames have been received. On Jetson AGX Xavier, it generates them as soon as one or more input frames have been received.
Byte Map Configuration
Each byte in the output stream is uniquely mapped from a byte in one of the four input streams. The mapping of bytes from input streams to output stream is software configurable via a byte map in the AMX module.
Each byte in the byte map is encoded with these fields:
Field
Bits
Description
Input stream
7:6
Identifies which of the four input streams the byte is mapped from, where 0 is RxCIF0, etc.
Input stream channel
5:2
Identifies which of the 16 possible channels in the input stream the byte is mapped from, where 0 is channel 0, etc.
Input stream byte
1:0
Identifies which byte in the input stream channel the byte is mapped from, where 0 is byte 0, etc.
Given that the maximum output frame size supported is 16 samples (from 16 channels) with 32 bits per sample, the byte map is organized as 16 words of 32 bits each: 64 bytes in total. Each 32-bit word in the byte map corresponds to one input channel. Therefore, if the output frame has samples from only two channels, then only the bytes in word 0 and word 1 need be programmed. If the output frame has samples from all 16 channels, then bytes in all 16 words must be programmed. Which bytes must be programmed in each word depends on the output frame sample size. If the sample size of each channel in the output frame is 16 bits, then it is only necessary to program byte 0 and byte 1 of each word in the byte map. If the sample size of each channel in the output frame is 32 bits, then it is necessary to program all four bytes of each word in the byte map.
Bear these points in mind:
• Input bytes must be mapped to output bytes in order. For example, if input frame bytes 0 and 1 are both mapped to the output frame, byte 1 must be mapped to a position in the output frame after byte 0.
• Not all bytes from an input frame need be mapped to the output frame.
• Each byte in the output frame has a software-configurable enable flag. If a particular byte’s enable flag is cleared, the corresponding mapping in the byte map is ignored, and that byte is populated with zeros.
Mixer Controls
Mixer controls are registered for each instance of AMX by respective codec driver and are used to configure the path, characteristics and processing method of audio data. In the table below, instance specific mixer controls are listed.
Mixer Control *
Description
Possible Values
AMX<i>-<j> Mux
Selects the AHUB client device from which the AMX input receives data.
* <i> refers to the instance ID of the AMX client, and <j> refers to the input port ID.
Usage and examples of AMX module can be found here.
ADX
The Audio Demultiplexer (ADX) module can demultiplex a single TDM stream of up to 16 channels and a maximum of 32 bits per channel into four streams of up to 16 channels and 32 bits per channel. The RX port of ADX receives input data from XBAR, and four TX ports transmit demultiplexed output to XBAR. Each port is exposed as a DAI, indicated by a solid line and routes are established using DAPM widgets as indicated by the dotted lines in the following diagram.
ADX has one input RxCIF, which supplies the input stream. The core logic selects bytes from this input stream based on a byte RAM map and forms output streams which are directed to a TxCIF FIFO to be transmitted to a downstream module in AHUB.
The ADX demultiplexer supports these features:
• Demultiplexing one input stream of up to 16 channels to four output streams of up to 16 channels each
• A “byte RAM” which can assemble output frames that contain any combination of bytes from the input frame. The byte RAM design is exactly same as the byte RAM in AMX, except that the direction of data flow is reversed.
Byte Map Configuration
Each byte in each output stream is mapped from a byte in the input stream. The mapping of the bytes from input stream to output streams is software configurable via a Byte Map in the ADX module through this mixer control.
Field
Bits
Description
Output Stream
7:6
Identifies the output stream that the byte is mapped to, where 0 = TxCIF0, etc.
Output Stream Channel
5:2
Identifies the output stream channel that the byte is mapped to, where 0 = channel 0, etc.
Output Stream Byte
1:0
Identifies the byte in the output stream channel that the byte is mapped to, where 0 = byte 0, etc.
Given that the maximum output frame size supported per stream is 16 channels with 32-bits per sample, the Byte Map is organized as 16 32-bit words (64 bytes in total). Each 32-bit word in the Byte Map corresponds to one channel in the input frame. Therefore, if the input frame only has two channels then only the bytes in word 0 and word 1 need to be programmed, while if the input frame has 16 channels then bytes in all 16 words need to be programmed. The bytes that need to be programmed in each word are dependent on the input frame sample size. If the sample size of each channel in the input frame is 16 bits, then it is only necessary to program byte 0 and byte 1 for each word in Byte Map. If the sample size of each channel in the input frame is 32 bits, then it is necessary to program all four bytes for each word in the Byte Map.
Bear these points in mind:
• Input bytes must be mapped to output bytes in order. For example, if input frame bytes 0 and 1 are both mapped to the output frame, byte 1 must be mapped to a position in the output frame after byte 0.
• Not all bytes in an input frame need be mapped to the output frame.
• Each byte in the output frame has a software-configurable enable flag. If a particular byte’s enable flag is cleared, the corresponding mapping in the byte map is ignored, and that byte is populated with zeros.
Mixer Controls
Mixer controls are registered for each instance of ADX by the respective codec driver, and are used to configure the path, characteristics, and processing method audio data. The table below lists the instance-specific mixer controls for each instance of the ADX module.
Mixer Control *
Description
Possible Values
ADX<i> Mux
Selects the AHUB client device from which the ADX input receives data.
An I2S codec driver supports bidirectional data flow and thereby defines CIF and DAP RX/TX widgets as follows. The CIF side of I2S interfaces with XBAR, and the DAP side interfaces with the physical codec on the given platform.
The DAPM routes established using these widgets are shown in the diagram below as dotted lines. I2S modules also expose kernel control to enable internal I2S loopback.
The I2S controller implements full-duplex and half-duplex point-to-point serial interfaces. It can interface with I2S-compatible products, such as digital audio tape devices, digital sound processors, modems, Bluetooth chips, etc.
The I2S codec driver supports the following features:
• Can operate both as master and slave
• Supports the following modes of data transfer:
• LRCK modes: I2S mode, Left Justified Mode (LJM), or Right Justified Mode (RJM)
The snippet above is from the device tree structure for Jetson AGX Xavier. Note that your address and a few other properties are platform-specific, and may be referenced by platform-specific device tree files. In the case of I2S, the device entry above specifies the names of clocks needed by the device, the source of each clock, and the register base address and address range belonging to the device. Other properties such as fsync-width may be adjusted to fit the use case’s requirements.
Mixer Controls
Mixer controls are registered for each instance of I2S by the respective codec driver, and are used to configure the path, characteristics, and processing method of audio data. The table below lists instance-specific mixer controls.
Mixer Control
Description
Possible Values
I2S<i> Loopback
Enable internal I2S loopback.
On or Off
I2S<i> Input Bit Format
Configures length of input sample in bits.
16 or 32
I2S<i> Codec Bit Format
Configures length of output sample in bits.
16 or 32
I2S<i> Fsync Width
Configures frame sync signal’s width in terms of bit-clocks.
The Mixer mixes audio streams from any of the 10 input ports that receive data from XBAR to any of the 5 output ports that transmit data onto XBAR. The DAPM widgets and routes for Mixer are as shown in the figure below. The Mixer driver also exposes RX Gain and Mixer Enable as additional kcontrols to set the volume of each input stream and to globally enable or disable the Mixer respectively.
Features Supported
• Supports mixing up to 10 input streams
• Supports five outputs, each of which can be a mix of any combination of 10 input streams
Mixer controls are registered for each instance of Mixer by the corresponding codec driver. They are used to the configure path, characteristics, and processing method of audio data. The table below lists instance-specific mixer controls.
Mixer Control *
Description
Possible Values
Mixer1-<i> Mux
Selects the AHUB client device from which the I2S input receives data.
The Sampling Frequency Converter (SFC) converts the input sampling frequency to the required sampling rate. SFC has one input port and one output port, which are connected to XBAR.
Features Supported
• Sampling frequency conversion of streams of up to two channels (stereo)
• Very low latency (maximum latency less than 125 microseconds)
• Supports the following frequency conversions marked by ‘X’. (Shaded cells represent the same frequency in and out. These cases bypass frequency conversion.)
Fs in/ Fs out
8
11.025
16
22.05
24
32
44.1
48
88.2
96
176.4
192
8.
X
X
X
X
11.025
X
X
16.
X
X
X
X
22.05
X
X
24.
X
X
32.
X
X
44.1
X
X
X
48.
X
X
X
X
X
X
88.2
X
X
96.
X
X
176.4
X
X
192.
X
X
Mixer Controls
Mixer controls are registered for each instance of SFC by the corresponding codec driver. They are used to configure the path, characteristics, and processing method of audio data. The table below lists instance-specific mixer controls.
Mixer Control *
Description
Possible Values
SFC<i> Mux
Selects the AHUB client device from which the I2S input receives data.
In the case of DMIC, a device entry specifies the instance ID of DMIC through ahub-dmic-id. It also specifies the register base address and address range belonging to the device, apart from clock-names and their sources.
Mixer Controls
Mixer controls are registered for each instance of DMIC by the corresponding codec driver. They are used to configure the path, characteristics, and processing method of audio data. The table below lists instance specific mixer controls.
Mixer Control *
Description
Possible Values
DMIC<i> Boost Gain
Configures volume.
0 to 25599, representing 0 to 256 in linear scale (with 100x factor)
DMIC<i> Mono Channel Select
Selects channel for mono recording.
Left or Right
DMIC<i> TX Mono to Stereo Conv
Configures mono to stereo conversion method for DMIC output.
None, ZERO, or COPY
DMIC<i> Output Bit Format
Configures output sample size in bits.
16 or 32
DMIC<i> Sample Rate
Configures sample rate of DMIC output.
8000, 11025, 16000, 22050, 24000, 32000, 44100, or 48000 Hz
DMIC<i> OSR Value
Configures OSR (oversampling ratio). OSR<i> ndicates selecting one sample from the several samples received on input lines of the DMIC processing block.
Mixer controls are registered for each instance of MVC by the corresponding codec driver. They are used to configure the path, characteristics, and processing method of audio data. The table below lists instance-specific mixer controls.
Mixer Control *
Description
Possible Values
MVC<i> Vol
Configures volume.
0 to 16000
(representing -120 to +40 dB with 100x scale factor)
MVC<i> Mute
Mutes/unmutes input stream.
On or Off
MVC<i> Curve Type
Configures the volume ramp curve type.
Poly or Linear
MVC<i> Channels
Configures channels of audio data passing through MVC.
0-8
MVC<i> input bit format
Configures sample size of input audio data through MVC.
16 or 32
MVC<i> Bits
Configures sample size of output audio data through MVC.
OSR_64, OSR_128, or OSR_256
MVC<i> Mux
Selects the AHUB client device from which the MVC input receives data.
The Digital Speaker (DSPK) is a PDM transmit block that converts multibit PCM audio input to oversampled one-bit PDM output. The DSPK controller consists of an interpolator that oversamples the incoming PCM and a delta-sigma modulator that converts the PCM signal to PDM.
• Passband frequency response: ≤0.5 dB peak-to-peak in 10 Hz – 20 kHz range
• Dynamic range: ≥105 dB
Device Tree Entry
This DSPK node entry enables a given DSPK instance on a given chip.
aconnect@2a41000 {
compatible = "nvidia,tegra210-aconnect";
status = "okay";
...
tegra_axbar: ahub {
compatible = "nvidia,tegra186-axbar";
status = "okay";
...
tegra_dspk1: dspk@2905000 {
compatible = "nvidia,tegra186-dspk";
reg = <0x0 0x2905000 0x0 0x100>;
nvidia,ahub-dspk-id = <0>;
clocks = <&bpmp_clks TEGRA194_CLK_DSPK1>,
<&bpmp_clks TEGRA194_CLK_PLLA_OUT0>,
<&bpmp_clks TEGRA194_CLK_SYNC_DSPK1>;
clock-names = "dspk", "pll_a_out0", "sync_dspk";
status = "okay";
};
...
};
};
This example is from the device tree structure file of Jetson AGX Xavier. In case of DSPK, the device entry specifies the instance ID of DSPK through ahub-dspk-id. It also specifies the register base address and address range belonging to the device, the clocks required, and their sources.
Mixer Controls
Mixer controls are registered for each instance of DSPK by the corresponding codec driver. They are used to the configure path, characteristics, and processing method of audio data. The table below lists instance-specific mixer controls.
Mixer Control *
Description
Possible Values
DSPK<i> Mux
Selects the AHUB client device which the DSPK input receives data from
Below is a list of names of AHUB clients’ TX ports.
Note:
The suffix <i> refers to the instance ID of a given AHUB client.
AHUB Client
TX Port Names *
ADMAIF
ADMAIF<i>
I2S
I2S<i>
DMIC
DMIC<i>
DSPK
DSPK<i>
AMX
AMX<i>
ADX
ADX<i>-1, ADX<i>-2, ADX<i>-3, ADX<i>-4
* <i> represents the instance ID of a given AHUB client.
ASoC Machine Driver
The ASoC machine driver connects the codec drivers to a PCM driver by linking the DAIs exposed by each module. It instantiates the sound card (a software component in ASoC architecture). Its header file defines the snd_soc_dai_link structure.
In brief, the machine driver’s functions are to:
• Populate the SoC DAI links (i.e. those between XBAR and interfaces such as I2S). These DAI links are internal to the SoC and connect the various hardware modules and interfaces to XBAR.
• Define DAPM widgets for the platform’s interfaces, such as headphone jacks, speakers, and microphones.
• Find platform-specific DAI links in the device tree (i.e. links between the audio codec and the SoC) and route DAPM widgets between the machine driver and the audio codec headphone/speaker outputs and microphone inputs.
• Configure the APE subsystem and codec clocks.
• Propagate the runtime PCM parameters (sample-rate, sample-size, etc.).
The Jetson ASoC machine driver is available in the kernel sources archive in this location:
To support a custom codec board on Jetson ASoC machine driver
1. Update the DAI link and DAPM widgets defined in the device tree.
You must update these items because the machine driver parses the device tree in order to instantiate the sound card. Update the device tree properties of the sound node for the platform when you customize the platform to support a third-party audio codec.
The following example gives an overview of the DAI links and DAPM widgets for the audio codec found in the device tree’s sound node for Jetson AGX Xavier.
The sound node is added to the device tree file for sound card registration and passing platform-related data to the machine driver. Some of the sound node’s properties are described below. All of the described properties are required except as noted.
• compatible: Specifies the machine driver with which the sound node is compatible. Its value must be:
• nvidia,tegra-audio-t210ref-mobile-rt565x for Jetson Nano or Jetson TX1
• nvidia,tegra-audio-t186ref-mobile-rt565x for Jetson Xavier NX, Jetson AGX Xavier series, or Jetson TX2 series
• nvidia,model: Specifies the sound card’s name, and can be customized as necessary.
• nvidia,audio-routing: Describes the route between the Jetson ASoC machine driver widgets and the codec widgets. The machine driver defines DAPM widgets for the platform’s physical microphone, headphone, and speakers. These must be connected to the corresponding DAPM widgets on the codec, which represent the codec’s microphones, headphones, speakers, etc.
• nvidia,num-codec-link: Defines the number of platform-specific DAI links that connect interfaces, such as I2S, DSPK or DMIC, to other external devices, such as codecs. This must be the same as the number of nvidia,dai-link-<number> child nodes.
• nvidia,dai-link-<number>: Represents the interface between the SoC and an external audio device. The properties for the node describe the configuration of the interface.
You must define one of these properties (with a unique <number>) for each DAI link.
By default, a few interfaces may be connected to dummy codecs (spdif-dit0, spdif-dit1, etc.) for testing when no physical codec is present. This allows the SoC to drive the interface pins even when no external device is present. These dummy codecs can be replaced by actual codecs (like rt5658 in example above) when a codec is connected to a given interface.
The Jetson ASoC machine driver uses a DAI link’s link-name property to identify the DAI link and perform any necessary configuration such as configuring the codec clock. For other generic properties of a DAI link, see the Simple-Card specification.
For a complete example of how to customize the device tree for a different audio codec, see 40-pin GPIO Expansion Header (which describes interfacing to a codec on the 40-pin GPIO expansion header).
Definition of a DAI Node
Each DAI link for the I2S interface must be defined by a DAI node, which is a subnode of the sound node. The overall format of a DAI node is described in the preceding section.
For each I2S interface DAI link, the following properties must be configured:
• bitclock-master and frame-master: Optional Booleans; specify whether the codec is a slave or a master. The codec is the I2S bit-clock and frame master if these properties are present, or the I2S slave if they are absent.
• format: Configures CPU/CODEC common audio format. Value may be i2s, right_j, left_j, dsp_a, or dsp_b.
• bclk-ratio: An integer used to configure the I2S bit clock rate. The I2S bit clock rate is the product of this value and the PCM sample-rate. A value of 0 yields the same clock rate as 1.
• tx-mask (applicable when codec I2S is transmitting) and rx-mask (applicable when codec I2S is receiving): Optional; configure the TDM TX and RX slot enable mask, respectively. These bit masks indicate which TDM slots contain data. They default to values determined by the number of channels, and need be set only when the configuration is being customized.
For example, if there are eight channels, a mask of 0xff enables all eight channels in the TDM frame; a mask of 0x0f enables only the first four slots in the frame. If slots are not enabled, the total TDM frame size is still the same size as the total number of channels, but the data in the slots is ignored.
Other DAI link properties are common for I2S, DMIC, and DSPK interface-based DAI links:
• srate: PCM Data stream sample rate
• bit-format: Data stream sample size
• num-channel: Data stream number of channels
Clocking and Power Management
The following debugfs node listing, obtained from /sys/kernel/debug/clk/clk_summary, shows the clock tree of the ASoC driver for Jetson AGX Xavier in the idle state, when no audio record or playback operations are in progress. The clock trees for the other Jetson devices are similar.
The clocks of the individual modules, AMX, ADX, AFC, SFC, MIXER, and others, are internally driven by the APE clock.
The clock for all codec drivers (I2S, DMIC, DSPK, XBAR, etc.) are switched off in the idle state. They are turned on when audio playback or recording begins.
The idle_bias_off option for the ASoC core (snd_soc_codec_driver) is set to 1 in the individual codec drivers for dynamic audio power management. This option causes the ASoC core to call the suspend() and resume()functions in the codec driver that was registered using SET_RUNTIME_PM_OPS during platform probe. It calls suspend() and resume() when playback or record stops or starts. suspend()saves the device state and disables the clock; resume()enables the clock and restores the device state.
Jetson platforms support one or more High Definition Audio (HDA) interfaces, through on-board HDMI, DP, and USB‑C ports. These interfaces can be used to perform high-quality audio rendering on devices like TVs, A/V receivers, etc. These HDA interfaces are available on various Jetson platforms:
• Jetson Xavier NX, Jetson TX2: one HDMI, one DP
• Jetson Nano, Jetson TX1: one HDMI
• Jetson AGX Xavier: one HDMI, two DP over USB-C
HDMI and DP interfaces can be connected using the respective connectors. DP over USB-C needs a USB-C to DP converter to connect to a DP sink.
Features Supported
Jetson High Definition Audio supports the following features:
• Compliant with High Definition Audio Specification Revision 1.0
• Supports HDMI 1.3a and DP
• Audio Format Support
• Channels: 2 to 8
• Sample size: 16 bits (S16_LE) or 24 bits (S32_LE)
You may experience issues when playing high resolution audio formats (using multichannel output or a high sampling rate), even with an audio format that is supported by your monitor. This is because the available audio bandwidth depends on the HDMI configuration, increasing with higher display resolutions.
If you encounter issues when playing a high resolution audio format, NVIDIA recommends setting your display resolution at least to the level that corresponds to your audio format in the following table. This table is taken from the HDMI 1.3a specification document.
Display resolution
Format timing
Pixel repetition
Vertical frequency (Hz)
Maximum fs, 8 channels (Hz)
Maximum frame rate, 2 channels, comp **
SuperAudio CD channel count
VGA
640x480p
none
59.94/60
48000
192
2
480i
1440x480i
2
59.94/60
88200
192
2
2880x480i
4
192000
768
8
240p
1440x240p
2
59.94/60
88200
192
2
2880x240p
4
192000
768
8
480p
720x480p
none
59.94/60
48000
192
2
1440x480p
2
176400
384
8
2880x480p
4
192000
768
8
720p
1280x720p
none
59.94/60
192000
768
8
1080i
1920x1080i
1080p
1920x1080p
Software Driver Details
HDA interfaces are accessible through standard ALSA interfaces. You can use the aplay utility for rendering audio:
aplay -Dhw:<cardname>,<deviceID> <in.wav>
Where:
• <cardname> is the sound card name; see the table under Board Interfaces.
• <deviceID> is the sound interface’s device ID.
• <in.wav> is the name of the sound file to be played. It should be a .wav file.
Here are some further details about driver usage:
• All HDA interfaces are available under one card.
• You can read card details from /proc/asound/cards.
• You can see available PCM devices (i.e. HDA interfaces) under /proc/asound/card<n>/.
• 16-bit audio is supported in S16_LE format; 20 or 24-bit audio is supported in S32_LE format.
All Jetson platforms provide a USB host interface for connecting various USB devices, including USB audio devices such as speakers, microphones and headsets.
Features Supported
Jetson High Definition Audio supports the following features:
• Channel count: 8 maximum
• Sample size: 16 bits (S16_LE) or 24 bits (S24_3LE)
• <file.wav> is the name of the input file (for aplay) or output file (for arecord), which must be a WAV file
Here are some further details about driver usage:
• The USB audio card is enumerated upon connecting the USB device (e.g. a USB headphone).
• You can read card details from /proc/asound/cards.
• You can see available PCM devices under /proc/asound/card<n>/.
Enabling Bluetooth Audio
Applies to: Original Jetson TX2 and Jetson TX2i
To ensure the Bluetooth® software stack is conformant for the configuration, Bluetooth audio is disabled by default. If additional Bluetooth audio profiles are enabled, product conformance may be impacted.
6. Enter this command to reboot the Jetson device:
$ sudo reboot
7. When the reboot is complete, pair and connect any Bluetooth headset.
Board Interfaces
The tables below list all of the audio interfaces exposed by Jetson developer kits. Note that some interfaces may not be directly available for use in the BSP provided. The pins may need to be configured for the desired function.
The need for pinmux configuration is indicated in the tables by the “Pinmux Setting Required” field.
For information about pinmux configuration, see the part of the Platform Adaptation and Bring-Up topic that applies to your platform.
All of the carrier boards used in Jetson developer kits have a 40-pin GPIO header which exposes audio I/O connections, as shown in table above. You can use this header to connect various audio cards to your Jetson platform.
When you choose an audio codec to use with a Jetson device, be sure that:
• It is hardware-compatible in terms of functional pins (I2S, DMIC, etc.), GPIO, power, and clocks required to support the codec.
• It is compatible with the Jetson I2S interface (sample rates, sample sizes, frame formats, etc.).
• A Linux kernel driver is available for the codec
• ALSA examples are available for it to show how to configure its audio routing and general setup. Configuring the audio routing can be the most complex part of integrating an I2S codec.
The 40-pin expansion header’s pinout can be inferred from the schematics for the Jetson platform. Subsequent sections give guidance for the software changes required to interface audio cards with Jetson boards.
Pinmux Configuration
The SoC I/O pins may operate as either a GPIO or a special-function I/O (SFIO) such as I2S or DMIC. Therefore, it is necessary to make sure that the any audio I/O pins are configured as an SFIO.
If a pin is not configured as desired by default, you must perform pinmux configuration on it. See the part of the Configuring the 40-Pin Expansion Header topic that applies to your platform.
See the table in the section for information about certain pins that can have alternate functions, e.g. a pin that could be configured as DSPK, DMIC, I2S, or GPIO.
Device Tree Configuration for a Custom Audio Card
To support a custom audio card or other external audio device, you may need to add or update various device tree nodes such as clocks and power supplies.
Populate Codec Node
To enable the codec, you must add the codec under the device tree node of the device that is used to access the codec. Most codecs use either I2C or SPI for access. In the example below, the codec uses I2C for its control interface, and so is added to the appropriate I2C node.
i2c@<addr> {
sgtl5000: sgtl5000@0a {
compatible = "fsl,sgtl5000";
reg = <0x0a>;
clocks = <&sgtl5000_mclk>;
micbias-resistor-k-ohms = <2>;
micbias-voltage-m-volts = <3000>;
VDDA-supply = <&vdd_3v3>;
VDDIO-supply = <&vdd_3v3>;
status = "okay";
};
};
See the device tree binding documentation to determine what properties must be populated for the codec and how to configure them.
Make sure that the relevant control interface (I2C, SPI, etc.) is enabled in the platform’s device tree. The 40-pin GPIO expansion header exposes an I2C controller; the table below shows the address of the I2C controller exposed on each Jetson platform.
Platform
40-pin expansion header I2C Address
Jetson Xavier NX
0x031e0000
Jetson Nano
0x7000c400
Jetson AGX Xavier
0x031e0000
Jetson TX2
0x0c240000
Jetson TX1
0x7000c000
Certain codecs may use an on-board oscillator as a clock source for the codec master clock (MCLK). If the codec’s device tree documentation requires that MCLK be defined, you may need to add a device tree node to represent the on-board oscillator. For example, an SGTL5000 codec may be clocked by a 12.288 MHz fixed-rate clock, which is present on the codec board, so a dummy clock is added to device tree:
clocks {
sgtl5000_mclk: sgtl5000_mclk {
compatible = "fixed-clock";
#clock-cells = <0>;
clock-frequency = <12288000>;
clock-output-names = "sgtl5000-mclk";
status = "okay";
};
};
Jetson I2S Node
The 40-pin GPIO expansion header exposes an I2S interface. The following table shows the address of the I2S interface exposed on the different Jetson platforms.
Platform
40-pin expansion header I2S address
Jetson Xavier NX
0x2901400
Jetson Nano
0x702d1300
Jetson AGX Xavier
0x02901000
Jetson TX2
0x02901100
Jetson TX1
0x702d1000
Make sure that the appropriate I2S interface is enabled by ensuring that the status property is set to okay:
i2s@<addr> {
status = "okay";
};
Codec and CPU DAI Link Setup
Under the sound node, update the DAI link that represents the link between the appropriate I2S interface and the codec. The ASoC machine driver parses the sound node, its DAI links, and its properties to instantiate the sound card, configure codec clocking, establish appropriate DAPM routes, and propagate runtime PCM parameters.
Configure I2S and Codec DAI Link
A DAI link must have a unique link-name which the Jetson ASoC machine driver can use to identify the link and perform any necessary codec configuration. A DAI link must also have a unique cpu-dai and codec-dai which respectively point to the SoC audio interface’s device node (e.g., I2S1 in the example below) and the codec board’s device node (SGTL5000 in the example below). cpu-dai-name and codec-dai-name are custom names which represent CPU DAI and CODEC DAI respectively. Details of the other properties are given in the following sections.
nvidia,dai-link-x {
link-name = "sgtl5000-codec";
cpu-dai = <&tegra_i2s1>;
codec-dai = <&sgtl5000>;
cpu-dai-name = "I2S1";
codec-dai-name = "sgtl5000";
format = "i2s";
bitclock-master;
frame-master;
bit-format = "s16_le";
name-prefix = "z";
};
Note that the DAI link instance associated with the 40-pin GPIO expansion header is platform-specific. Instance names are in the following table.
Platform
40-pin expansion header DAI link ID
Jetson Xavier NX
nvidia,dai-link-5
Jetson Nano
nvidia,dai-link-1
Jetson AGX Xavier
nvidia,dai-link-2
Jetson TX2
nvidia,dai-link-1
Jetson TX1
nvidia,dai-link-1
Codec as I2S Master/Slave
The codec board may support both master and slave modes. If it does check whether its Linux driver also supports both modes. If the codec and its driver both support both modes, review the driver and decide which mode to use.
Note that the device tree’s DAI link for the I2S codec interface is always configured from the perspective of the codec, so the absence of bitclock-master and frame-master implies that the codec is the slave.
If the codec operates in slave mode, there are two ways to set its clock.
• If AUD_MCLK is available on the header, you can use it to drive the codec’s MCLK.
• You can set AUD_MCLK to use a fixed rate by setting the following property under the device tree’s sound node:
sound {
nvidia,mclk-rate = <mclk_fixed_rate>;
mclk_parent = <mclk parent> //optional
};
• Alternatively, you can set AUD_MCLK as function of the sampling rate by setting the following property under the sound node:
sound {
mclk-fs = <scaling factor for sampling rate>;
mclk_parent = <mclk parent> //optional
};
The mclk_parent property refers to the parent of AUD_MCLK. You can obtain information about possible parents and rates of AUD_MCLK from sysfs nodes at:
• /sys/kernel/debug/clk/aud_mclk for Jetson Xavier NX, Jetson AGX Xavier, and Jetson TX2
• /sys/kernel/debug/clk/extern1 for Jetson Nano and Jetson TX1
The parent clock’s rate must be an integer multiple of the AUD_MCLK rate. Choose the parent clock rate based on the MCLK rate that the codec requires. By default, without any of the entries above, AUD_MCLK uses the SoC audio PLL (PLLA_OUT0) as clock parent, with a frequency equal to 256 times the sampling rate.
• If AUD_MCLK is not available, the codec can still act as slave if the codec driver supports using the SoC I2S bit clock to drive the codec's internal PLL which, in turn can be used to drive the codec’s MCLK. Note that SoC I2S derives its bit clock from PLLA_OUT_0 (PLL internal to SoC).
When the codec operates in master mode, the codec I2S bit clock typically is driven by the codec's internal PLL, which is driven in turn by a fixed rate external clock source. The properties below must be set in the appropriate DAI link to indicate that the codec should operate in master mode.
nvidia,dai-link-x {
bitclock-master;
frame-master;
};
I2S Mode Setting
To operate the I2S interface in LJM, RJM, or I2S, set the DAI link node’s format property to left_j, right_j, or i2s, respectively. For I2S to operate in FSYNC mode (dsp-a, dsp-b), set the format property to dsp_a or dsp_b. Configure the I2S frame-sync width as required for the appropriate I2S interface. Typically for dsp-a/b mode the frame sync width is a single bit clock.
nvidia,dai-link-x {
format = "dsp_a";
fsync-width = <0>;
};
Enable Codec Driver
The ASoC machine driver can be enabled or disabled in the Linux kernel by enabling or disabling kernel configuration symbols. On Jetson TX2, for example, the corresponding ASoC machine driver is enabled in the Linux kernel by selecting the kernel configuration symbol SND_SOC_TEGRA_T186REF_MOBILE_ALT.
To enable the SGTL5000 codec driver, update the kernel configuration entry for the SND_SOC_TEGRA_T186REF_MOBILE_ALT symbol to select this driver, so that whenever the machine driver is enabled, the SGTL5000 codec driver is also enabled. The following diff patch shows one way to accomplish this.
A similar patch to the Jetson ASoC machine driver kernel configuration is required for enabling the codec driver on other platforms. The following table shows the kernel configuration symbol that enables the Jetson ASoC machine driver for each Jetson platform.
Platform
Jetson ASoC machine driver kernel config
Jetson Xavier NX
SND_SOC_TEGRA_T186REF_MOBILE_ALT
Jetson Nano
SND_SOC_TEGRA_T210REF_MOBILE_ALT
Jetson AGX Xavier
SND_SOC_TEGRA_T186REF_MOBILE_ALT
Jetson TX2
SND_SOC_TEGRA_T186REF_MOBILE_ALT
Jetson Nano & TX1
SND_SOC_TEGRA_T210REF_MOBILE_ALT
Update the Machine Driver to Support a Custom Audio Card
The machine driver must be updated to support a custom audio card in order to configure the codec clock and DAI params. The example below shows the machine driver update for an SGTL5000 codec.
Add an Initialization Function for the Codec
You may need to update the machine driver when you integrate a new codec in order to perform any required codec initialization.
The codec SYSCLK or MCLK signals (the clock required for internal codec operation) may be sourced from the SoC I2S bit clock or from AUD_MCLK, available on the 40-pin GPIO expansion header or external oscillator sitting on codec board. Consequently the SYSCLK source must be configured in the initialization function. Usually the codec provides set_sysclk() callbacks which are triggered by calling snd_soc_dai_set_sysclk(). This facilitates configuration, since snd_soc_dai_set_sysclk()expects the SYSCLK source as one of its parameters.
When you use the SGTL5000 with a fixed codec MCLK you must add an initialization function to set the MCLK frequency as explained below.
static int tegra_machine_sgtl5000_init(struct snd_soc_pcm_runtime *rtd)
This would set the codec MCLK to receive the clock signal from an external oscillator on the codec board.
Register the Initialization Function for the Codec
The dai_link_setup function must register the initialization function as shown below for it to be executed. The link-name property of the codec’s DAI link identifies the codec, enabling the ASoC machine driver to populate the corresponding init function. (For SGTL5000 the value of link-name is sgtl5000-codec, as the section Configure I2S and Codec DAI Link shows.)
} else if (strstr(tegra_machine_codec_links[i].name,
"sgtl5000-codec")) {
tegra_machine_codec_links[i].init =
tegra_machine_sgtl5000_init;
}
}
}
}
Add Support for Runtime Configuration of Codec Parameters
PCM parameters are populated with the help of the patch for codec shown below. This patch updates the DAI parameters that are passed to the codec whenever playback/capture starts, so that the codec uses the codec’s current property values (sample-rate, channels, etc.).
As mentioned earlier, the codec’s SYSCLK or MCLK might be sourced from the SoC I2S bit clock. In that case, the PLL may be needed to upscale the BCLK rate to the desired SYSCLK rate (usually 256*fs or 512*fs). The codec driver provides set_pll() callbacks for facilitating PLL configuration, which are triggered on calling snd_soc_dai_set_pll() from tegra_machine_dai_init(). You can infer PLL setup details from the codec driver data sheet for a given BCLK rate (equal to sample_rate*channels*word_size). The expected SYSCLK rate (scale * sample_rate), and parameters for snd_soc_dai_set_pll can be defined as required.
static int tegra_machine_dai_init(struct snd_soc_pcm_runtime *runtime,
Jetson AGX Xavier has an “Audio Panel Header” (J511) on the bottom of the developer kit carrier board, as shown in figure below.
Header J511 supports Intel’sHDfront panel audio connector. Details of Intel’sfront panel audio header pinout configuration can be found in Intel document at:
To set up and configure the audio path to play or capture audio via the header, you must configure various ALSA mixer controls for both the Jetson device and the RT5658 codec. The following examples detail the ALSA mixer controls that you must configure.
Codec Mixer Controls
Codec mixer controls are registered by the codec driver and prefixed with the substring defined by the name-prefix property of the corresponding DAI link in sound device tree node.
To view the codec-specific mixer controls, enter this command line with the appropriate name prefix:
Alternatively, look for the codec-specific controls in the codec driver.
Playback
You can connect headphones or speakers to either or both of the playback ports, PORT 2R and PORT 2L, to play back mono or stereo recordings. Use the mixer control settings shown below.
amixer -c tegrasndt19xmob cset name="x HPO R Playback Switch" "on"
amixer -c tegrasndt19xmob cset name="x HPO L Playback Switch" "on"
#Start playback
aplay -D hw:tegrasndt19xmob,1 <in.wav>
Mic Capture
You can connect microphones to either or both of the recording ports, PORT 1R and PORT 1L, to record mono or stereo sound capture. Use these mixer control settings:
For Jetson devices other than Jetson AGX Xavier, you must set the boost gain of DMIC to 50 or less and add volume gain through MVC. In essence, MVC is used to apply compensatory gain while the signal is attenuated in DMIC to avoid saturation.
Mono Capture (L)
This example shows how to perform mono capture from DMIC3 via ADMAIF<i> (Left MIC).
This example shows how to use the AMX module to multiplex three stereo streams, DMIC2 (connected to RxCIF0), DMIC3 (connected to RxCIF1), and I2S (connected to RxCIF2).
This section describes some issues that are liable to occur when you are working with ASoC drivers, and their probable causes and solutions.
No Sound Cards Found
This has several possible causes. Some typical ones are described below. In most cases the dmesg output can provide clues.
Source Widget not Found
The dmesg output shows that no ‘source widget’ was found:
$ dmesg | grep "ASoC"
[4.874720] tegra-audio-t210ref tegra-audio-t210ref.0: ASoC: no source widget found for x OUT
[4.874724] tegra-audio-t210ref tegra-audio-t210ref.0: ASoC: Failed to add route x OUT-> direct-> Headphone-x
[4.874736] tegra-audio-t210ref tegra-audio-t210ref.0: ASoC: no sink widget found for x IN
[4.874739] tegra-audio-t210ref tegra-audio-t210ref.0: ASoC: Failed to add route LineIn-x-> direct-> x IN
x OUT and x IN are the widgets of the spdif-dit dummy codec. The ASoC may not have instantiated this codec. Enter this command to determine whether spdif-dit is instantiated:
cat /sys/kernel/debug/asoc/codecs
If spdif-dit is not instantiated, it is most likely because the SPDIF codec is not enabled in the Linux kernel configuration. Enter these commands to determine whether the SPDIF codec is enabled:
The dmesg output shows that no ‘CPU DAI’ was found:
$ dmesg | grep "ASoC"
[4.874720] tegra-audio-t210ref tegra-audio-t210ref.0: ASoC: CPU DAI DAP not registered
In this case, “DAP” is the CPU DAI for the I2S-to-codec DAI link.
The ASoC may not have instantiated the I2S codec. To determine whether the codec is instantiated, enter the command:
$ cat /sys/kernel/debug/asoc/codecs
If the I2S codec is instantiated, it has a name like tegra210-i2s, where the “tegra” number indicates the type of processor in the Jetson device.
Identifying the DAI link at the point of failure can give a clue to the I2S instance number that failed to instantiate. Accordingly, you can instantiate the I2S codec driver by providing a suitable entry point in the device tree structure (DTS) file as described in Codec Driver Instantiation Using Device Tree.
Sound Not Audible or Not Recorded
Follow this procedure to diagnose the issue:
1. Determine whether the DAPM path is completed. For tracing the DAPM path, DAPM tracing events must be enabled before you run the playback/record use case using the command:
for i in `find /sys/kernel/debug/tracing/events -name "enable" | grep snd_soc_`; do echo 1 > $i; done
If the DAPM path is not complete, the use case cannot proceed. The DAPM path is populated in the file below as and when it is set up.
cat /sys/kernel/debug/tracing/trace_pipe | grep *
Below is sample complete DAPM path for use case of recording through the microphone jack on Jetson AGX through rt5659, ADMAIF1, and I2S1.
snd_soc_dapm_path: *x RECMIX1R -> (direct) -> x ADC1 R
snd_soc_dapm_path: *x RECMIX1L -> (direct) -> x ADC1 L
snd_soc_dapm_path: *x BST1 -> BST1 Switch -> x RECMIX1R
snd_soc_dapm_path: *x BST1 -> BST1 Switch -> x RECMIX1L
snd_soc_dapm_path: *x IN1N -> (direct) -> x BST1
snd_soc_dapm_path: *x IN1P -> (direct) -> x BST1
snd_soc_dapm_path: *x Mic Jack -> (direct) -> x IN2P
snd_soc_dapm_path: *x Mic Jack -> (direct) -> x IN1P
You must ensure that the physical codec pins (input/output, IN1P, and IN2P in this case) are connected directly or indirectly to codec DAI’s AIF_OUT/AIF_IN. AIF_OUT/AIF_IN (AIF1 Capture in this case) interface with the CPU DAI (I2S in this case), which in turn must be connected directly or indirectly to the platform DAI (ADMAIF in this case).
2. Confirm the settings for the audio interface pins. The pins for the audio interface must be configured as special function IOs (SFIOs) and not GPIOs. Then the pinmux settings for the SFIO must select the desired audio function. See Board Interfaces to determine whether the pinmux settings are required. If so, see the appropriate section of the topic Platform Adaptation and Bring-Up for pinmux change instructions.
To verify the default SFIO pinmux configuration, check the pinmux node in the appropriate device tree source file after applying override in case of SFIO configuration through override.
3. Confirm that the audio interface’s status property is set to okay in the appropriate device tree source file.
For example, for Jetson TX2, the device tree file is at:
For example, if using I2S, probe the frame sync (FS) and bit clock (BCLK) to verify that the timings are correct.
I2S Software Reset Failed
A common problem is that the I2S software reset fails when starting playback or capture via an I2S interface. Error messages like this one appear in the dmesg log:
tegra210-i2s tegra210-i2s.0: Failed at I2S0_TX sw reset
This problem occurs when the clock for the I2S interface is not active and hence the software reset fails. It typically occurs when the I2S interface is the bit clock slave and hence the bit clock is provided by an external device such as a codec. If this problem occurs, check that the bit clock is being enabled when the playback or capture is initiated.
XRUN Observed During Playback/Capture
An XRUN is either an underrun (on playback) or overrun (on capture) of the audio circular buffer. In the case of playback, the CPU writes to the audio circular buffer. The DMA reads it and sends the data to the appropriate audio interface (such as I2S, etc.) via the AHUB. In the case of capture, the DMA writes to the audio circular buffer (with data received from the AHUB) and the CPU reads it.
An XRUN event typically indicates that the CPU is unable to keep up with the DMA. In the case of playback, the DMA reads stale data. In the case of capture, data is lost. Hence, an XRUN event can signify a system performance/latency issue, which can have many different causes.
If an XRUN occurs, try these measures to determine whether there is a performance issue:
• Enable maximum performance by running the jetson_clocks.sh script. This script is in the user HOME directory on the Jetson platform’s root file system.
• Use a RAM file system for reading and writing the audio data. The default root file system format for Jetson platforms is EXT4 with journaling enabled. Latencies have been observed with journaling file systems such as EXT4, and can lead to XRUN events. Enter these commands to create a simple 100 MB RAM file system:
$ sudo mkdir /mnt/ramfs
$ sudo mount -t tmpfs -o size=100m tmpfs /mnt/ramfs
• You can increase the size of the audio circular buffer to reduce the impact of system latencies. The default size of the audio circular buffer is 32 KB. It is defined by the buffer_bytes_max member of the tegra_alt_pcm_hardware structure in the Linux kernel source file: