kandi background
Explore Kits

audiosync | JavaScript library to sync audio | Runtime Evironment library

 by   johndyer Java Version: Current License: No License

 by   johndyer Java Version: Current License: No License

Download this library from

kandi X-RAY | audiosync Summary

audiosync is a Java library typically used in Server, Runtime Evironment, React, Nodejs applications. audiosync has no bugs, it has no vulnerabilities and it has low support. However audiosync build file is not available. You can download it from GitHub.
JavaScript library to sync audio with text based on a timing file
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • audiosync has a low active ecosystem.
  • It has 54 star(s) with 29 fork(s). There are 6 watchers for this library.
  • It had no major release in the last 12 months.
  • There are 1 open issues and 0 have been closed. On average issues are closed in 2863 days. There are no pull requests.
  • It has a neutral sentiment in the developer community.
  • The latest version of audiosync is current.
audiosync Support
Best in #Runtime Evironment
Average in #Runtime Evironment
audiosync Support
Best in #Runtime Evironment
Average in #Runtime Evironment

quality kandi Quality

  • audiosync has no bugs reported.
audiosync Quality
Best in #Runtime Evironment
Average in #Runtime Evironment
audiosync Quality
Best in #Runtime Evironment
Average in #Runtime Evironment

securitySecurity

  • audiosync has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
audiosync Security
Best in #Runtime Evironment
Average in #Runtime Evironment
audiosync Security
Best in #Runtime Evironment
Average in #Runtime Evironment

license License

  • audiosync does not have a standard license declared.
  • Check the repository for any license declaration and review the terms closely.
  • Without a license, all rights are reserved, and you cannot use the library in your applications.
audiosync License
Best in #Runtime Evironment
Average in #Runtime Evironment
audiosync License
Best in #Runtime Evironment
Average in #Runtime Evironment

buildReuse

  • audiosync releases are not available. You will need to build from source code and install.
  • audiosync has no build file. You will be need to create the build yourself to build the component from source.
  • Installation instructions are not available. Examples and code snippets are available.
audiosync Reuse
Best in #Runtime Evironment
Average in #Runtime Evironment
audiosync Reuse
Best in #Runtime Evironment
Average in #Runtime Evironment
Top functions reviewed by kandi - BETA

kandi has reviewed audiosync and discovered the below as its top functions. This is intended to give you an instant insight into audiosync implemented functionality, and help decide if they suit your requirements.

  • Collect successor tokens .
  • Creates the lattice lattice .
  • Compile the grammar .
  • Command line reader .
  • Start timing information .
  • Processes a dictionary entry .
  • Collects the phraseSpotting results .
  • Perform substitution .
  • load the abbrev model
  • Initializes the set of unit maps for the given node .

audiosync Key Features

JavaScript library to sync audio with text based on a timing file

default

copy iconCopydownload iconDownload
* Python 2.7
* java
* ant
* [sox](http://sox.sourceforge.net/)
* svn

License
-------
Dual licensed under the MIT or GPL Version 2 licenses.
MIT License: http://creativecommons.org/licenses/MIT/
GPL 2.0 license: http://creativecommons.org/licenses/GPL/2.0/

Play audio on smartphone, received via bluetooth, from a SensorTile (STEVAL-STLKT01V1)

copy iconCopydownload iconDownload
package com.st.BlueSTSDK.Example;

import android.content.Context;
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.SeekBar;

import com.st.BlueSTSDK.Feature;
import com.st.BlueSTSDK.Features.FeatureAudioADPCM;
import com.st.BlueSTSDK.Features.FeatureAudioADPCMSync;
import com.st.BlueSTSDK.Manager;
import com.st.BlueSTSDK.Node;
import com.st.BlueSTSDK.Utils.BVAudioSyncManager;

import java.util.List;


public class FeatureAudioActivity extends AppCompatActivity {

/**
 *   Node that will show the data
 */
private Node mNode;

/** fragment used for keep the connection open */
private NodeContainerFragment mNodeContainer;

//  Feature on which to apply the listener
private FeatureAudioADPCM mAudio;

// feature where we read the audio sync values
private FeatureAudioADPCMSync mAudioSync;

// The sampling rate
private static final int SAMPLE_RATE = 8000;

// audio manager
private static final int AUDIO_STREAM = AudioManager.STREAM_MUSIC;

//  Audio track builder
private AudioTrack mAudioTrack;

//object containing the sync data needed in a ADPCM stream decoding
private BVAudioSyncManager mBVAudioSyncManager = new BVAudioSyncManager();


private final static String NODE_FRAGMENT =   FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_FRAGMENT";
private final static String NODE_TAG = FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_TAG";



/**
 * create an intent for start the activity that will log the information from the node
 *
 * @param c    context used for create the intent
 * @param node note that will be used by the activity
 * @return intent for start this activity
 */
public static Intent getStartIntent(Context c, @NonNull Node node) {
    Intent i = new Intent(c, FeatureAudioActivity.class);
    i.putExtra(NODE_TAG, node.getTag());
    i.putExtras(NodeContainerFragment.prepareArguments(node));
    return i;
}

/**
 * listener for the audio feature, it will updates the audio values
 */
public final Feature.FeatureListener mAudioListener = new Feature.FeatureListener() {

    @Override
    public void onUpdate(final Feature f, final Feature.Sample sample) {
        short audioSample[] = FeatureAudioADPCM.getAudio(sample);

          /*Write audio data for playback
          @param short : The array that contains the data for playback
          @param int: offset in rawAudio where playback data begins
          @param int: The number of shorts to read in rawAudio after the offset
            */
        mAudioTrack.write(audioSample,0,audioSample.length);
    }

};

/**
 * listener for the audioSync feature, it will update the synchronism values
 */
public final Feature.FeatureListener mAudioSyncListener = new Feature.FeatureListener() {
    @Override
    public void onUpdate(Feature f, final Feature.Sample sample) {
        if(mBVAudioSyncManager!=null){
            mBVAudioSyncManager.setSyncParams(sample);
        }
    }
};

/* ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// */
private SeekBar mVolumeBar;
private AudioManager mAudioManager;

private Button mPlayButton;
private Button mStopButton;

private ImageButton mMuteButton;
private boolean mIsMute = false;

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_feature_audio);

    // find the node.
    String nodeTag = getIntent().getStringExtra(NODE_TAG);
    mNode = Manager.getSharedInstance().getNodeWithTag(nodeTag);


    List<Feature> listFeature = mNode.getFeatures();
    for (Feature f : listFeature) {
        if (f.isEnabled() && f.getName().equals("AudioFeature")) {

            mAudio=(FeatureAudioADPCM) f;

        }//if
        if (f.isEnabled() && f.getName().equals("AudioSyncFeature")) {

            mAudioSync=(FeatureAudioADPCMSync) f;

        }//if
    }//for


    //create/recover the NodeContainerFragment
    if (savedInstanceState == null) {
        Intent i = getIntent();
        mNodeContainer = new NodeContainerFragment();
        mNodeContainer.setArguments(i.getExtras());
        getFragmentManager().beginTransaction()
                .add(mNodeContainer, NODE_FRAGMENT).commit();
    } else {
        mNodeContainer = (NodeContainerFragment) getFragmentManager()
                .findFragmentByTag(NODE_FRAGMENT);
    }//if-else



    //builder audio track
    mAudioTrack = new AudioTrack(
            AudioManager.STREAM_MUSIC,
            SAMPLE_RATE,
            AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT,
            FeatureAudioADPCM.AUDIO_PACKAGE_SIZE,
            AudioTrack.MODE_STREAM);



    mPlayButton = (Button) findViewById(R.id.playButton);
    mStopButton = (Button) findViewById(R.id.stopButton);
    mMuteButton = (ImageButton) findViewById(R.id.muteButton);



    //  When the play button is pressed
    mPlayButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {

            if(!mIsMute){
                mAudioTrack.play();
                mAudioManager.setStreamVolume(AUDIO_STREAM,mVolumeBar.getProgress(),0);
            }

        }
    });

    //When the stop button is pressed
    mStopButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {

            stopAudioTrack();
            mAudioManager.setStreamVolume(AUDIO_STREAM,0,0);

        }
    });

    //When the mute button is pressed
    mMuteButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {
            changeState();
        }
        boolean changeState(){
            mIsMute=!mIsMute;
            if(mIsMute)
                muteAudio();
            else
                unMuteAudio();
            return mIsMute;
        }
        private void muteAudio(){
            mMuteButton.setImageResource(R.drawable.ic_volume_off_black_32dp);
            mAudioManager.setStreamVolume(AUDIO_STREAM,0,0);
            mVolumeBar.setEnabled(false);
        }

        private void unMuteAudio(){
            mMuteButton.setImageResource(R.drawable.ic_volume_up_black_32dp);
            mAudioManager.setStreamVolume(AUDIO_STREAM,mVolumeBar.getProgress(),0);
            mVolumeBar.setEnabled(true);
        }
    });

    //enable control volume
    setVolumeControlStream(AudioManager.STREAM_MUSIC);
    initControls();


}

@Override
public void onResume(){
    super.onResume();
    // enable needed notification
    if(mAudio!=null && mAudioSync!=null) {
        mAudio.addFeatureListener(mAudioListener);
        mBVAudioSyncManager.reinitResetFlag();
        mAudio.setAudioSyncManager(mBVAudioSyncManager);
        mNode.enableNotification(mAudio);
        mAudioSync.addFeatureListener(mAudioSyncListener);
        mNode.enableNotification(mAudioSync);
    }
}

@Override
public void onPause(){
    super.onPause();
    // disable needed notification
    if(mAudio!=null) {
        mAudio.removeFeatureListener(mAudioListener);
        mNode.disableNotification(mAudio);
    }
    if(mAudioSync!=null) {
        mAudioSync.removeFeatureListener(mAudioSyncListener);
        mNode.disableNotification(mAudioSync);
    }
}



private void stopAudioTrack(){
    synchronized(this) {
        mAudioTrack.pause();
        mAudioTrack.flush();
    }
}



//   Volume control from SeekBar
private void initControls()
{
    try
    {
        mVolumeBar = (SeekBar)findViewById(R.id.volumeValue);
        mAudioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
        mVolumeBar.setMax(mAudioManager
                .getStreamMaxVolume(AudioManager.STREAM_MUSIC));
        mVolumeBar.setProgress(mAudioManager
                .getStreamVolume(AudioManager.STREAM_MUSIC));


        mVolumeBar.setOnSeekBarChangeListener(new  SeekBar.OnSeekBarChangeListener()
        {
            @Override
            public void onStopTrackingTouch(SeekBar arg0)
            {
            }

            @Override
            public void onStartTrackingTouch(SeekBar arg0)
            {
            }

            @Override
            public void onProgressChanged(SeekBar arg0, int progress, boolean arg2)
            {
                mAudioManager.setStreamVolume(AudioManager.STREAM_MUSIC,
                        progress, 0);
            }
        });
    }
    catch (Exception e)
    {
        e.printStackTrace();
    }
}

/**
 * if we have to leave this activity, we force to keep the connection open, since we go back
 * in the {@link FeatureListActivity}
 */
@Override
public void onBackPressed() {
    mNodeContainer.keepConnectionOpen(true);
    super.onBackPressed();
}//onBackPressed

Community Discussions

Trending Discussions on audiosync
  • Play audio on smartphone, received via bluetooth, from a SensorTile (STEVAL-STLKT01V1)
Trending Discussions on audiosync

QUESTION

Play audio on smartphone, received via bluetooth, from a SensorTile (STEVAL-STLKT01V1)

Asked 2017-Jun-13 at 16:02

I am making an app that can play audio received via bluetooth from a board with sensors including the microphone. In the activity of the audio feature there are two buttons that will allow you to start playing audio in stream mode and stop playback. Unfortunately at the moment it does not work as I would like. The problem is that audioSample is null, so I can not get into the onUpdate method and extract the audio from the sample.

Changes: listener change, adding a button to disable audio

Below the code relating to activity:

package com.st.BlueSTSDK.Example;

import android.content.Context;
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.SeekBar;

import com.st.BlueSTSDK.Feature;
import com.st.BlueSTSDK.Features.FeatureAudioADPCM;
import com.st.BlueSTSDK.Features.FeatureAudioADPCMSync;
import com.st.BlueSTSDK.Manager;
import com.st.BlueSTSDK.Node;
import com.st.BlueSTSDK.Utils.BVAudioSyncManager;

import java.util.List;

/**
 * Created by Cesare on 09/06/2017.
 */

public class FeatureAudioActivity extends AppCompatActivity {

    /**
     *   Node that will show the data
     */
    private Node mNode;

    /** fragment used for keep the connection open */
    private NodeContainerFragment mNodeContainer;

    //  Feature on which to apply the listener
    private FeatureAudioADPCM mAudio;

    // feature where we read the audio sync values
    private FeatureAudioADPCMSync mAudioSync;

    // The sampling rate
    private static final int SAMPLE_RATE = 8000;

    // raw audio
    private short audioSample[];

    // audio manager
    private static final int AUDIO_STREAM = AudioManager.STREAM_MUSIC;

    //  Audio track builder
    private AudioTrack mAudioTrack;

    //object containing the sync data needed in a ADPCM stream decoding
    private BVAudioSyncManager mBVAudioSyncManager = new BVAudioSyncManager();


    private final static String NODE_FRAGMENT =   FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_FRAGMENT";
    private final static String NODE_TAG = FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_TAG";



    /**
     * create an intent for start the activity that will log the information from the node
     *
     * @param c    context used for create the intent
     * @param node note that will be used by the activity
     * @return intent for start this activity
     */
    public static Intent getStartIntent(Context c, @NonNull Node node) {
        Intent i = new Intent(c, FeatureAudioActivity.class);
        i.putExtra(NODE_TAG, node.getTag());
        i.putExtras(NodeContainerFragment.prepareArguments(node));
        return i;
    }

    /**
     * listener for the audio feature, it will updates the audio values
     */
    public final Feature.FeatureListener mAudioListener = new Feature.FeatureListener() {

        @Override
        public void onUpdate(final Feature f, final Feature.Sample sample) {
            audioSample = FeatureAudioADPCM.getAudio(sample);
        }

    };

    /**
     * listener for the audioSync feature, it will update the synchronism values
     */
    public final Feature.FeatureListener mAudioSyncListener = new Feature.FeatureListener() {
        @Override
        public void onUpdate(Feature f, final Feature.Sample sample) {
            if(mBVAudioSyncManager!=null){
                mBVAudioSyncManager.setSyncParams(sample);
            }
        }
    };

/* ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// */
    private SeekBar mVolumeBar;
    private AudioManager mAudioManager;

    private Button mPlayButton;
    private Button mStopButton;

    private ImageButton mMuteButton;
    private boolean mIsMute = false;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_feature_audio);

        // find the node.
        String nodeTag = getIntent().getStringExtra(NODE_TAG);
        mNode = Manager.getSharedInstance().getNodeWithTag(nodeTag);


        List<Feature> listFeature = mNode.getFeatures();
        for (Feature f : listFeature) {
            if (f.isEnabled() && f.getName().equals("AudioFeature")) {

                mAudio=(FeatureAudioADPCM) f;

            }//if
            if (f.isEnabled() && f.getName().equals("AudioSyncFeature")) {

                mAudioSync=(FeatureAudioADPCMSync) f;

            }//if
        }//for


        //create/recover the NodeContainerFragment
        if (savedInstanceState == null) {
            Intent i = getIntent();
            mNodeContainer = new NodeContainerFragment();
            mNodeContainer.setArguments(i.getExtras());
            getFragmentManager().beginTransaction()
                .add(mNodeContainer, NODE_FRAGMENT).commit();
        } else {
            mNodeContainer = (NodeContainerFragment) getFragmentManager()
                .findFragmentByTag(NODE_FRAGMENT);
        }//if-else



        //builder audio track
        mAudioTrack = new AudioTrack(
                AudioManager.STREAM_MUSIC,
                SAMPLE_RATE,
                AudioFormat.CHANNEL_OUT_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                FeatureAudioADPCM.AUDIO_PACKAGE_SIZE,
                AudioTrack.MODE_STREAM);





        mPlayButton = (Button) findViewById(R.id.playButton);
        mStopButton = (Button) findViewById(R.id.stopButton);
        mMuteButton = (ImageButton) findViewById(R.id.muteButton);

//        //start speaker phone
//        AudioManager audioManager =  (AudioManager)getSystemService(Context.AUDIO_SERVICE);
//        audioManager.setMode(AudioManager.MODE_IN_CALL);
//        audioManager.setSpeakerphoneOn(true);


        //  When the play button is pressed
        mPlayButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {

                mAudioTrack.play();

                /*Write audio data for playback
              @param short : The array that contains the data for playback
              @param int: offset in rawAudio where playback data begins
              @param int: The number of shorts to read in rawAudio after the offset
                */
                mAudioTrack.write(audioSample,0,audioSample.length);
            }
        });

        //When the stop button is pressed
        mStopButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {

                mAudioTrack.stop();
            }
        });

        //When the mute button is pressed
        mMuteButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                changeState();
            }
            boolean changeState(){
                mIsMute=!mIsMute;
                if(mIsMute)
                    muteAudio();
                else
                    unMuteAudio();
                return mIsMute;
            }
            private void muteAudio(){
                  mMuteButton.setImageResource(R.drawable.ic_volume_off_black_32dp);
                mAudioManager.setStreamVolume(AUDIO_STREAM,0,0);
                mVolumeBar.setEnabled(false);
            }

            private void unMuteAudio(){
                mMuteButton.setImageResource(R.drawable.ic_volume_up_black_32dp);
                mAudioManager.setStreamVolume(AUDIO_STREAM,mVolumeBar.getProgress(),0);
            mVolumeBar.setEnabled(true);
            }
        });


        setVolumeControlStream(AudioManager.STREAM_MUSIC);
        initControls();


        mAudioSync.addFeatureListener(mAudioSyncListener);
        mAudio.setAudioSyncManager(mBVAudioSyncManager);
        mAudio.addFeatureListener(mAudioListener);
        mNode.enableNotification(mAudio);



    }

    //   Volume control from SeekBar
    private void initControls()
    {
        try
        {
            mVolumeBar = (SeekBar)findViewById(R.id.volumeValue);
            mAudioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
            mVolumeBar.setMax(mAudioManager
                .getStreamMaxVolume(AudioManager.STREAM_MUSIC));
            mVolumeBar.setProgress(mAudioManager
                .getStreamVolume(AudioManager.STREAM_MUSIC));


            mVolumeBar.setOnSeekBarChangeListener(new  SeekBar.OnSeekBarChangeListener()
            {
                @Override
                public void onStopTrackingTouch(SeekBar arg0)
                {
                }

                @Override
                public void onStartTrackingTouch(SeekBar arg0)
                {
                }

                @Override
                public void onProgressChanged(SeekBar arg0, int progress, boolean arg2)
                {
                    mAudioManager.setStreamVolume(AudioManager.STREAM_MUSIC,
                        progress, 0);
                }
            });
        }
        catch (Exception e)
        {
            e.printStackTrace();
        }
    }

    /**
     * if we have to leave this activity, we force to keep the connection open, since we go back
     * in the {@link FeatureListActivity}
     */
     @Override
     public void onBackPressed() {
        mNodeContainer.keepConnectionOpen(true);
        super.onBackPressed();
     }//onBackPressed


}

ANSWER

Answered 2017-Jun-13 at 16:02

Moving the write method inside the onUpdate method. This allows you to write new audio each time we have a new sample. Adding onResume and onUpdate methods in which we respectively enable and disable the mAudio and mAudioSync notifications. Changes to setOnClickListener methods for play and stop bottoms.

package com.st.BlueSTSDK.Example;

import android.content.Context;
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.SeekBar;

import com.st.BlueSTSDK.Feature;
import com.st.BlueSTSDK.Features.FeatureAudioADPCM;
import com.st.BlueSTSDK.Features.FeatureAudioADPCMSync;
import com.st.BlueSTSDK.Manager;
import com.st.BlueSTSDK.Node;
import com.st.BlueSTSDK.Utils.BVAudioSyncManager;

import java.util.List;


public class FeatureAudioActivity extends AppCompatActivity {

/**
 *   Node that will show the data
 */
private Node mNode;

/** fragment used for keep the connection open */
private NodeContainerFragment mNodeContainer;

//  Feature on which to apply the listener
private FeatureAudioADPCM mAudio;

// feature where we read the audio sync values
private FeatureAudioADPCMSync mAudioSync;

// The sampling rate
private static final int SAMPLE_RATE = 8000;

// audio manager
private static final int AUDIO_STREAM = AudioManager.STREAM_MUSIC;

//  Audio track builder
private AudioTrack mAudioTrack;

//object containing the sync data needed in a ADPCM stream decoding
private BVAudioSyncManager mBVAudioSyncManager = new BVAudioSyncManager();


private final static String NODE_FRAGMENT =   FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_FRAGMENT";
private final static String NODE_TAG = FeatureAudioActivity.class.getCanonicalName() + "" +
        ".NODE_TAG";



/**
 * create an intent for start the activity that will log the information from the node
 *
 * @param c    context used for create the intent
 * @param node note that will be used by the activity
 * @return intent for start this activity
 */
public static Intent getStartIntent(Context c, @NonNull Node node) {
    Intent i = new Intent(c, FeatureAudioActivity.class);
    i.putExtra(NODE_TAG, node.getTag());
    i.putExtras(NodeContainerFragment.prepareArguments(node));
    return i;
}

/**
 * listener for the audio feature, it will updates the audio values
 */
public final Feature.FeatureListener mAudioListener = new Feature.FeatureListener() {

    @Override
    public void onUpdate(final Feature f, final Feature.Sample sample) {
        short audioSample[] = FeatureAudioADPCM.getAudio(sample);

          /*Write audio data for playback
          @param short : The array that contains the data for playback
          @param int: offset in rawAudio where playback data begins
          @param int: The number of shorts to read in rawAudio after the offset
            */
        mAudioTrack.write(audioSample,0,audioSample.length);
    }

};

/**
 * listener for the audioSync feature, it will update the synchronism values
 */
public final Feature.FeatureListener mAudioSyncListener = new Feature.FeatureListener() {
    @Override
    public void onUpdate(Feature f, final Feature.Sample sample) {
        if(mBVAudioSyncManager!=null){
            mBVAudioSyncManager.setSyncParams(sample);
        }
    }
};

/* ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// */
private SeekBar mVolumeBar;
private AudioManager mAudioManager;

private Button mPlayButton;
private Button mStopButton;

private ImageButton mMuteButton;
private boolean mIsMute = false;

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_feature_audio);

    // find the node.
    String nodeTag = getIntent().getStringExtra(NODE_TAG);
    mNode = Manager.getSharedInstance().getNodeWithTag(nodeTag);


    List<Feature> listFeature = mNode.getFeatures();
    for (Feature f : listFeature) {
        if (f.isEnabled() && f.getName().equals("AudioFeature")) {

            mAudio=(FeatureAudioADPCM) f;

        }//if
        if (f.isEnabled() && f.getName().equals("AudioSyncFeature")) {

            mAudioSync=(FeatureAudioADPCMSync) f;

        }//if
    }//for


    //create/recover the NodeContainerFragment
    if (savedInstanceState == null) {
        Intent i = getIntent();
        mNodeContainer = new NodeContainerFragment();
        mNodeContainer.setArguments(i.getExtras());
        getFragmentManager().beginTransaction()
                .add(mNodeContainer, NODE_FRAGMENT).commit();
    } else {
        mNodeContainer = (NodeContainerFragment) getFragmentManager()
                .findFragmentByTag(NODE_FRAGMENT);
    }//if-else



    //builder audio track
    mAudioTrack = new AudioTrack(
            AudioManager.STREAM_MUSIC,
            SAMPLE_RATE,
            AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT,
            FeatureAudioADPCM.AUDIO_PACKAGE_SIZE,
            AudioTrack.MODE_STREAM);



    mPlayButton = (Button) findViewById(R.id.playButton);
    mStopButton = (Button) findViewById(R.id.stopButton);
    mMuteButton = (ImageButton) findViewById(R.id.muteButton);



    //  When the play button is pressed
    mPlayButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {

            if(!mIsMute){
                mAudioTrack.play();
                mAudioManager.setStreamVolume(AUDIO_STREAM,mVolumeBar.getProgress(),0);
            }

        }
    });

    //When the stop button is pressed
    mStopButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {

            stopAudioTrack();
            mAudioManager.setStreamVolume(AUDIO_STREAM,0,0);

        }
    });

    //When the mute button is pressed
    mMuteButton.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View v) {
            changeState();
        }
        boolean changeState(){
            mIsMute=!mIsMute;
            if(mIsMute)
                muteAudio();
            else
                unMuteAudio();
            return mIsMute;
        }
        private void muteAudio(){
            mMuteButton.setImageResource(R.drawable.ic_volume_off_black_32dp);
            mAudioManager.setStreamVolume(AUDIO_STREAM,0,0);
            mVolumeBar.setEnabled(false);
        }

        private void unMuteAudio(){
            mMuteButton.setImageResource(R.drawable.ic_volume_up_black_32dp);
            mAudioManager.setStreamVolume(AUDIO_STREAM,mVolumeBar.getProgress(),0);
            mVolumeBar.setEnabled(true);
        }
    });

    //enable control volume
    setVolumeControlStream(AudioManager.STREAM_MUSIC);
    initControls();


}

@Override
public void onResume(){
    super.onResume();
    // enable needed notification
    if(mAudio!=null && mAudioSync!=null) {
        mAudio.addFeatureListener(mAudioListener);
        mBVAudioSyncManager.reinitResetFlag();
        mAudio.setAudioSyncManager(mBVAudioSyncManager);
        mNode.enableNotification(mAudio);
        mAudioSync.addFeatureListener(mAudioSyncListener);
        mNode.enableNotification(mAudioSync);
    }
}

@Override
public void onPause(){
    super.onPause();
    // disable needed notification
    if(mAudio!=null) {
        mAudio.removeFeatureListener(mAudioListener);
        mNode.disableNotification(mAudio);
    }
    if(mAudioSync!=null) {
        mAudioSync.removeFeatureListener(mAudioSyncListener);
        mNode.disableNotification(mAudioSync);
    }
}



private void stopAudioTrack(){
    synchronized(this) {
        mAudioTrack.pause();
        mAudioTrack.flush();
    }
}



//   Volume control from SeekBar
private void initControls()
{
    try
    {
        mVolumeBar = (SeekBar)findViewById(R.id.volumeValue);
        mAudioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
        mVolumeBar.setMax(mAudioManager
                .getStreamMaxVolume(AudioManager.STREAM_MUSIC));
        mVolumeBar.setProgress(mAudioManager
                .getStreamVolume(AudioManager.STREAM_MUSIC));


        mVolumeBar.setOnSeekBarChangeListener(new  SeekBar.OnSeekBarChangeListener()
        {
            @Override
            public void onStopTrackingTouch(SeekBar arg0)
            {
            }

            @Override
            public void onStartTrackingTouch(SeekBar arg0)
            {
            }

            @Override
            public void onProgressChanged(SeekBar arg0, int progress, boolean arg2)
            {
                mAudioManager.setStreamVolume(AudioManager.STREAM_MUSIC,
                        progress, 0);
            }
        });
    }
    catch (Exception e)
    {
        e.printStackTrace();
    }
}

/**
 * if we have to leave this activity, we force to keep the connection open, since we go back
 * in the {@link FeatureListActivity}
 */
@Override
public void onBackPressed() {
    mNodeContainer.keepConnectionOpen(true);
    super.onBackPressed();
}//onBackPressed

}

Source https://stackoverflow.com/questions/44027275

Community Discussions, Code Snippets contain sources that include Stack Exchange Network

Vulnerabilities

No vulnerabilities reported

Install audiosync

You can download it from GitHub.
You can use audiosync like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the audiosync component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .

Support

For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

DOWNLOAD this Library from

Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

Share this Page

share link
Consider Popular Runtime Evironment Libraries
Compare Runtime Evironment Libraries with Highest Support
Compare Runtime Evironment Libraries with Highest Quality
Compare Runtime Evironment Libraries with Highest Security
Compare Runtime Evironment Libraries with Permissive License
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

  • © 2022 Open Weaver Inc.