如何绘制Android上的音频波形 [英] How to draw a an audio waveform on Android

查看:158
本文介绍了如何绘制Android上的音频波形的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有我想要使用显示音频进来通过麦克风的线图振幅自定义视图。
获取幅度和所有我有没有问题,并绘制线条是不是一个真正的问题,无论是。

I have a custom view that I want to use to display the amplitude of audio coming in through the microphone in a line graph. Getting the amplitude and all that I have no problem with, and drawing the lines is not really a problem either.

我想要做的是显示幅度开始在最右边,向左移动。因此,与每一个新的样品我要翻译的位图的左侧,然后绘制从最后点到新点的线。我不知道什么是最简单的方式来实现,这是。我本来是可以通过绘制路径,只是增加一个新的点,每个样品的路径去做,问题是,像一分钟后,道路是太大而绘制。所以我想这件事,想切换到使用缓存的位图,该翻译在每次迭代,并从最后点到新点绘制。然而,这是棘手做的(实验后)。当我转换位图不动的最左边像素断位图,它只是将整个位图的画布,我也没办法写像素右侧。
下面是我想要做一个说明:

What I want to do is show the amplitude starting at the far right edge, moving left. So with each new sample I want to translate the bitmap to the left, then draw a line from the last point to the new point. I'm not sure what the easiest way to achieve this is. I originally was able to do it by drawing Paths and just adding a new point to the path with each sample, the problem was that after like a minute the path was too big to be drawn. So I thought about it and wanted to switch to using a cached bitmap, translate that on each iteration, and draw from the last point to the new point. However this is tricky to do as (after experimentation). When I translate the bitmap it doesn't move the far left pixels off the bitmap, it just moves the entire bitmap in the canvas and I have no way to write pixels to the right side. Below is a description of what I'm trying to do:

有鉴于此:

Given this:

我要翻译的左侧:

I want to translate that to the left:

然后画一条线到一个新的点右边的空间空间

Then draw a line to a new point the space space on the right

当然,步骤2和3应该在基本上发生在同一时间。

Of course, step 2 and 3 should happen at essentially the same time.

我怎样才能做到这一点?我打开新的思路完全一样,也许是保存所有的点到1屏幕的价值和绘画出来的每个电话的onDraw。我想preFER只是它们保存在一个位图,并做一些翻译/裁剪等来实现同样的事情可能没那么开销。

How can I achieve this? I'm open to new ideas altogether, like perhaps saving all the points for up to 1 screen worth and drawing them out on each onDraw call. I'd prefer to just save them in a bitmap and do some kind of translation/clipping etc to achieve the same thing with perhaps less overhead.

private static final int MAX_AMPLITUDE = 32767;
float lx, ly;
private Paint mPaint;
private Bitmap mBitmap;
private Canvas mCanvas;

private void init() {

    mPaint = new Paint(Paint.ANTI_ALIAS_FLAG);
    mPaint.setStyle(Paint.Style.STROKE);
    mPaint.setStrokeWidth(5);
    mPaint.setColor(Color.Black);
}

 @Override
public void onSizeChanged(int w, int h, int oldw, int oldh) {
    if (mBitmap != null) {
        mBitmap.recycle();
    }
    mBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
    mCanvas = new Canvas(mBitmap);
    height = h;
    width = w;
    ly = height;
    lx = width;
    amplitudeDivisor = ((float) MAX_AMPLITUDE / (float) height);
}

@Override
public void onDraw(Canvas canvas) {
    mAmplitude = (float)(MAX_AMPLITUDE * Math.random());
    float dx = width - delta;
    float dy = height - (mAmplitude / amplitudeDivisor);
    mCanvas.drawLine(lx, ly, dx, dy, mPaint);
    mCanvas.translate(-delta, 0);
    canvas.drawBitmap(mBitmap, 0, 0, mPaint);
    lx = dx;
    ly = dy;
    delta+=10;
    postInvalidateDelayed(200);

}

以上只是一个样品,我只是使用振幅简化为现在的随机值。我已经尝试了一堆东西,没有运气。任何帮助将大大AP preciated。

The above is just a sample, I'm just using a random value for the amplitude to simplify for now. I've tried a bunch of things with no luck. Any help would be greatly appreciated.

推荐答案

我最终通过保存指向数组得到这个工作。我画的白线录制开始之前。请注意,我用一个EvictingQueue从番石榴库点的循环缓冲区来呈现一行。要使用此,一旦开始记录呼叫开始(),当它结束通话停止。从您的活动,您将需要发送MediaRecorder getMaxAmplitude()值到这个类的updateAmplitude()方法,这样做的间隔的说50毫秒。该视图还支持旋转。

I ended up getting this working by saving the points to an array. I draw a white line before the recording starts. Note that I use an EvictingQueue from the Guava library as a circular buffer of points to render on a line. To use this, once a recording starts call start() and when it ends call stop. From your activity you will need to send MediaRecorder getMaxAmplitude() values to the updateAmplitude() method of this class, and do so at an interval of say 50 ms. The view also supports rotation.

public class AmplitudeWaveFormView extends View {
    private static final String TAG = AmplitudeWaveFormView.class.getSimpleName();

    private static final int MAX_AMPLITUDE = 32767;
    private static final int SAMPLES_PER_SCREEN = 100;
    private float mAmplitude = 0;

    private Paint mRecordingPaint, mNotRecordingPaint;
    private int height = -1;
    private int width = -1;
    private boolean mIsStarted;

    private float[] lastPoints;

    private int oldWidth = -1, oldHeight = -1;
    private int mCurrentSample;
    private float amplitudeDivisor = 1;
    private float lx,ly, deltaX;


    private EvictingQueue<Float> mPointQueue;


    private int recordColor;

    private int notRecordingColor;


    public AmplitudeWaveFormView(Context context) {
        super(context);
        init();
    }

    public AmplitudeWaveFormView(Context context, AttributeSet attrs) {
        super(context, attrs);
        init();
    }

    public AmplitudeWaveFormView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        init();
    }


    public void start() {
        mIsStarted = true;
    }


    public void stop() {
        mIsStarted = false;
    }
    public void updateAmplitude(float amplitude) {
        mAmplitude = amplitude;
        postInvalidate();
    }

    private void init() {
        recordColor = getResources().getColor(R.color.mint);
        notRecordingColor = getResources().getColor(R.color.alpine);
        mRecordingPaint = new Paint(Paint.ANTI_ALIAS_FLAG);
        mRecordingPaint.setStyle(Paint.Style.STROKE);
        mRecordingPaint.setStrokeWidth(5);
        mRecordingPaint.setColor(recordColor);

        mNotRecordingPaint = new Paint(Paint.ANTI_ALIAS_FLAG);
        mNotRecordingPaint.setStyle(Paint.Style.STROKE);
        mNotRecordingPaint.setStrokeWidth(5);
        mNotRecordingPaint.setColor(notRecordingColor);
    }

    @Override
    public void onSizeChanged(int w, int h, int oldw, int oldh) {
        height = h;
        width = w;
        ly = height;
        lx = width;
        deltaX =  (float)width / (float)SAMPLES_PER_SCREEN;
        amplitudeDivisor = ((float) MAX_AMPLITUDE / (float) height);

        mPointQueue = EvictingQueue.create(SAMPLES_PER_SCREEN * 4);
        if (lastPoints != null && lastPoints.length > 0) {
            float xScale = (float) width/oldWidth;
            float yScale = (float) height/oldHeight;
            Matrix matrix = new Matrix();
            matrix.setScale(xScale, yScale);
            matrix.mapPoints(lastPoints);
            mPointQueue.addAll(Floats.asList(lastPoints));
            ly = lastPoints[lastPoints.length-1];
            lx= lastPoints[lastPoints.length -2];
            lastPoints = null;
        }

    }

    @Override
    public void onRestoreInstanceState(Parcelable state) {
        if (state instanceof Bundle) {
            Bundle bundle = (Bundle) state;
            mCurrentSample = bundle.getInt("sample");
            lastPoints = bundle.getFloatArray("lines");
            oldWidth = bundle.getInt("oldWidth");
            oldHeight = bundle.getInt("oldHeight");
            state = ((Bundle) state).getParcelable("parent");

        }
        super.onRestoreInstanceState(state);
    }

    @Override
    public Parcelable onSaveInstanceState() {
        Bundle bundle = new Bundle();
        bundle.putFloatArray("lines", Floats.toArray(mPointQueue));
        bundle.putInt("sample", mCurrentSample);
        bundle.putParcelable("parent", super.onSaveInstanceState());
        bundle.putInt("oldWidth", width);
        bundle.putInt("oldHeight", height);
        return bundle;
    }


    @Override
    public void onDraw(Canvas canvas) {

        if (mIsStarted) {
            float x = lx + deltaX;
            float y = height - (mAmplitude / amplitudeDivisor);
            mPointQueue.add(lx);
            mPointQueue.add(ly);
            mPointQueue.add(x);
            mPointQueue.add(y);
            lastPoints = Floats.toArray(mPointQueue);
            lx = x;
            ly = y;
        }
        if (lastPoints != null && lastPoints.length > 0) {
            int len = mPointQueue.size() / 4 >= SAMPLES_PER_SCREEN ? SAMPLES_PER_SCREEN * 4 : mPointQueue.size();
            float translateX = width - lastPoints[lastPoints.length - 2];
            canvas.translate(translateX, 0);
            canvas.drawLines(lastPoints, 0, len, mRecordingPaint);
        }

        if (mCurrentSample <= SAMPLES_PER_SCREEN) {
            drawNotRecordingLine(canvas);
        }
        mCurrentSample++;
    }

    private void drawNotRecordingLine(Canvas canvas) {
        canvas.drawLine(0,height, width, height, mNotRecordingPaint);
    }
}

这篇关于如何绘制Android上的音频波形的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆