在Android 6.0(棉花糖):如何播放MIDI音符? [英] Android 6.0 (Marshmallow): How to play midi notes?

查看:2699
本文介绍了在Android 6.0(棉花糖):如何播放MIDI音符?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我创建生成现场乐器的应用程序的声音,我打算使用Android的棉花糖(版本6.0)特色新MIDI API。我在这里<读包文件概述href=\"http://developer.android.com/reference/android/media/midi/package-summary.html\">http://developer.android.com/reference/android/media/midi/package-summary.html我知道如何生成MIDI音符,但我仍然不能确定:我怎么已经生成的MIDI数据后,我真正发挥这些笔记

I'm creating an app that generates live instrument sounds and I'm planning on using the new Midi API featured in Android Marshmallow (version 6.0). I've read the package overview document here http://developer.android.com/reference/android/media/midi/package-summary.html and I know how to generate Midi notes but i'm still unsure: how do I actually play these notes after I've generated their Midi data?

我需要一个合成器程序播放MIDI音符?如果是这样,我必须做我自己还是一个由Android或第三方?

Do I need a synthesizer program to play Midi notes? If so, do I have to make my own or is one provided by Android or a 3rd party?

我和迷笛是新手,所以请尽可能描述与你的答案。

I am a novice with Midi so please be as descriptive as possible with your answer.

我试过到目前为止:
我创建了一个迷笛经理对象,并开设了一个输入端口

What i've tried so far: I've created a Midi manager object and opened an input port

MidiManager m = (MidiManager)context.getSystemService(Context.MIDI_SERVICE); 
MidiInputPort inputPort = device.openInputPort(index);

然后,我已经派出一个测试noteOn MIDI信息的端口

Then, i've sent a test noteOn midi message to the port

byte[] buffer = new byte[32];
int numBytes = 0;
int channel = 3; // MIDI channels 1-16 are encoded as 0-15.
buffer[numBytes++] = (byte)(0x90 + (channel - 1)); // note on
buffer[numBytes++] = (byte)60; // pitch is middle C
buffer[numBytes++] = (byte)127; // max velocity
int offset = 0;
// post is non-blocking
inputPort.send(buffer, offset, numBytes);

我还设置了一个类来接收MIDI音符信息

I've also set up a class to receive the midi note messages

class MyReceiver extends MidiReceiver {
    public void onSend(byte[] data, int offset,
            int count, long timestamp) throws IOException {
        // parse MIDI or whatever
    }
}
MidiOutputPort outputPort = device.openOutputPort(index);
outputPort.connect(new MyReceiver());

现在,这里的地方我最困惑。我的应用程序的使用情况是成为一个全功能于一身的组成和放大器;回放工具制作音乐。换句话说,我的应用程序需要包含或使用虚拟MIDI设备(如另一个应用程序的MIDI合成器的意图)。除非有人已经提出了这样的合成器,我必须将应用程序的生命周期内创建一个自己。实际上,我怎么居然将一个接收到的MIDI noteOn()成声音出来我的音箱?我特别困惑,因为还必须有一个以编程方式决定什么类型的乐器的音符听起来像它来自何处:这也是在合成完成

Now, here's where i'm most confused. The use case of my app is to be an all-in-one composition & playback tool for making music. In other words, my app needs to contain or use a virtual midi device (like an intent of another app's midi synthesizer). Unless someone already made such a synthesizer, I must create one myself within my app's lifecycle. How do I actually actually convert a received midi noteOn() into sound coming out of my speakers? I'm especially confused because there also has to be a way to programmatically decide what type of instrument the note sounds like it's coming from: is this also done in a synthesizer?

在Android的棉花糖MIDI支持是相当新的,所以我一直没能找到任何教程或样本合成应用程序联机。任何了解AP是preciated。

Midi support in Android Marshmallow is fairly new so I haven't been able to find any tutorials or sample synthesizer apps online. Any insight is appreciated.

推荐答案

我还没有发现任何官方的方式来控制从Java code中的内部合成。

I haven't found any "official" way to control the internal synthesizer from Java code.

也许最简单的方法是使用href=\"https://github.com/billthefarmer/mididriver\" rel=\"nofollow\">的Andr​​oid MIDI驱动的Sonivox合成的

Probably the easiest option is to use the Android midi driver for the Sonivox synthesizer.

得到它作为AAR包(解压缩* .zip文件)和存储*。 AAR在工作区文件中的某个地方。路径其实并不重要,它并不需要将自己的应用程序的文件夹结构内,但你的项目里面的库文件夹中可能是一个合乎逻辑的地方。

Get it as an AAR package (unzip the *.zip) and store the *.aar file somewhere in your workspace. The path doesn't really matter and it doesn't need to be inside your own app's folder structure but the "libs" folder inside your project could be a logical place.

使用您的Andr​​oid项目开放的Andr​​oid Studio中:

With your Android project open in Android Studio:

文件 - >新建 - >新建模块 - >导入.JAR / .AAR包 - >下一步 - >查找
  并选择MidiDriver-全release.aar,改变了子项目
  如果你想要的名字。 - >完成

File -> New -> New Module -> Import .JAR/.AAR Package -> Next -> Find and select the "MidiDriver-all-release.aar" and change the subproject name if you want. -> Finish

等待摇篮做它的魔力,然后去你的应用程序模块的设置(自己的应用程序项目的设置)的相关性选项卡,并添加(与绿色的+号)的MIDI驱动程序作为一个模块依赖。现在,你有机会获得MIDI驱动程序:

Wait for Gradle to do it's magic and then go to your "app" module's settings (your own app project's settings) to the "Dependencies" tab and add (with the green "+" sign) the MIDI Driver as a module dependency. Now you have access to the MIDI Driver:

import org.billthefarmer.mididriver.MidiDriver;
   ...
MidiDriver midiDriver = new MidiDriver();

而不必担心NDK和C ++东西你有这些Java方法可供选择:

Without having to worry anything about NDK and C++ you have these Java methods available:

// Not really necessary. Receives a callback when/if start() has succeeded.
midiDriver.setOnMidiStartListener(listener);
// Starts the driver.
midiDriver.start();
// Receives the driver's config info.
midiDriver.config();
// Stops the driver.
midiDriver.stop();
// Just calls write().
midiDriver.queueEvent(event);
// Sends a MIDI event to the synthesizer.
midiDriver.write(event);

播放和停止记一个非常基本的概念证明可能是这样的:

A very basic "proof of concept" for playing and stopping a note could something like:

package com.example.miditest;

import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Button;

import org.billthefarmer.mididriver.MidiDriver;

public class MainActivity extends AppCompatActivity implements MidiDriver.OnMidiStartListener,
        View.OnTouchListener {

    private MidiDriver midiDriver;
    private byte[] event;
    private int[] config;
    private Button buttonPlayNote;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        buttonPlayNote = (Button)findViewById(R.id.buttonPlayNote);
        buttonPlayNote.setOnTouchListener(this);

        // Instantiate the driver.
        midiDriver = new MidiDriver();
        // Set the listener.
        midiDriver.setOnMidiStartListener(this);
    }

    @Override
    protected void onResume() {
        super.onResume();
        midiDriver.start();

        // Get the configuration.
        config = midiDriver.config();

        // Print out the details.
        Log.d(this.getClass().getName(), "maxVoices: " + config[0]);
        Log.d(this.getClass().getName(), "numChannels: " + config[1]);
        Log.d(this.getClass().getName(), "sampleRate: " + config[2]);
        Log.d(this.getClass().getName(), "mixBufferSize: " + config[3]);
    }

    @Override
    protected void onPause() {
        super.onPause();
        midiDriver.stop();
    }

    @Override
    public void onMidiStart() {
        Log.d(this.getClass().getName(), "onMidiStart()");
    }

    private void playNote() {

        // Construct a note ON message for the middle C at maximum velocity on channel 1:
        event = new byte[3];
        event[0] = (byte) (0x90 | 0x00);  // 0x90 = note On, 0x00 = channel 1
        event[1] = (byte) 0x3C;  // 0x3C = middle C
        event[2] = (byte) 0x7F;  // 0x7F = the maximum velocity (127)

        // Internally this just calls write() and can be considered obsoleted:
        //midiDriver.queueEvent(event);

        // Send the MIDI event to the synthesizer.
        midiDriver.write(event);

    }

    private void stopNote() {

        // Construct a note OFF message for the middle C at minimum velocity on channel 1:
        event = new byte[3];
        event[0] = (byte) (0x80 | 0x00);  // 0x80 = note Off, 0x00 = channel 1
        event[1] = (byte) 0x3C;  // 0x3C = middle C
        event[2] = (byte) 0x00;  // 0x00 = the minimum velocity (0)

        // Send the MIDI event to the synthesizer.
        midiDriver.write(event);

    }

    @Override
    public boolean onTouch(View v, MotionEvent event) {

        Log.d(this.getClass().getName(), "Motion event: " + event);

        if (v.getId() == R.id.buttonPlayNote) {
            if (event.getAction() == MotionEvent.ACTION_DOWN) {
                Log.d(this.getClass().getName(), "MotionEvent.ACTION_DOWN");
                playNote();
            }
            if (event.getAction() == MotionEvent.ACTION_UP) {
                Log.d(this.getClass().getName(), "MotionEvent.ACTION_UP");
                stopNote();
            }
        }

        return false;
    }
}

布局文件只是有一个按钮,发挥了predefined注意当按住并释放时停止它:

The layout file just has one button that plays the predefined note when held down and stops it when released:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:paddingBottom="@dimen/activity_vertical_margin"
    android:paddingLeft="@dimen/activity_horizontal_margin"
    android:paddingRight="@dimen/activity_horizontal_margin"
    android:paddingTop="@dimen/activity_vertical_margin"
    tools:context="com.example.miditest.MainActivity"
    android:orientation="vertical">

    <Button
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="Play a note"
        android:id="@+id/buttonPlayNote" />
</LinearLayout>

这其实是这么简单。在code以上很可能与128可选手段,非常体面的延迟和适当的音符关功能,其中许多应用程序缺乏触摸钢琴应用程序的一个起点。

It is actually this simple. The code above could well be a starting point for a touch piano app with 128 selectable instruments, very decent latency and a proper "note off" functionality which many apps lack.

作为选择乐器:你只需要发送一个MIDI节目变的消息到您打算玩到选择通用MIDI soundset 128声音的一个渠道。但是,这涉及到的MIDI的细节,而不是到库的使用。

As for choosing the instrument: You'll just need to send a MIDI "program change" message to the channel on which you intend to play to choose one of the 128 sounds in the General MIDI soundset. But that's related to the details of MIDI and not to the usage of the library.

同样,你可能会想抽象掉MIDI的底层细节,让您可以轻松地在一个特定的速度播放特定的音符在特定的频道与特定仪器的具体时间以及您可能会发现一些线索从所有的开源Java和MIDI相关的应用程序和库迄今取得。

Likewise you'll probably want to abstract away the low level details of MIDI so that you can easily play a specific note on a specific channel with a specific instrument at a specific velocity for a specific time and for that you might find some clues from all the open source Java and MIDI related applications and libraries made so far.

这个方法不会对了需要的Andr​​oid 6.0。而此刻只有4.6%访问Play商店中的设备运行Android 6.x的这样就不会有你的应用程序多的观众。

This approach doesn't require Android 6.0 by the way. And at the moment only 4.6 % of devices visiting the Play Store run Android 6.x so there wouldn't be much audience for your app.

当然,如果你想使用 android.media.midi 包,那么你可以使用库来实现一个 android.media.midi .MidiReceiver 来接收MIDI事件,并发挥他们的内部合成器。谷歌已经推出了一些演示code,发挥票据同方,看见的波。只需更换与内部合成器。

Of course if you want to use the android.media.midi package you could then use the library to implement a android.media.midi.MidiReceiver to receive the MIDI events and play them on the internal synthesizer. Google already has some demo code that plays notes with square and saw waves. Just replace that with the internal synthesizer.

一些其他的选择可能是检查出有什么用移植 FluidSynth 到Android的状态。我猜有可能是可用的东西。

Some other options could be to check out what's the status with porting FluidSynth to Android. I guess there might be something available.

编辑:其他可能的有趣库:

  • port of Java's javax.sound.midi package for abstracting the low level MIDI technical details
  • USB MIDI Driver for connecting to a digital piano/keyboard with a USB MIDI connector
  • MIDI over Bluetooth LE driver for connecting wirelessly to a digital piano/keyboard that supports MIDI over Bluetooth LE (like e.g. some recent Roland and Dexibell digital pianos)
  • JFugue Music library port for Android for further abstracting the MIDI details and instead thinking in terms of music theory

这篇关于在Android 6.0(棉花糖):如何播放MIDI音符?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆