使用套接字fd流从手机实时视频电话 [英] Stream live video from phone to phone using socket fd

查看:413
本文介绍了使用套接字fd流从手机实时视频电话的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是新来的Andr​​oid编程和发现自己陷入我一直在研究各种方式以流从手机实时视频的手机,似乎有它主要的功能,当然除了最重要的部分:播放流。这似乎是来自一个电话发送该视频流,但第二个电话不能够播放流

I am new to android programming and have found myself stuck I have been researching various ways to stream live video from phone to phone and seem to have it mostly functional, except of course the most important part: playing the stream. It appears to be sending the stream from one phone, but the second phone is not able to play the stream.

下面是$ C $下打边

Here is the code for the playing side

public class VideoPlayback extends Activity implements Callback {
MediaPlayer mp;
private SurfaceView mPreview;
private SurfaceHolder holder;
private TextView mTextview;
public static final int SERVERPORT = 6775;
public static String SERVERIP="192.168.1.126";
Socket clientSocket;
private Handler handler = new Handler();
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    mPreview = (SurfaceView) findViewById(R.id.surfaceView1);
    mTextview = (TextView) findViewById(R.id.textView1);
    holder = mPreview.getHolder();
    holder.addCallback(this);
    holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    mTextview.setText("Attempting to connect");
    mp = new MediaPlayer();
    Thread t = new Thread(){
        public void run(){
            try {
                    clientSocket = new Socket(SERVERIP,SERVERPORT);
                handler.post(new Runnable() {
                    @Override
                    public void run() {
                        mTextview.setText("Connected to server");
                    }
                });
                handler.post(new Runnable() {
                    @Override
                    public void run() {
                        try {
                            ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(clientSocket);
                            pfd.getFileDescriptor().sync();
                            mp.setDataSource(pfd.getFileDescriptor());
                            pfd.close();
                            mp.setDisplay(holder);
                            mp.prepareAsync();
                            mp.start();
                        } catch (IOException e) {
                            // TODO Auto-generated catch block
                            e.printStackTrace();
                        }

                    }
                });

            } catch (UnknownHostException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
    };
    t.start();
}

这里是code为流侧

And here is the code for the streaming side

public class VideoStreaming extends Activity{
// User Interface Elements
VideoView mView;
TextView connectionStatus;
SurfaceHolder mHolder;
// Video variable
MediaRecorder recorder; 
// Networking variables
public static String SERVERIP="";
public static final int SERVERPORT = 6775;
private Handler handler = new Handler();
private ServerSocket serverSocket;  
/** Called when the activity is first created. */
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    // Define UI elements
    mView = (VideoView) findViewById(R.id.video_preview);
    connectionStatus = (TextView) findViewById(R.id.connection_status_textview);
    mHolder = mView.getHolder();
    mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    SERVERIP = "192.168.1.126";
    // Run new thread to handle socket communications
    Thread sendVideo = new Thread(new SendVideoThread());
    sendVideo.start();
}
 public class SendVideoThread implements Runnable{
    public void run(){
        // From Server.java
        try {
            if(SERVERIP!=null){
                handler.post(new Runnable() {
                    @Override
                    public void run() {
                        connectionStatus.setText("Listening on IP: " + SERVERIP);
                    }
                });
                serverSocket = new ServerSocket(SERVERPORT);
                while(true) {
                    //listen for incoming clients
                    Socket client = serverSocket.accept();
                    handler.post(new Runnable(){
                        @Override
                        public void run(){
                            connectionStatus.setText("Connected.");
                        }
                    });
                    try{
                            // Begin video communication
                            final ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(client);
                            handler.post(new Runnable(){
                                @Override
                                public void run(){
                                    recorder = new MediaRecorder();
                                    recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
                                    recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);                 
                                    recorder.setOutputFile(pfd.getFileDescriptor());
                                    recorder.setVideoFrameRate(20);
                                    recorder.setVideoSize(176,144);
                                    recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
                                    recorder.setPreviewDisplay(mHolder.getSurface());
                                    try {
                                        recorder.prepare();
                                    } catch (IllegalStateException e) {
                                        // TODO Auto-generated catch block
                                        e.printStackTrace();
                                    } catch (IOException e) {
                                        // TODO Auto-generated catch block
                                        e.printStackTrace();
                                    }
                                    recorder.start();
                                }
                            });
                    } catch (Exception e) {
                        handler.post(new Runnable(){
                            @Override
                            public void run(){
                                connectionStatus.setText("Oops.Connection interrupted. Please reconnect your phones.");
                            }
                        });
                        e.printStackTrace();
                    }
                }
            } else {
                handler.post(new Runnable() {
                    @Override
                    public void run(){
                        connectionStatus.setText("Couldn't detect internet connection.");
                    }
                });
            }
        } catch (Exception e){
            handler.post(new Runnable() {
                @Override
                public void run() {
                    connectionStatus.setText("Error");
                }
            });
            e.printStackTrace();
        }
        // End from server.java
    }
}

我收到试图创建的媒体播放器时,出现以下错误

I receive the following error when trying to create the MediaPLayer

05-24 16:25:39.360: ERROR/MediaPlayerService(88): offset error
05-24 16:25:39.360: ERROR/MediaPlayer(11895): Unable to to create media player
05-24 16:25:39.360: WARN/System.err(11895): java.io.IOException: setDataSourceFD failed.: status=0x80000000
05-24 16:25:39.360: WARN/System.err(11895):     at android.media.MediaPlayer.setDataSource(Native Method)
05-24 16:25:39.360: WARN/System.err(11895):     at android.media.MediaPlayer.setDataSource(MediaPlayer.java:811)
05-24 16:25:39.360: WARN/System.err(11895):     at com.conti.VideoPlayBack.VideoPlayback$1$2.run(VideoPlayback.java:63)
05-24 16:25:39.360: WARN/System.err(11895):     at android.os.Handler.handleCallback(Handler.java:587)
05-24 16:25:39.360: WARN/System.err(11895):     at android.os.Handler.dispatchMessage(Handler.java:92)
05-24 16:25:39.360: WARN/System.err(11895):     at android.os.Looper.loop(Looper.java:132)
05-24 16:25:39.360: WARN/System.err(11895):     at android.app.ActivityThread.main(ActivityThread.java:4025)
05-24 16:25:39.360: WARN/System.err(11895):     at java.lang.reflect.Method.invokeNative(Native Method)
05-24 16:25:39.360: WARN/System.err(11895):     at java.lang.reflect.Method.invoke(Method.java:491)
05-24 16:25:39.360: WARN/System.err(11895):     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:841)
05-24 16:25:39.360: WARN/System.err(11895):     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:599)
05-24 16:25:39.360: WARN/System.err(11895):     at dalvik.system.NativeStart.main(Native Method)

有没有人有这样的定位?在此先感谢!

Does anyone have a fix for this? Thanks in advance!

推荐答案

我发现了一个开源项目实现了我的努力。它通过IP相机处理元数据的视频。虽然它不直接发送视频到电话,它广播视频关于各种设备上观看。 /:源$ C ​​$ C可在下列项目的网页。 HTTP发现/$c$c.google.com/p/ipcamera-for-android/

I found an open source project for implementing what I was trying. It processes the video with metadata through an IP camera. Although it does not send video directly to a phone, it does broadcast video for various devices to watch. The source code can be found at the following project page http://code.google.com/p/ipcamera-for-android/.

使用Android 4.4系统还有另一种方式来播放实时MJPEG流。正在播放的流应该由其他设备端口在UDP上播出。比方说,我们有一个流正在播出192.168.0.101:8080。我们可以通过将web视图到我们的布局播放流。然后,在我们的活动,我们做到以下几点:

With Android 4.4 there is another way to play a live MJPEG stream. The stream you are playing should be broadcast by the other device on a port over UDP. Let's say we have a stream being broadcast on 192.168.0.101:8080. We can play the stream by adding a WebView to our layout. Then in our activity we do the following:

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.mjpeg_activity);
    // Grab instance of WebView
    WebView webView = (WebView)findViewById(R.id.webViewStream);
    // Set page content for webview
    webView.loadData("<html><head><meta name='viewport' content='target-densitydpi=device-dpi,initial-scale=1,minimum-scale=1,user-scalable=yes'/></head><body><center><img src=\"http://192.168.0.101:8080/\" alt=\"Stream\" align=\"middle\"></center></body></html>", "text/html", null);
    webView.getSettings().setBuiltInZoomControls(true);
}

在内容,我们告诉网页使用设备的DPI。为了支持双指缩放对我添加了以下初始规模= 1,最低规模为1,用户可扩展的网页用户= YES。最初的图像是它的原始尺寸,并且不能得到小。现在,用户可以扩展到放大图像和缩小到它的原始尺寸。卸下最小刻度将带给用户完全控制变焦,但可能会导致使图像这么小,你不能找到它。

In the content we tell the webpage to use the device's dpi. To support the user to pinch zoom on the webpage I have added the following initial-scale=1,minimum-scale=1,user-scalable=yes. Initially the image is it's original size and cannot get smaller. The user can now scale to zoom into the image and out to it's original size. Removing the minimum scale will give the user complete control over zoom, but can result in making the image so small you can't find it.

这篇关于使用套接字fd流从手机实时视频电话的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆