如何在没有arFragment的情况下使用ARcore在平面上的锚点之间画一条线 [英] How to draw a line between anchors on the plane with ARcore without arFragment

查看:27
本文介绍了如何在没有arFragment的情况下使用ARcore在平面上的锚点之间画一条线的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在围绕这个基于Agora ARcore Demo构建我的应用程序://github.com/google-ar/arcore-android-sdk/tree/master/samples/hello_ar_java" rel="nofollow noreferrer">Google 的 hello_ar_java 示例应用程序.

I'm building my app around this Agora ARcore Demo based on Google's hello_ar_java Sample APP.

此应用程序可捕获用户的点击并检查是否在场景中找到了任何平面.如果是这样,请在该点创建一个锚点.

This application, capture user's taps and check if any planes in the scene were found. If so, create an anchor at that point.

我想在各种锚点之间画一条线.

I would like to draw a line between the various anchors.

我在网上找到的所有东西都使用了 sceneForm 和 arFragment.

Everything I find on the web uses sceneForm and arFragment.

目前我已经设法在没有 arFragment 的情况下实现了 sceneForm 但该行没有显示,可能是因为我不知道如何在没有 arFragment 的情况下替换的这种方法:nodeToAdd.setParent(arFragment.getArSceneView().getScene());

At the moment I have managed to implement sceneForm without arFragment but the line is not showing, probably because of of this method that I don't know how to replace without arFragment: nodeToAdd.setParent(arFragment.getArSceneView().getScene());

为了在我的项目中实现场景形式,我从这个项目中得到了提示 LineView不使用sceneform还有其他方法吗?

To implement sceneform in my project I'm taking a cue from this project LineView Are there any other methods without using sceneform?

这就是我的处理方式:

public void onDrawFrame(GL10 gl) {
    // Clear screen to notify driver it should not load any pixels from previous frame.
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

    if (mSession == null) {
        return;
    }
    // Notify ARCore session that the view size changed so that the perspective matrix and
    // the video background can be properly adjusted.
    mDisplayRotationHelper.updateSessionIfNeeded(mSession);

    try {
        // Obtain the current frame from ARSession. When the configuration is set to
        // UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
        // camera framerate.
        Frame frame = mSession.update();
        Camera camera = frame.getCamera();

        // Handle taps. Handling only one tap per frame, as taps are usually low frequency
        // compared to frame rate.
        MotionEvent tap = queuedSingleTaps.poll();
        if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
            for (HitResult hit : frame.hitTest(tap)) {
                // Check if any plane was hit, and if it was hit inside the plane polygon
                Trackable trackable = hit.getTrackable();
                // Creates an anchor if a plane or an oriented point was hit.
                if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()))
                        || (trackable instanceof Point
                        && ((Point) trackable).getOrientationMode()
                        == Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
                    // Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
                    // Cap the number of objects created. This avoids overloading both the
                    // rendering system and ARCore.
                    if (anchors.size() >= 250) {
                        anchors.get(0).detach();
                        anchors.remove(0);
                    }
                    // Adding an Anchor tells ARCore that it should track this position in
                    // space. This anchor is created on the Plane to place the 3D model
                    // in the correct position relative both to the world and to the plane.
                    anchors.add(hit.createAnchor());
                    break;
                }
            }
        }

        // Draw background.
        mBackgroundRenderer.draw(frame);

        // If not tracking, don't draw 3d objects.
        if (camera.getTrackingState() == TrackingState.PAUSED) {
            return;
        }

        // Get projection matrix.
        float[] projmtx = new float[16];
        camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);

        // Get camera matrix and draw.
        float[] viewmtx = new float[16];
        camera.getViewMatrix(viewmtx, 0);

        // Compute lighting from average intensity of the image.
        final float lightIntensity = frame.getLightEstimate().getPixelIntensity();

        if (isShowPointCloud()) {
            // Visualize tracked points.
            PointCloud pointCloud = frame.acquirePointCloud();
            mPointCloud.update(pointCloud);
            mPointCloud.draw(viewmtx, projmtx);

            // Application is responsible for releasing the point cloud resources after
            // using it.
            pointCloud.release();
        }

        // Check if we detected at least one plane. If so, hide the loading message.
        if (mMessageSnackbar != null) {
            for (Plane plane : mSession.getAllTrackables(Plane.class)) {
                if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING
                        && plane.getTrackingState() == TrackingState.TRACKING) {
                    hideLoadingMessage();
                    break;
                }
            }
        }

        if (isShowPlane()) {
            // Visualize planes.
            mPlaneRenderer.drawPlanes(
                    mSession.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
        }

        // Visualize anchors created by touch.
        float scaleFactor = 1.0f;

        for (Anchor anchor : anchors) {
            if (anchor.getTrackingState() != TrackingState.TRACKING) {
                continue;
            }
            // Get the current pose of an Anchor in world space. The Anchor pose is updated
            // during calls to session.update() as ARCore refines its estimate of the world.
            anchor.getPose().toMatrix(mAnchorMatrix, 0);


            // Update and draw the model and its shadow.
            mVirtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactor);
            mVirtualObjectShadow.updateModelMatrix(mAnchorMatrix, scaleFactor);
            mVirtualObject.draw(viewmtx, projmtx, lightIntensity);
            mVirtualObjectShadow.draw(viewmtx, projmtx, lightIntensity);
        }

        sendARViewMessage();
    } catch (Throwable t) {
        // Avoid crashing the application due to unhandled exceptions.
        Log.e(TAG, "Exception on the OpenGL thread", t);
    }
}

那么:

 public void drawLineButton(View view) {
    AnchorNode nodeToAdd;
    for (Anchor anchor : anchors) {
        anchorNode = new AnchorNode(anchor);
        anchorNodeList.add(anchorNode);
        //this is the problem imho
        //nodeToAdd.setParent(arFragment.getArSceneView().getScene());
        numberOfAnchors++;

    }

    if (numberOfAnchors == 2 ) {
        drawLine(anchorNodeList.get(0), anchorNodeList.get(1));
    }
}

这里的节点存在并且是真实的.我没有发现任何错误,也没有显示行:

Here the Nodes exist and are real. I don't find any errors, and the lines don't show:

private void drawLine(AnchorNode node1, AnchorNode node2) {
    //Here the knots exist and are real. I don't find any errors, and the lines don't show
    runOnUiThread(new Runnable() {

        @Override
        public void run() {

            Vector3 point1, point2;
            point1 = node1.getWorldPosition();
            point2 = node2.getWorldPosition();

            //First, find the vector extending between the two points and define a look rotation
            //in terms of this Vector.
            final Vector3 difference = Vector3.subtract(point1, point2);
            final Vector3 directionFromTopToBottom = difference.normalized();
            final Quaternion rotationFromAToB =
                    Quaternion.lookRotation(directionFromTopToBottom, Vector3.up());
            MaterialFactory.makeOpaqueWithColor(getApplicationContext(), new Color(0, 255, 244))
                    .thenAccept(
                            material -> {
                        /* Then, create a rectangular prism, using ShapeFactory.makeCube() and use the difference vector
                               to extend to the necessary length.  */
                                Log.d(TAG,"drawLine insie .thenAccept");
                                ModelRenderable model = ShapeFactory.makeCube(
                                        new Vector3(.01f, .01f, difference.length()),
                                        Vector3.zero(), material);
                        /* Last, set the world rotation of the node to the rotation calculated earlier and set the world position to
                               the midpoint between the given points . */
                                Anchor lineAnchor = node2.getAnchor();
                                nodeForLine = new Node();
                                nodeForLine.setParent(node1);
                                nodeForLine.setRenderable(model);
                                nodeForLine.setWorldPosition(Vector3.add(point1, point2).scaled(.5f));
                                nodeForLine.setWorldRotation(rotationFromAToB);
                            }
                    );



        }
    });


}

这是我在 drawLine() 函数中的 point1、poin2 和 directionFromTopToBottom 的示例:

this is the example of my point1,poin2 and directionFromTopToBottom in the drawLine() function:

point1: [x=0.060496617, y=-0.39098215, z=-0.21526277]
point2: [x=0.05695567, y=-0.39132282, z=-0.33304527]
directionFromTopToBottom: [x=0.030049745, y=0.0028910497, z=0.9995442]

推荐答案

你的代码没有调用你的 drawLineButton() 函数,是吗?无论如何,看起来您正在尝试使用 Sceneform 中的一些东西(MaterialFactory、ModelRenderable 等),同时像 hello_ar_java 中所做的那样进行一些纯 OpenGL 渲染.

You code is not calling your drawLineButton() function, is it? Anyway, it looks like you're trying to use some things from Sceneform (MaterialFactory, ModelRenderable, etc) while doing some pure OpenGL rendering as done in the hello_ar_java.

混合这些不会产生任何好处,因为 Sceneform 使用 filament 作为渲染引擎,可以使用 OpenGL 或 Vulkan.因此,要么完全使用 Sceneform,要么完全使用 OpenGL(并了解 OpenGL 和 Android 的工作原理).

Mixing those will result in nothing good since Sceneform uses filament as rendering engine which could use OpenGL or Vulkan. So either go fully with Sceneform or fully with OpenGL (and understand how OpenGL and Android work).

现在,如果您想继续使用 hello_ar_java 示例,请按照 OpenGL 教程为每个锚点生成顶点并使用 GL_LINES 绘制它们你喜欢的线条尺寸.这是一个很好的 OpenGL 教程:https://learnopengl.com/ 我建议您阅读全部 入门部分,但请记住,它是 OpenGL,Android 使用 OpenGL ES,虽然存在一些差异,但计算机图形原理是相同的.

Now, if you want to continue with hello_ar_java sample, follow an OpenGL tutorial in order to be able to generate a vertex for each anchor and draw them with GL_LINES with the line size you like. Here's a good OpenGL tutorial: https://learnopengl.com/ I recommend going through all the Getting Started section, but just keep in mind that it is OpenGL and Android uses OpenGL ES, there are some differences but the computer graphics principles are the same.

这篇关于如何在没有arFragment的情况下使用ARcore在平面上的锚点之间画一条线的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆