尝试将javafx WebView渲染到屏幕外缓冲区或FBO [英] Trying to render javafx WebView to offscreen buffer or FBO

查看:183
本文介绍了尝试将javafx WebView渲染到屏幕外缓冲区或FBO的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最终目标是能够以30fps或更高的速度记录WebView的输出,也许是通过为javafx设置FBO?然后我可以按照我想要的任何帧速率拉出帧。



我已经在一些人中找到了,我在ViewScene中遇到了UploadingPainter,这让我觉得这是可能的。挣扎的是,这看起来似乎在幕后并且对我来说有点新鲜。



任何人都知道如何制作这样的作品吗?



这是我在调试过程中遇到的代码:

  @Override 
public void setStage(GlassStage stage){
super.setStage(stage);
if(stage!= null){
WindowStage wstage =(WindowStage)stage;
if(wstage.needsUpdateWindow()|| GraphicsPipeline.getPipeline()。isUploading()){
if(Pixels.getNativeFormat()!= Pixels.Format.BYTE_BGRA_PRE ||
ByteOrder。 nativeOrder()!= ByteOrder.LITTLE_ENDIAN){
抛出new UnsupportedOperationException(UNSUPPORTED_FORMAT);
}
painter = new UploadingPainter(this);
} else {
painter = new PresentingPainter(this);
}
painter.setRoot(getRoot());
paintRenderJob = new PaintRenderJob(this,PaintCollector.getInstance()。getRendered(),painter);
}
}


解决方案

这里是一个在WebView中捕获动画的示例。



从Web视图捕获的图像被放置在Paginator中以便查看,以便于查看它们。如果您愿意,可以使用 SwingFXUtils ImageIO 将它们写入文件。如果您想将结果图像放入缓冲区,可以使用 PixelReader






它不像我想要的那样工作。我想快照WebView而不将其置于可见的阶段。拍摄不在舞台上的节点的快照适用于JavaFX中的每个其他节点类型(据我所知),但是,出于某种奇怪的原因,它不适用于WebView。因此,示例实际上在显示窗口后面创建了一个新阶段,显示动画捕获结果的图像序列。我知道这不完全是你想要的,但它就是它......

  import javafx.animation.AnimationTimer ; 
import javafx.application.Application;
import javafx.beans.property。*;
import javafx.collections。*;
import javafx.concurrent.Worker;
import javafx.geometry.Insets;
import javafx.scene.Scene;
import javafx.scene.SnapshotParameters;
import javafx.scene.control。*;
import javafx.scene.image。*;
import javafx.scene.layout。*;
import javafx.scene.web.WebView;
import javafx.stage.Stage;

公共类WebViewAnimationCaptor扩展Application {

private static final String CAPTURE_URL =
https://upload.wikimedia.org/wikipedia/commons/d/dd /Muybridge_race_horse_animated.gif;

private static final int N_CAPS_PER_SECOND = 10;
private static final int MAX_CAPTURES = N_CAPS_PER_SECOND * 5;
private static final int W = 186,H = 124;

class CaptureResult {
ObservableList< Image> images = FXCollections.observableArrayList();
DoubleProperty progress = new SimpleDoubleProperty();
}

@Override public void start(阶段阶段){
CaptureResult captures = captureAnimation(CAPTURE_URL);
Pane captureViewer = createCaptureViewer(capture);

stage.setScene(新场景(captureViewer,W + 40,H + 80));
stage.show();
}

private StackPane createCaptureViewer(CaptureResult capture){
ProgressIndicator progressIndicator = new ProgressIndicator();
progressIndicator.progressProperty()。bind(captures.progress);
progressIndicator.setPrefSize(W,H);

StackPane stackPane = new StackPane(progressIndicator);
stackPane.setPadding(new Insets(10));
if(captures.progress.get()> = 1.0){
stackPane.getChildren()。setAll(
createImagePages(captures.images)
);
} else {
captures.progress.addListener((observable,oldValue,newValue) - > {
if(newValue.doubleValue()> = 1.0){
stackPane .getChildren()。setAll(
createImagePages(captures.images)
);
}
});
}

返回stackPane;
}

private Pagination createImagePages(ObservableList< Image> capture){
Pagination pagination = new Pagination();
pagination.setPageFactory(param - > {
ImageView currentImage = new ImageView();
currentImage.setImage(
param< captures.size()
? captures.get(param)
:null
);

StackPane pageContent = new StackPane(currentImage);
pageContent.setPrefSize(W,H);

返回pageContent;
});

pagination.setCurrentPageIndex(0);
pagination.setPageCount(captures.size());
pagination.setMaxPageIndicatorCount(captures.size());

返回分页;
}

private CaptureResult captureAnimation(final String url){
CaptureResult captureResult = new CaptureResult();

WebView webView = new WebView();
webView.getEngine()。load(url);
webView.setPrefSize(W,H);

阶段captureStage = new Stage();
captureStage.setScene(new Scene(webView,W,H));
captureStage.show();

SnapshotParameters snapshotParameters = new SnapshotParameters();
captureResult.progress.set(0);

AnimationTimer timer = new AnimationTimer(){
long last = 0;

@Override
public void handle(现在很久){
if(now> last + 1_000_000_000.0 / N_CAPS_PER_SECOND){
last = now;
captureResult.images.add(webView.snapshot(snapshotParameters,null));
captureResult.progress.setValue(
captureResult.images.size()* 1.0 / MAX_CAPTURES
);
}

if(captureResult.images.size()> MAX_CAPTURES){
captureStage.hide();
this.stop();
}
}
};

webView.getEngine()。getLoadWorker()。stateProperty()。addListener((observable,oldValue,newValue) - > {
if(Worker.State.SUCCEEDED.equals(newValue) )){
timer.start();
}
});

返回captureResult;
}

public static void main(String [] args){launch(args); }
}

要微调动画序列捕获,你可以查看



无头端口使用InputDeviceRegistry的LinuxInputDeviceRegistry实现。但是,无头端口根本不访问任何实际的Linux设备或任何本机API;它在设备模拟模式下使用Linux输入注册表。这样即使在非Linux平台上也可以模拟Linux设备输入。 tests / system / src / test / java / com / sun / glass / ui / monocle / input中的测试广泛使用此功能。


如果基于JavaFX Monocle的方法最终不适合你,你可以考虑另一个(不是JavaFX相关的)无头HTML渲染工具包,例如 PhantomJS


The ultimate goal is to be able to record the output of a WebView at 30fps or better, perhaps by setting up an FBO for javafx? I could then pull out frames at whatever framerate I wanted.

I've poked around some and I came across UploadingPainter in ViewScene, which makes me think that this is possible. The struggle is that this is seemingly under the hood and somewhat new to me.

Anyone know of a way to make something like this work?

This is the code that I came across during debugging:

@Override
public void setStage(GlassStage stage) {
    super.setStage(stage);
    if (stage != null) {
        WindowStage wstage  = (WindowStage)stage;
        if (wstage.needsUpdateWindow() || GraphicsPipeline.getPipeline().isUploading()) {
            if (Pixels.getNativeFormat() != Pixels.Format.BYTE_BGRA_PRE ||
                ByteOrder.nativeOrder() != ByteOrder.LITTLE_ENDIAN) {
                throw new UnsupportedOperationException(UNSUPPORTED_FORMAT);
            }
            painter = new UploadingPainter(this);
        } else {
            painter = new PresentingPainter(this);
        }
        painter.setRoot(getRoot());
        paintRenderJob = new PaintRenderJob(this, PaintCollector.getInstance().getRendered(), painter);
    }
}

解决方案

Here is an example of capturing an animation in a WebView.

The images captured from the web view are placed in a Paginator for viewing purposes just so that it is easy to review them. You could use SwingFXUtils and ImageIO to write them out to files instead if you wish. If you want to get the resultant images into a buffer, you could use their PixelReader.

It doesn't quite work the way I wanted it to. I wanted to snapshot the WebView without placing it in a visible stage. Taking snapshots of nodes that are not in a Stage works fine for every other node type in JavaFX (as far as I know), however, for some weird reason, it does not work for WebView. So the sample actually creates a new Stage behind the display window that displays the image sequence for the animation capture result. I'm aware that not exactly what you want, but it is what it is...

import javafx.animation.AnimationTimer;
import javafx.application.Application;
import javafx.beans.property.*;
import javafx.collections.*;
import javafx.concurrent.Worker;
import javafx.geometry.Insets;
import javafx.scene.Scene;
import javafx.scene.SnapshotParameters;
import javafx.scene.control.*;
import javafx.scene.image.*;
import javafx.scene.layout.*;
import javafx.scene.web.WebView;
import javafx.stage.Stage;

public class WebViewAnimationCaptor extends Application {

    private static final String CAPTURE_URL =
            "https://upload.wikimedia.org/wikipedia/commons/d/dd/Muybridge_race_horse_animated.gif";

    private static final int N_CAPS_PER_SECOND = 10;
    private static final int MAX_CAPTURES = N_CAPS_PER_SECOND * 5;
    private static final int W = 186, H = 124;

    class CaptureResult {
        ObservableList<Image> images = FXCollections.observableArrayList();
        DoubleProperty progress = new SimpleDoubleProperty();
    }

    @Override public void start(Stage stage) {
        CaptureResult captures = captureAnimation(CAPTURE_URL);
        Pane captureViewer = createCaptureViewer(captures);

        stage.setScene(new Scene(captureViewer, W + 40, H + 80));
        stage.show();
    }

    private StackPane createCaptureViewer(CaptureResult captures) {
        ProgressIndicator progressIndicator = new ProgressIndicator();
        progressIndicator.progressProperty().bind(captures.progress);
        progressIndicator.setPrefSize(W, H);

        StackPane stackPane = new StackPane(progressIndicator);
        stackPane.setPadding(new Insets(10));
        if (captures.progress.get() >= 1.0) {
            stackPane.getChildren().setAll(
                createImagePages(captures.images)
            );
        } else {
            captures.progress.addListener((observable, oldValue, newValue) -> {
                if (newValue.doubleValue() >= 1.0) {
                    stackPane.getChildren().setAll(
                            createImagePages(captures.images)
                    );
                }
            });
        }

        return stackPane;
    }

    private Pagination createImagePages(ObservableList<Image> captures) {
        Pagination pagination = new Pagination();
        pagination.setPageFactory(param -> {
            ImageView currentImage = new ImageView();
            currentImage.setImage(
                    param < captures.size()
                            ? captures.get(param)
                            : null
            );

            StackPane pageContent = new StackPane(currentImage);
            pageContent.setPrefSize(W, H);

            return pageContent;
        });

        pagination.setCurrentPageIndex(0);
        pagination.setPageCount(captures.size());
        pagination.setMaxPageIndicatorCount(captures.size());

        return pagination;
    }

    private CaptureResult captureAnimation(final String url) {
        CaptureResult captureResult = new CaptureResult();

        WebView webView = new WebView();
        webView.getEngine().load(url);
        webView.setPrefSize(W, H);

        Stage captureStage = new Stage();
        captureStage.setScene(new Scene(webView, W, H));
        captureStage.show();

        SnapshotParameters snapshotParameters = new SnapshotParameters();
        captureResult.progress.set(0);

        AnimationTimer timer = new AnimationTimer() {
            long last = 0;

            @Override
            public void handle(long now) {
                if (now > last + 1_000_000_000.0 / N_CAPS_PER_SECOND) {
                    last = now;
                    captureResult.images.add(webView.snapshot(snapshotParameters, null));
                    captureResult.progress.setValue(
                            captureResult.images.size() * 1.0 / MAX_CAPTURES
                    );
                }

                if (captureResult.images.size() > MAX_CAPTURES) {
                    captureStage.hide();
                    this.stop();
                }
            }
        };

        webView.getEngine().getLoadWorker().stateProperty().addListener((observable, oldValue, newValue) -> {
            if (Worker.State.SUCCEEDED.equals(newValue)) {
                timer.start();
            }
        });

        return captureResult;
    }

    public static void main(String[] args) { launch(args); }
}

To fine-tune the animation sequence capture, you could review this info on AnimationTimers in JavaFX.

If you need to make the thing "headless", so that a visible stage is not required, you could try this gist by danialfarid which performs "Java Image Capture, HTML Snapshot, HTML to image" (though I did not write the linked gist and have not tried it).


Headless is key, in my case. The (linux) machines in question run in a server farm totally headless. As for the gist, I see a show() in there but I'll take a closer look to make sure that I didn't overlook something.

The gist is based upon the Monocle glass rendering toolkit for JavaFX systems. This toolkit supports software based headless rendering on any system.

From the Monocle Documentation:

The headless port does nothing. It is for when you want to run JavaFX with no graphics, input or platform dependencies. Rendering still happens, it just doesn’t show up on the screen.

The headless port uses the LinuxInputDeviceRegistry implementation of InputDeviceRegistry. However the headless port does not access any actual Linux devices or any native APIs at all; it uses the Linux input registry in device simulation mode. This allows Linux device input to be simulated even on non-Linux platforms. The tests in tests/system/src/test/java/com/sun/glass/ui/monocle/input make extensive use of this feature.

If the JavaFX Monocle based approach ends up not working out for you, you could consider another (not JavaFX related) headless HTML rendering kit, such as PhantomJS.

这篇关于尝试将javafx WebView渲染到屏幕外缓冲区或FBO的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆