如何从 Java websocket 服务器发送图像以在 HTML5 画布中使用?

2022-01-17 00:00:00 websocket image html5-canvas java

我有一个用 Java 实现的 WebSocket 服务器.当客户端连接时,我想通过此连接发送图像供客户端在画布元素中使用.我想出了以下服务器代码:

public void onOpen(Connection connection) {尝试 {BufferedImage image = ImageIO.read(new File("image.jpg"));ByteArrayOutputStream baos = new ByteArrayOutputStream();ImageIO.write(image, "jpg", baos);byte[] byteArray = baos.toByteArray();connection.sendMessage(byteArray, 0, byteArray.length);} 捕捉(异常 e){System.out.println("错误:"+e.getMessage());}}

客户端 Javascript 如下所示:

onmessage : function(m) {如果(m.data){if (m.data instanceof Blob) {var blob = m.data;var bytes = new Uint8Array(blob);var image = context.createImageData(canvas.width, canvas.height);for (var i=0; i

连接正常并发送数据(blob.size 具有正确的值),但图像未绘制到画布上.Firefox 给我错误消息TypeError:值无法转换为任何一个:HTMLImageElement、HTMLCanvasElement、HTMLVideoElement.".

我知道使用 WebSockets 并不是向客户端发送图像的最佳方式.发送图片后,WebSocket 仅用于发送短信.

为了将图像发送到画布上,我需要进行哪些更改?

使用的资源:

如何在java中将图像转换为字节数组?

在 WebSocket 中接收 Blob 并在中呈现为图像帆布

解决方案

发送前尝试将图片转base64,例如:

函数drawImage(imgString){var canvas = document.getElementById("canvas");var ctx = canvas.getContext("2d");变种图像 = 新图像();image.src = imgString;image.onload = 函数() {ctx.drawImage(图像, 0, 0);};}

这里一个关于如何在Java中将图像转换为base64的链接p>

I have a WebSocket server implemented in Java. When a client connects I want to send an image over this connection for the client to use in a canvas element. I have come up with the following server code:

public void onOpen(Connection connection) {
    try {
        BufferedImage image = ImageIO.read(new File("image.jpg"));
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        ImageIO.write(image, "jpg", baos);
        byte[] byteArray = baos.toByteArray();
        connection.sendMessage(byteArray, 0, byteArray.length);
    } catch (Exception e ){
        System.out.println("Error: "+e.getMessage());
    }
}

The client-side Javascript looks like this:

onmessage : function(m) {
    if (m.data) {
        if (m.data instanceof Blob) {
            var blob = m.data;

            var bytes = new Uint8Array(blob);
            var image = context.createImageData(canvas.width, canvas.height);
            for (var i=0; i<bytes.length; i++) {
                image.data[i] = bytes[i];
            }
        }
    }
}

The connection works and the data is sent (blob.size has the correct value), but the image is not drawn onto the canvas. Firefox gives me the error message "TypeError: Value could not be converted to any of: HTMLImageElement, HTMLCanvasElement, HTMLVideoElement.".

I am aware of the fact that this using WebSockets is not the best way to send an image to the client. After sending the image the WebSocket is only used to send text messages.

What do I need to change for the image to be sent an applied to the canvas?

Resources used:

how to convert image to byte array in java?

Receive Blob in WebSocket and render as image in Canvas

解决方案

Try converting the image to base64 before sending, for example:

function drawImage(imgString){
    var canvas = document.getElementById("canvas");
    var ctx = canvas.getContext("2d");

    var image = new Image();
    image.src = imgString;
    image.onload = function() {
        ctx.drawImage(image, 0, 0);
    };
}

Here's a link on how to convert the image to base64 in Java

相关文章