ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Already solved the transport problem. In this moment, i can already carry a picture and see it in another node. For this, I'm using two ROS nodes in Android (publish / subscriver). I solved through two methods: CompressedImage and Image:

public void updateCompressedImage(byte[] data, int w, int h, ImageView i){
    Preconditions.checkNotNull(data);

    Time currentTime = node.getCurrentTime();
    String frameId = "camera";

    CompressedImage image = node.getTopicMessageFactory().newFromType(CompressedImage._TYPE);
    image.setFormat("jpeg");
    image.getHeader().setStamp(currentTime);
    image.getHeader().setFrameId(frameId);

    try {
        stream.write(data);
    } catch (IOException e) {
        e.printStackTrace();
    }

    image.setData(stream.buffer().copy());

    i.setImageBitmap(BitmapFactory.decodeByteArray(stream.buffer().array(),
            0, stream.buffer().array().length));
    stream.buffer().clear();

    publisher.publish(image);

}

public void updateImage(byte[] data, int w, int h, int s, ImageView i){
    Preconditions.checkNotNull(data);

    Image image = node.getTopicMessageFactory().newFromType(Image._TYPE);
    image.setHeight(h);
    image.setWidth(w);
    image.setStep(s);
    image.setEncoding("rgb8");

    Time currentTime = node.getCurrentTime();
    String frameId = "android_camera";
    image.getHeader().setStamp(currentTime);
    image.getHeader().setFrameId(frameId);

    try {
        stream.write(data);
    } catch (IOException e) {
        e.printStackTrace();
        throw new RosRuntimeException(e);
    }

    image.setData(stream.buffer().copy());
    i.setImageBitmap(BitmapFactory.decodeByteArray(stream.buffer().array(),
            0, stream.buffer().array().length));
    stream.buffer().clear();

    publisher.publish(image);
}

The i.setImageBitmap (...) is to compare if the images are equal. Using sensor_msgs.CompressImage images are equal. But using sensor_msgs.Image are not. I suspect the problem is in the encoding of the image.

If anyone can help I would appreciate.

Best regards

Paulo

Already solved the transport problem. In this moment, i can already carry a picture and see it in another node. For this, I'm using two ROS nodes in Android (publish / subscriver). I solved through two methods: CompressedImage and Image:

public void updateCompressedImage(byte[] data, int w, int h, ImageView i){
    Preconditions.checkNotNull(data);

    Time currentTime = node.getCurrentTime();
    String frameId = "camera";

    CompressedImage image = node.getTopicMessageFactory().newFromType(CompressedImage._TYPE);
    image.setFormat("jpeg");
    image.getHeader().setStamp(currentTime);
    image.getHeader().setFrameId(frameId);

    try {
        stream.write(data);
    } catch (IOException e) {
        e.printStackTrace();
    }

    image.setData(stream.buffer().copy());

    i.setImageBitmap(BitmapFactory.decodeByteArray(stream.buffer().array(),
            0, stream.buffer().array().length));
    stream.buffer().clear();

    publisher.publish(image);

}

public void updateImage(byte[] data, int w, int h, int s, ImageView i){
    Preconditions.checkNotNull(data);

    Image image = node.getTopicMessageFactory().newFromType(Image._TYPE);
    image.setHeight(h);
    image.setWidth(w);
    image.setStep(s);
    image.setEncoding("rgb8");

    Time currentTime = node.getCurrentTime();
    String frameId = "android_camera";
    image.getHeader().setStamp(currentTime);
    image.getHeader().setFrameId(frameId);

    try {
        stream.write(data);
    } catch (IOException e) {
        e.printStackTrace();
        throw new RosRuntimeException(e);
    }

    image.setData(stream.buffer().copy());
    i.setImageBitmap(BitmapFactory.decodeByteArray(stream.buffer().array(),
            0, stream.buffer().array().length));
    stream.buffer().clear();

    publisher.publish(image);
}

The stream is:

ChannelBufferOutputStream stream = new ChannelBufferOutputStream(MessageBuffers.dynamicBuffer());

The i.setImageBitmap (...) is to compare if the images are equal. Using sensor_msgs.CompressImage images are equal. But using sensor_msgs.Image are not. I suspect the problem is in the encoding of the image.

If anyone can help I would appreciate.

Best regards

Paulo