ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Andrew.A's profile - activity

2020-12-02 04:20:20 -0500 marked best answer Re-stream video capture

I currently have a robot running Ubuntu with ROS installed and an Asus Xtion Pro Live attached. What I'm trying to do is to capture the video and push it to a server, and then have the server re-stream it so that other clients that connect to my server can view the stream. The server is running Windows, has a public ip and does not have ROS installed.

What I currently have now is, while the robot is on the same wifi as a client, use mjpeg_server, and have the client connect directly to the robot to view the stream. What I'm trying to overcome is, if I wanted to use my robot outside and use a 3G/4G dongle, I wouldn't be able to connect to it directly. Or, if I wanted to view the video stream from outside the network. So I want it to push the stream to my server, and then have clients connect to the server to view the stream through a webapp. The server currently runs a webapp on tomcat, using java, javascript jsp, jquery, mssql. I want to add on to the webapp to allow viewing the video stream from the Asus Xtion on my robot, without having to install ROS on the server if possible.

I roughly know that there will be three things I have to do:

  1. Stream the Asus Xtion camera feed to the server,
  2. Receive the stream on my server somehow, and
  3. Have my server re-stream the video.

But I don't really have any idea on how to go about doing this. Can anyone help?

EDIT: I know that with VLC, I can pull a video stream, and then re-stream it. However this won't work for me because I won't be able to pull from the robot; it doesn't have a public ip so it wouldn't be accessible.

2017-04-24 01:58:31 -0500 marked best answer roslaunch openni_launch openni.launch no devices connected

I'm just starting out trying to use ROS with my Primesense device (Asus Xtion Pro Live) on Ubuntu 12.10 and ROS groovy. I'm trying to perform

roslaunch openni_launch openni.launch

but I'm faced with

[ INFO] [1375408552.874803204]: No devices connected.... waiting for devices to be connected

This is the output of lsusb:

Bus 001 Device 008: ID 1d27:0601  
Bus 005 Device 002: ID 413c:2003 Dell Computer Corp. Keyboard
Bus 005 Device 003: ID 0461:4d22 Primax Electronics, Ltd 
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub

It seems like my Primesense device isn't recognised (first row above). The question is, where can I download the appropriate drivers? I'm a bit confused because I can run ./SimpleViewer in the OpenNI samples (doesn't this mean the drivers are installed?). I've read a few other posts regarding this, including

http://answers.ros.org/question/60766/kinect-not-recognized-with-opennilaunch/

and this

http://answers.ros.org/question/60562/ubuntu-12042-and-openni_launch-not-detecting-kinect-after-update/

but nothing works. I've also tried looking up the Asus drivers download page here

http://support.asus.com/Download.aspx?SLanguage=en&m=Xtion+PRO+LIVE&p=19&s=11

and noticed that it's quite outdated (OpenNI v1.5 and NiTE v.1.5, when v2.2 is out). I already have OpenNI v2.2 installed. Should I be installing NiTE 2.2? I ask because it's categorized as a Middleware Library.

Thanks!

2016-05-08 15:08:35 -0500 marked best answer Device is in safe mode. Cannot start any stream!

I'm trying to use my Asus Xtion Pro Live on Ubuntu 12.04, running ROS Groovy. When I try to use the command roslaunch openni_launch openni.launch, I'm faced with this error:

Number devices connected: 1
[ INFO] [1401772289.477038700]: 1. device on bus 001:02 is a Xtion Pro (600) from ASUS (1d27) with serial id ''
[ INFO] [1401772289.480929348]: Searching for device with index = 1
[ INFO] [1401772289.535944795]: No matching device found.... waiting for devices. Reason: openni_wrapper::OpenNIDevice::OpenNIDevice(xn::Context&, const xn::NodeInfo&, const xn::NodeInfo&, const xn::NodeInfo&) @ /home/andrew/catkin_ws/src/openni_camera/src/openni_device.cpp @ 99 : creating depth generator failed. Reason: Device is in safe mode. Cannot start any stream!

I'm very puzzled because it was working perfectly fine just yesterday and I did not change anything at all. Running rosrun openni_camera openni_node produces the same error. I have OpenNI v1.5.7.10 installed, and Sensor-Bin v5.1.6.6 installed. I'm using the rollback_usb branch for openni_camera. Would anyone know what happened? Did my Asus Xtion Pro Live hardware itself malfunction?

2015-11-12 06:38:09 -0500 received badge  Notable Question (source)
2015-11-12 06:38:09 -0500 received badge  Famous Question (source)
2015-06-19 01:05:04 -0500 marked best answer High level design involving a network of robots and a server

I'm having some trouble trying to come up with a design for a network of robots, and a server. I'm trying to have a set up such that:

  • I have a central command center (like a server), where I can monitor and control these robots
  • I can use the server to give commands to each robot to do different things
  • Have robots send back responses to commands, back to the server

For example, say I have two robots running Ubuntu, and I want to monitor and control them through a web server. Say the server is running Tomcat, and has a public IP, so it can be assessed from anywhere. For security reasons, let's say I'll have to log in with a valid username and password to assess my robots. In this case, I would code up perhaps a .jsp file and serve that, or write some sort of webapp, to have a user interface to interact with the robots. To monitor the robots could mean like, I could select robot 1, and view what the camera mounted on robot 1 is currently capturing (say its running mjpeg_server). It could also mean that, the robot is constantly sending information back to the server about its status, such as whether it is online or offline, its current speed, its battery levels, etc, and the server is displaying the information through the user interface. For control, it could mean that I can toggle between exploration mode, and remote control. In remote control mode, I'd have buttons that I can press to move the robot.

I'm looking to build something like this, but don't really have an idea on how to start. Could I just use ROS's publisher/subscriber and service system for such communications? The issue is that my server is running on Windows, and as far as I know, ROS is only experimental on Windows, so would it be better for the server to be independent of ROS? If yes, how would I link the server with the robots? Would I create some kind of TCP connection, where the server sends some commands, and I write a ROS package for the robot to interpret these commands and carry them out?

I'm not looking for one exact answer, but more of a discussion on how to achieve such a set up, and to explore the options I have to do this. Any ideas, opinions, advice, warnings and reading resources would be appreciated!

2015-06-14 21:26:52 -0500 received badge  Famous Question (source)
2015-03-24 22:32:56 -0500 marked best answer Installing an older version of a ros package by source

I used to run version 1.8.9 of openni_camera, and I'd like to revert back to an older version of it. The source is found here.

This is actually a commit between versions 1.8.8 and 1.8.9. Does anyone know how I can go about installing this package through source?

I've previously installed openni_camera as a deb package, and have uninstalled it using sudo apt-get --purge remove ros-groovy-openni-camera. What I've tried so far is putting the source folder into my catkin workspace (~/catkin_ws/src), and using catkin_make in ~/catkin_ws, but I'm faced with an error:

$ catkin_make
Base path: /home/andrew/catkin_ws
Source space: /home/andrew/catkin_ws/src
Build space: /home/andrew/catkin_ws/build
Devel space: /home/andrew/catkin_ws/devel
Install space: /home/andrew/catkin_ws/install
####
#### Running command: "cmake /home/andrew/catkin_ws/src -DCATKIN_DEVEL_PREFIX=/home/andrew/catkin_ws/devel -DCMAKE_INSTALL_PREFIX=/home/andrew/catkin_ws/install" in "/home/andrew/catkin_ws/build"
####
-- Using CATKIN_DEVEL_PREFIX: /home/andrew/catkin_ws/devel
-- Using CMAKE_PREFIX_PATH: /home/andrew/catkin_ws/devel;/opt/ros/groovy
-- This workspace overlays: /home/andrew/catkin_ws/devel;/opt/ros/groovy
-- Using Debian Python package layout
-- Using CATKIN_ENABLE_TESTING: ON
-- Call enable_testing()
-- Using CATKIN_TEST_RESULTS_DIR: /home/andrew/catkin_ws/build/test_results
-- Found gtest sources under '/usr/src/gtest': gtests will be built
-- catkin 0.5.71
-- BUILD_SHARED_LIBS is on
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- ~~  traversing 2 packages in topological order:
-- ~~  - openni_camera
-- ~~  - servo
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- +++ processing catkin package: 'openni_camera'
-- ==> add_subdirectory(openni_camera)
-- Boost version: 1.49.0
-- Found the following Boost libraries:
--   system
--   filesystem
--   thread
CMake Error at /opt/ros/groovy/share/dynamic_reconfigure/cmake/extras.cmake:47 (message):
  Could not run dynamic reconfigure file 'cfg/OpenNI.cfg':
  /home/andrew/catkin_ws/build/catkin_generated/env_cached.sh: 10: exec:
  /home/andrew/catkin_ws/src/openni_camera/cfg/OpenNI.cfg: Permission denied
Call Stack (most recent call first):
  openni_camera/CMakeLists.txt:11 (generate_dynamic_reconfigure_options)


-- Configuring incomplete, errors occurred!
Invoking "cmake" failed

I'm running Ubuntu 12.10 on ROS Groovy, and using an Asus Xtion Pro Live. Please help!

2015-01-15 03:51:14 -0500 received badge  Famous Question (source)
2015-01-05 21:06:21 -0500 marked best answer ROS indigo installation for Ubuntu ARM (Jetson TK1) Ubuntu 14.04

I'm trying to install ROS indigo on my Jetson TK1, following the steps as stated here. However, I'm facing a problem when running sudo apt-get ros-indigo-ros-base. This is the error I receive:

$ sudo apt-get install ros-indigo-ros-base
[sudo] password for ubuntu: 
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
 ros-indigo-ros-base : Depends: ros-indigo-actionlib but it is not going to be installed
                       Depends: ros-indigo-bond-core but it is not going to be installed
                       Depends: ros-indigo-class-loader but it is not going to be installed
                       Depends: ros-indigo-common-tutorials but it is not going to be installed
                       Depends: ros-indigo-dynamic-reconfigure but it is not going to be installed
                       Depends: ros-indigo-nodelet-core but it is not going to be installed
                       Depends: ros-indigo-pluginlib but it is not going to be installed
                       Depends: ros-indigo-ros-core but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

Can anyone help me out please?

2015-01-03 07:34:20 -0500 commented answer Stream Asus Xtion Pro Live rgb with openni2 and mjpeg_server

Please ask this as a new question

2014-12-01 02:14:21 -0500 received badge  Taxonomist
2014-11-22 01:37:30 -0500 received badge  Famous Question (source)
2014-11-21 03:42:23 -0500 received badge  Famous Question (source)
2014-11-05 07:49:57 -0500 commented answer Streaming depth to web browser

I'll try this out once I can, thanks!

2014-11-05 05:52:39 -0500 commented answer Streaming depth to web browser

I'm actually trying to get the stream from a network device rather than localhost, so the URL looks something like http://my.network.device.ip:8181/stream_viewer?topic=/camera/depth/image.

2014-11-01 05:02:36 -0500 received badge  Notable Question (source)
2014-10-31 14:49:06 -0500 received badge  Self-Learner (source)
2014-10-31 14:49:06 -0500 received badge  Necromancer (source)
2014-10-31 11:26:57 -0500 received badge  Notable Question (source)
2014-10-30 22:38:06 -0500 commented answer Streaming depth to web browser

I'm having difficulty trying to show this on my webpage using the img tag. Do you know what's wrong?

2014-10-30 21:54:22 -0500 answered a question High level design involving a network of robots and a server

I managed to come up with a simple solution using OpenVPN. With OpenVPN, I created a private network so that I could directly access the robots through my webserver. From my webserver, I used rosbridge_suite to connect to the robot via a websocket. From there, I could publish messages to topics, subscribe to topics, and call services. I also ran mjpeg_server on the robots to be able to view them from the webserver!

2014-10-30 21:18:53 -0500 commented answer Streaming depth to web browser

This is excellent! Also, I think you should move your answer from the link you provided to here instead! I think it's more relevant here. I noticed the quality parameter doesn't work. Will it be included soon?

2014-10-30 21:03:56 -0500 received badge  Popular Question (source)
2014-10-30 14:42:18 -0500 received badge  Popular Question (source)
2014-10-30 05:45:43 -0500 received badge  Notable Question (source)
2014-10-30 05:01:03 -0500 asked a question Streaming depth to web browser

I'm running Indigo on Ubuntu 14.04 on a Nvidia Jetson TK1, using an Asus Xtion Pro Live.

Is it possible to stream the depth from a primesense device to web browser? I understand that it is possible to stream the rgb to a web browser using mjpeg_server and openni2.

When I try to use mjpeg_server, and I subscribe to the topic /camera/depth/image_raw, mjpeg_server gives the error Unable to convert 16UC1 image to ipl format.

And when I subscribe to /camera/depth/image, mjpeg_server gives the error Unable to convert 32FC1 image to ipl format.

Can anyone help me?

2014-10-29 23:11:20 -0500 answered a question Stream Asus Xtion Pro Live rgb with openni2 and mjpeg_server

The topic /camera/rgb/image_color/ didn't exist! I checked available topics by doing a rostopic list, and the topic I was interested turned out to be /camera/rgb/image_raw. After changing image_color to image_raw, it worked!

2014-10-29 21:58:30 -0500 commented answer ROS indigo installation for Ubuntu ARM (Jetson TK1) Ubuntu 14.04

I was following a tutorial here, and I think he did the same thing and created a dummy package!

2014-10-29 21:55:47 -0500 commented answer How to setup Xtion Pro Live Ubuntu 14.04 openNI(1/2) ?
2014-10-29 21:51:57 -0500 asked a question Stream Asus Xtion Pro Live rgb with openni2 and mjpeg_server

Currently, I'm running Indigo on Ubuntu 14.04 on my Nvidia Jetson TK1.

I'm trying to stream rgb from my Asus Xtion Pro Live rgb to a web browser using openni2 and mjpeg_server. I used openni v1 previously, and how i did it was simply by running roslaunch openni_launch openni.launch, and rosrun mjpeg_server mjpeg_server _port:=8181 . I put the url (in my case, http://192.168.254.100:8080/stream?to... ) in an img tag on a webserver, and I could see the rgb stream perfectly.

But with openni2, I tried to do the same, running roslaunch openni2_launch openni2.launch and rosrun mjpeg_server mjpeg_server, but I can't view any rgb stream on my web browser. When I connect though, I can see that mjpeg_server recognises it and says Client connected. 1 subscribers for /camera/rgb/image_color.

$ roslaunch openni2_launch openni2.launch
... logging to /home/ubuntu/.ros/log/bc719cc2-5fdc-11e4-8705-00044b25c9e6/roslaunch-tegra-ubuntu-21571.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://tegra-ubuntu:42313/

SUMMARY
========

PARAMETERS
 * /camera/camera_nodelet_manager/num_worker_threads: 4
 * /camera/depth_rectify_depth/interpolation: 0
 * /camera/driver/auto_exposure: True
 * /camera/driver/auto_white_balance: True
 * /camera/driver/color_depth_synchronization: False
 * /camera/driver/depth_camera_info_url: 
 * /camera/driver/depth_frame_id: /camera_depth_opt...
 * /camera/driver/depth_registration: False
 * /camera/driver/device_id: #1
 * /camera/driver/rgb_camera_info_url: 
 * /camera/driver/rgb_frame_id: /camera_rgb_optic...
 * /rosdistro: indigo
 * /rosversion: 1.11.9

NODES
  /camera/
    camera_nodelet_manager (nodelet/nodelet)
    depth_metric (nodelet/nodelet)
    depth_metric_rect (nodelet/nodelet)
    depth_points (nodelet/nodelet)
    depth_rectify_depth (nodelet/nodelet)
    driver (nodelet/nodelet)
    points_xyzrgb_sw_registered (nodelet/nodelet)
    rectify_color (nodelet/nodelet)
    register_depth_rgb (nodelet/nodelet)
  /
    camera_base_link (tf/static_transform_publisher)
    camera_base_link1 (tf/static_transform_publisher)
    camera_base_link2 (tf/static_transform_publisher)
    camera_base_link3 (tf/static_transform_publisher)

auto-starting new master
process[master]: started with pid [21582]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to bc719cc2-5fdc-11e4-8705-00044b25c9e6
process[rosout-1]: started with pid [21595]
started core service [/rosout]
process[camera/camera_nodelet_manager-2]: started with pid [21612]
process[camera/driver-3]: started with pid [21613]
[ INFO] [1414636240.011987344]: Initializing nodelet with 4 worker threads.
process[camera/rectify_color-4]: started with pid [21632]
process[camera/depth_rectify_depth-5]: started with pid [21660]
process[camera/depth_metric_rect-6]: started with pid [21671]
Warning: USB events thread - failed to set priority. This might cause loss of data...
process[camera/depth_metric-7]: started with pid [21688]
process[camera/depth_points-8]: started with pid [21707]
process[camera/register_depth_rgb-9]: started with pid [21717]
process[camera/points_xyzrgb_sw_registered-10]: started with pid [21735]
process[camera_base_link-11]: started with pid [21749]
process[camera_base_link1-12]: started with pid [21763]
process[camera_base_link2-13]: started with pid [21774]
process[camera_base_link3-14]: started with pid [21785]
[ INFO] [1414636240.516221719]: Device "1d27/0601@3/3" with serial number "1306010085" connected

Warning: USB events thread - failed to set priority. This might cause loss of data...

$ rosrun mjpeg_server mjpeg_server _port:=8181
[ INFO] [1414636245.969765456]: Starting mjpeg server
[ INFO] [1414636245.972252946]: Bind(8181) succeeded
[ INFO] [1414636245.972826527]: waiting for clients to connect
[ INFO] [1414636249.914464389]: Client connected
[ INFO] [1414636250.113366284]: 1 subscribers for /camera/rgb/image_color/
[ INFO] [1414636250.492810561]: Subscribing to topic /camera/rgb/image_color/

Can anyone tell me if I did anything wrong?

2014-10-29 21:41:53 -0500 commented answer How to setup Xtion Pro Live Ubuntu 14.04 openNI(1/2) ?

I'm trying to set up an rgb stream to my web browser using mjpeg_server and openni2 with Asus Xtion Pro Live. I ran roslaunch openni2_launch openni2.launch,rosrun mjpeg_server mjpeg_server , but I can't see anything on my web browser when I type in the url http://myurl.com:port/image_topic . Help!

2014-10-29 12:14:00 -0500 received badge  Popular Question (source)
2014-10-29 05:23:13 -0500 commented question ROS indigo installation for Ubuntu ARM (Jetson TK1) Ubuntu 14.04

No I didn't! Just enabled them and it's running great now. Thanks!