Here’s a couple pics and a short video of our latest product: the Xaxxon Open Lidar sensor! It’s a USB powered rotational laser scanner with open software and hardware, intended for use with mobile robots and simultaneous-location-and-mapping (SLAM) applications.
The sensor has a simple mechanical design, using the proven Garmin LIDAR-Litev3 laser distance measurement sensor, wired through a rotational slip ring, with stepper motor drive, two 3D printed frame parts, and an Arduino compatible PCB. Power and communication are delivered via USB cable.
Initial ROS drivers are up on Github, with dynamically configurable settings including:
-RPM (10-250, 180 default)
-Sample rate (up to 750Hz)
-Range limits (up to 40 meters, even in sunlight!)
We’re working on adding this unit to our online shop now, with all the details, circuit schematics, and printed part STL downloads. It will be available as a fully assembled and tested sensor, and a 3D-print-your-own kit.
Of course, there is also an Oculus Prime accessory version in the works!
The first of a few NEW Xaxxon products for 2019 is the POWER v2 PCB!
This battery charging and system power management board adds new features to the 1st generation Xaxxon Power LiPo3S PCB, and fixes issues. Enhancements include:
No parasitic battery drain (ie., you no longer need to be careful about unplugging batteries when not using the system, which should lead to fewer premature pack deaths!)
Optional daisy-chaining of multiple boards and batteries for increased capacity
Second soft power shutoff mode added: you can now kill system power only, yet keep the 5V microcontroller alive (drawing very little power), and optionally bring system power back up after specified delay
Optional isolated battery-only power out (eg., for 12V motors to never experience high wall power voltages)
Higher wall power voltages accepted, up to 20V
Protection diode and 5A fuse now on-board, simplifying wiring
All new Xaxxon robots will be shipped with this board pre-installed, along with an Oculusprime software update that makes use of the second soft-power shutoff mode, to add a ‘drownproofing’ feature:
For lost robots that are unable to dock, with no remote help immediately available – instead of the usual powering down completely when the battery is depleted, the PCB will optionally power down the host system only, while leaving its microcontroller alive, then it will bring the host system back up every hour on the hour for 5 minutes, to see if any remote help is available.
The board is AVAILABLENOW for purchase from our web store, for general mobile robotics projects, and DIY mobile PC projects. The previous generation board is still available (on sale!) while supplies last.
Bonus for current Xaxxon robot owners: if you want to upgrade to this new board you’re eligible for a discount! If interested, please contact us
We’ve once again updated our multi function differential drive robot MALGPCB — it retains the same basic functionality with slightly improved microcontroller power decoupling/smoothing, and a few layout changes.
Gone is the ancient 4 pin Molex jack found in the previous MALG — apparently the original Molex ‘Mate-N-Lok’ style dates back to 1963! The Molex cable was being used simply as a way to get 5V power from a motherboard, and to not be limited by the 500mA maximum supplied through the USB cable. Instead, with the J4205-ITX and J5005-ITX motherboards supplied with Oculus Prime SLAM Navigator robots, we’re now running a single lead from the motherboards’ +5V pin found in its chassis speaker header.
For flexibility we’re staying with use of Pololu daughter boards for the gyro. As well there is now an alternate gyro header for optional use of latest generation Pololu gyros and IMUs.
The photo below is of a heavily modified SLAM Navigator sporting a second MALGv3, equipped with a 9-DOF Pololu AltIMU-10:
The MALG (Motors-Audio-Lights-Gyro) multi function, Arduino compatible, differential drive robot PCB is back in stock!
Our remaining old MALGs had quality issues with their MAX21000 gyro ICs. So, otherwise perfectly good boards have been resurrected, with Pololu daughter boards sporting the L3GD20 three-axis angular rate sensor.
Performance-wise it seems there is no significant difference between the two gyros: by the specs the newer L3GD20 sensor has the edge, but they both ultimately provide super-accurate rotational odometry when used in mapping and auto-navigation. (The MALGPCB is standard equipment in our Oculus Prime mobile robots).
Expansion header pins that are occupied by the daughter board are still available via pass-thru pads on the underlying interface board, as detailed in the connection diagram:
Development and testing is now complete for the Relay Server software addition, which allows remote operation of Oculus Prime mobile robots connected to the internet via mobile 3G/4G/LTE networks. It also allows operation behind NAT firewalls, or on any network where there is no ability to configure and forward ports necessary for general internet remote access.
All that is required is to run an instance of the Oculusprime Server application on a device connected to an unconstrained network. This will act as the ‘relay’ server, which you then configure the robot to connect to. When you want to remotely connect to the robot, you connect to the relay server instead, which relays commands and video from the robot seamlessly. The server can be running on any Linux system, including a Raspberry Pi, or within a virtual Linux environment on a Windows or OS X PC.
Now you can equip Oculus Prime SLAM Navigator or Pi Explorer with a smartphone or mobile wifi hotspot, and see how far you get driving around outside, free from the limited range of a wifi network (in fair weather, of course!)
This addition to the Oculusprime Server application has been on the to-do list for a long time, but has been put off because, well, it required a lot of boring network programming. (There always seemed to be something more exciting to work on, like testing out the newly-opened-sourced Google CartographerROS package.)
Summary of enhancements to Oculusprime Server version 0.8:
Expanded network menu
Wifi Access Point Manager upgraded to version 0.914
Red5 streaming media server upgraded to version 1.07
Apache Tomcat web server upgraded to version 8.0.33
Updated power PCB firmware (auto power-off at 30%, reduce false positive errors)
Not too long ago, the availability of low cost depth sensors, suitable for mobile robot auto-navigation and SLAM mapping, had become a problem. Apple bought Primesense, and along with it the intellectual property behind the original Microsoft Kinect and Asus Xtion sensors. The excellent Asus XtionRGBD camera, which was to be the main SLAM sensor for Oculus Prime, was discontinued. The Kinect 1 was still available in quantity, but it was bigger and heavier, and required separate 12V and 5V power.
And it somehow just looks wrong with a Kinect mounted to Oculus Prime:
So, we decided to explore using stereo vision as a possible option. A prototype robot was conjured, sporting two Lifecam Cinema cameras:
OpenCV’s Semi-Global-Block-Matching (SGBM) algorithm yielded decent looking depth data from combined images. Below left is the left camera view, and on the right is the disparity image generated with the cameras separated by a 60mm baseline – pixel intensity is proportional to distance from camera:
Comparing the depth images from the stereo setup vs the Asus Xtion camera was looking promising (stereo image on top, Xtion image on bottom, left camera 2D image inset):
In practice however, with this stereo setup as the data source for ROSSLAM mapping, there were issues. The depth data was quite noisy for some surface textures, and depth accuracy wasn’t very good beyond a few meters. Also, the SGBM algorithm tends to cause data omission for large texture-less surfaces.
The image below shows a comparison of a plan view map of the same area, generated using the stereo setup on the left, and using the Asus Xtion depth camera on the right (using data from the horizontal plane only, and pure odometry to align the scans):
The stereo scan noise occasionally projected a small false obstacle, that would wreak havoc with the ROS path-planner, and the inaccuracy or omission of distant features would cause weak localisation (and a lost robot).
Another problem was the slow speed of the system: the OpenCV Java SGBM processing, along with all the other robot functions fighting for CPU time, would only yield 2 frames per second (the prototype stereo-bot had an Atom N2800 CPU), demanding that navigating robot speed be slowed way down, to reduce errors.
In retrospect, an integrated stereo solution like the Zed camera, with some on-board processing and tightly-calibrated cameras, would have been much more effective (if money was no object).
In the end, we stuck with the Asus Xtion, dwinding supply and all, in the hope that the near future would deliver a new low cost depth sensor with stable supply.
Luckily the Orbbec Astra came along just in time.