You are here

Press Releases

Focus Robotics Shipping PCI nDepth™ Vision System

Hudson, NH -- Nov 8 -- Focus Robotics today
began shipping a low cost PCI based Vision System that enables real-time
depth perception on a standard PC platform.

This system handles all aspects of depth calculation, including real-time lens
undistortion, camera rectification, correspondence searching, and advanced
post filtering for error removal at 30 frames per second for 752x480 images.
All output from the nDepth™ processor is sent via direct memory access
(DMA) to a host PC. The disparity is calculator using a Sum of Absolute
Differences with 9x9 block matching and 64 disparities. To reduce error,
left to right checking is also implemented. Since all processing is done
on the nDepth™ processor, no additional load is placed on the host PC.

The PCI nDepth™ vision system includes a 6cm 752x480 stereo vision camera
with >60dB of dynamic range, progressive scan, global shutter and near IR
enhanced performance for use with non-visible near IR illumination. The PCI
nDepth vision system also supports M12x5 uVideo lenses in a variety of focal
lengths. The stereo camera connects easily to the PCI card using one standard
CAT5e cable up to 15 feet in length.

Focus Robotics Paves Way for Machines to Perceive Depth

nDepth™ Vision Processor Overcomes Complexity and Cost of Generating
3D Depth Information; Enables Customers to Add Real-Time Depth perception
to Products.

Cambridge, MA -- May 12 -- Focus Robotics today unveiled
a new type of processor to help customers add real-time depth perception to products.
With integrated depth perception, products can interact with and monitor the environment
in ways more similar to humans.

Enabling machines to perceive the distance to objects in the environment has long
been a need of the robotic and automation industries. Companies have traditionally
relied on bump sensors and IR to enable products to perceive the world. Though useful
for alerting products that they have run into something or are about to go over a ledge,
IR and bump sensors are limited in the detail of information they can provide. Other
techniques have failed to be practical in the marketplace due to their high cost--
seven to ten thousand dollars for a laser range finder for example.

Though vision is a sensory rich alternative, extracting depth information from
vision has traditionally been difficult and costly. Software based depth perception
systems have been around for a few years, but they typically yield less then 3 frames
per second for standard 640x480 images on a dedicated 3GHz Pentium class processor.
The power usage, size, and cost of a PC that only provides 3 frames of depth is too
high for many products. The slow speed, even on such fast processors, is indicative
of the complexity involved in processing vision for depth.

In a move to help companies looking to improve their navigation and object interaction
systems, Focus Robotics has developed a vision processor that generates real-time,
full field of view depth information from images taken by low-cost camera sensors. Called
the nDepth™ vision processor, it provides 752x480 pixels of depth information at a
rate of up to 60 frames per second, all in a low cost, low power FPGA. High speed serial,
PCI, and USB processor ports are provided for both selection and standards for integrating
with the processor. Optional sale as an IP core allows integration in existing FPGA
hardware or volume ASIC production.

Using the nDepth™ processor and a pair of standard camera sensors, developers
get the depth information needed to measure objects, track objects, and even avoid objects
in real-time. Examples of applications include mobile robot navigation, people tracking,
gesture recognition, targeting, 3D surface visualization, and interactive gaming.
Since operations like camera calibration, depth searching, and error removal are
implemented in the nDepth™ processor, the demand for CPU resources is considerably

Focus Robotics nDepth™ vision processor provides depth measurements using
a pair of camera sensors and a technology called computational
stereo vision
. The basis of the technology lies in the fact that a single
physical point in 3D space projects to a pair of unique image locations when
observed by two cameras. If it is possible to locate these corresponding points in
the camera images, then the 3D location of the physical point can be computed by
triangulation. Finding the points in the left and right images which correspond to
the same physical point in space is called the correspondence problem.
This is a very difficult problem to solve and is the main function of the nDepth™
vision processor.

The same basic algorithm used by the nDepth™ vision processor is also used by
NASA's twin Mars Exploration Rovers Spirit and Opportunity for navigation on Mars. The
difference is that the rovers use software and only achieve a few frames per second on
256x256 images whereas the Focus Robotics nDepth™ processor provides up to 60
frames per second on 752x480 without taking CPU resources. In total, the nDepth™
processor completes an amazing 2 billion pixel operations per second.

A complete evaluation kit including nDepth™ vision processor, stereo camera, PCI
interface and software is available to assist developers who are looking to
evaluate the technology and incorporate it in their products. For details, visit: .

About Focus Robotics- Located in Southern New Hampshire,
Focus Robotics is a leading developer of vision based sensing technologies for the
robotics and automation industries. The company specializes in the development of high
performance vision processors using advanced stereo vision technology. This technology
enables customers to measure, identify, track and visualize objects in the environment
with incredible speed and accuracy. Vision processors overcome the high computational
costs and complexity of processing camera data and have many advantages over software
and laser based alternatives. This specialization in vision processors allows them to
innovate and deliver real value to customers.