APIs for HPCC Systems Data Ingestion for Common Robot Sensors

This project was completed by Aramis Tanelus, a high school student from American Heritage School of Boca/Delray in Florida. Aramis joined our intern program in 2018 to work on this project which was designed to help his school's autonomous agrcultural robot interface with HPCC Systems, for the purpose of big data processing and analytics. 

Find out about the HPCC Systems Summer Internship Program.

Project Benefits

This project will make it easy for anyone in robotics around the world to ingest data from common robotic sensors into an HPCC platform for use in data analysis.

Project Description

To create APIs for ingesting data into the HPCC platform for the following sensors. This list was compiled and created by Aramis Tanelus with the guidance of David de Hilster.

  1. ENCODER: Encoders provide information regarding the position of motors and actuators on a robot. They are used for tasks that require motors to move to a specific position or at a certain speed.

    • The raw output of an encoder is an integer representing the angular displacement of the motor or shaft the encoder is connected to. This integer is multiplied by constant scale factors to obtain useful metrics such as distance traveled or angle.

    • At the lowest level, the output of the encoder is a series of electric pulses that are directly read by a microcontroller. This makes it possible to read with any programming language supported by the device reading the pulses.

    • Relevant Links:

  2. IMU: The IMU (Inertial Measurement Unit) is used to gather information about a robot’s movement. Most IMUs are capable of reading acceleration and the rate of rotation about three axes. Some can sense the magnetic field strength about three axes.

    • IMUs provide a program with three floats for acceleration, one for each axis (x, y, z), three floats for rate of rotation (pitch, roll, yaw), and three floats for magnetic field strength (x, y, z). The rate of rotation is usually integrated to find the angular displacement of the robot about the three axes.

    • Most IMUs are standalone devices that provide an interface for directly retrieving data through serial communication. This allows them to be used with most languages, including Java, C++, and Python in FRC, and C++ and Python in ROS (Robot Operating System).

    • Relevant Links:

  3. CAMERA (image): Cameras provide imagery and the potential for vision processing to robots. They are used in most complex robots and find their way into FRC every year.

    • The format of the data returned by the camera differs depending on the type of image being captured. For normal RGB images, each image takes the form of a three dimensional array of integers. For grayscale (black and white) images, each image takes the form of a two dimensional array of integers.

    • Cameras are supported by a broad range of devices and programming languages. The images captured by a camera can be processed using Java, C++, and Python in FRC, and C++ and Python when using ROS.

    • Relevant Links:

  4. RANGEFINDERS: Rangefinders are used to provide the robot with the distance between two points. Rangefinders can be separated into two main categories: ultrasonic and laser. Additionally, there are laser scanners, which simultaneously spin and measure distances to produce a view of the surroundings rather than a single point.

  1.  

    • Rangefinders return a single float for each reading. Laser scanners return two lists of floats with equivalent lengths. One list holds the angle at which the readings were taken, the other holds the readings.

    • Language support for rangefinders and laser scanners is highly dependent on the availability of drivers. Most rangefinders can be implemented easily into FRC programs with Java and C++. Laser scanners always require their own drivers to work, which makes them difficult to use in FRC, but easy to integrate into any ROS program and used with any language supported by ROS.

    • Relevant Links

By the mid term review we would expect you to have:

  • Have learned and be familiar with the HPCC system and ingestion process of "spraying".

Mentor

David de Hilster
Contact Details

Backup Mentor: Xiaoming, Wang
Contact Details

Skills needed
  • Experience and access (acceses not necessary but a plus) to robotic sensors

  • Ability to write code in the languages that are used by the robotic sensors.

  • Ability to build and test the HPCC system (guidance will be provided).

Deliverables

Midterm

  • Be familiar with the HPCC system and ingestion process of "spraying".

End of project

  • Four APIs for the listed robotic sensors

Other resources

All pages in this wiki are subject to our site usage guidelines.