EmbeddedRocketComputer

From NPrize
Revision as of 00:49, 19 April 2012 by Vincent (Sọ̀rọ̀ | contribs) (new sensors links (camera, pitot tubes and compass) and HTTPS links)
Jump to navigationJump to search

Embedded computer: attitude and mission control, telemetry

The embedded computer is a very important part of a launcher, because of the development and testing time it requires, and because a simple unforeseen case can lead the whole operation to failure.

The embedded computing world undergoes lots of constraints: power consumption, size, weight, operating temperature... Outcomes are seen in processing power, memory space, connectivity (I/O ports), battery life, and mechanical design.

An embedded control computer has to have a low latency to process data from attitude sensors and command actuators. Realtime computing must be achieved through a hard-realtime operating system, or without using an operating system if you have only one process.

Hardware

It's hard to have a low cost, small form factor, with high processing power. But do we really need high power? That depends on what sensors are used, and their processing. Beyond that, we only need to get the command control, mission planning, and telemetry, that don't require a high power.

Since we are limited by cost, we won't be able to get high quality sensors, or highly integrated sensors like an inertial sensor, but rather accelerometers, and digital gyroscopes. Their data will need to be processed, but that still does not require a lot of power. If we use a video camera however, to track the sun and the Earth's curve for positioning, it will require a lot of processing power.

High processing power

An alternative to pure processing power by a CPU exists: digital signal processors (DSPs), and since they are also very expensive, we can use FPGAs to program them. An FPGA (Field-Programmable Gate Array) is an electronic chip with a matrix of gates that can be programmed in order to specialize it to a specific information processing. It then acts as a hardware processing based on a software definition of the processing, offloading the CPU. Information about FPGAs can be found at fpga4fun.com.

And it happens that there is an embedded microprocessor board that includes a FPGA and multiple I/Os, and a quite faire processing power: the Armadeus, based on an ARM (FreeScale) processor. Moreover, it supports the free (GPL) Xenomai Linux-based RTOS. Armadeus board integration has a dedicated page.

Other interesting embedded computer boards: the Eddy-CPU v2.1 and the Portux G20. They do not include a FPGA but are cheaper, the first has a great temperature range, and the second is more powerful and smaller.

Low processing power

If video is not used as a sensor, microcontrollers may be able to handle some sensors and actuators, at least for aircraft control. The ArduPilot is a good example of open project trying to achieve that.

Telemetry

It seems that the 900MHz version of the ZigBee communication standard is able to transmit at around 100kbps up to 10km. Taken from th ArduPilot page:

Two Xbee modules for wireless telemetry: This one with this adapter in the air and this one with this antenna and this adapter board.

Sensors

Before creating a new dedicated Sensors page because it takes too much space here, here is a list of sensors that can or should be used:

  • Accelerometers: quite common nowadays, accelerometers allow attitude sensing, together with gyroscopes or/and magnetometers. For our project, a single-axis accelerometer can be used to detect free-fall created by separation of plane and rocket, and to sense the roll movement of the rocket, at least for the first part of the flight, since Earth gravity will be more or less sensed depending on the roll. A second accelerometer could be used for thrust confirmation, collinear to the length of the rocket. It would also be a nice telemetry feature, and provide a feedback on the theoretically computed strength the mechanical structure has to sustain. To chose a sensor, sparkfun wrote an accelerometer tutorial.
  • Gyroscopes: they can obviously be helpful on attitude sensing, for yaw roll and pitch of the plane, and thus for the control command. To chose a sensor, sparkfun wrote a gyroscope tutorial. However, those sensors can be relatively expensive for a decent precision, and might be replaced by a camera sensor for low rotation rates.
  • Magnetometer (3D compass): lots of sensors exist too, for example the MicroMag and SCP1000, but are quite expensive. Knowing where is the North of Earth can be very useful, in order to corroborate information from the camera or other sensors, and add some precision to the orbital injection parameters.
  • Thermometer: for systems health monitoring, like engines temperature.
  • GPS if USAF and sensors allow it in flight altitude.
  • Camera: 8-bit data port if possible, like the TCM8230MD sensor. Some ARM processors (i.MX) feature the Camera/CMOS Sensor Interface (CSI) and hardware-accelerated processing or compression from this port. Horizon sensor is provided by a camera.
  • Pitot tubes even exist in stores (like DIYDrones)!
  • Fuel gauge or low level indicator and thus end of mission, orbital injection parameters freezing and stating.

Software

First thing about software is always thinking about the model of the application, meaning how will it be conceived or organized. Several layers are generally seen in softwares:

  • Real application: mission
    • Keep track of the status in the mission
    • Send orders (commands) to the control layer
  • Control system
    • Sensors and actuator communication and processing
    • Control loop from sensors to actuators regarding to commands
  • Operating system
  • Hardware

Mission: the launch program

We need to chose a way to express and manage the mission. It is defined by actions to trigger when some conditions are met, like "when altitude is 60km, proceed to staging", or "at T+7s, begin roll program".

Control

The control loop's purpose is to ensure that the vehicle is in a state consistent with the state expected by the mission. It controls attitude (roll, pitch, yaw) of the vehicle in order to make it fit with the expected attitude. In our case, roll is not really a concern for the rocket, since the satellite does not carry important science payload that has to be pointed in a particular direction. For the aircraft, on the other side, it is very important.

A control loop is decomposed like that:

picture

Sensors information is collected and processed. Actuator commands are processed from both sensor data and expected-to-be-reached sensor data (nominal flight pattern).

This loop has to be processed several times per second, with a highly accurate timing. Indeed, sensor processing, for example accelerometer data, has to be integrated to know the speed and the position of the vehicle. If time shifts randomly, calculated speed will not be correct, leading to false actuation command. With no luck, and we have to assume that it is the case, that creates real attitude error while it was not previously bad. If error is too important on pitch for example, it can lead to catastrophic structural damage at such high speeds.

Hard realtime operating systems (RTOS) guarantee that the time between expected processing time and actual processing time (the system's latency) is bounded by a very low maximum value.

Failsafe, mission abort

In case something goes wrong, for example and engine failure, or structural failure, if it can be detected by sensors, the systems will have to go into a failsafe mode - basically shutting down everything that can explode and try to return to ground in the minimum of different parts.

In some cases, the mission will need to be aborted from ground, because no sensor was available for a specific task, or because of a programming error. An upload communication would then be needed, allowing the system to cease its mission, and try to go back in one piece.