This post is the first loosely related to my research. The ‘Part 1’ label means that I intend to semi-regularly voice my enjoyment with the hardware I work with.  Because my research involves robots, hardware is a critical component of any working system. Every robticist can relate to my love-hate relationship with hardware that functions perfectly alone in lab but refuses to start if anyone’s around. In the end, I haven’t decided if this says more about robotics or the people working on robotics projects.

This post is not just about hardware, despite the title, but more about my recent experience chasing down a few performance quirks that I couldn’t directly attribute to my own bad code.

Setup

Hardware:

  • Point Grey Grasshopper Express camera – maximum resolution of 1024×1024 and maximum frame rate of 70fps at that resolution
  • 1394b express card
  • Lenovo W520 laptop

Software

In lab, the camera power comes from the firewire cable with an AC adapter plugged into the express card. During test runs, the camera power comes from a LiPo battery connected directly to its GPIO pins.

Symptoms

Camera1394 connects to, configures, and broadcasts images from the camera into the rest of the system. The camera was configured to stream 640×480 mono8 IIDC format 0 images at 60fps. Under certain circumstances, images were published at 14fps,  despite the configured 60fps. This behavior seemed to be a non-deterministic function of the system configuration.

Isolation

The problem first reared its ugly face during outdoor tests in the middle of the Georgia summer. In my mind, the camera, in a desperate attempt to not melt itself, sacrificed frame rate as a noble gesture to limp along as best it could. As testing continued, the same frame rate inconsistency appeared in the field and in lab. Regardless of whether we had the camera set to 30fps or 60fps in the configuration file, it would stay at 14fps. I loosely attributed the error to overheating and powering the camera from GPIO instead of the AC power adapter.

Mainly, I ignored the issue in favor of continuing to develop immediately required software components. The issue fell to the back of my mind after not occurring for a couple of weeks. But, during recent hardware and software preparations for an outdoor run, the problem reappeared at the worst time (of course). In lab, the camera stuck at 14fps during the final system check for a run that required at least 30fps. Great.

This isn’t happening. Pull the latest version of the code. All changes merged successfully. In the correct branch. Complete full recompile. Configuration file does specify 60fps. But still, 14fps. Not good. Time to actually figure out this stupid problem. I work through all possible configurations to isolate and reproduce the problem consistently. After performing the full range of tests, I reliably reproduced the camera being stuck at 14fps simply by running laptop on battery power, regardless of the configuration frame rate, camera power source, temperature, or any other system configuration.

Ta Da! Wait, that’s all it is? The fix should be easy.

The Fix

With the problem isolated, I stepped through the power control options with a few tips from people in lab:

  • BIOS settings had nothing except disabling the pci bus clock when no data is being transferred.
  • ASPM was not enabled by default. Force enabling it with a boot parameter allowed the system to enter ASPM powersave mode on battery power. On AC power, however, the power state returned to default, deferring power management. I couldn’t find a way to get ASPM into performance mode or stay in an ASPM state on AC power.
  • ACPI kernel module was running, and seemed to control the power state of the express card slot. Disabling it caused the camera program to constantly publish at 18fps, regardless of settings. Some progress, but I couldn’t disable the power saving measures.
  • Scripts executed when power state changes. I’m pretty sure I just missed something basic about how they are loaded/executed because nothing I did changed their execution when the power state changed.

All we had to show after working through all the possibilities was a repeatable test case where we could cycle the camera between the commanded frame rate and 14fps. We could also get the camera to run at a constant 18fps if I disabled all power control. Feeling frustrated, I shelved the issue in favor of other work.

The (lack of a) Fix

During the next group meeting, someone mentioned they wanted to experiment with a different camera data format than we had used. They wanted to use the full 1024×1024 image instead of the smaller 640×480 and 1024×768 images we were using. Because this resolution is non-standard in the IIDC formats, I switched to format 7 instead of  formats 0 and 1. Format 7 also allows us to use 2x and 4x on-camera pixel binning and gives a custom region of interest from within the full image if only a subset of the image is needed.

Preparing for a run with the new camera settings, I performed the final systems check in lab before moving outdoors by removing all AC power and starting the entire system. With everything up and running, I noticed the camera frame rate was exactly as commanded. A quick test sequence confirmed that using any available format 7 mode for image streaming provided the same results. Problem solved?

Success?

Something about Format 7 seems different, but nothing sticks out in the camera’s technical reference, and I can’t find any mention of behavior change in any format based on power state. The IIDC specification also doesn’t mention anything about power, unless I missed something. Any of my theories about possible interactions are purely speculation, but I will present my best attempt to explain what I believe was happening.

Using my findings about ACPI interactions with the camera frame rate, I believe that the camera interface software or express card is monitoring the power state. If the bus conclusion is correct, then the limiting is likely done on the laptop side possibly modifying the bus clock or something similar. I still have no explanation for how the camera driver could affect the frame rate of the camera unless it internally modifies the desired frame rate based on the power status.

If anybody has any insights into this issue, please let me know in the comments or by email.

Despite whatever caused the camera power state changes in formats 0 and 1, and not in 7, we are comfortable enough with the current solution to consider the issue closed. Format 7 is the long-term format we will use and it gives plenty of flexibility with the pixel binning and region of interest specification. The only down side is that the frame rate in format 7 is not user adjustable; it runs the camera as fast as possible based on the camera limitations and the bus bandwidth. We can toggle the bus bandwidth between 400Mb/s and 800Mb/s for a little control.

Issues like this seem to pop up often in small-scale robotics projects that rely on externally developed components to give basic functionality. Our project focuses mainly on the software side, specifically vision, so I can’t say if robotics programs that focus on the mechanical or electrical side have similar problems. It’s a love-hate relationship that always almost solved.

Leave a Reply