Saturday, July 30, 2011

Tale of the Fishy Camera Logs

One of the arguments made to show that cameras are accurate is that they are “tested annually” by an “independents calibration lab” and “tested daily” by police or city officials.  But what does that actually mean?  Out investigation into camera logs in Forest Heights and other towns using Optotraffic cameras shows that “independent” testing is not quite so independent, that daily tests for “accuracy” show nothing of the sort.  Plus officials are either unwilling or unable to show that a human operator was present at the time of the test.

Click to enlarge
Our search began in September when we first became aware of the reports of errors by Optotraffic speed cameras.  The recipient of some questionable citations (who had also been unable to receive a timely court hearing) filed a public information act request (at our recommendation) for records pertaining to camera errors,  maintenance of cameras, and causes for the delays in court hearings.  The town denied her request for all information except the following log.

The driver showed us the log, and we immediately realized that the signature of the same “operator” had signed the log which states “Signature of operator who performed self-test prior to producing recorded images”…. For 42 consecutive days, including weekends (when the devices cannot legally issue tickets) and the 4rth of July, with no days off, with signatures in nice neat straight rows.  While we admire such a diligent employee it did raise suspicion, since almost nobody works 42 consecutive days.  We also noticed that the released “log” excluded the ‘self test results” which are normally considered part of the “daily setup log” and are normally admitted as evidence as well.
On November 12, sent the town a follow-up request on behalf of this driver for the following information:
- the timecards, or if no timecard exists then other records of hours worked, for the speed monitoring system operator who signed the attached log for the speed monitoring system located at Indian Head Highway/Livingston Road NB, for all days from 6/22/2010 through 8/03/2010.
- A copy of the actual self test results (the actual output of the test), the system logs showing speed monitoring system's settings (threshold speed limit, setup-time, photo interval, etc), and system error logs for the speed monitoring system located at Indian Head Highway/Livingston Road NB on 7/4/2010, 7/5/2010, 7/8/2010, 7/20/2010, 7/22/2010, 7/27/2010.
- A copy of the certificate of calibration and/or calibration test record for the speed monitoring system referenced in the attached log.
The timecards of course would have proven whether or not the “operator” was in fact “working” on the day he signed that he “performed” the test.  In other words, they would show whether the signed, sworn statement on a document admitted as evidence in court was true or not.

No response was received from Forest Heights within the 30 day time period permitted by law.  The driver in this case eventually got a court hearing in December and needed to present her case without the requested evidence and was forced to pay the citations (this was before the technique for challenging citations based on time-distance calculations was well known).  Finally in APRIL received a response from Forest Heights, denying access to timecards.  However they did agree to release the camera logs and calibration certificates in exchange for a small fee.  That response was back dated to December 16, 2010 (which was still after the 30 days permitted by law, and no response had been received in December).  In May we sent payment for copies of the logs and calibration certificates which they had agreed to release.  A month later we finally received these daily setup log files, but the annual calibration certificates were still missing from the released documents.
Click to Enlarge

We quickly noticed that on every log file there was in fact a SECOND DATE, days or weeks after the test date.  Most of the logs had the second date 7/28/2010 regardless of the date of the test.  This seems to support the idea that the tests were all run automatically, and the ‘operator’ signed the statement after the fact.  Which would indicate that the "operator" did not "perform" the test "prior to producing recorded images" as the signed log files stated.  We also made a request for an administrative hearing with respect to the timecards, which as of this writing has not been honored.So what do these daily setup tests show?  Well lots of fancy charts and numbers, and seems to indicate everything is working right and a ‘Speed Error’ which seems to be very small.  But what is it actually testing?  Since this is an automatic test, it could not possibly bean actual vehicle driving past the sensor, so what is the claim of accuracy based on?

Well above the “speed error” it actually tells you what is tis based on “GPS Timing to Sensor Fire Timing Results”…. It is merely comparing the device’s internal timer to the time generated by the GPS satellites.  That’s all.  This is supported by the statement fromOptotraffic’s own technical document which describes the daily test as follows “the relative time between sensors is calibrated daily using the 1 pulse-per-second (PPS) signals from the global Position System (GPS) satellites”.  All the other items in the test are basically just saying the components are turned on and have power.   Optotraffic’s assertion is that the mere fact that the beams are firing at the correct rate and that the device’s internal timer is correct that it is accurate to much less than 1mph.  That is even though there is no test that the beams are aligned correctly, no ACTUAL speed measurement test, and certainly no test for the types of errors we have demonstrated are possible if the device’s two sensors strike different portions of the same vehicle.

But what about the annual test certificate?  Certainly that test covers everything else, right?  Well unfortunately to this date we have yet to receive the annual calibration certificates which Forest Heights stated they would provide.  Police Chief web stated that it was included in the documents he mailed us; it was not and he was informed of this, yet we have still to this date not received the certificates.

Fortunately, we have managed to acquire some calibration certificates from OTHER municipalities using Optotraffic cameras.  According to Optotraffic’s technical document, the annual test is described as follows ”Each Lane Sensor is third-party calibrated annually (by Maryland law) to verify the beam distance”.  That’s it. Optotraffic is basically asserting that if the ‘beam distance” is correct, and the internal timer is correct, then this means there are no other sources of error.  Never mind any factors that could come up in the real world environment (as opposed to the laboratory) such as the fact that the device is mounted atop a 32 foot tall pole that sways in the wind, or that the two beams could strike different portions of the same vehicle.

We had previously managed to acquire a calibration certificate for one of New Carrollton’s cameras, which would have been one of the first such devices deployed anywhere in the state.
Page 1: Click to Enlarge
Page 1: click to enlarge
Observe the following:
1.       On the first page you should note the following:  The "Applied Specification" only says "Manufacturer Specification"..  They ONLY tested what Optotraffic TOLD THEM TO.  Basically Optotraffic wrote their own test, one they already knew they were certain to pass.  Think of it as if the specification consisted of “the device will be 27 inches long” and they measured it with a ruled, INSTEAD OF testing the device as a whole to measure actual speeds.  Optotraffic paid CustomCal to certify the device, but Optotraffic told them on what the basis to test it.
2.       The following disclaimer appears at the end of the "Measurement Assurance" section: "due to any number of factors, the recommended due date of the item calibrated does not imply continuing conformance to specifications during the recommended interval".  This disclaims the entire test.  These are mobile cameras, and have been handled in any of a number of ways between the date of that test and actual deployment, there is no way to know the beams are still properly aligned on the citation date.   The distance between beams is NOT part of the daily test!  In fact Optotraffic can, and has, modified the software on the devices over the course of the past years since the tests, and possibly the hardware as well.  Arguably this should also void the certification since it is no longer exactly the same device that the test was run on.
3.       Now notice on pages 2 and 3,  the user name is "Don Cornwell", and on 001007, the "approver" is "Jose Tillard".  Google "Jose Tillard Sigma Space" and you will see a Linkedin page for a systems engineer at Sigma Space.  Google "Sigma Space Don Cornwell" and "Optotraffic Don Cornwell " and you will get the following email addresses: and  Don Cornwell is the Chief Technical Officer of Optotraffic.  In this case they actually crossed out Mr Cornwell’s name and wrote in the name of a custom cal official on the signature line.  And then there’s the big word “Optotraffic” across page 2, rather than the name of the independent lab.
4.       Notice also the word ‘speed SIMULATOR ID’… this was not an actual test of a moving vehicle, it was some form of automatic software system test.  A ‘simulation’ performed by the Optotraffic employee, not a real world test by an independent lab on an actual moving object.

We also managed to later acquire some logs from Berwynn Heights.  These were similar, sowing the same “Manufacture specification”, describing the speed test as a “speed simulator”, with the same optotraffic employee username of dcornwell (or no username) for some of the speed simulator tests (in one case there was another user we could not identify).  The Berwynn Heighs tests simply showed no signature on the pages. 

A question worth asking is in what way is this level of testing adequate to ensure the reliability of the devices in a real world environment.  It seems clear from the description of the tests that Optotraffic is making assumptions that only a limited number of issues can affect accuracy, with all other factors being ignored.  And there is reason to believe the "operators" of the devices have in some instances been permitted to "operate" the devices while sitting on the beach.  Optotraffic gets paid based on the number of tickets issued, and local governments profit significantly from them as well.  The cameras in Forest Heights issued over $3million worth of tickets last year.  Forest Height’s entire budget prior to getting speed cameras was only $1.7million.   If it is being left up to those with such an enormous financial incentive to decide what constitutes adequate test, and how loosely the law can be interpreted, can the public and the courts and actually know whether the evidence being used against citizens is in fact reliable?  We'd believe they know nothing of the sort, particularly given the number of speed measurement errors that have been reported already.