Conferences are a great opportunity to attend presentations discussing the latest research, to see the newest products and meet with colleagues. At the trade exhibits we are bombarded with sales pitches and a gambit of differing viewpoints. As a matter of course we tend to filter out all the sales talk and seek out the more reliable information coming from the determined battalion of sales reps with each rep looking to gain access to your agency’s valuable purchasing dollars.
After leaving the sales pavilion and returning to the presentations we tend to sit in the audience, drop our guard and become less selective about filtering the information coming out of each session. Some of the speakers are independent researchers but others are sponsored by their employer who more often than not, are a commercial business enterprise. Don’t get me wrong, commercial research has a lot to offer by delivering funding that would not otherwise be forthcoming without hedging on the possibility of future sales. However, on occasions the testing process, the data and the outcomes may not be as straightforward as first thought and along the way the results of the project may unintentionally become somewhat obscure. So here is a wake-up call about not believing everything that is laid out before you!
A research paper comparing the difference between narrow width reflective trim material vs large areas of patterned reflective fabric on PPE was presented by a 3M research team to the Human Factors and Ergonomics Society 49th Annual Meeting in 2005. The 3M company produces some of the leading products in the retroreflective marketplace but their research paper was taken to task the very next day in a detailed response circulated by another conference participant.
This type of controversy is a highly unusual event at any conference but the comments were quickly distributed around the venue due to the safety related inconsistencies found within the first research paper. The comments pointed to major discrepancies in the field testing, the assumptions about detection distance as well as other problems with the objectivity and manipulation of the data. Even though some of the issues are technical, the two papers are very short and it is easy to isolate the individual problems once they are explained.
The lesson here is simple – check the facts closely or seek technical advice before making decisions based on information from seemingly knowledgable sources.