Your browser may have trouble rendering this page. See supported browsers for more information.

This page shows the source for this entry, with WebCore formatting language tags and attributes highlighted.

Title

Thermal imaging is the next "facial recognition"

Description

In the aftermath of 9-11, biometrics and, in particular, bio-imaging software companies enjoyed a huge surge in valuation. Most of these products were shoddy and didn't deliver on even a reasonable fraction of their promise. That didn't stop legislators from passing laws requiring their use---and probably getting giant kickbacks from companies newly flush with cash derived from their increased valuations caused, at least in part, by these same laws. Life is quite easy for some companies---especially in the world of security theater. The article <a href="https://www.schneier.com/blog/archives/2020/05/thermal_imaging.html" author="Bruce Schneier">Thermal Imaging as Security Theater</a> lets us know that, after COVID-19, "thermal imaging software" appears to have emerged as the new darling of the security-theater world. However, the technology has a certain niche in which it makes sense: close-up, individual readings. That is, of course, <i>not</i> how they're going to be used. Schneier points out that: <bq><ul>They are not intended for distance from the people being inspected. They are "an imprecise method for scanning crowds" now put into a context where precision is critical.</ul></bq> Using thermal-imaging for use cases for which it is completely inappropriate will lead to <iq>false positives, leaving people stigmatized, harassed, [and] unfairly quarantined</iq> and will also create false negatives, as people exhibiting mild symptoms---or not presenting a notable fever---are missed entirely. That a security measure is completely inappropriate, ineffective, and possibly actively counterproductive won't prevent it from being enshrined in law the world over, though.