One of the problems we have long had with some cities’ red light camera systems is the role of the private sector, which sometimes assumes inherently governmental functions such as deciding who gets a traffic ticket—as well as collecting a slice of the revenue. The Birmingham News recently posted a series on the issue of private companies assuming traditionally governmental functions. In Alabama, private companies have been involved not only with traffic enforcement but also such things as tax collection and auditing and probation administration. In one town this led to what a judge condemned as a “debtor’s prison” and a “judicially sanctioned extortion racket.” As Jim Williams, executive director of the Public Affairs Research Council of Alabama, put it, “We expect the private sector to be aggressive. The responsibility to set limits and make rules lies with the government.” Unfortunately, all too often when government makes use of the private sector, it does not structure the deals carefully enough to ensure that the profit motive does not trample rights. And (as I am quoted as saying in the piece) private companies are not subject to checks and balances such as open-records laws that have evolved over time for government. With privatization a continuing craze—and local hunger for revenue at an historic high—we can unfortunately expect to see more of such misguided efforts, especially in the technology area where innovation comes from the private sector.
Ten percent of all online internet traffic is generated not by humans but by bots, according to a study reported in Adweek and Technology Review. The study shouldn’t be taken at face value as its sponsor makes tools for CAPTCHAs. But if true in the slightest, it is interesting in a couple of ways. First, such online contexts, where nobody knows you’re a bot, are the cutting edge of an ongoing real-world Turing Test by which we can measure the progress of artificial intelligence. There are strong competing economic incentives to (on the one hand) gin up ad revenue by having computers load web pages and click on ad links and (on the other hand) to ensure that it’s really entities with purchasing power (currently that’s just human beings) that are viewing your ads. A similar dynamic is playing out in the arena of online poker, where bot players, which have been around for years, are increasingly getting good enough to win large amounts of money from human players. Second, I have to wonder to what extent the effort to differentiate bot from human players might add to the already feverish amount of online monitoring and profiling that is taking place.
Another industry-sponsored internet study highlights an interesting dilemma for government agencies seeking to control information. As eweek reports, government agencies seeking to keep information from leaking out view it as vital that their employees encrypt their email. At the same time, that very same encryption allows whistleblowers or others to smuggle out secret information. From eweek:
Email encryption is an important tool for protecting sensitive information, but agencies must be sure that encryption is not making outbound emails so opaque that sensitive information can pass through without detection,” Michael Dayton, senior vice president of Axway’s security solutions group, said in a statement. “Agencies themselves may be providing the tools by which federal workers are leaking critical information—intentionally or not.
The report recommends that agencies encrypt their employees’ email, but only in a way that it can be decrypted and examined by the agency on its way out the door. Such are the complexities of trying to control information.
The Atlantic ran a nice piece by Alexis C. Madrigal on what the law says “If I Fly A UAV Over My Neighbor’s House.” The piece includes an entertaining look at the legal history of air space, which once belonged to the owner of the property below it “to the heavens,” but evolved toward being open to the public (which was necessary if air travel was to become routine). While that public right-of-way is a good thing, it opens up tricky issues with regards to trespassing and privacy—issues that so far have not been resolved.
On the march-toward-a-surveillance-state front, Government Security News reports on a software product that allows law enforcement to “search, sort and score” the “mountain of personal information about specific individuals frequently resides in a wide range of commercial databases.”
Steve Lowe, senior vice president and general manager for Intrado, told Government Security News during a phone interview last month that this kind of information is already being utilized on a regular basis by commercial businesses, such as insurance companies, law firms, entitlement organizations and credit bureaus, but rarely by police departments. “The sad thing is this hasn’t been available until now to law enforcement,” Lowe told GSN.
There is little detail about how “scores” on individuals are generated for the police. But the long-fading distinction between commercial-sector privacy and government privacy just faded a little bit more. And it just got a little more important that people worry about the contents of the dossiers on them in the files of the commercial data brokers.
Finally, I came upon an interesting article on automotive security on the Linux site LWN.Net, which takes up explicitly the issue I recently mused about: whether open-source code would help automotive security. The piece also paints a vivid picture of just how dire automotive security is.
First, there is a mistaken assumption that computing is not yet a pervasive part of modern automobiles. Likewise mistaken is the assumption that safety-critical systems (such as the aforementioned brakes, airbags, and engine) are properly isolated from low-security components (like the entertainment head unit) and are not vulnerable to attack. It is also incorrectly assumed that the low-security systems themselves do not harbor risks to drivers and passengers. In reality, modern cars have shipped with multiple embedded computers for years (many of which are mandatory by government order), presenting a large attack surface with numerous risks to personal safety, theft, eavesdropping, and other exploits.
Drawing on prior 2010 and 2011 reports, the piece describes how by compromising diagnostic inputs, telematics functions (such as OnStar) and even a car’s CD player allowed attackers access to vehicles that could compromise privacy (such as by uploading a car’s audio and GPS data to an internet server) or safety (such as by messing with speedometers or brakes).