University of Toronto WearComp Linux Project
Imagine a clock designed so that when the cover was lifted off, all the gears would fly out in different directions, such that a young child could not open up his or her parents' clock and determine how it works. Devices made in this manner would not be good for society, in particular for the growth and development of young engineers and scientists with a natural curiosity about the world around them.
As the boundary between software and hardware blurs, devices are becoming more and more difficult to understand. This difficulty arises in part as a result of deliberate obfuscation by product manufacturers. More and more devices contain general-purpose microprocessors, so that their function depends on software. Specificity of function is achieved through specificity of software rather than specificity of physical form. By manufacturing everyday devices in which only executable code is provided, manufacturers have provided a first level of obfuscation. Furthermore, additional obfuscation tools are often used in order to make the executable task image more difficult to understand. These tools include strippers that remove things such as object link names and even tools for building encrypted executables which contain a dynamic decryption function that generates a narrow sliding window of unencrypted executable, so that only a small fragment of the executable is decrypted at any given time. In this way, not only is the end user deprived of source code, but the executable code itself is encrypted, making it difficult or impossible to look at the code even at the machine-code level.
Moreover, complex programmable logic devices (CPLDs), such as the Alterra 7000 series, often have provisions to permanently destroy the data and address lines leading into a device, so that a single chip device can operate as a finite-state machine yet conceal even its machine-level contents from examination. (See Resources 1 for an excellent tutorial on FPGAs and CPLDs.) Devices such as Clipper chips go a step further by incorporating fluorine atoms, so that if the user attempts to put the device into a milling machine to mill it off layer by layer for examination under an electron microscope, the device will self-destruct in a quite drastic manner. Thus, the Clipper phones could contain a “Trojan horse” or some other kind of back door and we might never be able to determine whether or not this is the case—yet another example of deliberate obfuscation of the operational principles of everyday things.
We have a growing number of general-purpose devices in which the function or purpose depends on software, downloaded code or microcode. Because this code is intellectually encrypted, so is the purpose and function of the device. In this way, manufacturers may provide us with a stated function or purpose, but the actual function or purpose may differ or include extra features of which we are not aware.
A number of researchers have been proposing new computer user interfaces based on environmental sensors. Buxton, who did much of the early pioneering research into intelligent environments (smart rooms, etc.), was inspired by automatic flush urinals (as described, for example, in U.S. Pat. 4309781, 5170514, etc.) and formulated, designed and built a human-computer interaction system called the “Reactive Room” (see Resources 2 and 3). This system consisted of various sensors, including optical sensors (such as video cameras) and processing, so that the room would respond to the user's movement and activity.
Increasingly, we are witnessing the emergence of intelligent highways, smart rooms, smart floors, smart ceilings, smart toilets, smart elevators, smart light switches, etc. However, a typical attribute of these “smart spaces” is that they were designed by someone other than the occupant. Thus, the end user of the space often does not have a full disclosure of the operational characteristics of the sensory apparatus and the flow of intelligence data from the sensory apparatus.
In addition to the intellectual encryption described in the previous section, where manufacturers could make it difficult, or perhaps impossible, for the end user to disassemble such sensory units in order to determine their actual function. There is also the growth of hidden intelligence, in which the user may not even be aware of the sensory apparatus. For example, U.S. Pat. 4309781 (for a urinal flushing device) describes:
... sensor... hidden from view and thus discourage tampering with the sensor... when the body moves away from the viewing area... located such that an adult user of average height will not see it... sensing means, will be behind other components... positioned below the solenoid to allow light in and out. But the solenoid acts in the nature of a hood or canopy to shield the sensing means from the normal line of sight of most users.... Thus most users will not be aware of the sensing means. This will aid in discouraging tampering with the sensing means. A possible alternate arrangement would be to place the sensing means below and behind the inlet pipe.
U.S. Pat. 4998673 describes a viewing window concealed inside the nozzle of a shower head, where a fiber optics system is disclosed as a means of making the sensor remote. The concealment is to prevent users from being aware of its presence. U.S. Pat. 5199639 describes a more advanced system where the beam pattern of the nozzle is adapted to one or more characteristics of the user, while U.S. Pat. 3576277 discloses a similar system based on an array of sensing elements.
A method of creating viewing windows to observe the occupants of a space while at the same time making it difficult for the occupants to know if and when they are observed is proposed in U.S. Pat. 4225881 and U.S. Pat. 5726706.
In addition to concealing the sensory apparatus, a goal of many visual observation systems is to serve the needs of the system architect rather than the occupants. For example, U.S. Pat. 5202666 discloses a system for monitoring employees within a restroom environment, in order to enforce hygiene (washing of hands after using the toilet).
Other forms of intelligence, such as intelligent highways, often have additional unfortunate uses beyond those purported by the installers of the systems. For example, traffic-monitoring cameras were used to round up, detain and execute peaceful protesters in China's Tiananmen Square.
U.S. Pat. 4614968 discloses a system where a video camera is used to detect smoke by virtue of the fact that smoke reduces the contrast of a fixed pattern opposite the video camera. However, the patent notes that the camera can also be used for other functions such as visual surveillance of an area, since only one segment or line of the camera is needed for smoke detection. Again, the camera may thus be justified for one use; additional uses, not disclosed to occupants of the space, may then evolve. U.S. Pat. 5061977 and 4924416 disclose the use of video cameras to monitor crowds and automatically control lighting in response to the absorption of light by the crowds. While this form of environmental intelligence is purportedly for the benefit of the occupants (to provide them with improved lighting), there are obvious other uses.
U.S. Pat. 5387768 discloses the use of visual inspection of users in and around an automated elevator. Again, these provide simple examples of environmental intelligence in which there are other uses, such as security and surveillance. Although even those other uses (security and surveillance) are purportedly for the benefits of the occupants, and it is often even argued that concealing operational aspects of the system from the occupants is also for their benefit, it is an object of this paper to challenge these assumptions and provide an alternate form of intelligence.
When the operational characteristics, function, data flow and even the very existence of sensory apparatus is concealed from the end user, such as behind the grille of a smoke detector, environmental intelligence does not necessarily represent the best form of human-machine relationship for all concerned. Even when the sensors are visible, there must be the constant question as to whether or not the interests of the occupant are identical to those who control the intelligence-gathering infrastructure.
The need for personal space, free from monitoring, has also been recognized (see Resources 4) as essential to a healthy life. As more and more personal space is stolen from us, we may need to be the architects of alternate spaces of our own.
|Where's That Pesky Hidden Word?||Aug 28, 2015|
|A Project to Guarantee Better Security for Open-Source Projects||Aug 27, 2015|
|Concerning Containers' Connections: on Docker Networking||Aug 26, 2015|
|My Network Go-Bag||Aug 24, 2015|
|Doing Astronomy with Python||Aug 19, 2015|
|Build a “Virtual SuperComputer” with Process Virtualization||Aug 18, 2015|
- Problems with Ubuntu's Software Center and How Canonical Plans to Fix Them
- Boot with GRUB
- Three More Lessons
- Firefox Security Exploit Targets Linux Users and Web Developers
- Where's That Pesky Hidden Word?
- Alice, the Turtle of the Modern Age
- New Storage Solution is Music to the Ears of Fast-Growing Digital Music Company
- LibreOffice 5.0 Looking Good and Shipping This Week
- Upcoming Webinar: Getting Started with DevOps
- Non-Linux FOSS: Flaky Connection? Mosh it!