University of Toronto WearComp Linux Project

Dr. Mann describes his WearComp (“Wearable Computer”) invention and how it has evolved into the same kind of philosophical basis for self determination and mastery over one's own destiny that is characteristic of the Linux operating system that currently runs on WearComp.

This paper is part one of a two-part series. In this part I will describe a framework for machine intelligence that arises from the existence of human intelligence in the feedback loop of a computational process.

I will also describe the apparatus of the invention that realizes this form of intelligence, beginning with a historical perspective outlining its visual and photographic origins. The apparatus of this invention, called “WearComp”, emphasizes self-determination and personal empowerment.

I also intend to present the material within a philosophical context I call COSHER (Completely Open Source, Headers, Engineering and Research) that also emphasizes self-determination and mastery over one's own destiny.

This “personal empowerment” aspect of my work is what I believe to be a fundamental issue in operating systems such as Linux. It is this aspect that WearComp and Linux have in common, and it is for this reason that Linux is the selected operating system for WearComp.

An important goal of being COSHER is allowing anyone the option of acquiring, and thus advancing, the world's knowledge base.

I will also introduce a construct called “Humanistic Intelligence” (HI). HI is motivated by the philosophy of science, e.g., open peer review and the ability to construct one's own experimental space. HI provides a new synergy between humans and machines that seeks to involve the human rather than having computers emulate human thought or replace humans. Particular goals of HI are human involvement at the individual level and providing individuals with tools to challenge society's preconceived notions of human-computer relationships. An emphasis in this article is on computational frameworks surrounding “visual intelligence” devices, such as video cameras interfaced to computer systems.

Problem Statement

I begin with a statement of what I believe to be a fundamental problem we face in today's society as it pertains to computers and, in particular, to computer program source code and disclosure. Later, I will suggest what I believe to be solutions to this problem. Linux is one solution, together with an outlook based on science and on self-determination and individual empowerment at the personal level.

A first, fundamental problem is that of software hegemony, seamlessness of thought and the building of computer science upon a foundation of secrecy. Advanced computer systems is an area where a single individual can make a tremendous contribution to the advancement of human knowledge, but is often prevented from doing so by various forms of software fascism. A system that excludes any individual from exploring it fully may prevent that individual from “thinking outside the box” (especially when the box is “welded shut”). Such software hegemonies can prevent some individuals from participating in the culture of computer science and the advancement of the state of the art.

A second fundamental problem pertains to some of the new directions in human-computer interaction (HCI). These new directions are characterized by computers everywhere, constantly monitoring our activities and responding intelligently. This is the ubiquitous surveillance paradigm in which keyboards and mice are replaced by cameras and microphones watching us at all times. Perpetrators of this environmental intelligence claim we are being watched for our benefit and that they are making the world a better place for us.

Computers everywhere constantly monitoring our activities and responding intelligently have the potential to make matters worse from the software hegemony perspective, because of the possibility of excluding the individual user from knowledge not only of certain aspects of the computer upon his or her desk, but also of the principle of operation and the function of everyday things. Moreover, the implications of secrecy within the context of these intelligence-gathering functions puts forth a serious threat to personal privacy, solitude and freedom.

Figure 1. Evolution of the WearComp Invention

Computer Science or Computer Secrecy

Science provides us with ever-changing schools of thought, opinions, ideas and the like, while building upon a foundation of verifiable (and sometimes evolving) truth. The foundations, laws and theories of science, although true by assumption, may at any time be called into question as new experimental results unfold. Thus, when doing an experiment, we may begin by making certain assumptions; at any time, these assumptions may be verified.

In particular, a scientific experiment is a form of investigation that leads wherever the evidence may take us. In many cases, the evidence takes us back to questioning the very assumptions and foundations we had previously taken as truth. In some cases, instead of making a new discovery along the lines anticipated by previous scientists, we learn that another previous discovery was false or inaccurate. Sometimes these are the biggest and most important discoveries—things that are found out by accident.

Any scientific system that tries to anticipate “what 99% of the users of our result will need” may be constructing a thought prison for the other 1% of users who are the very people most likely to advance human knowledge. In many ways, the entire user base is in this thought prison, but many would never know it since their own explorations do not take them to the outermost walls of this thought prison.

Thus, a situation in which one or more of the foundation elements are held in secret is contrary to the principles of science. Although many results in science are treated as a “black box”, for operational simplicity there is always the possibility that the evidence may want to lead us inside that box.

Imagine, for example, conducting an experiment on a chemical reaction between a proprietary solution “A”, mixed with a secret powder “B”, brought to a temperature of 212 degrees T. (Top-secret temperature scale which you are not allowed to convert to other units.) It is hard to imagine where one might publish results of such an experiment, except perhaps in the Journal of Non-Reproducible Results.

Now, it is quite likely that one could make some new discoveries about the chemical reaction between A and B without knowing what A and B are. One might even be able to complete a doctoral dissertation and obtain a Ph.D. for the study of the reaction between A and B (assuming large enough quantities of A and B were available).

Results in computer science that are based, in part, on undisclosed matters inhibit the ability of the scientist to follow the evidence wherever it may lead. Even in a situation where the evidence does not lead inside one of the secret “black boxes”, science conducted in this manner is irresponsible in the sense that another scientist in the future may wish to build upon the result and may, in fact, conduct an experiment that leads backwards as well as forwards. Should the new scientist follow evidence that leads backwards, inside one of these secret black boxes, then the first scientist will have created a foundation contaminated by secrecy. In the interest of academic integrity, better science would result if all the foundations upon which it was built were subject to full examination by any scientist who might, at some time in the future, wish to build upon a given discovery.

Thus, although many computer scientists may work at a high level, there would be great merit in a computational foundation open to examination by others, even if the particular scientist using the computational foundation does not wish to examine it. For example, the designer of a high-level numerical algorithm who uses a computer with a fully disclosed operating system (such as Linux) does other scientists a great service, even if he uses it only at the API level and never intends to look at its source code or that of the Linux operating system underneath it.

Figure 2. ECE1766 Class Picture

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState