Noctivagous
- Who Is Noctivagous?
- The Problem with Today's Computer Industry
- Noctivagous Computer Corporation
- Industrial Design for Future Noctivagous Hardware
- iPhone Alternative, Desktop Alternative
- Developed Inputs
Last updated on 10-14-22.
Who Is Noctivagous?
Computers Are Powerful Computationally, But Getting Them to Do Advanced Tasks Takes Lots of Specific Steps
You can’t make the computer do whatever you want. Without effort, you should be able to make it take live video of you and match your moves to a 3D computer figure of your choosing. But that won’t be easy for you (at the moment), unless you are either provided an app specific to this chore or you are a person trained in this area of using the computer. It's not advanced to describe what you want in this scenario, but even computer programmers would have to look up how to accomplish this.
Something that straightforward to describe (match my video a 3D model) won't be easily describable to the computer that is able to do it easily. Instead, getting the computer to do it takes following a lot of technical steps.
A user has to conform to what today’s software is. It's important to change what conventions the computer software provides you like this, such that it isn’t hard to get right away into advanced computing scenarios.
We want, for example, to let you assemble an airplane safety card type of illustrated instruction using software, feed it to the computer, and have the computer do things according to that card. Unlike how the computer is configured right now, an airplane safety card is really like telling the computer what to do in a way that it serves you.
Everyone knows that these computers have a lot of computational power underneath, so much that they can put you in a virtual space that looks quite like the world you live in from day to day. But they're hard to make use of if anything is an advanced scenario. In the movies, it's easy to do advanced things with technology, but in these times it is hard without accumulating a lot of technical information in advance about the specific chores.
Making the computer less demanding in terms of specificity is important. If you want to make use of the computer, you can only relate to it the way that it allows, which is actually what was built into it. What can be built into it is a general purpose airplane safety card type of instruction set that all users can make use of after becoming familiar with the system.
So then what happens when the whole world has this kind of setup, where a person can truly make better use of electronic computers? The whole world changes, because the modern world is permeated with computers that are really hard to use.
The Problem with Today's Computer Industry
When Computer Code Is More Powerful, Computers Will Be More Powerful
Technological Power Is Not to Be Measured in Horsepower (CPU Speed)
If a high-end automobile, a sports car, features extremely high horsepower but it is difficult to handle when driving around curves of a road at a low speed, or it cannot handle minor bumps that come along, then that automobile cannot be considered truly powerful. It’s fast, but it takes too much effort on the part of the driver to manage it at important moments of its functioning.
That’s exactly how today’s computers are with respect to software development. They are extremely fast computationally, but it takes a lot to make a complete package of software for a mundane business need. If you add some handling to the computer code, though-- and if you make the computer capable of providing organization to that code when the quantity of it becomes large-- you will have a much better car. But this isn't a priority for any computer company today. Worst of all, most software engineers will hardly tolerate discussion of it even though it is extremely obvious and bearing down on them.
Many companies and software organizations write similar code for the same purposes, and there is a great deal of duplication of efforts across software. The computer is used in the same way at different types of businesses, but because they each have slightly different requirements they often have to get custom-made code for the same topic (e.g. inventory or invoicing). It has to do with how computer code exists, that it isn't very modular in terms of sharing pieces of programs that already have been made.
There is also the problem of executives at large corporations having upside down priorities. They are supposed to be the people who override the ignorant software engineers because they can see the big picture, getting advice from computer scientists from the 1970s who know what is happening today, but they don't see how they would stand to gain financially. If progress is made in this area (embedding information organization and "handling" in computer code), it isn’t as easy to promote it in the news media or in an advertisement. Thus, a reporter won't appreciate it. The gains may take some time to be realized (frustrating a stockholder at a quarterly earnings report), and so it is easiest for the executives to push for easy gains, pour enormous resources into projects that make use of the existing situation and incorporate faster or more advanced hardware technologies.
The quickest way to make a fortune at a computer company for quarterly profit is just to speed everything up superficially with faster chips, put a new display inside a phone, to deliver nothing but gadgety features all the time in flashy presentations. The public will be amazed and have no sense of the growing mountain of code that is sitting under the hood of the computer, as it is constantly being concealed by the army of software engineers. The pressure to compete indulged by corporate executives can reach levels of the absurd, where it actually involves removing the headphone jack that most buyers of the phone still use. Then stockholders are happy that profit is huge anyway because so many new computers or phones were purchased. To them, evidence of profit is evidence that all is in order, so no one anywhere can be reasoned with. This is actually how it is today.
Just as a car’s overall evaluation score cannot only exist in the engine, computing power cannot be defined in terms of CPU clock cycles, bus speeds, etc., because after that side of the situation has been addressed and the speed of computation is available, there is also the question of how a person can adequately make use of those resources.
Always, the chip companies are trying to make faster processors by placing more transistors at microscopic scales. But there is another facet to computing power and that is how adept the computer code is. If the computer code is always left unwieldy, as it is today, and it has no underlying organizational schemes— which is still true— then a very fast computer will go to waste. It will run a sloth’s code conventions.
Talking from the other side, all of the computing power out there is enormously untapped.
Noctivagous Computer Corporation
An Upset Technological Environment
Before the jet engine arrived, there was a concept once illustrated in a magazine that there could be spaced refueling stations where transatlantic pilots take a break with their propeller planes when traveling across the Atlantic Ocean. The arrival of the jet engine rendered this way of looking at air travel totally obsolete because the problem was not that an airplane needed refueling breaks in the middle of the ocean but that its range of travel was too short for such a long trip.
That's actually what is going to happen when computer code is upgraded. There are hundreds of startups and technology companies today that are unlikely to be around after computer code changes because the problems they solve-- or the services they offer-- are entirely fused to today's computer software systems. Many politicians do not want to speak to Noctivagous, but the joke is on them in the near future because when the WorldWideWeb changes radically (after computer code is upgraded) all of the public hearings they hold today will look extremely odd. Will Elon Musk want to have purchased Twitter? There is no way.
Just to give one specific example, some companies on the web just exist to help people put together web pages because it is hard to do so for a regular person. But there isn't a technical reason it has to be hard to make a web page. It's not hard to make a word processing document, so why would anyone have trouble making a personal home page today (outside of Facebook, etc.)? It is only hard because the engineering design of the WorldWideWeb is so bare. It could be as easy to build a website as making a spreadsheet, but the W3C isn't willing to change web browsers in that way. No software engineer can dispute this. Similarly, Google exists because it "crawls" the web, but if the engineering design of the web were different, it wouldn't have to do that because the web servers could report what information they contain, in their own organized format, to various central databases upon request.
The web, or WorldWideWeb as it used to be called, will shift dramatically, and it might even be replaced completely, because anyone will be able to rewrite these web browsers quickly. It may sound unbelievable, but some multi-billion dollar companies may not exist in the future because of this shift. You really don't need so many software engineers working on an operating system or software when the code configuration has changed at a deep level. The immediate quality of control over the system will have improved enormously.
The motivation for the German version of a jet engine was to develop a quieter engine than the loud propeller one in use at the time, and this then turned into something bigger in importance and value for society.
We are, today, as if taking trips across the Atlantic and stopping at refueling stations. And our propeller engines are loud. When we upgrade code and provide ourselves the equivalent of a jet engine, we will have a new industry, just like how the consumer airline industry arrived.
Bundled Basics with a Noctivagous Computer
Real-world experience and needs are different than what ships on an operating system today, and one gets the impression that computer executives are disconnected from what people actually do in business and in real life, apart from the time they spend for entertainment. How else could they be so determined to replace your computer with a new one when everyone you know around you wants to keep the same one they've had running, knowing that it works just fine (or could, if the computer company wanted it to)?
People don't need a microwave oven connected to the Internet; they need a microwave oven that cooks the food thoroughly, not too hot, and sometimes toasts it. There isn't an microwave oven yet with an inventory of foods. If A.I. is used for anything it should be for this kind of domestic task. A.I. ought to be washing clothes, drying them, and folding them all in the same machine. So that you don't have to get out of your car and face the gas fumes and the noise, the A.I. should remove the gas cap and pump the gas. All of this is feasible and within range of current technological conditions. But it will take a lot longer to make these kinds of devices than it should because it takes too long to make the code to run these appliances. (Still today, A.I. software applications are typed out like on a typewriter.) But overall, the convenience of a self-checkout stand can be applied elsewhere. In these cases, the technology is serving the society and people are not scared by it.
Computer companies have their priorities, which is to present to the public a set of loud products to gain praise in the news, and make company profit. You are afraid that if you don't buy the device you won't be measuring up, so they get your money perhaps. Or they suddenly made your old computer obsolete so as to pressure you into buying a new one. As a result, they don't make a better microwave oven; they try to hook it into the Internet. And your house then isn't like that one hooked into a wi-fi hub, where you make a noise and the lights turn on. Then it would be a great future for all of your home appliances to be sitting on the Internet, maybe making money for you with cryptographic digital currency. This is truly garbage. It is an upside down technological society, making e-waste.
Everyday life as lived by normal people-- not extremely rich computer company executives-- is something altogether different than the discussions taking place inside these Silicon Valley offices. Very often, it happens that what the software provides only lightly intersects with needs.
Products with No Strings Attached
You have to be willing to give people what they need without any strings attached and this is less common than ever. Because it is so rare, the technology environment is quite miserable to live in.
For example, if Microsoft bundles a video editor program with Windows, it attaches a condition that the person signs up with an e-mail address. By now, everyone knows how tiresome it is always to be signing up for services... Then it tricks the person into signing up for a subscription. This kind of problem is everywhere and it won't be solved until computer code is upgraded, which is when it won't be a remarkable event to have a completed package of software.
When it isn't a remarkable event to have a completed package of software, software companies won't get away with doing small stuff and overcharging consumers.
Before an AR headset is sold, all computer users should have quality of life improved in the following way:
- Printing labels for all of the cables they own.
We use this one point to explain what the computers should actually be doing for regular people. If you buy a computer and it has software, it should help you print out labels for all of your file folders, and you shouldn't have to search for any third-party software that costs anything (or makes you enter your e-mail address yet again).
Noctivagous intends to bundle a wide range of essential software for the person who buys the machine. This of course is possible when the programming systems are much more expansive in their approach to making apps, the way that this website describes. The computer should do a lot for the basics of everyday life; someday it should be able to clean your room and tidy up your house, it just can't do that. That's how we look at the computer.
To achieve that goal, the entire mode of writing software in the operating system has to be redone. A large amount of software has to be rewritten quickly.
It just isn't that software will be faster to put together when code is improved, it's that new types of software applications will be made available.
The Unfeasibility of Starting A New Computer Corporation without Immense Financial Resources Stems from The Primitive State of Code
- Consumer needs are much broader now than when the foundations of Windows, Mac, and Linux were established. Those operating systems were built up gradually over decades. Therefore, working with redesigned computer code is even more essential for making a new computer corporation.
- A smartphone is now faster than very expensive, high-end computers of the 1990s, so it must be the case that if the same code is being used today as in the 1990s, the computer industry has lagged in an important area.
- Computer processors advance, having become hundreds of times faster than a few decades ago, but they still implement schemes that are Unix-like (Mac, Windows, Linux), a trend laid down in the 1970s and 1980s. In a general way, very fast machines are now implementing operating system theories from decades ago.
- Even when a new operating system arrives, the types of software it offers never reaches into new territory like it used to. Software categories (e.g. video editors, 3d modelers) are exactly the same today as in the 1990s, with the exception of machine learning. The familiar apps (Premiere, Photoshop, etc.) just get faster to use or add new features to their well-established conventions.
- The 1990s serves as the general boundaries for what people do today, and this is because the last 20 years has never truly expanded beyond that era. The 1990s were a leap above the 1980s, but the 2000s and 2010s were not a leap above the 1990s.
- The feeling of using a desktop computer today is identical to the year 2000, except for certain improvements in trackpad and mouse devices.
- Mobile devices fascinate the public, but they are actually stripped down, reconfigured, and less-capable versions of the desktop computer, to the code level. Although ubiquitous, neither the iPad nor the iPhone are as impressive as people believe them to be; they are just portable computers that feature touchscreens.
- The "Post-PC era" discussion was a false agenda, and those who promoted it didn't understand what was lost when a mouse and keyboard transmuted into a touchscreen that only allowed interaction using fingers. Much time was spent infatuated with the touchscreen and its input pinch-zoom gestures rather than what the computers were doing for people, which was the focus of the 1990s.
- Pressures to conform to technologies of the current computing environment make new OS efforts reluctant to break away from the bad habits developed in computing. Too many technologies now need to be accommodated, wasting a future computer company's time.
- A computer company is better off creating its own, independent computing approach, allowing others to join later, rather than immediately hooking into the existing industry.
Desktop Computer GUIs Are Unrecognized as a Mental Model for Interacting with a Computer.
- The computer is unlike any other contemporary machine because its functional parts are microscopic. This particular issue is hardly ever addressed in lay discussion. There are entire landscapes of micrometer-level circuits inside the computer and this means that how the computer works is remote from the physical level at which the programmer works with it. There is a very large difference in scale between a keyboard key and the elements of a microscopic chip.
- Only felt at a vague level by most programmers, computer programming is largely about working within learned concepts pertaining to what the machine does behind its curtain, down at its very microscopic level. It never truly involves interacting with tangible components, of course. But, there is always a misleading impression that programming is something materially substantial. This is because of conventions of the GUI.
- Programming with the multitude of programming libraries is like making machinery in a fume hood without being able to directly observe the machine parts because basically everything is out of view. Enormous volumes of information must be learned in advance, such as how many bytes a given typewritten variable will eventually take up in memory. But allocated where? Somewhere in the machine, which is something out of sight, always, like everything.
- What is inside the computer that is present in storage, or executing on the CPU, is much larger than the impression a person gets while using a GUI, even when opening task managers that show processes.
- There is very little feeling of materiality or physicality for the entities that reside in the computer. Whereas a car hums and a clock ticks as part of its operation, it is merely a fan that whirs occasionally on a computer. But that is just for cooling.
- In the real world, it is very obvious when someone intrudes on private property or territory, but that is not as much the case for a computer today because its current arrangement conceals too much of its internal activity. Inherently, it does not expose any physical characteristics that would let a person sense when something has gone wrong or has been transgressed by another person. Alerts and explicit indicators are required, always.
- Therefore, for computer security to improve, there may have to be opportunities to physically sense what is happening in the computer, however that might exist. This would be a new sort of setup, a configuration different from our current desktop computers and mobile devices.
- Organizing principles for the filesystem are bare, originating from the MULTICS directory tree structure from the 1960s. A node can extend from a node, only, and this is a restrictive scheme for information organization. Folders can be placed inside folders, and that is all.
A Rejection of Empiricism as a Corporate Epistemology is Essential.
- Largely, modern science has appropriated knowledge acquired in ancient times, such as geometry. When it has added modern theories, it has claimed that all prior discussions actually fell under its way of thinking.
- Many aspects of mathematics are more spiritual than they appear, and have been reduced to rote equations memorized during high school and college. Many mathematics taught in high school are easy on the surface, but can influence a person's deeper view of the universe.
- The notion of investigating world phenomena in an empirical way originates from Rene Descartes, who embarked on individual research pursuits that would be frowned on in today's institution-driven, peer-review-based environment.
- Many discoveries, such as infrared (1800), predate the rigid research requirements that have developed over the last several decades, which prevent people from exploring the world on their own without approval from authorities in the empirical sciences.
- Those who fight for modern empirical thinking most fiercely are often the least aware of philosophy and epistemology, but have a knack for modern science.
(Page in Progress.)
Industrial Design for Future Noctivagous Hardware
Industrial Design Is No Longer Moving Forward, Apart from Making Things Thinner
Before talking about what Noctivagous has planned for industrial design, discussion is needed to explain why Noctivagous will not follow the existing trends of modern design as exemplified by the iPad, iPhone, and Mac Mini, with their minimalist plastic and metal shells. There are people who might read this website and consider themselves design experts on account of having purchased Apple products. This could actually be the majority of people, including software engineers who have no design background at all, but express strong, emotional design opinions because they appreciate that Apple has put care into its products that other companies have not. This is so commonplace, it is important to talk about the history of industrial design and architecture in the 20th century. Many people have the wrong idea that Apple's design principles emerged entirely from Apple itself or perhaps Jony Ive and Steve Jobs working together. In fact, the products you buy from an Apple store stem from modernist modes of thinking, which has shaped our civilized world in fundamental ways since the start of the 20th century.
In truth, there are few remaining modernist paths that industrial design can go down in the computer industry. When Apple's industrial designers try to change their plans up, they are generally not doing anything essentially different, and this is becoming more obvious from year to year. Noctivagous says that the theories of modernism are a root cause, that modernism should be abandoned to allow industrial design to move forward, because it carries unrecognized hindrances due to its banishment of decorative principles. For historical and ideological reasons, it offers nothing of long-term value, and it leaves designers running in circles or endlessly tweaking small features of their designs.
The first point of discussion for this, that modernist industrial design has reached its near dead end, is the enduring presence of the Titanium G4 PowerBook design, released by Apple in 2001. This computer design marks a major shift in industrial design for desktop computers; it changed the course of laptop computing, although it was influenced directly by the Sony VAIO line of computers. The first model of VAIO laptops, the PCG-505 (below), was sold by Sony in 1997, and its influence on the 2001 Titanium G4 PowerBook is obvious.
Some aspects of Apple's portable desktop line have changed over the years, and some technological advances have allowed all laptop computers to be made thinner, giving the feeling they are much more advanced. But in the general perspective of design, every laptop computer sold by Apple today descends from the Titanium G4 PowerBook, and the industrial designers there have just been making minor improvements to that design. Tech enthusiasts, hungry for any technological stimulation, will obsess over minor changes to these shells, and this precludes public discussion on an important topic, that modernist industrial design holds Apple and the computer industry back.
The Titanium G4 PowerBook established an industrial design format that none have been able to break away from. What Apple has done since then is change the hinges and other exterior features of its laptops. Other companies, who regularly borrow designs from Apple, come up with their own variations. If made thinner, the 2001 Titanium G4 PowerBook design could easily be sold today. Examining the history of modernism will explain why this is the case, that people find it hard to break away from archetypal modernist designs established at certain points in history.
The Emergence of Modernist Design Theories
The history of industrial design for consumer electronics, or product design as it is called in the U.K., is intertwined with the school of architecture called modernism. Modernism, which got a running start with the Bauhaus school in 1925, claimed that all adornments, all decorative elements, are superfluous to the needs of man and industry, just as they are superfluous to the internal workings of a machine. The machine is efficient, it doesn't need decoration to do its job, and since man is moving forward towards higher levels of technological progress, breaking from his premodern past, he should take on a similar way of being. Man is moving into the future; he doesn't need decoration in his buildings, cars, and cities. He now has access to something he didn't before for making buildings: steel and glass.
In modernism, all prior history was proclaimed to be outdated and influences were directly denied.
As the 20th century progressed, more detailed features of ornament on buildings were gradually being reduced, with art deco presaging widespread elimination of decorative elements on the exterior and interior of buildings. It was the modernists' objective to reduce the use of adornments on the exteriors of anything ("form follows function" is a phrase from Frank Lloyd Wright's mentor, Louis Sullivan), as having any decoration originating from the past was perceived as being tethered to backwardness. What many people wanted just after the turn of the century was to strive towards the future, emphasize science, and break down all barriers to technological progress. In the modernists' view, every sort of object in everyday life could be whittled down to its essential form. The Bauhaus tried to design goods and buildings that were without any floral patterns or traces of history, reduced to their underlying shapes.
Simultaneously, as people began to favor design that was without any ornament, they stopped passing down the skills and knowledge to the craftsmen and architects, who designed buildings and their interiors. Soon, it wasn't anymore that people were rejecting the past, but that they simply didn't know how to connect with it at all.
The result of this in the design of things was that by the 1950s, lots of flat, undecorated surfaces came to predominate. This is also pronounced in any Apple Store, which is so close to early 20th century modernist architecture that it approaches outright copying from a textbook. This is also a tricky topic, because so much of modernism looks the same.
Because of Steve Jobs, modernism was applied in a direct way to Apple's computers, its music devices, and its stores. The large glass doors at an Apple store are certainly taken from the glass panels of the Barcelona Pavilion. Steve Jobs' relationship with modernist design actually predated his return to Apple, first with the Apple IIc and next with the NeXT cube. For the 1998 iMac and G3 and G4 towers, Ive and Jobs tried to use unusual materials, such as translucent plastic. The appearance of Apple's later, lauded designs are actually a result of Steve Jobs borrowing heavily from modernism and Japanese design.
Modernism, Japan, and technological device design is often mixed together. Early twentieth century modernist architects like Frank Lloyd Wright did, in fact, did take some inspiration from Japan after visiting it (which, ironically, has a rich history of ornamental design), and at one time Wright designed a few buildings in that country. As Japan modernized, it was also influenced in turn by modernist design. It brought its long cultural history into its own take on modernism, which is often appealing to the outside world. This is why Jobs is said to have pointed to Sony as an ideal consumer product company, and he took trips to Japan. Sony, largely speaking, imbues the Japanese bearing into its industrial designs (or at least it did prior to the 2010s).
Postmodernism and Other Reactions to Modernism
Starting especially in the 1960s and 1970s, designers and architects expressed frustration with the paucity of interesting features in modernism. They recognized that one could only take so much of blank, drab slabs.
Some schools of architecture followed modernism, and they are considered direct reactions to modernism. Their aesthetic principles are most often extremely poor while also being very theoretical. Usually, these schools demonstrate variations of destroying or mixing up the general flat or boring appearance that modernist architecture and design produces. These include postmodernism, which added architectural elements to buildings that were whimsical allusions to earlier architectural history. Postmodern architects would add, say, a Greek column to a building, with all of its defining features removed except for the most recognizable ones. Or they would build it lying on its side deliberately, as if it had fallen over, but also make it look new. 1960s Las Vegas is regarded as a major source of inspiration for postmodernism in part because of its freewheeling look.
But, as we will talk about later, because all of the reactions to modernism were devoid of genuine, traditional decorative goals, having followed the modernism that rejected that altogether, they were forced to reuse old decoration in strange ways or break apart the modernist's planes into random, self-directed arrangements and collages. Modernism and its successive design movements can introduce total nonsense and try to pass it off as serious design work, and this is a consequence of abandoning true artistic principles that accompanied mankind before the 20th century.
Praise for Apple over Other PC Vendors in the Mid-2000s
What is the difference, then, between Apple and the rest of the computer industry, which has also existed in the modernist age alongside Apple? Before Steve Jobs emphasized industrial design strongly in front of the public, the rest of the person computer industry did not pay attention to the appearance of their machines, and they inherited modernist thinking without knowing it. So, yes, the off-brand PCs were unadorned just as Bauhaus once talked about, not putting any decoration on anything associated with modern life, but there is a big gap between that and modernist architecture. The PCs were made without any goal to make the computer box look appealing and that has nothing to do with Bauhaus or modernist architecture.
This is also true for the modern urban environment, in that there are many buildings that inherit modernist architecture thinking, but their builders or architects have scant knowledge of, or no interest in, making their presence appealing to the public. So, the Apple Store stands out as a nice place to go, while also subscribing to the same design philosophy that is present through the rest of modern U.S. cities where people would rather not visit because it is dreary, like many federal buildings. There are also many architects who point out that it is modernism itself that has created dreary urban landscapes, and they are quite right to think so.
Modernism, because it is without any tie to the past, engages in lots of arbitrary exercises, like making a gigantic cube out of glass that sits in the middle of Manhattan. It ends up focusing on irrelevant matters and ambitions, like constructing a building entirely out of glass just for the sake of constructing it out of glass, not because it would look better with glass per se or that the glass serves some actual purpose with respect to the use of the building. For example, as soon as employees moved into the famous Apple Park building, they were running into the glass walls and sticky notes were placed on them to provide warnings. This kind of thing only happens when the project architects are missing basic standards, but it is unavoidable because the professors who teach them also are without them, and it has been like this going back decades.
Modernism can also send industrial designers down roads of design tweaking; they might start to care obsessively about which type of screw will affix the machine together. The least significant details of a design will be tended to for long periods of time, at the expense of the project as a whole. This particular habit can be latched onto by the lay public, who is eager to understand art and graphic design in an easy way, one which doesn't require any effort. At the center of this is the "clean vs. cluttered" notion, wherein anything more than a bare appearance, sitting simplified and empty, is saying more than needs to be said or displaying more than has to be shown. It is this that enables the regular computer user, especially startup or corporate executives, to begin prescribing design guidelines according to their own feelings. They feel that they have reached the pinnacle of design upon placing a Helvetica-like font in the direct center of an otherwise blank page. Modernism feels accessible to the public when it is applied to computer technology, and anyone can feel like an expert right away, without any checks on bad design behavior. Marginal changes to a corporate logo, or simply the abolition of its decorative elements, become justification for the business's marketing to show itself off and solicit applause from conference audiences.
Industrial Design That Restores Ornament
Jony Ive's LoveFrom recently designed an award that grasps the value of restoring ornament and decorative principles, and it demonstrates that Ive has a willingness to pivot in his work and do what is demanded, not just what others have done. In contrast to this seal, all of his prior designs were strictly modernist, without any kind of decoration. The LoveFrom entity is apparently a collective of individuals, but since it is led by Ive this seal must be considered a full endorsement of ornament, at least in the circumstance where LoveFrom implemented it.
What does a computer look like that is more than blank sheets of machined metal, embodying elements from traditional ornament? This is a topic that will require some work by all interested companies to answer, for certain.
(Page in Progress.)
iPhone Alternative, Desktop Alternative
Always the iPhone
Often, it seems that there is no way out of making a phone that doesn't carry the general design of the iPhone. If there is ever an attempt to do something completely different, it is as if going backwards, to before 2007, with hardware that carries too many physical controls, has too little screen space leftover, and produces results that aren't worth the price of a smartphone. Really, if you do anything different, no one can watch movies at full size on their mobile screens, so it is often a non-starter. Still, there is no way that the 2007 iPhone and 2010 iPad are the final state of interactive, mobile computing. At the minimum, they are too nubby and dumbed-down. There is no way that the same tablet computer that works for young children should be used by adults, especially tech-savvy adults, but that is the current state of touchscreen interfaces.
The dilemma is said to be as follows. If you add physical buttons to the front of the mobile screen, especially keyboard keys, you sacrifice screen space for controls that aren't always in use. The touchscreen allows whatever controls to show up that are needed, whenever they are relevant to the open software application.
But if you stick with the touchscreen as it is today, you end up making an iPhone knockoff, and you keep the same problem everyone experiences, which is that touching a flat screen directly with fingers feels imprecise and fake. The graphically-rendered buttons don't have any tactile quality; they definitely look and feel as though they are under the glass of the touchscreen. Yes, it is true that any number of user interfaces can be generated for the user underneath that glass, but sometimes there is the feeling that physical buttons would be preferable, just that no one quite knows how to do it.
Navigating the UI with a Stylus Is Mostly Gone, But Can Be Found in Industrial and Warehouse Settings
Then, of course, there was the old stylus of the Palm Pilot and other devices, which are always set in contrast to the full-sized, multitouch screen of the iPhone and iPad. The stylus was intended to provide precision, but the experience of using it was as if a bad set of notes was being played. Using a stylus on a mobile screen involved lots of hunting and pecking, putting you in some kind of charade that you were doing things that helped you, but probably you were just wasting your time with a gadget. It didn't feel like navigating a device effectually or writing notes on paper. The metal or plastic stylus slid all over the plastic screen.
Primarily, manufacturers today limit the use of the stylus to drawing or writing, especially on e-Ink devices, whereas on the Palm device it was expected to be the tool for navigating the interface. This does make sense because as preferred as the finger is when using a touchscreen today, the stylus truly did allow precise, rapid targeting of the buttons on the screen; this is why the stylus is still used by shipping, receiving, and inventory control. Like it or not, business sometimes reveals what is true about user interface, regardless of whether tech enthusiasts think fax machines and pagers should go away. The imprecise experience of navigating a mobile device with finger touches is still an issue: in industrial settings, it is faster to use the stylus because the user does not need to figure out whether the finger has hit the right area on the screen. In industrial settings, also, fingers may get dirty and so will screens, and the use of a stylus keeps equipment in good shape.
In the end, though, this user interface spectrum in which the stylus sits on one side and the finger sits on the other, is something that can be moved beyond.
Always Pretending That It Isn't a Machine While Designing for It
From this point forward, we are going to talk about user interface principles that have never been discussed in the computer industry, or at least they were never built in a deliberate way into consumer electronic devices. This is Noctivagous Computer Corporation, not Apple or Google.
The issue is that designers at computer companies do not recognize that when you are designing interactions with a machine, you must acknowledge that machine carries its own bearing, its own nature, and this shouldn't be overridden for the purposes of giving the end user a more familiar experience. It is OK that the person is using a machine, and it is right to accept it as such instead of forcing it to feel like something different.
Although the iPhone was and still is successful, it is a product of the consumer technology industry that is always trying too hard to make technology feel extremely accessible to the non-techy, because sales is the goal. The problem is that there is no more room for philosophical discussion about what technology should be in the hands, and this cuts it off from developing in any other direction. Therefore, it looks like the iPhone cannot be surpassed. All that Apple can talk about is just how to make products that the lay public can accept and use right away without having to think about anything. Although on the surface this looks like a virtuous stance, it actually limits the course of technology severely. Reducing the computer experience to its most basic levels of interaction is treated as some kind of timeless value, even though it really is just saying one thing: anything confusing is undesirable and the solution is to whittle everything down to nubbiness.
Yes, if you remove the stylus and replace it with fingers, this is somewhat better. But primarily you can't use a stylus with a Palm Pilot screen as if it is really a paper interface to a computer because it is an affectation to do so; the person is performing an act, he is pretending that he is writing on paper just like normal, when really he is interacting with dynamic, electronic controls that happen to be displayed on a flat surface— not the same. In history of personal computing, people have always tried to make machines be something other than what they are, rather than embracing the nature of what it is, a machine. You don't see this with old mechanical automata, like the mannequins that would write words on paper; no one is afraid to let everyone know they are machines. The same is true for watches and clock towers. But with consumer electronics, the manufacturers pretend that they are not designing hardware that is carrying buttons and electronics inside. They want to impose simulations of familiar things onto the machines. Apple does this still today. It is perceived as making things more "friendly."
There is a false dichotomy in people's minds in respect to how electronic machines should be designed for everyday use and how a user should interact with them. They believe that if you embrace the nature of a machine, that it carries that mechanical feeling, you will be bringing geek into the situation or the device will feel cold and uninviting. This is not really what takes people into a geek state of mind or makes machinery feel inhospitable, though. Acknowledging that a machine carries the nature or bearing of a machine is the right thing to do upfront; pretending that it isn't one, forcing someone to feel like it isn't, is what creates unpleasant experiences, like a stylus on those old tablet devices such as the Newton and Palm Pilot.
If the Palm Pilot back in the day was saying, "look at you, you are writing on mobile, interactive paper you can carry anywhere," the iPhone says, "look at you, you are touching buttons with your fingers." Therefore, they actually share the same problem, that they are denying what the device actually is made of, electronic components, and asking you to enter into a charade. The same is actually also true of the GUI desktop ("look at you, you are moving 'windows' around on your very own office desktop. You've got your own folders, too.").
As it happens, icons, which lead to this roleplay, were introduced over a decade after the mouse was demonstrated by Douglas Engelbart in his famous demo. So, it isn't the mouse that brought this situation into being, nor touchscreens.
The Microfilm Viewer as Case Study
The microfilm viewer is a great example of a mechancial device that does not deny that it is a machine and works well for its purpose. No one has a significant problem with it— that is, it has never been a subject of complaints by library researchers over the decades— and it is as mechanical in user experience as any commercial device could be. Whether it would be improved in one way or another is not something people jump to talk about like today's technologies, although there are some models that are definitely easier to use. It doesn't take on any posturing, such as "look at you, you are reading a real newspaper," and this lack of pretense happened naturally because it is made mostly of mechanical, rather than modern computer parts. Of course, its screen image is physically projected, not rendered digitally and displayed through pixels. This is probably a reason there are no charades introduced into the situation. It is what it is, from the beginning.
Industrial design being a separate topic, a machine with this mechanical dynamic— one that is without any affectation and gets the job done quite well without causing major complaints— is unlikely ever to be unveiled on stage by Apple. Yet, it is the technological direction that everyone should prefer.
Every time you move a GUI window around with the mouse, so much computation is happening that it is almost indescribable. For moving a window that is 1000x1000 pixels in size, you are dragging one million pixels that need to be recalculated for each pixel step of the mouse, and the nearby windows in the computer also have to be reprocessed. Their internal needs have to be re-examined by the system or the applications themselves. It isn't as simple as the charade looks.
Put differently, you start dragging and each pixel step that your mouse moves while you drag will require the rearrangement in memory of one million pixels and then a redisplay of them on the monitor— for just that one pixel step. If you move the window a distance of 400 pixels across the screen, at least (because surrounding elements need to be re-rendered) 400 million pixel updates must occur. This is a gigantic amount of computation for the task of physically moving static content.
Engineers often talk about efficiency, but we see that their area of focus on this subject is off base; they are lacking a larger frame of reference because there is no computation in the microfilm projector, for rotation, zoom, and panning.
It actually took the entirety of the 1990s before computers could quickly process this volume of data, to move a facsimile so smoothly across a computer screen that its physical movement looked close to handling printed media in the real world.
Also worth mentioning is that today moving around a 1000x1000 pixel window might actually require processing as much as four million pixels on a very high resolution monitor, such as 4K, because each pixel is now given four subpixels for each pixel.
To be clear, the two technologies are not the same in purpose. A computer carries a much greater breadth of capabilities, with deep versatility provided by those pixels, as they can be updated to produce animations or display any sort of data, whether it is retrieved over a network or on disk.
But the microfilm projector does not show up in landfills often, or at least it did not for a long time, and this is something we should pay attention to; they last for decades, and when they were in active use no one complained about using an "ancient" one.
Newer microfilm projector models were released that improved on older conventions, but there wasn't a rush to toss out the old ones as soon as a new one was made. Microfilm isn't a technology that is vulnerable to obsoletion, as even the free, international source code repository GitHub stored its database on film in a vault in the Arctic.
Although this topic pertains to a specific area, display technology, it points to a broader pattern in computing technology, that the path of advancement
can be shifted towards longer-term outcomes that require far less
overhead. There are not too many examples of these that compete
with today's computing technologies, but that may change in the future.
(Page in Progress.)
Developed Inputs
Software Development Should Have Its Own Hardware
It is time that programmers consider that custom programming hardware could bring software into a much better state. In the electronic music industry, there are all sorts of electronic instruments and input devices. Musicians never hesitate to fill their studios with equipment for their craft.
As software development incorporates more graphical features, the hardware input devices can grow to accommodate the new IDE functionality.
At first, it might be good to partner with these electronic music companies to produce this hardware, since they are already experienced and their device configurations would only need a little tweaking or changing of control labels to match the needs of programming.
The programming we are talking about, it goes without saying, has nothing to do with what people are doing today, which is silly.
(Page in Progress.)
