rdist

December 1, 2014

Was the past better than now?

Filed under: Security — Nate Lawson @ 4:00 am

Here we go again — another article arguing whether the past was better or not (this one says “better”). These articles are tiresome, rehashing the debate whether technology is enabling or isolating and dehumanizing. But I’m interested in a different line of technology criticism: which parts of technology are a regression and what to do about that.

From the first stone tools, technology has both reflected us and changed us. When we became farmers, we became less portable and vulnerable to robbers, and it was possible to measure capital for the first time via a land’s quality and location.

When evaluating today’s technology, I think it’s important to keep a flexible point of view and not be limited by a linear view of history. For example, what would digital cash look like today if we had adopted a 10-year land ownership rotation back then? A linear progression from good to bad (or bad to good) ignores a more nuanced view that focuses on the good and bad, leading to an understanding what we can do about it.

Even though I work with developing new technology every day, I’m reticent to adopt it until I have time or motivation to review it thoroughly. There are two main reasons:

  1. Advances in technology often come with critical regressions
  2. What you use changes yourself, your way of thinking, and what you believe to be possible

The microwave oven was a huge advance in heating speed, but you lost the key aspect of temperature control. It is still difficult to find one that allows you to heat food to a particular temperature. Instead, you have to guess at the combination of watts and time. Software is even more plastic. You can be using code written by a 20-year-old Javascript newbie for reviewing the intricacies of your personal genome. Calling this entire technology a step forward or back is much too simplistic, and it lets said programmer off the hook for not knowing their own history.

Computer history should be a mandatory part of the curriculum. I don’t mean dry facts like the date the transistor was invented or which CPU first implemented pipelining. I mean criticism of historical choices in software or system design, and an analysis of how they could have been done differently.

Here are some example topics to get you started:

  1. Compare the Mac OS X sandboxing architecture to the Rainbow Series. Which is more usable? Compare and contrast the feature sets of each. Create an alternate history where modern Unix systems had thrown out UIDs and built on a data-centric privilege model.
  2. In terms of installation and removal, how do users expect iOS and Android devices to treat mobile apps? How does this compare to Windows programs or Linux packages? What are the potential side effects (in terms of system or filesystem changes, network activity, etc.) of installing a program? Running it? Removing it?
  3. Some developers have advocated “curl | sh” as an acceptable installation method as a replacement for native packages. They argue that there is no loss of security compared to downloading and installing a native package from an uncertain origin. Compare the functionality and risks of “curl | sh” to both a common package system (e.g., Debian dpkg) and an innovative system (e.g., Solaris IPS), focusing on operations like installing a package for the first time, upgrading it, installing programs with conflicting dependencies, etc. What is truly being lost, if anything?

Good design and engineering involves knowing what has come before, so we can move forward with as little loss as possible. Engineers should learn more about what has come before to avoid repeating the mistakes of the past. The past wasn’t better than the present, but ignoring it makes us all worse off than we could have been.

Blog at WordPress.com.