Open and Closed

Today, Facebook announced some major updates to their messaging system. There was a lot of speculation over the weekend that they were going to be launching an email client, perhaps in an effort to compete with Google and Gmail. And while email is definitely a part of this new messaging system, it isn't strictly email - it's a new way of handling messages altogether.

The service is currently opt-in (which means Facebook is learning about how to roll out features), so until I get a chance to give it a try for myself, I'll hold off commenting about it. If you want to learn more, you can read their blog post I linked to above, or watch a recording of their announcement. You can also catch Gizmodo's wrap-up of it, or read arguments for and against, courtesy of Lifehacker.

This announcement, however, has inspired me to write on something that I've been giving a good amount of thought to lately - open and closed systems.

At the risk of oversimplifying, let's define the two as follows:

An "open system" is one in which the system user, more or less, controls what they can do in it. A key component of an open system is the ability of a user to break from the intended use.

A "closed system", by contrast, is one where the system designers dictate what can be done within it. It is, by design, difficult to break from the intended use or funtionality.

These definitions are geared somewhat more toward technical/computer systems, but the general ideas can be applied to many things:

A game of solitare, played with physical cards, is an open system - you can break from the rules if you so desire. The same game, on a computer, is a closed system - breaking from the rules is, by design, impossible (unless you hack it somehow).

Driving a car is, relatively speaking, an open system - you control what you bring in it, how fast you drive, where you drive, when you leave, where you park, etc. Riding a bus is a more closed system - you lose some of this control. Even less open would be air travel (as new TSA regulations are clearly demonstrating).

In the tech realm, we have high-profile clashes between Microsoft and Apple for operating systems. Windows is more open, which has the effect of making it more vulnerable to viruses. Apple controls their system more strictly, which affords users more protections, at the expense of some flexibility in what they can do with it.

This mentality extends to mobile phones, where Apple's iPhone is fairly well locked down - they tightly regulate what apps you can download. Google's Android system is more open; one of their ads proudly proclaims that "when there's no limit to what Droid gets, there's no limit to what Droid does".

And, speaking of Google, they're battling Facebook over online identities. Facebook very tightly controls your information, and has only just started allowing users to pull their information back out. Facebook is notoriously difficult to leave (there's no simple "cancel account" feature - only a "deactivate" feature). And you can really see the difference in their approaches by looking at Facebook versus Buzz.

Facebook doesn't play nice with other services, though they DO make it fairly easy to connect OTHER sites to Facebook. They pull data one-way, into their closed system.

Google Buzz, on the other hand, is fairly minimal in features of its own. Instead, they allow you to integrate other services of your choice to Buzz, without locking down the data. By comparison, Buzz is pretty open.

The way Facebook and Google, two GIANT aggregators of personal data, treat this data is a wonderful example of differing philosophies. Facebook, by nature of trying to "map your social network", MUST tie your data to you. There's no other way for it to work. Facebook wants to aggregate your personal data, and use it to map how you connect with other people. And, in (presumably) an effort to keep you on the Facebook system, they close this system, making it hard to export the data it has collected.

Google also collects data from its many users, and there's no way for a user to really "delete" the data Google has collected about search habits, program use, etc. The difference, however, is that Google's information is more or less anonymous. They don't need to tie your search data to you in particular - just that SOME user has made that search.

When I was watching Facebook's live stream (that I linked to above) earlier today, they talked at one point about the differences between how Facebook and "other services" (really, Google, and Gmail in particular) serve ads. Google determines the ads to serve based on the content of the messages; when I open an email from Toshiba, I'm served ads about laptop computers. Facebook, by contrast, serves ads based on the information you've given them. In my case, I often get ads for games, despite what content I'm viewing. This difference gives an illustration of how the data they collect is used.

Now, none of this is meant to push one style of system over another, in general or in specific circumstances. Both have benefits, depending on the situation. As a general rule, I prefer open systems (so if my tone in describing closed systems above is slightly more negative, you know why). I think power and control should belong to the user, not the designer - good design, in my opinion, should GRANT users freedom, not take it away.

But I know not everybody agrees. There are tons of people who prefer the "just works" mentality of Macs, even if a more appropriate claim is "just works - but only in the way we intend". And that's fine - consumers certainly have the choice to use whatever system they prefer. The most important thing, really, is to understand the limitations and rules of the system you choose to interact with.


Dave said...

"I think power and control should belong to the user, not the designer - good design, in my opinion, should GRANT users freedom, not take it away." -- sounds like this could apply to free-will vs. destiny discussions of religious dogma.

Post a Comment