Jul. 25th, 2010

nancylebov: blue moon (Default)
The Web Means the End of Forgetting is a longish piece about how privacy is playing out these days, but it leaves out an entertaining possibility for enforcement.

All the methods suggested (and as stated in the article, none of them likely to be very effective) are around protecting information, whether by not giving it out or through automated methods of forgetting.

What about reputational costs for people who overreact to minor and/or outdated pieces of information?

Also, unless I missed something (I admit I skimmed) there was nothing about changes over time in standards for behavior, and a background assumption that every group has the same standards.

Who knows, maybe there can be longterm running scores of the effects of maintaining different sets of standards? Not that such a thing would cause people to agree on what effects are valuable, but it still might be interesting and somewhat useful.

Link thanks to [livejournal.com profile] rm.

Some of the underlying premises inspired by H G Wells' Men Like Gods [1]-- it's a book about a half dozen or so random British people from the 1920s stranded in a utopia. Among other things, records about everyone are publicly available, though, iirc, they're on paper files in central locations. One of the British character says something like he knows someone who could make utopia into hell in a few weeks just from having access to the records. It seems to me that you can only do that sort of thing if there's a tremendous amount that people want to keep secret.

One thing I'm hoping for is that so much information being available about what people actually do will lead to reasonable standards for how people can be expected to act. This may be excessively optimistic.
nancylebov: (green leaves)
The Web Means the End of Forgetting is a longish piece about how privacy is playing out these days, but it leaves out an entertaining possibility for enforcement.

All the methods suggested (and as stated in the article, none of them likely to be very effective) are around protecting information, whether by not giving it out or through automated methods of forgetting.

What about reputational costs for people who overreact to minor and/or outdated pieces of information?

Also, unless I missed something (I admit I skimmed) there was nothing about changes over time in standards for behavior, and a background assumption that every group has the same standards.

Who knows, maybe there can be longterm running scores of the effects of maintaining different sets of standards? Not that such a thing would cause people to agree on what effects are valuable, but it still might be interesting and somewhat useful.

Link thanks to [livejournal.com profile] rm.

Some of the underlying premises inspired by H G Wells' Men Like Gods [1]-- it's a book about a half dozen or so random British people from the 1920s stranded in a utopia. Among other things, records about everyone are publicly available, though, iirc, they're on paper files in central locations. One of the British character says something like he knows someone who could make utopia into hell in a few weeks just from having access to the records. It seems to me that you can only do that sort of thing if there's a tremendous amount that people want to keep secret.

One thing I'm hoping for is that so much information being available about what people actually do will lead to reasonable standards for how people can be expected to act. This may be excessively optimistic.
nancylebov: blue moon (Default)
Behold!
Basically, we came up with a design for a clicking-game, where each click generated a score. A random score. Except that these random scores were posted on an ordered leaderboard. I think one of my buddies even went so far as to implement it, mostly as coding practice. Unsurprisingly, people were happy to click multiple times in pursuit of a number high enough to warrant a spot on the publicly displayed list.

From a highly intellectual discussion of Cow Clicker, a parody of the Facebook farm games which is so close as to be an example of them.

Link found at Marginal Revolution.
nancylebov: (green leaves)
Behold!
Basically, we came up with a design for a clicking-game, where each click generated a score. A random score. Except that these random scores were posted on an ordered leaderboard. I think one of my buddies even went so far as to implement it, mostly as coding practice. Unsurprisingly, people were happy to click multiple times in pursuit of a number high enough to warrant a spot on the publicly displayed list.

From a highly intellectual discussion of Cow Clicker, a parody of the Facebook farm games which is so close as to be an example of them.

Link found at Marginal Revolution.

May 2025

S M T W T F S
    123
45678910
11 121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 10th, 2025 02:09 pm
Powered by Dreamwidth Studios