Jeffrey Gomez
Cubed™
Join date: 11 Jun 2004
Posts: 3,522
|
12-19-2005 16:45
With all due respect, this rides on implicit fallacy of the false dilemma. Second Life need not have DRM; counterpoint, everything in the world need not be "open" to everyone. As I said in my earlier posts and I will repeat here, I'm very much in favor of watermarking and versioning in the 3D medium. An example of this would be to "watermark" textures used with a model, or "display" a slightly lower level of detail while selling models of a higher echelon of quality. The spirit of the matter, in my opinion, is not to prevent at threat of law all piracy or applications of given software. This is impossible in the free medium computing exists upon. Instead, it is to incent the customer to buy the product. Peer-to-peer has become in many ways the new "try before you buy." There are actually several programs I had at first "ripped off" because no comparable trial existed. I have since bought all of these products with rare exception, often because an item is no longer in production. One of these programs was $500, which I bought at a premium. Nor am I afraid to admit that. It is simply a fact of life in the digital age. In that sense, it would be wiser to listen to what the market tells you than to rally against it. Give the customer what they want and the means to acquire it legally, and you'd be surprised what will come of it.
_____________________
---
|
CrystalShard Foo
1+1=10
Join date: 6 Feb 2004
Posts: 682
|
12-20-2005 06:12
We now turn to a Public Service Announcement: http://www.lafkon.net/tc/(requires Quicktime)
|
Mikey Dripp
Registered User
Join date: 5 Dec 2005
Posts: 26
|
12-20-2005 10:41
That video is nicely done, but it is an example of what I said in my first post about the campaign of misdirection and misinformation which surrounds this topic. It is full of inflammatory images, including what appears to be flowing blood and the insertion of a suppository, hardly a technique designed to foster reasoned and balanced discourse. You should ask yourself why opponents of TPM technology have to resort to this level of pandering and emotional manipulation.
The main thesis of the video is wrong: that Trusted Computing was originally designed to let you decide what to trust on your computer and what not to, but that it had been perverted and changed into a system where "they" decide what to trust on your computer, and "they" have decided not to trust "you".
The truth is that TC is and always has been a technology that would allow systems to come to trust each other across a network. How can you trust a remote system to perform properly and correctly, given that you aren't there and can't even be sure you are talking to who you think you are talking to (similar to the problem we all face in SL when we have no idea if the cute chick we are chatting up is anything like what "she" appears to be)? This led to the notion of trusted and authenticated boot sequences, remote attestations and other components of the whole TPM technology picture. Trusted Computing has always been about establishing that a remote system can be trusted to behave according to some specification.
In the context of the net, this means that it provides a way for one system to send data to another and have a degree of confidence and trust that the remote system will follow the rules in handling that data. And yes, that means trusting that even the operator of that remote system will not be able to get it to misbehave. This has applications to DRM and similar areas, which is why it is so unpopular. But it also could have uses that most people would agree are good, such as the one that I started this thread with.
TPMs can not only let distribution servers verify that remote client software is kosher, it could also let remote users verify that a server is trustworthy as well. This reverses the direction of trust establishment from most discussions of the technology, which totally changes the balance of power and its implications for end users. If we imagine a distributed SL application, where the SL servers are running on Trusted Computers and SL end user software validates that servers are not cheating, that is a system that protects end user privacy and rights rather than threatening them.
This "reversed" form of Trusted Computing has technical advantages as well. A Trusted Computer needs more than a TPM chip. There also has to be an infrastructure that validates and certifies the TPM chip and publishes these certifications. No such infrastructure presently exists, partly because it's controversial but also because it would be a massive effort to certify all those end-user TPM chips. It would be much more practical to start small and certify only the relatively few TPM systems that are intended to run as trustable servers. It would be a more manageable task, avoids the political controversy, and provides a technology which few people could object to.
|
Jeffrey Gomez
Cubed™
Join date: 11 Jun 2004
Posts: 3,522
|
12-20-2005 15:57
I want to just let this discussion roll over and die, but it's too poignant an issue to let it disappear entirely.. ... ......
The fact remains, however, that TPM still relies on raw data - zeros and ones - that can be manipulated. Microsoft has been fighting this battle for years, and as DRM exploits on the Xbox and Xbox 360 are proving -- it's a battle that it is losing.
As I said and will repeat, this is because MSFT is going against the nature of the medium. It is simply dooming itself to failure if it continues.
And in referring to a "network of trust" as opposed to TPM, one need look no further than Debian's lovely "APT-GET" system. The enterprising nerds up there have decided to be nonmalicious with that, and as a result have made a pretty damned useful system.
_____________________
---
|
Satchmo Prototype
eSheep
Join date: 26 Aug 2004
Posts: 1,323
|
12-20-2005 16:51
From: Jeffrey Gomez And in referring to a "network of trust" as opposed to TPM, one need look no further than Debian's lovely "APT-GET" system. The enterprising nerds up there have decided to be nonmalicious with that, and as a result have made a pretty damned useful system.
I've been using Debian for something like 8 years, and I'm a huge supporter. The problem with this system in the context of trust is that if someone was to hack your DNS server they could point ftp.debian.org to their own server and your system would blindly install a new binary that could include a trojan in it. The point being, even though you setup which server you want to pull binaries from you have no proof that your actually hitting the right server. Now this will change with "etch" since it's scheduled to include signed packages by default, but a developers machine could still be compromised, keys stolen, and then someone could sign a trojaned package and head back to the scenario I described above. Of course there is apt-secure which only comes with the Debian archive key, so if you want to verify individual packages you have to go out and fetch the developers key and import it into your keyring. No biggie, there are only thousands of developers. Now compare that to a debian update server running TPM, in which you would be sure that you are connected to the authentic debian server and on top of that can be sure it's running versions of software that don't have security advisories out for them.
|
Jeffrey Gomez
Cubed™
Join date: 11 Jun 2004
Posts: 3,522
|
12-20-2005 17:20
My point being, APT is a system half based on trusting the system, and half on the end user making decisions. It also exists in a more open environment built into the Debian core. APT is, IMHO, what TPM should look like. It's opt-in, free, supports most major packages, and reasonably secure provided you have a clean hosts file and DNS lookup. I was floored when I learned what APT could do; I was scared when I learned what TPM could do. And that makes the difference in terms of consumer support. Fun fact: APT will run on my Xbox copy of Xebian. How's that for support? 
_____________________
---
|