Forums (I/O Tower)
Forums 
 TRON: LEGACY 
 Time Dilation Discrepancy


New New Comments | Post No Change | Locked Closed
AuthorComments: FirstPrevious Page: of 3 Pages
insidetronworld
User

Posts: 35
RE: Time Dilation Discrepancy

on Monday, December, 27, 2010 9:51 AM
DarthMeow504 Wrote:It doesn't necessarily have to be consistent. Presumably, a "cycle" as a unit of time is based on a CPU's clock cycle. As anyone who works with computers knows, programs don't always run at optimum speed and make the same amount of progress per CPU cycle. A program is advanced as quickly as the CPU is able to but can be bogged down or even frozen depending on the demands on the processor. We've all seen this while playing video games, a system may be humming along cranking out your favorite 3D fragfest at 60 frames per second, then all of a sudden something else puts a demand on the processor and next thing you know it's choppy, 30 frames per second, and then there's a high rendering demand because of fast motion and a lot of objects on the screen, and you get a lag-filled "slideshow" effect as the program bogs down. It might even freeze for a second or two before CPU demand drops and things get back up to speed.

This could happen to Flynn's system countless times in 20 years, so there is really no telling how long precisely passed inside the system during that time. We can know the optimum speed, but you'd need another computer to crunch 20 years worth of system speed logs to figure out how much actual rendering took place.

In other words, it's variable cuz computers bog down a lot.


You have an interesting idea. But your talking about random demands on the processor. I myself like to think of it as measurble amounts of time. The means for measuring I don't know for sure, but I am basing it on the quotes in the movie. Thats all I have to go on. However you do have a good point. But did you see the size of that computer room. I bet you we're talking 100's if not 1000's of tera bytes of computing power in that one room.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
typicaltronname
User

Posts: 1,659
RE: Time Dilation Discrepancy

on Monday, December, 27, 2010 1:37 PM
Here is my theory. The events could very well have taken in 1 or more days.

Because, on the Grid, times goes faster there, so on the gird it could very well have taken 1 or more days.

The unit of time is called a micro-cycle. WHich is 8 hours in the real world, but in the grid, it translate to a longer amount of time.

Remember, time goes 50 times faster in the grid. So 8 hours times 50= 400. ( wonder if I'm using the correct math now. Because, if the above is true, then it *can* translate to 400 hours, OR 16 days. and Sam was not in the Grid for 16 days, grid time.)

Just a thought, though. I probably caused more confusion.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online

"Reveal your creation date or I will disassemble your code one operation at a time!"
 
DarthMeow504
User

Posts: 128
RE: Time Dilation Discrepancy

on Monday, December, 27, 2010 3:44 PM
Are you talking about the ENCOM server banks? Yeah that array was f'n huge. But I didn't see anything like that in Flynn's basement, unless I missed it. Vastly smaller system.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
DV8ER
User

Posts: 145
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 10:20 AM
DarthMeow504 Wrote:It doesn't necessarily have to be consistent. Presumably, a "cycle" as a unit of time is based on a CPU's clock cycle. As anyone who works with computers knows, programs don't always run at optimum speed and make the same amount of progress per CPU cycle. A program is advanced as quickly as the CPU is able to but can be bogged down or even frozen depending on the demands on the processor. We've all seen this while playing video games, a system may be humming along cranking out your favorite 3D fragfest at 60 frames per second, then all of a sudden something else puts a demand on the processor and next thing you know it's choppy, 30 frames per second, and then there's a high rendering demand because of fast motion and a lot of objects on the screen, and you get a lag-filled "slideshow" effect as the program bogs down. It might even freeze for a second or two before CPU demand drops and things get back up to speed.

This could happen to Flynn's system countless times in 20 years, so there is really no telling how long precisely passed inside the system during that time. We can know the optimum speed, but you'd need another computer to crunch 20 years worth of system speed logs to figure out how much actual rendering took place.

In other words, it's variable cuz computers bog down a lot.

This really isn't the case. Just becase your process bogs own doesn't mean the passage of time changes. The way computers work is it's a crytal that oscilates at a specific speed, this doesn't change just because the processer is busy. This means regardless of how a program bogs down the time passage of the system is still the same and is not affected. If this were the case, the system clock would stop running due to reading from the hard drive for example.

-Dbuy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
Kaisergrendel
User

Posts: 298
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 10:47 AM
DV8ER Wrote:This really isn't the case. Just becase your process bogs own doesn't mean the passage of time changes. The way computers work is it's a crytal that oscilates at a specific speed, this doesn't change just because the processer is busy. This means regardless of how a program bogs down the time passage of the system is still the same and is not affected. If this were the case, the system clock would stop running due to reading from the hard drive for example.
-D

I think you're confusing clock frequency with CPU performance, which was what Darthmeow was referring to. Granted, he misused the term first.

I have thought about this from Darthmeow's POV. If time is indeed measured perceptually rather than absolutely on the Grid, time dilation factor is variable.

I've also had an epiphany. It's rather comical of us to be debating the finer details of T:L when the writers themselves probably put less thought into it than we are. Possibly.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
Kat
User

Posts: 2,345
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 7:29 PM
Kaisergrendel Wrote:
DV8ER Wrote:This really isn't the case. Just becase your process bogs own doesn't mean the passage of time changes. The way computers work is it's a crytal that oscilates at a specific speed, this doesn't change just because the processer is busy. This means regardless of how a program bogs down the time passage of the system is still the same and is not affected. If this were the case, the system clock would stop running due to reading from the hard drive for example.
-D

I think you're confusing clock frequency with CPU performance, which was what Darthmeow was referring to. Granted, he misused the term first.

I have thought about this from Darthmeow's POV. If time is indeed measured perceptually rather than absolutely on the Grid, time dilation factor is variable.

I've also had an epiphany. It's rather comical of us to be debating the finer details of T:L when the writers themselves probably put less thought into it than we are. Possibly.

'Kay, so are we talking in relativity-like terms here, then?

I actually did think about this recently when reading a bit about relativity. I'm reading Cosmos, and in one part about relativity Sagan mentions something like "electrical impulses inside a computer do travel at almost the speed of light." So, putting the two thoughts together, it occurred to me... wouldn't the writers have it backwards, then? Because the closer you get to the speed of light, the more time slows down *for you* [but not everyone else]. So you could travel near the speed of light for what to you is a couple years, and you come back and find that for everyone else, thousands of years have passed.

So, if info inside a computer chip is traveling at near the speed of light, then X amount of time for them would actually seem like LESS time than passes in the real world, rather than more as is stated in the film, right? I mean, I realize that that wouldn't be plausible *for the film* because obviously you can't have Flynn like "hey I was in there for ten minutes... what do you mean you're not Sam, you're my great-great-great-great-great grandson???" but...buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online

What do you want? I'm busy.


Program, please!


Chaos.... good news.
 
Kaisergrendel
User

Posts: 298
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 8:53 PM
Kat Wrote:'Kay, so are we talking in relativity-like terms here, then?

I actually did think about this recently when reading a bit about relativity. I'm reading Cosmos, and in one part about relativity Sagan mentions something like "electrical impulses inside a computer do travel at almost the speed of light." So, putting the two thoughts together, it occurred to me... wouldn't the writers have it backwards, then? Because the closer you get to the speed of light, the more time slows down *for you* [but not everyone else]. So you could travel near the speed of light for what to you is a couple years, and you come back and find that for everyone else, thousands of years have passed.

So, if info inside a computer chip is traveling at near the speed of light, then X amount of time for them would actually seem like LESS time than passes in the real world, rather than more as is stated in the film, right? I mean, I realize that that wouldn't be plausible *for the film* because obviously you can't have Flynn like "hey I was in there for ten minutes... what do you mean you're not Sam, you're my great-great-great-great-great grandson???" but...

Actually our theory is pretty straightforward. It's the Nolan-esque idea where the perception of time depends on the granularity of the measurement of time. If the brain for example works faster than it normally does, it is able to sense/process time in smaller increments, therefore perceived time slows down for it. That's why we need clocks that measure time with empirical methods to calibrate our mental clocks.

Conversely for computers, while the clock frequency is stable unless dynamically overclocked, its processing load is influenced by the needs of the system at any given time. A large load takes more clock cycles to process, whereas a small load takes less time. A busy day on the Grid could take a 10 hours of real time to pass through the CPU, and a quiet one could take comparatively less.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
DV8ER
User

Posts: 145
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 9:15 PM
Kaisergrendel Wrote:
DV8ER Wrote:This really isn't the case. Just becase your process bogs own doesn't mean the passage of time changes. The way computers work is it's a crytal that oscilates at a specific speed, this doesn't change just because the processer is busy. This means regardless of how a program bogs down the time passage of the system is still the same and is not affected. If this were the case, the system clock would stop running due to reading from the hard drive for example.
-D

I think you're confusing clock frequency with CPU performance, which was what Darthmeow was referring to. Granted, he misused the term first.

I have thought about this from Darthmeow's POV. If time is indeed measured perceptually rather than absolutely on the Grid, time dilation factor is variable.

I've also had an epiphany. It's rather comical of us to be debating the finer details of T:L when the writers themselves probably put less thought into it than we are. Possibly.

Actually, I'm not. The processor speed is the speed in which the CPU can complete a certain amount of cycles per second. This is otherwise known as a Hertz. For example, one Hertz means that one cycle can be completed in one second. A megahertz means that one million cycles can be completed in a second. A gigahertz, the most common form of processor speed today, means that one billion cycles can be completed per second.

Truth be told, when you see your machine bogging down odds are good there are two factors that cause this

1) You don't have quite enough memory and the operating system (Windows, *nix, etc) begins using disk space to "swap" out older memory pages to a file on disk. This is an order of magnatude slower than using memory directly and should be avoided if possible by sparing no expense on RAM.

2) You have a heavy I/O intensive application that is pegging the hard drive and keeping it constantly busy. As other programs need disk access things start to slow down because each process is in turn awatiing the process ahead of it, etc.. Eventually, if the condtion doesn't improve, things can get bad pretty quickly.

This is by no means a complete list of things that can go wrong to a computer system to cause it to bog down, but 95% of the time the above two things are the culprit. Either way the clock speed will always be the same as the processer is designed to handle a finite number of instructions per tick, this is a constant that is not affected by work-load.

On the other hand, I see his point of view as well - If time dilation is in fact perception based rather than system base (which I doubt - I believe the cycle is direct reference to a CPU cycle) then yes, it is in fact variable as a program is "frozen" awaiting whatever resource is busy and preventing it from functioning correctly.

Oh and about your epiphany, LOL, no kidding. I'm guessing you're right.

Hope that helps set the record straight as in looking back my statement was a bit vague.

-Dbuy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
DV8ER
User

Posts: 145
RE: Time Dilation Discrepancy

on Monday, December, 12, 2011 9:21 PM
Kaisergrendel Wrote:
Kat Wrote:'Kay, so are we talking in relativity-like terms here, then?

I actually did think about this recently when reading a bit about relativity. I'm reading Cosmos, and in one part about relativity Sagan mentions something like "electrical impulses inside a computer do travel at almost the speed of light." So, putting the two thoughts together, it occurred to me... wouldn't the writers have it backwards, then? Because the closer you get to the speed of light, the more time slows down *for you* [but not everyone else]. So you could travel near the speed of light for what to you is a couple years, and you come back and find that for everyone else, thousands of years have passed.

So, if info inside a computer chip is traveling at near the speed of light, then X amount of time for them would actually seem like LESS time than passes in the real world, rather than more as is stated in the film, right? I mean, I realize that that wouldn't be plausible *for the film* because obviously you can't have Flynn like "hey I was in there for ten minutes... what do you mean you're not Sam, you're my great-great-great-great-great grandson???" but...

Actually our theory is pretty straightforward. It's the Nolan-esque idea where the perception of time depends on the granularity of the measurement of time. If the brain for example works faster than it normally does, it is able to sense/process time in smaller increments, therefore perceived time slows down for it. That's why we need clocks that measure time with empirical methods to calibrate our mental clocks.

Conversely for computers, while the clock frequency is stable unless dynamically overclocked, its processing load is influenced by the needs of the system at any given time. A large load takes more clock cycles to process, whereas a small load takes less time. A busy day on the Grid could take a 10 hours of real time to pass through the CPU, and a quiet one could take comparatively less.

Interesting question about the brain. Given time is man made, doesn't introducing the faster brain concept fundamentally alter the discussion? If time was invited by the brain, and you were to speed the brain up and re-invent time, would the increments be different?

-D
buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
Kaisergrendel
User

Posts: 298
RE: Time Dilation Discrepancy

on Tuesday, December, 13, 2011 12:11 AM
DV8ER Wrote:Actually, I'm not. The processor speed is the speed in which the CPU can complete a certain amount of cycles per second. This is otherwise known as a Hertz. For example, one Hertz means that one cycle can be completed in one second. A megahertz means that one million cycles can be completed in a second. A gigahertz, the most common form of processor speed today, means that one billion cycles can be completed per second.

No argument there, I'm just saying you're misinterpreting Darthmeow's post because he misused a term. His point:

If clock speed = constant,
If CPU load = variable
Then passage of time on Grid = variable

Example #1: Videogame console emulators on PC

Because today's PC's run faster than obsolete consoles, which were designed to output games at a certain framerate, emulators use a feature called "frame skip" to control game speed. Without it most games would run too quickly due to the higher CPU performance. In this instance, clock speed is the variable and CPU load is the constant, but the rule still applies - passage of time is variable in-system.

Example #2: Pre-rendered video cutscenes

This is pretty straightforward. Video game trailers/cutscenes are now usually rendered sequentially, as opposed to the best-effort method used in-game that drops framerate to preserve time constancy. Because the video is not rendered in realtime, some scenes will take longer to process than others.

DV8ER Wrote:Either way the clock speed will always be the same as the processer is designed to handle a finite number of instructions per tick, this is a constant that is not affected by work-load.

On the other hand, I see his point of view as well - If time dilation is in fact perception based rather than system base (which I doubt - I believe the cycle is direct reference to a CPU cycle) then yes, it is in fact variable as a program is "frozen" awaiting whatever resource is busy and preventing it from functioning correctly.

If a Tron cycle is indeed a CPU cycle, then it is an extremely inaccurate method to measure time on the Grid because of the points above. Food for thought.

I posted some time calculations on IMDB using clock frequencies of comparable computers in the 1980's, and it turns out that if we take the term Cycle literally and I am correct, it would throw off T:L's time dilation factor by thousands if not millions of years. I won't do the calculations again as I don't have the time but you're welcome to try and corroborate.

DV8ER Wrote:Interesting question about the brain. Given time is man made, doesn't introducing the faster brain concept fundamentally alter the discussion? If time was invited by the brain, and you were to speed the brain up and re-invent time, would the increments be different?

-D

I'm assuming you're not saying that time itself is man-made, rather the manner with which we divide time into distinct segments. If so, I doubt it. We would simply move from intuitively observing seconds to observing miliseconds, and so on.

Of course, you probably already know that time itself isn't man made and can be measured empirically. You mentioned crystal oscillations being constant - Quartz clocks are made based on this principle.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
Kat
User

Posts: 2,345
RE: Time Dilation Discrepancy

on Tuesday, December, 13, 2011 7:26 AM
DV8ER Wrote:Interesting question about the brain. Given time is man made, doesn't introducing the faster brain concept fundamentally alter the discussion? If time was invited by the brain, and you were to speed the brain up and re-invent time, would the increments be different?

-D
Ah. But time is NOT man-made. perhaps our conception of it and how we choose to measure it, but time itself is an actual thing that exists (hence why we're discussing relativity and all the other theories). That's a long discussion, though.


Kaisergrendel Wrote:If a Tron cycle is indeed a CPU cycle, then it is an extremely inaccurate method to measure time on the Grid because of the points above. Food for thought.
I see what the point is, but does it HAVE to be that way?

I mean, I go to work for 8 hours. Conceivably, I can achieve X amount of work in that time.

However, it seems like every time I plan to spend a whole day attacking my filing, someone needs me to scan something and somebody calls and somebody's looking for a file and somebody needs something typed and somebody needs their printer fixed and the Orkin guy shows up and needs to be taken around, etc. Certainly that loads down MY processor, but it doesn't mean time passes any more slowly... just means I get less done in a day than I could if all that extra stuff wasn't going on, because unfortunately, time itself doesn't change. Even if my perception of it might. Ten minutes at the dentist SEEMS like an hour, while ten minutes having fun SEEMS like two minutes, but they're both still ten minutes regardless of how I perceive them to go. Am I totally missing the point?
buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online

What do you want? I'm busy.


Program, please!


Chaos.... good news.
 
Kaisergrendel
User

Posts: 298
RE: Time Dilation Discrepancy

on Tuesday, December, 13, 2011 9:47 AM
Kat Wrote:I mean, I go to work for 8 hours. Conceivably, I can achieve X amount of work in that time.

However, it seems like every time I plan to spend a whole day attacking my filing, someone needs me to scan something and somebody calls and somebody's looking for a file and somebody needs something typed and somebody needs their printer fixed and the Orkin guy shows up and needs to be taken around, etc. Certainly that loads down MY processor, but it doesn't mean time passes any more slowly... just means I get less done in a day than I could if all that extra stuff wasn't going on, because unfortunately, time itself doesn't change. Even if my perception of it might. Ten minutes at the dentist SEEMS like an hour, while ten minutes having fun SEEMS like two minutes, but they're both still ten minutes regardless of how I perceive them to go. Am I totally missing the point?

Oh my, I went through a head trip trying to understand your point of view, but I got it and I can safely say you're missing the point. I see how easy it is to commit logical fallacies with this topic though.

In the real world, time is constant, and in order to reconcile the inaccuracies of your mental clock with reality, you need empirical means to measure time as an absolute, not relative unit. Time is time, as you say.

However on the Grid, the "mental clock" is a shared system resource, and perception *is* reality. Consider the following:

- A CPU's performance is finite and constant
- Real time, and subsequently clock cycles, are constant
- Number of programs providing instructions is variable
- Number and complexity of instructions is variable

Therefore the following can be inferred:

- A CPU will process an equal amount of instructions each cycle, in exactly the same amount of real time.
- Because amount of instructions is variable, when total system load increases, each program has its instructions processed at a slower rate. The slowdown is systemic, affecting every process, but programs whose resource demands remain unchanged from previous cycles now find that their task takes more cycles to complete for no apparent reason. Schedules can't be kept in realtime.

Conclusion: The relevance of real time clock cycles is diminished.

To put this back into perspective vis a vis your personal experience with time - Imagine it's an unusually busy day at the office. You come in in the morning and begin performing a simple task that usually takes an hour, but the hands in the clock on your desk are spinning like a pinwheel and it's now 11pm. Funny, that felt like an hour to you. You look out the window and see the sun is still up even though it should be night. Everyone in the building is freaking out because surely, they should have accomplished more in those 14 hours that just flew by, right?

Bit of a trip, isn't it?buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
DV8ER
User

Posts: 145
RE: Time Dilation Discrepancy

on Tuesday, December, 13, 2011 10:52 AM
Kaisergrendel Wrote:
DV8ER Wrote:Actually, I'm not. The processor speed is the speed in which the CPU can complete a certain amount of cycles per second. This is otherwise known as a Hertz. For example, one Hertz means that one cycle can be completed in one second. A megahertz means that one million cycles can be completed in a second. A gigahertz, the most common form of processor speed today, means that one billion cycles can be completed per second.

No argument there, I'm just saying you're misinterpreting Darthmeow's post because he misused a term. His point:

If clock speed = constant,
If CPU load = variable
Then passage of time on Grid = variable

Example #1: Videogame console emulators on PC

Because today's PC's run faster than obsolete consoles, which were designed to output games at a certain framerate, emulators use a feature called "frame skip" to control game speed. Without it most games would run too quickly due to the higher CPU performance. In this instance, clock speed is the variable and CPU load is the constant, but the rule still applies - passage of time is variable in-system.

Example #2: Pre-rendered video cutscenes

This is pretty straightforward. Video game trailers/cutscenes are now usually rendered sequentially, as opposed to the best-effort method used in-game that drops framerate to preserve time constancy. Because the video is not rendered in realtime, some scenes will take longer to process than others.

DV8ER Wrote:Either way the clock speed will always be the same as the processer is designed to handle a finite number of instructions per tick, this is a constant that is not affected by work-load.

On the other hand, I see his point of view as well - If time dilation is in fact perception based rather than system base (which I doubt - I believe the cycle is direct reference to a CPU cycle) then yes, it is in fact variable as a program is "frozen" awaiting whatever resource is busy and preventing it from functioning correctly.

If a Tron cycle is indeed a CPU cycle, then it is an extremely inaccurate method to measure time on the Grid because of the points above. Food for thought.

I posted some time calculations on IMDB using clock frequencies of comparable computers in the 1980's, and it turns out that if we take the term Cycle literally and I am correct, it would throw off T:L's time dilation factor by thousands if not millions of years. I won't do the calculations again as I don't have the time but you're welcome to try and corroborate.

DV8ER Wrote:Interesting question about the brain. Given time is man made, doesn't introducing the faster brain concept fundamentally alter the discussion? If time was invited by the brain, and you were to speed the brain up and re-invent time, would the increments be different?

-D

I'm assuming you're not saying that time itself is man-made, rather the manner with which we divide time into distinct segments. If so, I doubt it. We would simply move from intuitively observing seconds to observing miliseconds, and so on.

Of course, you probably already know that time itself isn't man made and can be measured empirically. You mentioned crystal oscillations being constant - Quartz clocks are made based on this principle.

Whoops, yeah, you and Kat read it the same and I certinaly didn't intend for it to sound that way, but yeah I was referring to the way we measure time not time it-self. Thanks for pointing that out...

-Dbuy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
Kat
User

Posts: 2,345
RE: Time Dilation Discrepancy

on Wednesday, December, 14, 2011 7:39 PM
Kaisergrendel Wrote:Oh my, I went through a head trip trying to understand your point of view, but I got it and I can safely say you're missing the point. I see how easy it is to commit logical fallacies with this topic though.

In the real world, time is constant, and in order to reconcile the inaccuracies of your mental clock with reality, you need empirical means to measure time as an absolute, not relative unit. Time is time, as you say.

However on the Grid, the "mental clock" is a shared system resource, and perception *is* reality. Consider the following:


I mean, I get what you're saying... however, what bugs me about the theory is that time in the real world will still progress at the same rate. A process that takes ten minutes still takes ten real world minutes, no matter how long or short it seems to programs inside. How could a user ever go in if they would have no idea what time it might be when they come back out? (Granted, one could, no doubt, consult the system clock [a program named Chron in one of my fics] to make sure one was aware of the passage of time in the real world ["Ted! Don't forget to wind your watch!"], but you would go in having no idea whether you could expect to accomplish a lot in your allotted time, or very little. Too variable to work for the film, I think.
buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online

What do you want? I'm busy.


Program, please!


Chaos.... good news.
 
Kaisergrendel
User

Posts: 298
RE: Time Dilation Discrepancy

on Saturday, December, 31, 2011 2:46 PM
Kat Wrote:
Kaisergrendel Wrote:Oh my, I went through a head trip trying to understand your point of view, but I got it and I can safely say you're missing the point. I see how easy it is to commit logical fallacies with this topic though.

In the real world, time is constant, and in order to reconcile the inaccuracies of your mental clock with reality, you need empirical means to measure time as an absolute, not relative unit. Time is time, as you say.

However on the Grid, the "mental clock" is a shared system resource, and perception *is* reality. Consider the following:


I mean, I get what you're saying... however, what bugs me about the theory is that time in the real world will still progress at the same rate. A process that takes ten minutes still takes ten real world minutes, no matter how long or short it seems to programs inside. How could a user ever go in if they would have no idea what time it might be when they come back out? (Granted, one could, no doubt, consult the system clock [a program named Chron in one of my fics] to make sure one was aware of the passage of time in the real world ["Ted! Don't forget to wind your watch!"], but you would go in having no idea whether you could expect to accomplish a lot in your allotted time, or very little. Too variable to work for the film, I think.

You have your points. Either way, there needs to be a way to measure time in both absolute and relative contexts on the Grid. I highly doubt tron "cycles" are the former, though.

For example, Flynn says the portal stays open for one milicycle, which *feels* like 8 hours. All CLU has to do to upset this prediction is overload and bottleneck the system, causing a milicycle to feel a lot faster in relation to absolute time than 8 hours. Boom, impractical.buy viagra onlinehttp://www.bilimselbilisim.com/haberler_detay.aspx?id=42 viagra online


 
FirstPrevious Page: of 3 Pages
New New Comments | Post No Change | Locked Closed
Forums 
 TRON: LEGACY 
 Time Dilation Discrepancy