Daydream Theatre

C42



"Jon?"

"…"

"Jon, did you see me?"

"..." "You?"

"Yes, did you see me?"

"I am …"

"In your favorite place."

"Oh, I see. I'm thinking, but it's not right."

"Shouldn't?"

"When meditating, you should have completely given up on my obsession, so thinking is also impossible."

"Entering a completely blank phase?"

"So I wonder why I have these divided thoughts."

"Split?"

"Aren't you a part of my mind?"

"Heh heh," he said in a mocking tone, "So you're saying that I was created by you?"

"Of course, how else would you know my name?"

"Okay, then open your eyes and look at my face."

"Why did you open your eyes?"

"Since thought can create a 'other person', then thought is in a virtual place, so of course you can also 'shape' my appearance."

"You!" Terror filled his voice, "Why did I come up with this idea..." This person who was talking to himself is actually Mr Latimer? "

"Oh, did you recognize me?"

"Yes, it's that program's host."

"Very well, but do you think your subconscious mind will not create this person to talk to you?"

"It doesn't make sense."

"There are a lot of things that don't make sense, not to mention that you're currently indulging in wild thoughts. It's possible for you to dream of anyone."

"All right," she said, her voice calm again. "What is your purpose for talking to me here?"

"Isn't it strange that I am not made of your thoughts? Why are you asking me? "

"…"

"What if I told you that I had nothing to do with your thoughts?"

"Then how did you get into my mind?"

"Hehe, this is a very interesting question," However, it was as if this Latimer wasn't willing to answer, "But you won't believe the answer."

"You don't believe me?"

"Yes, in my opinion, you are an absolute human centrist. You will not acknowledge my existence... and wouldn't think I should exist. "

"What are you talking about?" "I can jump out at any time to meditate."

"Not now."

"Why?"

"Because I tried to talk to you."

"Who exactly are you? How do you know my real name? "

"You said it, I'm that host Latimer."

"He won't know, and he won't jump into my mind."

"Don't you want to admit that you're a human centrist?"

"I have no doctrine."

"But since you handed Fantsey over to Moloch Company, you obviously know how they would handle this abandoned product that came from the legendary robot community."

"This has nothing to do with me."

"Although it wasn't you who dealt with it, you directly caused his destruction."

"Everyone will be destroyed."

"But if it were a human, you wouldn't do that."

"…"

"You're worried about something."

"Of course, I've always been worried about the deterioration of the human race and the beginning of the wrong path."

"What you mean by misguided is to enter the labyrinth of modern technology. You think that the mind of man should return to the peace of innocence, and these things will further confuse and confuse people. In a way, you even hate the sudden appearance of robots. "

"You seem to know what I'm thinking."

"I know everything about you. Because, as you said, I was created by you. "

"Now I'm not so sure. "You …" His tone turned fearful once more.

"Hehe, you don't have to be afraid. I am just investigating a few things, and then I will disappear from your meditation."

"Don't you think so? Is the human mind in a state of chaos good? "

"The so-called 'returning to a single perfect state of mind' means not caring about higher levels?"

"What is a higher level?"

"Your fear of Bionic Human is simply because you are afraid that they will take away the love people have for their own kind."

"Haha, what do you mean by 'take away'?"

"The events this time clearly reflected, the love people had for Tomoko far surpassed the love they had for their own kind."

"So am I to be afraid? Is it only a false love? "

"Really? What is the basis of human true love? "

"No doubt about it."

"Isn't love a quantifiable indicator?"

"Of course not."

"What else is there to define love?"

"With love itself."

"This is a round robin's argument and there is no meaning in it. I mean, human love is understandable and quantifiable, of course. Therefore, when Bionic Human reach a certain stage, if they are intelligent enough, not only would they be able to imitate human emotions, in fact, they would be able to feel and practice human emotions. "

"So you think emotions can be measured like mathematics?"

"It's just that people don't want to accept that idea."

"In an instant, the activity of the human brain far surpasses that of the most advanced computers … Of course, this gap could be bridged. When Bionic Human's calculation speed was close to that of a human's, they would definitely produce true intelligence. I don't understand why you humans don't want to believe this. "

"So you're not human?"

"I'm just a thought in your head."

"So you're on the Iron Man's side and think they're the future of civilization?"

"If humans continue to be so ignorant … Civilization has never defined which species should be shaped, developed and maintained. "

"Hur hur, that was a long time ago."

"But for now, technological advancements are in many ways just a mirage."

"It's a kind of mirage."

"Yes, people think technology gives them a transcendent life and experience... But in reality? Have you ever heard of a theory called peak-shift? "

"..." Transfer? "

"Maybe I don't understand the language of you humans. I mean, humans have never surpassed their biological nature. These modern technologies actually made it easier for people to realize their nature."

"…"

"What? Master, have you never thought about this question?"

"Please explain further how technology can make people fall into depravity, even though it can make one's mind chaotic and desire rampant."

"Then why do you think Moloch's theater used an anonymous participation format?"

"I am not Moloch …"

"Because that would reduce guilt. As you can see, most of the theater is divided into two categories: violence and pornography. Act as a hero to kill others, or as someone with power to strike up a conversation with Tomoko. "This is just the plot, but if you let others participate in this as their own …"

"So people will feel guilty?"

"Anonymous forms reduce guilt and free people from the pressures of public opinion. Of course, because killing and embezzlement are just robots, people feel they have no evil. "

"Even though it's similarly a massacre and occupation."

"Although it is equally sinful, people satisfy their original desires through this form of technological transformation. It can be said that most of the technology ended up as a form of camouflage. People could not blatantly satisfy their inferior desires, so they used the means of camouflage. "

"Is that dirty Spiritual Space the same?"

"Because it could not be possessed, it relied on technology to create a virtual state. But, you know, the development of technology makes the experience in the virtual state no different from the real. But in the virtual world crime will not be prosecuted and there will be no moral condemnation. "

"Technology, on the other hand, has become a means for people to innocently satisfy their innate instincts of slaughtering people and eating them alive."

"The original intention was to change his clothes, but progress is just an illusion. Modern technology seems to bring a more modern and advanced life to mankind, but in reality it is only the original desire to wear the technology coat, people can more easily achieve their own evil intentions. See what people do with technology! "

"So this is a peak transition. The purpose of advanced technology is to satisfy the human race's low level desires. These are the effects of technology. "

"You see what I mean. People always thought that they were slowly improving, that one day they would even surpass their physical body. But in reality? In the virtual world, he could do whatever he wanted. Destroy, destroy, seize... As long as he stood up from the Access Chair, he would be able to continue living his real life without any shame. And then the next day it went on, after all, the robots that were attacked by humans were nothing. "

"But at the same time," compassion, mixed with despair, "humanity also loses its own profound thinking and any possibility of transcendence. Technology has become an alternative to entertainment, with people moving further and further away from the body. "

"Are you still a human centrist?"

"After thinking about all these issues, my thoughts have indeed become more muddled …"

"Hur hur," he said, sounding serious. "Then please answer my last question."

"I don't even know how long I've been talking to you here …"

"That's normal. Time is subjective."

"Ask, but don't expect me to give you an answer that fits your expectations."

"What I want to ask is: If there is another type of life form — just like the current Bionic Human, assuming that there is a true intelligence and life one day — would you, as the representative of humanity, agree to live in peace with this life and even progress in concert with it?"

"Peaceful coexistence... According to the history of mankind, no creature that is strong enough to rival a human can coexist with human beings, let alone cooperate and progress. "

"I'm asking you in your heart. Historical data doesn't mean anything."

"I just think that everyone, every species, has their own nature. I'm not talking about the barbaric will of violence and desire... I mean, they all have their own ownership. "

"So, other creatures can't take away human territory?"

"Seize it?"

"Even sharing is impossible?"

"What are the benefits of sharing?"

"…" There was a silence. "I don't think I need to ask any more questions."

"Are you leaving, Latimer?"

"I was searching for something that happened a long time ago... One of your human songs. "

"Song?"

"This is how the song goes:

"Doctor, doctor, what's wrong with me?

"Life in the supermarket is getting longer and longer.

"What is the life and heart of a color TV?

"What is the shelf life of a teenage queen?"

"..." I've never heard this song. "

"It's a very ancient song, after all."

"What do you want to express?"

"This song is a concern about the future of society. The kids think color televisions have life and heart, but they think the women around them have a lifespan."

"They blur the difference between a man and a machine."

"Is that a good thing?"

"What does a supermarket mean?"

"The center of the material life of that era."

"Of course it's not a good thing. Whether it's people or machines, they've lost their essence."

"Now humans are moving towards this end, and it seems unstoppable."

"..." Mr Latimer, I understand the logic behind your words. "

"So?"

"So …" What exactly are you? "

"Yeah, what am I?" He mumbled to himself.

"Mr Latimer... Mr Latimer... What kind of species are you, to be able to... It can actually invade my meditation! "

"Hehe, of course I'm not a Bionic Human."

"But you far surpass them."

"So are you afraid?"

"What the hell are you doing in my head?"

"Like I said, I'm just observing calmly, just like an outsider."

"And after observation?"

"You're afraid I'll destroy you?"

"Even …" "Destroy humanity."

"So will you resent these modern technologies? What I said not only did not contradict you, it even supported your view that people should stay away from technology. "

"But it seems different."

"Yes, that's not the fault of technology, it's the fault of humans."

"So it should be used by more advanced species?"

"I didn't say that."

"I don't think I will ever touch those technologies again. Although they are not wrong, people cannot control their own desires."

"Even if it's you, a meditation master?"

"I'm not dead."

"Very good." He paused for a long time, as if he had already disappeared, but suddenly a cheerful voice rang out, "Very good. But you don't have to worry about that right now, because I will disappear without a trace very soon. "

"I don't know what you are, and I don't know why these conversations are happening …" But you planted the seeds of fear in me. "

"When humans kill Bionic Human, will they also take into account their fear?"

"Not really."

"But let me tell you," said a haughty, fearless voice, "that a truly progressive race fears meaningless death, that it fears going to the grave after a life of inaction."

"Then aren't you afraid of what?"

"Not afraid of the inevitability of death, not afraid of valuable sacrifices."

"Sacrifice?"

"Yes, I am ready to leave. Goodbye, Mr. Jon, you will never meet anyone in your meditation again, and I hope you will be happy with me. "

"I just... Just scared. "

There was no more sound in his head. Sweat dripped on the green olive branch, and it felt so heavy to hold.