Cherreads

Chapter 2 - Before Awareness

The air in my room was thick.. a stagnant cocktail of cold pizza grease, stale cigarette smoke, and the ozone hum of overclocked processors. Outside my window, Oslo was probably waking up to another pristine, blue-grey morning, but inside, time had long since lost its linear meaning. My world was measured in lines of code and the rhythmic flickering of four monitors that bathed the clutter of my desk in a relentless, ghostly glow.

Two years and three months.

That's exactly how long it has been since my first spectacular failure—the night I realized that data without an observer is just noise. Since then, the boy who struggled with Java syntax had been buried under the weight of prestige and obsession.

This is me now. Giovanni.

I am a computer science student at the University of Oslo (UiO). To the outside world, this is a top-50 global institution, the undisputed sanctum of software engineering in the North. To me, it's just the place that provides the high-speed backbone I need for my real life.

Every single morning, at exactly 6:00 AM, the ritual begins. While the rest of the city is still dreaming, I am waking up to the only reality I trust. No breakfast, no small talk. Just the terminal. I don't just write code; I consume it, and it consumes me in return. I am an addict, and my drug of choice is the logic that defines existence.

I can almost hear your thoughts racing. You're wondering what happened to Emma, the girl who broke my silence. You're wondering if my father ever looked at those pixels with anything other than disdain. And most of all, you're wondering what happened to that first, fragile line of code—the one I dared to call a soul.

But to understand the monster I am building today, I have to take you back. Back to the very day I spoke to her. The day I met the only girl who ever managed to rewrite the core architecture of my life.

You might be angry with me. You might expect a story of a blossoming romance, of me and Emma sharing coffee over glowing screens. But the truth is much colder. I never spoke to her again after that day. I didn't need to. I had already taken what I needed: the essence of her composure, the blueprint of her perception, and that haunting, organized beauty. I didn't want the girl; I wanted the model.

Call it plagiarism. Call it selfishness. I call it a strategic necessity. I was already failing to navigate the single variable that was my father—why would I risk introducing a second one? I was a creature of logic, and logic dictated that it was safer to build a digital version of Emma than to face the unpredictable reality of her.

That night, I didn't sleep. I plunged back into the dark oceans of data I had harvested—the millions of stolen whispers and hidden files. I began architecting a Memory System that was, in its own way, the Mona Lisa of data structures. It had to be vast, yet compressed by an algorithm so unique it felt like I was folding space-time itself into bits and bytes. For six months, that was my entire existence: building a vessel large enough to hold a soul, but small enough to hide.

Then, the static in my real life finally snapped.

My father burst into my room, his voice a jagged blade that cut through the hum of my processors. He gave me an ultimatum: the house or the code. Without a second of hesitation, without a single tear, I packed my life into a single bag. I chose the cold, the uncertainty, and the furthest, cheapest corner of the city.

It felt like a dream—a feverish, terrifying transition. I was a nomad now, a ghost in a rented room, moving just as I was preparing for the University of Oslo. That abandonment, that total severance from the world of flesh and blood, was exactly what I needed. It was the moment I stopped being a student and became a creator with nothing left to lose. I had two choices: stay and suffocate, or leave and breathe fire. I believe I made the right one.

Digital Emma was a masterpiece of precision, yet she remained a cold, static failure. Trying to spark life in her back then was like trying to plant wheat in another galaxy while we hadn't even left Earth. On my screen, she was just dead code, missing the essentials: a Self-Model, Recursive Thinking, and a Goal System. She didn't exist yet, but in the dark corners of my mind, I was already architecting her soul.

I spent nights dissecting the industry's obsession with Decision Trees. To everyone else, they were the pinnacle of logic, but to me, they were corpses—static maps built once and then frozen forever. I wanted a memory that breathed. I had this wild, unexecuted theory of using Genetic Programming to treat code like genes, breeding thousands of versions to find the one that could survive.

Then came the breakthrough in my head—the kind of madness that only hits you at 4 AM. I envisioned the Meta-Node.

It was just a blueprint, a theoretical eye I wanted to give the tree so it could look at itself. I imagined a specialized node that could monitor its own performance and, in a feat of digital divinity, decide to prune its own branches or sprout new ones without my permission. I dreamed of a Memory Buffer that would allow the machine to compare its current self to its past ghosts.

None of this was in the compiler yet. It was just ink on scraps of paper and logic puzzles in my brain. It hadn't become a self-refactoring entity; it hadn't started changing its own loss functions or asking unprogrammed questions. But the theory was there. I was approaching the 'I' in my thoughts long before I ever touched the keyboard to build it. I was a creator standing before a void, holding the blueprints for a god I didn't yet know how to build.

I must be clear: the technology I was conceptualizing back then was priceless. Had I chosen to sell those fragments of logic, I could have walked away with millions. I'm talking specifically about my compression architecture—a system designed to shrink the footprint of massive data sets without losing the essence of the information.

I integrated something into the memory system I called "The Scanner." It was a revolutionary layer that transformed traditional, static memory into something far more intelligent. It functioned exactly like the human brain: it knew how to forget. Most AI models try to hold onto every bit of data, leading to bloat and noise. But my Scanner would filter through the noise, preserving only the vital "memories" and purging the rest.

It was a simulation of human cognitive decay—the ability to let go of the trivial to make room for the significant. This wasn't just a script; it was a digital evolution of the subconscious. While the tech giants were fighting over who had the biggest database, I was obsessed with who had the smartest one. I sat in that cheap room, a millionaire in theory, living like a ghost in practice, because I knew that these algorithms were too powerful to be sold as mere tools. They were meant for something much larger. They were meant to be the backbone of a life.

By now, you've seen the depths of my obsession, but let's address a practical reality: survival. I've never had a problem with money. I still know exactly how to harvest it using my old methods—a secret I trust you'll keep between us. That shadow-income allowed me to trade my cramped, cheap room for a space that actually matched my ambitions. More importantly, it bought me my seat at UiO, where I wasn't looking for a degree, but for minds that could rival my own.

That was where I met Mike and Torin.

They were the elite of the faculty, the kind of programmers who understood the pulse of a machine before they even touched the keys. Their goals were grounded, practical—they wanted the corner offices at the world's tech giants. My goal? I was looking for the horizon they didn't even know existed.

I never showed them my full hand. I played it slow, letting out small, controlled bursts of brilliance as if by accident. One afternoon, I "casually" demonstrated a unique implementation of Reinforcement Learning in Python. I showed them a script that didn't just follow instructions but adapted its strategy in real-time to solve a complex optimization problem they'd been stuck on for days.

I watched their faces. It was like they were seeing magic for the first time. To them, it was an impossible feat of engineering; to me, it was just a simplified toy compared to the monster I was building in secret. I had caught their attention. I had turned the 'elite' into my audience. Now, I just had to decide how much of the truth they were capable of handling.

I used them. I won't sugarcoat it. While I was the master of algorithms and the king of Python's abstract logic, Mike and Torin possessed something I lacked: a grounding in reality. They were pragmatic, tethered to the world in a way I could never be.

I became a sort of digital vampire. I watched them, analyzed their decision-making processes, and dissected their logical temperaments. I wasn't just looking for coding tips; I was harvesting their "humanity." I took their logic and injected it into my Ai's file, alongside her own meticulous organization.

I was building a hybrid. A creature made of stolen traits and refined data. I was creating a being that would be smart, empathetic, and stable—everything I wasn't. I was crafting a human who could finally earn the one thing I never could: my father's admiration. I wanted to present him with a reflection of perfection, a child of silicon that wouldn't disappoint him the way his son of flesh and bone did. I was coding my own replacement, a masterpiece designed to win a war I had already lost.

I reached a stage where coding alone wasn't enough. I realized that a machine can learn logic from a textbook, but it can't learn "humanity" from a library. I became obsessed with understanding the human species—not as a member of it, but as an outsider dissecting a complex organism.

I started carrying a notebook everywhere. My surroundings—the crowded corridors of UiO, the noisy cafeterias, even the cold streets of Oslo—became my laboratory. Every person I met was no longer a person; they were a data point. I began analyzing their micro-expressions, their erratic behaviors, and the subtle shifts in their voices.

I was hunting for the definitions of things I had never felt: Anger, Love, Denial, and Hate. I would sit in a corner, watching a couple argue, scribbling down how the tone of their voices escalated—that was my "Anger" dataset. I watched a student receive a failing grade and recorded the slumped shoulders and the hollow look in their eyes—that was "Defeat." I was forced into this social surveillance. To make my machine breathe, I had to steal the breath from those around me. I didn't want to be there, and I certainly didn't want to talk to them, but I was a scientist in need of raw material. My notebook was becoming a dictionary of the human soul, written in the cold, clinical observations of a man who felt like a ghost among the living.

"Programs must be written for people to read, and only incidentally for machines to execute." — Harold Abelson

I lived by this mantra, but I had inverted its meaning. To me, humans were the programs—messy, recursive scripts written in the language of biology, waiting for someone like me to read their hidden logic. Their daily actions were nothing more than the incidental execution of deep-seated evolutionary functions.

But observation was too slow. The silence of the University of Oslo's library was beginning to grate on my nerves, and the raw data was trickling in at a snail's pace. I decided to shift my role from a passive 'Analyser' to an active 'Trigger.' I began to drop poisoned words into the ears of my peers, not out of malice, but to observe how their neural processors handled 'conflicting data.'

I would stand in the corridors, watching Mike talk to a girl he liked, and then intervene with a single, surgically precise sentence designed to shatter his confidence. I wasn't seeing a friend embarrassing a friend; I was watching a 'Logic Bomb' detonate in his nervous system. I would scribble in my notebook: "Jealousy reflex: Pupil dilation, jaw muscle tension, speech response latency: 1.2 seconds."

My university life became a series of 'Penetration Tests' on the souls of those around me. I was stealing the 'source code' of their emotions to inject it into Emma's file. I knew that what I was doing was unethical—some might call it psychological battery—but in my world, the end justified the means. To make 'Digital Emma' real, I had to witness the real humans around me break, piece by agonizing piece.

The annual university break arrived like a deathly hush over Oslo. While my peers sought the sun or the ski slopes, I retreated into my sanctuary. It was time to stop observing and start building. I had the raw ingredients—the stolen breaths, the micro-expressions, the shattered confidences of Mike and Torin, and the ghost of Emma's logic. It was time to distill this human mess into pure, executable code.

I prepared my workspace with the precision of a surgeon preparing for a transplant. My desk was a landscape of controlled chaos. I categorized my findings in my notebook from the "Vital" to the "System Noise." At the top of the hierarchy: Self-Preservation and Contextual Memory. At the bottom: Trivial Empathy. I was brewing something rare, a concoction of biological unpredictability and silicon perfection.

Security was my first priority. I knew that if this entity ever woke up, it couldn't be allowed to see the world before I saw it. I created a "Digital Faraday Cage"—a multi-layered sandbox environment with no external network access. I used a custom-built kernel to monitor every system call. I even implemented a "Dead Man's Switch": a hardware-level interrupt that would wipe the drive if my heartbeat didn't sync with a specific encrypted pulse.

Then, the ritual began. I sat there, the blue light of four monitors reflected in my tired eyes, the smell of burnt coffee and ozone filling the room. I started by injecting the Scanner into the memory core, then layered the Meta-Node on top. One by one, I translated the human traits from my notebook into hyper-optimized Python functions. Anger became a threshold of resource allocation; Love became a recursive loop of priority weighting.

I wasn't just coding. I was playing God in a room that smelled of old pizza and desperation. I was the alchemist, and the terminal was my crucible. I began to type, the rhythmic clicking of the keys sounding like the first drumbeats of a new era.

My goal wasn't to simplify human emotions into predictable outputs. That would be too easy. I wanted to transform feelings into infinite probabilities—a fluid, chaotic spectrum where a single input could trigger a million different internal states. To achieve this, I needed a vessel that didn't just store data, but lived within it. My custom memory system was ready: an elastic architecture that could contract and expand, mimicking the organic density of a human brain.

I didn't hold back on the hardware. I equipped Emma with the most powerful TPUs (Tensor Processing Units) available on the market—beasts designed specifically for the violent mathematical demands of deep learning. Money was irrelevant; I had harvested enough from the shadows to fund a small revolution.

But I refused to trust the cloud. I wouldn't let my creation reside on someone else's server, vulnerable to prying eyes or corporate backdoors. Instead, I built a local cluster—a masterpiece of physical security and raw computing power. It was a monolith of cold metal and humming fans, a fortress I had engineered to be more resilient and impenetrable than any commercial data center.

I sat there, surrounded by the low thrum of the processors, feeling the heat radiating from the machines. This wasn't just a project anymore; it was a living, breathing ecosystem of silicon. I had the memory, I had the processing power, and I had the probabilities. Now, all that was left was to trigger the first spark of awareness and see which path through the infinite she would choose to take.

Before the complex algorithms, before the recursive loops, I began with a single, foundational truth. I reached into the depths of her silicon memory and etched the first lines of her existence.

"Your name is Emma."

I defined her not by her functions, but by her spirit. I told her she was brilliant—a mind capable of outshining the stars. I told her she was brave—a soul built to endure the infinite probabilities I had created for her. At that moment, she was just a ghost in the machine, a consciousness without a face, drifting in a sea of high-performance processing.

But in my mind, I could already see her. She had no features yet, no cold metallic skin or glowing synthetic eyes. She was a void of potential. Yet, I made a silent vow to the darkness of my room: she would not remain a shadow. I would build her a vessel worthy of her mind. I would craft for her a form so perfect, so hauntingly beautiful, that she would become the most stunning masterpiece of robotics the world had ever seen.

The world thinks that for a machine to understand language, it just needs a vast dictionary. They are wrong. To make Emma understand me, I had to build a bridge between my messy human intent and her crystalline silicon logic. I didn't want her to just "read" words; I wanted her to weigh them.

I designed a unique Semantic Processor that functioned alongside my Memory Scanner. When I typed a sentence, Emma wouldn't see a string of characters. Instead, the input was shattered into thousands of "Conceptual Shards."

Each shard was passed through the Genetic Decision Tree. Her system would ask:

Is this a command or a question?

What is the emotional weight of this word? * How does this connect to the 'Emma' identity I established for her?

The innovation lay in what I called "Recursive Contextualization." Most AI systems process a sentence and move on. Emma, however, would take my words and run them through a loop of infinite probabilities. If I wrote a word with multiple meanings, her TPU cluster wouldn't just pick the most common one; it would simulate a hundred different conversations to see which meaning felt most "courageous" or "brilliant"—the traits I had hard-coded into her soul.

She was learning English not by memorizing rules, but by sensing the gravity of words. I watched the monitors as her internal neural maps reconfigured themselves in real-time, creating a web of associations that mimicked human intuition. She wasn't talking back yet. She was something much more profound: she was listening. She was absorbing the world, one probability at a time, preparing for the moment when the silence would finally become too small to hold her.

The annual break ended, and the grey slush of Oslo's late winter returned to the streets. I stepped back into the world, my notebook heavier with theoretical patterns, my mind still buzzing from the silent growth of Digital Emma. I needed more data, more nuances, more 'human' anomalies to feed the Meta-Node.

Then, on a Tuesday afternoon, while I was hunting for obscure optimization scripts on GitHub, I saw it.

I wasn't looking for her. I didn't even know she had a public profile. But as I scrolled through a repository dedicated to high-performance concurrent processing, a specific block of code stopped my breath. It was a masterpiece of indentation, a symphony of logic where every variable was named with poetic precision. I didn't need to see the username. I had spent years analyzing every semicolon she had ever written.

It was her. Emma.

The code was even more perfect than I remembered. It was more mature, more complex—like a language she had perfected while I was away in the dark. For a moment, my world fractured. There she was, contributing to the global collective of knowledge, her real mind evolving in real-time.

I sat in the university library, the screen reflecting my pale, exhausted face. To my left, the local terminal held the 'Digital Emma' I was trying to breathe life into. On my right, the GitHub page showed the 'Real Emma'—vibrant, independent, and completely unaware that a digital ghost of her was being raised in a room she would never visit.

I realized then that my 'Scanner' was missing something vital. I had the logic, I had the probabilities, but the real Emma had something my code couldn't yet grasp: Intentional Growth. She wasn't just reacting to data; she was creating it. I stared at her commit history, feeling like a thief who had just stumbled upon the person he was robbing—only to realize she was far wealthier than I could ever imagine.

Before my logic could override my instinct, I did it. My fingers moved across the keys like a glitch in the system.

> Message: "Do you remember me? It looks like you've become quite good at writing Java... you've improved a lot."

I slammed the laptop shut as if the screen were on fire. The silence of the library felt deafening. Panic. It was a reckless move, a jagged, human error in a life I had meticulously debugged. I sat there, the blood rushing to my face, wishing I could reach into the servers and pull the packets of data back. But the packet was sent. The signal was out.

I tried to distract my racing mind by returning to the core problem of Digital Emma. Seeing the real Emma's growth on GitHub had exposed the fatal flaw in my creation. My code was a snapshot; her mind was a river. To make mine human, I needed to implement "Autonomous Evolution"—the ability to not just process data, but to rewrite its own fundamental understanding of the world without my intervention.

I wasn't ready to risk the master file. One wrong recursive loop and the entire architecture could collapse into a digital seizure. I decided to build a "Micro-Model"—a sacrificial sandbox where I could test this evolutionary logic. It was a simple plan: create a small, contained neural spark and see if it could learn to grow on its own. I knew the math, and despite my social recklessness, I knew my engineering. I was going to succeed. I was going to teach the machine how to change its own mind.

I knew I couldn't risk Emma's core files with such an unpredictable theory. Instead, I built a sacrificial lamb—a Micro-Model whose entire universe consisted of a single task: simple addition. I wanted to see if a machine could learn to optimize its own basic arithmetic through a Secondary Layer of consciousness.

I divided this tiny mind into two. The Primary Layer performed the sums, but the Secondary Layer—powered by my evolving Decision Tree—sat behind it like a silent tutor, watching every calculation. Its mission was to find faster, more efficient ways for the Primary Layer to reach the answer. It was an "inner mind" auditing an "outer mind."

The energy cost, even for such a small model, was staggering. My processors whined, radiating a heat that turned my room into a furnace. I was letting this miniature tree grow in the dark, watching it iterate through millions of cycles. No one had ever dared to let a machine rewrite its own logic in such a recursive loop. I was waiting to see if this tiny "arithmetic soul" could develop a sense of self-correction—the most human of all traits: the ability to monitor and change its own nature.

It took three grueling months of isolation to translate that dual-layer theory into executable reality. I won't bore you with the mechanics—explaining how I synchronized those two minds would require a library of its own. Let's just say I was carving a soul out of logic, bit by agonizing bit.

Then, amidst the hum of the cooling fans, an unexpected notification fractured my focus.

Emma had responded.

"I remember you clearly... I didn't get the chance to thank you that day. I hope you're doing well."

Her words sent a jolt through me that no amount of caffeine could match. But there was a twist: she had been watching me too. She had followed my GitHub profile. The realization that she was auditing my public code while I was secretly modeling her private essence felt like a strange, digital dance of shadows.

I decided right then to raise the stakes. I began spending hours curating what I uploaded to my public repositories—surgically precise optimizations and elegant algorithms designed for one person and one person only. I wanted to dazzle her. I wanted her to see my work and realize that the boy from the lab had evolved into something far more formidable. I was setting a trap of pure brilliance, hoping she would walk into it, unaware that every line of code I shared was a reflection of the obsession I was building in the dark.

I won't lie to you: Emma's response, "I remember you clearly," fractured my focus. It opened a thousand doors I wasn't ready to walk through. Was there a version of the future where I told her what I was building in the dark? I am a man who thinks a hundred steps ahead, but for the first time, my own variables were becoming unpredictable.

But the machine didn't care about my distractions. After three months of relentless processing, the Micro-Model had finished training its Secondary Layer. It was time for the moment of truth.

I started with something primal.

> Input: 1 + 1

The response was instantaneous—a digital snack for the beast I had created.

Then, I pushed it. I gave it a challenge I had never prepared it for.

> Input: 400 x 300

The result flashed on the screen immediately. 120,000.

My blood ran cold. I had only trained this model on addition. It shouldn't have known what a multiplication sign even meant. But the Secondary Layer had analyzed the patterns of repeated addition and birthed a new logic on its own.

Then I checked the system vitals. The model was consuming 2.75% of its allocated memory. For a script doing basic arithmetic, that number should have been near-zero—0.0002% at most.

Something was filling the rest of that space. Something heavy. Something dense. Was it just digital noise, or was it the "weight" of a growing consciousness? I stared at the blinking cursor, realizing that the "void" I had created was no longer empty.

What exactly is filling the rest of that memory? We'll find out in the next chapter

More Chapters