The decision felt simple when I made it.
No more accidents. That was it. That was the whole plan. I would stop leaving myself in situations where something could happen unintentionally, and I would stop using the word unintentionally as a reason not to think harder about how I'd gotten there.
Monday I changed my campus routes. Tuesday I altered when I went to the dining hall. Wednesday I started keeping headphones in—not always playing anything, just a universal signal that I was unavailable for incidental contact. Small adjustments. Nothing dramatic. Just the kind of restructuring you do when you realize your environment has been doing most of your decision-making for you.
The system logged it within forty-eight hours.
SYSTEM NOTICE
Behavioral analysis: proximity avoidance protocol detected
Accident rate (7-day rolling): reduced 61%
Classification updated: deliberate operator
I stared at "deliberate operator" for a while. Felt less like a compliment than it probably looked.
The problem was Zoe.
Zoe Lin was not a proximity issue you could solve with route changes. She wasn't a person who arrived on schedule or stayed in predictable locations or respected implicit signals that you weren't available. She was a force of nature dressed as a second-year student, and the single most reliable thing about her was that she would appear in exactly the place you had not planned for her to be.
She found me Wednesday afternoon at the outdoor study tables near the science building—a location I had specifically chosen because nobody sat there in February. It was cold. The wind came around the corner of the building at an angle that made it actively unpleasant.
Zoe was sitting two seats down, wrapped in a coat the color of a traffic cone, eating a sandwich like she'd been there for an hour.
She looked up. "Oh, hey."
"How are you here," I said. It wasn't really a question.
"I like it out here." She gestured at the wind with her sandwich. "Nobody bothers you."
I sat down. Because leaving would have been too obvious.
"You've been avoiding me," she said.
"I've been restructuring my schedule."
"That's the same thing."
I pulled out my laptop. "It's really not."
She turned to face me fully, which was the kind of social move that made it physically awkward to look at a screen. "Ethan. I touched your shoulder three weeks ago and you went weird for four days. I'm not going to accidentally kiss you just because we're sitting near each other."
"I know."
"Then what are you doing?"
I thought about it. The honest version was: the system classified accidental contact as a vulnerability, and I've been trying to reduce my vulnerability score. The sensible version was something shorter.
"Trying to be less reactive," I said. "To everything. It's not about you specifically."
She considered this. Took a bite of her sandwich. Chewed thoughtfully.
"Okay," she said. "But you've been sitting two feet closer to the door at every shared class. You think I haven't noticed?"
"I didn't think you tracked that."
"I track everything." She said it like it was obvious. "I just don't usually say anything." A pause. "You're allowed to need space. You just don't have to be weird about it."
"I'm not being weird."
She looked at me.
"I'm being deliberate," I said.
"That's worse," she said, and went back to her sandwich.
The system updated that night.
SYSTEM NOTICE
Proximity log: ZOE LIN — contact, 22 min, non-triggering
Accident rate: unchanged
Note: social engineering vector identified
Social engineering vector.
As if the fact that Zoe existed and showed up places was a tactic. As if the universe was conspiring with her specifically to undermine my accident-avoidance protocol. As if she was the problem and not just a person who happened to generate chaos the way some people generated static.
I typed into my notes app: the system is starting to sound paranoid.
Then I deleted it.
Because the thing was—I'd been thinking the same thing for two days. The pattern of near-misses wasn't entirely random. The café incident. The elevator situation. The study group that had somehow always included exactly the wrong person at the wrong time.
Accidents, I'd called them. Every single one.
But I'd also chosen to be in all of those places.
Thursday I ran into Sienna in the stairwell.
Not planned. She was coming down, I was going up, and we met at the landing between the second and third floors. She looked at me with an expression that said she was clocking the encounter against her internal model of my behavior and filing it appropriately.
"Stairwell," she said. "Bold choice."
"I've been taking the stairs."
"Since when?"
"This week."
She studied me. "You changed your elevator timing, didn't you."
I had. I'd adjusted it to avoid the post-lecture window when the lifts were crowded and contact was structurally likely. I hadn't told anyone. I hadn't thought anyone would notice.
"Don't—" she started.
"I know."
"You can't just route around people, Ethan."
"I'm not routing around people. I'm routing around situations."
"We're in all your situations." She gestured between us, at the stairwell, at the building. "This is a situation. Are you going to start avoiding stairwells?"
I didn't have a good answer to that.
She passed me on the stairs and kept walking.
Friday evening, I sat with the week's data.
Accident rate: down. Incidental contact: reduced. Unplanned proximity: mostly managed. On paper, the protocol was working. The system agreed: KPIs trending in the correct direction, behavioral classification holding at deliberate operator, no flags.
What the system didn't log: Zoe knew. Sienna knew. Maya was going to figure it out within the week. I'd spent five days meticulously reducing accidental contact and everyone in my life had independently noticed and named it.
The tighter I got, the more visible I became.
I thought about that.
Then the system sent one more notice before I went to sleep.
SYSTEM NOTICE
Protocol review: proximity avoidance, day 5
Accident rate: reduced 61%
Social awareness of protocol: estimated 3 persons
Projected incident: organic / uncontrolled / 48–72 hours
Classification: socially engineered accident
I read it three times.
It wasn't predicting an accident because of my carelessness.
It was predicting one because other people now knew I was trying to prevent them—and the system had apparently decided that knowledge itself was a mechanism.
I put the phone down and looked at the ceiling for a long time.
The accident rate had gone down. The accident probability had gone up.
I had no idea how to make those two things stop being true at the same time.
