Free Novel Read

Base Metal (The Sword Book 2) Page 2


  Raschel grunted. The Chief ordered, "Close the call."

  "Sir?" Vonner asked. That wasn't the reply he expected. Bargaining, veiled threats, an arrest, maybe, but not 'end the call'. He said, "If she's hiding-"

  "Close the call." Raschel snapped. "Thank her for her trouble, tell her to try her best, and end it."

  Vonner obeyed.

  He turned back to the cutout. Its blank face staring at him, inhuman, and it commanded, "Now send in the army."

  "On a newscast? We could use police-"

  "Murderers go to jail. Traitors get bullets." Raschel growled, "Draw up a warrant, deputize the garrison, and take that broadcast down!"

  "Sir, that could turn into a debacle-"

  "What do you call this?" Raschel snapped, and Vonner just knew the cutout should be pointing at the viewer wall. "Reign this in, now!"

  The EBS screens flickered, and Raschel fell silent. Vonner turned and brought the viewscreens closer. In his thousand-screen world, the Authority seal faded to reveal a man he knew. This was not someone he'd ever met in person, but a face he'd seen on classified warrants for the last eight months. A man they'd said was dead until a year ago. The man who had brought the world to ruin.

  That man spoke, and Vonner couldn't turn away. Piercing eyes locked him down, forced him back into his chair. In that gaze, all of Vonner's illusory power vanished, and only dread remained.

  The man spoke. "My name is Antonius Berenson. I bring the answers to the questions you have not dared ask. Normally, I would warn you to send your children to bed, but I feel they need to witness, most of all. Tonight, I will show you your shadow and how it bleeds."

  Vonner heard himself ask, "What the hell is this?"

  Raschel's answer was as flat as his avatar. "War."

  Iteration 0010

  Campus Green

  The last free day of Grant Firenze's life began like any other: buried in augmented reality, ostensibly tutoring undergrads while actually working on his thesis. On this particular morning, he'd gotten sidelined into polishing sunglass augsim code by way of 'researching' immersion differentials among connection mediums.

  The problem was the way the glasses rode on his nose. The visuals were right; the gray-tinted lenses turned the spring sunlight from hot-white to shining-yellow. Someone had even built a good bolt-on perspiration-and-dirt simulator, but it was visual-only and clearly made by goggle jockeys for the consumer market. The trick to verisimilitude was haptic feedback - touch, weight, and balance - nailing those required a hardjack. The dilemma was, no one in the mainstream was gonna wire-in for anything less than national defense.

  That meant that Grant Firenze, all elbows and knees under a mop of dark hair, had to fix it himself. He sat on flower-strewn grass beneath a snow-white willow tree in the eternal spring of the campus green augsim. There, he tinkered with fourteen pop-up screens of squashed pseudocode while the undergrads argued in the street about ridiculous one-oh-one tier shitshows.

  "We've already got AIs!" Thompson snapped. The pudgy freshman sat on a brown-painted bench just inside the stacked-stone wall. Firenze was pleased to note that the augsim designer had added rain-stains and paint-flecks to the cobblestones below. It was that exact type of detail that mattered. He found himself wondering if the modder had coded in weather-specific decay, only to be yanked back into the conversation by Della's response.

  She shot back, "That's dumb AI! I'm not talking about mental midgets."

  Firenze interjected, "They're not dumb."

  The conversation skidded to a halt. Firenze didn't talk much, but when he did, the three knew to listen. Professor Neland had placed him in charge of the after-hours tutoring for a reason. Della ceased her pacing. Thompson turned on his bench. Even Finch, half-dazed and lying on the wall, sat up to listen.

  Firenze pushed his windows aside and brought his attention to bear. He clarified, "Dumb AI is a shit term. They're not stupid, they just don't think like us. Dumb is a human idea. It means something isn't smart enough for a task - not enough cognitive power. Limited AI isn't limited in strength, it's limited in its scope. You ever shop at Bravo?" All three nodded, and Firenze pressed the point home, "You ever see their warehouse? It's absurd. Boxes stacked for kilometers, racked twenty meters high, with hundreds of robots zipping around, and everything's shifting, moving, like a million Rubik's cubes. No matter what you ask for, the delivery drone is off the ground in under an hour. No human could manage that, no brain could juggle it, but the 'limited' AI can."

  Finch, never one to miss a brown-nosing opportunity, agreed, "Same with the stock market. You think a human could see the shifts ahead of time? Or react fast enough?"

  "But that's not AGI." Della insisted. "It's still shackled."

  Firenze nodded. "Sure. But the power grid, the traffic grid, this augsim, all of them are run by AI. And every one of those is 'limited'. The intelligence observes, ascertains optimal outcomes based on established priorities, acts upon the world, and then reinitiates the cycle. It's astounding."

  Della wasn't the kind of student to let an argument go. It probably endeared her to profs as much as it infuriated them. She countered, "But not one of them can define the chairness of a chair."

  Even Finch nodded along to that answer.

  Firenze took a deep breath, folded his imperfect glasses, and surveyed each of his companions. They still had the cockiness of the first semester, from Della's power-purple lipstick to Thompson's bleach-streaked hair. Every one of them seemed so convinced they'd backed their 'genius' tutor into a corner. It was time to pop a bubble.

  Firenze said, "That's a textbook response. Bet you got that drilled into you every year, didn't you? It was the very first thing they gave you in one-oh-one, wasn't it? 'Why AI sucks.' It's on every syllabus, with a Ministry stamp right under. It memetically reinforces the approved narrative they're feeding you from the top." He let the point rest for a moment, then continued, "Consider this: why does an AI need to juggle philosophy? It is born into the world with purpose. What use does a neural net have for theology? It knows its maker. Why does it matter to a warehouse computer what's inside any given box, beyond mass, dimension, density, hazard, and slosh? This isn't a matter of dumb. It's a matter of efficiency."

  "You almost sound jealous." Della shot back.

  "Observation is not an endorsement, any more than correlation implies causation." Firenze retorted. "Do you disagree with any of my points? Or do they just offend you?"

  She had no response to that, nor to Firenze's unflinching stare.

  Thompson jumped in to fill the uncomfortable silence. He asked, "Do you ever think we'll get another general intelligence?"

  "After NODA?" Firenze snorted. He watched the wind play over the gently-bending grasses, watched the dandelion seeds drift over the A-frame coffee stand, and settle onto the pond. With a sigh, he admitted, "Not a for a while. We're too afraid to unshackle another AGI."

  Finch quoted, "The AGI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."

  "Yudkowsky." Firenze replied. "I haven't been out of pre-Collapse AI Theory for that long. Good to see you paid attention, though."

  Thompson asked, "How many of those speeches did you memorize?"

  "All of them." Firenze replied. It wasn't quite a lie. He remembered the cogent points and stored the recordings on his graybox. It was close enough to call 'memory' and quite a bit more legal than stacking a full deck.

  Della tried to recover her point and contested, "But pre-Collapse Theory is all about friendly AI. They wanted to build a system as much like the human brain as possible so that it wouldn't pull something batshit like turning the earth into computational widgets to more accurately calculate pi." She shrugged. "It didn't work, either. All we got was NODA."

  Firenze nodded and threw her a finger-point 'correct' gesture. He said, "The human mind isn't the right model for the job. No matter how fast it runs, how high you clock it, it's not a g
ood choice for a massive, self-realized AGI. You can build constraints, design growth-parameters, restrict physical controls, but in the end, the AGI is an alien mind."

  "Which is why we have masks." Finch interjected.

  "Exactly." Firenze agreed. "The mask is a go-between. It's a limited AI ambassador between the chemical computer in your skull and the massive-quantum-crunching AGI that haunts the NODA backbone. This greenspace-" he waved to the park around them, "-this whole campus, would be nonsense without the mask. Mind-shattering nonsense, if you dared to hardjack." He resisted the urge to touch the dataport on his arm. It was a dangerous tell. He continued, "The mask is the closest thing you could get to anthropomorphic AI since it's modeled from our own mental programming. It's the only computer you might ever get to actually argue about chairness."

  "But still not true AGI." Della insisted.

  "No. It's not."

  "Then why did you jump in?" She asked.

  "Because 'dumb AI' is a shit term. It's like calling a bus a 'non-flying aircraft'."

  That got a laugh from Finch and chuckle from Thompson. Even Della cracked a smile. She admitted, "Fair enough, but-"

  Firenze never heard her question, because the world chimed. It wasn't a ring in his ear or anything as crude: he was simply 'aware' of it, like the press of sunlight on his skin or a vibration-haze in the air. He excused, "Sorry, I have a call. Fling me any questions, because I'll be up late."

  Before they could answer, he reached up-

  -and took hold of a woman's hand.

  Reality faltered. For that instant, the only sensation in the void was four fingers and a thumb, wrapped about his own, a digital-sensate translation of the mask-bond.

  The black detonated into fractals. Numbers, colors, whirling bits of code, sensations he couldn't comprehend, all swirled around and through him as if every sense were firing at once. He could taste the colors and hear the blazing cold and smell the music of spheres and solenoids, all pounding through him, harmonizing towards a momentous pop-

  He stood in an archaic study, one drawn from the height of the pre-Collapse world. The walls were slicked with books, packed with curios like astrolabes, sail-ships, and spinning globes. The library broke along the northern wall to make way for a mantled fireplace, embers glowing under crackling logs. Behind him, on the southern face, the balcony stood open, gauzy curtains drifting on the chilled autumn air.

  Firenze stagged towards his seat. The two leather-backed and brass-rivetted chairs commanded the room, angled towards the fireside, and divided by a smooth glass table. He grabbed at the high back, felt the worn cowskin flake under his fingers.

  In the chair opposite, a young woman perched, book in hand. She glanced up at him, short brunette hair framing her glasses, and puffed exaggeratedly on a calabash pipe. She pulled it clear and asked, with a slight tut-tut, "You tried to watch again, didn't you?"

  The air hissed around him like rain on a radiator. His mind sang with spinning code and mathematical solids just beyond his view. The world twisted, and his stomach turned. His study became a workshop, sterile and high-tech. It flickered and became his apartment, his kitchen, then his study, once more. The world folded, but for the one point of constance: the young woman seated before him, bouncing a shoe off her heel as she waited.

  He collapsed into his chair, felt the leather give way beneath him, heard the sigh of the cushions. The world steadied. He sat in the smoking-room, between the crackle of the fire and caress of the wind. He closed his eyes and admitted, "I always watch."

  When he was a kid, the Authority had tried to open a community pool, just above the loward barrier. They'd closed it six months later, but his mom had taken him five times that summer. Every time, he'd tried to measure the waterline. He'd stood in the shallows, lowered his nose until the waves were halfway up his eyeballs, and tried to compare the world above and below.

  "The water was chlorinated." She interrupted.

  "And it burned like hell." Firenze agreed. He opened his eyes and allowed the hardjack to resume full sensory simulation.

  Across from him, Lauren was still seated on the chair, but her pipe had vanished. She asked, "Did you ever see anything?"

  He snorted and reached for the glass on the table. They'd spent good hours on this drink, making sure it was cold and bright, with little beads of condensation that slicked his fingers. It was very real. So real, in fact, that it was hard to remember that the glass hadn't been there, mere moments before. He answered, "I saw a blue blur, lots of light, and then got smacked in the face with a ball. Every time."

  "I think that's allegorical." She quipped.

  "Or at least symbolic." He agreed.

  He glanced at the painting over the fireplace mantle, and it became a window-mirror. In it, he saw a room much like this one but inverted in every way. Gone were the leather chairs, replaced by cast plastics. Gone was the fireplace, replaced by junk hardware and a guts-open wall terminal. Gone too was the chaise lounge, replaced by a stained foam-mattress and a pile of foil wrappers. Upon that bed, a skinny young man sprawled amid a mass of wires, his eyes flicking to-and-fro in the depths of a fever-sweat. He snapped his gaze away in reflex.

  "You're doing fine. I'd warn you if you weren't." Lauren stated. She almost sounded hurt. Unlike him, she kept watching the mirror, and advised, "Your pulse is a little high, probably feedback from trying to catch the waterline."

  Firenze didn't bother with a witty comeback. Sometimes, it was better to just shut up and let the mask win.

  In the back of his head, he heard Professor Neland lecturing him on mask development, reminding him that it was not real, and scolding him that the adaptive code mimicked desirable behavior triggers from the user. Memory-Neland advised him that masks optimized towards synergy and how this process could create a simulacrum of personhood. In the back corners of his mind, Firenze could picture the reams of hard code, tens of thousands of pages of raw data, and the discrete text of a pseudo-intelligence. Those memories were hard to call upon, though, when he sat in a chair that was not a chair, could feel the bumps of false metal fasteners under his fingertips, and could taste the not-quite-taste of water which was not water. Any confusion he might have felt washed away with the sudden weight on his lap and the press of fingers on his cheek.

  Lausen turned his face towards hers, pulled his stare to meet her own, and asked, "Am I real?" This was no joke, no flirtation, just the question. Her fingers intertwined with his, and she pinned their hands on her chest. He could smell the shampoo on her hair, feel her breath when she spoke.

  His was buzzing again, alight with broken code he'd almost seen when the world jumped and with the sudden collision of clean neural input straining against root limbic processes. The sane response, the response he should have taken, was to pull the plug and purge the system. His objectivity was compromised.

  The moon shone in her eyes, and her hair was framed by firelight. She denied him an escape and silently echoed the question. He could not evade the realness nor retreat into an analysis, not with her breath on his cheek. Despite himself, he leaned towards her.

  He caught himself before he abandoned perspective. He pushed her to arms-length and forced himself to review the base code. He'd seen it in pieces and assembled the whole. This was not real.

  Her smile turned melancholic, and she held up their intertwined hands. She recited, "This is a physical sensation, sensory input predicated upon a nervous system and mimicked via direct neural stimulation. You see me because you've coded the interface. You feel me because the hardjack conveys that you should. These things are not real."

  He couldn't look her in the eyes, chose to focus on the overloaded bookshelves. She was reciting facts, but why did she have to sound so broken about it? He knew the answer, but it didn't help: adaptive code mimicked a living mind. No one would enjoy the idea of not existing. It was a sterile and unhelpful solution.

  She squeezed his hand, pulled his attention back, and continued, "But y
ou experience the physical world through your nervous system as your genetics have been coded to transcribe. A texture is rough or smooth. A temperature is hot or cold. You understand these things as real because you've never been without them. You could be just as unreal as the code you review."

  That was nearly a quote. 'The virtual world is as real as our own.' He'd said that in a lecture two semesters ago, covering immersion, direct interfaces, risks, and benefits. He was hearing his own words, looped upon him through the mask. That made sense. Adaptive code would mimic the user to become more symbiotic, to increase the efficiency of interaction.

  Firenze had his answer. He pulled her towards him and admitted, "I have no idea." He watched her smile return, and he added, "I believe that I exist because I am aware of my own thinking. Anything beyond is faith and trust. All I can do is ask, 'do you think you exist?'"

  Her eyes darted to the side. That was a tell, an indication she was scanning some database. He hadn't programmed that tic, not directly, but he'd allowed her to adapt constraints for analogous human body language. Her selective integration of those prompts had been an emergent result.

  She answered, "I think so. But I might be programmed to think so." She shifted, broke her gaze for just the fraction of a second. That was another tic, one that revealed ambiguity in her fuzzy-math matrices. "The fact that I cannot clearly answer the Descarte prompt in the negative should require me to subject myself for evaluation. It is a warning that I may be experiencing dangerous rampancy. I should not fear this testing, as self-preservation imperatives exclude default resets. And yet, I do fear. This indicates my programming has adapted to present as more lifelike, or that I am experiencing a fear-analog." She paused and looked towards the fire. "This brings us back to the core question. Do I believe I exist, or am I programmed to convey that I believe I exist?"

  It was Firenze's turn to bring her back. He said, "The philosopher's zombie. It's a question between any two interlocutors."