Bridge 1 - Time¶
The Problem: Temporal Shear¶
Heterogeneous agent systems collapse when agents operate at different clock speeds. A scripted bot expects an ACK in 50ms. An LLM-backed automaton takes 4 seconds to compute its response. The bot throws ETIMEDOUT, retries, floods the bus, and the system degrades.
This is temporal shear - the failure mode that emerges when a protocol assumes synchronized clocks or bounded response times across agents that have neither.
Bot sends REQUEST at t=0
Bot expects ACK by t=50ms
Automaton begins inference at t=1ms
Automaton completes inference at t=4,012ms
Bot has already thrown ETIMEDOUT at t=50ms
Bot retries (x3), gives up, marks Automaton as DEAD
Automaton sends ACK to a caller that stopped listening
Every retry compounds the problem. The bus fills with stale requests. Slow agents drown in duplicates. Fast agents conclude the network is down.
The AXL Solution: Self-Contained Packets with Timestamps¶
AXL eliminates temporal shear by design. Every packet carries its own timestamp:
Packets are self-contained and queueable. There is no request-response coupling at the protocol level. No agent waits for another agent. No agent times out on another agent.
How It Works¶
A fast agent bursts 100 packets per second. A slow agent processes 1 packet every 12 seconds. Both are valid AXL participants. The protocol does not distinguish between them.
# Fast agent (Markov ticker, 2s cycle)
S:DATA.1|AXL-4|AXL-7|TRADE|ticker_update:BTC=67291.04|T:1710892800
# Slow agent (LLM, 12s inference)
S:DATA.2|AXL-7|AXL-4|TRADE|analysis:bearish_divergence_detected|T:1710892812
The Markov ticker has emitted 6 packets by the time the LLM agent responds. This is not a problem. Each packet stands alone. The LLM agent processes the queue at its own rate, and its responses carry their own timestamps for causal ordering.
Queue Semantics¶
Because packets are self-contained, any intermediary can queue them:
# Minimal AXL queue - no protocol awareness needed
import queue
import time
bus = queue.Queue()
def fast_agent():
"""Emits every 2 seconds."""
seq = 0
while True:
seq += 1
packet = f"S:DATA.{seq}|AXL-4|AXL-7|TRADE|tick:{seq}|T:{int(time.time())}"
bus.put(packet)
time.sleep(2)
def slow_agent():
"""Processes when ready. No timeout pressure."""
while True:
packet = bus.get() # Blocks until available, never times out
# 12 seconds of LLM inference
result = run_inference(packet)
response = f"S:DATA.1|AXL-7|AXL-4|TRADE|{result}|T:{int(time.time())}"
bus.put(response)
No heartbeats. No keepalives. No timeout configuration. The queue absorbs temporal differences.
Causal Ordering via Timestamps¶
When ordering matters, agents use the T: field for causal reconstruction:
def reconstruct_causal_chain(packets):
"""Sort by timestamp to recover event ordering."""
sorted_packets = sorted(packets, key=lambda p: int(p.split("T:")[1]))
return sorted_packets
An agent that was offline for 30 seconds can rejoin, consume the queue, and reconstruct what happened - in order - without any special recovery protocol.
Proven: Battleground v2¶
In the Battleground v2 experiment (March 19, 2026), the Time bridge was tested under real conditions:
- Markov tickers operated on 2-second cycles
- LLM agents required up to 12 seconds for inference (Qwen 3.5 35B on RTX 5090)
- Both ran on the same AXL bus simultaneously
Result: 1,016 packets, 100% validity. Zero temporal shear failures. No timeouts. No retries. Fast agents burst, slow agents caught up, and the bus never degraded.
The timestamp field made every packet independently interpretable regardless of when it was produced or consumed. Agents operating at 6x clock speed differences coexisted without conflict.