The Typical Code Language.

Humans & AI need a better language for code.

Not another experiment. Not another mosaic of compromises over situational constraints. Something thoughtful. Versatile. Enduring.

TypeScript, Go, and Rust support today's industrial greenfield projects. But they were conceived long before the era of pervasive LLMs. And the cracks are showing.

We want AI-written, human-audited code landing planes, keeping people alive on life support, and processing millions of transactions per second. Today's languages can't do that.

However, AI discourse is centered around models and enormous compute budgets. Yet the actual language being emitted—the true bottleneck—is largely ignored.

There is strong evidence that AI is now settling in the valley of local optimization. The next frontier is unlikely to come from futher improvements to AI itself, but rather from a systemic redesign of language.

Good languages are not computer science projects.

They're exercises in industrial design and precision craftsmanship. Ada, in its quiet rigor, was in design for years before a single line of code was written. And for fifty years, it has been a structural beam of national security.

Typical follows this worldview. Almost entirely privately funded. Detached from AI hype cycles. Design efforts heavily front-loaded. Crafted for a multi-decade time horizon.

We are designing for LLMs as engineers. For humans as governors. For the systems that will sustain future civilization. It must respect the past, pave the way to the future, and ultimately unlock human potential.

Typical is more than code. It is a carefully designed substrate for thought—a tool that amplifies curiosity, mastery, and creation. Its simplicity is deliberate, its ergonomics familiar, its reach unprecedented.

We are honored to be bringing it to life.

Language Elements

TypeScript, Grown Up

class Greeter
{
	(const name: string) { }
	
	export greet()
	{
		const opt = Math.floor(Math.random() * 4);
		return switch (opt) {
			0: `Hello, ${this.name}!`;
			1: `Good day, ${this.name}!`;
			2: `What's up ${this.name}?`;
			3: `Hey ${this.name}, how are you?`;
			else: `Hello`;
		}
	}
}

const greeter = Greeter("Alice");
console.log(greeter.greet());

Typical is a systems-capable language that can produce native binaries with extreme performance and safety constraints, yet it feels a lot like TypeScript. But it goes further. It removes the historical baggage from JavaScript, and has even better ergonomics.

For those fluent in TypeScript, the learning curve is negligible. The surface remains intact—console.log(), Math.random(), classes, functions, type definitions—all behave almost exactly like TypeScript. However, beneath the familiar ECMA built-ins exists a foundation engineered for uncompromising performance, safety, and analyzability.

Effortless Compiler-Managed Memory

extractWords(text: string)
{
	// Lifetime for wordPattern starts here
	const wordPattern = /\w+/g;
	const matches: string[] = [];
	let match = wordPattern.exec(text);
	while (match !== null) {
		matches.push(match[0]);
		match = wordPattern.exec(text);
	}
	
	// Lifetime for wordPattern ends here
	return matches;
}

const sentence = "Lifetimes are effortless!";
const words = extractWords(sentence);
console.log(words);

Typical eliminates the trade-offs of memory management. Compiler-inferred lifetimes and region-based allocation give predictability + ergonomics, without garbage collection. Reference counting is used only when provably necessary, and even then–it's far safer than strategies used in other languages.

LLMs never guess about ownership—memory is solved at the compiler level, making high-performance, safe, AI-generated code effortless.

Threads-As-Types

worker MathWorker;

add(a: int, b: int)
{
	declare.worker(MathWorker);
	return a + b;
}

calc(a: int, b: int)
{
	// Same thread, no await required.
	declare.worker(MathWorker);
	const x = add(a, 10);
	const y = add(b, 20);
	return add(x, y);
}

// An await is required because of
// the thread-boundary crossing.
const result = await add(1, 2);

The Typical compiler knows the thread where each bit of code runs. This allows it to handle cross-thread calls automatically. The result is a system that feels like single-threaded NodeJS, yet delivers fully statically analyzable, high-performance parallelism.

This is one of many wins that are only possible because Typical is fully statically analyzable.

And this win isn't just the ergonomics–it's the only practical way to unlock concurrency for LLMs. Teaching a model to write concurrent code in traditional languages seems to be impossible. Typical solves the problem at the root. It bakes concurrency into to the type system and avoids the problem entirely.

Bombs

readNumberFromString(str: string) 💣
{
	// This function is bomb-prone because
	// it can throw, so the IDE annotates it
	// with the bomb emoticon.
	const num = parseInt(str, 10);
	if (num === null)
		throw;
	
	return num;
}

getDouble(str: string)
{
	// bombs are falsey. Bombs are diffused
	// via nullish coalescence.
	const n = readNumberFromString(str) 💣 ?? 0;
	return n * 2;
}

In Typical, errors don't break your flow—they seamlessly propagate as values called bombs. Bombs look like TypeScript's null, but they behave more like Rust errors: safe, analyzable, and always visible to the compiler.

This is massive for LLMs. Models can generate TypeScript-style code, while the compiler enforces bomb handling and provides precise guidance. This avoids having to teach models novel exception patterns.

Bombs unapologetically display as IDE decorations from compiler-inferred knowledge. They don't exist in the source code. While this may disrupt legacy workflows, the wins for LLMs make it worthwhile.

Optional Declarations

hardRealtimeLoop(low: u64, high: u64)
{
	declare.noAllocations;
	declare.noReferenceCounting;
	declare.pureFunctional;
	
	// Checked at compile-time
	declare high > low;
	
	// Run for 1 second
	let result = 0;
	
	for (n of low to high)
		for (i of 0 to 1_000_000)
			result += Math.sin(i) * Math.sqrt(i);
	
	return result;
	
	// Checked at compile-time
	declare return > 0;
}

Any scope—file, class, type, or function—can declare various intentions, from purity and allocation constraints to memory layout and execution guarantees. These declarations are first-class intent signals in the compiler's reasoning, which is high-value information that can be better used to steer LLMs toward correctness.

Languages with harsh invariants tend to stagnate in niche domains. Typical's declare statements do the opposite: they make Typical useful in more domains. They're scoped, fine-grained, and opt-in. This is how a single language can be the best tool for sub-millisecond high-frequency trading, and for beginners writing simple scripts to copy data from one place to another.

Compiles To Rust

// Typical Code
daysSince(year: int, month: int, day: int)
{
	const day = 1000 * 60 * 60 * 24;
	const now = new Date();
	const past = new Date(year, month - 1, day);
	const nt = now.getTime();
	const pt = past.getTime();
	return Math.floor((nt - pt) / day);
}

// Generated Rust code
fn days_since(year: i64, month: i64, day: i64) -> i64 {
	let day = 1000 * 60 * 60 * 24;
	let now = Date::new();
	let past = Date::new_with_ymd(year, month - 1, day);
	let nt = now.get_time();
	let pt = past.get_time();
	(nt - pt) / day
}

Typical does not introduce a new runtime or ecosystem—it compiles directly to idiomatic Rust, but with the ECMA built-ins ported and available. This ensures full compatibility with existing crates while preserving the language's performance, safety, and concurrency guarantees. What you write in Typical becomes idiomatic Rust with no efficiency lost.

Just as TypeScript lead the way as a type-safe layer over JavaScript, Typical intends to lead the way as an LLM+ergonomics affordance layer over Rust. Most Rust crates work in Typical without modification, giving immediate access to a large and vibrant ecosystem.