Introduction
ktr is a domain-specific language for parametric sewing patterns.
It describes patterns as geometry and constraints.
A ktr program is compiled into a canonical, diffable IR and executed by a runtime.
Goals
- Define sewing patterns as precise geometry with explicit drafting constraints.
- Support fully parametric patterns driven by body measurements.
- Compile to a stable, diffable IR suitable for version control and visual editors.
- Provide TypeScript and Zig runtimes out of the box.
- Expose a C-defined runtime interface, enabling portable runtimes and bindings in any language.
- Ship first-class tooling: compiler, LSP and editor integrations.
ktr is designed for reproducible pattern drafting, not for general-purpose programming.
Example
input head = 100mm {
assert head > 0mm
assert head < 800mm
}
input target_neck = 200mm {
assert target_neck > 0mm
assert target_neck < 500mm
}
fn neck_quarter(tweak: f64) {
let right = point(tweak * head / 10, 0mm)
let bottom = point(0mm, tweak * head / 12)
let cp1 = right.up(bottom.dy(right) / 2)
let cp2 = bottom.right(bottom.dx(right) / 2)
return bezier(right, cp1, cp2, bottom)
}
let tweak = search (t: f64) {
bounds t [0.6 .. 1.6]
tolerance 1mm
require neck_quarter(t).length == target_neck
}
export neck_quarter(tweak) as "Neck Curve" Language Reference
ktr is a domain-specific language for parametric pattern drafting. Programs
describe measurements, geometric constructions, and constraints that a solver
can resolve at runtime. The compiler (ktrc) translates .ktr source into a
line-oriented intermediate representation (.ktrir) consumed by runtimes.
The formal grammars are also available as standalone EBNF files:
- ktr.ebnf — ktr source language grammar
- ktrir.ebnf — ktr-ir intermediate representation grammar
1. Notation
The grammar uses ISO 14977 Extended Backus-Naur Form (EBNF).
| Notation | Meaning |
|---|---|
= | definition |
, | concatenation |
| | alternation |
[ ... ] | optional (0 or 1) |
{ ... } | repetition (0 or more) |
( ... ) | grouping |
" ... " | terminal string |
(* ... *) | comment |
- | exception |
2. Lexical Grammar
2.1 Source Encoding
Source files are UTF-8 encoded. The grammar operates over bytes; identifiers and keywords use ASCII only.
2.2 Whitespace and Comments
Whitespace (spaces, tabs, newlines) separates tokens but is otherwise ignored.
Line comments begin with // and extend to the end of the line. They are
treated as whitespace and may appear anywhere a token boundary is valid.
COMMENT = "//" , { ANY - NEWLINE } ;
Example:
// This is a comment
let x = 100mm // inline comment
2.3 Keywords
let input fn return search bounds tolerance require export as assert piece
Keywords are reserved and cannot be used as identifiers.
2.4 Tokens
(* ------------------------------------------------------------------ *)
(* ktr source language -- Lexical grammar *)
(* ------------------------------------------------------------------ *)
LETTER = "A" | "B" | "C" | "D" | "E" | "F" | "G" | "H"
| "I" | "J" | "K" | "L" | "M" | "N" | "O" | "P"
| "Q" | "R" | "S" | "T" | "U" | "V" | "W" | "X"
| "Y" | "Z"
| "a" | "b" | "c" | "d" | "e" | "f" | "g" | "h"
| "i" | "j" | "k" | "l" | "m" | "n" | "o" | "p"
| "q" | "r" | "s" | "t" | "u" | "v" | "w" | "x"
| "y" | "z" ;
DIGIT = "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7"
| "8" | "9" ;
IDENT = ( LETTER | "_" ) , { LETTER | DIGIT | "_" } ;
INTEGER = DIGIT , { DIGIT } ;
NUMBER = [ "-" ] , DIGIT , { DIGIT } , [ "." , DIGIT , { DIGIT } ] ;
UNIT = "mm" | "cm" ;
DIMENSION = NUMBER , UNIT ; (* e.g. 100mm, 25cm *)
PERCENTAGE = NUMBER , "%" ; (* e.g. 50% *)
STRING = '"' , { ANY - '"' } , '"' ;
The lexer greedily attaches a unit suffix to a preceding number. An unknown suffix causes the number and suffix to lex as separate tokens (a bare number followed by an identifier).
3. Syntactic Grammar
(* ------------------------------------------------------------------ *)
(* ktr source language v0.1 -- Syntactic grammar *)
(* ------------------------------------------------------------------ *)
program = { statement } ;
statement = let_statement
| input_decl
| fn_def
| piece_def
| export_stmt ;
(* ------------------------------------------------------------------ *)
(* Let bindings *)
(* ------------------------------------------------------------------ *)
let_statement = "let" , IDENT , "=" , expression ;
(* ------------------------------------------------------------------ *)
(* Input declarations *)
(* ------------------------------------------------------------------ *)
input_decl = "input" , IDENT , "=" , expression ,
[ "{" , { assert_stmt } , "}" ] ;
assert_stmt = "assert" , IDENT , cmp_op , expression ;
cmp_op = "==" | "!=" | ">" | "<" | ">=" | "<=" ;
(* ------------------------------------------------------------------ *)
(* Function definitions *)
(* ------------------------------------------------------------------ *)
fn_def = "fn" , IDENT , "(" , [ param_list ] , ")" ,
"{" , fn_body , "}" ;
param_list = param , { "," , param } ;
param = IDENT , ":" , type_name ;
fn_body = { let_statement } , return_stmt ;
return_stmt = "return" , expression ;
(* ------------------------------------------------------------------ *)
(* Piece definitions *)
(* ------------------------------------------------------------------ *)
piece_def = "piece" , IDENT , "{" , { piece_member } , "}" ;
piece_member = IDENT , "=" , expression ;
(* ------------------------------------------------------------------ *)
(* Search (solver) blocks *)
(* ------------------------------------------------------------------ *)
(* Search appears as the RHS of a let binding: *)
(* let tweak = search (t: f64) { ... } *)
search_expr = "search" , "(" , param , ")" ,
"{" , search_body , "}" ;
search_body = bounds_clause ,
tolerance_clause ,
require_clause ;
bounds_clause = "bounds" , IDENT , "[" , expression , ".." , expression , "]" ;
tolerance_clause = "tolerance" , expression ;
require_clause = "require" , expression , cmp_op , expression ;
(* ------------------------------------------------------------------ *)
(* Export statements *)
(* ------------------------------------------------------------------ *)
export_stmt = "export" , expression , "as" , STRING ;
(* ------------------------------------------------------------------ *)
(* Expressions *)
(* ------------------------------------------------------------------ *)
expression = additive_expr ;
additive_expr = multiplicative_expr ,
{ ( "+" | "-" ) , multiplicative_expr } ;
multiplicative_expr = unary_expr ,
{ ( "*" | "/" ) , unary_expr } ;
unary_expr = [ "-" ] , postfix_expr ;
postfix_expr = primary_expr , { field_access | method_call } ;
field_access = "." , IDENT ;
method_call = "." , IDENT , "(" , [ arg_list ] , ")" ;
primary_expr = DIMENSION (* 100mm *)
| PERCENTAGE (* 50% *)
| NUMBER (* 42, 3.14 *)
| fn_call (* point(x, y) *)
| piece_expr (* piece { ... } *)
| search_expr (* search (t) { ... } *)
| IDENT (* head *)
| "(" , expression , ")" ; (* grouping *)
fn_call = IDENT , "(" , [ arg_list ] , ")" ;
piece_expr = "piece" , "{" , { piece_member } , "}" ;
arg_list = expression , { "," , expression } ;
(* ------------------------------------------------------------------ *)
(* Type names *)
(* ------------------------------------------------------------------ *)
type_name = "f64"
| "length"
| "percentage"
| "point"
| "bezier"
| "line"
| "piece"
| "bool" ;
3.1 Operator Precedence (lowest to highest)
| Precedence | Operators | Associativity |
|---|---|---|
| 1 | + - | Left |
| 2 | * / | Left |
| 3 | unary - | Right (prefix) |
| 4 | .field .method() | Left |
Comparison operators (==, >, <, etc.) appear only in assert and
require clauses, not as general expression operators.
4. Type System
| Type | Description |
|---|---|
f64 | Bare scalar (unitless double-precision float) |
length | Dimensional value carrying a unit (mm, cm) |
percentage | Percentage value (% suffix in source) |
point | 2D coordinate (x: length, y: length) |
bezier | Cubic Bezier curve (4 control points) |
line | Straight segment between two points |
piece | Named collection of member bindings |
bool | Boolean (internal to assertions/requires) |
4.1 Units
Length values carry their original unit through compilation. The compiler preserves units; conversion happens at runtime.
| Unit | Name |
|---|---|
mm | millimeters |
cm | centimeters |
4.2 Type Inference
The type of a let binding is inferred from its right-hand side:
- A
DIMENSIONliteral has typelength. - A
PERCENTAGEliteral has typepercentage. - A bare
NUMBERhas typef64. - An identifier reference inherits the type of the referenced binding.
- Arithmetic expressions follow standard promotion rules (future spec).
- Function calls have the declared return type of the function.
4.3 Poison Type
poison is a compiler-internal pseudo-type assigned when type resolution
fails (e.g., referencing an undefined identifier). Poison suppresses
cascading diagnostic errors. It never appears in .ktr source or .ktrir
output.
5. Program Structure
A ktr program is a sequence of top-level declarations. Bindings are evaluated in source order; every name must be defined before it is referenced (topological ordering).
5.1 Inputs
input head = 100mm {
assert head > 0mm
assert head < 800mm
}
An input declares a parametric measurement with a default value and
optional constraints. At runtime, inputs can be overridden by the user
within the bounds specified by their assertions.
5.2 Let Bindings
let right = point(tweak * head / 10, 0mm)
A let binding introduces a named value. The right-hand side is an
arbitrary expression. Bindings are immutable.
5.3 Functions
fn neck_quarter(tweak: f64) {
let right = point(tweak * head / 10, 0mm)
let bottom = point(0mm, tweak * head / 12)
return bezier(right, cp1, cp2, bottom)
}
Functions take typed parameters and contain a sequence of let bindings
followed by a single return expression. Functions may reference inputs
and other top-level bindings. Recursion is not supported.
Functions can also return a piece by returning an anonymous piece expression:
fn make_sleeve(width: length, height: length) {
return piece {
top_left = point(0mm, 0mm)
top_right = point(width, 0mm)
bottom_left = point(0mm, height)
bottom_right = point(width, height)
}
}
let sleeve = make_sleeve(200mm, 400mm)
let y = sleeve.top_left.y
5.4 Search (Solver)
let tweak = search (t: f64) {
bounds t [0.6 .. 1.6]
tolerance 1mm
require neck_quarter(t).length == target_neck
}
A search expression declares a solver variable. The runtime finds a
value for the parameter within the given bounds such that the require
constraint is satisfied within the given tolerance.
5.5 Pieces
piece neckhole {
top_left = point(armhole_depth + shoulder_width, 0mm)
top_right = point(chest_width - armhole_depth - shoulder_width, 0mm)
curve = bezier(top_left, neck_cp1, neck_cp2, top_right)
}
let x = neckhole.top_left.x
A piece defines a named, scoped collection of member bindings. Inside a piece
body, members are written as name = expression (without let) and are visible
in definition order. Members may reference:
- Earlier members in the same piece by bare name.
- External bindings (inputs, top-level lets, function results, etc.).
Outside the piece body, members are accessed through qualified field access
(piece_name.member_name). Piece member names do not leak into the enclosing
scope.
piece also has an expression form (piece { ... }) that constructs an
anonymous piece value. This is most useful as a function return value.
5.6 Exports
export neck_quarter(tweak) as "Neck Curve"
An export marks a value for output by the runtime (e.g., rendering as
SVG). The string label is the human-readable name.
6. Built-in Functions and Methods
6.1 Constructors
| Function | Signature |
|---|---|
point(x, y) | (length, length) -> point |
bezier(p1, p2, p3, p4) | (point, point, point, point) -> bezier |
line(start, end) | (point, point) -> line |
6.2 Field Accessors
Composite types expose named fields via dot syntax. Field access can be
chained (e.g., some_line.point1.x).
Point Fields
| Field | Type | Description |
|---|---|---|
.x | length | Horizontal coordinate of the point |
.y | length | Vertical coordinate of the point |
Line Fields
| Field | Type | Description |
|---|---|---|
.point1 | point | Start point of the line |
.point2 | point | End point of the line |
Bezier Fields
| Field | Type | Description |
|---|---|---|
.point1 | point | First control point (start) |
.point2 | point | Second control point |
.point3 | point | Third control point |
.point4 | point | Fourth control point (end) |
Piece Fields
For a piece, available fields are exactly the members declared in its body.
For example, with:
piece sleeve {
top_left = point(0mm, 0mm)
}
sleeve.top_left has type point and can be chained (sleeve.top_left.x).
6.3 Point Methods
| Method | Signature | Description |
|---|---|---|
.up(d) | (point, length) -> point | Shift point up by d |
.down(d) | (point, length) -> point | Shift point down by d |
.left(d) | (point, length) -> point | Shift point left by d |
.right(d) | (point, length) -> point | Shift point right by d |
.dx(other) | (point, point) -> length | Horizontal distance to other |
.dy(other) | (point, point) -> length | Vertical distance to other |
6.4 Curve Methods
| Method | Signature | Description |
|---|---|---|
.length | (bezier) -> length | Arc length of the curve |
7. Full Example
input head = 100mm {
assert head > 0mm
assert head < 800mm
}
input target_neck = 200mm {
assert target_neck > 0mm
assert target_neck < 500mm
}
fn neck_quarter(tweak: f64) {
let right = point(tweak * head / 10, 0mm)
let bottom = point(0mm, tweak * head / 12)
let cp1 = right.up(bottom.dy(right) / 2)
let cp2 = bottom.right(bottom.dx(right) / 2)
return bezier(right, cp1, cp2, bottom)
}
let tweak = search (t: f64) {
bounds t [0.6 .. 1.6]
tolerance 1mm
require neck_quarter(t).length == target_neck
}
export neck_quarter(tweak) as "Neck Curve"
8. Compiler Pipeline
.ktr source
│
├─ lexer ─── tokenization
├─ parser ─── AST construction
├─ sema ─── type checking, name resolution
├─ lower ─── AST + Sema → IR
└─ ir_emit ─── IR → .ktrir text
.ktrir text
│
├─ ir_parse ─── .ktrir text → IR (for runtimes)
└─ ir_decompile ─── IR → .ktr text (roundtrip verification)
9. Implementation Status
The compiler currently implements the following subset:
-
letbindings with literal values (100mm,50%,42) -
letbindings with identifier references (let y = x) - Type inference for
length,percentage,f64 - Arithmetic expressions (
+,-,*,/) with precedence - Parenthesized grouping
- Duplicate binding detection
- Undefined reference detection
- Poison type for error suppression
- IR lowering, emission, parsing, and decompilation
- Full roundtrip:
.ktr→ IR →.ktrir→ IR →.ktr - Runtime evaluator (
ktrr) with unit normalization (cm → mm) - Runtime WASM module with JSON output
-
inputdeclarations with literal defaults (input head = 100mm) - Runtime input overrides (parametric evaluation)
-
point,bezier,lineconstructor types with full pipeline support -
fndefinitions with typed parameters - Function calls (user-defined and built-in constructors)
- Field accessors (
.x,.y,.point1, etc.) with chaining - Top-level
piecedefinitions with typed member access (piece.member) - Anonymous
piece { ... }expressions - Functions returning
piecevalues
Planned (not yet implemented):
-
inputassertion blocks (assert head > 0mm) - Unary negation (
-expr) - Method calls (
.up(),.dx(), etc.) -
searchsolver blocks -
exportstatements -
booltype