changed the way that functions are defined

This commit is contained in:
IgorCielniak
2026-01-08 15:28:10 +01:00
parent d4dc6ceef5
commit b9098d9893
33 changed files with 232 additions and 198 deletions

View File

@@ -23,6 +23,8 @@
- `read-token`: splits the byte stream; default is whitespace delimited with numeric/string literal recognizers. - `read-token`: splits the byte stream; default is whitespace delimited with numeric/string literal recognizers.
- `on-token`: user code decides whether to interpret, compile, or treat the token as syntax. - `on-token`: user code decides whether to interpret, compile, or treat the token as syntax.
- `lookup`: resolves token → word entry; can be replaced to build new namespaces or module systems. - `lookup`: resolves token → word entry; can be replaced to build new namespaces or module systems.
- **Definition form**: `word <name> ... end` is the required way to declare high-level words. Legacy `: <name> ... ;` definitions are no longer accepted.
- **Text macros**: `macro <name> [param_count] ... ;` records tokens until the closing `;` and registers a macro that performs positional substitution (`$1`, `$2`, ...). The old `macro: ... ;macro` form is removed.
- **Compile vs interpret**: Each word advertises stack effect + immediacy. Immediate words execute during compilation (macro behavior). Others emit code or inline asm. - **Compile vs interpret**: Each word advertises stack effect + immediacy. Immediate words execute during compilation (macro behavior). Others emit code or inline asm.
- **Syntax morphing**: Provide primitives `set-reader`, `with-reader`, and word-lists so layers (e.g., Lisp-like forms) can be composed. - **Syntax morphing**: Provide primitives `set-reader`, `with-reader`, and word-lists so layers (e.g., Lisp-like forms) can be composed.
- **Inline Python hooks**: `:py name { ... } ;` executes the enclosed Python block immediately, then registers `name` as a word whose behavior is provided by that block. Define a `macro(ctx)` function to intercept compilation (receiving a `MacroContext` with helpers like `next_token`, `emit_literal`, `new_label`, `inject_tokens`, and direct access to the active parser), and/or an `intrinsic(builder)` function to emit custom assembly. This lets end users extend the language—parsing source, manipulating AST nodes, or writing NASM—without touching the bootstrap source. The standard librarys `extend-syntax` and `fn` forms are ordinary `:py` blocks built with these APIs, so users can clone or replace them entirely from L2 source files. - **Inline Python hooks**: `:py name { ... } ;` executes the enclosed Python block immediately, then registers `name` as a word whose behavior is provided by that block. Define a `macro(ctx)` function to intercept compilation (receiving a `MacroContext` with helpers like `next_token`, `emit_literal`, `new_label`, `inject_tokens`, and direct access to the active parser), and/or an `intrinsic(builder)` function to emit custom assembly. This lets end users extend the language—parsing source, manipulating AST nodes, or writing NASM—without touching the bootstrap source. The standard librarys `extend-syntax` and `fn` forms are ordinary `:py` blocks built with these APIs, so users can clone or replace them entirely from L2 source files.
@@ -122,7 +124,7 @@ struct: Point
## 14. Standard Library Sketch ## 14. Standard Library Sketch
- **Core words**: Arithmetic, logic, stack ops, comparison, memory access, control flow combinators. - **Core words**: Arithmetic, logic, stack ops, comparison, memory access, control flow combinators.
- **Return-stack helpers**: `>r`, `r>`, `rdrop`, and `rpick` shuffle values between the data stack and the return stack. Theyre used by the `fn` sugar but also available to user code for building custom control constructs. - **Return-stack helpers**: `>r`, `r>`, `rdrop`, and `rpick` shuffle values between the data stack and the return stack. Theyre used by the `fn` sugar but also available to user code for building custom control constructs.
- **Meta words**: Reader management, dictionary inspection, definition forms (`:`, `:noninline`, `:asm`, `immediate`). - **Meta words**: Reader management, dictionary inspection, definition forms (`word ... end`, `:noninline`, `:asm`, `immediate`).
- **Allocators**: Default bump allocator, arena allocator, and hook to install custom malloc/free pairs. - **Allocators**: Default bump allocator, arena allocator, and hook to install custom malloc/free pairs.
- **FFI/syscalls**: Thin wrappers plus convenience words for POSIX-level APIs. - **FFI/syscalls**: Thin wrappers plus convenience words for POSIX-level APIs.
- **Diagnostics**: Minimal `type`, `emit`, `cr`, `dump`, and tracing hooks for debugging emitted asm. - **Diagnostics**: Minimal `type`, `emit`, `cr`, `dump`, and tracing hooks for debugging emitted asm.

4
a.sl
View File

@@ -1,7 +1,7 @@
import stdlib/stdlib.sl import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
: main word main
"hello world" puts "hello world" puts
; end
compile-time main compile-time main

View File

@@ -1,10 +1,10 @@
import stdlib/stdlib.sl import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
: main word main
0 argc for 0 argc for
dup dup
argv@ dup strlen puts argv@ dup strlen puts
1 + 1 +
end end
; end

View File

@@ -4,10 +4,10 @@ import stdlib/io.sl
extern long labs(long n) extern long labs(long n)
extern void exit(int status) extern void exit(int status)
: main word main
# Test C-style extern with implicit ABI handling # Test C-style extern with implicit ABI handling
-10 labs puti cr -10 labs puti cr
# Test extern void # Test extern void
0 exit 0 exit
; end

4
f.sl
View File

@@ -4,7 +4,7 @@ import stdlib/float.sl
extern double atan2(double y, double x) extern double atan2(double y, double x)
: main word main
# Basic math # Basic math
1.5 2.5 f+ fputln # Outputs: 4.000000 1.5 2.5 f+ fputln # Outputs: 4.000000
@@ -13,5 +13,5 @@ extern double atan2(double y, double x)
4.0 f* fputln # Outputs: 3.141593 (approx pi) 4.0 f* fputln # Outputs: 3.141593 (approx pi)
0 0
; end

88
fn.sl
View File

@@ -1,4 +1,4 @@
: call-syntax-rewrite # ( fnameToken -- handled ) word call-syntax-rewrite # ( fnameToken -- handled )
dup token-lexeme identifier? 0 == if drop 0 exit end dup token-lexeme identifier? 0 == if drop 0 exit end
peek-token dup nil? if drop drop 0 exit end peek-token dup nil? if drop drop 0 exit end
dup token-lexeme "(" string= 0 == if drop drop 0 exit end dup token-lexeme "(" string= 0 == if drop drop 0 exit end
@@ -26,34 +26,34 @@ begin
# default: append tok to cur # default: append tok to cur
list-append list-append
again again
; end
immediate immediate
compile-only compile-only
: extend-syntax word extend-syntax
"call-syntax-rewrite" set-token-hook "call-syntax-rewrite" set-token-hook
; end
immediate immediate
compile-only compile-only
: fn-op-prec word fn-op-prec
dup "+" string= if drop 1 exit end dup "+" string= if drop 1 exit end
dup "-" string= if drop 1 exit end dup "-" string= if drop 1 exit end
dup "*" string= if drop 2 exit end dup "*" string= if drop 2 exit end
dup "/" string= if drop 2 exit end dup "/" string= if drop 2 exit end
dup "%" string= if drop 2 exit end dup "%" string= if drop 2 exit end
drop 0 drop 0
; end
compile-only compile-only
: fn-operator? word fn-operator?
fn-op-prec 0 > fn-op-prec 0 >
; end
compile-only compile-only
: fn-check-dup word fn-check-dup
>r # params (r: name) >r # params (r: name)
0 # params idx 0 # params idx
begin begin
@@ -66,10 +66,10 @@ begin
drop # drop comparison flag when no error drop # drop comparison flag when no error
r> 1 + # params idx+1 r> 1 + # params idx+1
again again
; end
compile-only compile-only
: fn-params word fn-params
list-new # lexer params list-new # lexer params
swap # params lexer swap # params lexer
>r # params (r: lexer) >r # params (r: lexer)
@@ -90,17 +90,17 @@ begin
dup ")" string= if drop r> exit end dup ")" string= if drop r> exit end
"expected ',' or ')' in parameter list" parse-error "expected ',' or ')' in parameter list" parse-error
again again
; end
compile-only compile-only
: fn-collect-body word fn-collect-body
"{" lexer-expect drop # consume opening brace, keep lexer "{" lexer-expect drop # consume opening brace, keep lexer
lexer-collect-brace # lexer bodyTokens lexer-collect-brace # lexer bodyTokens
swap drop # bodyTokens swap drop # bodyTokens
; end
compile-only compile-only
: fn-lexemes-from-tokens word fn-lexemes-from-tokens
>r # (r: tokens) >r # (r: tokens)
list-new # acc list-new # acc
begin begin
@@ -114,10 +114,10 @@ begin
token-lexeme # acc lex token-lexeme # acc lex
list-append # acc' list-append # acc'
again again
; end
compile-only compile-only
: fn-validate-body word fn-validate-body
dup list-length 0 == if "empty function body" parse-error end dup list-length 0 == if "empty function body" parse-error end
dup 0 list-get token-lexeme "return" string= 0 == if "function body must start with 'return'" parse-error end dup 0 list-get token-lexeme "return" string= 0 == if "function body must start with 'return'" parse-error end
dup list-last ";" string= 0 == if "function body must terminate with ';'" parse-error end dup list-last ";" string= 0 == if "function body must terminate with ';'" parse-error end
@@ -125,11 +125,11 @@ compile-only
list-pop drop # body expr' (trim trailing ';') list-pop drop # body expr' (trim trailing ';')
list-pop-front drop # body expr (trim leading 'return') list-pop-front drop # body expr (trim leading 'return')
dup list-length 0 == if "missing return expression" parse-error end dup list-length 0 == if "missing return expression" parse-error end
; end
compile-only compile-only
: fn-filter-raw-body # bodyLexemes -- tokens word fn-filter-raw-body # bodyLexemes -- tokens
list-new swap # out body list-new swap # out body
begin begin
dup list-empty? if dup list-empty? if
@@ -152,11 +152,11 @@ begin
r> # out' body' r> # out' body'
continue continue
again again
; end
compile-only compile-only
: fn-body->tokens # bodyLexemes -- tokens word fn-body->tokens # bodyLexemes -- tokens
dup list-length 0 == if "empty function body" parse-error end dup list-length 0 == if "empty function body" parse-error end
dup 0 list-get token-lexeme "return" string= if dup 0 list-get token-lexeme "return" string= if
fn-validate-body # expr fn-validate-body # expr
@@ -165,10 +165,10 @@ compile-only
end end
fn-filter-raw-body fn-filter-raw-body
dup list-length 0 == if "empty function body" parse-error end dup list-length 0 == if "empty function body" parse-error end
; end
compile-only compile-only
: fn-emit-prologue # params out -- params out word fn-emit-prologue # params out -- params out
over list-length # params out n over list-length # params out n
begin begin
dup 0 > if dup 0 > if
@@ -181,10 +181,10 @@ begin
drop # params out drop # params out
exit exit
again again
; end
compile-only compile-only
: fn-emit-epilogue # params out -- out word fn-emit-epilogue # params out -- out
over list-length >r # params out (r: n) over list-length >r # params out (r: n)
begin begin
r> dup 0 > if r> dup 0 > if
@@ -196,30 +196,30 @@ begin
swap drop # out swap drop # out
exit exit
again again
; end
compile-only compile-only
: fn-translate-prologue-loop # count -- word fn-translate-prologue-loop # count --
dup 0 > if dup 0 > if
1 - 1 -
0 rpick ">r" list-append drop 0 rpick ">r" list-append drop
fn-translate-prologue-loop fn-translate-prologue-loop
end end
drop drop
; end
compile-only compile-only
: fn-translate-epilogue-loop # count -- word fn-translate-epilogue-loop # count --
dup 0 > if dup 0 > if
1 - 1 -
0 rpick "rdrop" list-append drop 0 rpick "rdrop" list-append drop
fn-translate-epilogue-loop fn-translate-epilogue-loop
end end
drop drop
; end
compile-only compile-only
: fn-param-index # params name -- params idx flag word fn-param-index # params name -- params idx flag
>r # params (r: name) >r # params (r: name)
0 # params idx 0 # params idx
@@ -238,11 +238,11 @@ begin
drop # params idx drop # params idx
1 + # params idx+1 1 + # params idx+1
again again
; end
compile-only compile-only
: fn-build-param-map # params -- params map word fn-build-param-map # params -- params map
map-new # params map map-new # params map
0 # params map idx 0 # params map idx
begin begin
@@ -258,11 +258,11 @@ compile-only
r> 1 + # params map' idx' r> 1 + # params map' idx'
continue continue
again again
; end
compile-only compile-only
: fn-translate-token # out map tok -- out map word fn-translate-token # out map tok -- out map
# number? # number?
dup string>number # out map tok num ok dup string>number # out map tok num ok
if if
@@ -299,11 +299,11 @@ compile-only
swap >r # out tok (r: map) swap >r # out tok (r: map)
list-append # out' list-append # out'
r> # out' map r> # out' map
; end
compile-only compile-only
: fn-translate-postfix-loop # map out postfix -- map out word fn-translate-postfix-loop # map out postfix -- map out
begin begin
dup list-empty? if dup list-empty? if
drop drop
@@ -317,11 +317,11 @@ compile-only
r> # map out postfix' r> # map out postfix'
continue continue
again again
; end
compile-only compile-only
: fn-translate-postfix # postfix params -- out word fn-translate-postfix # postfix params -- out
swap # params postfix swap # params postfix
list-new # params postfix out list-new # params postfix out
@@ -341,15 +341,15 @@ compile-only
# drop map, emit epilogue # drop map, emit epilogue
swap drop # params out swap drop # params out
fn-emit-epilogue # out fn-emit-epilogue # out
; end
compile-only compile-only
: fn-build-body word fn-build-body
fn-translate-postfix # words fn-translate-postfix # words
; end
compile-only compile-only
: fn word fn
"(),{};+-*/%," lexer-new # lexer "(),{};+-*/%," lexer-new # lexer
dup lexer-pop # lexer nameTok dup lexer-pop # lexer nameTok
dup >r # save nameTok dup >r # save nameTok
@@ -368,6 +368,6 @@ compile-only
r> drop # drop name string r> drop # drop name string
r> # name token r> # name token
swap emit-definition swap emit-definition
; end
immediate immediate
compile-only compile-only

4
gg.sl
View File

@@ -2,6 +2,6 @@ import stdlib/io.sl
extern long labs(long n) extern long labs(long n)
: main word main
-3 labs puti -3 labs puti
; end

View File

@@ -1,6 +1,6 @@
import stdlib/stdlib.sl import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
: main word main
"hello world" puts "hello world" puts
; end

71
main.py
View File

@@ -201,6 +201,7 @@ class Definition:
body: List[Op] body: List[Op]
immediate: bool = False immediate: bool = False
compile_only: bool = False compile_only: bool = False
terminator: str = "end"
@dataclass @dataclass
@@ -410,11 +411,19 @@ class Parser:
continue continue
lexeme = token.lexeme lexeme = token.lexeme
if lexeme == ":": if lexeme == ":":
self._begin_definition(token) raise ParseError(
continue f"':' definitions are no longer supported; use 'word <name> ... end' at {token.line}:{token.column}"
if lexeme == ";": )
self._end_definition(token) if lexeme == "word":
self._begin_definition(token, terminator="end")
continue continue
if lexeme == "end":
if self.control_stack:
self._handle_end_control()
continue
if self._try_end_definition(token):
continue
raise ParseError(f"unexpected 'end' at {token.line}:{token.column}")
if lexeme == ":asm": if lexeme == ":asm":
self._parse_asm_definition(token) self._parse_asm_definition(token)
continue continue
@@ -439,13 +448,13 @@ class Parser:
if lexeme == "do": if lexeme == "do":
self._handle_do_control() self._handle_do_control()
continue continue
if lexeme == "end":
self._handle_end_control()
continue
if self._maybe_expand_macro(token): if self._maybe_expand_macro(token):
continue continue
self._handle_token(token) self._handle_token(token)
if self.macro_recording is not None:
raise ParseError("unterminated macro definition (missing ';')")
if len(self.context_stack) != 1: if len(self.context_stack) != 1:
raise ParseError("unclosed definition at EOF") raise ParseError("unclosed definition at EOF")
if self.control_stack: if self.control_stack:
@@ -593,7 +602,7 @@ class Parser:
def _handle_macro_recording(self, token: Token) -> bool: def _handle_macro_recording(self, token: Token) -> bool:
if self.macro_recording is None: if self.macro_recording is None:
return False return False
if token.lexeme == ";macro": if token.lexeme == ";":
self._finish_macro_recording(token) self._finish_macro_recording(token)
else: else:
self.macro_recording.tokens.append(token.lexeme) self.macro_recording.tokens.append(token.lexeme)
@@ -638,7 +647,7 @@ class Parser:
def _finish_macro_recording(self, token: Token) -> None: def _finish_macro_recording(self, token: Token) -> None:
if self.macro_recording is None: if self.macro_recording is None:
raise ParseError(f"unexpected ';macro' at {token.line}:{token.column}") raise ParseError(f"unexpected ';' closing a macro at {token.line}:{token.column}")
macro_def = self.macro_recording macro_def = self.macro_recording
self.macro_recording = None self.macro_recording = None
word = Word(name=macro_def.name) word = Word(name=macro_def.name)
@@ -715,11 +724,24 @@ class Parser:
self._append_op(Op(op="branch_zero", data=entry["end"])) self._append_op(Op(op="branch_zero", data=entry["end"]))
self._push_control(entry) self._push_control(entry)
def _begin_definition(self, token: Token) -> None: def _try_end_definition(self, token: Token) -> bool:
if len(self.context_stack) <= 1:
return False
ctx = self.context_stack[-1]
if not isinstance(ctx, Definition):
return False
if ctx.terminator != token.lexeme:
return False
self._end_definition(token)
return True
def _begin_definition(self, token: Token, terminator: str = "end") -> None:
if self._eof(): if self._eof():
raise ParseError(f"definition name missing after ':' at {token.line}:{token.column}") raise ParseError(
f"definition name missing after '{token.lexeme}' at {token.line}:{token.column}"
)
name_token = self._consume() name_token = self._consume()
definition = Definition(name=name_token.lexeme, body=[]) definition = Definition(name=name_token.lexeme, body=[], terminator=terminator)
self.context_stack.append(definition) self.context_stack.append(definition)
word = self.dictionary.lookup(definition.name) word = self.dictionary.lookup(definition.name)
if word is None: if word is None:
@@ -730,10 +752,14 @@ class Parser:
def _end_definition(self, token: Token) -> None: def _end_definition(self, token: Token) -> None:
if len(self.context_stack) <= 1: if len(self.context_stack) <= 1:
raise ParseError(f"unexpected ';' at {token.line}:{token.column}") raise ParseError(f"unexpected '{token.lexeme}' at {token.line}:{token.column}")
ctx = self.context_stack.pop() ctx = self.context_stack.pop()
if not isinstance(ctx, Definition): if not isinstance(ctx, Definition):
raise ParseError("';' can only close definitions") raise ParseError(f"'{token.lexeme}' can only close definitions")
if ctx.terminator != token.lexeme:
raise ParseError(
f"definition '{ctx.name}' expects terminator '{ctx.terminator}' but got '{token.lexeme}'"
)
word = self.definition_stack.pop() word = self.definition_stack.pop()
ctx.immediate = word.immediate ctx.immediate = word.immediate
ctx.compile_only = word.compile_only ctx.compile_only = word.compile_only
@@ -1836,7 +1862,7 @@ def macro_compile_time(ctx: MacroContext) -> Optional[List[Op]]:
def macro_begin_text_macro(ctx: MacroContext) -> Optional[List[Op]]: def macro_begin_text_macro(ctx: MacroContext) -> Optional[List[Op]]:
parser = ctx.parser parser = ctx.parser
if parser._eof(): if parser._eof():
raise ParseError("macro name missing after 'macro:'") raise ParseError("macro name missing after 'macro'")
name_token = parser.next_token() name_token = parser.next_token()
param_count = 0 param_count = 0
peek = parser.peek_token() peek = parser.peek_token()
@@ -1850,14 +1876,6 @@ def macro_begin_text_macro(ctx: MacroContext) -> Optional[List[Op]]:
return None return None
def macro_end_text_macro(ctx: MacroContext) -> Optional[List[Op]]:
parser = ctx.parser
if parser.macro_recording is None:
raise ParseError("';macro' without matching 'macro:'")
# Actual closing handled in parser loop when ';macro' token is seen.
return None
def _struct_emit_definition(tokens: List[Token], template: Token, name: str, body: Sequence[str]) -> None: def _struct_emit_definition(tokens: List[Token], template: Token, name: str, body: Sequence[str]) -> None:
def make_token(lexeme: str) -> Token: def make_token(lexeme: str) -> Token:
return Token( return Token(
@@ -1868,11 +1886,11 @@ def _struct_emit_definition(tokens: List[Token], template: Token, name: str, bod
end=template.end, end=template.end,
) )
tokens.append(make_token(":")) tokens.append(make_token("word"))
tokens.append(make_token(name)) tokens.append(make_token(name))
for lexeme in body: for lexeme in body:
tokens.append(make_token(lexeme)) tokens.append(make_token(lexeme))
tokens.append(make_token(";")) tokens.append(make_token("end"))
class SplitLexer: class SplitLexer:
@@ -2521,8 +2539,7 @@ def bootstrap_dictionary() -> Dictionary:
dictionary.register(Word(name="immediate", immediate=True, macro=macro_immediate)) dictionary.register(Word(name="immediate", immediate=True, macro=macro_immediate))
dictionary.register(Word(name="compile-only", immediate=True, macro=macro_compile_only)) dictionary.register(Word(name="compile-only", immediate=True, macro=macro_compile_only))
dictionary.register(Word(name="compile-time", immediate=True, macro=macro_compile_time)) dictionary.register(Word(name="compile-time", immediate=True, macro=macro_compile_time))
dictionary.register(Word(name="macro:", immediate=True, macro=macro_begin_text_macro)) dictionary.register(Word(name="macro", immediate=True, macro=macro_begin_text_macro))
dictionary.register(Word(name=";macro", immediate=True, macro=macro_end_text_macro))
dictionary.register(Word(name="struct:", immediate=True, macro=macro_struct_begin)) dictionary.register(Word(name="struct:", immediate=True, macro=macro_struct_begin))
dictionary.register(Word(name=";struct", immediate=True, macro=macro_struct_end)) dictionary.register(Word(name=";struct", immediate=True, macro=macro_struct_end))
_register_compile_time_primitives(dictionary) _register_compile_time_primitives(dictionary)

View File

@@ -2,14 +2,14 @@ import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
import fn.sl import fn.sl
: main word main
2 40 + 2 40 +
puti cr puti cr
extend-syntax extend-syntax
foo(1, 2) foo(1, 2)
puti cr puti cr
0 0
; end
fn foo(int a, int b){ fn foo(int a, int b){
return a + b; return a + b;

View File

@@ -4,15 +4,15 @@
# and prints that much consequent elements # and prints that much consequent elements
# from the stack while not modifying it # from the stack while not modifying it
: dump word dump
1 swap 1 swap
for for
dup pick dup pick
puti cr puti cr
1 + 1 +
end end
drop drop
; end
# : int3 ( -- ) # : int3 ( -- )
:asm int3 { :asm int3 {

View File

@@ -86,10 +86,10 @@
# Output # Output
extern int printf(char* fmt, double x) extern int printf(char* fmt, double x)
: fput word fput
"%f" drop swap printf drop "%f" drop swap printf drop
; end
: fputln word fputln
"%f\n" drop swap printf drop "%f\n" drop swap printf drop
; end

View File

@@ -343,5 +343,6 @@
} }
; ;
: cr 10 putc ; word cr 10 putc end
: puts write_buf cr ;
word puts write_buf cr end

View File

@@ -1,6 +1,6 @@
import stdlib.sl import stdlib.sl
: alloc word alloc
0 # addr hint (NULL) 0 # addr hint (NULL)
swap # size swap # size
3 # prot (PROT_READ | PROT_WRITE) 3 # prot (PROT_READ | PROT_WRITE)
@@ -8,8 +8,8 @@ import stdlib.sl
-1 # fd -1 # fd
0 # offset 0 # offset
mmap mmap
; end
: free word free
munmap drop munmap drop
; end

View File

@@ -1,14 +1,14 @@
import stdlib/stdlib.sl import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
: strcmp word strcmp
3 pick 2 pick @ swap @ == 3 pick 2 pick @ swap @ ==
; end
: main word main
"g" "g" "g" "g"
strcmp strcmp
puti cr puti cr
puts puts
puts puts
; end

View File

@@ -1,7 +1,7 @@
import stdlib/stdlib.sl import stdlib/stdlib.sl
import stdlib/io.sl import stdlib/io.sl
: strconcat word strconcat
0 pick 3 pick + 0 pick 3 pick +
dup dup
>r >r >r >r >r >r >r >r >r >r >r >r
@@ -25,9 +25,9 @@ import stdlib/io.sl
rot rot
drop drop
rdrop rdrop rdrop rdrop rdrop rdrop
; end
: alloc word alloc
0 # addr hint (NULL) 0 # addr hint (NULL)
swap # size swap # size
3 # prot (PROT_READ | PROT_WRITE) 3 # prot (PROT_READ | PROT_WRITE)
@@ -35,13 +35,13 @@ import stdlib/io.sl
-1 # fd -1 # fd
0 # offset 0 # offset
mmap mmap
; end
: free word free
munmap drop munmap drop
; end
: strcpy #(dst_addr src_addr len -- dst_addr len) word strcpy #(dst_addr src_addr len -- dst_addr len)
dup dup
>r >r
swap swap
@@ -66,10 +66,10 @@ import stdlib/io.sl
swap swap
nip nip
r> dup -rot - swap r> dup -rot - swap
; end
: main word main
"hello world hello world hello " "world hello world hello world" "hello world hello world hello " "world hello world hello world"
strconcat strconcat
puts puts
; end

4
t.sl
View File

@@ -8,9 +8,9 @@ fn foo(int a, int b){
return a b +; return a b +;
} }
: main word main
extend-syntax extend-syntax
foo(3, 2) foo(3, 2)
puti cr puti cr
0 0
; end

View File

@@ -2,22 +2,22 @@ import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
import ../stdlib/mem.sl import ../stdlib/mem.sl
: test-mem-alloc word test-mem-alloc
4096 alloc dup 1337 swap ! # allocate 4096 bytes, store 1337 at start 4096 alloc dup 1337 swap ! # allocate 4096 bytes, store 1337 at start
dup @ puti cr # print value at start dup @ puti cr # print value at start
4096 free # free the memory 4096 free # free the memory
; end
struct: Point struct: Point
field x 8 field x 8
field y 8 field y 8
;struct ;struct
: main word main
32 alloc # allocate 32 bytes (enough for a Point struct) 32 alloc # allocate 32 bytes (enough for a Point struct)
dup 111 swap Point.x! dup 111 swap Point.x!
dup 222 swap Point.y! dup 222 swap Point.y!
dup Point.x@ puti cr dup Point.x@ puti cr
Point.y@ puti cr Point.y@ puti cr
32 free # free the memory 32 free # free the memory
; end

View File

@@ -2,14 +2,14 @@ import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
import ../fn.sl import ../fn.sl
: main word main
2 40 + 2 40 +
puti cr puti cr
extend-syntax extend-syntax
foo(1, 2) foo(1, 2)
puti cr puti cr
0 0
; end
fn foo(int a, int b){ fn foo(int a, int b){
return a + b; return a + b;

View File

@@ -2,7 +2,7 @@ import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
import ../stdlib/debug.sl import ../stdlib/debug.sl
: main word main
1 1 2dup 2dup puti cr puti cr 1 1 2dup 2dup puti cr puti cr
+ +
dup puti cr dup puti cr
@@ -15,12 +15,12 @@ import ../stdlib/debug.sl
r> 3 + puti r> 3 + puti
" numbers printed from the fibonaci sequence" puts " numbers printed from the fibonaci sequence" puts
0 0
; end
: main2 word main2
1 2 while over 100 < do 1 2 while over 100 < do
over puti cr over puti cr
swap over + swap over +
end end
; end

View File

@@ -9,21 +9,21 @@ import ../fn.sl
} }
; ;
macro: square macro square
dup * dup *
;macro ;
macro: defconst 2 macro defconst 2
: $1 word $1
$2 $2
; end
;macro ;
macro: defadder 3 macro defadder 3
: $1 word $1
$2 $3 + $2 $3 +
; end
;macro ;
defconst MAGIC 99 defconst MAGIC 99
defadder add13 5 8 defadder add13 5 8
@@ -39,46 +39,46 @@ fn fancy_add(int a, int b){
return (a + b) * b; return (a + b) * b;
} }
: test-add word test-add
5 7 + puti cr 5 7 + puti cr
; end
: test-sub word test-sub
10 3 - puti cr 10 3 - puti cr
; end
: test-mul word test-mul
6 7 * puti cr 6 7 * puti cr
; end
: test-div word test-div
84 7 / puti cr 84 7 / puti cr
; end
: test-mod word test-mod
85 7 % puti cr 85 7 % puti cr
; end
: test-drop word test-drop
10 20 drop puti cr 10 20 drop puti cr
; end
: test-dup word test-dup
11 dup + puti cr 11 dup + puti cr
; end
: test-swap word test-swap
2 5 swap - puti cr 2 5 swap - puti cr
; end
: test-store word test-store
mem-slot dup mem-slot dup
123 swap ! 123 swap !
@ puti cr @ puti cr
; end
: test-mmap word test-mmap
0 # addr hint (NULL) 0 # addr hint (NULL)
4096 # length (page) 4096 # length (page)
3 # prot (PROT_READ | PROT_WRITE) 3 # prot (PROT_READ | PROT_WRITE)
@@ -91,23 +91,23 @@ fn fancy_add(int a, int b){
dup dup
@ puti cr @ puti cr
4096 munmap drop 4096 munmap drop
; end
: test-macro word test-macro
9 square puti cr 9 square puti cr
MAGIC puti cr MAGIC puti cr
add13 puti cr add13 puti cr
; end
: test-if word test-if
5 5 == if 5 5 == if
111 puti cr 111 puti cr
else else
222 puti cr 222 puti cr
end end
; end
: test-else-if word test-else-if
2 2
dup 1 == if dup 1 == if
50 puti cr 50 puti cr
@@ -119,34 +119,34 @@ fn fancy_add(int a, int b){
end end
end end
drop drop
; end
: test-for word test-for
0 0
5 for 5 for
1 + 1 +
end end
puti cr puti cr
; end
: test-for-zero word test-for-zero
123 123
0 for 0 for
drop drop
end end
puti cr puti cr
; end
: test-struct word test-struct
mem-slot mem-slot
dup 111 swap Point.x! dup 111 swap Point.x!
dup 222 swap Point.y! dup 222 swap Point.y!
dup Point.x@ puti cr dup Point.x@ puti cr
Point.y@ puti cr Point.y@ puti cr
Point.size puti cr Point.size puti cr
; end
: test-cmp word test-cmp
5 5 == puti cr 5 5 == puti cr
5 4 == puti cr 5 4 == puti cr
5 4 != puti cr 5 4 != puti cr
@@ -159,16 +159,16 @@ fn fancy_add(int a, int b){
6 5 <= puti cr 6 5 <= puti cr
5 5 >= puti cr 5 5 >= puti cr
4 5 >= puti cr 4 5 >= puti cr
; end
: test-c-fn word test-c-fn
3 3
7 7
fancy_add() fancy_add()
puti cr puti cr
; end
: main word main
test-add test-add
test-sub test-sub
test-mul test-mul
@@ -188,4 +188,4 @@ fn fancy_add(int a, int b){
test-struct test-struct
test-c-fn test-c-fn
0 0
; end

View File

@@ -1,7 +1,7 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
"/tmp/l2_read_file_test.txt" "/tmp/l2_read_file_test.txt"
"read_file works\n" "read_file works\n"
write_file drop write_file drop
@@ -33,4 +33,4 @@ import ../stdlib/io.sl
"unknown read_file failure" puts "unknown read_file failure" puts
dup # file_len file_len file_addr dup # file_len file_len file_addr
exit # Exit with returned file_len as the program exit code (debug) exit # Exit with returned file_len as the program exit code (debug)
; end

View File

@@ -1,7 +1,7 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
1024 1024
read_stdin # returns (addr len) read_stdin # returns (addr len)
dup 0 > if dup 0 > if
@@ -10,4 +10,4 @@ import ../stdlib/io.sl
end end
"read_stdin failed" puts "read_stdin failed" puts
exit exit
; end

View File

@@ -1,8 +1,8 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
"hello from write_buf test\n" "hello from write_buf test\n"
write_buf write_buf
0 0
; end

View File

@@ -1,7 +1,7 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
"/tmp/l2_write_file_test.txt" # path "/tmp/l2_write_file_test.txt" # path
"hello from write_file test\n" # buffer "hello from write_file test\n" # buffer
write_file write_file
@@ -14,4 +14,4 @@ import ../stdlib/io.sl
"write failed errno=" puts "write failed errno=" puts
puti cr puti cr
exit exit
; end

View File

@@ -1,7 +1,7 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
10 10
while while
dup 0 > dup 0 >
@@ -10,4 +10,4 @@ import ../stdlib/io.sl
1 - 1 -
end end
drop drop
; end

View File

@@ -1,7 +1,7 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
0 0
5 for 5 for
1 + 1 +
@@ -10,4 +10,4 @@ import ../stdlib/io.sl
5 5 == puti cr 5 5 == puti cr
5 4 == puti cr 5 4 == puti cr
0 0
; end

View File

@@ -1,9 +1,9 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
mem 5 swap ! mem 5 swap !
mem 8 + 6 swap ! mem 8 + 6 swap !
mem @ puti cr mem @ puti cr
mem 8 + @ puti cr mem 8 + @ puti cr
; end

View File

@@ -1,12 +1,12 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: dup word dup
6 6
; end
compile-only compile-only
: emit-overridden word emit-overridden
"dup" use-l2-ct "dup" use-l2-ct
42 42
dup dup
@@ -17,12 +17,12 @@ compile-only
swap swap
list-append list-append
inject-tokens inject-tokens
; end
immediate immediate
compile-only compile-only
: main word main
emit-overridden emit-overridden
puti cr puti cr
0 0
; end

View File

@@ -1,9 +1,9 @@
import ../stdlib/stdlib.sl import ../stdlib/stdlib.sl
import ../stdlib/io.sl import ../stdlib/io.sl
: main word main
"hello world" puts "hello world" puts
"line1\nline2" puts "line1\nline2" puts
"" puts "" puts
0 0
; end

View File

@@ -0,0 +1 @@
7

12
tests/word_syntax.sl Normal file
View File

@@ -0,0 +1,12 @@
import ../stdlib/stdlib.sl
import ../stdlib/io.sl
word add-two
+
end
word main
3 4 add-two
puti cr
0
end

1
tests/word_syntax.test Normal file
View File

@@ -0,0 +1 @@
python main.py tests/word_syntax.sl -o /tmp/word_syntax > /dev/null && /tmp/word_syntax