Preparation for sharing
- rustfmt - clippy - comments - README
This commit is contained in:
28
README.md
28
README.md
@@ -1,5 +1,29 @@
|
||||
All you need to run the project is a nighly rust toolchain. Go to one of the folders within `examples` and run
|
||||
Orchid is an experimental lazy, pure functional programming language designed to be embeddable in a Rust application for scripting.
|
||||
|
||||
# Usage
|
||||
|
||||
TODO
|
||||
|
||||
I need to write a few articles explaining individual fragments of the language, and accurately document everything. Writing tutorials at this stage is not really worth it.
|
||||
|
||||
# Design
|
||||
|
||||
The execution model is lambda calculus, with call by name and copy tracking to avoid repeating steps. This leads to the minimal number of necessary reduction steps.
|
||||
|
||||
To make the syntax more intuitive, completely hygienic macros can be used which are applied to expressions after all imports are resolved and all tokens are namespaced both in the macro and in the referencing expression.
|
||||
|
||||
Namespaces are inspired by Rust modules and ES6. Every file and directory is implicitly a public module. Files can `export` names of constants or namespaces, all names in a substitution rule, or explicitly export some names. Names are implicitly created when they're referenced. `import` syntax is similar to Rust except with `(` parentheses `)` and no semicolons.
|
||||
|
||||
# Try it out
|
||||
|
||||
The project uses the nighly rust toolchain. Go to one of the folders within `examples` and run
|
||||
|
||||
```sh
|
||||
cargo run -- -p .
|
||||
```
|
||||
```
|
||||
|
||||
you can try modifying the examples, but error reporting for the time being is pretty terrible.
|
||||
|
||||
# Contribution
|
||||
|
||||
All contributions are welcome. For the time being, use the issue tracker to discuss ideas.
|
||||
27
notes/macros.md
Normal file
27
notes/macros.md
Normal file
@@ -0,0 +1,27 @@
|
||||
Substitution rules are represented by the `=prio=>` arrow where `prio` is a floating point literal. They are tested form highest priority to lowest. When one matches, the substitution is executed and all macros are re-checked from the beginning.
|
||||
|
||||
Wildcards either match a single token `$foo`, at least one token `...$bar` or any number of tokens `..$baz`. The latter two forms can also have an unsigned integer growth priority `...$quz:3` which influences their order in deciding the precedence of matches.
|
||||
|
||||
# Match priority
|
||||
|
||||
When a macro matches the program more than once, matches in ancestors take precedence. If there's no direct ancestry, the left branch takes precedence. When two matches are found in the same token sequence, the order is determined by the number of tokens allocated to the highest priority variable length wildcard where this number differs.
|
||||
|
||||
Variable length placeholders outside parens always have a higher priority than those inside. On the same level, the numbers decide the priority. In case of a tie, the placeholder to the left is preferred.
|
||||
|
||||
# Writing macros
|
||||
|
||||
Macro programs are systems consisting of substitution rules which reinterpret the tree produced by the previous rules. A good example for how this works can be found in ../examples/list-processing/fn.orc
|
||||
|
||||
Priority numbers are written in hexadecimal normal form to avoid precision bugs, and they're divided into bands throughout the f64 value range: (the numbers represent powers of 16)
|
||||
|
||||
- **32-39**: Binary operators, in inverse priority order
|
||||
- **80-87**: Expression-like structures such as if/then/else
|
||||
- **128-135**: Anything that creates lambdas
|
||||
Programs triggered by a lower priority pattern than this can assume that all names are correctly bound
|
||||
- **200**: Aliases extracted for readability
|
||||
The user-accessible entry points of all macro programs must be lower priority than this, so any arbitrary syntax can be extracted into an alias with no side effects
|
||||
- **224-231**: Integration; documented hooks exposed by a macro package to allow third party packages to extend its functionality
|
||||
The `statement` pattern produced by `do{}` blocks and matched by `let` and `cps` is a good example of this. When any of these are triggered, all macro programs are in a documented state.
|
||||
- **248-255**: Transitional states within macro programs get the highest priority
|
||||
|
||||
The numbers are arbitrary and up for debate. These are just the ones I came up with when writing the examples.
|
||||
@@ -24,7 +24,7 @@
|
||||
"editor.formatOnType": true,
|
||||
},
|
||||
"[rust]": {
|
||||
"editor.rulers": [74]
|
||||
"editor.rulers": [80],
|
||||
},
|
||||
"rust-analyzer.showUnlinkedFileNotification": false,
|
||||
"rust-analyzer.checkOnSave": true,
|
||||
|
||||
34
rustfmt.toml
Normal file
34
rustfmt.toml
Normal file
@@ -0,0 +1,34 @@
|
||||
# meta
|
||||
format_code_in_doc_comments = true
|
||||
unstable_features = true
|
||||
version = "Two"
|
||||
|
||||
# space
|
||||
tab_spaces = 2
|
||||
max_width = 80
|
||||
error_on_line_overflow = true
|
||||
format_macro_matchers = true
|
||||
newline_style = "Unix"
|
||||
normalize_comments = true
|
||||
wrap_comments = true
|
||||
overflow_delimited_expr = true
|
||||
single_line_if_else_max_width = 50
|
||||
use_small_heuristics = "Max"
|
||||
|
||||
# literals
|
||||
hex_literal_case = "Lower"
|
||||
format_strings = true
|
||||
|
||||
# delimiters
|
||||
match_arm_blocks = false
|
||||
match_block_trailing_comma = true
|
||||
|
||||
# structure
|
||||
condense_wildcard_suffixes = true
|
||||
use_field_init_shorthand = true
|
||||
use_try_shorthand = true
|
||||
|
||||
# Modules
|
||||
group_imports = "StdExternalCrate"
|
||||
imports_granularity = "Module"
|
||||
reorder_modules = true
|
||||
13
src/cli.rs
13
src/cli.rs
@@ -1,19 +1,22 @@
|
||||
use std::{fmt::Display, io::{stdin, BufRead, stdout, Write}};
|
||||
use std::fmt::Display;
|
||||
use std::io::{stdin, stdout, BufRead, Write};
|
||||
|
||||
pub fn prompt<T: Display, E: Display>(
|
||||
prompt: &str,
|
||||
default: T,
|
||||
mut try_cast: impl FnMut(String) -> Result<T, E>
|
||||
mut try_cast: impl FnMut(String) -> Result<T, E>,
|
||||
) -> T {
|
||||
loop {
|
||||
print!("{prompt} ({default}): ");
|
||||
stdout().lock().flush().unwrap();
|
||||
let mut input = String::with_capacity(100);
|
||||
stdin().lock().read_line(&mut input).unwrap();
|
||||
if input.is_empty() {return default}
|
||||
if input.is_empty() {
|
||||
return default;
|
||||
}
|
||||
match try_cast(input) {
|
||||
Ok(t) => return t,
|
||||
Err(e) => println!("Error: {e}")
|
||||
Err(e) => println!("Error: {e}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
18
src/external/assertion_error.rs
vendored
18
src/external/assertion_error.rs
vendored
@@ -1,23 +1,27 @@
|
||||
use std::rc::Rc;
|
||||
use std::fmt::Display;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::foreign::ExternError;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
|
||||
|
||||
/// Some expectation (usually about the argument types of a function) did not
|
||||
/// hold.
|
||||
#[derive(Clone)]
|
||||
pub struct AssertionError{
|
||||
pub struct AssertionError {
|
||||
pub value: ExprInst,
|
||||
pub assertion: &'static str,
|
||||
}
|
||||
|
||||
impl AssertionError {
|
||||
pub fn fail(value: ExprInst, assertion: &'static str) -> Result<!, Rc<dyn ExternError>> {
|
||||
return Err(Self { value, assertion }.into_extern())
|
||||
pub fn fail(
|
||||
value: ExprInst,
|
||||
assertion: &'static str,
|
||||
) -> Result<!, Rc<dyn ExternError>> {
|
||||
return Err(Self { value, assertion }.into_extern());
|
||||
}
|
||||
|
||||
pub fn ext(value: ExprInst, assertion: &'static str) -> Rc<dyn ExternError> {
|
||||
return Self { value, assertion }.into_extern()
|
||||
return Self { value, assertion }.into_extern();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -27,4 +31,4 @@ impl Display for AssertionError {
|
||||
}
|
||||
}
|
||||
|
||||
impl ExternError for AssertionError{}
|
||||
impl ExternError for AssertionError {}
|
||||
|
||||
18
src/external/bool/boolean.rs
vendored
18
src/external/bool/boolean.rs
vendored
@@ -1,12 +1,18 @@
|
||||
use crate::foreign::Atom;
|
||||
use crate::representations::{interpreted::{Clause, ExprInst}, Primitive};
|
||||
use crate::atomic_inert;
|
||||
use crate::foreign::Atom;
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::Primitive;
|
||||
|
||||
/// Booleans exposed to Orchid
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub struct Boolean(pub bool);
|
||||
atomic_inert!(Boolean);
|
||||
|
||||
impl From<bool> for Boolean { fn from(value: bool) -> Self { Self(value) } }
|
||||
impl From<bool> for Boolean {
|
||||
fn from(value: bool) -> Self {
|
||||
Self(value)
|
||||
}
|
||||
}
|
||||
|
||||
impl TryFrom<ExprInst> for Boolean {
|
||||
type Error = ();
|
||||
@@ -15,9 +21,9 @@ impl TryFrom<ExprInst> for Boolean {
|
||||
let expr = value.expr();
|
||||
if let Clause::P(Primitive::Atom(Atom(a))) = &expr.clause {
|
||||
if let Some(b) = a.as_any().downcast_ref::<Boolean>() {
|
||||
return Ok(*b)
|
||||
return Ok(*b);
|
||||
}
|
||||
}
|
||||
return Err(())
|
||||
Err(())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
54
src/external/bool/equals.rs
vendored
54
src/external/bool/equals.rs
vendored
@@ -1,48 +1,54 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::representations::{interpreted::ExprInst, Literal};
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
use super::super::assertion_error::AssertionError;
|
||||
use super::boolean::Boolean;
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Literal;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Equals function
|
||||
/// Compares the inner values if
|
||||
///
|
||||
/// - both values are char,
|
||||
/// - both are string,
|
||||
/// - both are either uint or num
|
||||
///
|
||||
/// Next state: [Equals1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Equals2;
|
||||
externfn_impl!(Equals2, |_: &Self, x: ExprInst| Ok(Equals1{x}));
|
||||
externfn_impl!(Equals2, |_: &Self, x: ExprInst| Ok(Equals1 { x }));
|
||||
|
||||
/// Partially applied Equals function
|
||||
///
|
||||
/// Prev state: [Equals2]; Next state: [Equals0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Equals1{ x: ExprInst }
|
||||
pub struct Equals1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Equals1, x);
|
||||
atomic_impl!(Equals1);
|
||||
externfn_impl!(Equals1, |this: &Self, x: ExprInst| {
|
||||
with_lit(&this.x, |l| Ok(Equals0{ a: l.clone(), x }))
|
||||
with_lit(&this.x, |l| Ok(Equals0 { a: l.clone(), x }))
|
||||
});
|
||||
|
||||
/// Fully applied Equals function.
|
||||
///
|
||||
/// Prev state: [Equals1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Equals0 { a: Literal, x: ExprInst }
|
||||
pub struct Equals0 {
|
||||
a: Literal,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Equals0, x);
|
||||
atomic_impl!(Equals0, |Self{ a, x }: &Self, _| {
|
||||
let eqls = with_lit(x, |l| Ok(match (a, l) {
|
||||
(Literal::Char(c1), Literal::Char(c2)) => c1 == c2,
|
||||
(Literal::Num(n1), Literal::Num(n2)) => n1 == n2,
|
||||
(Literal::Str(s1), Literal::Str(s2)) => s1 == s2,
|
||||
(Literal::Uint(i1), Literal::Uint(i2)) => i1 == i2,
|
||||
(Literal::Num(n1), Literal::Uint(u1)) => *n1 == (*u1 as f64),
|
||||
(Literal::Uint(u1), Literal::Num(n1)) => *n1 == (*u1 as f64),
|
||||
(_, _) => AssertionError::fail(x.clone(), "the expected type")?,
|
||||
}))?;
|
||||
atomic_impl!(Equals0, |Self { a, x }: &Self, _| {
|
||||
let eqls = with_lit(x, |l| {
|
||||
Ok(match (a, l) {
|
||||
(Literal::Char(c1), Literal::Char(c2)) => c1 == c2,
|
||||
(Literal::Num(n1), Literal::Num(n2)) => n1 == n2,
|
||||
(Literal::Str(s1), Literal::Str(s2)) => s1 == s2,
|
||||
(Literal::Uint(i1), Literal::Uint(i2)) => i1 == i2,
|
||||
(Literal::Num(n1), Literal::Uint(u1)) => *n1 == (*u1 as f64),
|
||||
(Literal::Uint(u1), Literal::Num(n1)) => *n1 == (*u1 as f64),
|
||||
(..) => AssertionError::fail(x.clone(), "the expected type")?,
|
||||
})
|
||||
})?;
|
||||
Ok(Boolean::from(eqls).to_atom_cls())
|
||||
});
|
||||
|
||||
57
src/external/bool/ifthenelse.rs
vendored
57
src/external/bool/ifthenelse.rs
vendored
@@ -1,41 +1,46 @@
|
||||
use std::fmt::Debug;
|
||||
use std::rc::Rc;
|
||||
|
||||
use super::Boolean;
|
||||
use crate::external::assertion_error::AssertionError;
|
||||
use crate::representations::{PathSet, interpreted::{Clause, ExprInst}};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::PathSet;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
use super::Boolean;
|
||||
|
||||
/// IfThenElse function
|
||||
///
|
||||
/// Takes a boolean and two branches, runs the first if the bool is true, the
|
||||
/// second if it's false.
|
||||
///
|
||||
/// Next state: [IfThenElse0]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct IfThenElse1;
|
||||
externfn_impl!(IfThenElse1, |_: &Self, x: ExprInst| Ok(IfThenElse0{x}));
|
||||
|
||||
/// Partially applied IfThenElse function
|
||||
///
|
||||
/// Prev state: [IfThenElse1]; Next state: [IfThenElse0]
|
||||
externfn_impl!(IfThenElse1, |_: &Self, x: ExprInst| Ok(IfThenElse0 { x }));
|
||||
|
||||
/// Prev state: [IfThenElse1]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct IfThenElse0{ x: ExprInst }
|
||||
pub struct IfThenElse0 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(IfThenElse0, x);
|
||||
atomic_impl!(IfThenElse0, |this: &Self, _| {
|
||||
let Boolean(b) = this.x.clone().try_into()
|
||||
let Boolean(b) = this
|
||||
.x
|
||||
.clone()
|
||||
.try_into()
|
||||
.map_err(|_| AssertionError::ext(this.x.clone(), "a boolean"))?;
|
||||
Ok(if b { Clause::Lambda {
|
||||
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
||||
body: Clause::Lambda {
|
||||
args: None,
|
||||
body: Clause::LambdaArg.wrap()
|
||||
}.wrap()
|
||||
}} else { Clause::Lambda {
|
||||
args: None,
|
||||
body: Clause::Lambda {
|
||||
Ok(if b {
|
||||
Clause::Lambda {
|
||||
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
||||
body: Clause::LambdaArg.wrap()
|
||||
}.wrap()
|
||||
}})
|
||||
});
|
||||
body: Clause::Lambda { args: None, body: Clause::LambdaArg.wrap() }
|
||||
.wrap(),
|
||||
}
|
||||
} else {
|
||||
Clause::Lambda {
|
||||
args: None,
|
||||
body: Clause::Lambda {
|
||||
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
||||
body: Clause::LambdaArg.wrap(),
|
||||
}
|
||||
.wrap(),
|
||||
}
|
||||
})
|
||||
});
|
||||
|
||||
10
src/external/bool/mod.rs
vendored
10
src/external/bool/mod.rs
vendored
@@ -1,16 +1,16 @@
|
||||
mod equals;
|
||||
mod boolean;
|
||||
mod equals;
|
||||
mod ifthenelse;
|
||||
pub use boolean::Boolean;
|
||||
|
||||
use crate::{pipeline::ConstTree, interner::Interner};
|
||||
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
pub fn bool(i: &Interner) -> ConstTree {
|
||||
ConstTree::tree([
|
||||
(i.i("ifthenelse"), ConstTree::xfn(ifthenelse::IfThenElse1)),
|
||||
(i.i("equals"), ConstTree::xfn(equals::Equals2)),
|
||||
(i.i("true"), ConstTree::atom(Boolean(true))),
|
||||
(i.i("false"), ConstTree::atom(Boolean(false)))
|
||||
(i.i("false"), ConstTree::atom(Boolean(false))),
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
9
src/external/conv/mod.rs
vendored
9
src/external/conv/mod.rs
vendored
@@ -1,13 +1,14 @@
|
||||
use crate::{interner::Interner, pipeline::ConstTree};
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
mod to_string;
|
||||
mod parse_float;
|
||||
mod parse_uint;
|
||||
mod to_string;
|
||||
|
||||
pub fn conv(i: &Interner) -> ConstTree {
|
||||
ConstTree::tree([
|
||||
(i.i("parse_float"), ConstTree::xfn(parse_float::ParseFloat1)),
|
||||
(i.i("parse_uint"), ConstTree::xfn(parse_uint::ParseUint1)),
|
||||
(i.i("to_string"), ConstTree::xfn(to_string::ToString1))
|
||||
(i.i("to_string"), ConstTree::xfn(to_string::ToString1)),
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
52
src/external/conv/parse_float.rs
vendored
52
src/external/conv/parse_float.rs
vendored
@@ -1,41 +1,43 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use chumsky::Parser;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use super::super::assertion_error::AssertionError;
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::parse::float_parser;
|
||||
use crate::representations::{interpreted::ExprInst, Literal};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Literal;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// ParseFloat a number
|
||||
///
|
||||
/// parse a number. Accepts the same syntax Orchid does
|
||||
///
|
||||
/// Next state: [ParseFloat0]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct ParseFloat1;
|
||||
externfn_impl!(ParseFloat1, |_: &Self, x: ExprInst| Ok(ParseFloat0{x}));
|
||||
externfn_impl!(ParseFloat1, |_: &Self, x: ExprInst| Ok(ParseFloat0 { x }));
|
||||
|
||||
/// Applied to_string function
|
||||
///
|
||||
/// Prev state: [ParseFloat1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ParseFloat0{ x: ExprInst }
|
||||
pub struct ParseFloat0 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(ParseFloat0, x);
|
||||
atomic_impl!(ParseFloat0, |Self{ x }: &Self, _| {
|
||||
let number = with_lit(x, |l| Ok(match l {
|
||||
Literal::Str(s) => {
|
||||
let parser = float_parser();
|
||||
parser.parse(s.as_str())
|
||||
.map_err(|_| AssertionError::ext(x.clone(), "cannot be parsed into a float"))?
|
||||
}
|
||||
Literal::Num(n) => *n,
|
||||
Literal::Uint(i) => (*i as u32).into(),
|
||||
Literal::Char(char) => char.to_digit(10)
|
||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||
.into()
|
||||
}))?;
|
||||
atomic_impl!(ParseFloat0, |Self { x }: &Self, _| {
|
||||
let number = with_lit(x, |l| {
|
||||
Ok(match l {
|
||||
Literal::Str(s) => {
|
||||
let parser = float_parser();
|
||||
parser.parse(s.as_str()).map_err(|_| {
|
||||
AssertionError::ext(x.clone(), "cannot be parsed into a float")
|
||||
})?
|
||||
},
|
||||
Literal::Num(n) => *n,
|
||||
Literal::Uint(i) => (*i as u32).into(),
|
||||
Literal::Char(char) => char
|
||||
.to_digit(10)
|
||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||
.into(),
|
||||
})
|
||||
})?;
|
||||
Ok(number.into())
|
||||
});
|
||||
});
|
||||
|
||||
61
src/external/conv/parse_uint.rs
vendored
61
src/external/conv/parse_uint.rs
vendored
@@ -1,40 +1,47 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use chumsky::Parser;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::external::{litconv::with_lit, assertion_error::AssertionError};
|
||||
use crate::representations::{interpreted::ExprInst, Literal};
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use crate::external::assertion_error::AssertionError;
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::parse::int_parser;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Literal;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Parse a number
|
||||
///
|
||||
/// Parse an unsigned integer. Accepts the same formats Orchid does. If the
|
||||
/// input is a number, floors it.
|
||||
///
|
||||
/// Next state: [ParseUint0]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct ParseUint1;
|
||||
externfn_impl!(ParseUint1, |_: &Self, x: ExprInst| Ok(ParseUint0{x}));
|
||||
externfn_impl!(ParseUint1, |_: &Self, x: ExprInst| Ok(ParseUint0 { x }));
|
||||
|
||||
/// Applied ParseUint function
|
||||
///
|
||||
/// Prev state: [ParseUint1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ParseUint0{ x: ExprInst }
|
||||
pub struct ParseUint0 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(ParseUint0, x);
|
||||
atomic_impl!(ParseUint0, |Self{ x }: &Self, _| {
|
||||
let uint = with_lit(x, |l| Ok(match l {
|
||||
Literal::Str(s) => {
|
||||
let parser = int_parser();
|
||||
parser.parse(s.as_str())
|
||||
.map_err(|_| AssertionError::ext(x.clone(), "cannot be parsed into an unsigned int"))?
|
||||
}
|
||||
Literal::Num(n) => n.floor() as u64,
|
||||
Literal::Uint(i) => *i,
|
||||
Literal::Char(char) => char.to_digit(10)
|
||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||
.into()
|
||||
}))?;
|
||||
atomic_impl!(ParseUint0, |Self { x }: &Self, _| {
|
||||
let uint = with_lit(x, |l| {
|
||||
Ok(match l {
|
||||
Literal::Str(s) => {
|
||||
let parser = int_parser();
|
||||
parser.parse(s.as_str()).map_err(|_| {
|
||||
AssertionError::ext(
|
||||
x.clone(),
|
||||
"cannot be parsed into an unsigned int",
|
||||
)
|
||||
})?
|
||||
},
|
||||
Literal::Num(n) => n.floor() as u64,
|
||||
Literal::Uint(i) => *i,
|
||||
Literal::Char(char) => char
|
||||
.to_digit(10)
|
||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||
.into(),
|
||||
})
|
||||
})?;
|
||||
Ok(uint.into())
|
||||
});
|
||||
});
|
||||
|
||||
35
src/external/conv/to_string.rs
vendored
35
src/external/conv/to_string.rs
vendored
@@ -1,31 +1,32 @@
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::representations::{interpreted::ExprInst, Literal};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Literal;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// ToString a clause
|
||||
///
|
||||
/// Convert a literal to a string using Rust's conversions for floats, chars and
|
||||
/// uints respectively
|
||||
///
|
||||
/// Next state: [ToString0]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct ToString1;
|
||||
externfn_impl!(ToString1, |_: &Self, x: ExprInst| Ok(ToString0{x}));
|
||||
externfn_impl!(ToString1, |_: &Self, x: ExprInst| Ok(ToString0 { x }));
|
||||
|
||||
/// Applied ToString function
|
||||
///
|
||||
/// Prev state: [ToString1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ToString0{ x: ExprInst }
|
||||
pub struct ToString0 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(ToString0, x);
|
||||
atomic_impl!(ToString0, |Self{ x }: &Self, _| {
|
||||
let string = with_lit(x, |l| Ok(match l {
|
||||
Literal::Char(c) => c.to_string(),
|
||||
Literal::Uint(i) => i.to_string(),
|
||||
Literal::Num(n) => n.to_string(),
|
||||
Literal::Str(s) => s.clone()
|
||||
}))?;
|
||||
atomic_impl!(ToString0, |Self { x }: &Self, _| {
|
||||
let string = with_lit(x, |l| {
|
||||
Ok(match l {
|
||||
Literal::Char(c) => c.to_string(),
|
||||
Literal::Uint(i) => i.to_string(),
|
||||
Literal::Num(n) => n.to_string(),
|
||||
Literal::Str(s) => s.clone(),
|
||||
})
|
||||
})?;
|
||||
Ok(string.into())
|
||||
});
|
||||
|
||||
29
src/external/cpsio/debug.rs
vendored
29
src/external/cpsio/debug.rs
vendored
@@ -3,31 +3,30 @@ use std::fmt::Debug;
|
||||
use crate::foreign::{Atomic, AtomicReturn};
|
||||
use crate::interner::InternedDisplay;
|
||||
use crate::interpreter::Context;
|
||||
use crate::{externfn_impl, atomic_defaults};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_defaults, externfn_impl};
|
||||
|
||||
/// Debug function
|
||||
///
|
||||
/// Next state: [Debug0]
|
||||
|
||||
/// Print and return whatever expression is in the argument without normalizing
|
||||
/// it.
|
||||
///
|
||||
/// Next state: [Debug1]
|
||||
#[derive(Clone)]
|
||||
pub struct Debug2;
|
||||
externfn_impl!(Debug2, |_: &Self, x: ExprInst| Ok(Debug1{x}));
|
||||
|
||||
/// Partially applied Print function
|
||||
///
|
||||
/// Prev state: [Debug1]
|
||||
externfn_impl!(Debug2, |_: &Self, x: ExprInst| Ok(Debug1 { x }));
|
||||
|
||||
/// Prev state: [Debug2]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Debug1{ x: ExprInst }
|
||||
pub struct Debug1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
impl Atomic for Debug1 {
|
||||
atomic_defaults!();
|
||||
fn run(&self, ctx: Context) -> crate::foreign::AtomicResult {
|
||||
println!("{}", self.x.bundle(&ctx.interner));
|
||||
Ok(AtomicReturn{
|
||||
println!("{}", self.x.bundle(ctx.interner));
|
||||
Ok(AtomicReturn {
|
||||
clause: self.x.expr().clause.clone(),
|
||||
gas: ctx.gas.map(|g| g - 1),
|
||||
inert: false
|
||||
inert: false,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
45
src/external/cpsio/io.rs
vendored
45
src/external/cpsio/io.rs
vendored
@@ -1,35 +1,40 @@
|
||||
use std::io::{self, Write, stdin};
|
||||
use std::io::{self, Write};
|
||||
|
||||
use crate::{representations::{interpreted::{ExprInst, Clause}, Primitive, Literal}, atomic_inert, interpreter::{HandlerParm, HandlerRes}, unwrap_or, external::runtime_error::RuntimeError};
|
||||
use crate::external::runtime_error::RuntimeError;
|
||||
use crate::interpreter::{HandlerParm, HandlerRes};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::{Literal, Primitive};
|
||||
use crate::{atomic_inert, unwrap_or};
|
||||
|
||||
/// An IO command to be handled by the host application.
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum IO {
|
||||
Print(String, ExprInst),
|
||||
Readline(ExprInst)
|
||||
Readline(ExprInst),
|
||||
}
|
||||
atomic_inert!(IO);
|
||||
|
||||
/// Default xommand handler for IO actions
|
||||
pub fn handle(effect: HandlerParm) -> HandlerRes {
|
||||
let io: &IO = unwrap_or!(
|
||||
effect.as_any().downcast_ref();
|
||||
return Err(effect)
|
||||
);
|
||||
match io {
|
||||
// Downcast command
|
||||
let io: &IO = unwrap_or!(effect.as_any().downcast_ref(); Err(effect)?);
|
||||
// Interpret and execute
|
||||
Ok(match io {
|
||||
IO::Print(str, cont) => {
|
||||
print!("{}", str);
|
||||
io::stdout().flush().unwrap();
|
||||
Ok(Ok(cont.clone()))
|
||||
io::stdout()
|
||||
.flush()
|
||||
.map_err(|e| RuntimeError::ext(e.to_string(), "writing to stdout"))?;
|
||||
cont.clone()
|
||||
},
|
||||
IO::Readline(cont) => {
|
||||
let mut buf = String::new();
|
||||
if let Err(e) = stdin().read_line(&mut buf) {
|
||||
return Ok(Err(RuntimeError::ext(e.to_string(), "reading from stdin")));
|
||||
}
|
||||
io::stdin()
|
||||
.read_line(&mut buf)
|
||||
.map_err(|e| RuntimeError::ext(e.to_string(), "reading from stdin"))?;
|
||||
buf.pop();
|
||||
Ok(Ok(Clause::Apply {
|
||||
f: cont.clone(),
|
||||
x: Clause::P(Primitive::Literal(Literal::Str(buf))).wrap()
|
||||
}.wrap()))
|
||||
}
|
||||
}
|
||||
}
|
||||
let x = Clause::P(Primitive::Literal(Literal::Str(buf))).wrap();
|
||||
Clause::Apply { f: cont.clone(), x }.wrap()
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
15
src/external/cpsio/mod.rs
vendored
15
src/external/cpsio/mod.rs
vendored
@@ -1,18 +1,19 @@
|
||||
use crate::{interner::Interner, pipeline::ConstTree};
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
mod debug;
|
||||
mod io;
|
||||
mod panic;
|
||||
mod print;
|
||||
mod readline;
|
||||
mod debug;
|
||||
mod panic;
|
||||
mod io;
|
||||
|
||||
pub use io::{IO, handle};
|
||||
pub use io::{handle, IO};
|
||||
|
||||
pub fn cpsio(i: &Interner) -> ConstTree {
|
||||
ConstTree::tree([
|
||||
(i.i("print"), ConstTree::xfn(print::Print2)),
|
||||
(i.i("readline"), ConstTree::xfn(readline::Readln2)),
|
||||
(i.i("debug"), ConstTree::xfn(debug::Debug2)),
|
||||
(i.i("panic"), ConstTree::xfn(panic::Panic1))
|
||||
(i.i("panic"), ConstTree::xfn(panic::Panic1)),
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
24
src/external/cpsio/panic.rs
vendored
24
src/external/cpsio/panic.rs
vendored
@@ -1,23 +1,29 @@
|
||||
use std::fmt::Display;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use crate::external::litconv::with_str;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::foreign::ExternError;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Takes a message, returns an [ExternError] unconditionally.
|
||||
///
|
||||
/// Next state: [Panic0]
|
||||
#[derive(Clone)]
|
||||
pub struct Panic1;
|
||||
externfn_impl!(Panic1, |_: &Self, x: ExprInst| Ok(Panic0{ x }));
|
||||
externfn_impl!(Panic1, |_: &Self, x: ExprInst| Ok(Panic0 { x }));
|
||||
|
||||
/// Prev state: [Panic1]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Panic0{ x: ExprInst }
|
||||
pub struct Panic0 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Panic0, x);
|
||||
atomic_impl!(Panic0, |Self{ x }: &Self, _| {
|
||||
with_str(x, |s| {
|
||||
Err(OrchidPanic(s.clone()).into_extern())
|
||||
})
|
||||
atomic_impl!(Panic0, |Self { x }: &Self, _| {
|
||||
with_str(x, |s| Err(OrchidPanic(s.clone()).into_extern()))
|
||||
});
|
||||
|
||||
/// An unrecoverable error in Orchid land. Of course, because Orchid is lazy, it
|
||||
/// only applies to the expressions that use the one that generated it.
|
||||
pub struct OrchidPanic(String);
|
||||
|
||||
impl Display for OrchidPanic {
|
||||
@@ -26,4 +32,4 @@ impl Display for OrchidPanic {
|
||||
}
|
||||
}
|
||||
|
||||
impl ExternError for OrchidPanic {}
|
||||
impl ExternError for OrchidPanic {}
|
||||
|
||||
37
src/external/cpsio/print.rs
vendored
37
src/external/cpsio/print.rs
vendored
@@ -1,43 +1,40 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use super::io::IO;
|
||||
use crate::external::litconv::with_str;
|
||||
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
||||
use crate::interpreter::Context;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl, atomic_defaults};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_defaults, atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
use super::io::IO;
|
||||
|
||||
/// Print function
|
||||
///
|
||||
/// Wrap a string and the continuation into an [IO] event to be evaluated by the
|
||||
/// embedder.
|
||||
///
|
||||
/// Next state: [Print1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Print2;
|
||||
externfn_impl!(Print2, |_: &Self, x: ExprInst| Ok(Print1{x}));
|
||||
externfn_impl!(Print2, |_: &Self, x: ExprInst| Ok(Print1 { x }));
|
||||
|
||||
/// Partially applied Print function
|
||||
///
|
||||
/// Prev state: [Print2]; Next state: [Print0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Print1{ x: ExprInst }
|
||||
pub struct Print1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Print1, x);
|
||||
atomic_impl!(Print1);
|
||||
externfn_impl!(Print1, |this: &Self, x: ExprInst| {
|
||||
with_str(&this.x, |s| {
|
||||
Ok(Print0{ s: s.clone(), x })
|
||||
})
|
||||
with_str(&this.x, |s| Ok(Print0 { s: s.clone(), x }))
|
||||
});
|
||||
|
||||
/// Prev state: [Print1]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Print0{ s: String, x: ExprInst }
|
||||
pub struct Print0 {
|
||||
s: String,
|
||||
x: ExprInst,
|
||||
}
|
||||
impl Atomic for Print0 {
|
||||
atomic_defaults!();
|
||||
fn run(&self, ctx: Context) -> AtomicResult {
|
||||
Ok(AtomicReturn::from_data(
|
||||
IO::Print(self.s.clone(), self.x.clone()),
|
||||
ctx
|
||||
))
|
||||
Ok(AtomicReturn::from_data(IO::Print(self.s.clone(), self.x.clone()), ctx))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
29
src/external/cpsio/readline.rs
vendored
29
src/external/cpsio/readline.rs
vendored
@@ -1,32 +1,27 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use super::io::IO;
|
||||
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
||||
use crate::interpreter::Context;
|
||||
use crate::{externfn_impl, atomic_defaults};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_defaults, externfn_impl};
|
||||
|
||||
use super::io::IO;
|
||||
|
||||
/// Readln function
|
||||
///
|
||||
/// Create an [IO] event that reads a line form standard input and calls the
|
||||
/// continuation with it.
|
||||
///
|
||||
/// Next state: [Readln1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Readln2;
|
||||
externfn_impl!(Readln2, |_: &Self, x: ExprInst| Ok(Readln1{x}));
|
||||
|
||||
/// Partially applied Readln function
|
||||
///
|
||||
/// Prev state: [Readln2]; Next state: [Readln0]
|
||||
externfn_impl!(Readln2, |_: &Self, x: ExprInst| Ok(Readln1 { x }));
|
||||
|
||||
/// Prev state: [Readln2]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Readln1{ x: ExprInst }
|
||||
pub struct Readln1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
impl Atomic for Readln1 {
|
||||
atomic_defaults!();
|
||||
fn run(&self, ctx: Context) -> AtomicResult {
|
||||
Ok(AtomicReturn::from_data(
|
||||
IO::Readline(self.x.clone()),
|
||||
ctx
|
||||
))
|
||||
Ok(AtomicReturn::from_data(IO::Readline(self.x.clone()), ctx))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
31
src/external/litconv.rs
vendored
31
src/external/litconv.rs
vendored
@@ -1,34 +1,45 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::foreign::ExternError;
|
||||
use crate::external::assertion_error::AssertionError;
|
||||
use crate::foreign::ExternError;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Literal;
|
||||
|
||||
pub fn with_lit<T>(x: &ExprInst,
|
||||
predicate: impl FnOnce(&Literal) -> Result<T, Rc<dyn ExternError>>
|
||||
/// Tries to cast the [ExprInst] as a [Literal], calls the provided function on
|
||||
/// it if successful. Returns a generic [AssertionError] if not.
|
||||
pub fn with_lit<T>(
|
||||
x: &ExprInst,
|
||||
predicate: impl FnOnce(&Literal) -> Result<T, Rc<dyn ExternError>>,
|
||||
) -> Result<T, Rc<dyn ExternError>> {
|
||||
x.with_literal(predicate)
|
||||
.map_err(|()| AssertionError::ext(x.clone(), "a literal value"))
|
||||
.and_then(|r| r)
|
||||
}
|
||||
|
||||
pub fn with_str<T>(x: &ExprInst,
|
||||
predicate: impl FnOnce(&String) -> Result<T, Rc<dyn ExternError>>
|
||||
/// Like [with_lit] but also unwraps [Literal::Str]
|
||||
pub fn with_str<T>(
|
||||
x: &ExprInst,
|
||||
predicate: impl FnOnce(&String) -> Result<T, Rc<dyn ExternError>>,
|
||||
) -> Result<T, Rc<dyn ExternError>> {
|
||||
with_lit(x, |l| {
|
||||
if let Literal::Str(s) = l {predicate(&s)} else {
|
||||
if let Literal::Str(s) = l {
|
||||
predicate(s)
|
||||
} else {
|
||||
AssertionError::fail(x.clone(), "a string")?
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
pub fn with_uint<T>(x: &ExprInst,
|
||||
predicate: impl FnOnce(u64) -> Result<T, Rc<dyn ExternError>>
|
||||
/// Like [with_lit] but also unwraps [Literal::Uint]
|
||||
pub fn with_uint<T>(
|
||||
x: &ExprInst,
|
||||
predicate: impl FnOnce(u64) -> Result<T, Rc<dyn ExternError>>,
|
||||
) -> Result<T, Rc<dyn ExternError>> {
|
||||
with_lit(x, |l| {
|
||||
if let Literal::Uint(u) = l {predicate(*u)} else {
|
||||
if let Literal::Uint(u) = l {
|
||||
predicate(*u)
|
||||
} else {
|
||||
AssertionError::fail(x.clone(), "an uint")?
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
14
src/external/mod.rs
vendored
14
src/external/mod.rs
vendored
@@ -1,11 +1,11 @@
|
||||
mod num;
|
||||
mod assertion_error;
|
||||
pub mod std;
|
||||
mod conv;
|
||||
mod str;
|
||||
mod cpsio;
|
||||
mod runtime_error;
|
||||
mod bool;
|
||||
mod conv;
|
||||
mod cpsio;
|
||||
mod litconv;
|
||||
mod num;
|
||||
mod runtime_error;
|
||||
pub mod std;
|
||||
mod str;
|
||||
|
||||
pub use cpsio::{IO, handle};
|
||||
pub use cpsio::{handle, IO};
|
||||
|
||||
7
src/external/num/mod.rs
vendored
7
src/external/num/mod.rs
vendored
@@ -2,7 +2,8 @@ mod numeric;
|
||||
pub mod operators;
|
||||
pub use numeric::Numeric;
|
||||
|
||||
use crate::{interner::Interner, pipeline::ConstTree};
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
pub fn num(i: &Interner) -> ConstTree {
|
||||
ConstTree::tree([
|
||||
@@ -10,6 +11,6 @@ pub fn num(i: &Interner) -> ConstTree {
|
||||
(i.i("subtract"), ConstTree::xfn(operators::subtract::Subtract2)),
|
||||
(i.i("multiply"), ConstTree::xfn(operators::multiply::Multiply2)),
|
||||
(i.i("divide"), ConstTree::xfn(operators::divide::Divide2)),
|
||||
(i.i("remainder"), ConstTree::xfn(operators::remainder::Remainder2))
|
||||
(i.i("remainder"), ConstTree::xfn(operators::remainder::Remainder2)),
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
47
src/external/num/numeric.rs
vendored
47
src/external/num/numeric.rs
vendored
@@ -1,4 +1,4 @@
|
||||
use std::ops::{Add, Sub, Mul, Div, Rem};
|
||||
use std::ops::{Add, Div, Mul, Rem, Sub};
|
||||
use std::rc::Rc;
|
||||
|
||||
use ordered_float::NotNan;
|
||||
@@ -6,21 +6,21 @@ use ordered_float::NotNan;
|
||||
use crate::external::assertion_error::AssertionError;
|
||||
use crate::external::litconv::with_lit;
|
||||
use crate::foreign::ExternError;
|
||||
use crate::representations::Literal;
|
||||
use crate::representations::Primitive;
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::{Literal, Primitive};
|
||||
|
||||
/// A number, either floating point or unsigned int, visible to Orchid.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub enum Numeric {
|
||||
Uint(u64),
|
||||
Num(NotNan<f64>)
|
||||
Num(NotNan<f64>),
|
||||
}
|
||||
|
||||
impl Numeric {
|
||||
/// Wrap a f64 in a Numeric
|
||||
///
|
||||
///
|
||||
/// # Panics
|
||||
///
|
||||
///
|
||||
/// if the value is NaN or Infinity.try_into()
|
||||
fn num<T: Into<f64>>(value: T) -> Self {
|
||||
let f = value.into();
|
||||
@@ -36,9 +36,9 @@ impl Add for Numeric {
|
||||
match (self, rhs) {
|
||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a + b),
|
||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a + b),
|
||||
(Numeric::Uint(a), Numeric::Num(b)) |
|
||||
(Numeric::Num(b), Numeric::Uint(a))
|
||||
=> Numeric::num::<f64>(a as f64 + *b)
|
||||
(Numeric::Uint(a), Numeric::Num(b))
|
||||
| (Numeric::Num(b), Numeric::Uint(a)) =>
|
||||
Numeric::num::<f64>(a as f64 + *b),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -49,11 +49,10 @@ impl Sub for Numeric {
|
||||
fn sub(self, rhs: Self) -> Self::Output {
|
||||
match (self, rhs) {
|
||||
(Numeric::Uint(a), Numeric::Uint(b)) if b <= a => Numeric::Uint(a - b),
|
||||
(Numeric::Uint(a), Numeric::Uint(b))
|
||||
=> Numeric::num(a as f64 - b as f64),
|
||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::num(a as f64 - b as f64),
|
||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a - b),
|
||||
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 - *b),
|
||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a - b as f64)
|
||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a - b as f64),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -65,9 +64,9 @@ impl Mul for Numeric {
|
||||
match (self, rhs) {
|
||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a * b),
|
||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a * b),
|
||||
(Numeric::Uint(a), Numeric::Num(b)) |
|
||||
(Numeric::Num(b), Numeric::Uint(a))
|
||||
=> Numeric::Num(NotNan::new(a as f64).unwrap() * b)
|
||||
(Numeric::Uint(a), Numeric::Num(b))
|
||||
| (Numeric::Num(b), Numeric::Uint(a)) =>
|
||||
Numeric::Num(NotNan::new(a as f64).unwrap() * b),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -90,7 +89,7 @@ impl Rem for Numeric {
|
||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a % b),
|
||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a % b),
|
||||
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 % *b),
|
||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a % b as f64)
|
||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a % b as f64),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -101,7 +100,7 @@ impl TryFrom<ExprInst> for Numeric {
|
||||
with_lit(&value.clone(), |l| match l {
|
||||
Literal::Uint(i) => Ok(Numeric::Uint(*i)),
|
||||
Literal::Num(n) => Ok(Numeric::Num(*n)),
|
||||
_ => AssertionError::fail(value, "an integer or number")?
|
||||
_ => AssertionError::fail(value, "an integer or number")?,
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -110,7 +109,7 @@ impl From<Numeric> for Clause {
|
||||
fn from(value: Numeric) -> Self {
|
||||
Clause::P(Primitive::Literal(match value {
|
||||
Numeric::Uint(i) => Literal::Uint(i),
|
||||
Numeric::Num(n) => Literal::Num(n)
|
||||
Numeric::Num(n) => Literal::Num(n),
|
||||
}))
|
||||
}
|
||||
}
|
||||
@@ -119,16 +118,16 @@ impl From<Numeric> for String {
|
||||
fn from(value: Numeric) -> Self {
|
||||
match value {
|
||||
Numeric::Uint(i) => i.to_string(),
|
||||
Numeric::Num(n) => n.to_string()
|
||||
Numeric::Num(n) => n.to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Into<f64> for Numeric {
|
||||
fn into(self) -> f64 {
|
||||
match self {
|
||||
impl From<Numeric> for f64 {
|
||||
fn from(val: Numeric) -> Self {
|
||||
match val {
|
||||
Numeric::Num(n) => *n,
|
||||
Numeric::Uint(i) => i as f64
|
||||
Numeric::Uint(i) => i as f64,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
31
src/external/num/operators/add.rs
vendored
31
src/external/num/operators/add.rs
vendored
@@ -1,39 +1,36 @@
|
||||
|
||||
use super::super::Numeric;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use super::super::Numeric;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Add function
|
||||
///
|
||||
/// Adds two numbers
|
||||
///
|
||||
/// Next state: [Add1]
|
||||
#[derive(Clone)]
|
||||
pub struct Add2;
|
||||
externfn_impl!(Add2, |_: &Self, x: ExprInst| Ok(Add1{x}));
|
||||
externfn_impl!(Add2, |_: &Self, x: ExprInst| Ok(Add1 { x }));
|
||||
|
||||
/// Partially applied Add function
|
||||
///
|
||||
/// Prev state: [Add2]; Next state: [Add0]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Add1{ x: ExprInst }
|
||||
pub struct Add1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Add1, x);
|
||||
atomic_impl!(Add1);
|
||||
externfn_impl!(Add1, |this: &Self, x: ExprInst| {
|
||||
let a: Numeric = this.x.clone().try_into()?;
|
||||
Ok(Add0{ a, x })
|
||||
Ok(Add0 { a, x })
|
||||
});
|
||||
|
||||
/// Fully applied Add function.
|
||||
///
|
||||
/// Prev state: [Add1]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Add0 { a: Numeric, x: ExprInst }
|
||||
pub struct Add0 {
|
||||
a: Numeric,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Add0, x);
|
||||
atomic_impl!(Add0, |Self{ a, x }: &Self, _| {
|
||||
atomic_impl!(Add0, |Self { a, x }: &Self, _| {
|
||||
let b: Numeric = x.clone().try_into()?;
|
||||
Ok((*a + b).into())
|
||||
});
|
||||
|
||||
|
||||
|
||||
33
src/external/num/operators/divide.rs
vendored
33
src/external/num/operators/divide.rs
vendored
@@ -1,40 +1,37 @@
|
||||
|
||||
use super::super::Numeric;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use super::super::Numeric;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Divide function
|
||||
///
|
||||
/// Divides two numbers
|
||||
///
|
||||
/// Next state: [Divide1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Divide2;
|
||||
externfn_impl!(Divide2, |_: &Self, x: ExprInst| Ok(Divide1{x}));
|
||||
externfn_impl!(Divide2, |_: &Self, x: ExprInst| Ok(Divide1 { x }));
|
||||
|
||||
/// Partially applied Divide function
|
||||
///
|
||||
/// Prev state: [Divide2]; Next state: [Divide0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Divide1{ x: ExprInst }
|
||||
pub struct Divide1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Divide1, x);
|
||||
atomic_impl!(Divide1);
|
||||
externfn_impl!(Divide1, |this: &Self, x: ExprInst| {
|
||||
let a: Numeric = this.x.clone().try_into()?;
|
||||
Ok(Divide0{ a, x })
|
||||
Ok(Divide0 { a, x })
|
||||
});
|
||||
|
||||
/// Fully applied Divide function.
|
||||
///
|
||||
/// Prev state: [Divide1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Divide0 { a: Numeric, x: ExprInst }
|
||||
pub struct Divide0 {
|
||||
a: Numeric,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Divide0, x);
|
||||
atomic_impl!(Divide0, |Self{ a, x }: &Self, _| {
|
||||
atomic_impl!(Divide0, |Self { a, x }: &Self, _| {
|
||||
let b: Numeric = x.clone().try_into()?;
|
||||
Ok((*a / b).into())
|
||||
});
|
||||
});
|
||||
|
||||
2
src/external/num/operators/mod.rs
vendored
2
src/external/num/operators/mod.rs
vendored
@@ -2,4 +2,4 @@ pub mod add;
|
||||
pub mod divide;
|
||||
pub mod multiply;
|
||||
pub mod remainder;
|
||||
pub mod subtract;
|
||||
pub mod subtract;
|
||||
|
||||
34
src/external/num/operators/multiply.rs
vendored
34
src/external/num/operators/multiply.rs
vendored
@@ -1,40 +1,36 @@
|
||||
|
||||
use super::super::Numeric;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use super::super::Numeric;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Multiply function
|
||||
///
|
||||
/// Multiplies two numbers
|
||||
///
|
||||
/// Next state: [Multiply1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Multiply2;
|
||||
externfn_impl!(Multiply2, |_: &Self, x: ExprInst| Ok(Multiply1{x}));
|
||||
externfn_impl!(Multiply2, |_: &Self, x: ExprInst| Ok(Multiply1 { x }));
|
||||
|
||||
/// Partially applied Multiply function
|
||||
///
|
||||
/// Prev state: [Multiply2]; Next state: [Multiply0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Multiply1{ x: ExprInst }
|
||||
pub struct Multiply1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Multiply1, x);
|
||||
atomic_impl!(Multiply1);
|
||||
externfn_impl!(Multiply1, |this: &Self, x: ExprInst| {
|
||||
let a: Numeric = this.x.clone().try_into()?;
|
||||
Ok(Multiply0{ a, x })
|
||||
Ok(Multiply0 { a, x })
|
||||
});
|
||||
|
||||
/// Fully applied Multiply function.
|
||||
///
|
||||
/// Prev state: [Multiply1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Multiply0 { a: Numeric, x: ExprInst }
|
||||
pub struct Multiply0 {
|
||||
a: Numeric,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Multiply0, x);
|
||||
atomic_impl!(Multiply0, |Self{ a, x }: &Self, _| {
|
||||
atomic_impl!(Multiply0, |Self { a, x }: &Self, _| {
|
||||
let b: Numeric = x.clone().try_into()?;
|
||||
Ok((*a * b).into())
|
||||
});
|
||||
});
|
||||
|
||||
34
src/external/num/operators/remainder.rs
vendored
34
src/external/num/operators/remainder.rs
vendored
@@ -1,40 +1,36 @@
|
||||
|
||||
use super::super::Numeric;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use super::super::Numeric;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Remainder function
|
||||
///
|
||||
/// Takes the modulus of two numbers.
|
||||
///
|
||||
/// Next state: [Remainder1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Remainder2;
|
||||
externfn_impl!(Remainder2, |_: &Self, x: ExprInst| Ok(Remainder1{x}));
|
||||
externfn_impl!(Remainder2, |_: &Self, x: ExprInst| Ok(Remainder1 { x }));
|
||||
|
||||
/// Partially applied Remainder function
|
||||
///
|
||||
/// Prev state: [Remainder2]; Next state: [Remainder0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Remainder1{ x: ExprInst }
|
||||
pub struct Remainder1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Remainder1, x);
|
||||
atomic_impl!(Remainder1);
|
||||
externfn_impl!(Remainder1, |this: &Self, x: ExprInst| {
|
||||
let a: Numeric = this.x.clone().try_into()?;
|
||||
Ok(Remainder0{ a, x })
|
||||
Ok(Remainder0 { a, x })
|
||||
});
|
||||
|
||||
/// Fully applied Remainder function.
|
||||
///
|
||||
/// Prev state: [Remainder1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Remainder0 { a: Numeric, x: ExprInst }
|
||||
pub struct Remainder0 {
|
||||
a: Numeric,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Remainder0, x);
|
||||
atomic_impl!(Remainder0, |Self{ a, x }: &Self, _| {
|
||||
atomic_impl!(Remainder0, |Self { a, x }: &Self, _| {
|
||||
let b: Numeric = x.clone().try_into()?;
|
||||
Ok((*a % b).into())
|
||||
});
|
||||
});
|
||||
|
||||
34
src/external/num/operators/subtract.rs
vendored
34
src/external/num/operators/subtract.rs
vendored
@@ -1,40 +1,36 @@
|
||||
|
||||
use super::super::Numeric;
|
||||
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use super::super::Numeric;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Subtract function
|
||||
///
|
||||
/// Subtracts two numbers
|
||||
///
|
||||
/// Next state: [Subtract1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Subtract2;
|
||||
externfn_impl!(Subtract2, |_: &Self, x: ExprInst| Ok(Subtract1{x}));
|
||||
externfn_impl!(Subtract2, |_: &Self, x: ExprInst| Ok(Subtract1 { x }));
|
||||
|
||||
/// Partially applied Subtract function
|
||||
///
|
||||
/// Prev state: [Subtract2]; Next state: [Subtract0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Subtract1{ x: ExprInst }
|
||||
pub struct Subtract1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Subtract1, x);
|
||||
atomic_impl!(Subtract1);
|
||||
externfn_impl!(Subtract1, |this: &Self, x: ExprInst| {
|
||||
let a: Numeric = this.x.clone().try_into()?;
|
||||
Ok(Subtract0{ a, x })
|
||||
Ok(Subtract0 { a, x })
|
||||
});
|
||||
|
||||
/// Fully applied Subtract function.
|
||||
///
|
||||
/// Prev state: [Subtract1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Subtract0 { a: Numeric, x: ExprInst }
|
||||
pub struct Subtract0 {
|
||||
a: Numeric,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Subtract0, x);
|
||||
atomic_impl!(Subtract0, |Self{ a, x }: &Self, _| {
|
||||
atomic_impl!(Subtract0, |Self { a, x }: &Self, _| {
|
||||
let b: Numeric = x.clone().try_into()?;
|
||||
Ok((*a - b).into())
|
||||
});
|
||||
});
|
||||
|
||||
15
src/external/runtime_error.rs
vendored
15
src/external/runtime_error.rs
vendored
@@ -1,7 +1,9 @@
|
||||
use std::{rc::Rc, fmt::Display};
|
||||
use std::fmt::Display;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::foreign::ExternError;
|
||||
|
||||
/// Some external event prevented the operation from succeeding
|
||||
#[derive(Clone)]
|
||||
pub struct RuntimeError {
|
||||
message: String,
|
||||
@@ -9,12 +11,15 @@ pub struct RuntimeError {
|
||||
}
|
||||
|
||||
impl RuntimeError {
|
||||
pub fn fail(message: String, operation: &'static str) -> Result<!, Rc<dyn ExternError>> {
|
||||
return Err(Self { message, operation }.into_extern())
|
||||
pub fn fail(
|
||||
message: String,
|
||||
operation: &'static str,
|
||||
) -> Result<!, Rc<dyn ExternError>> {
|
||||
return Err(Self { message, operation }.into_extern());
|
||||
}
|
||||
|
||||
pub fn ext(message: String, operation: &'static str) -> Rc<dyn ExternError> {
|
||||
return Self { message, operation }.into_extern()
|
||||
return Self { message, operation }.into_extern();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -24,4 +29,4 @@ impl Display for RuntimeError {
|
||||
}
|
||||
}
|
||||
|
||||
impl ExternError for RuntimeError{}
|
||||
impl ExternError for RuntimeError {}
|
||||
|
||||
17
src/external/std.rs
vendored
17
src/external/std.rs
vendored
@@ -1,16 +1,11 @@
|
||||
use crate::pipeline::ConstTree;
|
||||
use crate::interner::Interner;
|
||||
|
||||
use super::bool::bool;
|
||||
use super::cpsio::cpsio;
|
||||
use super::conv::conv;
|
||||
use super::str::str;
|
||||
use super::cpsio::cpsio;
|
||||
use super::num::num;
|
||||
use super::str::str;
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
pub fn std(i: &Interner) -> ConstTree {
|
||||
cpsio(i)
|
||||
+ conv(i)
|
||||
+ bool(i)
|
||||
+ str(i)
|
||||
+ num(i)
|
||||
}
|
||||
cpsio(i) + conv(i) + bool(i) + str(i) + num(i)
|
||||
}
|
||||
|
||||
41
src/external/str/char_at.rs
vendored
41
src/external/str/char_at.rs
vendored
@@ -2,41 +2,44 @@ use std::fmt::Debug;
|
||||
|
||||
use crate::external::litconv::{with_str, with_uint};
|
||||
use crate::external::runtime_error::RuntimeError;
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::{Literal, Primitive};
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
|
||||
/// CharAt function
|
||||
///
|
||||
/// Takes an uint and a string, finds the char in a string at a 0-based index
|
||||
///
|
||||
/// Next state: [CharAt1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct CharAt2;
|
||||
externfn_impl!(CharAt2, |_: &Self, x: ExprInst| Ok(CharAt1{x}));
|
||||
externfn_impl!(CharAt2, |_: &Self, x: ExprInst| Ok(CharAt1 { x }));
|
||||
|
||||
/// Partially applied CharAt function
|
||||
///
|
||||
/// Prev state: [CharAt2]; Next state: [CharAt0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CharAt1{ x: ExprInst }
|
||||
pub struct CharAt1 {
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(CharAt1, x);
|
||||
atomic_impl!(CharAt1);
|
||||
externfn_impl!(CharAt1, |this: &Self, x: ExprInst| {
|
||||
with_str(&this.x, |s| Ok(CharAt0{ s: s.clone(), x }))
|
||||
with_str(&this.x, |s| Ok(CharAt0 { s: s.clone(), x }))
|
||||
});
|
||||
|
||||
/// Fully applied CharAt function.
|
||||
///
|
||||
/// Prev state: [CharAt1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CharAt0 { s: String, x: ExprInst }
|
||||
pub struct CharAt0 {
|
||||
s: String,
|
||||
x: ExprInst,
|
||||
}
|
||||
atomic_redirect!(CharAt0, x);
|
||||
atomic_impl!(CharAt0, |Self{ s, x }: &Self, _| {
|
||||
with_uint(x, |i| if let Some(c) = s.chars().nth(i as usize) {
|
||||
Ok(Clause::P(Primitive::Literal(Literal::Char(c))))
|
||||
} else {
|
||||
RuntimeError::fail("Character index out of bounds".to_string(), "indexing string")?
|
||||
atomic_impl!(CharAt0, |Self { s, x }: &Self, _| {
|
||||
with_uint(x, |i| {
|
||||
if let Some(c) = s.chars().nth(i as usize) {
|
||||
Ok(Clause::P(Primitive::Literal(Literal::Char(c))))
|
||||
} else {
|
||||
RuntimeError::fail(
|
||||
"Character index out of bounds".to_string(),
|
||||
"indexing string",
|
||||
)?
|
||||
}
|
||||
})
|
||||
});
|
||||
|
||||
36
src/external/str/concatenate.rs
vendored
36
src/external/str/concatenate.rs
vendored
@@ -1,39 +1,37 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use crate::external::litconv::with_str;
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
use crate::representations::{Primitive, Literal};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::{Literal, Primitive};
|
||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||
|
||||
/// Concatenate function
|
||||
///
|
||||
/// Concatenates two strings
|
||||
///
|
||||
/// Next state: [Concatenate1]
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Concatenate2;
|
||||
externfn_impl!(Concatenate2, |_: &Self, c: ExprInst| Ok(Concatenate1{c}));
|
||||
externfn_impl!(Concatenate2, |_: &Self, c: ExprInst| Ok(Concatenate1 { c }));
|
||||
|
||||
/// Partially applied Concatenate function
|
||||
///
|
||||
/// Prev state: [Concatenate2]; Next state: [Concatenate0]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Concatenate1{ c: ExprInst }
|
||||
pub struct Concatenate1 {
|
||||
c: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Concatenate1, c);
|
||||
atomic_impl!(Concatenate1);
|
||||
externfn_impl!(Concatenate1, |this: &Self, c: ExprInst| {
|
||||
with_str(&this.c, |a| Ok(Concatenate0{ a: a.clone(), c }))
|
||||
with_str(&this.c, |a| Ok(Concatenate0 { a: a.clone(), c }))
|
||||
});
|
||||
|
||||
/// Fully applied Concatenate function.
|
||||
///
|
||||
/// Prev state: [Concatenate1]
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Concatenate0 { a: String, c: ExprInst }
|
||||
pub struct Concatenate0 {
|
||||
a: String,
|
||||
c: ExprInst,
|
||||
}
|
||||
atomic_redirect!(Concatenate0, c);
|
||||
atomic_impl!(Concatenate0, |Self{ a, c }: &Self, _| {
|
||||
with_str(c, |b| Ok(Clause::P(Primitive::Literal(
|
||||
Literal::Str(a.to_owned() + b)
|
||||
))))
|
||||
atomic_impl!(Concatenate0, |Self { a, c }: &Self, _| {
|
||||
with_str(c, |b| {
|
||||
Ok(Clause::P(Primitive::Literal(Literal::Str(a.to_owned() + b))))
|
||||
})
|
||||
});
|
||||
|
||||
14
src/external/str/mod.rs
vendored
14
src/external/str/mod.rs
vendored
@@ -1,10 +1,12 @@
|
||||
mod concatenate;
|
||||
mod char_at;
|
||||
mod concatenate;
|
||||
|
||||
use crate::{pipeline::ConstTree, interner::Interner};
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::ConstTree;
|
||||
|
||||
pub fn str(i: &Interner) -> ConstTree {
|
||||
ConstTree::tree([
|
||||
(i.i("concatenate"), ConstTree::xfn(concatenate::Concatenate2))
|
||||
])
|
||||
}
|
||||
ConstTree::tree([(
|
||||
i.i("concatenate"),
|
||||
ConstTree::xfn(concatenate::Concatenate2),
|
||||
)])
|
||||
}
|
||||
|
||||
@@ -1,30 +1,25 @@
|
||||
use std::any::Any;
|
||||
use std::error::Error;
|
||||
use std::fmt::{Display, Debug};
|
||||
use std::fmt::{Debug, Display};
|
||||
use std::hash::Hash;
|
||||
use std::rc::Rc;
|
||||
|
||||
use dyn_clone::DynClone;
|
||||
|
||||
use crate::interpreter::{RuntimeError, Context};
|
||||
|
||||
use crate::representations::Primitive;
|
||||
use crate::interpreter::{Context, RuntimeError};
|
||||
pub use crate::representations::interpreted::Clause;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::representations::Primitive;
|
||||
|
||||
pub struct AtomicReturn {
|
||||
pub clause: Clause,
|
||||
pub gas: Option<usize>,
|
||||
pub inert: bool
|
||||
pub inert: bool,
|
||||
}
|
||||
impl AtomicReturn {
|
||||
/// Wrap an inert atomic for delivery to the supervisor
|
||||
pub fn from_data<D: Atomic>(d: D, c: Context) -> Self {
|
||||
AtomicReturn {
|
||||
clause: d.to_atom_cls(),
|
||||
gas: c.gas,
|
||||
inert: false
|
||||
}
|
||||
AtomicReturn { clause: d.to_atom_cls(), gas: c.gas, inert: false }
|
||||
}
|
||||
}
|
||||
|
||||
@@ -36,7 +31,9 @@ pub type RcExpr = ExprInst;
|
||||
|
||||
pub trait ExternError: Display {
|
||||
fn into_extern(self) -> Rc<dyn ExternError>
|
||||
where Self: 'static + Sized {
|
||||
where
|
||||
Self: 'static + Sized,
|
||||
{
|
||||
Rc::new(self)
|
||||
}
|
||||
}
|
||||
@@ -58,14 +55,19 @@ pub trait ExternFn: DynClone {
|
||||
fn hash(&self, state: &mut dyn std::hash::Hasher) {
|
||||
state.write_str(self.name())
|
||||
}
|
||||
fn to_xfn_cls(self) -> Clause where Self: Sized + 'static {
|
||||
fn to_xfn_cls(self) -> Clause
|
||||
where
|
||||
Self: Sized + 'static,
|
||||
{
|
||||
Clause::P(Primitive::ExternFn(Box::new(self)))
|
||||
}
|
||||
}
|
||||
|
||||
impl Eq for dyn ExternFn {}
|
||||
impl PartialEq for dyn ExternFn {
|
||||
fn eq(&self, other: &Self) -> bool { self.name() == other.name() }
|
||||
fn eq(&self, other: &Self) -> bool {
|
||||
self.name() == other.name()
|
||||
}
|
||||
}
|
||||
impl Hash for dyn ExternFn {
|
||||
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
||||
@@ -78,10 +80,16 @@ impl Debug for dyn ExternFn {
|
||||
}
|
||||
}
|
||||
|
||||
pub trait Atomic: Any + Debug + DynClone where Self: 'static {
|
||||
pub trait Atomic: Any + Debug + DynClone
|
||||
where
|
||||
Self: 'static,
|
||||
{
|
||||
fn as_any(&self) -> &dyn Any;
|
||||
fn run(&self, ctx: Context) -> AtomicResult;
|
||||
fn to_atom_cls(self) -> Clause where Self: Sized {
|
||||
fn to_atom_cls(self) -> Clause
|
||||
where
|
||||
Self: Sized,
|
||||
{
|
||||
Clause::P(Primitive::Atom(Atom(Box::new(self))))
|
||||
}
|
||||
}
|
||||
@@ -105,10 +113,11 @@ impl Atom {
|
||||
pub fn try_cast<T: Atomic>(&self) -> Result<&T, ()> {
|
||||
self.data().as_any().downcast_ref().ok_or(())
|
||||
}
|
||||
pub fn is<T: 'static>(&self) -> bool { self.data().as_any().is::<T>() }
|
||||
pub fn is<T: 'static>(&self) -> bool {
|
||||
self.data().as_any().is::<T>()
|
||||
}
|
||||
pub fn cast<T: 'static>(&self) -> &T {
|
||||
self.data().as_any().downcast_ref()
|
||||
.expect("Type mismatch on Atom::cast")
|
||||
self.data().as_any().downcast_ref().expect("Type mismatch on Atom::cast")
|
||||
}
|
||||
pub fn run(&self, ctx: Context) -> AtomicResult {
|
||||
self.0.run(ctx)
|
||||
@@ -125,4 +134,4 @@ impl Debug for Atom {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "##ATOM[{:?}]##", self.data())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,13 +1,16 @@
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::Atomic;
|
||||
|
||||
/// A macro that generates the straightforward, syntactically invariant part of implementing
|
||||
/// [Atomic]. Implemented fns are [Atomic::as_any], [Atomic::definitely_eq] and [Atomic::hash].
|
||||
///
|
||||
/// A macro that generates the straightforward, syntactically invariant part of
|
||||
/// implementing [Atomic]. Implemented fns are [Atomic::as_any],
|
||||
/// [Atomic::definitely_eq] and [Atomic::hash].
|
||||
///
|
||||
/// It depends on [Eq] and [Hash]
|
||||
#[macro_export]
|
||||
macro_rules! atomic_defaults {
|
||||
() => {
|
||||
fn as_any(&self) -> &dyn std::any::Any { self }
|
||||
fn as_any(&self) -> &dyn std::any::Any {
|
||||
self
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,33 +1,39 @@
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::representations::Primitive;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::{Atomic, ExternFn};
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::any::Any;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use dyn_clone::DynClone;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::fmt::Debug;
|
||||
|
||||
/// A macro that generates implementations of [Atomic] to simplify the development of
|
||||
/// external bindings for Orchid.
|
||||
///
|
||||
/// The macro depends on implementations of [AsRef<Clause>] and [From<(&Self, Clause)>] for
|
||||
/// extracting the clause to be processed and then reconstructing the [Atomic]. Naturally,
|
||||
/// supertraits of [Atomic] are also dependencies. These are [Any], [Debug] and [DynClone].
|
||||
///
|
||||
/// The simplest form just requires the typename to be specified. This additionally depends on an
|
||||
/// implementation of [ExternFn] because after the clause is fully normalized it returns `Self`
|
||||
/// wrapped in a [Primitive::ExternFn]. It is intended for intermediary
|
||||
/// stages of the function where validation and the next state are defined in [ExternFn::apply].
|
||||
///
|
||||
#[allow(unused)] // for the doc comments
|
||||
use dyn_clone::DynClone;
|
||||
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::{Atomic, ExternFn};
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::representations::Primitive;
|
||||
|
||||
/// A macro that generates implementations of [Atomic] to simplify the
|
||||
/// development of external bindings for Orchid.
|
||||
///
|
||||
/// The macro depends on implementations of [AsRef<Clause>] and [From<(&Self,
|
||||
/// Clause)>] for extracting the clause to be processed and then reconstructing
|
||||
/// the [Atomic]. Naturally, supertraits of [Atomic] are also dependencies.
|
||||
/// These are [Any], [Debug] and [DynClone].
|
||||
///
|
||||
/// The simplest form just requires the typename to be specified. This
|
||||
/// additionally depends on an implementation of [ExternFn] because after the
|
||||
/// clause is fully normalized it returns `Self` wrapped in a
|
||||
/// [Primitive::ExternFn]. It is intended for intermediary stages of the
|
||||
/// function where validation and the next state are defined in
|
||||
/// [ExternFn::apply].
|
||||
///
|
||||
/// ```
|
||||
/// atomic_impl!(Multiply1)
|
||||
/// ```
|
||||
///
|
||||
/// The last stage of the function should use the extended form of the macro which takes an
|
||||
/// additional closure to explicitly describe what happens when the argument is fully processed.
|
||||
///
|
||||
///
|
||||
/// The last stage of the function should use the extended form of the macro
|
||||
/// which takes an additional closure to explicitly describe what happens when
|
||||
/// the argument is fully processed.
|
||||
///
|
||||
/// ```
|
||||
/// // excerpt from the exact implementation of Multiply
|
||||
/// atomic_impl!(Multiply0, |Self(a, cls): &Self| {
|
||||
@@ -35,45 +41,44 @@ use std::fmt::Debug;
|
||||
/// Ok(*a * b).into())
|
||||
/// })
|
||||
/// ```
|
||||
///
|
||||
#[macro_export]
|
||||
macro_rules! atomic_impl {
|
||||
($typ:ident) => {
|
||||
atomic_impl!{$typ, |this: &Self, _: $crate::interpreter::Context| {
|
||||
atomic_impl! {$typ, |this: &Self, _: $crate::interpreter::Context| {
|
||||
use $crate::foreign::ExternFn;
|
||||
Ok(this.clone().to_xfn_cls())
|
||||
}}
|
||||
};
|
||||
($typ:ident, $next_phase:expr) => {
|
||||
impl $crate::foreign::Atomic for $typ {
|
||||
$crate::atomic_defaults!{}
|
||||
$crate::atomic_defaults! {}
|
||||
|
||||
fn run(&self, ctx: $crate::interpreter::Context)
|
||||
-> $crate::foreign::AtomicResult
|
||||
{
|
||||
fn run(
|
||||
&self,
|
||||
ctx: $crate::interpreter::Context,
|
||||
) -> $crate::foreign::AtomicResult {
|
||||
// extract the expression
|
||||
let expr = <Self as
|
||||
AsRef<$crate::foreign::RcExpr>
|
||||
>::as_ref(self).clone();
|
||||
let expr =
|
||||
<Self as AsRef<$crate::foreign::RcExpr>>::as_ref(self).clone();
|
||||
// run the expression
|
||||
let ret = $crate::interpreter::run(expr, ctx.clone())?;
|
||||
let $crate::interpreter::Return{ gas, state, inert } = ret;
|
||||
let $crate::interpreter::Return { gas, state, inert } = ret;
|
||||
// rebuild the atomic
|
||||
let next_self = <Self as
|
||||
From<(&Self, $crate::foreign::RcExpr)>
|
||||
>::from((self, state));
|
||||
let next_self =
|
||||
<Self as From<(&Self, $crate::foreign::RcExpr)>>::from((self, state));
|
||||
// branch off or wrap up
|
||||
let clause = if inert {
|
||||
match ($next_phase)(&next_self, ctx) {
|
||||
let closure = $next_phase;
|
||||
match closure(&next_self, ctx) {
|
||||
Ok(r) => r,
|
||||
Err(e) => return Err(
|
||||
$crate::interpreter::RuntimeError::Extern(e)
|
||||
)
|
||||
Err(e) => return Err($crate::interpreter::RuntimeError::Extern(e)),
|
||||
}
|
||||
} else { next_self.to_atom_cls() };
|
||||
} else {
|
||||
next_self.to_atom_cls()
|
||||
};
|
||||
// package and return
|
||||
Ok($crate::foreign::AtomicReturn{ clause, gas, inert: false })
|
||||
Ok($crate::foreign::AtomicReturn { clause, gas, inert: false })
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,29 +1,33 @@
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::Atomic;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::any::Any;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use dyn_clone::DynClone;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::fmt::Debug;
|
||||
|
||||
/// Implement [Atomic] for a structure that cannot be transformed any further. This would be optimal
|
||||
/// for atomics encapsulating raw data. [Atomic] depends on [Any], [Debug] and [DynClone].
|
||||
#[allow(unused)] // for the doc comments
|
||||
use dyn_clone::DynClone;
|
||||
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::Atomic;
|
||||
|
||||
/// Implement [Atomic] for a structure that cannot be transformed any further.
|
||||
/// This would be optimal for atomics encapsulating raw data. [Atomic] depends
|
||||
/// on [Any], [Debug] and [DynClone].
|
||||
#[macro_export]
|
||||
macro_rules! atomic_inert {
|
||||
($typ:ident) => {
|
||||
impl $crate::foreign::Atomic for $typ {
|
||||
$crate::atomic_defaults!{}
|
||||
$crate::atomic_defaults! {}
|
||||
|
||||
fn run(&self, ctx: $crate::interpreter::Context)
|
||||
-> $crate::foreign::AtomicResult
|
||||
{
|
||||
Ok($crate::foreign::AtomicReturn{
|
||||
fn run(
|
||||
&self,
|
||||
ctx: $crate::interpreter::Context,
|
||||
) -> $crate::foreign::AtomicResult {
|
||||
Ok($crate::foreign::AtomicReturn {
|
||||
clause: self.clone().to_atom_cls(),
|
||||
gas: ctx.gas,
|
||||
inert: true
|
||||
inert: true,
|
||||
})
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,30 +1,33 @@
|
||||
#[allow(unused)]
|
||||
use super::atomic_impl;
|
||||
|
||||
/// Implement the traits required by [atomic_impl] to redirect run_* functions to a field
|
||||
/// with a particular name.
|
||||
/// Implement the traits required by [atomic_impl] to redirect run_* functions
|
||||
/// to a field with a particular name.
|
||||
#[macro_export]
|
||||
macro_rules! atomic_redirect {
|
||||
($typ:ident) => {
|
||||
impl AsRef<$crate::foreign::RcExpr> for $typ {
|
||||
fn as_ref(&self) -> &Clause { &self.0 }
|
||||
fn as_ref(&self) -> &Clause {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
impl From<(&Self, $crate::foreign::RcExpr)> for $typ {
|
||||
fn from((old, clause): (&Self, Clause)) -> Self {
|
||||
Self{ 0: clause, ..old.clone() }
|
||||
Self { 0: clause, ..old.clone() }
|
||||
}
|
||||
}
|
||||
};
|
||||
($typ:ident, $field:ident) => {
|
||||
impl AsRef<$crate::foreign::RcExpr>
|
||||
for $typ {
|
||||
fn as_ref(&self) -> &$crate::foreign::RcExpr { &self.$field }
|
||||
impl AsRef<$crate::foreign::RcExpr> for $typ {
|
||||
fn as_ref(&self) -> &$crate::foreign::RcExpr {
|
||||
&self.$field
|
||||
}
|
||||
}
|
||||
impl From<(&Self, $crate::foreign::RcExpr)>
|
||||
for $typ {
|
||||
impl From<(&Self, $crate::foreign::RcExpr)> for $typ {
|
||||
#[allow(clippy::needless_update)]
|
||||
fn from((old, $field): (&Self, $crate::foreign::RcExpr)) -> Self {
|
||||
Self{ $field, ..old.clone() }
|
||||
Self { $field, ..old.clone() }
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,41 +1,47 @@
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::{atomic_impl, atomic_redirect};
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::representations::Primitive;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::{Atomic, ExternFn};
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::any::Any;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::fmt::Debug;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::hash::Hash;
|
||||
|
||||
#[allow(unused)] // for the doc comments
|
||||
use dyn_clone::DynClone;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use std::fmt::Debug;
|
||||
|
||||
/// Implement [ExternFn] with a closure that produces an [Atomic] from a reference to self
|
||||
/// and a closure. This can be used in conjunction with [atomic_impl] and [atomic_redirect]
|
||||
/// to normalize the argument automatically before using it.
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::foreign::{Atomic, ExternFn};
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::representations::Primitive;
|
||||
#[allow(unused)] // for the doc comments
|
||||
use crate::{atomic_impl, atomic_redirect};
|
||||
|
||||
/// Implement [ExternFn] with a closure that produces an [Atomic] from a
|
||||
/// reference to self and a closure. This can be used in conjunction with
|
||||
/// [atomic_impl] and [atomic_redirect] to normalize the argument automatically
|
||||
/// before using it.
|
||||
#[macro_export]
|
||||
macro_rules! externfn_impl {
|
||||
($typ:ident, $next_atomic:expr) => {
|
||||
impl $crate::foreign::ExternFn for $typ {
|
||||
fn name(&self) -> &str {stringify!($typ)}
|
||||
fn apply(&self,
|
||||
fn name(&self) -> &str {
|
||||
stringify!($typ)
|
||||
}
|
||||
fn apply(
|
||||
&self,
|
||||
arg: $crate::foreign::RcExpr,
|
||||
_ctx: $crate::interpreter::Context
|
||||
_ctx: $crate::interpreter::Context,
|
||||
) -> $crate::foreign::XfnResult {
|
||||
match ($next_atomic)(self, arg) { // ? casts the result but we want to strictly forward it
|
||||
Ok(r) => Ok(
|
||||
$crate::representations::interpreted::Clause::P(
|
||||
$crate::representations::Primitive::Atom(
|
||||
$crate::foreign::Atom::new(r)
|
||||
)
|
||||
)
|
||||
),
|
||||
Err(e) => Err(e)
|
||||
let closure = $next_atomic;
|
||||
match closure(self, arg) {
|
||||
// ? casts the result but we want to strictly forward it
|
||||
Ok(r) => Ok($crate::representations::interpreted::Clause::P(
|
||||
$crate::representations::Primitive::Atom(
|
||||
$crate::foreign::Atom::new(r),
|
||||
),
|
||||
)),
|
||||
Err(e) => Err(e),
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,4 +2,4 @@ mod atomic_defaults;
|
||||
mod atomic_impl;
|
||||
mod atomic_inert;
|
||||
mod atomic_redirect;
|
||||
mod externfn_impl;
|
||||
mod externfn_impl;
|
||||
|
||||
@@ -6,13 +6,14 @@ use crate::interner::Interner;
|
||||
/// A variant of [std::fmt::Display] for objects that contain interned
|
||||
/// strings and therefore can only be stringified in the presence of a
|
||||
/// string interner
|
||||
///
|
||||
///
|
||||
/// The functions defined here are suffixed to distinguish them from
|
||||
/// the ones in Display and ToString respectively, because Rust can't
|
||||
/// identify functions based on arity
|
||||
pub trait InternedDisplay {
|
||||
/// formats the value using the given formatter and string interner
|
||||
fn fmt_i(&self,
|
||||
fn fmt_i(
|
||||
&self,
|
||||
f: &mut std::fmt::Formatter<'_>,
|
||||
i: &Interner,
|
||||
) -> std::fmt::Result;
|
||||
@@ -28,26 +29,31 @@ pub trait InternedDisplay {
|
||||
buf
|
||||
}
|
||||
|
||||
fn bundle<'a>(&'a self, interner: &'a Interner)
|
||||
-> DisplayBundle<'a, Self>
|
||||
{
|
||||
fn bundle<'a>(&'a self, interner: &'a Interner) -> DisplayBundle<'a, Self> {
|
||||
DisplayBundle { interner, data: self }
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> InternedDisplay for T where T: Display {
|
||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, _i: &Interner) -> std::fmt::Result {
|
||||
<Self as Display>::fmt(&self, f)
|
||||
impl<T> InternedDisplay for T
|
||||
where
|
||||
T: Display,
|
||||
{
|
||||
fn fmt_i(
|
||||
&self,
|
||||
f: &mut std::fmt::Formatter<'_>,
|
||||
_i: &Interner,
|
||||
) -> std::fmt::Result {
|
||||
<Self as Display>::fmt(self, f)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct DisplayBundle<'a, T: InternedDisplay + ?Sized> {
|
||||
interner: &'a Interner,
|
||||
data: &'a T
|
||||
data: &'a T,
|
||||
}
|
||||
|
||||
impl<'a, T: InternedDisplay> Display for DisplayBundle<'a, T> {
|
||||
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
|
||||
self.data.fmt_i(f, self.interner)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,21 @@
|
||||
mod display;
|
||||
mod monotype;
|
||||
mod multitype;
|
||||
mod token;
|
||||
mod display;
|
||||
|
||||
pub use display::{DisplayBundle, InternedDisplay};
|
||||
pub use monotype::TypedInterner;
|
||||
pub use multitype::Interner;
|
||||
pub use token::Token;
|
||||
pub use display::{DisplayBundle, InternedDisplay};
|
||||
pub use token::Tok;
|
||||
|
||||
/// A symbol, nsname, nname or namespaced name is a sequence of namespaces
|
||||
/// and an identifier. The [Vec] can never be empty.
|
||||
///
|
||||
/// Throughout different stages of processing, these names can be
|
||||
///
|
||||
/// - local names to be prefixed with the current module
|
||||
/// - imported names starting with a segment
|
||||
/// - ending a single import or
|
||||
/// - defined in one of the glob imported modules
|
||||
/// - absolute names
|
||||
pub type Sym = Tok<Vec<Tok<String>>>;
|
||||
|
||||
@@ -1,50 +1,54 @@
|
||||
use std::num::NonZeroU32;
|
||||
use std::cell::RefCell;
|
||||
use std::borrow::Borrow;
|
||||
use std::hash::{Hash, BuildHasher};
|
||||
use std::cell::RefCell;
|
||||
use std::hash::{BuildHasher, Hash};
|
||||
use std::num::NonZeroU32;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use super::token::Token;
|
||||
use super::token::Tok;
|
||||
|
||||
pub struct TypedInterner<T: 'static + Eq + Hash + Clone>{
|
||||
tokens: RefCell<HashMap<&'static T, Token<T>>>,
|
||||
values: RefCell<Vec<(&'static T, bool)>>
|
||||
/// An interner for any type that implements [Borrow]. This is inspired by
|
||||
/// Lasso but much simpler, in part because not much can be known about the type.
|
||||
pub struct TypedInterner<T: 'static + Eq + Hash + Clone> {
|
||||
tokens: RefCell<HashMap<&'static T, Tok<T>>>,
|
||||
values: RefCell<Vec<(&'static T, bool)>>,
|
||||
}
|
||||
impl<T: Eq + Hash + Clone> TypedInterner<T> {
|
||||
/// Create a fresh interner instance
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
tokens: RefCell::new(HashMap::new()),
|
||||
values: RefCell::new(Vec::new())
|
||||
values: RefCell::new(Vec::new()),
|
||||
}
|
||||
}
|
||||
|
||||
/// Intern an object, returning a token
|
||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned<Owned = T>>(&self, q: &Q)
|
||||
-> Token<T> where T: Borrow<Q>
|
||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned<Owned = T>>(&self, q: &Q) -> Tok<T>
|
||||
where
|
||||
T: Borrow<Q>,
|
||||
{
|
||||
let mut tokens = self.tokens.borrow_mut();
|
||||
let hash = compute_hash(tokens.hasher(), q);
|
||||
let raw_entry = tokens.raw_entry_mut().from_hash(hash, |k| {
|
||||
<T as Borrow<Q>>::borrow(k) == q
|
||||
});
|
||||
let raw_entry = tokens
|
||||
.raw_entry_mut()
|
||||
.from_hash(hash, |k| <T as Borrow<Q>>::borrow(k) == q);
|
||||
let kv = raw_entry.or_insert_with(|| {
|
||||
let mut values = self.values.borrow_mut();
|
||||
let uniq_key: NonZeroU32 = (values.len() as u32 + 1u32)
|
||||
.try_into().expect("can never be zero");
|
||||
let uniq_key: NonZeroU32 =
|
||||
(values.len() as u32 + 1u32).try_into().expect("can never be zero");
|
||||
let keybox = Box::new(q.to_owned());
|
||||
let keyref = Box::leak(keybox);
|
||||
values.push((keyref, true));
|
||||
let token = Token::<T>::from_id(uniq_key);
|
||||
let token = Tok::<T>::from_id(uniq_key);
|
||||
(keyref, token)
|
||||
});
|
||||
*kv.1
|
||||
}
|
||||
|
||||
/// Resolve a token, obtaining an object
|
||||
/// It is illegal to use a token obtained from one interner with another.
|
||||
pub fn r(&self, t: Token<T>) -> &T {
|
||||
/// It is illegal to use a token obtained from one interner with
|
||||
/// another.
|
||||
pub fn r(&self, t: Tok<T>) -> &T {
|
||||
let values = self.values.borrow();
|
||||
let key = t.into_usize() - 1;
|
||||
values[key].0
|
||||
@@ -52,17 +56,20 @@ impl<T: Eq + Hash + Clone> TypedInterner<T> {
|
||||
|
||||
/// Intern a static reference without allocating the data on the heap
|
||||
#[allow(unused)]
|
||||
pub fn intern_static(&self, tref: &'static T) -> Token<T> {
|
||||
pub fn intern_static(&self, tref: &'static T) -> Tok<T> {
|
||||
let mut tokens = self.tokens.borrow_mut();
|
||||
let token = *tokens.raw_entry_mut().from_key(tref)
|
||||
.or_insert_with(|| {
|
||||
let mut values = self.values.borrow_mut();
|
||||
let uniq_key: NonZeroU32 = (values.len() as u32 + 1u32)
|
||||
.try_into().expect("can never be zero");
|
||||
values.push((tref, false));
|
||||
let token = Token::<T>::from_id(uniq_key);
|
||||
(tref, token)
|
||||
}).1;
|
||||
let token = *tokens
|
||||
.raw_entry_mut()
|
||||
.from_key(tref)
|
||||
.or_insert_with(|| {
|
||||
let mut values = self.values.borrow_mut();
|
||||
let uniq_key: NonZeroU32 =
|
||||
(values.len() as u32 + 1u32).try_into().expect("can never be zero");
|
||||
values.push((tref, false));
|
||||
let token = Tok::<T>::from_id(uniq_key);
|
||||
(tref, token)
|
||||
})
|
||||
.1;
|
||||
token
|
||||
}
|
||||
}
|
||||
@@ -74,10 +81,10 @@ impl<T: Eq + Hash + Clone> Drop for TypedInterner<T> {
|
||||
// which negates the need for unsafe here
|
||||
let mut values = self.values.borrow_mut();
|
||||
for (item, owned) in values.drain(..) {
|
||||
if !owned {continue}
|
||||
unsafe {
|
||||
Box::from_raw((item as *const T).cast_mut())
|
||||
};
|
||||
if !owned {
|
||||
continue;
|
||||
}
|
||||
unsafe { Box::from_raw((item as *const T).cast_mut()) };
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -85,10 +92,10 @@ impl<T: Eq + Hash + Clone> Drop for TypedInterner<T> {
|
||||
/// Helper function to compute hashes outside a hashmap
|
||||
fn compute_hash(
|
||||
hash_builder: &impl BuildHasher,
|
||||
key: &(impl Hash + ?Sized)
|
||||
key: &(impl Hash + ?Sized),
|
||||
) -> u64 {
|
||||
use core::hash::Hasher;
|
||||
let mut state = hash_builder.build_hasher();
|
||||
key.hash(&mut state);
|
||||
state.finish()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,17 @@
|
||||
use std::any::{Any, TypeId};
|
||||
use std::borrow::Borrow;
|
||||
use std::cell::{RefCell, RefMut};
|
||||
use std::any::{TypeId, Any};
|
||||
use std::hash::Hash;
|
||||
use std::rc::Rc;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use super::monotype::TypedInterner;
|
||||
use super::token::Token;
|
||||
use super::token::Tok;
|
||||
|
||||
/// A collection of interners based on their type. Allows to intern any object
|
||||
/// that implements [ToOwned]. Objects of the same type are stored together in a
|
||||
/// [TypedInterner].
|
||||
pub struct Interner {
|
||||
interners: RefCell<HashMap<TypeId, Rc<dyn Any>>>,
|
||||
}
|
||||
@@ -17,56 +20,59 @@ impl Interner {
|
||||
Self { interners: RefCell::new(HashMap::new()) }
|
||||
}
|
||||
|
||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned>(&self, q: &Q)
|
||||
-> Token<Q::Owned>
|
||||
where Q::Owned: 'static + Eq + Hash + Clone + Borrow<Q>
|
||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned>(&self, q: &Q) -> Tok<Q::Owned>
|
||||
where
|
||||
Q::Owned: 'static + Eq + Hash + Clone + Borrow<Q>,
|
||||
{
|
||||
let mut interners = self.interners.borrow_mut();
|
||||
let interner = get_interner(&mut interners);
|
||||
interner.i(q)
|
||||
}
|
||||
|
||||
pub fn r<T: 'static + Eq + Hash + Clone>(&self, t: Token<T>) -> &T {
|
||||
pub fn r<T: 'static + Eq + Hash + Clone>(&self, t: Tok<T>) -> &T {
|
||||
let mut interners = self.interners.borrow_mut();
|
||||
let interner = get_interner(&mut interners);
|
||||
// TODO: figure this out
|
||||
unsafe{ (interner.r(t) as *const T).as_ref().unwrap() }
|
||||
unsafe { (interner.r(t) as *const T).as_ref().unwrap() }
|
||||
}
|
||||
|
||||
/// Fully resolve
|
||||
/// TODO: make this generic over containers
|
||||
pub fn extern_vec<T: 'static + Eq + Hash + Clone>(&self,
|
||||
t: Token<Vec<Token<T>>>
|
||||
pub fn extern_vec<T: 'static + Eq + Hash + Clone>(
|
||||
&self,
|
||||
t: Tok<Vec<Tok<T>>>,
|
||||
) -> Vec<T> {
|
||||
let mut interners = self.interners.borrow_mut();
|
||||
let v_int = get_interner(&mut interners);
|
||||
let t_int = get_interner(&mut interners);
|
||||
let v = v_int.r(t);
|
||||
v.iter()
|
||||
.map(|t| t_int.r(*t))
|
||||
.cloned()
|
||||
.collect()
|
||||
v.iter().map(|t| t_int.r(*t)).cloned().collect()
|
||||
}
|
||||
|
||||
pub fn extern_all<T: 'static + Eq + Hash + Clone>(&self,
|
||||
s: &[Token<T>]
|
||||
pub fn extern_all<T: 'static + Eq + Hash + Clone>(
|
||||
&self,
|
||||
s: &[Tok<T>],
|
||||
) -> Vec<T> {
|
||||
s.iter()
|
||||
.map(|t| self.r(*t))
|
||||
.cloned()
|
||||
.collect()
|
||||
s.iter().map(|t| self.r(*t)).cloned().collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for Interner {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Get or create an interner for a given type.
|
||||
fn get_interner<T: 'static + Eq + Hash + Clone>(
|
||||
interners: &mut RefMut<HashMap<TypeId, Rc<dyn Any>>>
|
||||
interners: &mut RefMut<HashMap<TypeId, Rc<dyn Any>>>,
|
||||
) -> Rc<TypedInterner<T>> {
|
||||
let boxed = interners.raw_entry_mut().from_key(&TypeId::of::<T>())
|
||||
.or_insert_with(|| (
|
||||
TypeId::of::<T>(),
|
||||
Rc::new(TypedInterner::<T>::new())
|
||||
)).1.clone();
|
||||
let boxed = interners
|
||||
.raw_entry_mut()
|
||||
.from_key(&TypeId::of::<T>())
|
||||
.or_insert_with(|| (TypeId::of::<T>(), Rc::new(TypedInterner::<T>::new())))
|
||||
.1
|
||||
.clone();
|
||||
boxed.downcast().expect("the typeid is supposed to protect from this")
|
||||
}
|
||||
|
||||
@@ -94,8 +100,9 @@ mod test {
|
||||
#[allow(unused)]
|
||||
pub fn test_str_slice() {
|
||||
let interner = Interner::new();
|
||||
let key1 = interner.i(&vec!["a".to_string(), "b".to_string(), "c".to_string()]);
|
||||
let key1 =
|
||||
interner.i(&vec!["a".to_string(), "b".to_string(), "c".to_string()]);
|
||||
let key2 = interner.i(&["a", "b", "c"][..]);
|
||||
// assert_eq!(key1, key2);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,18 @@
|
||||
use std::{num::NonZeroU32, marker::PhantomData};
|
||||
use std::cmp::PartialEq;
|
||||
use std::fmt::Debug;
|
||||
use std::hash::Hash;
|
||||
use std::marker::PhantomData;
|
||||
use std::num::NonZeroU32;
|
||||
|
||||
use std::cmp::PartialEq;
|
||||
|
||||
pub struct Token<T>{
|
||||
/// A number representing an object of type `T` stored in some interner. It is a
|
||||
/// logic error to compare tokens obtained from different interners, or to use a
|
||||
/// token with an interner other than the one that created it, but this is
|
||||
/// currently not enforced.
|
||||
pub struct Tok<T> {
|
||||
id: NonZeroU32,
|
||||
phantom_data: PhantomData<T>
|
||||
phantom_data: PhantomData<T>,
|
||||
}
|
||||
impl<T> Token<T> {
|
||||
impl<T> Tok<T> {
|
||||
pub fn from_id(id: NonZeroU32) -> Self {
|
||||
Self { id, phantom_data: PhantomData }
|
||||
}
|
||||
@@ -21,37 +25,39 @@ impl<T> Token<T> {
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Debug for Token<T> {
|
||||
impl<T> Debug for Tok<T> {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "Token({})", self.id)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Copy for Token<T> {}
|
||||
impl<T> Clone for Token<T> {
|
||||
impl<T> Copy for Tok<T> {}
|
||||
impl<T> Clone for Tok<T> {
|
||||
fn clone(&self) -> Self {
|
||||
Self{ id: self.id, phantom_data: PhantomData }
|
||||
Self { id: self.id, phantom_data: PhantomData }
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Eq for Token<T> {}
|
||||
impl<T> PartialEq for Token<T> {
|
||||
fn eq(&self, other: &Self) -> bool { self.id == other.id }
|
||||
impl<T> Eq for Tok<T> {}
|
||||
impl<T> PartialEq for Tok<T> {
|
||||
fn eq(&self, other: &Self) -> bool {
|
||||
self.id == other.id
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Ord for Token<T> {
|
||||
impl<T> Ord for Tok<T> {
|
||||
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
|
||||
self.id.cmp(&other.id)
|
||||
}
|
||||
}
|
||||
impl<T> PartialOrd for Token<T> {
|
||||
impl<T> PartialOrd for Tok<T> {
|
||||
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
|
||||
Some(self.cmp(&other))
|
||||
Some(self.cmp(other))
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Hash for Token<T> {
|
||||
impl<T> Hash for Tok<T> {
|
||||
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
||||
state.write_u32(self.id.into())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,103 +1,126 @@
|
||||
use super::context::Context;
|
||||
use super::error::RuntimeError;
|
||||
use super::Return;
|
||||
use crate::foreign::AtomicReturn;
|
||||
use crate::representations::Primitive;
|
||||
use crate::representations::PathSet;
|
||||
use crate::representations::interpreted::{ExprInst, Clause};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::{PathSet, Primitive};
|
||||
use crate::utils::Side;
|
||||
|
||||
use super::Return;
|
||||
use super::error::RuntimeError;
|
||||
use super::context::Context;
|
||||
|
||||
/// Process the clause at the end of the provided path.
|
||||
/// Note that paths always point to at least one target.
|
||||
/// Note also that this is not cached as a normalization step in the
|
||||
/// intermediate expressions.
|
||||
/// Process the clause at the end of the provided path. Note that paths always
|
||||
/// point to at least one target. Note also that this is not cached as a
|
||||
/// normalization step in the intermediate expressions.
|
||||
fn map_at<E>(
|
||||
path: &[Side], source: ExprInst,
|
||||
mapper: &mut impl FnMut(&Clause) -> Result<Clause, E>
|
||||
path: &[Side],
|
||||
source: ExprInst,
|
||||
mapper: &mut impl FnMut(&Clause) -> Result<Clause, E>,
|
||||
) -> Result<ExprInst, E> {
|
||||
source.try_update(|value| {
|
||||
// Pass right through lambdas
|
||||
if let Clause::Lambda { args, body } = value {
|
||||
return Ok((Clause::Lambda {
|
||||
args: args.clone(),
|
||||
body: map_at(path, body.clone(), mapper)?
|
||||
}, ()))
|
||||
}
|
||||
// If the path ends here, process the next (non-lambda) node
|
||||
let (head, tail) = if let Some(sf) = path.split_first() {sf} else {
|
||||
return Ok((mapper(value)?, ()))
|
||||
};
|
||||
// If it's an Apply, execute the next step in the path
|
||||
if let Clause::Apply { f, x } = value {
|
||||
return Ok((match head {
|
||||
Side::Left => Clause::Apply {
|
||||
f: map_at(tail, f.clone(), mapper)?,
|
||||
x: x.clone(),
|
||||
},
|
||||
Side::Right => Clause::Apply {
|
||||
f: f.clone(),
|
||||
x: map_at(tail, x.clone(), mapper)?,
|
||||
}
|
||||
}, ()))
|
||||
}
|
||||
panic!("Invalid path")
|
||||
}).map(|p| p.0)
|
||||
source
|
||||
.try_update(|value| {
|
||||
// Pass right through lambdas
|
||||
if let Clause::Lambda { args, body } = value {
|
||||
return Ok((
|
||||
Clause::Lambda {
|
||||
args: args.clone(),
|
||||
body: map_at(path, body.clone(), mapper)?,
|
||||
},
|
||||
(),
|
||||
));
|
||||
}
|
||||
// If the path ends here, process the next (non-lambda) node
|
||||
let (head, tail) = if let Some(sf) = path.split_first() {
|
||||
sf
|
||||
} else {
|
||||
return Ok((mapper(value)?, ()));
|
||||
};
|
||||
// If it's an Apply, execute the next step in the path
|
||||
if let Clause::Apply { f, x } = value {
|
||||
return Ok((
|
||||
match head {
|
||||
Side::Left => Clause::Apply {
|
||||
f: map_at(tail, f.clone(), mapper)?,
|
||||
x: x.clone(),
|
||||
},
|
||||
Side::Right => Clause::Apply {
|
||||
f: f.clone(),
|
||||
x: map_at(tail, x.clone(), mapper)?,
|
||||
},
|
||||
},
|
||||
(),
|
||||
));
|
||||
}
|
||||
panic!("Invalid path")
|
||||
})
|
||||
.map(|p| p.0)
|
||||
}
|
||||
|
||||
/// Replace the [Clause::LambdaArg] placeholders at the ends of the [PathSet]
|
||||
/// with the value in the body. Note that a path may point to multiple
|
||||
/// placeholders.
|
||||
fn substitute(paths: &PathSet, value: Clause, body: ExprInst) -> ExprInst {
|
||||
let PathSet{ steps, next } = paths;
|
||||
map_at(&steps, body, &mut |checkpoint| -> Result<Clause, !> {
|
||||
let PathSet { steps, next } = paths;
|
||||
map_at(steps, body, &mut |checkpoint| -> Result<Clause, !> {
|
||||
match (checkpoint, next) {
|
||||
(Clause::Lambda{..}, _) => unreachable!("Handled by map_at"),
|
||||
(Clause::Lambda { .. }, _) => unreachable!("Handled by map_at"),
|
||||
(Clause::Apply { f, x }, Some((left, right))) => Ok(Clause::Apply {
|
||||
f: substitute(&left, value.clone(), f.clone()),
|
||||
x: substitute(&right, value.clone(), x.clone()),
|
||||
f: substitute(left, value.clone(), f.clone()),
|
||||
x: substitute(right, value.clone(), x.clone()),
|
||||
}),
|
||||
(Clause::LambdaArg, None) => Ok(value.clone()),
|
||||
(_, None) => panic!("Substitution path ends in something other than LambdaArg"),
|
||||
(_, Some(_)) => panic!("Substitution path leads into something other than Apply"),
|
||||
(_, None) =>
|
||||
panic!("Substitution path ends in something other than LambdaArg"),
|
||||
(_, Some(_)) =>
|
||||
panic!("Substitution path leads into something other than Apply"),
|
||||
}
|
||||
}).into_ok()
|
||||
})
|
||||
.into_ok()
|
||||
}
|
||||
|
||||
/// Apply a function-like expression to a parameter.
|
||||
/// If any work is being done, gas will be deducted.
|
||||
pub fn apply(
|
||||
f: ExprInst, x: ExprInst, ctx: Context
|
||||
f: ExprInst,
|
||||
x: ExprInst,
|
||||
ctx: Context,
|
||||
) -> Result<Return, RuntimeError> {
|
||||
let (state, (gas, inert)) = f.clone().try_update(|clause| match clause {
|
||||
let (state, (gas, inert)) = f.try_update(|clause| match clause {
|
||||
// apply an ExternFn or an internal function
|
||||
Clause::P(Primitive::ExternFn(f)) => {
|
||||
let clause = f.apply(x, ctx.clone())
|
||||
.map_err(|e| RuntimeError::Extern(e))?;
|
||||
let clause =
|
||||
f.apply(x, ctx.clone()).map_err(|e| RuntimeError::Extern(e))?;
|
||||
Ok((clause, (ctx.gas.map(|g| g - 1), false)))
|
||||
}
|
||||
Clause::Lambda{args, body} => Ok(if let Some(args) = args {
|
||||
},
|
||||
Clause::Lambda { args, body } => Ok(if let Some(args) = args {
|
||||
let x_cls = x.expr().clause.clone();
|
||||
let new_xpr_inst = substitute(args, x_cls, body.clone());
|
||||
let new_xpr = new_xpr_inst.expr();
|
||||
// cost of substitution
|
||||
// XXX: should this be the number of occurrences instead?
|
||||
(new_xpr.clause.clone(), (ctx.gas.map(|x| x - 1), false))
|
||||
} else {(body.expr().clause.clone(), (ctx.gas, false))}),
|
||||
} else {
|
||||
(body.expr().clause.clone(), (ctx.gas, false))
|
||||
}),
|
||||
Clause::Constant(name) => {
|
||||
let symval = if let Some(sym) = ctx.symbols.get(name) {sym.clone()}
|
||||
else { panic!("missing symbol for function {}",
|
||||
ctx.interner.extern_vec(*name).join("::")
|
||||
)};
|
||||
Ok((Clause::Apply { f: symval, x, }, (ctx.gas, false)))
|
||||
}
|
||||
Clause::P(Primitive::Atom(atom)) => { // take a step in expanding atom
|
||||
let symval = if let Some(sym) = ctx.symbols.get(name) {
|
||||
sym.clone()
|
||||
} else {
|
||||
panic!(
|
||||
"missing symbol for function {}",
|
||||
ctx.interner.extern_vec(*name).join("::")
|
||||
)
|
||||
};
|
||||
Ok((Clause::Apply { f: symval, x }, (ctx.gas, false)))
|
||||
},
|
||||
Clause::P(Primitive::Atom(atom)) => {
|
||||
// take a step in expanding atom
|
||||
let AtomicReturn { clause, gas, inert } = atom.run(ctx.clone())?;
|
||||
Ok((Clause::Apply { f: clause.wrap(), x }, (gas, inert)))
|
||||
},
|
||||
Clause::Apply{ f: fun, x: arg } => { // take a step in resolving pre-function
|
||||
Clause::Apply { f: fun, x: arg } => {
|
||||
// take a step in resolving pre-function
|
||||
let ret = apply(fun.clone(), arg.clone(), ctx.clone())?;
|
||||
let Return { state, inert, gas } = ret;
|
||||
Ok((Clause::Apply{ f: state, x }, (gas, inert)))
|
||||
Ok((Clause::Apply { f: state, x }, (gas, inert)))
|
||||
},
|
||||
_ => Err(RuntimeError::NonFunctionApplication(f.clone()))
|
||||
_ => Err(RuntimeError::NonFunctionApplication(f.clone())),
|
||||
})?;
|
||||
Ok(Return { state, gas, inert })
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,29 +1,27 @@
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use crate::interner::{Interner, Sym};
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::interner::{Token, Interner};
|
||||
|
||||
/// All the data associated with an interpreter run
|
||||
#[derive(Clone)]
|
||||
pub struct Context<'a> {
|
||||
pub symbols: &'a HashMap<Token<Vec<Token<String>>>, ExprInst>,
|
||||
/// Table used to resolve constants
|
||||
pub symbols: &'a HashMap<Sym, ExprInst>,
|
||||
/// The interner used for strings internally, so external functions can deduce
|
||||
/// referenced constant names on the fly
|
||||
pub interner: &'a Interner,
|
||||
/// The number of reduction steps the interpreter can take before returning
|
||||
pub gas: Option<usize>,
|
||||
}
|
||||
|
||||
impl Context<'_> {
|
||||
pub fn is_stuck(&self, res: Option<usize>) -> bool {
|
||||
match (res, self.gas) {
|
||||
(Some(a), Some(b)) => a == b,
|
||||
(None, None) => false,
|
||||
(None, Some(_)) => panic!("gas not tracked despite limit"),
|
||||
(Some(_), None) => panic!("gas tracked without request"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// All the data produced by an interpreter run
|
||||
#[derive(Clone)]
|
||||
pub struct Return {
|
||||
/// The new expression tree
|
||||
pub state: ExprInst,
|
||||
/// Leftover [Context::gas] if counted
|
||||
pub gas: Option<usize>,
|
||||
/// If true, the next run would not modify the expression
|
||||
pub inert: bool,
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
use std::fmt::Display;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
use crate::foreign::ExternError;
|
||||
use crate::representations::interpreted::ExprInst;
|
||||
|
||||
/// Problems in the process of execution
|
||||
#[derive(Clone, Debug)]
|
||||
@@ -21,7 +21,8 @@ impl Display for RuntimeError {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::Extern(e) => write!(f, "Error in external function: {e}"),
|
||||
Self::NonFunctionApplication(loc) => write!(f, "Primitive applied as function at {loc:?}")
|
||||
Self::NonFunctionApplication(loc) =>
|
||||
write!(f, "Primitive applied as function at {loc:?}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
mod apply;
|
||||
mod error;
|
||||
mod context;
|
||||
mod error;
|
||||
mod run;
|
||||
|
||||
pub use context::{Context, Return};
|
||||
pub use error::RuntimeError;
|
||||
pub use run::{run, run_handler, Handler, HandlerParm, HandlerRes};
|
||||
pub use run::{run, run_handler, Handler, HandlerParm, HandlerRes};
|
||||
|
||||
@@ -1,106 +1,155 @@
|
||||
use std::mem;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::foreign::{AtomicReturn, Atomic, ExternError, Atom};
|
||||
use crate::representations::Primitive;
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
|
||||
use super::apply::apply;
|
||||
use super::error::RuntimeError;
|
||||
use super::context::{Context, Return};
|
||||
use super::error::RuntimeError;
|
||||
use crate::foreign::{Atom, Atomic, AtomicReturn, ExternError};
|
||||
use crate::representations::interpreted::{Clause, ExprInst};
|
||||
use crate::representations::Primitive;
|
||||
|
||||
pub fn run(
|
||||
expr: ExprInst,
|
||||
mut ctx: Context
|
||||
) -> Result<Return, RuntimeError> {
|
||||
let (state, (gas, inert)) = expr.try_normalize(|cls| -> Result<(Clause, _), RuntimeError> {
|
||||
let mut i = cls.clone();
|
||||
while ctx.gas.map(|g| g > 0).unwrap_or(true) {
|
||||
match &i {
|
||||
Clause::Apply { f, x } => {
|
||||
let res = apply(f.clone(), x.clone(), ctx.clone())?;
|
||||
if res.inert {return Ok((i, (res.gas, true)))}
|
||||
ctx.gas = res.gas;
|
||||
i = res.state.expr().clause.clone();
|
||||
/// Normalize an expression using beta reduction with memoization
|
||||
pub fn run(expr: ExprInst, mut ctx: Context) -> Result<Return, RuntimeError> {
|
||||
let (state, (gas, inert)) =
|
||||
expr.try_normalize(|cls| -> Result<(Clause, _), RuntimeError> {
|
||||
let mut i = cls.clone();
|
||||
while ctx.gas.map(|g| g > 0).unwrap_or(true) {
|
||||
match &i {
|
||||
Clause::Apply { f, x } => {
|
||||
let res = apply(f.clone(), x.clone(), ctx.clone())?;
|
||||
if res.inert {
|
||||
return Ok((i, (res.gas, true)));
|
||||
}
|
||||
ctx.gas = res.gas;
|
||||
i = res.state.expr().clause.clone();
|
||||
},
|
||||
Clause::P(Primitive::Atom(data)) => {
|
||||
let ret = data.run(ctx.clone())?;
|
||||
let AtomicReturn { clause, gas, inert } = ret;
|
||||
if inert {
|
||||
return Ok((i, (gas, true)));
|
||||
}
|
||||
ctx.gas = gas;
|
||||
i = clause.clone();
|
||||
},
|
||||
Clause::Constant(c) => {
|
||||
let symval = ctx.symbols.get(c).expect("missing symbol for value");
|
||||
ctx.gas = ctx.gas.map(|g| g - 1); // cost of lookup
|
||||
i = symval.expr().clause.clone();
|
||||
},
|
||||
// non-reducible
|
||||
_ => return Ok((i, (ctx.gas, true))),
|
||||
}
|
||||
Clause::P(Primitive::Atom(data)) => {
|
||||
let ret = data.run(ctx.clone())?;
|
||||
let AtomicReturn { clause, gas, inert } = ret;
|
||||
if inert {return Ok((i, (gas, true)))}
|
||||
ctx.gas = gas;
|
||||
i = clause.clone();
|
||||
}
|
||||
Clause::Constant(c) => {
|
||||
let symval = ctx.symbols.get(c).expect("missing symbol for value");
|
||||
ctx.gas = ctx.gas.map(|g| g - 1); // cost of lookup
|
||||
i = symval.expr().clause.clone();
|
||||
}
|
||||
// non-reducible
|
||||
_ => return Ok((i, (ctx.gas, true)))
|
||||
}
|
||||
}
|
||||
// out of gas
|
||||
Ok((i, (ctx.gas, false)))
|
||||
})?;
|
||||
// out of gas
|
||||
Ok((i, (ctx.gas, false)))
|
||||
})?;
|
||||
Ok(Return { state, gas, inert })
|
||||
}
|
||||
|
||||
/// Opaque inert data that may encode a command to a [Handler]
|
||||
pub type HandlerParm = Box<dyn Atomic>;
|
||||
pub type HandlerRes = Result<
|
||||
Result<ExprInst, Rc<dyn ExternError>>,
|
||||
HandlerParm
|
||||
>;
|
||||
|
||||
pub trait Handler {
|
||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes;
|
||||
|
||||
fn then<T: Handler>(self, t: T) -> impl Handler where Self: Sized {
|
||||
Pair(self, t)
|
||||
/// Reasons why a [Handler] could not interpret a command. Convertible from
|
||||
/// either variant
|
||||
pub enum HandlerErr {
|
||||
/// The command was addressed to us but its execution resulted in an error
|
||||
Extern(Rc<dyn ExternError>),
|
||||
/// This handler is not applicable, either because the [HandlerParm] is not a
|
||||
/// command or because it's meant for some other handler
|
||||
NA(HandlerParm),
|
||||
}
|
||||
impl From<Rc<dyn ExternError>> for HandlerErr {
|
||||
fn from(value: Rc<dyn ExternError>) -> Self {
|
||||
Self::Extern(value)
|
||||
}
|
||||
}
|
||||
impl<T> From<T> for HandlerErr
|
||||
where
|
||||
T: ExternError + 'static,
|
||||
{
|
||||
fn from(value: T) -> Self {
|
||||
Self::Extern(value.into_extern())
|
||||
}
|
||||
}
|
||||
impl From<HandlerParm> for HandlerErr {
|
||||
fn from(value: HandlerParm) -> Self {
|
||||
Self::NA(value)
|
||||
}
|
||||
}
|
||||
|
||||
impl<F> Handler for F where F: FnMut(HandlerParm) -> HandlerRes {
|
||||
/// Various possible outcomes of a [Handler] execution.
|
||||
pub type HandlerRes = Result<ExprInst, HandlerErr>;
|
||||
|
||||
/// A trait for things that may be able to handle commands returned by Orchid
|
||||
/// code. This trait is implemented for [FnMut(HandlerParm) -> HandlerRes] and
|
||||
/// [(Handler, Handler)], users are not supposed to implement it themselves.
|
||||
///
|
||||
/// A handler receives an arbitrary inert [Atomic] and uses [Atomic::as_any]
|
||||
/// then [std::any::Any::downcast_ref] to obtain a known type. If this fails, it
|
||||
/// returns the box in [HandlerErr::NA] which will be passed to the next
|
||||
/// handler.
|
||||
pub trait Handler {
|
||||
/// Attempt to resolve a command with this handler.
|
||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes;
|
||||
|
||||
/// If this handler isn't applicable, try the other one.
|
||||
fn or<T: Handler>(self, t: T) -> impl Handler
|
||||
where
|
||||
Self: Sized,
|
||||
{
|
||||
(self, t)
|
||||
}
|
||||
}
|
||||
|
||||
impl<F> Handler for F
|
||||
where
|
||||
F: FnMut(HandlerParm) -> HandlerRes,
|
||||
{
|
||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
||||
self(data)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct Pair<T, U>(T, U);
|
||||
|
||||
impl<T: Handler, U: Handler> Handler for Pair<T, U> {
|
||||
impl<T: Handler, U: Handler> Handler for (T, U) {
|
||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
||||
match self.0.resolve(data) {
|
||||
Ok(out) => Ok(out),
|
||||
Err(data) => self.1.resolve(data)
|
||||
Err(HandlerErr::NA(data)) => self.1.resolve(data),
|
||||
x => x,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// [run] orchid code, executing any commands it returns using the specified
|
||||
/// [Handler]s.
|
||||
pub fn run_handler(
|
||||
mut expr: ExprInst,
|
||||
mut handler: impl Handler,
|
||||
mut ctx: Context
|
||||
mut ctx: Context,
|
||||
) -> Result<Return, RuntimeError> {
|
||||
loop {
|
||||
let ret = run(expr.clone(), ctx.clone())?;
|
||||
if ret.gas == Some(0) {
|
||||
return Ok(ret)
|
||||
return Ok(ret);
|
||||
}
|
||||
let state_ex = ret.state.expr();
|
||||
let a = if let Clause::P(Primitive::Atom(a)) = &state_ex.clause {a}
|
||||
else {
|
||||
let a = if let Clause::P(Primitive::Atom(a)) = &state_ex.clause {
|
||||
a
|
||||
} else {
|
||||
mem::drop(state_ex);
|
||||
return Ok(ret)
|
||||
return Ok(ret);
|
||||
};
|
||||
let boxed = a.clone().0;
|
||||
expr = match handler.resolve(boxed) {
|
||||
Ok(r) => r.map_err(RuntimeError::Extern)?,
|
||||
Err(e) => return Ok(Return{
|
||||
gas: ret.gas,
|
||||
inert: ret.inert,
|
||||
state: Clause::P(Primitive::Atom(Atom(e))).wrap()
|
||||
})
|
||||
Ok(expr) => expr,
|
||||
Err(HandlerErr::Extern(ext)) => Err(ext)?,
|
||||
Err(HandlerErr::NA(atomic)) =>
|
||||
return Ok(Return {
|
||||
gas: ret.gas,
|
||||
inert: ret.inert,
|
||||
state: Clause::P(Primitive::Atom(Atom(atomic))).wrap(),
|
||||
}),
|
||||
};
|
||||
ctx.gas = ret.gas;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
21
src/main.rs
21
src/main.rs
@@ -12,19 +12,20 @@
|
||||
#![feature(trait_alias)]
|
||||
#![feature(return_position_impl_trait_in_trait)]
|
||||
|
||||
mod parse;
|
||||
mod cli;
|
||||
mod external;
|
||||
pub(crate) mod foreign;
|
||||
mod foreign_macros;
|
||||
mod interner;
|
||||
mod interpreter;
|
||||
mod utils;
|
||||
mod parse;
|
||||
mod pipeline;
|
||||
mod representations;
|
||||
mod rule;
|
||||
pub(crate) mod foreign;
|
||||
mod external;
|
||||
mod foreign_macros;
|
||||
mod pipeline;
|
||||
mod run_dir;
|
||||
mod cli;
|
||||
use std::{path::PathBuf, fs::File};
|
||||
mod utils;
|
||||
use std::fs::File;
|
||||
use std::path::PathBuf;
|
||||
|
||||
use clap::Parser;
|
||||
use cli::prompt;
|
||||
@@ -37,7 +38,7 @@ use run_dir::run_dir;
|
||||
struct Args {
|
||||
/// Folder containing main.orc
|
||||
#[arg(short, long)]
|
||||
pub project: Option<String>
|
||||
pub project: Option<String>,
|
||||
}
|
||||
|
||||
fn main() {
|
||||
@@ -48,7 +49,7 @@ fn main() {
|
||||
path.push("main.orc");
|
||||
match File::open(&path) {
|
||||
Ok(_) => Ok(p),
|
||||
Err(e) => Err(format!("{}: {e}", path.display()))
|
||||
Err(e) => Err(format!("{}: {e}", path.display())),
|
||||
}
|
||||
})
|
||||
});
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
pub use chumsky::{self, prelude::*, Parser};
|
||||
pub use chumsky::prelude::*;
|
||||
pub use chumsky::{self, Parser};
|
||||
|
||||
use super::decls::SimpleParser;
|
||||
|
||||
/// Parses Lua-style comments
|
||||
pub fn comment_parser() -> impl Parser<char, String, Error = Simple<char>> {
|
||||
pub fn comment_parser() -> impl SimpleParser<char, String> {
|
||||
choice((
|
||||
just("--[").ignore_then(take_until(
|
||||
just("]--").ignored()
|
||||
)),
|
||||
just("--").ignore_then(take_until(
|
||||
just("\n").rewind().ignored().or(end())
|
||||
))
|
||||
)).map(|(vc, ())| vc).collect().labelled("comment")
|
||||
just("--[").ignore_then(take_until(just("]--").ignored())),
|
||||
just("--").ignore_then(take_until(just("\n").rewind().ignored().or(end()))),
|
||||
))
|
||||
.map(|(vc, ())| vc)
|
||||
.collect()
|
||||
.labelled("comment")
|
||||
}
|
||||
|
||||
@@ -3,46 +3,53 @@ use std::rc::Rc;
|
||||
use crate::interner::Interner;
|
||||
|
||||
/// Trait enclosing all context features
|
||||
///
|
||||
///
|
||||
/// Hiding type parameters in associated types allows for simpler
|
||||
/// parser definitions
|
||||
pub trait Context: Clone {
|
||||
type Op: AsRef<str>;
|
||||
|
||||
fn ops<'a>(&'a self) -> &'a [Self::Op];
|
||||
fn ops(&self) -> &[Self::Op];
|
||||
fn file(&self) -> Rc<Vec<String>>;
|
||||
fn interner<'a>(&'a self) -> &'a Interner;
|
||||
fn interner(&self) -> &Interner;
|
||||
}
|
||||
|
||||
/// Struct implementing context
|
||||
///
|
||||
///
|
||||
/// Hiding type parameters in associated types allows for simpler
|
||||
/// parser definitions
|
||||
pub struct ParsingContext<'a, Op> {
|
||||
pub ops: &'a [Op],
|
||||
pub interner: &'a Interner,
|
||||
pub file: Rc<Vec<String>>
|
||||
pub file: Rc<Vec<String>>,
|
||||
}
|
||||
|
||||
impl<'a, Op> ParsingContext<'a, Op> {
|
||||
pub fn new(ops: &'a [Op], interner: &'a Interner, file: Rc<Vec<String>>)
|
||||
-> Self { Self { ops, interner, file } }
|
||||
pub fn new(
|
||||
ops: &'a [Op],
|
||||
interner: &'a Interner,
|
||||
file: Rc<Vec<String>>,
|
||||
) -> Self {
|
||||
Self { ops, interner, file }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, Op> Clone for ParsingContext<'a, Op> {
|
||||
fn clone(&self) -> Self {
|
||||
Self {
|
||||
ops: self.ops,
|
||||
interner: self.interner,
|
||||
file: self.file.clone()
|
||||
}
|
||||
Self { ops: self.ops, interner: self.interner, file: self.file.clone() }
|
||||
}
|
||||
}
|
||||
|
||||
impl<Op: AsRef<str>> Context for ParsingContext<'_, Op> {
|
||||
type Op = Op;
|
||||
|
||||
fn interner<'a>(&'a self) -> &'a Interner { self.interner }
|
||||
fn file(&self) -> Rc<Vec<String>> {self.file.clone()}
|
||||
fn ops<'a>(&'a self) -> &'a [Self::Op] { self.ops }
|
||||
}
|
||||
fn interner(&self) -> &Interner {
|
||||
self.interner
|
||||
}
|
||||
fn file(&self) -> Rc<Vec<String>> {
|
||||
self.file.clone()
|
||||
}
|
||||
fn ops(&self) -> &[Self::Op] {
|
||||
self.ops
|
||||
}
|
||||
}
|
||||
|
||||
14
src/parse/decls.rs
Normal file
14
src/parse/decls.rs
Normal file
@@ -0,0 +1,14 @@
|
||||
use std::hash::Hash;
|
||||
|
||||
use chumsky::prelude::Simple;
|
||||
use chumsky::recursive::Recursive;
|
||||
use chumsky::{BoxedParser, Parser};
|
||||
|
||||
/// Wrapper around [Parser] with [Simple] error to avoid repeating the input
|
||||
pub trait SimpleParser<I: Eq + Hash + Clone, O> =
|
||||
Parser<I, O, Error = Simple<I>>;
|
||||
/// Boxed version of [SimpleParser]
|
||||
pub type BoxedSimpleParser<'a, I, O> = BoxedParser<'a, I, O, Simple<I>>;
|
||||
/// [Recursive] specialization of [SimpleParser] to parameterize calls to
|
||||
/// [chumsky::recursive::recursive]
|
||||
pub type SimpleRecursive<'a, I, O> = Recursive<'a, I, O, Simple<I>>;
|
||||
@@ -1,8 +1,12 @@
|
||||
/// Produces filter_mapping functions for enum types:
|
||||
/// ```rs
|
||||
/// enum_parser!(Foo::Bar | "Some error!") // Accepts Foo::Bar(T) into T
|
||||
/// enum_parser!(Foo::Bar) // same as above but with the default error "Expected Foo::Bar"
|
||||
/// enum_parser!(Foo >> Quz; Bar, Baz) // Parses Foo::Bar(T) into Quz::Bar(T) and Foo::Baz(U) into Quz::Baz(U)
|
||||
/// enum_parser!(Foo::Bar | "Some error!")
|
||||
/// // Foo::Bar(T) into T
|
||||
/// enum_parser!(Foo::Bar)
|
||||
/// // same as above but with the default error "Expected Foo::Bar"
|
||||
/// enum_parser!(Foo >> Quz; Bar, Baz)
|
||||
/// // Foo::Bar(T) into Quz::Bar(T)
|
||||
/// // Foo::Baz(U) into Quz::Baz(U)
|
||||
/// ```
|
||||
#[macro_export]
|
||||
macro_rules! enum_filter {
|
||||
@@ -43,4 +47,4 @@ macro_rules! enum_filter {
|
||||
($p:path) => {
|
||||
enum_filter!($p | {concat!("Expected ", stringify!($p))})
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,107 +1,110 @@
|
||||
use std::ops::Range;
|
||||
use std::rc::Rc;
|
||||
|
||||
use chumsky::{self, prelude::*, Parser};
|
||||
|
||||
use crate::enum_filter;
|
||||
use crate::representations::Primitive;
|
||||
use crate::representations::ast::{Clause, Expr};
|
||||
use crate::representations::location::Location;
|
||||
use crate::interner::Token;
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::{self, Parser};
|
||||
|
||||
use super::context::Context;
|
||||
use super::lexer::{Lexeme, Entry, filter_map_lex};
|
||||
use super::decls::SimpleParser;
|
||||
use super::lexer::{filter_map_lex, Entry, Lexeme};
|
||||
use crate::enum_filter;
|
||||
use crate::interner::Sym;
|
||||
use crate::representations::ast::{Clause, Expr};
|
||||
use crate::representations::location::Location;
|
||||
use crate::representations::Primitive;
|
||||
|
||||
/// Parses any number of expr wrapped in (), [] or {}
|
||||
fn sexpr_parser(
|
||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone
|
||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone {
|
||||
expr: impl SimpleParser<Entry, Expr> + Clone,
|
||||
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone {
|
||||
let body = expr.repeated();
|
||||
choice((
|
||||
Lexeme::LP('(').parser().then(body.clone())
|
||||
.then(Lexeme::RP('(').parser()),
|
||||
Lexeme::LP('[').parser().then(body.clone())
|
||||
.then(Lexeme::RP('[').parser()),
|
||||
Lexeme::LP('{').parser().then(body.clone())
|
||||
.then(Lexeme::RP('{').parser()),
|
||||
)).map(|((lp, body), rp)| {
|
||||
let Entry{lexeme, range: Range{start, ..}} = lp;
|
||||
Lexeme::LP('(').parser().then(body.clone()).then(Lexeme::RP('(').parser()),
|
||||
Lexeme::LP('[').parser().then(body.clone()).then(Lexeme::RP('[').parser()),
|
||||
Lexeme::LP('{').parser().then(body).then(Lexeme::RP('{').parser()),
|
||||
))
|
||||
.map(|((lp, body), rp)| {
|
||||
let Entry { lexeme, range: Range { start, .. } } = lp;
|
||||
let end = rp.range.end;
|
||||
let char = if let Lexeme::LP(c) = lexeme {c}
|
||||
else {unreachable!("The parser only matches Lexeme::LP")};
|
||||
let char = if let Lexeme::LP(c) = lexeme {
|
||||
c
|
||||
} else {
|
||||
unreachable!("The parser only matches Lexeme::LP")
|
||||
};
|
||||
(Clause::S(char, Rc::new(body)), start..end)
|
||||
}).labelled("S-expression")
|
||||
})
|
||||
.labelled("S-expression")
|
||||
}
|
||||
|
||||
/// Parses `\name.body` or `\name:type.body` where name is any valid name
|
||||
/// and type and body are both expressions. Comments are allowed
|
||||
/// and ignored everywhere in between the tokens
|
||||
fn lambda_parser<'a>(
|
||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone + 'a,
|
||||
ctx: impl Context + 'a
|
||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
||||
Lexeme::BS.parser()
|
||||
.ignore_then(expr.clone())
|
||||
.then_ignore(Lexeme::Name(ctx.interner().i(".")).parser())
|
||||
.then(expr.repeated().at_least(1))
|
||||
.map_with_span(move |(arg, body), span| {
|
||||
(Clause::Lambda(Rc::new(arg), Rc::new(body)), span)
|
||||
}).labelled("Lambda")
|
||||
expr: impl SimpleParser<Entry, Expr> + Clone + 'a,
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||
Lexeme::BS
|
||||
.parser()
|
||||
.ignore_then(expr.clone())
|
||||
.then_ignore(Lexeme::Name(ctx.interner().i(".")).parser())
|
||||
.then(expr.repeated().at_least(1))
|
||||
.map_with_span(move |(arg, body), span| {
|
||||
(Clause::Lambda(Rc::new(arg), Rc::new(body)), span)
|
||||
})
|
||||
.labelled("Lambda")
|
||||
}
|
||||
|
||||
/// Parses a sequence of names separated by :: <br/>
|
||||
/// Comments and line breaks are allowed and ignored in between
|
||||
pub fn ns_name_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, (Token<Vec<Token<String>>>, Range<usize>), Error = Simple<Entry>> + Clone + 'a
|
||||
{
|
||||
pub fn ns_name_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, (Sym, Range<usize>)> + Clone + 'a {
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.separated_by(Lexeme::NS.parser()).at_least(1)
|
||||
.separated_by(Lexeme::NS.parser())
|
||||
.at_least(1)
|
||||
.map(move |elements| {
|
||||
let start = elements.first().expect("can never be empty").1.start;
|
||||
let end = elements.last().expect("can never be empty").1.end;
|
||||
let tokens =
|
||||
/*ctx.prefix().iter().copied().chain*/(
|
||||
elements.iter().map(|(t, _)| *t)
|
||||
).collect::<Vec<_>>();
|
||||
let tokens = (elements.iter().map(|(t, _)| *t)).collect::<Vec<_>>();
|
||||
(ctx.interner().i(&tokens), start..end)
|
||||
}).labelled("Namespaced name")
|
||||
})
|
||||
.labelled("Namespaced name")
|
||||
}
|
||||
|
||||
pub fn namelike_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a
|
||||
{
|
||||
pub fn namelike_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||
choice((
|
||||
filter_map_lex(enum_filter!(Lexeme::PH))
|
||||
.map(|(ph, range)| (Clause::Placeh(ph), range)),
|
||||
ns_name_parser(ctx)
|
||||
.map(|(token, range)| (Clause::Name(token), range)),
|
||||
ns_name_parser(ctx).map(|(token, range)| (Clause::Name(token), range)),
|
||||
))
|
||||
}
|
||||
|
||||
pub fn clause_parser<'a>(
|
||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone + 'a,
|
||||
ctx: impl Context + 'a
|
||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
||||
expr: impl SimpleParser<Entry, Expr> + Clone + 'a,
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||
choice((
|
||||
filter_map_lex(enum_filter!(Lexeme >> Primitive; Literal))
|
||||
.map(|(p, s)| (Clause::P(p), s)).labelled("Literal"),
|
||||
.map(|(p, s)| (Clause::P(p), s))
|
||||
.labelled("Literal"),
|
||||
sexpr_parser(expr.clone()),
|
||||
lambda_parser(expr.clone(), ctx.clone()),
|
||||
lambda_parser(expr, ctx.clone()),
|
||||
namelike_parser(ctx),
|
||||
)).labelled("Clause")
|
||||
))
|
||||
.labelled("Clause")
|
||||
}
|
||||
|
||||
/// Parse an expression
|
||||
pub fn xpr_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, Expr, Error = Simple<Entry>> + 'a
|
||||
{
|
||||
pub fn xpr_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, Expr> + 'a {
|
||||
recursive(move |expr| {
|
||||
clause_parser(expr, ctx.clone())
|
||||
.map(move |(value, range)| {
|
||||
Expr{
|
||||
value: value.clone(),
|
||||
location: Location::Range { file: ctx.file(), range }
|
||||
}
|
||||
clause_parser(expr, ctx.clone()).map(move |(value, range)| Expr {
|
||||
value,
|
||||
location: Location::Range { file: ctx.file(), range },
|
||||
})
|
||||
}).labelled("Expression")
|
||||
}
|
||||
})
|
||||
.labelled("Expression")
|
||||
}
|
||||
|
||||
@@ -1,58 +1,59 @@
|
||||
use std::fmt::Debug;
|
||||
|
||||
use chumsky::{prelude::*, Parser};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::Parser;
|
||||
use thiserror::Error;
|
||||
|
||||
use crate::representations::sourcefile::{FileEntry};
|
||||
use crate::parse::sourcefile::split_lines;
|
||||
|
||||
use super::context::Context;
|
||||
use super::{lexer, line_parser, Entry};
|
||||
|
||||
use crate::parse::sourcefile::split_lines;
|
||||
use crate::representations::sourcefile::FileEntry;
|
||||
|
||||
#[derive(Error, Debug, Clone)]
|
||||
pub enum ParseError {
|
||||
#[error("Could not tokenize {0:?}")]
|
||||
Lex(Vec<Simple<char>>),
|
||||
#[error("Could not parse {:?} on line {}", .0.first().unwrap().1.span(), .0.first().unwrap().0)]
|
||||
Ast(Vec<(usize, Simple<Entry>)>)
|
||||
#[error(
|
||||
"Could not parse {:?} on line {}",
|
||||
.0.first().unwrap().1.span(),
|
||||
.0.first().unwrap().0
|
||||
)]
|
||||
Ast(Vec<(usize, Simple<Entry>)>),
|
||||
}
|
||||
|
||||
/// All the data required for parsing
|
||||
|
||||
|
||||
/// Parse a string of code into a collection of module elements;
|
||||
/// imports, exports, comments, declarations, etc.
|
||||
///
|
||||
///
|
||||
/// Notice that because the lexer splits operators based on the provided
|
||||
/// list, the output will only be correct if operator list already
|
||||
/// contains all operators defined or imported by this module.
|
||||
pub fn parse<'a>(data: &str, ctx: impl Context)
|
||||
-> Result<Vec<FileEntry>, ParseError>
|
||||
{
|
||||
pub fn parse(
|
||||
data: &str,
|
||||
ctx: impl Context,
|
||||
) -> Result<Vec<FileEntry>, ParseError> {
|
||||
// TODO: wrap `i`, `ops` and `prefix` in a parsing context
|
||||
let lexie = lexer(ctx.clone());
|
||||
let token_batchv = lexie.parse(data).map_err(ParseError::Lex)?;
|
||||
// println!("Lexed:\n{}", LexedText(token_batchv.clone()).bundle(ctx.interner()));
|
||||
// println!("Lexed:\n{:?}", token_batchv.clone());
|
||||
let parsr = line_parser(ctx).then_ignore(end());
|
||||
let (parsed_lines, errors_per_line) = split_lines(&token_batchv)
|
||||
.enumerate()
|
||||
.map(|(i, entv)| (i,
|
||||
entv.iter()
|
||||
.filter(|e| !e.is_filler())
|
||||
.cloned()
|
||||
.collect::<Vec<_>>()
|
||||
))
|
||||
.filter(|(_, l)| l.len() > 0)
|
||||
.map(|(i, entv)| {
|
||||
(i, entv.iter().filter(|e| !e.is_filler()).cloned().collect::<Vec<_>>())
|
||||
})
|
||||
.filter(|(_, l)| !l.is_empty())
|
||||
.map(|(i, l)| (i, parsr.parse(l)))
|
||||
.map(|(i, res)| match res {
|
||||
Ok(r) => (Some(r), (i, vec![])),
|
||||
Err(e) => (None, (i, e))
|
||||
}).unzip::<_, _, Vec<_>, Vec<_>>();
|
||||
let total_err = errors_per_line.into_iter()
|
||||
Err(e) => (None, (i, e)),
|
||||
})
|
||||
.unzip::<_, _, Vec<_>, Vec<_>>();
|
||||
let total_err = errors_per_line
|
||||
.into_iter()
|
||||
.flat_map(|(i, v)| v.into_iter().map(move |e| (i, e)))
|
||||
.collect::<Vec<_>>();
|
||||
if !total_err.is_empty() { Err(ParseError::Ast(total_err)) }
|
||||
else { Ok(parsed_lines.into_iter().map(Option::unwrap).collect()) }
|
||||
if !total_err.is_empty() {
|
||||
Err(ParseError::Ast(total_err))
|
||||
} else {
|
||||
Ok(parsed_lines.into_iter().map(Option::unwrap).collect())
|
||||
}
|
||||
}
|
||||
@@ -1,16 +1,20 @@
|
||||
use chumsky::{Parser, prelude::*};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::Parser;
|
||||
use itertools::Itertools;
|
||||
|
||||
use super::context::Context;
|
||||
use super::decls::{SimpleParser, SimpleRecursive};
|
||||
use super::lexer::{filter_map_lex, Lexeme};
|
||||
use super::Entry;
|
||||
use crate::interner::Tok;
|
||||
use crate::representations::sourcefile::Import;
|
||||
use crate::utils::iter::{box_once, box_flatten, into_boxed_iter, BoxedIterIter};
|
||||
use crate::interner::Token;
|
||||
use crate::utils::iter::{
|
||||
box_flatten, box_once, into_boxed_iter, BoxedIterIter,
|
||||
};
|
||||
use crate::{box_chain, enum_filter};
|
||||
|
||||
use super::Entry;
|
||||
use super::context::Context;
|
||||
use super::lexer::{Lexeme, filter_map_lex};
|
||||
|
||||
/// initialize a BoxedIter<BoxedIter<String>> with a single element.
|
||||
fn init_table(name: Token<String>) -> BoxedIterIter<'static, Token<String>> {
|
||||
fn init_table(name: Tok<String>) -> BoxedIterIter<'static, Tok<String>> {
|
||||
// I'm not at all confident that this is a good approach.
|
||||
box_once(box_once(name))
|
||||
}
|
||||
@@ -21,56 +25,74 @@ fn init_table(name: Token<String>) -> BoxedIterIter<'static, Token<String>> {
|
||||
/// preferably contain crossplatform filename-legal characters but the
|
||||
/// symbols are explicitly allowed to go wild.
|
||||
/// There's a blacklist in [name]
|
||||
pub fn import_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, Vec<Import>, Error = Simple<Entry>> + 'a
|
||||
{
|
||||
pub fn import_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, Vec<Import>> + 'a {
|
||||
// TODO: this algorithm isn't cache friendly and copies a lot
|
||||
recursive({
|
||||
let ctx = ctx.clone();
|
||||
move |expr:Recursive<Entry, BoxedIterIter<Token<String>>, Simple<Entry>>| {
|
||||
filter_map_lex(enum_filter!(Lexeme::Name)).map(|(t, _)| t)
|
||||
.separated_by(Lexeme::NS.parser())
|
||||
.then(
|
||||
Lexeme::NS.parser()
|
||||
.ignore_then(
|
||||
choice((
|
||||
expr.clone()
|
||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
||||
.map(|v| box_flatten(v.into_iter()))
|
||||
.labelled("import group"),
|
||||
// Each expr returns a list of imports, flatten into common list
|
||||
Lexeme::Name(ctx.interner().i("*")).parser()
|
||||
.map(move |_| init_table(ctx.interner().i("*")))
|
||||
.labelled("wildcard import"), // Just a *, wrapped
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.map(|(t, _)| init_table(t))
|
||||
.labelled("import terminal") // Just a name, wrapped
|
||||
))
|
||||
).or_not()
|
||||
)
|
||||
.map(|(name, opt_post): (Vec<Token<String>>, Option<BoxedIterIter<Token<String>>>)|
|
||||
-> BoxedIterIter<Token<String>> {
|
||||
if let Some(post) = opt_post {
|
||||
Box::new(post.map(move |el| {
|
||||
box_chain!(name.clone().into_iter(), el)
|
||||
}))
|
||||
} else {
|
||||
box_once(into_boxed_iter(name))
|
||||
}
|
||||
})
|
||||
move |expr: SimpleRecursive<Entry, BoxedIterIter<Tok<String>>>| {
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.map(|(t, _)| t)
|
||||
.separated_by(Lexeme::NS.parser())
|
||||
.then(
|
||||
Lexeme::NS
|
||||
.parser()
|
||||
.ignore_then(choice((
|
||||
expr
|
||||
.clone()
|
||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||
.delimited_by(
|
||||
Lexeme::LP('(').parser(),
|
||||
Lexeme::RP('(').parser(),
|
||||
)
|
||||
.map(|v| box_flatten(v.into_iter()))
|
||||
.labelled("import group"),
|
||||
// Each expr returns a list of imports, flatten into common list
|
||||
Lexeme::Name(ctx.interner().i("*"))
|
||||
.parser()
|
||||
.map(move |_| init_table(ctx.interner().i("*")))
|
||||
.labelled("wildcard import"), // Just a *, wrapped
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.map(|(t, _)| init_table(t))
|
||||
.labelled("import terminal"), // Just a name, wrapped
|
||||
)))
|
||||
.or_not(),
|
||||
)
|
||||
.map(
|
||||
|(name, opt_post): (
|
||||
Vec<Tok<String>>,
|
||||
Option<BoxedIterIter<Tok<String>>>,
|
||||
)|
|
||||
-> BoxedIterIter<Tok<String>> {
|
||||
if let Some(post) = opt_post {
|
||||
Box::new(
|
||||
post.map(move |el| box_chain!(name.clone().into_iter(), el)),
|
||||
)
|
||||
} else {
|
||||
box_once(into_boxed_iter(name))
|
||||
}
|
||||
},
|
||||
)
|
||||
}
|
||||
}).map(move |paths| {
|
||||
paths.filter_map(|namespaces| {
|
||||
let mut path = namespaces.collect_vec();
|
||||
let name = path.pop()?;
|
||||
Some(Import {
|
||||
path: ctx.interner().i(&path),
|
||||
name: {
|
||||
if name == ctx.interner().i("*") { None }
|
||||
else { Some(name) }
|
||||
}
|
||||
})
|
||||
.map(move |paths| {
|
||||
paths
|
||||
.filter_map(|namespaces| {
|
||||
let mut path = namespaces.collect_vec();
|
||||
let name = path.pop()?;
|
||||
Some(Import {
|
||||
path: ctx.interner().i(&path),
|
||||
name: {
|
||||
if name == ctx.interner().i("*") {
|
||||
None
|
||||
} else {
|
||||
Some(name)
|
||||
}
|
||||
},
|
||||
})
|
||||
})
|
||||
}).collect()
|
||||
}).labelled("import")
|
||||
.collect()
|
||||
})
|
||||
.labelled("import")
|
||||
}
|
||||
|
||||
@@ -1,31 +1,36 @@
|
||||
use std::fmt;
|
||||
use std::ops::Range;
|
||||
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::text::keyword;
|
||||
use chumsky::{Parser, Span};
|
||||
use ordered_float::NotNan;
|
||||
use chumsky::{Parser, prelude::*, text::keyword, Span};
|
||||
|
||||
use crate::ast::{Placeholder, PHClass};
|
||||
use crate::representations::Literal;
|
||||
use crate::interner::{Token, InternedDisplay, Interner};
|
||||
|
||||
use super::context::Context;
|
||||
use super::placeholder;
|
||||
use super::{number, string, name, comment};
|
||||
use super::decls::SimpleParser;
|
||||
use super::{comment, name, number, placeholder, string};
|
||||
use crate::ast::{PHClass, Placeholder};
|
||||
use crate::interner::{InternedDisplay, Interner, Tok};
|
||||
use crate::representations::Literal;
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub struct Entry{
|
||||
pub struct Entry {
|
||||
pub lexeme: Lexeme,
|
||||
pub range: Range<usize>
|
||||
pub range: Range<usize>,
|
||||
}
|
||||
impl Entry {
|
||||
pub fn is_filler(&self) -> bool {
|
||||
matches!(self.lexeme, Lexeme::Comment(_))
|
||||
|| matches!(self.lexeme, Lexeme::BR)
|
||||
|| matches!(self.lexeme, Lexeme::BR)
|
||||
}
|
||||
}
|
||||
|
||||
impl InternedDisplay for Entry {
|
||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, i: &Interner) -> std::fmt::Result {
|
||||
fn fmt_i(
|
||||
&self,
|
||||
f: &mut std::fmt::Formatter<'_>,
|
||||
i: &Interner,
|
||||
) -> std::fmt::Result {
|
||||
self.lexeme.fmt_i(f, i)
|
||||
}
|
||||
}
|
||||
@@ -40,21 +45,24 @@ impl Span for Entry {
|
||||
type Context = Lexeme;
|
||||
type Offset = usize;
|
||||
|
||||
fn context(&self) -> Self::Context {self.lexeme.clone()}
|
||||
fn start(&self) -> Self::Offset {self.range.start()}
|
||||
fn end(&self) -> Self::Offset {self.range.end()}
|
||||
fn context(&self) -> Self::Context {
|
||||
self.lexeme.clone()
|
||||
}
|
||||
fn start(&self) -> Self::Offset {
|
||||
self.range.start()
|
||||
}
|
||||
fn end(&self) -> Self::Offset {
|
||||
self.range.end()
|
||||
}
|
||||
fn new(context: Self::Context, range: Range<Self::Offset>) -> Self {
|
||||
Self{
|
||||
lexeme: context,
|
||||
range
|
||||
}
|
||||
Self { lexeme: context, range }
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub enum Lexeme {
|
||||
Literal(Literal),
|
||||
Name(Token<String>),
|
||||
Name(Tok<String>),
|
||||
Rule(NotNan<f64>),
|
||||
/// Walrus operator (formerly shorthand macro)
|
||||
Const,
|
||||
@@ -74,11 +82,15 @@ pub enum Lexeme {
|
||||
Export,
|
||||
Import,
|
||||
Namespace,
|
||||
PH(Placeholder)
|
||||
PH(Placeholder),
|
||||
}
|
||||
|
||||
impl InternedDisplay for Lexeme {
|
||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, i: &Interner) -> std::fmt::Result {
|
||||
fn fmt_i(
|
||||
&self,
|
||||
f: &mut std::fmt::Formatter<'_>,
|
||||
i: &Interner,
|
||||
) -> std::fmt::Result {
|
||||
match self {
|
||||
Self::Literal(l) => write!(f, "{:?}", l),
|
||||
Self::Name(token) => write!(f, "{}", i.r(*token)),
|
||||
@@ -90,9 +102,9 @@ impl InternedDisplay for Lexeme {
|
||||
'(' => write!(f, ")"),
|
||||
'[' => write!(f, "]"),
|
||||
'{' => write!(f, "}}"),
|
||||
_ => f.debug_tuple("RP").field(l).finish()
|
||||
_ => f.debug_tuple("RP").field(l).finish(),
|
||||
},
|
||||
Self::BR => write!(f, "\n"),
|
||||
Self::BR => writeln!(f),
|
||||
Self::BS => write!(f, "\\"),
|
||||
Self::At => write!(f, "@"),
|
||||
Self::Type => write!(f, ":"),
|
||||
@@ -103,27 +115,30 @@ impl InternedDisplay for Lexeme {
|
||||
Self::PH(Placeholder { name, class }) => match *class {
|
||||
PHClass::Scalar => write!(f, "${}", i.r(*name)),
|
||||
PHClass::Vec { nonzero, prio } => {
|
||||
if nonzero {write!(f, "...")}
|
||||
else {write!(f, "..")}?;
|
||||
if nonzero {
|
||||
write!(f, "...")
|
||||
} else {
|
||||
write!(f, "..")
|
||||
}?;
|
||||
write!(f, "${}", i.r(*name))?;
|
||||
if prio != 0 {write!(f, ":{}", prio)?;};
|
||||
if prio != 0 {
|
||||
write!(f, ":{}", prio)?;
|
||||
};
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Lexeme {
|
||||
pub fn rule(prio: impl Into<f64>) -> Self {
|
||||
Lexeme::Rule(
|
||||
NotNan::new(prio.into())
|
||||
.expect("Rule priority cannot be NaN")
|
||||
)
|
||||
Lexeme::Rule(NotNan::new(prio.into()).expect("Rule priority cannot be NaN"))
|
||||
}
|
||||
|
||||
pub fn parser<E: chumsky::Error<Entry>>(self)
|
||||
-> impl Parser<Entry, Entry, Error = E> + Clone {
|
||||
pub fn parser<E: chumsky::Error<Entry>>(
|
||||
self,
|
||||
) -> impl Parser<Entry, Entry, Error = E> + Clone {
|
||||
filter(move |ent: &Entry| ent.lexeme == self)
|
||||
}
|
||||
}
|
||||
@@ -141,16 +156,14 @@ impl InternedDisplay for LexedText {
|
||||
}
|
||||
}
|
||||
|
||||
fn paren_parser(lp: char, rp: char)
|
||||
-> impl Parser<char, Lexeme, Error=Simple<char>>
|
||||
{
|
||||
just(lp).to(Lexeme::LP(lp))
|
||||
.or(just(rp).to(Lexeme::RP(lp)))
|
||||
fn paren_parser(lp: char, rp: char) -> impl SimpleParser<char, Lexeme> {
|
||||
just(lp).to(Lexeme::LP(lp)).or(just(rp).to(Lexeme::RP(lp)))
|
||||
}
|
||||
|
||||
pub fn literal_parser() -> impl Parser<char, Literal, Error = Simple<char>> {
|
||||
pub fn literal_parser() -> impl SimpleParser<char, Literal> {
|
||||
choice((
|
||||
number::int_parser().map(Literal::Uint), // all ints are valid floats so it takes precedence
|
||||
// all ints are valid floats so it takes precedence
|
||||
number::int_parser().map(Literal::Uint),
|
||||
number::float_parser().map(Literal::Num),
|
||||
string::char_parser().map(Literal::Char),
|
||||
string::str_parser().map(Literal::Str),
|
||||
@@ -159,10 +172,12 @@ pub fn literal_parser() -> impl Parser<char, Literal, Error = Simple<char>> {
|
||||
|
||||
pub static BASE_OPS: &[&str] = &[",", ".", "..", "..."];
|
||||
|
||||
pub fn lexer<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<char, Vec<Entry>, Error=Simple<char>> + 'a
|
||||
{
|
||||
let all_ops = ctx.ops().iter()
|
||||
pub fn lexer<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<char, Vec<Entry>> + 'a {
|
||||
let all_ops = ctx
|
||||
.ops()
|
||||
.iter()
|
||||
.map(|op| op.as_ref())
|
||||
.chain(BASE_OPS.iter().cloned())
|
||||
.map(str::to_string)
|
||||
@@ -175,7 +190,10 @@ pub fn lexer<'a>(ctx: impl Context + 'a)
|
||||
paren_parser('[', ']'),
|
||||
paren_parser('{', '}'),
|
||||
just(":=").to(Lexeme::Const),
|
||||
just("=").ignore_then(number::float_parser()).then_ignore(just("=>")).map(Lexeme::rule),
|
||||
just("=")
|
||||
.ignore_then(number::float_parser())
|
||||
.then_ignore(just("=>"))
|
||||
.map(Lexeme::rule),
|
||||
comment::comment_parser().map(Lexeme::Comment),
|
||||
just("::").to(Lexeme::NS),
|
||||
just('\\').to(Lexeme::BS),
|
||||
@@ -184,20 +202,18 @@ pub fn lexer<'a>(ctx: impl Context + 'a)
|
||||
just('\n').to(Lexeme::BR),
|
||||
placeholder::placeholder_parser(ctx.clone()).map(Lexeme::PH),
|
||||
literal_parser().map(Lexeme::Literal),
|
||||
name::name_parser(&all_ops).map(move |n| {
|
||||
Lexeme::Name(ctx.interner().i(&n))
|
||||
})
|
||||
name::name_parser(&all_ops)
|
||||
.map(move |n| Lexeme::Name(ctx.interner().i(&n))),
|
||||
))
|
||||
.map_with_span(|lexeme, range| Entry{ lexeme, range })
|
||||
.padded_by(one_of(" \t").repeated())
|
||||
.repeated()
|
||||
.then_ignore(end())
|
||||
.map_with_span(|lexeme, range| Entry { lexeme, range })
|
||||
.padded_by(one_of(" \t").repeated())
|
||||
.repeated()
|
||||
.then_ignore(end())
|
||||
}
|
||||
|
||||
|
||||
pub fn filter_map_lex<'a, O, M: ToString>(
|
||||
f: impl Fn(Lexeme) -> Result<O, M> + Clone + 'a
|
||||
) -> impl Parser<Entry, (O, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
||||
f: impl Fn(Lexeme) -> Result<O, M> + Clone + 'a,
|
||||
) -> impl SimpleParser<Entry, (O, Range<usize>)> + Clone + 'a {
|
||||
filter_map(move |s: Range<usize>, e: Entry| {
|
||||
let out = f(e.lexeme).map_err(|msg| Simple::custom(s.clone(), msg))?;
|
||||
Ok((out, s))
|
||||
|
||||
@@ -1,19 +1,20 @@
|
||||
mod string;
|
||||
mod number;
|
||||
mod name;
|
||||
mod lexer;
|
||||
mod comment;
|
||||
mod expression;
|
||||
mod sourcefile;
|
||||
mod import;
|
||||
mod parse;
|
||||
mod enum_filter;
|
||||
mod placeholder;
|
||||
mod context;
|
||||
mod decls;
|
||||
mod enum_filter;
|
||||
mod expression;
|
||||
mod facade;
|
||||
mod import;
|
||||
mod lexer;
|
||||
mod name;
|
||||
mod number;
|
||||
mod placeholder;
|
||||
mod sourcefile;
|
||||
mod string;
|
||||
|
||||
pub use sourcefile::line_parser;
|
||||
pub use lexer::{lexer, Lexeme, Entry};
|
||||
pub use context::ParsingContext;
|
||||
pub use facade::{parse, ParseError};
|
||||
pub use lexer::{lexer, Entry, Lexeme};
|
||||
pub use name::is_op;
|
||||
pub use parse::{parse, ParseError};
|
||||
pub use number::{float_parser, int_parser};
|
||||
pub use context::ParsingContext;
|
||||
pub use sourcefile::line_parser;
|
||||
|
||||
@@ -1,22 +1,28 @@
|
||||
use chumsky::{self, prelude::*, Parser};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::{self, Parser};
|
||||
|
||||
use super::decls::{BoxedSimpleParser, SimpleParser};
|
||||
|
||||
/// Matches any one of the passed operators, preferring longer ones
|
||||
fn op_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
||||
-> BoxedParser<'a, char, String, Simple<char>>
|
||||
{
|
||||
let mut sorted_ops: Vec<String> = ops.iter()
|
||||
.map(|t| t.as_ref().to_string()).collect();
|
||||
fn op_parser<'a>(
|
||||
ops: &[impl AsRef<str> + Clone],
|
||||
) -> BoxedSimpleParser<'a, char, String> {
|
||||
let mut sorted_ops: Vec<String> =
|
||||
ops.iter().map(|t| t.as_ref().to_string()).collect();
|
||||
sorted_ops.sort_by_key(|op| -(op.len() as i64));
|
||||
sorted_ops.into_iter()
|
||||
sorted_ops
|
||||
.into_iter()
|
||||
.map(|op| just(op).boxed())
|
||||
.reduce(|a, b| a.or(b).boxed())
|
||||
.unwrap_or_else(|| {
|
||||
empty().map(|()| panic!("Empty isn't meant to match")).boxed()
|
||||
}).labelled("operator").boxed()
|
||||
})
|
||||
.labelled("operator")
|
||||
.boxed()
|
||||
}
|
||||
|
||||
/// Characters that cannot be parsed as part of an operator
|
||||
///
|
||||
///
|
||||
/// The initial operator list overrides this.
|
||||
static NOT_NAME_CHAR: &[char] = &[
|
||||
':', // used for namespacing and type annotations
|
||||
@@ -28,35 +34,34 @@ static NOT_NAME_CHAR: &[char] = &[
|
||||
];
|
||||
|
||||
/// Matches anything that's allowed as an operator
|
||||
///
|
||||
///
|
||||
/// FIXME: `@name` without a dot should be parsed correctly for overrides.
|
||||
/// Could be an operator but then parametrics should take precedence,
|
||||
/// which might break stuff. investigate.
|
||||
///
|
||||
///
|
||||
/// TODO: `'` could work as an operator whenever it isn't closed.
|
||||
/// It's common im maths so it's worth a try
|
||||
///
|
||||
///
|
||||
/// TODO: `.` could possibly be parsed as an operator in some contexts.
|
||||
/// This operator is very common in maths so it's worth a try.
|
||||
/// Investigate.
|
||||
pub fn modname_parser<'a>()
|
||||
-> impl Parser<char, String, Error = Simple<char>> + 'a
|
||||
{
|
||||
pub fn modname_parser<'a>() -> impl SimpleParser<char, String> + 'a {
|
||||
filter(move |c| !NOT_NAME_CHAR.contains(c) && !c.is_whitespace())
|
||||
.repeated().at_least(1)
|
||||
.repeated()
|
||||
.at_least(1)
|
||||
.collect()
|
||||
.labelled("modname")
|
||||
}
|
||||
|
||||
/// Parse an operator or name. Failing both, parse everything up to
|
||||
/// the next whitespace or blacklisted character as a new operator.
|
||||
pub fn name_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
||||
-> impl Parser<char, String, Error = Simple<char>> + 'a
|
||||
{
|
||||
pub fn name_parser<'a>(
|
||||
ops: &[impl AsRef<str> + Clone],
|
||||
) -> impl SimpleParser<char, String> + 'a {
|
||||
choice((
|
||||
op_parser(ops), // First try to parse a known operator
|
||||
text::ident().labelled("plain text"), // Failing that, parse plain text
|
||||
modname_parser() // Finally parse everything until tne next forbidden char
|
||||
modname_parser(), // Finally parse everything until tne next forbidden char
|
||||
))
|
||||
.labelled("name")
|
||||
}
|
||||
@@ -65,7 +70,7 @@ pub fn name_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
||||
/// and text, just not at the start.
|
||||
pub fn is_op(s: impl AsRef<str>) -> bool {
|
||||
return match s.as_ref().chars().next() {
|
||||
Some(x) => !x.is_alphanumeric(),
|
||||
None => false
|
||||
}
|
||||
Some(x) => !x.is_alphanumeric(),
|
||||
None => false,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
use chumsky::{self, prelude::*, Parser};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::{self, Parser};
|
||||
use ordered_float::NotNan;
|
||||
|
||||
use super::decls::SimpleParser;
|
||||
|
||||
fn assert_not_digit(base: u32, c: char) {
|
||||
if base > (10 + (c as u32 - 'a' as u32)) {
|
||||
panic!("The character '{}' is a digit in base ({})", c, base)
|
||||
@@ -8,9 +11,9 @@ fn assert_not_digit(base: u32, c: char) {
|
||||
}
|
||||
|
||||
/// Parse an arbitrarily grouped sequence of digits starting with an underscore.
|
||||
///
|
||||
///
|
||||
/// TODO: this should use separated_by and parse the leading group too
|
||||
fn separated_digits_parser(base: u32) -> impl Parser<char, String, Error = Simple<char>> {
|
||||
fn separated_digits_parser(base: u32) -> impl SimpleParser<char, String> {
|
||||
just('_')
|
||||
.ignore_then(text::digits(base))
|
||||
.repeated()
|
||||
@@ -18,57 +21,62 @@ fn separated_digits_parser(base: u32) -> impl Parser<char, String, Error = Simpl
|
||||
}
|
||||
|
||||
/// parse a grouped uint
|
||||
///
|
||||
///
|
||||
/// Not to be confused with [int_parser] which does a lot more
|
||||
fn uint_parser(base: u32) -> impl Parser<char, u64, Error = Simple<char>> {
|
||||
text::int(base)
|
||||
.then(separated_digits_parser(base))
|
||||
.map(move |(s1, s2): (String, String)| {
|
||||
fn uint_parser(base: u32) -> impl SimpleParser<char, u64> {
|
||||
text::int(base).then(separated_digits_parser(base)).map(
|
||||
move |(s1, s2): (String, String)| {
|
||||
u64::from_str_radix(&(s1 + &s2), base).unwrap()
|
||||
})
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
/// parse exponent notation, or return 0 as the default exponent.
|
||||
/// The exponent is always in decimal.
|
||||
fn pow_parser() -> impl Parser<char, i32, Error = Simple<char>> {
|
||||
/// The exponent is always in decimal.
|
||||
fn pow_parser() -> impl SimpleParser<char, i32> {
|
||||
choice((
|
||||
just('p')
|
||||
.ignore_then(text::int(10))
|
||||
.map(|s: String| s.parse().unwrap()),
|
||||
just('p').ignore_then(text::int(10)).map(|s: String| s.parse().unwrap()),
|
||||
just("p-")
|
||||
.ignore_then(text::int(10))
|
||||
.map(|s: String| -s.parse::<i32>().unwrap()),
|
||||
)).or_else(|_| Ok(0))
|
||||
))
|
||||
.or_else(|_| Ok(0))
|
||||
}
|
||||
|
||||
/// returns a mapper that converts a mantissa and an exponent into an uint
|
||||
///
|
||||
///
|
||||
/// TODO it panics if it finds a negative exponent
|
||||
fn nat2u(base: u64) -> impl Fn((u64, i32),) -> u64 {
|
||||
fn nat2u(base: u64) -> impl Fn((u64, i32)) -> u64 {
|
||||
move |(val, exp)| {
|
||||
if exp == 0 {val}
|
||||
else {val * base.checked_pow(exp.try_into().unwrap()).unwrap()}
|
||||
if exp == 0 {
|
||||
val
|
||||
} else {
|
||||
val * base.checked_pow(exp.try_into().unwrap()).unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// returns a mapper that converts a mantissa and an exponent into a float
|
||||
fn nat2f(base: u64) -> impl Fn((NotNan<f64>, i32),) -> NotNan<f64> {
|
||||
fn nat2f(base: u64) -> impl Fn((NotNan<f64>, i32)) -> NotNan<f64> {
|
||||
move |(val, exp)| {
|
||||
if exp == 0 {val}
|
||||
else {val * (base as f64).powf(exp.try_into().unwrap())}
|
||||
if exp == 0 {
|
||||
val
|
||||
} else {
|
||||
val * (base as f64).powf(exp.try_into().unwrap())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// parse an uint from exponential notation (panics if 'p' is a digit in base)
|
||||
fn pow_uint_parser(base: u32) -> impl Parser<char, u64, Error = Simple<char>> {
|
||||
fn pow_uint_parser(base: u32) -> impl SimpleParser<char, u64> {
|
||||
assert_not_digit(base, 'p');
|
||||
uint_parser(base).then(pow_parser()).map(nat2u(base.into()))
|
||||
}
|
||||
|
||||
/// parse an uint from a base determined by its prefix or lack thereof
|
||||
///
|
||||
///
|
||||
/// Not to be confused with [uint_parser] which is a component of it.
|
||||
pub fn int_parser() -> impl Parser<char, u64, Error = Simple<char>> {
|
||||
pub fn int_parser() -> impl SimpleParser<char, u64> {
|
||||
choice((
|
||||
just("0b").ignore_then(pow_uint_parser(2)),
|
||||
just("0x").ignore_then(pow_uint_parser(16)),
|
||||
@@ -78,35 +86,40 @@ pub fn int_parser() -> impl Parser<char, u64, Error = Simple<char>> {
|
||||
}
|
||||
|
||||
/// parse a float from dot notation
|
||||
fn dotted_parser(base: u32) -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
||||
fn dotted_parser(base: u32) -> impl SimpleParser<char, NotNan<f64>> {
|
||||
uint_parser(base)
|
||||
.then(
|
||||
just('.').ignore_then(
|
||||
text::digits(base).then(separated_digits_parser(base))
|
||||
).map(move |(frac1, frac2)| {
|
||||
let frac = frac1 + &frac2;
|
||||
let frac_num = u64::from_str_radix(&frac, base).unwrap() as f64;
|
||||
let dexp = base.pow(frac.len().try_into().unwrap());
|
||||
frac_num / dexp as f64
|
||||
}).or_not().map(|o| o.unwrap_or_default())
|
||||
).try_map(|(wh, f), s| {
|
||||
NotNan::new(wh as f64 + f).map_err(|_| Simple::custom(s, "Float literal evaluates to NaN"))
|
||||
})
|
||||
.then(
|
||||
just('.')
|
||||
.ignore_then(text::digits(base).then(separated_digits_parser(base)))
|
||||
.map(move |(frac1, frac2)| {
|
||||
let frac = frac1 + &frac2;
|
||||
let frac_num = u64::from_str_radix(&frac, base).unwrap() as f64;
|
||||
let dexp = base.pow(frac.len().try_into().unwrap());
|
||||
frac_num / dexp as f64
|
||||
})
|
||||
.or_not()
|
||||
.map(|o| o.unwrap_or_default()),
|
||||
)
|
||||
.try_map(|(wh, f), s| {
|
||||
NotNan::new(wh as f64 + f)
|
||||
.map_err(|_| Simple::custom(s, "Float literal evaluates to NaN"))
|
||||
})
|
||||
}
|
||||
|
||||
/// parse a float from dotted and optionally also exponential notation
|
||||
fn pow_float_parser(base: u32) -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
||||
fn pow_float_parser(base: u32) -> impl SimpleParser<char, NotNan<f64>> {
|
||||
assert_not_digit(base, 'p');
|
||||
dotted_parser(base).then(pow_parser()).map(nat2f(base.into()))
|
||||
}
|
||||
|
||||
/// parse a float with dotted and optionally exponential notation from a base determined by its
|
||||
/// prefix
|
||||
pub fn float_parser() -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
||||
/// parse a float with dotted and optionally exponential notation from a base
|
||||
/// determined by its prefix
|
||||
pub fn float_parser() -> impl SimpleParser<char, NotNan<f64>> {
|
||||
choice((
|
||||
just("0b").ignore_then(pow_float_parser(2)),
|
||||
just("0x").ignore_then(pow_float_parser(16)),
|
||||
just('0').ignore_then(pow_float_parser(8)),
|
||||
pow_float_parser(10),
|
||||
)).labelled("float")
|
||||
))
|
||||
.labelled("float")
|
||||
}
|
||||
|
||||
@@ -1,16 +1,18 @@
|
||||
use chumsky::{Parser, prelude::*};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::Parser;
|
||||
|
||||
use crate::ast::{Placeholder, PHClass};
|
||||
use super::context::Context;
|
||||
use super::decls::SimpleParser;
|
||||
use super::number::int_parser;
|
||||
use crate::ast::{PHClass, Placeholder};
|
||||
|
||||
use super::{number::int_parser, context::Context};
|
||||
|
||||
pub fn placeholder_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<char, Placeholder, Error = Simple<char>> + 'a
|
||||
{
|
||||
pub fn placeholder_parser(
|
||||
ctx: impl Context,
|
||||
) -> impl SimpleParser<char, Placeholder> {
|
||||
choice((
|
||||
just("...").to(Some(true)),
|
||||
just("..").to(Some(false)),
|
||||
empty().to(None)
|
||||
empty().to(None),
|
||||
))
|
||||
.then(just("$").ignore_then(text::ident()))
|
||||
.then(just(":").ignore_then(int_parser()).or_not())
|
||||
@@ -19,12 +21,10 @@ pub fn placeholder_parser<'a>(ctx: impl Context + 'a)
|
||||
if let Some(nonzero) = vec_nonzero {
|
||||
let prio = vec_prio.unwrap_or_default();
|
||||
Ok(Placeholder { name, class: PHClass::Vec { nonzero, prio } })
|
||||
} else if vec_prio.is_some() {
|
||||
Err(Simple::custom(span, "Scalar placeholders have no priority"))
|
||||
} else {
|
||||
if vec_prio.is_some() {
|
||||
Err(Simple::custom(span, "Scalar placeholders have no priority"))
|
||||
} else {
|
||||
Ok(Placeholder { name, class: PHClass::Scalar })
|
||||
}
|
||||
Ok(Placeholder { name, class: PHClass::Scalar })
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,55 +1,67 @@
|
||||
use std::iter;
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::representations::location::Location;
|
||||
use crate::representations::sourcefile::{FileEntry, Member};
|
||||
use crate::enum_filter;
|
||||
use crate::ast::{Rule, Constant, Expr, Clause};
|
||||
use crate::interner::Token;
|
||||
|
||||
use super::Entry;
|
||||
use super::context::Context;
|
||||
use super::expression::xpr_parser;
|
||||
use super::import::import_parser;
|
||||
use super::lexer::{Lexeme, filter_map_lex};
|
||||
|
||||
use chumsky::{Parser, prelude::*};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::Parser;
|
||||
use itertools::Itertools;
|
||||
|
||||
fn rule_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, Rule, Error = Simple<Entry>> + 'a
|
||||
{
|
||||
xpr_parser(ctx.clone()).repeated().at_least(1)
|
||||
use super::context::Context;
|
||||
use super::decls::{SimpleParser, SimpleRecursive};
|
||||
use super::expression::xpr_parser;
|
||||
use super::import::import_parser;
|
||||
use super::lexer::{filter_map_lex, Lexeme};
|
||||
use super::Entry;
|
||||
use crate::ast::{Clause, Constant, Expr, Rule};
|
||||
use crate::enum_filter;
|
||||
use crate::representations::location::Location;
|
||||
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||
|
||||
fn rule_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, Rule> + 'a {
|
||||
xpr_parser(ctx.clone())
|
||||
.repeated()
|
||||
.at_least(1)
|
||||
.then(filter_map_lex(enum_filter!(Lexeme::Rule)))
|
||||
.then(xpr_parser(ctx).repeated().at_least(1))
|
||||
.map(|((s, (prio, _)), t)| Rule{
|
||||
.map(|((s, (prio, _)), t)| Rule {
|
||||
source: Rc::new(s),
|
||||
prio,
|
||||
target: Rc::new(t)
|
||||
}).labelled("Rule")
|
||||
target: Rc::new(t),
|
||||
})
|
||||
.labelled("Rule")
|
||||
}
|
||||
|
||||
fn const_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, Constant, Error = Simple<Entry>> + 'a
|
||||
{
|
||||
fn const_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, Constant> + 'a {
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.then_ignore(Lexeme::Const.parser())
|
||||
.then(xpr_parser(ctx.clone()).repeated().at_least(1))
|
||||
.map(move |((name, _), value)| Constant{
|
||||
.map(move |((name, _), value)| Constant {
|
||||
name,
|
||||
value: if let Ok(ex) = value.iter().exactly_one() { ex.clone() }
|
||||
else {
|
||||
let start = value.first().expect("value cannot be empty")
|
||||
.location.range().expect("all locations in parsed source are known")
|
||||
value: if let Ok(ex) = value.iter().exactly_one() {
|
||||
ex.clone()
|
||||
} else {
|
||||
let start = value
|
||||
.first()
|
||||
.expect("value cannot be empty")
|
||||
.location
|
||||
.range()
|
||||
.expect("all locations in parsed source are known")
|
||||
.start;
|
||||
let end = value.last().expect("asserted right above")
|
||||
.location.range().expect("all locations in parsed source are known")
|
||||
let end = value
|
||||
.last()
|
||||
.expect("asserted right above")
|
||||
.location
|
||||
.range()
|
||||
.expect("all locations in parsed source are known")
|
||||
.end;
|
||||
Expr{
|
||||
Expr {
|
||||
location: Location::Range { file: ctx.file(), range: start..end },
|
||||
value: Clause::S('(', Rc::new(value))
|
||||
value: Clause::S('(', Rc::new(value)),
|
||||
}
|
||||
}
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
@@ -60,56 +72,61 @@ pub fn collect_errors<T, E: chumsky::Error<T>>(e: Vec<E>) -> E {
|
||||
}
|
||||
|
||||
fn namespace_parser<'a>(
|
||||
line: impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a,
|
||||
) -> impl Parser<Entry, (Token<String>, Vec<FileEntry>), Error = Simple<Entry>> + 'a {
|
||||
Lexeme::Namespace.parser()
|
||||
.ignore_then(filter_map_lex(enum_filter!(Lexeme::Name)))
|
||||
.then(
|
||||
any().repeated().delimited_by(
|
||||
Lexeme::LP('(').parser(),
|
||||
Lexeme::RP('(').parser()
|
||||
).try_map(move |body, _| {
|
||||
split_lines(&body)
|
||||
.map(|l| line.parse(l))
|
||||
.collect::<Result<Vec<_>,_>>()
|
||||
.map_err(collect_errors)
|
||||
})
|
||||
).map(move |((name, _), body)| {
|
||||
(name, body)
|
||||
})
|
||||
line: impl SimpleParser<Entry, FileEntry> + 'a,
|
||||
) -> impl SimpleParser<Entry, Namespace> + 'a {
|
||||
Lexeme::Namespace
|
||||
.parser()
|
||||
.ignore_then(filter_map_lex(enum_filter!(Lexeme::Name)))
|
||||
.then(
|
||||
any()
|
||||
.repeated()
|
||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
||||
.try_map(move |body, _| {
|
||||
split_lines(&body)
|
||||
.map(|l| line.parse(l))
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.map_err(collect_errors)
|
||||
}),
|
||||
)
|
||||
.map(move |((name, _), body)| Namespace { name, body })
|
||||
}
|
||||
|
||||
fn member_parser<'a>(
|
||||
line: impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a,
|
||||
ctx: impl Context + 'a
|
||||
) -> impl Parser<Entry, Member, Error = Simple<Entry>> + 'a {
|
||||
line: impl SimpleParser<Entry, FileEntry> + 'a,
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, Member> + 'a {
|
||||
choice((
|
||||
namespace_parser(line)
|
||||
.map(|(name, body)| Member::Namespace(name, body)),
|
||||
namespace_parser(line).map(Member::Namespace),
|
||||
rule_parser(ctx.clone()).map(Member::Rule),
|
||||
const_parser(ctx).map(Member::Constant),
|
||||
))
|
||||
}
|
||||
|
||||
pub fn line_parser<'a>(ctx: impl Context + 'a)
|
||||
-> impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a
|
||||
{
|
||||
recursive(|line: Recursive<Entry, FileEntry, Simple<Entry>>| {
|
||||
pub fn line_parser<'a>(
|
||||
ctx: impl Context + 'a,
|
||||
) -> impl SimpleParser<Entry, FileEntry> + 'a {
|
||||
recursive(|line: SimpleRecursive<Entry, FileEntry>| {
|
||||
choice((
|
||||
// In case the usercode wants to parse doc
|
||||
filter_map_lex(enum_filter!(Lexeme >> FileEntry; Comment)).map(|(ent, _)| ent),
|
||||
filter_map_lex(enum_filter!(Lexeme >> FileEntry; Comment))
|
||||
.map(|(ent, _)| ent),
|
||||
// plain old imports
|
||||
Lexeme::Import.parser()
|
||||
Lexeme::Import
|
||||
.parser()
|
||||
.ignore_then(import_parser(ctx.clone()).map(FileEntry::Import)),
|
||||
Lexeme::Export.parser().ignore_then(choice((
|
||||
// token collection
|
||||
Lexeme::NS.parser().ignore_then(
|
||||
filter_map_lex(enum_filter!(Lexeme::Name)).map(|(e, _)| e)
|
||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
||||
).map(FileEntry::Export),
|
||||
Lexeme::NS
|
||||
.parser()
|
||||
.ignore_then(
|
||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||
.map(|(e, _)| e)
|
||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser()),
|
||||
)
|
||||
.map(FileEntry::Export),
|
||||
// public declaration
|
||||
member_parser(line.clone(), ctx.clone()).map(FileEntry::Exported)
|
||||
member_parser(line.clone(), ctx.clone()).map(FileEntry::Exported),
|
||||
))),
|
||||
// This could match almost anything so it has to go last
|
||||
member_parser(line, ctx).map(FileEntry::Internal),
|
||||
@@ -123,13 +140,13 @@ pub fn split_lines(data: &[Entry]) -> impl Iterator<Item = &[Entry]> {
|
||||
let mut finished = false;
|
||||
iter::from_fn(move || {
|
||||
let mut paren_count = 0;
|
||||
while let Some((i, Entry{ lexeme, .. })) = source.next() {
|
||||
for (i, Entry { lexeme, .. }) in source.by_ref() {
|
||||
match lexeme {
|
||||
Lexeme::LP(_) => paren_count += 1,
|
||||
Lexeme::RP(_) => paren_count -= 1,
|
||||
Lexeme::BR if paren_count == 0 => {
|
||||
let begin = last_slice;
|
||||
last_slice = i+1;
|
||||
last_slice = i + 1;
|
||||
return Some(&data[begin..i]);
|
||||
},
|
||||
_ => (),
|
||||
@@ -138,8 +155,9 @@ pub fn split_lines(data: &[Entry]) -> impl Iterator<Item = &[Entry]> {
|
||||
// Include last line even without trailing newline
|
||||
if !finished {
|
||||
finished = true;
|
||||
return Some(&data[last_slice..])
|
||||
return Some(&data[last_slice..]);
|
||||
}
|
||||
None
|
||||
}).filter(|s| s.len() > 0)
|
||||
})
|
||||
.filter(|s| !s.is_empty())
|
||||
}
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
use chumsky::{self, prelude::*, Parser};
|
||||
use chumsky::prelude::*;
|
||||
use chumsky::{self, Parser};
|
||||
|
||||
use super::decls::SimpleParser;
|
||||
|
||||
/// Parses a text character that is not the specified delimiter
|
||||
fn text_parser(delim: char) -> impl Parser<char, char, Error = Simple<char>> {
|
||||
fn text_parser(delim: char) -> impl SimpleParser<char, char> {
|
||||
// Copied directly from Chumsky's JSON example.
|
||||
let escape = just('\\').ignore_then(
|
||||
just('\\')
|
||||
@@ -12,35 +15,39 @@ fn text_parser(delim: char) -> impl Parser<char, char, Error = Simple<char>> {
|
||||
.or(just('n').to('\n'))
|
||||
.or(just('r').to('\r'))
|
||||
.or(just('t').to('\t'))
|
||||
.or(just('u').ignore_then(
|
||||
filter(|c: &char| c.is_ascii_hexdigit())
|
||||
.repeated()
|
||||
.exactly(4)
|
||||
.collect::<String>()
|
||||
.validate(|digits, span, emit| {
|
||||
char::from_u32(u32::from_str_radix(&digits, 16).unwrap())
|
||||
.unwrap_or_else(|| {
|
||||
emit(Simple::custom(span, "invalid unicode character"));
|
||||
'\u{FFFD}' // unicode replacement character
|
||||
})
|
||||
}),
|
||||
)),
|
||||
.or(
|
||||
just('u').ignore_then(
|
||||
filter(|c: &char| c.is_ascii_hexdigit())
|
||||
.repeated()
|
||||
.exactly(4)
|
||||
.collect::<String>()
|
||||
.validate(|digits, span, emit| {
|
||||
char::from_u32(u32::from_str_radix(&digits, 16).unwrap())
|
||||
.unwrap_or_else(|| {
|
||||
emit(Simple::custom(span, "invalid unicode character"));
|
||||
'\u{FFFD}' // unicode replacement character
|
||||
})
|
||||
}),
|
||||
),
|
||||
),
|
||||
);
|
||||
filter(move |&c| c != '\\' && c != delim).or(escape)
|
||||
}
|
||||
|
||||
/// Parse a character literal between single quotes
|
||||
pub fn char_parser() -> impl Parser<char, char, Error = Simple<char>> {
|
||||
pub fn char_parser() -> impl SimpleParser<char, char> {
|
||||
just('\'').ignore_then(text_parser('\'')).then_ignore(just('\''))
|
||||
}
|
||||
|
||||
/// Parse a string between double quotes
|
||||
pub fn str_parser() -> impl Parser<char, String, Error = Simple<char>> {
|
||||
pub fn str_parser() -> impl SimpleParser<char, String> {
|
||||
just('"')
|
||||
.ignore_then(
|
||||
text_parser('"').map(Some)
|
||||
.ignore_then(
|
||||
text_parser('"').map(Some)
|
||||
.or(just("\\\n").map(|_| None)) // Newlines preceded by backslashes are ignored.
|
||||
.repeated()
|
||||
).then_ignore(just('"'))
|
||||
.flatten().collect()
|
||||
.repeated(),
|
||||
)
|
||||
.then_ignore(just('"'))
|
||||
.flatten()
|
||||
.collect()
|
||||
}
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
mod project_error;
|
||||
mod parse_error_with_path;
|
||||
mod unexpected_directory;
|
||||
mod module_not_found;
|
||||
mod not_exported;
|
||||
mod parse_error_with_path;
|
||||
mod project_error;
|
||||
mod too_many_supers;
|
||||
mod unexpected_directory;
|
||||
mod visibility_mismatch;
|
||||
|
||||
pub use project_error::{ErrorPosition, ProjectError};
|
||||
pub use parse_error_with_path::ParseErrorWithPath;
|
||||
pub use unexpected_directory::UnexpectedDirectory;
|
||||
pub use module_not_found::ModuleNotFound;
|
||||
pub use not_exported::NotExported;
|
||||
pub use parse_error_with_path::ParseErrorWithPath;
|
||||
pub use project_error::{ErrorPosition, ProjectError};
|
||||
pub use too_many_supers::TooManySupers;
|
||||
pub use visibility_mismatch::VisibilityMismatch;
|
||||
pub use unexpected_directory::UnexpectedDirectory;
|
||||
pub use visibility_mismatch::VisibilityMismatch;
|
||||
|
||||
@@ -1,16 +1,16 @@
|
||||
use crate::utils::{BoxedIter, iter::box_once};
|
||||
|
||||
use super::{ProjectError, ErrorPosition};
|
||||
use super::{ErrorPosition, ProjectError};
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::BoxedIter;
|
||||
|
||||
/// Error produced when an import refers to a nonexistent module
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub struct ModuleNotFound {
|
||||
pub file: Vec<String>,
|
||||
pub subpath: Vec<String>
|
||||
pub subpath: Vec<String>,
|
||||
}
|
||||
impl ProjectError for ModuleNotFound {
|
||||
fn description(&self) -> &str {
|
||||
"an import refers to a nonexistent module"
|
||||
"an import refers to a nonexistent module"
|
||||
}
|
||||
fn message(&self) -> String {
|
||||
format!(
|
||||
@@ -22,4 +22,4 @@ impl ProjectError for ModuleNotFound {
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||
box_once(ErrorPosition::just_file(self.file.clone()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::{utils::BoxedIter, representations::location::Location};
|
||||
|
||||
use super::{ProjectError, ErrorPosition};
|
||||
use super::{ErrorPosition, ProjectError};
|
||||
use crate::representations::location::Location;
|
||||
use crate::utils::BoxedIter;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct NotExported {
|
||||
@@ -16,21 +16,21 @@ impl ProjectError for NotExported {
|
||||
"An import refers to a symbol that exists but isn't exported"
|
||||
}
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||
Box::new([
|
||||
ErrorPosition{
|
||||
location: Location::File(Rc::new(self.file.clone())),
|
||||
message: Some(format!(
|
||||
"{} isn't exported",
|
||||
self.subpath.join("::")
|
||||
)),
|
||||
},
|
||||
ErrorPosition{
|
||||
location: Location::File(Rc::new(self.referrer_file.clone())),
|
||||
message: Some(format!(
|
||||
"{} cannot see this symbol",
|
||||
self.referrer_subpath.join("::")
|
||||
)),
|
||||
}
|
||||
].into_iter())
|
||||
Box::new(
|
||||
[
|
||||
ErrorPosition {
|
||||
location: Location::File(Rc::new(self.file.clone())),
|
||||
message: Some(format!("{} isn't exported", self.subpath.join("::"))),
|
||||
},
|
||||
ErrorPosition {
|
||||
location: Location::File(Rc::new(self.referrer_file.clone())),
|
||||
message: Some(format!(
|
||||
"{} cannot see this symbol",
|
||||
self.referrer_subpath.join("::")
|
||||
)),
|
||||
},
|
||||
]
|
||||
.into_iter(),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,21 +1,21 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use super::{ErrorPosition, ProjectError};
|
||||
use crate::parse::ParseError;
|
||||
use crate::representations::location::Location;
|
||||
use crate::utils::BoxedIter;
|
||||
use crate::parse::ParseError;
|
||||
|
||||
use super::ErrorPosition;
|
||||
use super::ProjectError;
|
||||
|
||||
/// Produced by stages that parse text when it fails.
|
||||
#[derive(Debug)]
|
||||
pub struct ParseErrorWithPath {
|
||||
pub full_source: String,
|
||||
pub path: Vec<String>,
|
||||
pub error: ParseError
|
||||
pub error: ParseError,
|
||||
}
|
||||
impl ProjectError for ParseErrorWithPath {
|
||||
fn description(&self) -> &str {"Failed to parse code"}
|
||||
fn description(&self) -> &str {
|
||||
"Failed to parse code"
|
||||
}
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||
match &self.error {
|
||||
ParseError::Lex(lex) => Box::new(lex.iter().map(|s| ErrorPosition {
|
||||
@@ -23,15 +23,20 @@ impl ProjectError for ParseErrorWithPath {
|
||||
file: Rc::new(self.path.clone()),
|
||||
range: s.span(),
|
||||
},
|
||||
message: Some(s.to_string())
|
||||
message: Some(s.to_string()),
|
||||
})),
|
||||
ParseError::Ast(ast) => Box::new(ast.iter().map(|(_i, s)| ErrorPosition {
|
||||
location: s.found().map(|e| Location::Range {
|
||||
file: Rc::new(self.path.clone()),
|
||||
range: e.range.clone()
|
||||
}).unwrap_or_else(|| Location::File(Rc::new(self.path.clone()))),
|
||||
message: Some(s.label().unwrap_or("Parse error").to_string())
|
||||
ParseError::Ast(ast) => Box::new(ast.iter().map(|(_i, s)| {
|
||||
ErrorPosition {
|
||||
location: s
|
||||
.found()
|
||||
.map(|e| Location::Range {
|
||||
file: Rc::new(self.path.clone()),
|
||||
range: e.range.clone(),
|
||||
})
|
||||
.unwrap_or_else(|| Location::File(Rc::new(self.path.clone()))),
|
||||
message: Some(s.label().unwrap_or("Parse error").to_string()),
|
||||
}
|
||||
})),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -8,7 +8,7 @@ use crate::utils::BoxedIter;
|
||||
/// processing got stuck, a command that is likely to be incorrect
|
||||
pub struct ErrorPosition {
|
||||
pub location: Location,
|
||||
pub message: Option<String>
|
||||
pub message: Option<String>,
|
||||
}
|
||||
|
||||
impl ErrorPosition {
|
||||
@@ -24,12 +24,17 @@ pub trait ProjectError: Debug {
|
||||
/// A general description of this type of error
|
||||
fn description(&self) -> &str;
|
||||
/// A formatted message that includes specific parameters
|
||||
fn message(&self) -> String {String::new()}
|
||||
fn message(&self) -> String {
|
||||
String::new()
|
||||
}
|
||||
/// Code positions relevant to this error
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition>;
|
||||
/// Convert the error into an [Rc<dyn ProjectError>] to be able to
|
||||
/// handle various errors together
|
||||
fn rc(self) -> Rc<dyn ProjectError> where Self: Sized + 'static {
|
||||
fn rc(self) -> Rc<dyn ProjectError>
|
||||
where
|
||||
Self: Sized + 'static,
|
||||
{
|
||||
Rc::new(self)
|
||||
}
|
||||
}
|
||||
@@ -41,10 +46,12 @@ impl Display for dyn ProjectError {
|
||||
let positions = self.positions();
|
||||
write!(f, "Problem with the project: {description}; {message}")?;
|
||||
for ErrorPosition { location, message } in positions {
|
||||
write!(f, "@{location}: {}",
|
||||
write!(
|
||||
f,
|
||||
"@{location}: {}",
|
||||
message.unwrap_or("location of interest".to_string())
|
||||
)?
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::{utils::{BoxedIter, iter::box_once}, representations::location::Location};
|
||||
|
||||
use super::{ProjectError, ErrorPosition};
|
||||
use super::{ErrorPosition, ProjectError};
|
||||
use crate::representations::location::Location;
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::BoxedIter;
|
||||
|
||||
/// Error produced when an import path starts with more `super` segments
|
||||
/// than the current module's absolute path
|
||||
@@ -10,12 +11,12 @@ use super::{ProjectError, ErrorPosition};
|
||||
pub struct TooManySupers {
|
||||
pub path: Vec<String>,
|
||||
pub offender_file: Vec<String>,
|
||||
pub offender_mod: Vec<String>
|
||||
pub offender_mod: Vec<String>,
|
||||
}
|
||||
impl ProjectError for TooManySupers {
|
||||
fn description(&self) -> &str {
|
||||
"an import path starts with more `super` segments than \
|
||||
the current module's absolute path"
|
||||
"an import path starts with more `super` segments than the current \
|
||||
module's absolute path"
|
||||
}
|
||||
fn message(&self) -> String {
|
||||
format!(
|
||||
@@ -32,7 +33,7 @@ impl ProjectError for TooManySupers {
|
||||
"path {} in {} contains too many `super` steps.",
|
||||
self.path.join("::"),
|
||||
self.offender_mod.join("::")
|
||||
))
|
||||
)),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,18 +1,17 @@
|
||||
use crate::utils::{BoxedIter, iter::box_once};
|
||||
|
||||
use super::ErrorPosition;
|
||||
use super::ProjectError;
|
||||
use super::{ErrorPosition, ProjectError};
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::BoxedIter;
|
||||
|
||||
/// Produced when a stage that deals specifically with code encounters
|
||||
/// a path that refers to a directory
|
||||
#[derive(Debug)]
|
||||
pub struct UnexpectedDirectory {
|
||||
pub path: Vec<String>
|
||||
pub path: Vec<String>,
|
||||
}
|
||||
impl ProjectError for UnexpectedDirectory {
|
||||
fn description(&self) -> &str {
|
||||
"A stage that deals specifically with code encountered a path \
|
||||
that refers to a directory"
|
||||
"A stage that deals specifically with code encountered a path that refers \
|
||||
to a directory"
|
||||
}
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||
box_once(ErrorPosition::just_file(self.path.clone()))
|
||||
@@ -23,4 +22,4 @@ impl ProjectError for UnexpectedDirectory {
|
||||
self.path.join("/")
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
use std::rc::Rc;
|
||||
use crate::representations::location::Location;
|
||||
use crate::utils::{BoxedIter, iter::box_once};
|
||||
|
||||
use super::project_error::{ProjectError, ErrorPosition};
|
||||
use super::project_error::{ErrorPosition, ProjectError};
|
||||
use crate::representations::location::Location;
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::BoxedIter;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct VisibilityMismatch{
|
||||
pub struct VisibilityMismatch {
|
||||
pub namespace: Vec<String>,
|
||||
pub file: Rc<Vec<String>>
|
||||
pub file: Rc<Vec<String>>,
|
||||
}
|
||||
impl ProjectError for VisibilityMismatch {
|
||||
fn description(&self) -> &str {
|
||||
@@ -19,7 +20,7 @@ impl ProjectError for VisibilityMismatch {
|
||||
message: Some(format!(
|
||||
"{} is opened multiple times with different visibilities",
|
||||
self.namespace.join("::")
|
||||
))
|
||||
)),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,25 +1,23 @@
|
||||
use std::path::Path;
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::rc::Rc;
|
||||
use std::path::PathBuf;
|
||||
use std::io;
|
||||
use std::fs;
|
||||
use std::{fs, io};
|
||||
|
||||
use crate::interner::{Interner, Sym};
|
||||
use crate::pipeline::error::{
|
||||
ErrorPosition, ProjectError, UnexpectedDirectory,
|
||||
};
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::{Cache, BoxedIter};
|
||||
use crate::interner::{Interner, Token};
|
||||
use crate::pipeline::error::UnexpectedDirectory;
|
||||
use crate::pipeline::error::{ProjectError, ErrorPosition};
|
||||
use crate::utils::{BoxedIter, Cache};
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct FileLoadingError{
|
||||
pub struct FileLoadingError {
|
||||
file: io::Error,
|
||||
dir: io::Error,
|
||||
path: Vec<String>
|
||||
path: Vec<String>,
|
||||
}
|
||||
impl ProjectError for FileLoadingError {
|
||||
fn description(&self) -> &str {
|
||||
"Neither a file nor a directory could be read from \
|
||||
the requested path"
|
||||
"Neither a file nor a directory could be read from the requested path"
|
||||
}
|
||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||
box_once(ErrorPosition::just_file(self.path.clone()))
|
||||
@@ -37,57 +35,55 @@ pub enum Loaded {
|
||||
Collection(Rc<Vec<String>>),
|
||||
}
|
||||
impl Loaded {
|
||||
pub fn is_code(&self) -> bool {matches!(self, Loaded::Code(_))}
|
||||
pub fn is_code(&self) -> bool {
|
||||
matches!(self, Loaded::Code(_))
|
||||
}
|
||||
}
|
||||
|
||||
pub type IOResult = Result<Loaded, Rc<dyn ProjectError>>;
|
||||
|
||||
pub type FileCache<'a> = Cache<'a, Token<Vec<Token<String>>>, IOResult>;
|
||||
pub type FileCache<'a> = Cache<'a, Sym, IOResult>;
|
||||
|
||||
/// Load a file from a path expressed in Rust strings, but relative to
|
||||
/// a root expressed as an OS Path.
|
||||
pub fn load_file(root: &Path, path: &[impl AsRef<str>]) -> IOResult {
|
||||
// let os_path = path.into_iter()
|
||||
// .map_into::<OsString>()
|
||||
// .collect::<Vec<_>>();
|
||||
let full_path = path.iter().fold(
|
||||
root.to_owned(),
|
||||
|p, s| p.join(s.as_ref())
|
||||
);
|
||||
let full_path = path.iter().fold(root.to_owned(), |p, s| p.join(s.as_ref()));
|
||||
let file_path = full_path.with_extension("orc");
|
||||
let file_error = match fs::read_to_string(&file_path) {
|
||||
let file_error = match fs::read_to_string(file_path) {
|
||||
Ok(string) => return Ok(Loaded::Code(Rc::new(string))),
|
||||
Err(err) => err
|
||||
Err(err) => err,
|
||||
};
|
||||
let dir = match fs::read_dir(&full_path) {
|
||||
Ok(dir) => dir,
|
||||
Err(dir_error) => {
|
||||
return Err(FileLoadingError {
|
||||
file: file_error,
|
||||
dir: dir_error,
|
||||
path: path.iter()
|
||||
.map(|s| s.as_ref().to_string())
|
||||
.collect(),
|
||||
}.rc())
|
||||
}
|
||||
Err(dir_error) =>
|
||||
return Err(
|
||||
FileLoadingError {
|
||||
file: file_error,
|
||||
dir: dir_error,
|
||||
path: path.iter().map(|s| s.as_ref().to_string()).collect(),
|
||||
}
|
||||
.rc(),
|
||||
),
|
||||
};
|
||||
let names = dir.filter_map(Result::ok)
|
||||
let names = dir
|
||||
.filter_map(Result::ok)
|
||||
.filter_map(|ent| {
|
||||
let fname = ent.file_name().into_string().ok()?;
|
||||
let ftyp = ent.metadata().ok()?.file_type();
|
||||
Some(if ftyp.is_dir() {fname} else {
|
||||
Some(if ftyp.is_dir() {
|
||||
fname
|
||||
} else {
|
||||
fname.strip_suffix(".or")?.to_string()
|
||||
})
|
||||
}).collect();
|
||||
})
|
||||
.collect();
|
||||
Ok(Loaded::Collection(Rc::new(names)))
|
||||
}
|
||||
|
||||
/// Generates a cached file loader for a directory
|
||||
pub fn mk_cache(root: PathBuf, i: &Interner) -> FileCache {
|
||||
Cache::new(move |token: Token<Vec<Token<String>>>, _this| -> IOResult {
|
||||
let path = i.r(token).iter()
|
||||
.map(|t| i.r(*t).as_str())
|
||||
.collect::<Vec<_>>();
|
||||
Cache::new(move |token: Sym, _this| -> IOResult {
|
||||
let path = i.r(token).iter().map(|t| i.r(*t).as_str()).collect::<Vec<_>>();
|
||||
load_file(&root, &path)
|
||||
})
|
||||
}
|
||||
@@ -95,12 +91,18 @@ pub fn mk_cache(root: PathBuf, i: &Interner) -> FileCache {
|
||||
/// Loads the string contents of a file at the given location.
|
||||
/// If the path points to a directory, raises an error.
|
||||
pub fn load_text(
|
||||
path: Token<Vec<Token<String>>>,
|
||||
load_file: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
||||
i: &Interner
|
||||
path: Sym,
|
||||
load_file: &impl Fn(Sym) -> IOResult,
|
||||
i: &Interner,
|
||||
) -> Result<Rc<String>, Rc<dyn ProjectError>> {
|
||||
if let Loaded::Code(s) = load_file(path)? {Ok(s)}
|
||||
else {Err(UnexpectedDirectory{
|
||||
path: i.r(path).iter().map(|t| i.r(*t)).cloned().collect()
|
||||
}.rc())}
|
||||
}
|
||||
if let Loaded::Code(s) = load_file(path)? {
|
||||
Ok(s)
|
||||
} else {
|
||||
Err(
|
||||
UnexpectedDirectory {
|
||||
path: i.r(path).iter().map(|t| i.r(*t)).cloned().collect(),
|
||||
}
|
||||
.rc(),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,32 +1,32 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::representations::tree::Module;
|
||||
use crate::representations::sourcefile::absolute_path;
|
||||
use crate::utils::{Substack};
|
||||
use crate::interner::{Token, Interner};
|
||||
|
||||
use super::error::{ProjectError, TooManySupers};
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::representations::sourcefile::absolute_path;
|
||||
use crate::utils::Substack;
|
||||
|
||||
pub fn import_abs_path(
|
||||
src_path: &[Token<String>],
|
||||
mod_stack: Substack<Token<String>>,
|
||||
module: &Module<impl Clone, impl Clone>,
|
||||
import_path: &[Token<String>],
|
||||
src_path: &[Tok<String>],
|
||||
mod_stack: Substack<Tok<String>>,
|
||||
import_path: &[Tok<String>],
|
||||
i: &Interner,
|
||||
) -> Result<Vec<Token<String>>, Rc<dyn ProjectError>> {
|
||||
) -> Result<Vec<Tok<String>>, Rc<dyn ProjectError>> {
|
||||
// path of module within file
|
||||
let mod_pathv = mod_stack.iter().rev_vec_clone();
|
||||
// path of module within compilation
|
||||
let abs_pathv = src_path.iter().copied()
|
||||
let abs_pathv = src_path
|
||||
.iter()
|
||||
.copied()
|
||||
.chain(mod_pathv.iter().copied())
|
||||
.collect::<Vec<_>>();
|
||||
// preload-target path relative to module
|
||||
// preload-target path within compilation
|
||||
absolute_path(&abs_pathv, import_path, i, &|n| {
|
||||
module.items.contains_key(&n)
|
||||
}).map_err(|_| TooManySupers{
|
||||
absolute_path(&abs_pathv, import_path, i).map_err(|_| {
|
||||
TooManySupers {
|
||||
path: import_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||
offender_file: src_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||
offender_mod: mod_pathv.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||
}.rc())
|
||||
}
|
||||
}
|
||||
.rc()
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,18 +1,20 @@
|
||||
use hashbrown::{HashMap, HashSet};
|
||||
|
||||
use std::hash::Hash;
|
||||
|
||||
use crate::interner::Token;
|
||||
use hashbrown::{HashMap, HashSet};
|
||||
|
||||
use crate::interner::Sym;
|
||||
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct AliasMap{
|
||||
pub targets: HashMap<Token<Vec<Token<String>>>, Token<Vec<Token<String>>>>,
|
||||
pub aliases: HashMap<Token<Vec<Token<String>>>, HashSet<Token<Vec<Token<String>>>>>,
|
||||
pub struct AliasMap {
|
||||
pub targets: HashMap<Sym, Sym>,
|
||||
pub aliases: HashMap<Sym, HashSet<Sym>>,
|
||||
}
|
||||
impl AliasMap {
|
||||
pub fn new() -> Self {Self::default()}
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
pub fn link(&mut self, alias: Token<Vec<Token<String>>>, target: Token<Vec<Token<String>>>) {
|
||||
pub fn link(&mut self, alias: Sym, target: Sym) {
|
||||
let prev = self.targets.insert(alias, target);
|
||||
debug_assert!(prev.is_none(), "Alias already has a target");
|
||||
multimap_entry(&mut self.aliases, &target).insert(alias);
|
||||
@@ -21,9 +23,7 @@ impl AliasMap {
|
||||
for alt in alts {
|
||||
// Assert that this step has always been done in the past
|
||||
debug_assert!(
|
||||
self.aliases.get(&alt)
|
||||
.map(HashSet::is_empty)
|
||||
.unwrap_or(true),
|
||||
self.aliases.get(&alt).map(HashSet::is_empty).unwrap_or(true),
|
||||
"Alias set of alias not empty"
|
||||
);
|
||||
debug_assert!(
|
||||
@@ -35,7 +35,7 @@ impl AliasMap {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn resolve(&self, alias: Token<Vec<Token<String>>>) -> Option<Token<Vec<Token<String>>>> {
|
||||
pub fn resolve(&self, alias: Sym) -> Option<Sym> {
|
||||
self.targets.get(&alias).copied()
|
||||
}
|
||||
}
|
||||
@@ -44,10 +44,11 @@ impl AliasMap {
|
||||
/// map-to-set (aka. multimap)
|
||||
fn multimap_entry<'a, K: Eq + Hash + Clone, V>(
|
||||
map: &'a mut HashMap<K, HashSet<V>>,
|
||||
key: &'_ K
|
||||
key: &'_ K,
|
||||
) -> &'a mut HashSet<V> {
|
||||
map.raw_entry_mut()
|
||||
map
|
||||
.raw_entry_mut()
|
||||
.from_key(key)
|
||||
.or_insert_with(|| (key.clone(), HashSet::new()))
|
||||
.1
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,22 +2,28 @@ use std::rc::Rc;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use crate::{utils::Substack, interner::{Token, Interner}, pipeline::{ProjectModule, ProjectExt}, representations::tree::{ModEntry, ModMember}, ast::{Rule, Expr}};
|
||||
|
||||
use super::{alias_map::AliasMap, decls::InjectedAsFn};
|
||||
use super::alias_map::AliasMap;
|
||||
use super::decls::InjectedAsFn;
|
||||
use crate::ast::{Expr, Rule};
|
||||
use crate::interner::{Interner, Sym, Tok};
|
||||
use crate::pipeline::{ProjectExt, ProjectModule};
|
||||
use crate::representations::tree::{ModEntry, ModMember};
|
||||
use crate::utils::Substack;
|
||||
|
||||
fn resolve(
|
||||
token: Token<Vec<Token<String>>>,
|
||||
token: Sym,
|
||||
alias_map: &AliasMap,
|
||||
i: &Interner,
|
||||
) -> Option<Vec<Token<String>>> {
|
||||
) -> Option<Vec<Tok<String>>> {
|
||||
if let Some(alias) = alias_map.resolve(token) {
|
||||
Some(i.r(alias).clone())
|
||||
} else if let Some((foot, body)) = i.r(token).split_last() {
|
||||
let mut new_beginning = resolve(i.i(body), alias_map, i)?;
|
||||
new_beginning.push(*foot);
|
||||
Some(new_beginning)
|
||||
} else {None}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn process_expr(
|
||||
@@ -26,74 +32,91 @@ fn process_expr(
|
||||
injected_as: &impl InjectedAsFn,
|
||||
i: &Interner,
|
||||
) -> Expr {
|
||||
expr.map_names(&|n| {
|
||||
injected_as(&i.r(n)[..]).or_else(|| {
|
||||
let next_v = resolve(n, alias_map, i)?;
|
||||
// println!("Resolved alias {} to {}",
|
||||
// i.extern_vec(n).join("::"),
|
||||
// i.extern_all(&next_v).join("::")
|
||||
// );
|
||||
Some(
|
||||
injected_as(&next_v)
|
||||
.unwrap_or_else(|| i.i(&next_v))
|
||||
)
|
||||
expr
|
||||
.map_names(&|n| {
|
||||
injected_as(&i.r(n)[..]).or_else(|| {
|
||||
let next_v = resolve(n, alias_map, i)?;
|
||||
// println!("Resolved alias {} to {}",
|
||||
// i.extern_vec(n).join("::"),
|
||||
// i.extern_all(&next_v).join("::")
|
||||
// );
|
||||
Some(injected_as(&next_v).unwrap_or_else(|| i.i(&next_v)))
|
||||
})
|
||||
})
|
||||
}).unwrap_or_else(|| expr.clone())
|
||||
.unwrap_or_else(|| expr.clone())
|
||||
}
|
||||
|
||||
// TODO: replace is_injected with injected_as
|
||||
/// Replace all aliases with the name they're originally defined as
|
||||
fn apply_aliases_rec(
|
||||
path: Substack<Token<String>>,
|
||||
path: Substack<Tok<String>>,
|
||||
module: &ProjectModule,
|
||||
alias_map: &AliasMap,
|
||||
i: &Interner,
|
||||
injected_as: &impl InjectedAsFn,
|
||||
) -> ProjectModule {
|
||||
let items = module.items.iter().map(|(name, ent)| {
|
||||
let ModEntry{ exported, member } = ent;
|
||||
let member = match member {
|
||||
ModMember::Item(expr) => ModMember::Item(
|
||||
process_expr(expr, alias_map, injected_as, i)
|
||||
),
|
||||
ModMember::Sub(module) => {
|
||||
let subpath = path.push(*name);
|
||||
let is_ignored = injected_as(&subpath.iter().rev_vec_clone()).is_some();
|
||||
let new_mod = if is_ignored {module.clone()} else {
|
||||
let module = module.as_ref();
|
||||
Rc::new(apply_aliases_rec(
|
||||
subpath, module,
|
||||
alias_map, i, injected_as
|
||||
))
|
||||
};
|
||||
ModMember::Sub(new_mod)
|
||||
let items = module
|
||||
.items
|
||||
.iter()
|
||||
.map(|(name, ent)| {
|
||||
let ModEntry { exported, member } = ent;
|
||||
let member = match member {
|
||||
ModMember::Item(expr) =>
|
||||
ModMember::Item(process_expr(expr, alias_map, injected_as, i)),
|
||||
ModMember::Sub(module) => {
|
||||
let subpath = path.push(*name);
|
||||
let is_ignored =
|
||||
injected_as(&subpath.iter().rev_vec_clone()).is_some();
|
||||
let new_mod = if is_ignored {
|
||||
module.clone()
|
||||
} else {
|
||||
let module = module.as_ref();
|
||||
Rc::new(apply_aliases_rec(
|
||||
subpath,
|
||||
module,
|
||||
alias_map,
|
||||
i,
|
||||
injected_as,
|
||||
))
|
||||
};
|
||||
ModMember::Sub(new_mod)
|
||||
},
|
||||
};
|
||||
(*name, ModEntry { exported: *exported, member })
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
let rules = module
|
||||
.extra
|
||||
.rules
|
||||
.iter()
|
||||
.map(|rule| {
|
||||
let Rule { source, prio, target } = rule;
|
||||
Rule {
|
||||
prio: *prio,
|
||||
source: Rc::new(
|
||||
source
|
||||
.iter()
|
||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||
.collect::<Vec<_>>(),
|
||||
),
|
||||
target: Rc::new(
|
||||
target
|
||||
.iter()
|
||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||
.collect::<Vec<_>>(),
|
||||
),
|
||||
}
|
||||
};
|
||||
(*name, ModEntry{ exported: *exported, member })
|
||||
}).collect::<HashMap<_, _>>();
|
||||
let rules = module.extra.rules.iter().map(|rule| {
|
||||
let Rule{ source, prio, target } = rule;
|
||||
Rule{
|
||||
prio: *prio,
|
||||
source: Rc::new(source.iter()
|
||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||
.collect::<Vec<_>>()
|
||||
),
|
||||
target: Rc::new(target.iter()
|
||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||
.collect::<Vec<_>>()
|
||||
),
|
||||
}
|
||||
}).collect::<Vec<_>>();
|
||||
ProjectModule{
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
ProjectModule {
|
||||
items,
|
||||
imports: module.imports.clone(),
|
||||
extra: ProjectExt{
|
||||
extra: ProjectExt {
|
||||
rules,
|
||||
exports: module.extra.exports.clone(),
|
||||
file: module.extra.file.clone(),
|
||||
imports_from: module.extra.imports_from.clone(),
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -104,4 +127,4 @@ pub fn apply_aliases(
|
||||
injected_as: &impl InjectedAsFn,
|
||||
) -> ProjectModule {
|
||||
apply_aliases_rec(Substack::Bottom, module, alias_map, i, injected_as)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,62 +1,70 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::representations::tree::{WalkErrorKind, ModMember};
|
||||
use crate::pipeline::error::{ProjectError, NotExported};
|
||||
use crate::pipeline::project_tree::{ProjectTree, split_path, ProjectModule};
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::utils::{Substack, pushed};
|
||||
|
||||
use super::alias_map::AliasMap;
|
||||
use super::decls::InjectedAsFn;
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::pipeline::error::{NotExported, ProjectError};
|
||||
use crate::pipeline::project_tree::{split_path, ProjectModule, ProjectTree};
|
||||
use crate::representations::tree::{ModMember, WalkErrorKind};
|
||||
use crate::utils::{pushed, Substack};
|
||||
|
||||
/// Assert that a module identified by a path can see a given symbol
|
||||
fn assert_visible(
|
||||
source: &[Token<String>], // must point to a file or submodule
|
||||
target: &[Token<String>], // may point to a symbol or module of any kind
|
||||
source: &[Tok<String>], // must point to a file or submodule
|
||||
target: &[Tok<String>], // may point to a symbol or module of any kind
|
||||
project: &ProjectTree,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Result<(), Rc<dyn ProjectError>> {
|
||||
let (tgt_item, tgt_path) = if let Some(s) = target.split_last() {s}
|
||||
else {return Ok(())};
|
||||
let shared_len = source.iter()
|
||||
.zip(tgt_path.iter())
|
||||
.take_while(|(a, b)| a == b)
|
||||
.count();
|
||||
let shared_root = project.0.walk(&tgt_path[..shared_len], false)
|
||||
.expect("checked in parsing");
|
||||
let direct_parent = shared_root.walk(&tgt_path[shared_len..], true)
|
||||
.map_err(|e| match e.kind {
|
||||
WalkErrorKind::Missing => panic!("checked in parsing"),
|
||||
WalkErrorKind::Private => {
|
||||
let full_path = &tgt_path[..shared_len + e.pos];
|
||||
let (file, sub) = split_path(full_path, &project);
|
||||
let (ref_file, ref_sub) = split_path(source, &project);
|
||||
NotExported{
|
||||
file: i.extern_all(file),
|
||||
subpath: i.extern_all(sub),
|
||||
referrer_file: i.extern_all(ref_file),
|
||||
referrer_subpath: i.extern_all(ref_sub),
|
||||
}.rc()
|
||||
let (tgt_item, tgt_path) = if let Some(s) = target.split_last() {
|
||||
s
|
||||
} else {
|
||||
return Ok(());
|
||||
};
|
||||
let shared_len =
|
||||
source.iter().zip(tgt_path.iter()).take_while(|(a, b)| a == b).count();
|
||||
let shared_root =
|
||||
project.0.walk(&tgt_path[..shared_len], false).expect("checked in parsing");
|
||||
let direct_parent =
|
||||
shared_root.walk(&tgt_path[shared_len..], true).map_err(|e| {
|
||||
match e.kind {
|
||||
WalkErrorKind::Missing => panic!("checked in parsing"),
|
||||
WalkErrorKind::Private => {
|
||||
let full_path = &tgt_path[..shared_len + e.pos];
|
||||
let (file, sub) = split_path(full_path, project);
|
||||
let (ref_file, ref_sub) = split_path(source, project);
|
||||
NotExported {
|
||||
file: i.extern_all(file),
|
||||
subpath: i.extern_all(sub),
|
||||
referrer_file: i.extern_all(ref_file),
|
||||
referrer_subpath: i.extern_all(ref_sub),
|
||||
}
|
||||
.rc()
|
||||
},
|
||||
}
|
||||
})?;
|
||||
let tgt_item_exported = direct_parent.extra.exports.contains_key(tgt_item);
|
||||
let target_prefixes_source = shared_len == tgt_path.len()
|
||||
&& source.get(shared_len) == Some(tgt_item);
|
||||
let target_prefixes_source =
|
||||
shared_len == tgt_path.len() && source.get(shared_len) == Some(tgt_item);
|
||||
if !tgt_item_exported && !target_prefixes_source {
|
||||
let (file, sub) = split_path(target, &project);
|
||||
let (ref_file, ref_sub) = split_path(source, &project);
|
||||
Err(NotExported{
|
||||
file: i.extern_all(file),
|
||||
subpath: i.extern_all(sub),
|
||||
referrer_file: i.extern_all(ref_file),
|
||||
referrer_subpath: i.extern_all(ref_sub),
|
||||
}.rc())
|
||||
} else {Ok(())}
|
||||
let (file, sub) = split_path(target, project);
|
||||
let (ref_file, ref_sub) = split_path(source, project);
|
||||
Err(
|
||||
NotExported {
|
||||
file: i.extern_all(file),
|
||||
subpath: i.extern_all(sub),
|
||||
referrer_file: i.extern_all(ref_file),
|
||||
referrer_subpath: i.extern_all(ref_sub),
|
||||
}
|
||||
.rc(),
|
||||
)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Populate target and alias maps from the module tree recursively
|
||||
fn collect_aliases_rec(
|
||||
path: Substack<Token<String>>,
|
||||
path: Substack<Tok<String>>,
|
||||
module: &ProjectModule,
|
||||
project: &ProjectTree,
|
||||
alias_map: &mut AliasMap,
|
||||
@@ -65,7 +73,9 @@ fn collect_aliases_rec(
|
||||
) -> Result<(), Rc<dyn ProjectError>> {
|
||||
// Assume injected module has been alias-resolved
|
||||
let mod_path_v = path.iter().rev_vec_clone();
|
||||
if injected_as(&mod_path_v).is_some() {return Ok(())};
|
||||
if injected_as(&mod_path_v).is_some() {
|
||||
return Ok(());
|
||||
};
|
||||
for (&name, &target_mod) in module.extra.imports_from.iter() {
|
||||
let target_mod_v = i.r(target_mod);
|
||||
let target_sym_v = pushed(target_mod_v, name);
|
||||
@@ -78,11 +88,16 @@ fn collect_aliases_rec(
|
||||
for (&name, entry) in module.items.iter() {
|
||||
let submodule = if let ModMember::Sub(s) = &entry.member {
|
||||
s.as_ref()
|
||||
} else {continue};
|
||||
} else {
|
||||
continue;
|
||||
};
|
||||
collect_aliases_rec(
|
||||
path.push(name),
|
||||
submodule, project, alias_map,
|
||||
i, injected_as,
|
||||
submodule,
|
||||
project,
|
||||
alias_map,
|
||||
i,
|
||||
injected_as,
|
||||
)?
|
||||
}
|
||||
Ok(())
|
||||
@@ -97,7 +112,11 @@ pub fn collect_aliases(
|
||||
injected_as: &impl InjectedAsFn,
|
||||
) -> Result<(), Rc<dyn ProjectError>> {
|
||||
collect_aliases_rec(
|
||||
Substack::Bottom, module, project, alias_map,
|
||||
i, injected_as
|
||||
Substack::Bottom,
|
||||
module,
|
||||
project,
|
||||
alias_map,
|
||||
i,
|
||||
injected_as,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
use crate::interner::Token;
|
||||
use crate::interner::{Sym, Tok};
|
||||
|
||||
pub trait InjectedAsFn = Fn(
|
||||
&[Token<String>]
|
||||
) -> Option<Token<Vec<Token<String>>>>;
|
||||
pub trait InjectedAsFn = Fn(&[Tok<String>]) -> Option<Sym>;
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
mod alias_map;
|
||||
mod collect_aliases;
|
||||
mod apply_aliases;
|
||||
mod resolve_imports;
|
||||
mod collect_aliases;
|
||||
mod decls;
|
||||
mod resolve_imports;
|
||||
|
||||
pub use resolve_imports::resolve_imports;
|
||||
|
||||
@@ -2,16 +2,14 @@ use std::rc::Rc;
|
||||
|
||||
use itertools::Itertools;
|
||||
|
||||
use super::alias_map::AliasMap;
|
||||
use super::apply_aliases::apply_aliases;
|
||||
use super::collect_aliases::collect_aliases;
|
||||
use super::decls::InjectedAsFn;
|
||||
use crate::interner::Interner;
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::pipeline::project_tree::ProjectTree;
|
||||
|
||||
|
||||
use super::alias_map::AliasMap;
|
||||
use super::collect_aliases::collect_aliases;
|
||||
use super::apply_aliases::apply_aliases;
|
||||
use super::decls::InjectedAsFn;
|
||||
|
||||
/// Follow import chains to locate the original name of all tokens, then
|
||||
/// replace these aliases with the original names throughout the tree
|
||||
pub fn resolve_imports(
|
||||
@@ -20,14 +18,14 @@ pub fn resolve_imports(
|
||||
injected_as: &impl InjectedAsFn,
|
||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||
let mut map = AliasMap::new();
|
||||
collect_aliases(
|
||||
project.0.as_ref(),
|
||||
&project, &mut map,
|
||||
i, injected_as
|
||||
)?;
|
||||
println!("Aliases: {{{:?}}}",
|
||||
map.targets.iter()
|
||||
.map(|(kt, vt)| format!("{} => {}",
|
||||
collect_aliases(project.0.as_ref(), &project, &mut map, i, injected_as)?;
|
||||
println!(
|
||||
"Aliases: {{{:?}}}",
|
||||
map
|
||||
.targets
|
||||
.iter()
|
||||
.map(|(kt, vt)| format!(
|
||||
"{} => {}",
|
||||
i.extern_vec(*kt).join("::"),
|
||||
i.extern_vec(*vt).join("::")
|
||||
))
|
||||
@@ -35,4 +33,4 @@ pub fn resolve_imports(
|
||||
);
|
||||
let new_mod = apply_aliases(project.0.as_ref(), &map, i, injected_as);
|
||||
Ok(ProjectTree(Rc::new(new_mod)))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,19 +1,14 @@
|
||||
pub mod error;
|
||||
pub mod file_loader;
|
||||
mod import_abs_path;
|
||||
mod import_resolution;
|
||||
mod parse_layer;
|
||||
mod project_tree;
|
||||
mod source_loader;
|
||||
mod import_abs_path;
|
||||
mod split_name;
|
||||
mod import_resolution;
|
||||
pub mod file_loader;
|
||||
mod parse_layer;
|
||||
|
||||
pub use parse_layer::parse_layer;
|
||||
pub use project_tree::{
|
||||
ConstTree, ProjectExt, ProjectModule, ProjectTree, from_const_tree,
|
||||
collect_consts, collect_rules,
|
||||
collect_consts, collect_rules, from_const_tree, ConstTree, ProjectExt,
|
||||
ProjectModule, ProjectTree,
|
||||
};
|
||||
// pub use file_loader::{Loaded, FileLoadingError, IOResult};
|
||||
// pub use error::{
|
||||
// ErrorPosition, ModuleNotFound, NotExported, ParseErrorWithPath,
|
||||
// ProjectError, TooManySupers, UnexpectedDirectory
|
||||
// };
|
||||
@@ -1,52 +1,46 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::representations::sourcefile::FileEntry;
|
||||
use crate::interner::{Token, Interner};
|
||||
|
||||
use super::{project_tree, import_resolution};
|
||||
use super::source_loader;
|
||||
use super::file_loader::IOResult;
|
||||
use super::error::ProjectError;
|
||||
use super::ProjectTree;
|
||||
use super::file_loader::IOResult;
|
||||
use super::{import_resolution, project_tree, source_loader, ProjectTree};
|
||||
use crate::interner::{Interner, Sym, Tok};
|
||||
use crate::representations::sourcefile::FileEntry;
|
||||
|
||||
/// Using an IO callback, produce a project tree that includes the given
|
||||
/// target symbols or files if they're defined.
|
||||
///
|
||||
///
|
||||
/// The environment accessible to the loaded source can be specified with
|
||||
/// a pre-existing tree which will be merged with the loaded data, and a
|
||||
/// prelude which will be prepended to each individual file. Since the
|
||||
/// prelude gets compiled with each file, normally it should be a glob
|
||||
/// import pointing to a module in the environment.
|
||||
pub fn parse_layer<'a>(
|
||||
targets: &[Token<Vec<Token<String>>>],
|
||||
loader: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
||||
environment: &'a ProjectTree,
|
||||
pub fn parse_layer(
|
||||
targets: &[Sym],
|
||||
loader: &impl Fn(Sym) -> IOResult,
|
||||
environment: &ProjectTree,
|
||||
prelude: &[FileEntry],
|
||||
i: &Interner,
|
||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||
// A path is injected if it is walkable in the injected tree
|
||||
let injected_as = |path: &[Token<String>]| {
|
||||
let injected_as = |path: &[Tok<String>]| {
|
||||
let (item, modpath) = path.split_last()?;
|
||||
let module = environment.0.walk(modpath, false).ok()?;
|
||||
let inj = module.extra.exports.get(item).copied()?;
|
||||
Some(inj)
|
||||
};
|
||||
let injected_names = |path: Token<Vec<Token<String>>>| {
|
||||
let pathv = &i.r(path)[..];
|
||||
let module = environment.0.walk(&pathv, false).ok()?;
|
||||
Some(Rc::new(
|
||||
module.extra.exports.keys().copied().collect()
|
||||
))
|
||||
let injected_names = |path: Tok<Vec<Tok<String>>>| {
|
||||
let module = environment.0.walk(&i.r(path)[..], false).ok()?;
|
||||
Some(Rc::new(module.extra.exports.keys().copied().collect()))
|
||||
};
|
||||
let source = source_loader::load_source(
|
||||
targets, prelude, i, loader, &|path| injected_as(path).is_some()
|
||||
)?;
|
||||
let source =
|
||||
source_loader::load_source(targets, prelude, i, loader, &|path| {
|
||||
injected_as(path).is_some()
|
||||
})?;
|
||||
let tree = project_tree::build_tree(source, i, prelude, &injected_names)?;
|
||||
let sum = ProjectTree(Rc::new(
|
||||
environment.0.as_ref().clone()
|
||||
+ tree.0.as_ref().clone()
|
||||
environment.0.as_ref().clone() + tree.0.as_ref().clone(),
|
||||
));
|
||||
let resolvd = import_resolution::resolve_imports(sum, i, &injected_as)?;
|
||||
// Addition among modules favours the left hand side.
|
||||
Ok(resolvd)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,23 +1,19 @@
|
||||
use crate::representations::sourcefile::{Member, FileEntry};
|
||||
use crate::interner::Token;
|
||||
use crate::interner::Tok;
|
||||
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||
|
||||
fn member_rec(
|
||||
// object
|
||||
member: Member,
|
||||
// context
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
prelude: &[FileEntry],
|
||||
) -> Member {
|
||||
match member {
|
||||
Member::Namespace(name, body) => {
|
||||
let new_body = entv_rec(
|
||||
body,
|
||||
path,
|
||||
prelude
|
||||
);
|
||||
Member::Namespace(name, new_body)
|
||||
Member::Namespace(Namespace { name, body }) => {
|
||||
let new_body = entv_rec(body, path, prelude);
|
||||
Member::Namespace(Namespace { name, body: new_body })
|
||||
},
|
||||
any => any
|
||||
any => any,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -25,28 +21,26 @@ fn entv_rec(
|
||||
// object
|
||||
data: Vec<FileEntry>,
|
||||
// context
|
||||
mod_path: &[Token<String>],
|
||||
mod_path: &[Tok<String>],
|
||||
prelude: &[FileEntry],
|
||||
) -> Vec<FileEntry> {
|
||||
prelude.iter().cloned()
|
||||
.chain(data.into_iter()
|
||||
.map(|ent| match ent {
|
||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
||||
mem, mod_path, prelude
|
||||
)),
|
||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
||||
mem, mod_path, prelude
|
||||
)),
|
||||
any => any
|
||||
})
|
||||
)
|
||||
prelude
|
||||
.iter()
|
||||
.cloned()
|
||||
.chain(data.into_iter().map(|ent| match ent {
|
||||
FileEntry::Exported(mem) =>
|
||||
FileEntry::Exported(member_rec(mem, mod_path, prelude)),
|
||||
FileEntry::Internal(mem) =>
|
||||
FileEntry::Internal(member_rec(mem, mod_path, prelude)),
|
||||
any => any,
|
||||
}))
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub fn add_prelude(
|
||||
data: Vec<FileEntry>,
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
prelude: &[FileEntry],
|
||||
) -> Vec<FileEntry> {
|
||||
entv_rec(data, path, prelude)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,38 +2,41 @@ use std::rc::Rc;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::utils::iter::{box_once, box_empty};
|
||||
use crate::utils::{Substack, pushed};
|
||||
use crate::ast::{Expr, Constant};
|
||||
use crate::pipeline::source_loader::{LoadedSourceTable, LoadedSource};
|
||||
use crate::representations::tree::{Module, ModMember, ModEntry};
|
||||
use crate::representations::sourcefile::{FileEntry, Member, absolute_path};
|
||||
|
||||
use super::collect_ops::InjectedOperatorsFn;
|
||||
use super::{collect_ops, ProjectTree, ProjectExt};
|
||||
use super::parse_file::parse_file;
|
||||
use super::{collect_ops, ProjectExt, ProjectTree};
|
||||
use crate::ast::{Constant, Expr};
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::pipeline::source_loader::{LoadedSource, LoadedSourceTable};
|
||||
use crate::representations::sourcefile::{absolute_path, FileEntry, Member};
|
||||
use crate::representations::tree::{ModEntry, ModMember, Module};
|
||||
use crate::utils::iter::{box_empty, box_once};
|
||||
use crate::utils::{pushed, Substack};
|
||||
|
||||
#[derive(Debug)]
|
||||
struct ParsedSource<'a> {
|
||||
path: Vec<Token<String>>,
|
||||
path: Vec<Tok<String>>,
|
||||
loaded: &'a LoadedSource,
|
||||
parsed: Vec<FileEntry>
|
||||
parsed: Vec<FileEntry>,
|
||||
}
|
||||
|
||||
pub fn split_path<'a>(path: &'a [Token<String>], proj: &'a ProjectTree)
|
||||
-> (&'a [Token<String>], &'a [Token<String>])
|
||||
{
|
||||
let (end, body) = if let Some(s) = path.split_last() {s}
|
||||
else {return (&[], &[])};
|
||||
let mut module = proj.0.walk(body, false).expect("invalid path cannot be split");
|
||||
pub fn split_path<'a>(
|
||||
path: &'a [Tok<String>],
|
||||
proj: &'a ProjectTree,
|
||||
) -> (&'a [Tok<String>], &'a [Tok<String>]) {
|
||||
let (end, body) = if let Some(s) = path.split_last() {
|
||||
s
|
||||
} else {
|
||||
return (&[], &[]);
|
||||
};
|
||||
let mut module =
|
||||
proj.0.walk(body, false).expect("invalid path cannot be split");
|
||||
if let ModMember::Sub(m) = &module.items[end].member {
|
||||
module = m.clone();
|
||||
}
|
||||
let file = module.extra.file.as_ref()
|
||||
.map(|s| &path[..s.len()])
|
||||
.unwrap_or(&path[..]);
|
||||
let file =
|
||||
module.extra.file.as_ref().map(|s| &path[..s.len()]).unwrap_or(path);
|
||||
let subpath = &path[file.len()..];
|
||||
(file, subpath)
|
||||
}
|
||||
@@ -41,7 +44,7 @@ pub fn split_path<'a>(path: &'a [Token<String>], proj: &'a ProjectTree)
|
||||
/// Convert normalized, prefixed source into a module
|
||||
fn source_to_module(
|
||||
// level
|
||||
path: Substack<Token<String>>,
|
||||
path: Substack<Tok<String>>,
|
||||
preparsed: &Module<impl Clone, impl Clone>,
|
||||
// data
|
||||
data: Vec<FileEntry>,
|
||||
@@ -50,35 +53,38 @@ fn source_to_module(
|
||||
filepath_len: usize,
|
||||
) -> Rc<Module<Expr, ProjectExt>> {
|
||||
let path_v = path.iter().rev_vec_clone();
|
||||
let imports = data.iter()
|
||||
.filter_map(|ent| if let FileEntry::Import(impv) = ent {
|
||||
Some(impv.iter())
|
||||
} else {None})
|
||||
let imports = data
|
||||
.iter()
|
||||
.filter_map(|ent| {
|
||||
if let FileEntry::Import(impv) = ent {
|
||||
Some(impv.iter())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.flatten()
|
||||
.cloned()
|
||||
.collect::<Vec<_>>();
|
||||
let imports_from = imports.iter()
|
||||
let imports_from = imports
|
||||
.iter()
|
||||
.map(|imp| {
|
||||
let mut imp_path_v = i.r(imp.path).clone();
|
||||
imp_path_v.push(imp.name.expect("imports normalized"));
|
||||
let mut abs_path = absolute_path(
|
||||
&path_v,
|
||||
&imp_path_v,
|
||||
i, &|n| preparsed.items.contains_key(&n)
|
||||
).expect("tested in preparsing");
|
||||
let mut abs_path =
|
||||
absolute_path(&path_v, &imp_path_v, i).expect("tested in preparsing");
|
||||
let name = abs_path.pop().expect("importing the global context");
|
||||
(name, i.i(&abs_path))
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
let exports = data.iter()
|
||||
let exports = data
|
||||
.iter()
|
||||
.flat_map(|ent| {
|
||||
let mk_ent = |name| (name, i.i(&pushed(&path_v, name)));
|
||||
match ent {
|
||||
FileEntry::Export(names)
|
||||
=> Box::new(names.iter().copied().map(mk_ent)),
|
||||
FileEntry::Export(names) => Box::new(names.iter().copied().map(mk_ent)),
|
||||
FileEntry::Exported(mem) => match mem {
|
||||
Member::Constant(constant) => box_once(mk_ent(constant.name)),
|
||||
Member::Namespace(name, _) => box_once(mk_ent(*name)),
|
||||
Member::Namespace(ns) => box_once(mk_ent(ns.name)),
|
||||
Member::Rule(rule) => {
|
||||
let mut names = Vec::new();
|
||||
for e in rule.source.iter() {
|
||||
@@ -89,13 +95,14 @@ fn source_to_module(
|
||||
})
|
||||
}
|
||||
Box::new(names.into_iter())
|
||||
}
|
||||
}
|
||||
_ => box_empty()
|
||||
},
|
||||
},
|
||||
_ => box_empty(),
|
||||
}
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
let rules = data.iter()
|
||||
let rules = data
|
||||
.iter()
|
||||
.filter_map(|ent| match ent {
|
||||
FileEntry::Exported(Member::Rule(rule)) => Some(rule),
|
||||
FileEntry::Internal(Member::Rule(rule)) => Some(rule),
|
||||
@@ -103,38 +110,51 @@ fn source_to_module(
|
||||
})
|
||||
.cloned()
|
||||
.collect::<Vec<_>>();
|
||||
let items = data.into_iter()
|
||||
let items = data
|
||||
.into_iter()
|
||||
.filter_map(|ent| match ent {
|
||||
FileEntry::Exported(Member::Namespace(name, body)) => {
|
||||
let prep_member = &preparsed.items[&name].member;
|
||||
let new_prep = if let ModMember::Sub(s) = prep_member {s.as_ref()}
|
||||
else { panic!("preparsed missing a submodule") };
|
||||
FileEntry::Exported(Member::Namespace(ns)) => {
|
||||
let prep_member = &preparsed.items[&ns.name].member;
|
||||
let new_prep = if let ModMember::Sub(s) = prep_member {
|
||||
s.as_ref()
|
||||
} else {
|
||||
panic!("preparsed missing a submodule")
|
||||
};
|
||||
let module = source_to_module(
|
||||
path.push(name),
|
||||
new_prep, body, i, filepath_len
|
||||
path.push(ns.name),
|
||||
new_prep,
|
||||
ns.body,
|
||||
i,
|
||||
filepath_len,
|
||||
);
|
||||
let member = ModMember::Sub(module);
|
||||
Some((name, ModEntry{ exported: true, member }))
|
||||
}
|
||||
FileEntry::Internal(Member::Namespace(name, body)) => {
|
||||
let prep_member = &preparsed.items[&name].member;
|
||||
let new_prep = if let ModMember::Sub(s) = prep_member {s.as_ref()}
|
||||
else { panic!("preparsed missing a submodule") };
|
||||
Some((ns.name, ModEntry { exported: true, member }))
|
||||
},
|
||||
FileEntry::Internal(Member::Namespace(ns)) => {
|
||||
let prep_member = &preparsed.items[&ns.name].member;
|
||||
let new_prep = if let ModMember::Sub(s) = prep_member {
|
||||
s.as_ref()
|
||||
} else {
|
||||
panic!("preparsed missing a submodule")
|
||||
};
|
||||
let module = source_to_module(
|
||||
path.push(name),
|
||||
new_prep, body, i, filepath_len
|
||||
path.push(ns.name),
|
||||
new_prep,
|
||||
ns.body,
|
||||
i,
|
||||
filepath_len,
|
||||
);
|
||||
let member = ModMember::Sub(module);
|
||||
Some((name, ModEntry{ exported: false, member }))
|
||||
}
|
||||
FileEntry::Exported(Member::Constant(Constant{ name, value })) => {
|
||||
Some((ns.name, ModEntry { exported: false, member }))
|
||||
},
|
||||
FileEntry::Exported(Member::Constant(Constant { name, value })) => {
|
||||
let member = ModMember::Item(value);
|
||||
Some((name, ModEntry{ exported: true, member }))
|
||||
}
|
||||
FileEntry::Internal(Member::Constant(Constant{ name, value })) => {
|
||||
Some((name, ModEntry { exported: true, member }))
|
||||
},
|
||||
FileEntry::Internal(Member::Constant(Constant { name, value })) => {
|
||||
let member = ModMember::Item(value);
|
||||
Some((name, ModEntry{ exported: false, member }))
|
||||
}
|
||||
Some((name, ModEntry { exported: false, member }))
|
||||
},
|
||||
_ => None,
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
@@ -150,15 +170,15 @@ fn source_to_module(
|
||||
imports_from,
|
||||
exports,
|
||||
rules,
|
||||
file: Some(path_v[..filepath_len].to_vec())
|
||||
}
|
||||
file: Some(path_v[..filepath_len].to_vec()),
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
fn files_to_module(
|
||||
path: Substack<Token<String>>,
|
||||
path: Substack<Tok<String>>,
|
||||
files: &[ParsedSource],
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Rc<Module<Expr, ProjectExt>> {
|
||||
let lvl = path.len();
|
||||
let path_v = path.iter().rev_vec_clone();
|
||||
@@ -167,19 +187,22 @@ fn files_to_module(
|
||||
path,
|
||||
files[0].loaded.preparsed.0.as_ref(),
|
||||
files[0].parsed.clone(),
|
||||
i, path.len()
|
||||
)
|
||||
i,
|
||||
path.len(),
|
||||
);
|
||||
}
|
||||
let items = files.group_by(|a, b| a.path[lvl] == b.path[lvl]).into_iter()
|
||||
let items = files
|
||||
.group_by(|a, b| a.path[lvl] == b.path[lvl])
|
||||
.map(|files| {
|
||||
let namespace = files[0].path[lvl];
|
||||
let subpath = path.push(namespace);
|
||||
let module = files_to_module(subpath, files, i);
|
||||
let member = ModMember::Sub(module);
|
||||
(namespace, ModEntry{ exported: true, member })
|
||||
(namespace, ModEntry { exported: true, member })
|
||||
})
|
||||
.collect::<HashMap<_, _>>();
|
||||
let exports: HashMap<_, _> = items.keys()
|
||||
let exports: HashMap<_, _> = items
|
||||
.keys()
|
||||
.copied()
|
||||
.map(|name| (name, i.i(&pushed(&path_v, name))))
|
||||
.collect();
|
||||
@@ -188,38 +211,44 @@ fn files_to_module(
|
||||
// i.extern_all(&path_v[..]).join("::"),
|
||||
// exports.keys().map(|t| i.r(*t)).join(", ")
|
||||
// );
|
||||
Rc::new(Module{
|
||||
Rc::new(Module {
|
||||
items,
|
||||
imports: vec![],
|
||||
extra: ProjectExt {
|
||||
exports,
|
||||
imports_from: HashMap::new(),
|
||||
rules: vec![], file: None,
|
||||
}
|
||||
rules: vec![],
|
||||
file: None,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
pub fn build_tree<'a>(
|
||||
pub fn build_tree(
|
||||
files: LoadedSourceTable,
|
||||
i: &Interner,
|
||||
prelude: &[FileEntry],
|
||||
injected: &impl InjectedOperatorsFn,
|
||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||
let ops_cache = collect_ops::mk_cache(&files, i, injected);
|
||||
let mut entries = files.iter()
|
||||
.map(|(path, loaded)| Ok((
|
||||
i.r(*path),
|
||||
loaded,
|
||||
parse_file(*path, &files, &ops_cache, i, prelude)?
|
||||
)))
|
||||
let mut entries = files
|
||||
.iter()
|
||||
.map(|(path, loaded)| {
|
||||
Ok((
|
||||
i.r(*path),
|
||||
loaded,
|
||||
parse_file(*path, &files, &ops_cache, i, prelude)?,
|
||||
))
|
||||
})
|
||||
.collect::<Result<Vec<_>, Rc<dyn ProjectError>>>()?;
|
||||
// sort by similarity, then longest-first
|
||||
entries.sort_unstable_by(|a, b| a.0.cmp(&b.0).reverse());
|
||||
let files = entries.into_iter()
|
||||
.map(|(path, loaded, parsed)| ParsedSource{
|
||||
loaded, parsed,
|
||||
path: path.clone()
|
||||
entries.sort_unstable_by(|a, b| a.0.cmp(b.0).reverse());
|
||||
let files = entries
|
||||
.into_iter()
|
||||
.map(|(path, loaded, parsed)| ParsedSource {
|
||||
loaded,
|
||||
parsed,
|
||||
path: path.clone(),
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
Ok(ProjectTree(files_to_module(Substack::Bottom, &files, i)))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,73 +4,80 @@ use std::rc::Rc;
|
||||
use hashbrown::HashSet;
|
||||
use itertools::Itertools;
|
||||
|
||||
use crate::representations::tree::WalkErrorKind;
|
||||
use crate::interner::{Interner, Sym, Tok};
|
||||
use crate::pipeline::error::{ModuleNotFound, ProjectError};
|
||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||
use crate::pipeline::error::{ProjectError, ModuleNotFound};
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::utils::Cache;
|
||||
use crate::pipeline::split_name::split_name;
|
||||
use crate::representations::tree::WalkErrorKind;
|
||||
use crate::utils::Cache;
|
||||
|
||||
pub type OpsResult = Result<Rc<HashSet<Token<String>>>, Rc<dyn ProjectError>>;
|
||||
pub type ExportedOpsCache<'a> = Cache<'a, Token<Vec<Token<String>>>, OpsResult>;
|
||||
pub type OpsResult = Result<Rc<HashSet<Tok<String>>>, Rc<dyn ProjectError>>;
|
||||
pub type ExportedOpsCache<'a> = Cache<'a, Sym, OpsResult>;
|
||||
|
||||
pub trait InjectedOperatorsFn = Fn(
|
||||
Token<Vec<Token<String>>>
|
||||
) -> Option<Rc<HashSet<Token<String>>>>;
|
||||
pub trait InjectedOperatorsFn = Fn(Sym) -> Option<Rc<HashSet<Tok<String>>>>;
|
||||
|
||||
fn coprefix<T: Eq>(
|
||||
l: impl Iterator<Item = T>,
|
||||
r: impl Iterator<Item = T>
|
||||
r: impl Iterator<Item = T>,
|
||||
) -> usize {
|
||||
l.zip(r).take_while(|(a, b)| a == b).count()
|
||||
}
|
||||
|
||||
/// Collect all names exported by the module at the specified path
|
||||
pub fn collect_exported_ops(
|
||||
path: Token<Vec<Token<String>>>,
|
||||
path: Sym,
|
||||
loaded: &LoadedSourceTable,
|
||||
i: &Interner,
|
||||
injected: &impl InjectedOperatorsFn
|
||||
injected: &impl InjectedOperatorsFn,
|
||||
) -> OpsResult {
|
||||
if let Some(ops) = injected(path) {
|
||||
if path == i.i(&[i.i("prelude")][..]) {
|
||||
println!("%%% Prelude exported ops %%%");
|
||||
println!("{}", ops.iter().map(|t| i.r(*t)).join(", "));
|
||||
}
|
||||
return Ok(ops)
|
||||
return Ok(ops);
|
||||
}
|
||||
let is_file = |n: &[Token<String>]| loaded.contains_key(&i.i(n));
|
||||
let is_file = |n: &[Tok<String>]| loaded.contains_key(&i.i(n));
|
||||
let path_s = &i.r(path)[..];
|
||||
let name_split = split_name(path_s, &is_file);
|
||||
let (fpath_v, subpath_v) = if let Some(f) = name_split {f} else {
|
||||
return Ok(Rc::new(loaded.keys().copied()
|
||||
.filter_map(|modname| {
|
||||
let modname_s = i.r(modname);
|
||||
if path_s.len() == coprefix(path_s.iter(), modname_s.iter()) {
|
||||
Some(modname_s[path_s.len()])
|
||||
} else {None}
|
||||
})
|
||||
.collect::<HashSet<_>>()
|
||||
))
|
||||
let (fpath_v, subpath_v) = if let Some(f) = name_split {
|
||||
f
|
||||
} else {
|
||||
return Ok(Rc::new(
|
||||
loaded
|
||||
.keys()
|
||||
.copied()
|
||||
.filter_map(|modname| {
|
||||
let modname_s = i.r(modname);
|
||||
if path_s.len() == coprefix(path_s.iter(), modname_s.iter()) {
|
||||
Some(modname_s[path_s.len()])
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect::<HashSet<_>>(),
|
||||
));
|
||||
};
|
||||
let fpath = i.i(fpath_v);
|
||||
let preparsed = &loaded[&fpath].preparsed;
|
||||
let module = preparsed.0.walk(&subpath_v, false)
|
||||
.map_err(|walk_err| match walk_err.kind {
|
||||
WalkErrorKind::Private => unreachable!("visibility is not being checked here"),
|
||||
WalkErrorKind::Missing => ModuleNotFound{
|
||||
let module = preparsed.0.walk(subpath_v, false).map_err(|walk_err| {
|
||||
match walk_err.kind {
|
||||
WalkErrorKind::Private =>
|
||||
unreachable!("visibility is not being checked here"),
|
||||
WalkErrorKind::Missing => ModuleNotFound {
|
||||
file: i.extern_vec(fpath),
|
||||
subpath: subpath_v.into_iter()
|
||||
subpath: subpath_v
|
||||
.iter()
|
||||
.take(walk_err.pos)
|
||||
.map(|t| i.r(*t))
|
||||
.cloned()
|
||||
.collect()
|
||||
}.rc(),
|
||||
})?;
|
||||
let out: HashSet<_> = module.items.iter()
|
||||
.filter(|(_, v)| v.exported)
|
||||
.map(|(k, _)| *k)
|
||||
.collect();
|
||||
.collect(),
|
||||
}
|
||||
.rc(),
|
||||
}
|
||||
})?;
|
||||
let out: HashSet<_> =
|
||||
module.items.iter().filter(|(_, v)| v.exported).map(|(k, _)| *k).collect();
|
||||
if path == i.i(&[i.i("prelude")][..]) {
|
||||
println!("%%% Prelude exported ops %%%");
|
||||
println!("{}", out.iter().map(|t| i.r(*t)).join(", "));
|
||||
@@ -83,7 +90,5 @@ pub fn mk_cache<'a>(
|
||||
i: &'a Interner,
|
||||
injected: &'a impl InjectedOperatorsFn,
|
||||
) -> ExportedOpsCache<'a> {
|
||||
Cache::new(|path, _this| {
|
||||
collect_exported_ops(path, loaded, i, injected)
|
||||
})
|
||||
}
|
||||
Cache::new(|path, _this| collect_exported_ops(path, loaded, i, injected))
|
||||
}
|
||||
|
||||
@@ -2,7 +2,7 @@ mod exported_ops;
|
||||
mod ops_for;
|
||||
|
||||
pub use exported_ops::{
|
||||
ExportedOpsCache, OpsResult, InjectedOperatorsFn,
|
||||
collect_exported_ops, mk_cache
|
||||
collect_exported_ops, mk_cache, ExportedOpsCache, InjectedOperatorsFn,
|
||||
OpsResult,
|
||||
};
|
||||
pub use ops_for::collect_ops_for;
|
||||
pub use ops_for::collect_ops_for;
|
||||
|
||||
@@ -3,20 +3,19 @@ use std::rc::Rc;
|
||||
use hashbrown::HashSet;
|
||||
use itertools::Itertools;
|
||||
|
||||
use super::exported_ops::{ExportedOpsCache, OpsResult};
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::parse::is_op;
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::representations::tree::{Module, ModMember};
|
||||
use crate::pipeline::import_abs_path::import_abs_path;
|
||||
|
||||
use super::exported_ops::{ExportedOpsCache, OpsResult};
|
||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||
use crate::representations::tree::{ModMember, Module};
|
||||
|
||||
/// Collect all operators and names, exported or local, defined in this
|
||||
/// tree.
|
||||
fn tree_all_ops(
|
||||
module: &Module<impl Clone, impl Clone>,
|
||||
ops: &mut HashSet<Token<String>>
|
||||
ops: &mut HashSet<Tok<String>>,
|
||||
) {
|
||||
ops.extend(module.items.keys().copied());
|
||||
for ent in module.items.values() {
|
||||
@@ -28,21 +27,22 @@ fn tree_all_ops(
|
||||
|
||||
/// Collect all names imported in this file
|
||||
pub fn collect_ops_for(
|
||||
file: &[Token<String>],
|
||||
file: &[Tok<String>],
|
||||
loaded: &LoadedSourceTable,
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> OpsResult {
|
||||
let tree = &loaded[&i.i(file)].preparsed.0;
|
||||
let mut ret = HashSet::new();
|
||||
println!("collecting ops for {}", i.extern_all(file).join("::"));
|
||||
tree_all_ops(tree.as_ref(), &mut ret);
|
||||
tree.visit_all_imports(&mut |modpath, module, import| {
|
||||
if let Some(n) = import.name { ret.insert(n); } else {
|
||||
tree.visit_all_imports(&mut |modpath, _module, import| {
|
||||
if let Some(n) = import.name {
|
||||
ret.insert(n);
|
||||
} else {
|
||||
println!("\tglob import from {}", i.extern_vec(import.path).join("::"));
|
||||
let path = import_abs_path(
|
||||
&file, modpath, module, &i.r(import.path)[..], i
|
||||
).expect("This error should have been caught during loading");
|
||||
let path = import_abs_path(file, modpath, &i.r(import.path)[..], i)
|
||||
.expect("This error should have been caught during loading");
|
||||
ret.extend(ops_cache.find(&i.i(&path))?.iter().copied());
|
||||
}
|
||||
Ok::<_, Rc<dyn ProjectError>>(())
|
||||
@@ -53,4 +53,4 @@ pub fn collect_ops_for(
|
||||
println!("{}", ret.iter().map(|t| i.r(*t)).join(", "))
|
||||
}
|
||||
Ok(Rc::new(ret))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,26 +1,26 @@
|
||||
use std::{ops::Add, rc::Rc};
|
||||
use std::ops::Add;
|
||||
use std::rc::Rc;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use super::{ProjectExt, ProjectModule, ProjectTree};
|
||||
use crate::ast::{Clause, Expr};
|
||||
use crate::foreign::{Atom, Atomic, ExternFn};
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::representations::location::Location;
|
||||
use crate::representations::tree::{ModEntry, ModMember, Module};
|
||||
use crate::representations::Primitive;
|
||||
use crate::representations::location::Location;
|
||||
use crate::foreign::{ExternFn, Atomic, Atom};
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::ast::{Expr, Clause};
|
||||
use crate::utils::{Substack, pushed};
|
||||
|
||||
use super::{ProjectModule, ProjectExt, ProjectTree};
|
||||
use crate::utils::{pushed, Substack};
|
||||
|
||||
pub enum ConstTree {
|
||||
Const(Expr),
|
||||
Tree(HashMap<Token<String>, ConstTree>)
|
||||
Tree(HashMap<Tok<String>, ConstTree>),
|
||||
}
|
||||
impl ConstTree {
|
||||
pub fn primitive(primitive: Primitive) -> Self {
|
||||
Self::Const(Expr{
|
||||
Self::Const(Expr {
|
||||
location: Location::Unknown,
|
||||
value: Clause::P(primitive)
|
||||
value: Clause::P(primitive),
|
||||
})
|
||||
}
|
||||
pub fn xfn(xfn: impl ExternFn + 'static) -> Self {
|
||||
@@ -29,9 +29,7 @@ impl ConstTree {
|
||||
pub fn atom(atom: impl Atomic + 'static) -> Self {
|
||||
Self::primitive(Primitive::Atom(Atom(Box::new(atom))))
|
||||
}
|
||||
pub fn tree(
|
||||
arr: impl IntoIterator<Item = (Token<String>, Self)>
|
||||
) -> Self {
|
||||
pub fn tree(arr: impl IntoIterator<Item = (Tok<String>, Self)>) -> Self {
|
||||
Self::Tree(arr.into_iter().collect())
|
||||
}
|
||||
}
|
||||
@@ -57,27 +55,29 @@ impl Add for ConstTree {
|
||||
}
|
||||
|
||||
fn from_const_tree_rec(
|
||||
path: Substack<Token<String>>,
|
||||
consts: HashMap<Token<String>, ConstTree>,
|
||||
file: &[Token<String>],
|
||||
path: Substack<Tok<String>>,
|
||||
consts: HashMap<Tok<String>, ConstTree>,
|
||||
file: &[Tok<String>],
|
||||
i: &Interner,
|
||||
) -> ProjectModule {
|
||||
let mut items = HashMap::new();
|
||||
let path_v = path.iter().rev_vec_clone();
|
||||
for (name, item) in consts {
|
||||
items.insert(name, ModEntry{
|
||||
items.insert(name, ModEntry {
|
||||
exported: true,
|
||||
member: match item {
|
||||
ConstTree::Const(c) => ModMember::Item(c),
|
||||
ConstTree::Tree(t) => ModMember::Sub(Rc::new(
|
||||
from_const_tree_rec(path.push(name), t, file, i)
|
||||
)),
|
||||
}
|
||||
ConstTree::Tree(t) => ModMember::Sub(Rc::new(from_const_tree_rec(
|
||||
path.push(name),
|
||||
t,
|
||||
file,
|
||||
i,
|
||||
))),
|
||||
},
|
||||
});
|
||||
}
|
||||
let exports = items.keys()
|
||||
.map(|name| (*name, i.i(&pushed(&path_v, *name))))
|
||||
.collect();
|
||||
let exports =
|
||||
items.keys().map(|name| (*name, i.i(&pushed(&path_v, *name)))).collect();
|
||||
Module {
|
||||
items,
|
||||
imports: vec![],
|
||||
@@ -85,15 +85,15 @@ fn from_const_tree_rec(
|
||||
exports,
|
||||
file: Some(file.to_vec()),
|
||||
..Default::default()
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub fn from_const_tree(
|
||||
consts: HashMap<Token<String>, ConstTree>,
|
||||
file: &[Token<String>],
|
||||
consts: HashMap<Tok<String>, ConstTree>,
|
||||
file: &[Tok<String>],
|
||||
i: &Interner,
|
||||
) -> ProjectTree {
|
||||
let module = from_const_tree_rec(Substack::Bottom, consts, file, i);
|
||||
ProjectTree(Rc::new(module))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,38 +1,30 @@
|
||||
/* FILE SEPARATION BOUNDARY
|
||||
// FILE SEPARATION BOUNDARY
|
||||
//
|
||||
// Collect all operators accessible in each file, parse the files with
|
||||
// correct tokenization, resolve glob imports, convert expressions to
|
||||
// refer to tokens with (local) absolute path, and connect them into a
|
||||
// single tree.
|
||||
//
|
||||
// The module checks for imports from missing modules (including
|
||||
// submodules). All other errors must be checked later.
|
||||
//
|
||||
// Injection strategy:
|
||||
// Return all items of the given module in the injected tree for
|
||||
// `injected` The output of this stage is a tree, which can simply be
|
||||
// overlaid with the injected tree
|
||||
|
||||
Collect all operators accessible in each file, parse the files with
|
||||
correct tokenization, resolve glob imports, convert expressions to
|
||||
refer to tokens with (local) absolute path, and connect them into a
|
||||
single tree.
|
||||
|
||||
The module checks for imports from missing modules (including submodules).
|
||||
All other errors must be checked later.
|
||||
|
||||
Injection strategy:
|
||||
Return all items of the given module in the injected tree for `injected`
|
||||
The output of this stage is a tree, which can simply be overlaid with
|
||||
the injected tree
|
||||
*/
|
||||
|
||||
mod collect_ops;
|
||||
mod parse_file;
|
||||
mod add_prelude;
|
||||
mod build_tree;
|
||||
mod collect_ops;
|
||||
mod const_tree;
|
||||
mod normalize_imports;
|
||||
mod parse_file;
|
||||
mod prefix;
|
||||
mod tree;
|
||||
mod const_tree;
|
||||
mod add_prelude;
|
||||
|
||||
pub use build_tree::{build_tree, split_path};
|
||||
pub use collect_ops::InjectedOperatorsFn;
|
||||
|
||||
pub use const_tree::{
|
||||
ConstTree, from_const_tree,
|
||||
};
|
||||
|
||||
pub use const_tree::{from_const_tree, ConstTree};
|
||||
pub use tree::{
|
||||
ProjectExt, ProjectModule, ProjectTree, collect_consts, collect_rules
|
||||
collect_consts, collect_rules, ProjectExt, ProjectModule, ProjectTree,
|
||||
};
|
||||
|
||||
pub use build_tree::{
|
||||
build_tree, split_path
|
||||
};
|
||||
@@ -1,74 +1,88 @@
|
||||
use crate::representations::tree::{Module, ModMember};
|
||||
use crate::representations::sourcefile::{Member, FileEntry, Import};
|
||||
use crate::utils::BoxedIter;
|
||||
use crate::utils::{Substack, iter::box_once};
|
||||
use crate::interner::{Interner, Token};
|
||||
use crate::pipeline::import_abs_path::import_abs_path;
|
||||
|
||||
use super::collect_ops::ExportedOpsCache;
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::pipeline::import_abs_path::import_abs_path;
|
||||
use crate::representations::sourcefile::{
|
||||
FileEntry, Import, Member, Namespace,
|
||||
};
|
||||
use crate::representations::tree::{ModMember, Module};
|
||||
use crate::utils::iter::box_once;
|
||||
use crate::utils::{BoxedIter, Substack};
|
||||
|
||||
fn member_rec(
|
||||
// level
|
||||
mod_stack: Substack<Token<String>>,
|
||||
mod_stack: Substack<Tok<String>>,
|
||||
preparsed: &Module<impl Clone, impl Clone>,
|
||||
// object
|
||||
member: Member,
|
||||
// context
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Member {
|
||||
match member {
|
||||
Member::Namespace(name, body) => {
|
||||
Member::Namespace(Namespace { name, body }) => {
|
||||
let prepmember = &preparsed.items[&name].member;
|
||||
let subprep = if let ModMember::Sub(m) = prepmember {m.clone()}
|
||||
else {unreachable!("This name must point to a namespace")};
|
||||
let subprep = if let ModMember::Sub(m) = prepmember {
|
||||
m.clone()
|
||||
} else {
|
||||
unreachable!("This name must point to a namespace")
|
||||
};
|
||||
let new_body = entv_rec(
|
||||
mod_stack.push(name),
|
||||
subprep.as_ref(),
|
||||
body,
|
||||
path, ops_cache, i
|
||||
path,
|
||||
ops_cache,
|
||||
i,
|
||||
);
|
||||
Member::Namespace(name, new_body)
|
||||
Member::Namespace(Namespace { name, body: new_body })
|
||||
},
|
||||
any => any
|
||||
any => any,
|
||||
}
|
||||
}
|
||||
|
||||
fn entv_rec(
|
||||
// level
|
||||
mod_stack: Substack<Token<String>>,
|
||||
mod_stack: Substack<Tok<String>>,
|
||||
preparsed: &Module<impl Clone, impl Clone>,
|
||||
// object
|
||||
data: Vec<FileEntry>,
|
||||
// context
|
||||
mod_path: &[Token<String>],
|
||||
mod_path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Vec<FileEntry> {
|
||||
data.into_iter()
|
||||
data
|
||||
.into_iter()
|
||||
.map(|ent| match ent {
|
||||
FileEntry::Import(imps) => FileEntry::Import(imps.into_iter()
|
||||
.flat_map(|import| if let Import{ name: None, path } = import {
|
||||
let p = import_abs_path(
|
||||
mod_path, mod_stack, preparsed, &i.r(path)[..], i
|
||||
).expect("Should have emerged in preparsing");
|
||||
let names = ops_cache.find(&i.i(&p))
|
||||
.expect("Should have emerged in second parsing");
|
||||
let imports = names.iter()
|
||||
.map(move |&n| Import{ name: Some(n), path })
|
||||
.collect::<Vec<_>>();
|
||||
Box::new(imports.into_iter()) as BoxedIter<Import>
|
||||
} else {box_once(import)})
|
||||
.collect()
|
||||
FileEntry::Import(imps) => FileEntry::Import(
|
||||
imps
|
||||
.into_iter()
|
||||
.flat_map(|import| {
|
||||
if let Import { name: None, path } = import {
|
||||
let p = import_abs_path(mod_path, mod_stack, &i.r(path)[..], i)
|
||||
.expect("Should have emerged in preparsing");
|
||||
let names = ops_cache
|
||||
.find(&i.i(&p))
|
||||
.expect("Should have emerged in second parsing");
|
||||
let imports = names
|
||||
.iter()
|
||||
.map(move |&n| Import { name: Some(n), path })
|
||||
.collect::<Vec<_>>();
|
||||
Box::new(imports.into_iter()) as BoxedIter<Import>
|
||||
} else {
|
||||
box_once(import)
|
||||
}
|
||||
})
|
||||
.collect(),
|
||||
),
|
||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
||||
mod_stack, preparsed, mem, mod_path, ops_cache, i
|
||||
mod_stack, preparsed, mem, mod_path, ops_cache, i,
|
||||
)),
|
||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
||||
mod_stack, preparsed, mem, mod_path, ops_cache, i
|
||||
mod_stack, preparsed, mem, mod_path, ops_cache, i,
|
||||
)),
|
||||
any => any
|
||||
any => any,
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
@@ -76,9 +90,9 @@ fn entv_rec(
|
||||
pub fn normalize_imports(
|
||||
preparsed: &Module<impl Clone, impl Clone>,
|
||||
data: Vec<FileEntry>,
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Vec<FileEntry> {
|
||||
entv_rec(Substack::Bottom, preparsed, data, path, ops_cache, i)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,18 +1,17 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::parse;
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::representations::sourcefile::{FileEntry, normalize_namespaces};
|
||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||
use crate::interner::{Token, Interner};
|
||||
|
||||
use super::add_prelude::add_prelude;
|
||||
use super::collect_ops::{ExportedOpsCache, collect_ops_for};
|
||||
use super::collect_ops::{collect_ops_for, ExportedOpsCache};
|
||||
use super::normalize_imports::normalize_imports;
|
||||
use super::prefix::prefix;
|
||||
use crate::interner::{Interner, Sym};
|
||||
use crate::parse;
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||
use crate::representations::sourcefile::{normalize_namespaces, FileEntry};
|
||||
|
||||
pub fn parse_file(
|
||||
path: Token<Vec<Token<String>>>,
|
||||
path: Sym,
|
||||
loaded: &LoadedSourceTable,
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner,
|
||||
@@ -21,24 +20,24 @@ pub fn parse_file(
|
||||
let ld = &loaded[&path];
|
||||
// let ops_cache = collect_ops::mk_cache(loaded, i);
|
||||
let ops = collect_ops_for(&i.r(path)[..], loaded, ops_cache, i)?;
|
||||
let ops_vec = ops.iter()
|
||||
.map(|t| i.r(*t))
|
||||
.cloned()
|
||||
.collect::<Vec<_>>();
|
||||
let ctx = parse::ParsingContext{
|
||||
let ops_vec = ops.iter().map(|t| i.r(*t)).cloned().collect::<Vec<_>>();
|
||||
let ctx = parse::ParsingContext {
|
||||
interner: i,
|
||||
ops: &ops_vec,
|
||||
file: Rc::new(i.extern_vec(path))
|
||||
file: Rc::new(i.extern_vec(path)),
|
||||
};
|
||||
let entries = parse::parse(ld.text.as_str(), ctx)
|
||||
.expect("This error should have been caught during loading");
|
||||
let with_prelude = add_prelude(entries, &i.r(path)[..], prelude);
|
||||
let impnormalized = normalize_imports(
|
||||
&ld.preparsed.0, with_prelude, &i.r(path)[..], ops_cache, i
|
||||
&ld.preparsed.0,
|
||||
with_prelude,
|
||||
&i.r(path)[..],
|
||||
ops_cache,
|
||||
i,
|
||||
);
|
||||
let nsnormalized = normalize_namespaces(
|
||||
Box::new(impnormalized.into_iter()), i
|
||||
).expect("This error should have been caught during preparsing");
|
||||
let nsnormalized = normalize_namespaces(Box::new(impnormalized.into_iter()))
|
||||
.expect("This error should have been caught during preparsing");
|
||||
let prefixed = prefix(nsnormalized, &i.r(path)[..], ops_cache, i);
|
||||
Ok(prefixed)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,82 +1,78 @@
|
||||
use std::rc::Rc;
|
||||
|
||||
use crate::ast::{Constant, Rule};
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::utils::Substack;
|
||||
use crate::representations::sourcefile::{Member, FileEntry};
|
||||
|
||||
use super::collect_ops::ExportedOpsCache;
|
||||
use crate::ast::{Constant, Rule};
|
||||
use crate::interner::{Interner, Tok};
|
||||
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||
use crate::utils::Substack;
|
||||
|
||||
fn member_rec(
|
||||
// level
|
||||
mod_stack: Substack<Token<String>>,
|
||||
mod_stack: Substack<Tok<String>>,
|
||||
// object
|
||||
data: Member,
|
||||
// context
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Member {
|
||||
// let except = |op| imported.contains(&op);
|
||||
let except = |_| false;
|
||||
let prefix_v = path.iter().copied()
|
||||
let prefix_v = path
|
||||
.iter()
|
||||
.copied()
|
||||
.chain(mod_stack.iter().rev_vec_clone().into_iter())
|
||||
.collect::<Vec<_>>();
|
||||
let prefix = i.i(&prefix_v);
|
||||
match data {
|
||||
Member::Namespace(name, body) => {
|
||||
let new_body = entv_rec(
|
||||
mod_stack.push(name),
|
||||
body,
|
||||
path, ops_cache, i
|
||||
);
|
||||
Member::Namespace(name, new_body)
|
||||
}
|
||||
Member::Constant(constant) => Member::Constant(Constant{
|
||||
Member::Namespace(Namespace { name, body }) => {
|
||||
let new_body = entv_rec(mod_stack.push(name), body, path, ops_cache, i);
|
||||
Member::Namespace(Namespace { name, body: new_body })
|
||||
},
|
||||
Member::Constant(constant) => Member::Constant(Constant {
|
||||
name: constant.name,
|
||||
value: constant.value.prefix(prefix, i, &except)
|
||||
value: constant.value.prefix(prefix, i, &except),
|
||||
}),
|
||||
Member::Rule(rule) => Member::Rule(Rule{
|
||||
Member::Rule(rule) => Member::Rule(Rule {
|
||||
prio: rule.prio,
|
||||
source: Rc::new(rule.source.iter()
|
||||
.map(|e| e.prefix(prefix, i, &except))
|
||||
.collect()
|
||||
source: Rc::new(
|
||||
rule.source.iter().map(|e| e.prefix(prefix, i, &except)).collect(),
|
||||
),
|
||||
target: Rc::new(rule.target.iter()
|
||||
.map(|e| e.prefix(prefix, i, &except))
|
||||
.collect()
|
||||
target: Rc::new(
|
||||
rule.target.iter().map(|e| e.prefix(prefix, i, &except)).collect(),
|
||||
),
|
||||
})
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
fn entv_rec(
|
||||
// level
|
||||
mod_stack: Substack<Token<String>>,
|
||||
mod_stack: Substack<Tok<String>>,
|
||||
// object
|
||||
data: Vec<FileEntry>,
|
||||
// context
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Vec<FileEntry> {
|
||||
data.into_iter().map(|fe| match fe {
|
||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
||||
mod_stack, mem, path, ops_cache, i
|
||||
)),
|
||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
||||
mod_stack, mem, path, ops_cache, i
|
||||
)),
|
||||
// XXX should [FileEntry::Export] be prefixed?
|
||||
any => any
|
||||
}).collect()
|
||||
data
|
||||
.into_iter()
|
||||
.map(|fe| match fe {
|
||||
FileEntry::Exported(mem) =>
|
||||
FileEntry::Exported(member_rec(mod_stack, mem, path, ops_cache, i)),
|
||||
FileEntry::Internal(mem) =>
|
||||
FileEntry::Internal(member_rec(mod_stack, mem, path, ops_cache, i)),
|
||||
// XXX should [FileEntry::Export] be prefixed?
|
||||
any => any,
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub fn prefix(
|
||||
data: Vec<FileEntry>,
|
||||
path: &[Token<String>],
|
||||
path: &[Tok<String>],
|
||||
ops_cache: &ExportedOpsCache,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) -> Vec<FileEntry> {
|
||||
entv_rec(Substack::Bottom, data, path, ops_cache, i)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,33 +1,36 @@
|
||||
use std::{ops::Add, rc::Rc};
|
||||
use std::ops::Add;
|
||||
use std::rc::Rc;
|
||||
|
||||
use hashbrown::HashMap;
|
||||
|
||||
use crate::representations::tree::{Module, ModMember};
|
||||
use crate::ast::{Rule, Expr};
|
||||
use crate::interner::{Token, Interner};
|
||||
use crate::ast::{Expr, Rule};
|
||||
use crate::interner::{Interner, Sym, Tok};
|
||||
use crate::representations::tree::{ModMember, Module};
|
||||
use crate::utils::Substack;
|
||||
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct ProjectExt{
|
||||
pub struct ProjectExt {
|
||||
/// Pairs each foreign token to the module it was imported from
|
||||
pub imports_from: HashMap<Token<String>, Token<Vec<Token<String>>>>,
|
||||
pub imports_from: HashMap<Tok<String>, Sym>,
|
||||
/// Pairs each exported token to its original full name.
|
||||
pub exports: HashMap<Token<String>, Token<Vec<Token<String>>>>,
|
||||
pub exports: HashMap<Tok<String>, Sym>,
|
||||
/// All rules defined in this module, exported or not
|
||||
pub rules: Vec<Rule>,
|
||||
/// Filename, if known, for error reporting
|
||||
pub file: Option<Vec<Token<String>>>
|
||||
pub file: Option<Vec<Tok<String>>>,
|
||||
}
|
||||
|
||||
impl Add for ProjectExt {
|
||||
type Output = Self;
|
||||
|
||||
fn add(mut self, rhs: Self) -> Self::Output {
|
||||
let ProjectExt{ imports_from, exports, rules, file } = rhs;
|
||||
let ProjectExt { imports_from, exports, rules, file } = rhs;
|
||||
self.imports_from.extend(imports_from.into_iter());
|
||||
self.exports.extend(exports.into_iter());
|
||||
self.rules.extend(rules.into_iter());
|
||||
if file.is_some() { self.file = file }
|
||||
if file.is_some() {
|
||||
self.file = file
|
||||
}
|
||||
self
|
||||
}
|
||||
}
|
||||
@@ -51,10 +54,10 @@ pub fn collect_rules(project: &ProjectTree) -> Vec<Rule> {
|
||||
}
|
||||
|
||||
fn collect_consts_rec(
|
||||
path: Substack<Token<String>>,
|
||||
bag: &mut HashMap<Token<Vec<Token<String>>>, Expr>,
|
||||
path: Substack<Tok<String>>,
|
||||
bag: &mut HashMap<Sym, Expr>,
|
||||
module: &ProjectModule,
|
||||
i: &Interner
|
||||
i: &Interner,
|
||||
) {
|
||||
for (key, entry) in module.items.iter() {
|
||||
match &entry.member {
|
||||
@@ -62,26 +65,18 @@ fn collect_consts_rec(
|
||||
let mut name = path.iter().rev_vec_clone();
|
||||
name.push(*key);
|
||||
bag.insert(i.i(&name), expr.clone());
|
||||
}
|
||||
ModMember::Sub(module) => {
|
||||
collect_consts_rec(
|
||||
path.push(*key),
|
||||
bag, module, i
|
||||
)
|
||||
}
|
||||
},
|
||||
ModMember::Sub(module) =>
|
||||
collect_consts_rec(path.push(*key), bag, module, i),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn collect_consts(project: &ProjectTree, i: &Interner)
|
||||
-> HashMap<Token<Vec<Token<String>>>, Expr>
|
||||
{
|
||||
pub fn collect_consts(
|
||||
project: &ProjectTree,
|
||||
i: &Interner,
|
||||
) -> HashMap<Sym, Expr> {
|
||||
let mut consts = HashMap::new();
|
||||
collect_consts_rec(
|
||||
Substack::Bottom,
|
||||
&mut consts,
|
||||
project.0.as_ref(),
|
||||
i
|
||||
);
|
||||
collect_consts_rec(Substack::Bottom, &mut consts, project.0.as_ref(), i);
|
||||
consts
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,47 +1,58 @@
|
||||
use std::iter;
|
||||
use std::rc::Rc;
|
||||
|
||||
use super::loaded_source::{LoadedSource, LoadedSourceTable};
|
||||
use super::preparse::preparse;
|
||||
use crate::interner::{Interner, Sym, Tok};
|
||||
use crate::pipeline::error::ProjectError;
|
||||
use crate::pipeline::file_loader::{load_text, IOResult, Loaded};
|
||||
use crate::pipeline::import_abs_path::import_abs_path;
|
||||
use crate::pipeline::split_name::split_name;
|
||||
use crate::interner::{Token, Interner};
|
||||
|
||||
use crate::pipeline::file_loader::{Loaded, load_text, IOResult};
|
||||
use crate::representations::sourcefile::FileEntry;
|
||||
use super::loaded_source::{LoadedSourceTable, LoadedSource};
|
||||
use super::preparse::preparse;
|
||||
|
||||
/// Load the source at the given path or all within if it's a collection,
|
||||
/// and all sources imported from these.
|
||||
fn load_abs_path_rec(
|
||||
abs_path: Token<Vec<Token<String>>>,
|
||||
abs_path: Sym,
|
||||
table: &mut LoadedSourceTable,
|
||||
prelude: &[FileEntry],
|
||||
i: &Interner,
|
||||
get_source: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
||||
is_injected: &impl Fn(&[Token<String>]) -> bool
|
||||
get_source: &impl Fn(Sym) -> IOResult,
|
||||
is_injected: &impl Fn(&[Tok<String>]) -> bool,
|
||||
) -> Result<(), Rc<dyn ProjectError>> {
|
||||
let abs_pathv = i.r(abs_path);
|
||||
// short-circuit if this import is defined externally or already known
|
||||
if is_injected(&abs_pathv) | table.contains_key(&abs_path) {
|
||||
return Ok(())
|
||||
if is_injected(abs_pathv) | table.contains_key(&abs_path) {
|
||||
return Ok(());
|
||||
}
|
||||
// try splitting the path to file, swallowing any IO errors
|
||||
let is_file = |p| (get_source)(p).map(|l| l.is_code()).unwrap_or(false);
|
||||
let name_split = split_name(&abs_pathv, &|p| is_file(i.i(p)));
|
||||
let filename = if let Some((f, _)) = name_split {f} else {
|
||||
let name_split = split_name(abs_pathv, &|p| is_file(i.i(p)));
|
||||
let filename = if let Some((f, _)) = name_split {
|
||||
f
|
||||
} else {
|
||||
// If the path could not be split to file, load it as directory
|
||||
let coll = if let Loaded::Collection(c) = (get_source)(abs_path)? {c}
|
||||
// ^^ raise any IO error that was previously swallowed
|
||||
else {panic!("split_name returned None but the path is a file")};
|
||||
let coll = if let Loaded::Collection(c) = (get_source)(abs_path)? {
|
||||
c
|
||||
}
|
||||
// ^^ raise any IO error that was previously swallowed
|
||||
else {
|
||||
panic!("split_name returned None but the path is a file")
|
||||
};
|
||||
// recurse on all files and folders within
|
||||
for item in coll.iter() {
|
||||
let abs_subpath = abs_pathv.iter()
|
||||
let abs_subpath = abs_pathv
|
||||
.iter()
|
||||
.copied()
|
||||
.chain(iter::once(i.i(item)))
|
||||
.collect::<Vec<_>>();
|
||||
load_abs_path_rec(
|
||||
i.i(&abs_subpath), table, prelude, i, get_source, is_injected
|
||||
i.i(&abs_subpath),
|
||||
table,
|
||||
prelude,
|
||||
i,
|
||||
get_source,
|
||||
is_injected,
|
||||
)?
|
||||
}
|
||||
return Ok(());
|
||||
@@ -50,18 +61,23 @@ fn load_abs_path_rec(
|
||||
let text = load_text(i.i(filename), &get_source, i)?;
|
||||
let preparsed = preparse(
|
||||
filename.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||
text.as_str(), prelude, i
|
||||
text.as_str(),
|
||||
prelude,
|
||||
i,
|
||||
)?;
|
||||
table.insert(abs_path, LoadedSource{ text, preparsed: preparsed.clone() });
|
||||
table.insert(abs_path, LoadedSource { text, preparsed: preparsed.clone() });
|
||||
// recurse on all imported modules
|
||||
preparsed.0.visit_all_imports(&mut |modpath, module, import| {
|
||||
let abs_pathv = import_abs_path(
|
||||
&filename, modpath,
|
||||
module, &import.nonglob_path(i), i
|
||||
)?;
|
||||
preparsed.0.visit_all_imports(&mut |modpath, _module, import| {
|
||||
let abs_pathv =
|
||||
import_abs_path(filename, modpath, &import.nonglob_path(i), i)?;
|
||||
// recurse on imported module
|
||||
load_abs_path_rec(
|
||||
i.i(&abs_pathv), table, prelude, i, get_source, is_injected
|
||||
i.i(&abs_pathv),
|
||||
table,
|
||||
prelude,
|
||||
i,
|
||||
get_source,
|
||||
is_injected,
|
||||
)
|
||||
})
|
||||
}
|
||||
@@ -69,20 +85,15 @@ fn load_abs_path_rec(
|
||||
/// Load and preparse all files reachable from the load targets via
|
||||
/// imports that aren't injected.
|
||||
pub fn load_source(
|
||||
targets: &[Token<Vec<Token<String>>>],
|
||||
targets: &[Sym],
|
||||
prelude: &[FileEntry],
|
||||
i: &Interner,
|
||||
get_source: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
||||
is_injected: &impl Fn(&[Token<String>]) -> bool,
|
||||
get_source: &impl Fn(Sym) -> IOResult,
|
||||
is_injected: &impl Fn(&[Tok<String>]) -> bool,
|
||||
) -> Result<LoadedSourceTable, Rc<dyn ProjectError>> {
|
||||
let mut table = LoadedSourceTable::new();
|
||||
for target in targets {
|
||||
load_abs_path_rec(
|
||||
*target,
|
||||
&mut table,
|
||||
prelude,
|
||||
i, get_source, is_injected
|
||||
)?
|
||||
load_abs_path_rec(*target, &mut table, prelude, i, get_source, is_injected)?
|
||||
}
|
||||
Ok(table)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
use std::{rc::Rc, collections::HashMap};
|
||||
|
||||
use crate::interner::Token;
|
||||
use std::collections::HashMap;
|
||||
use std::rc::Rc;
|
||||
|
||||
use super::preparse::Preparsed;
|
||||
use crate::interner::Sym;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct LoadedSource {
|
||||
@@ -10,4 +10,4 @@ pub struct LoadedSource {
|
||||
pub preparsed: Preparsed,
|
||||
}
|
||||
|
||||
pub type LoadedSourceTable = HashMap<Token<Vec<Token<String>>>, LoadedSource>;
|
||||
pub type LoadedSourceTable = HashMap<Sym, LoadedSource>;
|
||||
|
||||
@@ -1,25 +1,24 @@
|
||||
/* PULL LOGISTICS BOUNDARY
|
||||
|
||||
Specifying exactly what this module should be doing was an unexpectedly
|
||||
hard challenge. It is intended to encapsulate all pull logistics, but
|
||||
this definition is apparently prone to scope creep.
|
||||
|
||||
Load files, preparse them to obtain a list of imports, follow these.
|
||||
Preparsing also returns the module tree and list of exported synbols
|
||||
for free, which is needed later so the output of preparsing is also
|
||||
attached to the module output.
|
||||
|
||||
The module checks for IO errors, syntax errors, malformed imports and
|
||||
imports from missing files. All other errors must be checked later.
|
||||
|
||||
Injection strategy:
|
||||
see whether names are valid in the injected tree for is_injected
|
||||
*/
|
||||
// PULL LOGISTICS BOUNDARY
|
||||
//
|
||||
// Specifying exactly what this module should be doing was an unexpectedly
|
||||
// hard challenge. It is intended to encapsulate all pull logistics, but
|
||||
// this definition is apparently prone to scope creep.
|
||||
//
|
||||
// Load files, preparse them to obtain a list of imports, follow these.
|
||||
// Preparsing also returns the module tree and list of exported synbols
|
||||
// for free, which is needed later so the output of preparsing is also
|
||||
// attached to the module output.
|
||||
//
|
||||
// The module checks for IO errors, syntax errors, malformed imports and
|
||||
// imports from missing files. All other errors must be checked later.
|
||||
//
|
||||
// Injection strategy:
|
||||
// see whether names are valid in the injected tree for is_injected
|
||||
|
||||
mod load_source;
|
||||
mod loaded_source;
|
||||
mod preparse;
|
||||
|
||||
pub use loaded_source::{LoadedSource, LoadedSourceTable};
|
||||
pub use load_source::load_source;
|
||||
pub use preparse::Preparsed;
|
||||
pub use loaded_source::{LoadedSource, LoadedSourceTable};
|
||||
pub use preparse::Preparsed;
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user