Preparation for sharing
- rustfmt - clippy - comments - README
This commit is contained in:
26
README.md
26
README.md
@@ -1,5 +1,29 @@
|
|||||||
All you need to run the project is a nighly rust toolchain. Go to one of the folders within `examples` and run
|
Orchid is an experimental lazy, pure functional programming language designed to be embeddable in a Rust application for scripting.
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
|
||||||
|
TODO
|
||||||
|
|
||||||
|
I need to write a few articles explaining individual fragments of the language, and accurately document everything. Writing tutorials at this stage is not really worth it.
|
||||||
|
|
||||||
|
# Design
|
||||||
|
|
||||||
|
The execution model is lambda calculus, with call by name and copy tracking to avoid repeating steps. This leads to the minimal number of necessary reduction steps.
|
||||||
|
|
||||||
|
To make the syntax more intuitive, completely hygienic macros can be used which are applied to expressions after all imports are resolved and all tokens are namespaced both in the macro and in the referencing expression.
|
||||||
|
|
||||||
|
Namespaces are inspired by Rust modules and ES6. Every file and directory is implicitly a public module. Files can `export` names of constants or namespaces, all names in a substitution rule, or explicitly export some names. Names are implicitly created when they're referenced. `import` syntax is similar to Rust except with `(` parentheses `)` and no semicolons.
|
||||||
|
|
||||||
|
# Try it out
|
||||||
|
|
||||||
|
The project uses the nighly rust toolchain. Go to one of the folders within `examples` and run
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
cargo run -- -p .
|
cargo run -- -p .
|
||||||
```
|
```
|
||||||
|
|
||||||
|
you can try modifying the examples, but error reporting for the time being is pretty terrible.
|
||||||
|
|
||||||
|
# Contribution
|
||||||
|
|
||||||
|
All contributions are welcome. For the time being, use the issue tracker to discuss ideas.
|
||||||
27
notes/macros.md
Normal file
27
notes/macros.md
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
Substitution rules are represented by the `=prio=>` arrow where `prio` is a floating point literal. They are tested form highest priority to lowest. When one matches, the substitution is executed and all macros are re-checked from the beginning.
|
||||||
|
|
||||||
|
Wildcards either match a single token `$foo`, at least one token `...$bar` or any number of tokens `..$baz`. The latter two forms can also have an unsigned integer growth priority `...$quz:3` which influences their order in deciding the precedence of matches.
|
||||||
|
|
||||||
|
# Match priority
|
||||||
|
|
||||||
|
When a macro matches the program more than once, matches in ancestors take precedence. If there's no direct ancestry, the left branch takes precedence. When two matches are found in the same token sequence, the order is determined by the number of tokens allocated to the highest priority variable length wildcard where this number differs.
|
||||||
|
|
||||||
|
Variable length placeholders outside parens always have a higher priority than those inside. On the same level, the numbers decide the priority. In case of a tie, the placeholder to the left is preferred.
|
||||||
|
|
||||||
|
# Writing macros
|
||||||
|
|
||||||
|
Macro programs are systems consisting of substitution rules which reinterpret the tree produced by the previous rules. A good example for how this works can be found in ../examples/list-processing/fn.orc
|
||||||
|
|
||||||
|
Priority numbers are written in hexadecimal normal form to avoid precision bugs, and they're divided into bands throughout the f64 value range: (the numbers represent powers of 16)
|
||||||
|
|
||||||
|
- **32-39**: Binary operators, in inverse priority order
|
||||||
|
- **80-87**: Expression-like structures such as if/then/else
|
||||||
|
- **128-135**: Anything that creates lambdas
|
||||||
|
Programs triggered by a lower priority pattern than this can assume that all names are correctly bound
|
||||||
|
- **200**: Aliases extracted for readability
|
||||||
|
The user-accessible entry points of all macro programs must be lower priority than this, so any arbitrary syntax can be extracted into an alias with no side effects
|
||||||
|
- **224-231**: Integration; documented hooks exposed by a macro package to allow third party packages to extend its functionality
|
||||||
|
The `statement` pattern produced by `do{}` blocks and matched by `let` and `cps` is a good example of this. When any of these are triggered, all macro programs are in a documented state.
|
||||||
|
- **248-255**: Transitional states within macro programs get the highest priority
|
||||||
|
|
||||||
|
The numbers are arbitrary and up for debate. These are just the ones I came up with when writing the examples.
|
||||||
@@ -24,7 +24,7 @@
|
|||||||
"editor.formatOnType": true,
|
"editor.formatOnType": true,
|
||||||
},
|
},
|
||||||
"[rust]": {
|
"[rust]": {
|
||||||
"editor.rulers": [74]
|
"editor.rulers": [80],
|
||||||
},
|
},
|
||||||
"rust-analyzer.showUnlinkedFileNotification": false,
|
"rust-analyzer.showUnlinkedFileNotification": false,
|
||||||
"rust-analyzer.checkOnSave": true,
|
"rust-analyzer.checkOnSave": true,
|
||||||
|
|||||||
34
rustfmt.toml
Normal file
34
rustfmt.toml
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# meta
|
||||||
|
format_code_in_doc_comments = true
|
||||||
|
unstable_features = true
|
||||||
|
version = "Two"
|
||||||
|
|
||||||
|
# space
|
||||||
|
tab_spaces = 2
|
||||||
|
max_width = 80
|
||||||
|
error_on_line_overflow = true
|
||||||
|
format_macro_matchers = true
|
||||||
|
newline_style = "Unix"
|
||||||
|
normalize_comments = true
|
||||||
|
wrap_comments = true
|
||||||
|
overflow_delimited_expr = true
|
||||||
|
single_line_if_else_max_width = 50
|
||||||
|
use_small_heuristics = "Max"
|
||||||
|
|
||||||
|
# literals
|
||||||
|
hex_literal_case = "Lower"
|
||||||
|
format_strings = true
|
||||||
|
|
||||||
|
# delimiters
|
||||||
|
match_arm_blocks = false
|
||||||
|
match_block_trailing_comma = true
|
||||||
|
|
||||||
|
# structure
|
||||||
|
condense_wildcard_suffixes = true
|
||||||
|
use_field_init_shorthand = true
|
||||||
|
use_try_shorthand = true
|
||||||
|
|
||||||
|
# Modules
|
||||||
|
group_imports = "StdExternalCrate"
|
||||||
|
imports_granularity = "Module"
|
||||||
|
reorder_modules = true
|
||||||
11
src/cli.rs
11
src/cli.rs
@@ -1,19 +1,22 @@
|
|||||||
use std::{fmt::Display, io::{stdin, BufRead, stdout, Write}};
|
use std::fmt::Display;
|
||||||
|
use std::io::{stdin, stdout, BufRead, Write};
|
||||||
|
|
||||||
pub fn prompt<T: Display, E: Display>(
|
pub fn prompt<T: Display, E: Display>(
|
||||||
prompt: &str,
|
prompt: &str,
|
||||||
default: T,
|
default: T,
|
||||||
mut try_cast: impl FnMut(String) -> Result<T, E>
|
mut try_cast: impl FnMut(String) -> Result<T, E>,
|
||||||
) -> T {
|
) -> T {
|
||||||
loop {
|
loop {
|
||||||
print!("{prompt} ({default}): ");
|
print!("{prompt} ({default}): ");
|
||||||
stdout().lock().flush().unwrap();
|
stdout().lock().flush().unwrap();
|
||||||
let mut input = String::with_capacity(100);
|
let mut input = String::with_capacity(100);
|
||||||
stdin().lock().read_line(&mut input).unwrap();
|
stdin().lock().read_line(&mut input).unwrap();
|
||||||
if input.is_empty() {return default}
|
if input.is_empty() {
|
||||||
|
return default;
|
||||||
|
}
|
||||||
match try_cast(input) {
|
match try_cast(input) {
|
||||||
Ok(t) => return t,
|
Ok(t) => return t,
|
||||||
Err(e) => println!("Error: {e}")
|
Err(e) => println!("Error: {e}"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
18
src/external/assertion_error.rs
vendored
18
src/external/assertion_error.rs
vendored
@@ -1,23 +1,27 @@
|
|||||||
use std::rc::Rc;
|
|
||||||
use std::fmt::Display;
|
use std::fmt::Display;
|
||||||
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::foreign::ExternError;
|
use crate::foreign::ExternError;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
|
||||||
|
/// Some expectation (usually about the argument types of a function) did not
|
||||||
|
/// hold.
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct AssertionError{
|
pub struct AssertionError {
|
||||||
pub value: ExprInst,
|
pub value: ExprInst,
|
||||||
pub assertion: &'static str,
|
pub assertion: &'static str,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl AssertionError {
|
impl AssertionError {
|
||||||
pub fn fail(value: ExprInst, assertion: &'static str) -> Result<!, Rc<dyn ExternError>> {
|
pub fn fail(
|
||||||
return Err(Self { value, assertion }.into_extern())
|
value: ExprInst,
|
||||||
|
assertion: &'static str,
|
||||||
|
) -> Result<!, Rc<dyn ExternError>> {
|
||||||
|
return Err(Self { value, assertion }.into_extern());
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn ext(value: ExprInst, assertion: &'static str) -> Rc<dyn ExternError> {
|
pub fn ext(value: ExprInst, assertion: &'static str) -> Rc<dyn ExternError> {
|
||||||
return Self { value, assertion }.into_extern()
|
return Self { value, assertion }.into_extern();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -27,4 +31,4 @@ impl Display for AssertionError {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl ExternError for AssertionError{}
|
impl ExternError for AssertionError {}
|
||||||
|
|||||||
16
src/external/bool/boolean.rs
vendored
16
src/external/bool/boolean.rs
vendored
@@ -1,12 +1,18 @@
|
|||||||
use crate::foreign::Atom;
|
|
||||||
use crate::representations::{interpreted::{Clause, ExprInst}, Primitive};
|
|
||||||
use crate::atomic_inert;
|
use crate::atomic_inert;
|
||||||
|
use crate::foreign::Atom;
|
||||||
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
|
||||||
|
/// Booleans exposed to Orchid
|
||||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||||
pub struct Boolean(pub bool);
|
pub struct Boolean(pub bool);
|
||||||
atomic_inert!(Boolean);
|
atomic_inert!(Boolean);
|
||||||
|
|
||||||
impl From<bool> for Boolean { fn from(value: bool) -> Self { Self(value) } }
|
impl From<bool> for Boolean {
|
||||||
|
fn from(value: bool) -> Self {
|
||||||
|
Self(value)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl TryFrom<ExprInst> for Boolean {
|
impl TryFrom<ExprInst> for Boolean {
|
||||||
type Error = ();
|
type Error = ();
|
||||||
@@ -15,9 +21,9 @@ impl TryFrom<ExprInst> for Boolean {
|
|||||||
let expr = value.expr();
|
let expr = value.expr();
|
||||||
if let Clause::P(Primitive::Atom(Atom(a))) = &expr.clause {
|
if let Clause::P(Primitive::Atom(Atom(a))) = &expr.clause {
|
||||||
if let Some(b) = a.as_any().downcast_ref::<Boolean>() {
|
if let Some(b) = a.as_any().downcast_ref::<Boolean>() {
|
||||||
return Ok(*b)
|
return Ok(*b);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return Err(())
|
Err(())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
42
src/external/bool/equals.rs
vendored
42
src/external/bool/equals.rs
vendored
@@ -1,48 +1,54 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::external::litconv::with_lit;
|
|
||||||
use crate::representations::{interpreted::ExprInst, Literal};
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
|
||||||
|
|
||||||
use super::super::assertion_error::AssertionError;
|
use super::super::assertion_error::AssertionError;
|
||||||
use super::boolean::Boolean;
|
use super::boolean::Boolean;
|
||||||
|
use crate::external::litconv::with_lit;
|
||||||
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::representations::Literal;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Equals function
|
/// Compares the inner values if
|
||||||
|
///
|
||||||
|
/// - both values are char,
|
||||||
|
/// - both are string,
|
||||||
|
/// - both are either uint or num
|
||||||
///
|
///
|
||||||
/// Next state: [Equals1]
|
/// Next state: [Equals1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Equals2;
|
pub struct Equals2;
|
||||||
externfn_impl!(Equals2, |_: &Self, x: ExprInst| Ok(Equals1{x}));
|
externfn_impl!(Equals2, |_: &Self, x: ExprInst| Ok(Equals1 { x }));
|
||||||
|
|
||||||
/// Partially applied Equals function
|
|
||||||
///
|
|
||||||
/// Prev state: [Equals2]; Next state: [Equals0]
|
/// Prev state: [Equals2]; Next state: [Equals0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Equals1{ x: ExprInst }
|
pub struct Equals1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Equals1, x);
|
atomic_redirect!(Equals1, x);
|
||||||
atomic_impl!(Equals1);
|
atomic_impl!(Equals1);
|
||||||
externfn_impl!(Equals1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Equals1, |this: &Self, x: ExprInst| {
|
||||||
with_lit(&this.x, |l| Ok(Equals0{ a: l.clone(), x }))
|
with_lit(&this.x, |l| Ok(Equals0 { a: l.clone(), x }))
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Equals function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Equals1]
|
/// Prev state: [Equals1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Equals0 { a: Literal, x: ExprInst }
|
pub struct Equals0 {
|
||||||
|
a: Literal,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Equals0, x);
|
atomic_redirect!(Equals0, x);
|
||||||
atomic_impl!(Equals0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Equals0, |Self { a, x }: &Self, _| {
|
||||||
let eqls = with_lit(x, |l| Ok(match (a, l) {
|
let eqls = with_lit(x, |l| {
|
||||||
|
Ok(match (a, l) {
|
||||||
(Literal::Char(c1), Literal::Char(c2)) => c1 == c2,
|
(Literal::Char(c1), Literal::Char(c2)) => c1 == c2,
|
||||||
(Literal::Num(n1), Literal::Num(n2)) => n1 == n2,
|
(Literal::Num(n1), Literal::Num(n2)) => n1 == n2,
|
||||||
(Literal::Str(s1), Literal::Str(s2)) => s1 == s2,
|
(Literal::Str(s1), Literal::Str(s2)) => s1 == s2,
|
||||||
(Literal::Uint(i1), Literal::Uint(i2)) => i1 == i2,
|
(Literal::Uint(i1), Literal::Uint(i2)) => i1 == i2,
|
||||||
(Literal::Num(n1), Literal::Uint(u1)) => *n1 == (*u1 as f64),
|
(Literal::Num(n1), Literal::Uint(u1)) => *n1 == (*u1 as f64),
|
||||||
(Literal::Uint(u1), Literal::Num(n1)) => *n1 == (*u1 as f64),
|
(Literal::Uint(u1), Literal::Num(n1)) => *n1 == (*u1 as f64),
|
||||||
(_, _) => AssertionError::fail(x.clone(), "the expected type")?,
|
(..) => AssertionError::fail(x.clone(), "the expected type")?,
|
||||||
}))?;
|
})
|
||||||
|
})?;
|
||||||
Ok(Boolean::from(eqls).to_atom_cls())
|
Ok(Boolean::from(eqls).to_atom_cls())
|
||||||
});
|
});
|
||||||
|
|||||||
47
src/external/bool/ifthenelse.rs
vendored
47
src/external/bool/ifthenelse.rs
vendored
@@ -1,41 +1,46 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
|
use super::Boolean;
|
||||||
use crate::external::assertion_error::AssertionError;
|
use crate::external::assertion_error::AssertionError;
|
||||||
use crate::representations::{PathSet, interpreted::{Clause, ExprInst}};
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::PathSet;
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
use super::Boolean;
|
/// Takes a boolean and two branches, runs the first if the bool is true, the
|
||||||
|
/// second if it's false.
|
||||||
/// IfThenElse function
|
|
||||||
///
|
///
|
||||||
/// Next state: [IfThenElse0]
|
/// Next state: [IfThenElse0]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct IfThenElse1;
|
pub struct IfThenElse1;
|
||||||
externfn_impl!(IfThenElse1, |_: &Self, x: ExprInst| Ok(IfThenElse0{x}));
|
externfn_impl!(IfThenElse1, |_: &Self, x: ExprInst| Ok(IfThenElse0 { x }));
|
||||||
|
|
||||||
/// Partially applied IfThenElse function
|
|
||||||
///
|
|
||||||
/// Prev state: [IfThenElse1]; Next state: [IfThenElse0]
|
|
||||||
|
|
||||||
|
/// Prev state: [IfThenElse1]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct IfThenElse0{ x: ExprInst }
|
pub struct IfThenElse0 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(IfThenElse0, x);
|
atomic_redirect!(IfThenElse0, x);
|
||||||
atomic_impl!(IfThenElse0, |this: &Self, _| {
|
atomic_impl!(IfThenElse0, |this: &Self, _| {
|
||||||
let Boolean(b) = this.x.clone().try_into()
|
let Boolean(b) = this
|
||||||
|
.x
|
||||||
|
.clone()
|
||||||
|
.try_into()
|
||||||
.map_err(|_| AssertionError::ext(this.x.clone(), "a boolean"))?;
|
.map_err(|_| AssertionError::ext(this.x.clone(), "a boolean"))?;
|
||||||
Ok(if b { Clause::Lambda {
|
Ok(if b {
|
||||||
|
Clause::Lambda {
|
||||||
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
||||||
body: Clause::Lambda {
|
body: Clause::Lambda { args: None, body: Clause::LambdaArg.wrap() }
|
||||||
args: None,
|
.wrap(),
|
||||||
body: Clause::LambdaArg.wrap()
|
}
|
||||||
}.wrap()
|
} else {
|
||||||
}} else { Clause::Lambda {
|
Clause::Lambda {
|
||||||
args: None,
|
args: None,
|
||||||
body: Clause::Lambda {
|
body: Clause::Lambda {
|
||||||
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
args: Some(PathSet { steps: Rc::new(vec![]), next: None }),
|
||||||
body: Clause::LambdaArg.wrap()
|
body: Clause::LambdaArg.wrap(),
|
||||||
}.wrap()
|
}
|
||||||
}})
|
.wrap(),
|
||||||
|
}
|
||||||
|
})
|
||||||
});
|
});
|
||||||
8
src/external/bool/mod.rs
vendored
8
src/external/bool/mod.rs
vendored
@@ -1,16 +1,16 @@
|
|||||||
mod equals;
|
|
||||||
mod boolean;
|
mod boolean;
|
||||||
|
mod equals;
|
||||||
mod ifthenelse;
|
mod ifthenelse;
|
||||||
pub use boolean::Boolean;
|
pub use boolean::Boolean;
|
||||||
|
|
||||||
use crate::{pipeline::ConstTree, interner::Interner};
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
pub fn bool(i: &Interner) -> ConstTree {
|
pub fn bool(i: &Interner) -> ConstTree {
|
||||||
ConstTree::tree([
|
ConstTree::tree([
|
||||||
(i.i("ifthenelse"), ConstTree::xfn(ifthenelse::IfThenElse1)),
|
(i.i("ifthenelse"), ConstTree::xfn(ifthenelse::IfThenElse1)),
|
||||||
(i.i("equals"), ConstTree::xfn(equals::Equals2)),
|
(i.i("equals"), ConstTree::xfn(equals::Equals2)),
|
||||||
(i.i("true"), ConstTree::atom(Boolean(true))),
|
(i.i("true"), ConstTree::atom(Boolean(true))),
|
||||||
(i.i("false"), ConstTree::atom(Boolean(false)))
|
(i.i("false"), ConstTree::atom(Boolean(false))),
|
||||||
])
|
])
|
||||||
}
|
}
|
||||||
7
src/external/conv/mod.rs
vendored
7
src/external/conv/mod.rs
vendored
@@ -1,13 +1,14 @@
|
|||||||
use crate::{interner::Interner, pipeline::ConstTree};
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
mod to_string;
|
|
||||||
mod parse_float;
|
mod parse_float;
|
||||||
mod parse_uint;
|
mod parse_uint;
|
||||||
|
mod to_string;
|
||||||
|
|
||||||
pub fn conv(i: &Interner) -> ConstTree {
|
pub fn conv(i: &Interner) -> ConstTree {
|
||||||
ConstTree::tree([
|
ConstTree::tree([
|
||||||
(i.i("parse_float"), ConstTree::xfn(parse_float::ParseFloat1)),
|
(i.i("parse_float"), ConstTree::xfn(parse_float::ParseFloat1)),
|
||||||
(i.i("parse_uint"), ConstTree::xfn(parse_uint::ParseUint1)),
|
(i.i("parse_uint"), ConstTree::xfn(parse_uint::ParseUint1)),
|
||||||
(i.i("to_string"), ConstTree::xfn(to_string::ToString1))
|
(i.i("to_string"), ConstTree::xfn(to_string::ToString1)),
|
||||||
])
|
])
|
||||||
}
|
}
|
||||||
38
src/external/conv/parse_float.rs
vendored
38
src/external/conv/parse_float.rs
vendored
@@ -1,41 +1,43 @@
|
|||||||
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use chumsky::Parser;
|
use chumsky::Parser;
|
||||||
|
|
||||||
use std::fmt::Debug;
|
|
||||||
|
|
||||||
use super::super::assertion_error::AssertionError;
|
use super::super::assertion_error::AssertionError;
|
||||||
use crate::external::litconv::with_lit;
|
use crate::external::litconv::with_lit;
|
||||||
use crate::parse::float_parser;
|
use crate::parse::float_parser;
|
||||||
use crate::representations::{interpreted::ExprInst, Literal};
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::representations::Literal;
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// ParseFloat a number
|
/// parse a number. Accepts the same syntax Orchid does
|
||||||
///
|
///
|
||||||
/// Next state: [ParseFloat0]
|
/// Next state: [ParseFloat0]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct ParseFloat1;
|
pub struct ParseFloat1;
|
||||||
externfn_impl!(ParseFloat1, |_: &Self, x: ExprInst| Ok(ParseFloat0{x}));
|
externfn_impl!(ParseFloat1, |_: &Self, x: ExprInst| Ok(ParseFloat0 { x }));
|
||||||
|
|
||||||
/// Applied to_string function
|
|
||||||
///
|
|
||||||
/// Prev state: [ParseFloat1]
|
/// Prev state: [ParseFloat1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct ParseFloat0{ x: ExprInst }
|
pub struct ParseFloat0 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(ParseFloat0, x);
|
atomic_redirect!(ParseFloat0, x);
|
||||||
atomic_impl!(ParseFloat0, |Self{ x }: &Self, _| {
|
atomic_impl!(ParseFloat0, |Self { x }: &Self, _| {
|
||||||
let number = with_lit(x, |l| Ok(match l {
|
let number = with_lit(x, |l| {
|
||||||
|
Ok(match l {
|
||||||
Literal::Str(s) => {
|
Literal::Str(s) => {
|
||||||
let parser = float_parser();
|
let parser = float_parser();
|
||||||
parser.parse(s.as_str())
|
parser.parse(s.as_str()).map_err(|_| {
|
||||||
.map_err(|_| AssertionError::ext(x.clone(), "cannot be parsed into a float"))?
|
AssertionError::ext(x.clone(), "cannot be parsed into a float")
|
||||||
}
|
})?
|
||||||
|
},
|
||||||
Literal::Num(n) => *n,
|
Literal::Num(n) => *n,
|
||||||
Literal::Uint(i) => (*i as u32).into(),
|
Literal::Uint(i) => (*i as u32).into(),
|
||||||
Literal::Char(char) => char.to_digit(10)
|
Literal::Char(char) => char
|
||||||
|
.to_digit(10)
|
||||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||||
.into()
|
.into(),
|
||||||
}))?;
|
})
|
||||||
|
})?;
|
||||||
Ok(number.into())
|
Ok(number.into())
|
||||||
});
|
});
|
||||||
47
src/external/conv/parse_uint.rs
vendored
47
src/external/conv/parse_uint.rs
vendored
@@ -1,40 +1,47 @@
|
|||||||
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use chumsky::Parser;
|
use chumsky::Parser;
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use crate::external::assertion_error::AssertionError;
|
||||||
|
use crate::external::litconv::with_lit;
|
||||||
use crate::external::{litconv::with_lit, assertion_error::AssertionError};
|
|
||||||
use crate::representations::{interpreted::ExprInst, Literal};
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
|
||||||
use crate::parse::int_parser;
|
use crate::parse::int_parser;
|
||||||
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::representations::Literal;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Parse a number
|
/// Parse an unsigned integer. Accepts the same formats Orchid does. If the
|
||||||
|
/// input is a number, floors it.
|
||||||
///
|
///
|
||||||
/// Next state: [ParseUint0]
|
/// Next state: [ParseUint0]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct ParseUint1;
|
pub struct ParseUint1;
|
||||||
externfn_impl!(ParseUint1, |_: &Self, x: ExprInst| Ok(ParseUint0{x}));
|
externfn_impl!(ParseUint1, |_: &Self, x: ExprInst| Ok(ParseUint0 { x }));
|
||||||
|
|
||||||
/// Applied ParseUint function
|
|
||||||
///
|
|
||||||
/// Prev state: [ParseUint1]
|
/// Prev state: [ParseUint1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct ParseUint0{ x: ExprInst }
|
pub struct ParseUint0 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(ParseUint0, x);
|
atomic_redirect!(ParseUint0, x);
|
||||||
atomic_impl!(ParseUint0, |Self{ x }: &Self, _| {
|
atomic_impl!(ParseUint0, |Self { x }: &Self, _| {
|
||||||
let uint = with_lit(x, |l| Ok(match l {
|
let uint = with_lit(x, |l| {
|
||||||
|
Ok(match l {
|
||||||
Literal::Str(s) => {
|
Literal::Str(s) => {
|
||||||
let parser = int_parser();
|
let parser = int_parser();
|
||||||
parser.parse(s.as_str())
|
parser.parse(s.as_str()).map_err(|_| {
|
||||||
.map_err(|_| AssertionError::ext(x.clone(), "cannot be parsed into an unsigned int"))?
|
AssertionError::ext(
|
||||||
}
|
x.clone(),
|
||||||
|
"cannot be parsed into an unsigned int",
|
||||||
|
)
|
||||||
|
})?
|
||||||
|
},
|
||||||
Literal::Num(n) => n.floor() as u64,
|
Literal::Num(n) => n.floor() as u64,
|
||||||
Literal::Uint(i) => *i,
|
Literal::Uint(i) => *i,
|
||||||
Literal::Char(char) => char.to_digit(10)
|
Literal::Char(char) => char
|
||||||
|
.to_digit(10)
|
||||||
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
.ok_or(AssertionError::ext(x.clone(), "is not a decimal digit"))?
|
||||||
.into()
|
.into(),
|
||||||
}))?;
|
})
|
||||||
|
})?;
|
||||||
Ok(uint.into())
|
Ok(uint.into())
|
||||||
});
|
});
|
||||||
27
src/external/conv/to_string.rs
vendored
27
src/external/conv/to_string.rs
vendored
@@ -1,31 +1,32 @@
|
|||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::external::litconv::with_lit;
|
use crate::external::litconv::with_lit;
|
||||||
use crate::representations::{interpreted::ExprInst, Literal};
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::representations::Literal;
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// ToString a clause
|
/// Convert a literal to a string using Rust's conversions for floats, chars and
|
||||||
|
/// uints respectively
|
||||||
///
|
///
|
||||||
/// Next state: [ToString0]
|
/// Next state: [ToString0]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct ToString1;
|
pub struct ToString1;
|
||||||
externfn_impl!(ToString1, |_: &Self, x: ExprInst| Ok(ToString0{x}));
|
externfn_impl!(ToString1, |_: &Self, x: ExprInst| Ok(ToString0 { x }));
|
||||||
|
|
||||||
/// Applied ToString function
|
|
||||||
///
|
|
||||||
/// Prev state: [ToString1]
|
/// Prev state: [ToString1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct ToString0{ x: ExprInst }
|
pub struct ToString0 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(ToString0, x);
|
atomic_redirect!(ToString0, x);
|
||||||
atomic_impl!(ToString0, |Self{ x }: &Self, _| {
|
atomic_impl!(ToString0, |Self { x }: &Self, _| {
|
||||||
let string = with_lit(x, |l| Ok(match l {
|
let string = with_lit(x, |l| {
|
||||||
|
Ok(match l {
|
||||||
Literal::Char(c) => c.to_string(),
|
Literal::Char(c) => c.to_string(),
|
||||||
Literal::Uint(i) => i.to_string(),
|
Literal::Uint(i) => i.to_string(),
|
||||||
Literal::Num(n) => n.to_string(),
|
Literal::Num(n) => n.to_string(),
|
||||||
Literal::Str(s) => s.clone()
|
Literal::Str(s) => s.clone(),
|
||||||
}))?;
|
})
|
||||||
|
})?;
|
||||||
Ok(string.into())
|
Ok(string.into())
|
||||||
});
|
});
|
||||||
|
|||||||
25
src/external/cpsio/debug.rs
vendored
25
src/external/cpsio/debug.rs
vendored
@@ -3,31 +3,30 @@ use std::fmt::Debug;
|
|||||||
use crate::foreign::{Atomic, AtomicReturn};
|
use crate::foreign::{Atomic, AtomicReturn};
|
||||||
use crate::interner::InternedDisplay;
|
use crate::interner::InternedDisplay;
|
||||||
use crate::interpreter::Context;
|
use crate::interpreter::Context;
|
||||||
use crate::{externfn_impl, atomic_defaults};
|
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_defaults, externfn_impl};
|
||||||
|
|
||||||
/// Debug function
|
/// Print and return whatever expression is in the argument without normalizing
|
||||||
|
/// it.
|
||||||
///
|
///
|
||||||
/// Next state: [Debug0]
|
/// Next state: [Debug1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Debug2;
|
pub struct Debug2;
|
||||||
externfn_impl!(Debug2, |_: &Self, x: ExprInst| Ok(Debug1{x}));
|
externfn_impl!(Debug2, |_: &Self, x: ExprInst| Ok(Debug1 { x }));
|
||||||
|
|
||||||
/// Partially applied Print function
|
|
||||||
///
|
|
||||||
/// Prev state: [Debug1]
|
|
||||||
|
|
||||||
|
/// Prev state: [Debug2]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Debug1{ x: ExprInst }
|
pub struct Debug1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
impl Atomic for Debug1 {
|
impl Atomic for Debug1 {
|
||||||
atomic_defaults!();
|
atomic_defaults!();
|
||||||
fn run(&self, ctx: Context) -> crate::foreign::AtomicResult {
|
fn run(&self, ctx: Context) -> crate::foreign::AtomicResult {
|
||||||
println!("{}", self.x.bundle(&ctx.interner));
|
println!("{}", self.x.bundle(ctx.interner));
|
||||||
Ok(AtomicReturn{
|
Ok(AtomicReturn {
|
||||||
clause: self.x.expr().clause.clone(),
|
clause: self.x.expr().clause.clone(),
|
||||||
gas: ctx.gas.map(|g| g - 1),
|
gas: ctx.gas.map(|g| g - 1),
|
||||||
inert: false
|
inert: false,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
43
src/external/cpsio/io.rs
vendored
43
src/external/cpsio/io.rs
vendored
@@ -1,35 +1,40 @@
|
|||||||
use std::io::{self, Write, stdin};
|
use std::io::{self, Write};
|
||||||
|
|
||||||
use crate::{representations::{interpreted::{ExprInst, Clause}, Primitive, Literal}, atomic_inert, interpreter::{HandlerParm, HandlerRes}, unwrap_or, external::runtime_error::RuntimeError};
|
use crate::external::runtime_error::RuntimeError;
|
||||||
|
use crate::interpreter::{HandlerParm, HandlerRes};
|
||||||
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::{Literal, Primitive};
|
||||||
|
use crate::{atomic_inert, unwrap_or};
|
||||||
|
|
||||||
|
/// An IO command to be handled by the host application.
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
pub enum IO {
|
pub enum IO {
|
||||||
Print(String, ExprInst),
|
Print(String, ExprInst),
|
||||||
Readline(ExprInst)
|
Readline(ExprInst),
|
||||||
}
|
}
|
||||||
atomic_inert!(IO);
|
atomic_inert!(IO);
|
||||||
|
|
||||||
|
/// Default xommand handler for IO actions
|
||||||
pub fn handle(effect: HandlerParm) -> HandlerRes {
|
pub fn handle(effect: HandlerParm) -> HandlerRes {
|
||||||
let io: &IO = unwrap_or!(
|
// Downcast command
|
||||||
effect.as_any().downcast_ref();
|
let io: &IO = unwrap_or!(effect.as_any().downcast_ref(); Err(effect)?);
|
||||||
return Err(effect)
|
// Interpret and execute
|
||||||
);
|
Ok(match io {
|
||||||
match io {
|
|
||||||
IO::Print(str, cont) => {
|
IO::Print(str, cont) => {
|
||||||
print!("{}", str);
|
print!("{}", str);
|
||||||
io::stdout().flush().unwrap();
|
io::stdout()
|
||||||
Ok(Ok(cont.clone()))
|
.flush()
|
||||||
|
.map_err(|e| RuntimeError::ext(e.to_string(), "writing to stdout"))?;
|
||||||
|
cont.clone()
|
||||||
},
|
},
|
||||||
IO::Readline(cont) => {
|
IO::Readline(cont) => {
|
||||||
let mut buf = String::new();
|
let mut buf = String::new();
|
||||||
if let Err(e) = stdin().read_line(&mut buf) {
|
io::stdin()
|
||||||
return Ok(Err(RuntimeError::ext(e.to_string(), "reading from stdin")));
|
.read_line(&mut buf)
|
||||||
}
|
.map_err(|e| RuntimeError::ext(e.to_string(), "reading from stdin"))?;
|
||||||
buf.pop();
|
buf.pop();
|
||||||
Ok(Ok(Clause::Apply {
|
let x = Clause::P(Primitive::Literal(Literal::Str(buf))).wrap();
|
||||||
f: cont.clone(),
|
Clause::Apply { f: cont.clone(), x }.wrap()
|
||||||
x: Clause::P(Primitive::Literal(Literal::Str(buf))).wrap()
|
},
|
||||||
}.wrap()))
|
})
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
13
src/external/cpsio/mod.rs
vendored
13
src/external/cpsio/mod.rs
vendored
@@ -1,18 +1,19 @@
|
|||||||
use crate::{interner::Interner, pipeline::ConstTree};
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
|
mod debug;
|
||||||
|
mod io;
|
||||||
|
mod panic;
|
||||||
mod print;
|
mod print;
|
||||||
mod readline;
|
mod readline;
|
||||||
mod debug;
|
|
||||||
mod panic;
|
|
||||||
mod io;
|
|
||||||
|
|
||||||
pub use io::{IO, handle};
|
pub use io::{handle, IO};
|
||||||
|
|
||||||
pub fn cpsio(i: &Interner) -> ConstTree {
|
pub fn cpsio(i: &Interner) -> ConstTree {
|
||||||
ConstTree::tree([
|
ConstTree::tree([
|
||||||
(i.i("print"), ConstTree::xfn(print::Print2)),
|
(i.i("print"), ConstTree::xfn(print::Print2)),
|
||||||
(i.i("readline"), ConstTree::xfn(readline::Readln2)),
|
(i.i("readline"), ConstTree::xfn(readline::Readln2)),
|
||||||
(i.i("debug"), ConstTree::xfn(debug::Debug2)),
|
(i.i("debug"), ConstTree::xfn(debug::Debug2)),
|
||||||
(i.i("panic"), ConstTree::xfn(panic::Panic1))
|
(i.i("panic"), ConstTree::xfn(panic::Panic1)),
|
||||||
])
|
])
|
||||||
}
|
}
|
||||||
22
src/external/cpsio/panic.rs
vendored
22
src/external/cpsio/panic.rs
vendored
@@ -1,23 +1,29 @@
|
|||||||
use std::fmt::Display;
|
use std::fmt::Display;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
|
||||||
use crate::external::litconv::with_str;
|
use crate::external::litconv::with_str;
|
||||||
use crate::representations::interpreted::ExprInst;
|
|
||||||
use crate::foreign::ExternError;
|
use crate::foreign::ExternError;
|
||||||
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
|
/// Takes a message, returns an [ExternError] unconditionally.
|
||||||
|
///
|
||||||
|
/// Next state: [Panic0]
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Panic1;
|
pub struct Panic1;
|
||||||
externfn_impl!(Panic1, |_: &Self, x: ExprInst| Ok(Panic0{ x }));
|
externfn_impl!(Panic1, |_: &Self, x: ExprInst| Ok(Panic0 { x }));
|
||||||
|
|
||||||
|
/// Prev state: [Panic1]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Panic0{ x: ExprInst }
|
pub struct Panic0 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Panic0, x);
|
atomic_redirect!(Panic0, x);
|
||||||
atomic_impl!(Panic0, |Self{ x }: &Self, _| {
|
atomic_impl!(Panic0, |Self { x }: &Self, _| {
|
||||||
with_str(x, |s| {
|
with_str(x, |s| Err(OrchidPanic(s.clone()).into_extern()))
|
||||||
Err(OrchidPanic(s.clone()).into_extern())
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
|
/// An unrecoverable error in Orchid land. Of course, because Orchid is lazy, it
|
||||||
|
/// only applies to the expressions that use the one that generated it.
|
||||||
pub struct OrchidPanic(String);
|
pub struct OrchidPanic(String);
|
||||||
|
|
||||||
impl Display for OrchidPanic {
|
impl Display for OrchidPanic {
|
||||||
|
|||||||
33
src/external/cpsio/print.rs
vendored
33
src/external/cpsio/print.rs
vendored
@@ -1,43 +1,40 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
|
use super::io::IO;
|
||||||
use crate::external::litconv::with_str;
|
use crate::external::litconv::with_str;
|
||||||
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
||||||
use crate::interpreter::Context;
|
use crate::interpreter::Context;
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl, atomic_defaults};
|
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_defaults, atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
use super::io::IO;
|
/// Wrap a string and the continuation into an [IO] event to be evaluated by the
|
||||||
|
/// embedder.
|
||||||
/// Print function
|
|
||||||
///
|
///
|
||||||
/// Next state: [Print1]
|
/// Next state: [Print1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Print2;
|
pub struct Print2;
|
||||||
externfn_impl!(Print2, |_: &Self, x: ExprInst| Ok(Print1{x}));
|
externfn_impl!(Print2, |_: &Self, x: ExprInst| Ok(Print1 { x }));
|
||||||
|
|
||||||
/// Partially applied Print function
|
|
||||||
///
|
|
||||||
/// Prev state: [Print2]; Next state: [Print0]
|
/// Prev state: [Print2]; Next state: [Print0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Print1{ x: ExprInst }
|
pub struct Print1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Print1, x);
|
atomic_redirect!(Print1, x);
|
||||||
atomic_impl!(Print1);
|
atomic_impl!(Print1);
|
||||||
externfn_impl!(Print1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Print1, |this: &Self, x: ExprInst| {
|
||||||
with_str(&this.x, |s| {
|
with_str(&this.x, |s| Ok(Print0 { s: s.clone(), x }))
|
||||||
Ok(Print0{ s: s.clone(), x })
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
|
/// Prev state: [Print1]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Print0{ s: String, x: ExprInst }
|
pub struct Print0 {
|
||||||
|
s: String,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
impl Atomic for Print0 {
|
impl Atomic for Print0 {
|
||||||
atomic_defaults!();
|
atomic_defaults!();
|
||||||
fn run(&self, ctx: Context) -> AtomicResult {
|
fn run(&self, ctx: Context) -> AtomicResult {
|
||||||
Ok(AtomicReturn::from_data(
|
Ok(AtomicReturn::from_data(IO::Print(self.s.clone(), self.x.clone()), ctx))
|
||||||
IO::Print(self.s.clone(), self.x.clone()),
|
|
||||||
ctx
|
|
||||||
))
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
25
src/external/cpsio/readline.rs
vendored
25
src/external/cpsio/readline.rs
vendored
@@ -1,32 +1,27 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
|
use super::io::IO;
|
||||||
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
use crate::foreign::{Atomic, AtomicResult, AtomicReturn};
|
||||||
use crate::interpreter::Context;
|
use crate::interpreter::Context;
|
||||||
use crate::{externfn_impl, atomic_defaults};
|
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_defaults, externfn_impl};
|
||||||
|
|
||||||
use super::io::IO;
|
/// Create an [IO] event that reads a line form standard input and calls the
|
||||||
|
/// continuation with it.
|
||||||
/// Readln function
|
|
||||||
///
|
///
|
||||||
/// Next state: [Readln1]
|
/// Next state: [Readln1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Readln2;
|
pub struct Readln2;
|
||||||
externfn_impl!(Readln2, |_: &Self, x: ExprInst| Ok(Readln1{x}));
|
externfn_impl!(Readln2, |_: &Self, x: ExprInst| Ok(Readln1 { x }));
|
||||||
|
|
||||||
/// Partially applied Readln function
|
|
||||||
///
|
|
||||||
/// Prev state: [Readln2]; Next state: [Readln0]
|
|
||||||
|
|
||||||
|
/// Prev state: [Readln2]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Readln1{ x: ExprInst }
|
pub struct Readln1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
impl Atomic for Readln1 {
|
impl Atomic for Readln1 {
|
||||||
atomic_defaults!();
|
atomic_defaults!();
|
||||||
fn run(&self, ctx: Context) -> AtomicResult {
|
fn run(&self, ctx: Context) -> AtomicResult {
|
||||||
Ok(AtomicReturn::from_data(
|
Ok(AtomicReturn::from_data(IO::Readline(self.x.clone()), ctx))
|
||||||
IO::Readline(self.x.clone()),
|
|
||||||
ctx
|
|
||||||
))
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
29
src/external/litconv.rs
vendored
29
src/external/litconv.rs
vendored
@@ -1,33 +1,44 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::foreign::ExternError;
|
|
||||||
use crate::external::assertion_error::AssertionError;
|
use crate::external::assertion_error::AssertionError;
|
||||||
|
use crate::foreign::ExternError;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
use crate::representations::Literal;
|
use crate::representations::Literal;
|
||||||
|
|
||||||
pub fn with_lit<T>(x: &ExprInst,
|
/// Tries to cast the [ExprInst] as a [Literal], calls the provided function on
|
||||||
predicate: impl FnOnce(&Literal) -> Result<T, Rc<dyn ExternError>>
|
/// it if successful. Returns a generic [AssertionError] if not.
|
||||||
|
pub fn with_lit<T>(
|
||||||
|
x: &ExprInst,
|
||||||
|
predicate: impl FnOnce(&Literal) -> Result<T, Rc<dyn ExternError>>,
|
||||||
) -> Result<T, Rc<dyn ExternError>> {
|
) -> Result<T, Rc<dyn ExternError>> {
|
||||||
x.with_literal(predicate)
|
x.with_literal(predicate)
|
||||||
.map_err(|()| AssertionError::ext(x.clone(), "a literal value"))
|
.map_err(|()| AssertionError::ext(x.clone(), "a literal value"))
|
||||||
.and_then(|r| r)
|
.and_then(|r| r)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn with_str<T>(x: &ExprInst,
|
/// Like [with_lit] but also unwraps [Literal::Str]
|
||||||
predicate: impl FnOnce(&String) -> Result<T, Rc<dyn ExternError>>
|
pub fn with_str<T>(
|
||||||
|
x: &ExprInst,
|
||||||
|
predicate: impl FnOnce(&String) -> Result<T, Rc<dyn ExternError>>,
|
||||||
) -> Result<T, Rc<dyn ExternError>> {
|
) -> Result<T, Rc<dyn ExternError>> {
|
||||||
with_lit(x, |l| {
|
with_lit(x, |l| {
|
||||||
if let Literal::Str(s) = l {predicate(&s)} else {
|
if let Literal::Str(s) = l {
|
||||||
|
predicate(s)
|
||||||
|
} else {
|
||||||
AssertionError::fail(x.clone(), "a string")?
|
AssertionError::fail(x.clone(), "a string")?
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn with_uint<T>(x: &ExprInst,
|
/// Like [with_lit] but also unwraps [Literal::Uint]
|
||||||
predicate: impl FnOnce(u64) -> Result<T, Rc<dyn ExternError>>
|
pub fn with_uint<T>(
|
||||||
|
x: &ExprInst,
|
||||||
|
predicate: impl FnOnce(u64) -> Result<T, Rc<dyn ExternError>>,
|
||||||
) -> Result<T, Rc<dyn ExternError>> {
|
) -> Result<T, Rc<dyn ExternError>> {
|
||||||
with_lit(x, |l| {
|
with_lit(x, |l| {
|
||||||
if let Literal::Uint(u) = l {predicate(*u)} else {
|
if let Literal::Uint(u) = l {
|
||||||
|
predicate(*u)
|
||||||
|
} else {
|
||||||
AssertionError::fail(x.clone(), "an uint")?
|
AssertionError::fail(x.clone(), "an uint")?
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|||||||
14
src/external/mod.rs
vendored
14
src/external/mod.rs
vendored
@@ -1,11 +1,11 @@
|
|||||||
mod num;
|
|
||||||
mod assertion_error;
|
mod assertion_error;
|
||||||
pub mod std;
|
|
||||||
mod conv;
|
|
||||||
mod str;
|
|
||||||
mod cpsio;
|
|
||||||
mod runtime_error;
|
|
||||||
mod bool;
|
mod bool;
|
||||||
|
mod conv;
|
||||||
|
mod cpsio;
|
||||||
mod litconv;
|
mod litconv;
|
||||||
|
mod num;
|
||||||
|
mod runtime_error;
|
||||||
|
pub mod std;
|
||||||
|
mod str;
|
||||||
|
|
||||||
pub use cpsio::{IO, handle};
|
pub use cpsio::{handle, IO};
|
||||||
|
|||||||
5
src/external/num/mod.rs
vendored
5
src/external/num/mod.rs
vendored
@@ -2,7 +2,8 @@ mod numeric;
|
|||||||
pub mod operators;
|
pub mod operators;
|
||||||
pub use numeric::Numeric;
|
pub use numeric::Numeric;
|
||||||
|
|
||||||
use crate::{interner::Interner, pipeline::ConstTree};
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
pub fn num(i: &Interner) -> ConstTree {
|
pub fn num(i: &Interner) -> ConstTree {
|
||||||
ConstTree::tree([
|
ConstTree::tree([
|
||||||
@@ -10,6 +11,6 @@ pub fn num(i: &Interner) -> ConstTree {
|
|||||||
(i.i("subtract"), ConstTree::xfn(operators::subtract::Subtract2)),
|
(i.i("subtract"), ConstTree::xfn(operators::subtract::Subtract2)),
|
||||||
(i.i("multiply"), ConstTree::xfn(operators::multiply::Multiply2)),
|
(i.i("multiply"), ConstTree::xfn(operators::multiply::Multiply2)),
|
||||||
(i.i("divide"), ConstTree::xfn(operators::divide::Divide2)),
|
(i.i("divide"), ConstTree::xfn(operators::divide::Divide2)),
|
||||||
(i.i("remainder"), ConstTree::xfn(operators::remainder::Remainder2))
|
(i.i("remainder"), ConstTree::xfn(operators::remainder::Remainder2)),
|
||||||
])
|
])
|
||||||
}
|
}
|
||||||
41
src/external/num/numeric.rs
vendored
41
src/external/num/numeric.rs
vendored
@@ -1,4 +1,4 @@
|
|||||||
use std::ops::{Add, Sub, Mul, Div, Rem};
|
use std::ops::{Add, Div, Mul, Rem, Sub};
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use ordered_float::NotNan;
|
use ordered_float::NotNan;
|
||||||
@@ -6,14 +6,14 @@ use ordered_float::NotNan;
|
|||||||
use crate::external::assertion_error::AssertionError;
|
use crate::external::assertion_error::AssertionError;
|
||||||
use crate::external::litconv::with_lit;
|
use crate::external::litconv::with_lit;
|
||||||
use crate::foreign::ExternError;
|
use crate::foreign::ExternError;
|
||||||
use crate::representations::Literal;
|
|
||||||
use crate::representations::Primitive;
|
|
||||||
use crate::representations::interpreted::{Clause, ExprInst};
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::{Literal, Primitive};
|
||||||
|
|
||||||
|
/// A number, either floating point or unsigned int, visible to Orchid.
|
||||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||||
pub enum Numeric {
|
pub enum Numeric {
|
||||||
Uint(u64),
|
Uint(u64),
|
||||||
Num(NotNan<f64>)
|
Num(NotNan<f64>),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Numeric {
|
impl Numeric {
|
||||||
@@ -36,9 +36,9 @@ impl Add for Numeric {
|
|||||||
match (self, rhs) {
|
match (self, rhs) {
|
||||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a + b),
|
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a + b),
|
||||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a + b),
|
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a + b),
|
||||||
(Numeric::Uint(a), Numeric::Num(b)) |
|
(Numeric::Uint(a), Numeric::Num(b))
|
||||||
(Numeric::Num(b), Numeric::Uint(a))
|
| (Numeric::Num(b), Numeric::Uint(a)) =>
|
||||||
=> Numeric::num::<f64>(a as f64 + *b)
|
Numeric::num::<f64>(a as f64 + *b),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -49,11 +49,10 @@ impl Sub for Numeric {
|
|||||||
fn sub(self, rhs: Self) -> Self::Output {
|
fn sub(self, rhs: Self) -> Self::Output {
|
||||||
match (self, rhs) {
|
match (self, rhs) {
|
||||||
(Numeric::Uint(a), Numeric::Uint(b)) if b <= a => Numeric::Uint(a - b),
|
(Numeric::Uint(a), Numeric::Uint(b)) if b <= a => Numeric::Uint(a - b),
|
||||||
(Numeric::Uint(a), Numeric::Uint(b))
|
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::num(a as f64 - b as f64),
|
||||||
=> Numeric::num(a as f64 - b as f64),
|
|
||||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a - b),
|
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a - b),
|
||||||
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 - *b),
|
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 - *b),
|
||||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a - b as f64)
|
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a - b as f64),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -65,9 +64,9 @@ impl Mul for Numeric {
|
|||||||
match (self, rhs) {
|
match (self, rhs) {
|
||||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a * b),
|
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a * b),
|
||||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a * b),
|
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a * b),
|
||||||
(Numeric::Uint(a), Numeric::Num(b)) |
|
(Numeric::Uint(a), Numeric::Num(b))
|
||||||
(Numeric::Num(b), Numeric::Uint(a))
|
| (Numeric::Num(b), Numeric::Uint(a)) =>
|
||||||
=> Numeric::Num(NotNan::new(a as f64).unwrap() * b)
|
Numeric::Num(NotNan::new(a as f64).unwrap() * b),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -90,7 +89,7 @@ impl Rem for Numeric {
|
|||||||
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a % b),
|
(Numeric::Uint(a), Numeric::Uint(b)) => Numeric::Uint(a % b),
|
||||||
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a % b),
|
(Numeric::Num(a), Numeric::Num(b)) => Numeric::num(a % b),
|
||||||
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 % *b),
|
(Numeric::Uint(a), Numeric::Num(b)) => Numeric::num(a as f64 % *b),
|
||||||
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a % b as f64)
|
(Numeric::Num(a), Numeric::Uint(b)) => Numeric::num(*a % b as f64),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -101,7 +100,7 @@ impl TryFrom<ExprInst> for Numeric {
|
|||||||
with_lit(&value.clone(), |l| match l {
|
with_lit(&value.clone(), |l| match l {
|
||||||
Literal::Uint(i) => Ok(Numeric::Uint(*i)),
|
Literal::Uint(i) => Ok(Numeric::Uint(*i)),
|
||||||
Literal::Num(n) => Ok(Numeric::Num(*n)),
|
Literal::Num(n) => Ok(Numeric::Num(*n)),
|
||||||
_ => AssertionError::fail(value, "an integer or number")?
|
_ => AssertionError::fail(value, "an integer or number")?,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -110,7 +109,7 @@ impl From<Numeric> for Clause {
|
|||||||
fn from(value: Numeric) -> Self {
|
fn from(value: Numeric) -> Self {
|
||||||
Clause::P(Primitive::Literal(match value {
|
Clause::P(Primitive::Literal(match value {
|
||||||
Numeric::Uint(i) => Literal::Uint(i),
|
Numeric::Uint(i) => Literal::Uint(i),
|
||||||
Numeric::Num(n) => Literal::Num(n)
|
Numeric::Num(n) => Literal::Num(n),
|
||||||
}))
|
}))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -119,16 +118,16 @@ impl From<Numeric> for String {
|
|||||||
fn from(value: Numeric) -> Self {
|
fn from(value: Numeric) -> Self {
|
||||||
match value {
|
match value {
|
||||||
Numeric::Uint(i) => i.to_string(),
|
Numeric::Uint(i) => i.to_string(),
|
||||||
Numeric::Num(n) => n.to_string()
|
Numeric::Num(n) => n.to_string(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Into<f64> for Numeric {
|
impl From<Numeric> for f64 {
|
||||||
fn into(self) -> f64 {
|
fn from(val: Numeric) -> Self {
|
||||||
match self {
|
match val {
|
||||||
Numeric::Num(n) => *n,
|
Numeric::Num(n) => *n,
|
||||||
Numeric::Uint(i) => i as f64
|
Numeric::Uint(i) => i as f64,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
29
src/external/num/operators/add.rs
vendored
29
src/external/num/operators/add.rs
vendored
@@ -1,39 +1,36 @@
|
|||||||
|
|
||||||
use super::super::Numeric;
|
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use super::super::Numeric;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Add function
|
/// Adds two numbers
|
||||||
///
|
///
|
||||||
/// Next state: [Add1]
|
/// Next state: [Add1]
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Add2;
|
pub struct Add2;
|
||||||
externfn_impl!(Add2, |_: &Self, x: ExprInst| Ok(Add1{x}));
|
externfn_impl!(Add2, |_: &Self, x: ExprInst| Ok(Add1 { x }));
|
||||||
|
|
||||||
/// Partially applied Add function
|
|
||||||
///
|
|
||||||
/// Prev state: [Add2]; Next state: [Add0]
|
/// Prev state: [Add2]; Next state: [Add0]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Add1{ x: ExprInst }
|
pub struct Add1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Add1, x);
|
atomic_redirect!(Add1, x);
|
||||||
atomic_impl!(Add1);
|
atomic_impl!(Add1);
|
||||||
externfn_impl!(Add1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Add1, |this: &Self, x: ExprInst| {
|
||||||
let a: Numeric = this.x.clone().try_into()?;
|
let a: Numeric = this.x.clone().try_into()?;
|
||||||
Ok(Add0{ a, x })
|
Ok(Add0 { a, x })
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Add function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Add1]
|
/// Prev state: [Add1]
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Add0 { a: Numeric, x: ExprInst }
|
pub struct Add0 {
|
||||||
|
a: Numeric,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Add0, x);
|
atomic_redirect!(Add0, x);
|
||||||
atomic_impl!(Add0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Add0, |Self { a, x }: &Self, _| {
|
||||||
let b: Numeric = x.clone().try_into()?;
|
let b: Numeric = x.clone().try_into()?;
|
||||||
Ok((*a + b).into())
|
Ok((*a + b).into())
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
29
src/external/num/operators/divide.rs
vendored
29
src/external/num/operators/divide.rs
vendored
@@ -1,40 +1,37 @@
|
|||||||
|
|
||||||
use super::super::Numeric;
|
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use super::super::Numeric;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Divide function
|
/// Divides two numbers
|
||||||
///
|
///
|
||||||
/// Next state: [Divide1]
|
/// Next state: [Divide1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Divide2;
|
pub struct Divide2;
|
||||||
externfn_impl!(Divide2, |_: &Self, x: ExprInst| Ok(Divide1{x}));
|
externfn_impl!(Divide2, |_: &Self, x: ExprInst| Ok(Divide1 { x }));
|
||||||
|
|
||||||
/// Partially applied Divide function
|
|
||||||
///
|
|
||||||
/// Prev state: [Divide2]; Next state: [Divide0]
|
/// Prev state: [Divide2]; Next state: [Divide0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Divide1{ x: ExprInst }
|
pub struct Divide1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Divide1, x);
|
atomic_redirect!(Divide1, x);
|
||||||
atomic_impl!(Divide1);
|
atomic_impl!(Divide1);
|
||||||
externfn_impl!(Divide1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Divide1, |this: &Self, x: ExprInst| {
|
||||||
let a: Numeric = this.x.clone().try_into()?;
|
let a: Numeric = this.x.clone().try_into()?;
|
||||||
Ok(Divide0{ a, x })
|
Ok(Divide0 { a, x })
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Divide function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Divide1]
|
/// Prev state: [Divide1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Divide0 { a: Numeric, x: ExprInst }
|
pub struct Divide0 {
|
||||||
|
a: Numeric,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Divide0, x);
|
atomic_redirect!(Divide0, x);
|
||||||
atomic_impl!(Divide0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Divide0, |Self { a, x }: &Self, _| {
|
||||||
let b: Numeric = x.clone().try_into()?;
|
let b: Numeric = x.clone().try_into()?;
|
||||||
Ok((*a / b).into())
|
Ok((*a / b).into())
|
||||||
});
|
});
|
||||||
30
src/external/num/operators/multiply.rs
vendored
30
src/external/num/operators/multiply.rs
vendored
@@ -1,40 +1,36 @@
|
|||||||
|
|
||||||
use super::super::Numeric;
|
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use super::super::Numeric;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Multiply function
|
/// Multiplies two numbers
|
||||||
///
|
///
|
||||||
/// Next state: [Multiply1]
|
/// Next state: [Multiply1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Multiply2;
|
pub struct Multiply2;
|
||||||
externfn_impl!(Multiply2, |_: &Self, x: ExprInst| Ok(Multiply1{x}));
|
externfn_impl!(Multiply2, |_: &Self, x: ExprInst| Ok(Multiply1 { x }));
|
||||||
|
|
||||||
/// Partially applied Multiply function
|
|
||||||
///
|
|
||||||
/// Prev state: [Multiply2]; Next state: [Multiply0]
|
/// Prev state: [Multiply2]; Next state: [Multiply0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Multiply1{ x: ExprInst }
|
pub struct Multiply1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Multiply1, x);
|
atomic_redirect!(Multiply1, x);
|
||||||
atomic_impl!(Multiply1);
|
atomic_impl!(Multiply1);
|
||||||
externfn_impl!(Multiply1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Multiply1, |this: &Self, x: ExprInst| {
|
||||||
let a: Numeric = this.x.clone().try_into()?;
|
let a: Numeric = this.x.clone().try_into()?;
|
||||||
Ok(Multiply0{ a, x })
|
Ok(Multiply0 { a, x })
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Multiply function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Multiply1]
|
/// Prev state: [Multiply1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Multiply0 { a: Numeric, x: ExprInst }
|
pub struct Multiply0 {
|
||||||
|
a: Numeric,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Multiply0, x);
|
atomic_redirect!(Multiply0, x);
|
||||||
atomic_impl!(Multiply0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Multiply0, |Self { a, x }: &Self, _| {
|
||||||
let b: Numeric = x.clone().try_into()?;
|
let b: Numeric = x.clone().try_into()?;
|
||||||
Ok((*a * b).into())
|
Ok((*a * b).into())
|
||||||
});
|
});
|
||||||
30
src/external/num/operators/remainder.rs
vendored
30
src/external/num/operators/remainder.rs
vendored
@@ -1,40 +1,36 @@
|
|||||||
|
|
||||||
use super::super::Numeric;
|
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use super::super::Numeric;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Remainder function
|
/// Takes the modulus of two numbers.
|
||||||
///
|
///
|
||||||
/// Next state: [Remainder1]
|
/// Next state: [Remainder1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Remainder2;
|
pub struct Remainder2;
|
||||||
externfn_impl!(Remainder2, |_: &Self, x: ExprInst| Ok(Remainder1{x}));
|
externfn_impl!(Remainder2, |_: &Self, x: ExprInst| Ok(Remainder1 { x }));
|
||||||
|
|
||||||
/// Partially applied Remainder function
|
|
||||||
///
|
|
||||||
/// Prev state: [Remainder2]; Next state: [Remainder0]
|
/// Prev state: [Remainder2]; Next state: [Remainder0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Remainder1{ x: ExprInst }
|
pub struct Remainder1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Remainder1, x);
|
atomic_redirect!(Remainder1, x);
|
||||||
atomic_impl!(Remainder1);
|
atomic_impl!(Remainder1);
|
||||||
externfn_impl!(Remainder1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Remainder1, |this: &Self, x: ExprInst| {
|
||||||
let a: Numeric = this.x.clone().try_into()?;
|
let a: Numeric = this.x.clone().try_into()?;
|
||||||
Ok(Remainder0{ a, x })
|
Ok(Remainder0 { a, x })
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Remainder function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Remainder1]
|
/// Prev state: [Remainder1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Remainder0 { a: Numeric, x: ExprInst }
|
pub struct Remainder0 {
|
||||||
|
a: Numeric,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Remainder0, x);
|
atomic_redirect!(Remainder0, x);
|
||||||
atomic_impl!(Remainder0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Remainder0, |Self { a, x }: &Self, _| {
|
||||||
let b: Numeric = x.clone().try_into()?;
|
let b: Numeric = x.clone().try_into()?;
|
||||||
Ok((*a % b).into())
|
Ok((*a % b).into())
|
||||||
});
|
});
|
||||||
30
src/external/num/operators/subtract.rs
vendored
30
src/external/num/operators/subtract.rs
vendored
@@ -1,40 +1,36 @@
|
|||||||
|
|
||||||
use super::super::Numeric;
|
|
||||||
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use super::super::Numeric;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Subtract function
|
/// Subtracts two numbers
|
||||||
///
|
///
|
||||||
/// Next state: [Subtract1]
|
/// Next state: [Subtract1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Subtract2;
|
pub struct Subtract2;
|
||||||
externfn_impl!(Subtract2, |_: &Self, x: ExprInst| Ok(Subtract1{x}));
|
externfn_impl!(Subtract2, |_: &Self, x: ExprInst| Ok(Subtract1 { x }));
|
||||||
|
|
||||||
/// Partially applied Subtract function
|
|
||||||
///
|
|
||||||
/// Prev state: [Subtract2]; Next state: [Subtract0]
|
/// Prev state: [Subtract2]; Next state: [Subtract0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Subtract1{ x: ExprInst }
|
pub struct Subtract1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Subtract1, x);
|
atomic_redirect!(Subtract1, x);
|
||||||
atomic_impl!(Subtract1);
|
atomic_impl!(Subtract1);
|
||||||
externfn_impl!(Subtract1, |this: &Self, x: ExprInst| {
|
externfn_impl!(Subtract1, |this: &Self, x: ExprInst| {
|
||||||
let a: Numeric = this.x.clone().try_into()?;
|
let a: Numeric = this.x.clone().try_into()?;
|
||||||
Ok(Subtract0{ a, x })
|
Ok(Subtract0 { a, x })
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Subtract function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Subtract1]
|
/// Prev state: [Subtract1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Subtract0 { a: Numeric, x: ExprInst }
|
pub struct Subtract0 {
|
||||||
|
a: Numeric,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Subtract0, x);
|
atomic_redirect!(Subtract0, x);
|
||||||
atomic_impl!(Subtract0, |Self{ a, x }: &Self, _| {
|
atomic_impl!(Subtract0, |Self { a, x }: &Self, _| {
|
||||||
let b: Numeric = x.clone().try_into()?;
|
let b: Numeric = x.clone().try_into()?;
|
||||||
Ok((*a - b).into())
|
Ok((*a - b).into())
|
||||||
});
|
});
|
||||||
15
src/external/runtime_error.rs
vendored
15
src/external/runtime_error.rs
vendored
@@ -1,7 +1,9 @@
|
|||||||
use std::{rc::Rc, fmt::Display};
|
use std::fmt::Display;
|
||||||
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::foreign::ExternError;
|
use crate::foreign::ExternError;
|
||||||
|
|
||||||
|
/// Some external event prevented the operation from succeeding
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct RuntimeError {
|
pub struct RuntimeError {
|
||||||
message: String,
|
message: String,
|
||||||
@@ -9,12 +11,15 @@ pub struct RuntimeError {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl RuntimeError {
|
impl RuntimeError {
|
||||||
pub fn fail(message: String, operation: &'static str) -> Result<!, Rc<dyn ExternError>> {
|
pub fn fail(
|
||||||
return Err(Self { message, operation }.into_extern())
|
message: String,
|
||||||
|
operation: &'static str,
|
||||||
|
) -> Result<!, Rc<dyn ExternError>> {
|
||||||
|
return Err(Self { message, operation }.into_extern());
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn ext(message: String, operation: &'static str) -> Rc<dyn ExternError> {
|
pub fn ext(message: String, operation: &'static str) -> Rc<dyn ExternError> {
|
||||||
return Self { message, operation }.into_extern()
|
return Self { message, operation }.into_extern();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -24,4 +29,4 @@ impl Display for RuntimeError {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl ExternError for RuntimeError{}
|
impl ExternError for RuntimeError {}
|
||||||
|
|||||||
15
src/external/std.rs
vendored
15
src/external/std.rs
vendored
@@ -1,16 +1,11 @@
|
|||||||
use crate::pipeline::ConstTree;
|
|
||||||
use crate::interner::Interner;
|
|
||||||
|
|
||||||
use super::bool::bool;
|
use super::bool::bool;
|
||||||
use super::cpsio::cpsio;
|
|
||||||
use super::conv::conv;
|
use super::conv::conv;
|
||||||
use super::str::str;
|
use super::cpsio::cpsio;
|
||||||
use super::num::num;
|
use super::num::num;
|
||||||
|
use super::str::str;
|
||||||
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
pub fn std(i: &Interner) -> ConstTree {
|
pub fn std(i: &Interner) -> ConstTree {
|
||||||
cpsio(i)
|
cpsio(i) + conv(i) + bool(i) + str(i) + num(i)
|
||||||
+ conv(i)
|
|
||||||
+ bool(i)
|
|
||||||
+ str(i)
|
|
||||||
+ num(i)
|
|
||||||
}
|
}
|
||||||
35
src/external/str/char_at.rs
vendored
35
src/external/str/char_at.rs
vendored
@@ -2,41 +2,44 @@ use std::fmt::Debug;
|
|||||||
|
|
||||||
use crate::external::litconv::{with_str, with_uint};
|
use crate::external::litconv::{with_str, with_uint};
|
||||||
use crate::external::runtime_error::RuntimeError;
|
use crate::external::runtime_error::RuntimeError;
|
||||||
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
use crate::representations::{Literal, Primitive};
|
use crate::representations::{Literal, Primitive};
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
use crate::representations::interpreted::{Clause, ExprInst};
|
|
||||||
|
|
||||||
/// CharAt function
|
/// Takes an uint and a string, finds the char in a string at a 0-based index
|
||||||
///
|
///
|
||||||
/// Next state: [CharAt1]
|
/// Next state: [CharAt1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct CharAt2;
|
pub struct CharAt2;
|
||||||
externfn_impl!(CharAt2, |_: &Self, x: ExprInst| Ok(CharAt1{x}));
|
externfn_impl!(CharAt2, |_: &Self, x: ExprInst| Ok(CharAt1 { x }));
|
||||||
|
|
||||||
/// Partially applied CharAt function
|
|
||||||
///
|
|
||||||
/// Prev state: [CharAt2]; Next state: [CharAt0]
|
/// Prev state: [CharAt2]; Next state: [CharAt0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct CharAt1{ x: ExprInst }
|
pub struct CharAt1 {
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(CharAt1, x);
|
atomic_redirect!(CharAt1, x);
|
||||||
atomic_impl!(CharAt1);
|
atomic_impl!(CharAt1);
|
||||||
externfn_impl!(CharAt1, |this: &Self, x: ExprInst| {
|
externfn_impl!(CharAt1, |this: &Self, x: ExprInst| {
|
||||||
with_str(&this.x, |s| Ok(CharAt0{ s: s.clone(), x }))
|
with_str(&this.x, |s| Ok(CharAt0 { s: s.clone(), x }))
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied CharAt function.
|
|
||||||
///
|
|
||||||
/// Prev state: [CharAt1]
|
/// Prev state: [CharAt1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct CharAt0 { s: String, x: ExprInst }
|
pub struct CharAt0 {
|
||||||
|
s: String,
|
||||||
|
x: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(CharAt0, x);
|
atomic_redirect!(CharAt0, x);
|
||||||
atomic_impl!(CharAt0, |Self{ s, x }: &Self, _| {
|
atomic_impl!(CharAt0, |Self { s, x }: &Self, _| {
|
||||||
with_uint(x, |i| if let Some(c) = s.chars().nth(i as usize) {
|
with_uint(x, |i| {
|
||||||
|
if let Some(c) = s.chars().nth(i as usize) {
|
||||||
Ok(Clause::P(Primitive::Literal(Literal::Char(c))))
|
Ok(Clause::P(Primitive::Literal(Literal::Char(c))))
|
||||||
} else {
|
} else {
|
||||||
RuntimeError::fail("Character index out of bounds".to_string(), "indexing string")?
|
RuntimeError::fail(
|
||||||
|
"Character index out of bounds".to_string(),
|
||||||
|
"indexing string",
|
||||||
|
)?
|
||||||
|
}
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|||||||
34
src/external/str/concatenate.rs
vendored
34
src/external/str/concatenate.rs
vendored
@@ -1,39 +1,37 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use crate::external::litconv::with_str;
|
use crate::external::litconv::with_str;
|
||||||
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
|
||||||
use crate::representations::{Primitive, Literal};
|
|
||||||
use crate::representations::interpreted::{Clause, ExprInst};
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::{Literal, Primitive};
|
||||||
|
use crate::{atomic_impl, atomic_redirect, externfn_impl};
|
||||||
|
|
||||||
/// Concatenate function
|
/// Concatenates two strings
|
||||||
///
|
///
|
||||||
/// Next state: [Concatenate1]
|
/// Next state: [Concatenate1]
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Concatenate2;
|
pub struct Concatenate2;
|
||||||
externfn_impl!(Concatenate2, |_: &Self, c: ExprInst| Ok(Concatenate1{c}));
|
externfn_impl!(Concatenate2, |_: &Self, c: ExprInst| Ok(Concatenate1 { c }));
|
||||||
|
|
||||||
/// Partially applied Concatenate function
|
|
||||||
///
|
|
||||||
/// Prev state: [Concatenate2]; Next state: [Concatenate0]
|
/// Prev state: [Concatenate2]; Next state: [Concatenate0]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Concatenate1{ c: ExprInst }
|
pub struct Concatenate1 {
|
||||||
|
c: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Concatenate1, c);
|
atomic_redirect!(Concatenate1, c);
|
||||||
atomic_impl!(Concatenate1);
|
atomic_impl!(Concatenate1);
|
||||||
externfn_impl!(Concatenate1, |this: &Self, c: ExprInst| {
|
externfn_impl!(Concatenate1, |this: &Self, c: ExprInst| {
|
||||||
with_str(&this.c, |a| Ok(Concatenate0{ a: a.clone(), c }))
|
with_str(&this.c, |a| Ok(Concatenate0 { a: a.clone(), c }))
|
||||||
});
|
});
|
||||||
|
|
||||||
/// Fully applied Concatenate function.
|
|
||||||
///
|
|
||||||
/// Prev state: [Concatenate1]
|
/// Prev state: [Concatenate1]
|
||||||
|
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Concatenate0 { a: String, c: ExprInst }
|
pub struct Concatenate0 {
|
||||||
|
a: String,
|
||||||
|
c: ExprInst,
|
||||||
|
}
|
||||||
atomic_redirect!(Concatenate0, c);
|
atomic_redirect!(Concatenate0, c);
|
||||||
atomic_impl!(Concatenate0, |Self{ a, c }: &Self, _| {
|
atomic_impl!(Concatenate0, |Self { a, c }: &Self, _| {
|
||||||
with_str(c, |b| Ok(Clause::P(Primitive::Literal(
|
with_str(c, |b| {
|
||||||
Literal::Str(a.to_owned() + b)
|
Ok(Clause::P(Primitive::Literal(Literal::Str(a.to_owned() + b))))
|
||||||
))))
|
})
|
||||||
});
|
});
|
||||||
|
|||||||
12
src/external/str/mod.rs
vendored
12
src/external/str/mod.rs
vendored
@@ -1,10 +1,12 @@
|
|||||||
mod concatenate;
|
|
||||||
mod char_at;
|
mod char_at;
|
||||||
|
mod concatenate;
|
||||||
|
|
||||||
use crate::{pipeline::ConstTree, interner::Interner};
|
use crate::interner::Interner;
|
||||||
|
use crate::pipeline::ConstTree;
|
||||||
|
|
||||||
pub fn str(i: &Interner) -> ConstTree {
|
pub fn str(i: &Interner) -> ConstTree {
|
||||||
ConstTree::tree([
|
ConstTree::tree([(
|
||||||
(i.i("concatenate"), ConstTree::xfn(concatenate::Concatenate2))
|
i.i("concatenate"),
|
||||||
])
|
ConstTree::xfn(concatenate::Concatenate2),
|
||||||
|
)])
|
||||||
}
|
}
|
||||||
@@ -1,30 +1,25 @@
|
|||||||
use std::any::Any;
|
use std::any::Any;
|
||||||
use std::error::Error;
|
use std::error::Error;
|
||||||
use std::fmt::{Display, Debug};
|
use std::fmt::{Debug, Display};
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use dyn_clone::DynClone;
|
use dyn_clone::DynClone;
|
||||||
|
|
||||||
use crate::interpreter::{RuntimeError, Context};
|
use crate::interpreter::{Context, RuntimeError};
|
||||||
|
|
||||||
use crate::representations::Primitive;
|
|
||||||
pub use crate::representations::interpreted::Clause;
|
pub use crate::representations::interpreted::Clause;
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
|
||||||
pub struct AtomicReturn {
|
pub struct AtomicReturn {
|
||||||
pub clause: Clause,
|
pub clause: Clause,
|
||||||
pub gas: Option<usize>,
|
pub gas: Option<usize>,
|
||||||
pub inert: bool
|
pub inert: bool,
|
||||||
}
|
}
|
||||||
impl AtomicReturn {
|
impl AtomicReturn {
|
||||||
/// Wrap an inert atomic for delivery to the supervisor
|
/// Wrap an inert atomic for delivery to the supervisor
|
||||||
pub fn from_data<D: Atomic>(d: D, c: Context) -> Self {
|
pub fn from_data<D: Atomic>(d: D, c: Context) -> Self {
|
||||||
AtomicReturn {
|
AtomicReturn { clause: d.to_atom_cls(), gas: c.gas, inert: false }
|
||||||
clause: d.to_atom_cls(),
|
|
||||||
gas: c.gas,
|
|
||||||
inert: false
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -36,7 +31,9 @@ pub type RcExpr = ExprInst;
|
|||||||
|
|
||||||
pub trait ExternError: Display {
|
pub trait ExternError: Display {
|
||||||
fn into_extern(self) -> Rc<dyn ExternError>
|
fn into_extern(self) -> Rc<dyn ExternError>
|
||||||
where Self: 'static + Sized {
|
where
|
||||||
|
Self: 'static + Sized,
|
||||||
|
{
|
||||||
Rc::new(self)
|
Rc::new(self)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -58,14 +55,19 @@ pub trait ExternFn: DynClone {
|
|||||||
fn hash(&self, state: &mut dyn std::hash::Hasher) {
|
fn hash(&self, state: &mut dyn std::hash::Hasher) {
|
||||||
state.write_str(self.name())
|
state.write_str(self.name())
|
||||||
}
|
}
|
||||||
fn to_xfn_cls(self) -> Clause where Self: Sized + 'static {
|
fn to_xfn_cls(self) -> Clause
|
||||||
|
where
|
||||||
|
Self: Sized + 'static,
|
||||||
|
{
|
||||||
Clause::P(Primitive::ExternFn(Box::new(self)))
|
Clause::P(Primitive::ExternFn(Box::new(self)))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Eq for dyn ExternFn {}
|
impl Eq for dyn ExternFn {}
|
||||||
impl PartialEq for dyn ExternFn {
|
impl PartialEq for dyn ExternFn {
|
||||||
fn eq(&self, other: &Self) -> bool { self.name() == other.name() }
|
fn eq(&self, other: &Self) -> bool {
|
||||||
|
self.name() == other.name()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
impl Hash for dyn ExternFn {
|
impl Hash for dyn ExternFn {
|
||||||
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
||||||
@@ -78,10 +80,16 @@ impl Debug for dyn ExternFn {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub trait Atomic: Any + Debug + DynClone where Self: 'static {
|
pub trait Atomic: Any + Debug + DynClone
|
||||||
|
where
|
||||||
|
Self: 'static,
|
||||||
|
{
|
||||||
fn as_any(&self) -> &dyn Any;
|
fn as_any(&self) -> &dyn Any;
|
||||||
fn run(&self, ctx: Context) -> AtomicResult;
|
fn run(&self, ctx: Context) -> AtomicResult;
|
||||||
fn to_atom_cls(self) -> Clause where Self: Sized {
|
fn to_atom_cls(self) -> Clause
|
||||||
|
where
|
||||||
|
Self: Sized,
|
||||||
|
{
|
||||||
Clause::P(Primitive::Atom(Atom(Box::new(self))))
|
Clause::P(Primitive::Atom(Atom(Box::new(self))))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -105,10 +113,11 @@ impl Atom {
|
|||||||
pub fn try_cast<T: Atomic>(&self) -> Result<&T, ()> {
|
pub fn try_cast<T: Atomic>(&self) -> Result<&T, ()> {
|
||||||
self.data().as_any().downcast_ref().ok_or(())
|
self.data().as_any().downcast_ref().ok_or(())
|
||||||
}
|
}
|
||||||
pub fn is<T: 'static>(&self) -> bool { self.data().as_any().is::<T>() }
|
pub fn is<T: 'static>(&self) -> bool {
|
||||||
|
self.data().as_any().is::<T>()
|
||||||
|
}
|
||||||
pub fn cast<T: 'static>(&self) -> &T {
|
pub fn cast<T: 'static>(&self) -> &T {
|
||||||
self.data().as_any().downcast_ref()
|
self.data().as_any().downcast_ref().expect("Type mismatch on Atom::cast")
|
||||||
.expect("Type mismatch on Atom::cast")
|
|
||||||
}
|
}
|
||||||
pub fn run(&self, ctx: Context) -> AtomicResult {
|
pub fn run(&self, ctx: Context) -> AtomicResult {
|
||||||
self.0.run(ctx)
|
self.0.run(ctx)
|
||||||
|
|||||||
@@ -1,13 +1,16 @@
|
|||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use crate::foreign::Atomic;
|
use crate::foreign::Atomic;
|
||||||
|
|
||||||
/// A macro that generates the straightforward, syntactically invariant part of implementing
|
/// A macro that generates the straightforward, syntactically invariant part of
|
||||||
/// [Atomic]. Implemented fns are [Atomic::as_any], [Atomic::definitely_eq] and [Atomic::hash].
|
/// implementing [Atomic]. Implemented fns are [Atomic::as_any],
|
||||||
|
/// [Atomic::definitely_eq] and [Atomic::hash].
|
||||||
///
|
///
|
||||||
/// It depends on [Eq] and [Hash]
|
/// It depends on [Eq] and [Hash]
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! atomic_defaults {
|
macro_rules! atomic_defaults {
|
||||||
() => {
|
() => {
|
||||||
fn as_any(&self) -> &dyn std::any::Any { self }
|
fn as_any(&self) -> &dyn std::any::Any {
|
||||||
|
self
|
||||||
|
}
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -1,32 +1,38 @@
|
|||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use crate::representations::Primitive;
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use crate::foreign::{Atomic, ExternFn};
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::any::Any;
|
use std::any::Any;
|
||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use dyn_clone::DynClone;
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
/// A macro that generates implementations of [Atomic] to simplify the development of
|
#[allow(unused)] // for the doc comments
|
||||||
/// external bindings for Orchid.
|
use dyn_clone::DynClone;
|
||||||
|
|
||||||
|
#[allow(unused)] // for the doc comments
|
||||||
|
use crate::foreign::{Atomic, ExternFn};
|
||||||
|
#[allow(unused)] // for the doc comments
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
|
||||||
|
/// A macro that generates implementations of [Atomic] to simplify the
|
||||||
|
/// development of external bindings for Orchid.
|
||||||
///
|
///
|
||||||
/// The macro depends on implementations of [AsRef<Clause>] and [From<(&Self, Clause)>] for
|
/// The macro depends on implementations of [AsRef<Clause>] and [From<(&Self,
|
||||||
/// extracting the clause to be processed and then reconstructing the [Atomic]. Naturally,
|
/// Clause)>] for extracting the clause to be processed and then reconstructing
|
||||||
/// supertraits of [Atomic] are also dependencies. These are [Any], [Debug] and [DynClone].
|
/// the [Atomic]. Naturally, supertraits of [Atomic] are also dependencies.
|
||||||
|
/// These are [Any], [Debug] and [DynClone].
|
||||||
///
|
///
|
||||||
/// The simplest form just requires the typename to be specified. This additionally depends on an
|
/// The simplest form just requires the typename to be specified. This
|
||||||
/// implementation of [ExternFn] because after the clause is fully normalized it returns `Self`
|
/// additionally depends on an implementation of [ExternFn] because after the
|
||||||
/// wrapped in a [Primitive::ExternFn]. It is intended for intermediary
|
/// clause is fully normalized it returns `Self` wrapped in a
|
||||||
/// stages of the function where validation and the next state are defined in [ExternFn::apply].
|
/// [Primitive::ExternFn]. It is intended for intermediary stages of the
|
||||||
|
/// function where validation and the next state are defined in
|
||||||
|
/// [ExternFn::apply].
|
||||||
///
|
///
|
||||||
/// ```
|
/// ```
|
||||||
/// atomic_impl!(Multiply1)
|
/// atomic_impl!(Multiply1)
|
||||||
/// ```
|
/// ```
|
||||||
///
|
///
|
||||||
/// The last stage of the function should use the extended form of the macro which takes an
|
/// The last stage of the function should use the extended form of the macro
|
||||||
/// additional closure to explicitly describe what happens when the argument is fully processed.
|
/// which takes an additional closure to explicitly describe what happens when
|
||||||
|
/// the argument is fully processed.
|
||||||
///
|
///
|
||||||
/// ```
|
/// ```
|
||||||
/// // excerpt from the exact implementation of Multiply
|
/// // excerpt from the exact implementation of Multiply
|
||||||
@@ -35,44 +41,43 @@ use std::fmt::Debug;
|
|||||||
/// Ok(*a * b).into())
|
/// Ok(*a * b).into())
|
||||||
/// })
|
/// })
|
||||||
/// ```
|
/// ```
|
||||||
///
|
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! atomic_impl {
|
macro_rules! atomic_impl {
|
||||||
($typ:ident) => {
|
($typ:ident) => {
|
||||||
atomic_impl!{$typ, |this: &Self, _: $crate::interpreter::Context| {
|
atomic_impl! {$typ, |this: &Self, _: $crate::interpreter::Context| {
|
||||||
use $crate::foreign::ExternFn;
|
use $crate::foreign::ExternFn;
|
||||||
Ok(this.clone().to_xfn_cls())
|
Ok(this.clone().to_xfn_cls())
|
||||||
}}
|
}}
|
||||||
};
|
};
|
||||||
($typ:ident, $next_phase:expr) => {
|
($typ:ident, $next_phase:expr) => {
|
||||||
impl $crate::foreign::Atomic for $typ {
|
impl $crate::foreign::Atomic for $typ {
|
||||||
$crate::atomic_defaults!{}
|
$crate::atomic_defaults! {}
|
||||||
|
|
||||||
fn run(&self, ctx: $crate::interpreter::Context)
|
fn run(
|
||||||
-> $crate::foreign::AtomicResult
|
&self,
|
||||||
{
|
ctx: $crate::interpreter::Context,
|
||||||
|
) -> $crate::foreign::AtomicResult {
|
||||||
// extract the expression
|
// extract the expression
|
||||||
let expr = <Self as
|
let expr =
|
||||||
AsRef<$crate::foreign::RcExpr>
|
<Self as AsRef<$crate::foreign::RcExpr>>::as_ref(self).clone();
|
||||||
>::as_ref(self).clone();
|
|
||||||
// run the expression
|
// run the expression
|
||||||
let ret = $crate::interpreter::run(expr, ctx.clone())?;
|
let ret = $crate::interpreter::run(expr, ctx.clone())?;
|
||||||
let $crate::interpreter::Return{ gas, state, inert } = ret;
|
let $crate::interpreter::Return { gas, state, inert } = ret;
|
||||||
// rebuild the atomic
|
// rebuild the atomic
|
||||||
let next_self = <Self as
|
let next_self =
|
||||||
From<(&Self, $crate::foreign::RcExpr)>
|
<Self as From<(&Self, $crate::foreign::RcExpr)>>::from((self, state));
|
||||||
>::from((self, state));
|
|
||||||
// branch off or wrap up
|
// branch off or wrap up
|
||||||
let clause = if inert {
|
let clause = if inert {
|
||||||
match ($next_phase)(&next_self, ctx) {
|
let closure = $next_phase;
|
||||||
|
match closure(&next_self, ctx) {
|
||||||
Ok(r) => r,
|
Ok(r) => r,
|
||||||
Err(e) => return Err(
|
Err(e) => return Err($crate::interpreter::RuntimeError::Extern(e)),
|
||||||
$crate::interpreter::RuntimeError::Extern(e)
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
} else { next_self.to_atom_cls() };
|
} else {
|
||||||
|
next_self.to_atom_cls()
|
||||||
|
};
|
||||||
// package and return
|
// package and return
|
||||||
Ok($crate::foreign::AtomicReturn{ clause, gas, inert: false })
|
Ok($crate::foreign::AtomicReturn { clause, gas, inert: false })
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,27 +1,31 @@
|
|||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use crate::foreign::Atomic;
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::any::Any;
|
use std::any::Any;
|
||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use dyn_clone::DynClone;
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
/// Implement [Atomic] for a structure that cannot be transformed any further. This would be optimal
|
#[allow(unused)] // for the doc comments
|
||||||
/// for atomics encapsulating raw data. [Atomic] depends on [Any], [Debug] and [DynClone].
|
use dyn_clone::DynClone;
|
||||||
|
|
||||||
|
#[allow(unused)] // for the doc comments
|
||||||
|
use crate::foreign::Atomic;
|
||||||
|
|
||||||
|
/// Implement [Atomic] for a structure that cannot be transformed any further.
|
||||||
|
/// This would be optimal for atomics encapsulating raw data. [Atomic] depends
|
||||||
|
/// on [Any], [Debug] and [DynClone].
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! atomic_inert {
|
macro_rules! atomic_inert {
|
||||||
($typ:ident) => {
|
($typ:ident) => {
|
||||||
impl $crate::foreign::Atomic for $typ {
|
impl $crate::foreign::Atomic for $typ {
|
||||||
$crate::atomic_defaults!{}
|
$crate::atomic_defaults! {}
|
||||||
|
|
||||||
fn run(&self, ctx: $crate::interpreter::Context)
|
fn run(
|
||||||
-> $crate::foreign::AtomicResult
|
&self,
|
||||||
{
|
ctx: $crate::interpreter::Context,
|
||||||
Ok($crate::foreign::AtomicReturn{
|
) -> $crate::foreign::AtomicResult {
|
||||||
|
Ok($crate::foreign::AtomicReturn {
|
||||||
clause: self.clone().to_atom_cls(),
|
clause: self.clone().to_atom_cls(),
|
||||||
gas: ctx.gas,
|
gas: ctx.gas,
|
||||||
inert: true
|
inert: true,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,29 +1,32 @@
|
|||||||
#[allow(unused)]
|
#[allow(unused)]
|
||||||
use super::atomic_impl;
|
use super::atomic_impl;
|
||||||
|
|
||||||
/// Implement the traits required by [atomic_impl] to redirect run_* functions to a field
|
/// Implement the traits required by [atomic_impl] to redirect run_* functions
|
||||||
/// with a particular name.
|
/// to a field with a particular name.
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! atomic_redirect {
|
macro_rules! atomic_redirect {
|
||||||
($typ:ident) => {
|
($typ:ident) => {
|
||||||
impl AsRef<$crate::foreign::RcExpr> for $typ {
|
impl AsRef<$crate::foreign::RcExpr> for $typ {
|
||||||
fn as_ref(&self) -> &Clause { &self.0 }
|
fn as_ref(&self) -> &Clause {
|
||||||
|
&self.0
|
||||||
|
}
|
||||||
}
|
}
|
||||||
impl From<(&Self, $crate::foreign::RcExpr)> for $typ {
|
impl From<(&Self, $crate::foreign::RcExpr)> for $typ {
|
||||||
fn from((old, clause): (&Self, Clause)) -> Self {
|
fn from((old, clause): (&Self, Clause)) -> Self {
|
||||||
Self{ 0: clause, ..old.clone() }
|
Self { 0: clause, ..old.clone() }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
($typ:ident, $field:ident) => {
|
($typ:ident, $field:ident) => {
|
||||||
impl AsRef<$crate::foreign::RcExpr>
|
impl AsRef<$crate::foreign::RcExpr> for $typ {
|
||||||
for $typ {
|
fn as_ref(&self) -> &$crate::foreign::RcExpr {
|
||||||
fn as_ref(&self) -> &$crate::foreign::RcExpr { &self.$field }
|
&self.$field
|
||||||
}
|
}
|
||||||
impl From<(&Self, $crate::foreign::RcExpr)>
|
}
|
||||||
for $typ {
|
impl From<(&Self, $crate::foreign::RcExpr)> for $typ {
|
||||||
|
#[allow(clippy::needless_update)]
|
||||||
fn from((old, $field): (&Self, $crate::foreign::RcExpr)) -> Self {
|
fn from((old, $field): (&Self, $crate::foreign::RcExpr)) -> Self {
|
||||||
Self{ $field, ..old.clone() }
|
Self { $field, ..old.clone() }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,39 +1,45 @@
|
|||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use crate::{atomic_impl, atomic_redirect};
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use crate::representations::Primitive;
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use crate::foreign::{Atomic, ExternFn};
|
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::any::Any;
|
use std::any::Any;
|
||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
|
use std::fmt::Debug;
|
||||||
|
#[allow(unused)] // for the doc comments
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
|
|
||||||
#[allow(unused)] // for the doc comments
|
#[allow(unused)] // for the doc comments
|
||||||
use dyn_clone::DynClone;
|
use dyn_clone::DynClone;
|
||||||
#[allow(unused)] // for the doc comments
|
|
||||||
use std::fmt::Debug;
|
|
||||||
|
|
||||||
/// Implement [ExternFn] with a closure that produces an [Atomic] from a reference to self
|
#[allow(unused)] // for the doc comments
|
||||||
/// and a closure. This can be used in conjunction with [atomic_impl] and [atomic_redirect]
|
use crate::foreign::{Atomic, ExternFn};
|
||||||
/// to normalize the argument automatically before using it.
|
#[allow(unused)] // for the doc comments
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
#[allow(unused)] // for the doc comments
|
||||||
|
use crate::{atomic_impl, atomic_redirect};
|
||||||
|
|
||||||
|
/// Implement [ExternFn] with a closure that produces an [Atomic] from a
|
||||||
|
/// reference to self and a closure. This can be used in conjunction with
|
||||||
|
/// [atomic_impl] and [atomic_redirect] to normalize the argument automatically
|
||||||
|
/// before using it.
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! externfn_impl {
|
macro_rules! externfn_impl {
|
||||||
($typ:ident, $next_atomic:expr) => {
|
($typ:ident, $next_atomic:expr) => {
|
||||||
impl $crate::foreign::ExternFn for $typ {
|
impl $crate::foreign::ExternFn for $typ {
|
||||||
fn name(&self) -> &str {stringify!($typ)}
|
fn name(&self) -> &str {
|
||||||
fn apply(&self,
|
stringify!($typ)
|
||||||
|
}
|
||||||
|
fn apply(
|
||||||
|
&self,
|
||||||
arg: $crate::foreign::RcExpr,
|
arg: $crate::foreign::RcExpr,
|
||||||
_ctx: $crate::interpreter::Context
|
_ctx: $crate::interpreter::Context,
|
||||||
) -> $crate::foreign::XfnResult {
|
) -> $crate::foreign::XfnResult {
|
||||||
match ($next_atomic)(self, arg) { // ? casts the result but we want to strictly forward it
|
let closure = $next_atomic;
|
||||||
Ok(r) => Ok(
|
match closure(self, arg) {
|
||||||
$crate::representations::interpreted::Clause::P(
|
// ? casts the result but we want to strictly forward it
|
||||||
|
Ok(r) => Ok($crate::representations::interpreted::Clause::P(
|
||||||
$crate::representations::Primitive::Atom(
|
$crate::representations::Primitive::Atom(
|
||||||
$crate::foreign::Atom::new(r)
|
$crate::foreign::Atom::new(r),
|
||||||
)
|
|
||||||
)
|
|
||||||
),
|
),
|
||||||
Err(e) => Err(e)
|
)),
|
||||||
|
Err(e) => Err(e),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -12,7 +12,8 @@ use crate::interner::Interner;
|
|||||||
/// identify functions based on arity
|
/// identify functions based on arity
|
||||||
pub trait InternedDisplay {
|
pub trait InternedDisplay {
|
||||||
/// formats the value using the given formatter and string interner
|
/// formats the value using the given formatter and string interner
|
||||||
fn fmt_i(&self,
|
fn fmt_i(
|
||||||
|
&self,
|
||||||
f: &mut std::fmt::Formatter<'_>,
|
f: &mut std::fmt::Formatter<'_>,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> std::fmt::Result;
|
) -> std::fmt::Result;
|
||||||
@@ -28,22 +29,27 @@ pub trait InternedDisplay {
|
|||||||
buf
|
buf
|
||||||
}
|
}
|
||||||
|
|
||||||
fn bundle<'a>(&'a self, interner: &'a Interner)
|
fn bundle<'a>(&'a self, interner: &'a Interner) -> DisplayBundle<'a, Self> {
|
||||||
-> DisplayBundle<'a, Self>
|
|
||||||
{
|
|
||||||
DisplayBundle { interner, data: self }
|
DisplayBundle { interner, data: self }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> InternedDisplay for T where T: Display {
|
impl<T> InternedDisplay for T
|
||||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, _i: &Interner) -> std::fmt::Result {
|
where
|
||||||
<Self as Display>::fmt(&self, f)
|
T: Display,
|
||||||
|
{
|
||||||
|
fn fmt_i(
|
||||||
|
&self,
|
||||||
|
f: &mut std::fmt::Formatter<'_>,
|
||||||
|
_i: &Interner,
|
||||||
|
) -> std::fmt::Result {
|
||||||
|
<Self as Display>::fmt(self, f)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct DisplayBundle<'a, T: InternedDisplay + ?Sized> {
|
pub struct DisplayBundle<'a, T: InternedDisplay + ?Sized> {
|
||||||
interner: &'a Interner,
|
interner: &'a Interner,
|
||||||
data: &'a T
|
data: &'a T,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<'a, T: InternedDisplay> Display for DisplayBundle<'a, T> {
|
impl<'a, T: InternedDisplay> Display for DisplayBundle<'a, T> {
|
||||||
|
|||||||
@@ -1,9 +1,21 @@
|
|||||||
|
mod display;
|
||||||
mod monotype;
|
mod monotype;
|
||||||
mod multitype;
|
mod multitype;
|
||||||
mod token;
|
mod token;
|
||||||
mod display;
|
|
||||||
|
|
||||||
|
pub use display::{DisplayBundle, InternedDisplay};
|
||||||
pub use monotype::TypedInterner;
|
pub use monotype::TypedInterner;
|
||||||
pub use multitype::Interner;
|
pub use multitype::Interner;
|
||||||
pub use token::Token;
|
pub use token::Tok;
|
||||||
pub use display::{DisplayBundle, InternedDisplay};
|
|
||||||
|
/// A symbol, nsname, nname or namespaced name is a sequence of namespaces
|
||||||
|
/// and an identifier. The [Vec] can never be empty.
|
||||||
|
///
|
||||||
|
/// Throughout different stages of processing, these names can be
|
||||||
|
///
|
||||||
|
/// - local names to be prefixed with the current module
|
||||||
|
/// - imported names starting with a segment
|
||||||
|
/// - ending a single import or
|
||||||
|
/// - defined in one of the glob imported modules
|
||||||
|
/// - absolute names
|
||||||
|
pub type Sym = Tok<Vec<Tok<String>>>;
|
||||||
|
|||||||
@@ -1,50 +1,54 @@
|
|||||||
use std::num::NonZeroU32;
|
|
||||||
use std::cell::RefCell;
|
|
||||||
use std::borrow::Borrow;
|
use std::borrow::Borrow;
|
||||||
use std::hash::{Hash, BuildHasher};
|
use std::cell::RefCell;
|
||||||
|
use std::hash::{BuildHasher, Hash};
|
||||||
|
use std::num::NonZeroU32;
|
||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use super::token::Token;
|
use super::token::Tok;
|
||||||
|
|
||||||
pub struct TypedInterner<T: 'static + Eq + Hash + Clone>{
|
/// An interner for any type that implements [Borrow]. This is inspired by
|
||||||
tokens: RefCell<HashMap<&'static T, Token<T>>>,
|
/// Lasso but much simpler, in part because not much can be known about the type.
|
||||||
values: RefCell<Vec<(&'static T, bool)>>
|
pub struct TypedInterner<T: 'static + Eq + Hash + Clone> {
|
||||||
|
tokens: RefCell<HashMap<&'static T, Tok<T>>>,
|
||||||
|
values: RefCell<Vec<(&'static T, bool)>>,
|
||||||
}
|
}
|
||||||
impl<T: Eq + Hash + Clone> TypedInterner<T> {
|
impl<T: Eq + Hash + Clone> TypedInterner<T> {
|
||||||
/// Create a fresh interner instance
|
/// Create a fresh interner instance
|
||||||
pub fn new() -> Self {
|
pub fn new() -> Self {
|
||||||
Self {
|
Self {
|
||||||
tokens: RefCell::new(HashMap::new()),
|
tokens: RefCell::new(HashMap::new()),
|
||||||
values: RefCell::new(Vec::new())
|
values: RefCell::new(Vec::new()),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Intern an object, returning a token
|
/// Intern an object, returning a token
|
||||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned<Owned = T>>(&self, q: &Q)
|
pub fn i<Q: ?Sized + Eq + Hash + ToOwned<Owned = T>>(&self, q: &Q) -> Tok<T>
|
||||||
-> Token<T> where T: Borrow<Q>
|
where
|
||||||
|
T: Borrow<Q>,
|
||||||
{
|
{
|
||||||
let mut tokens = self.tokens.borrow_mut();
|
let mut tokens = self.tokens.borrow_mut();
|
||||||
let hash = compute_hash(tokens.hasher(), q);
|
let hash = compute_hash(tokens.hasher(), q);
|
||||||
let raw_entry = tokens.raw_entry_mut().from_hash(hash, |k| {
|
let raw_entry = tokens
|
||||||
<T as Borrow<Q>>::borrow(k) == q
|
.raw_entry_mut()
|
||||||
});
|
.from_hash(hash, |k| <T as Borrow<Q>>::borrow(k) == q);
|
||||||
let kv = raw_entry.or_insert_with(|| {
|
let kv = raw_entry.or_insert_with(|| {
|
||||||
let mut values = self.values.borrow_mut();
|
let mut values = self.values.borrow_mut();
|
||||||
let uniq_key: NonZeroU32 = (values.len() as u32 + 1u32)
|
let uniq_key: NonZeroU32 =
|
||||||
.try_into().expect("can never be zero");
|
(values.len() as u32 + 1u32).try_into().expect("can never be zero");
|
||||||
let keybox = Box::new(q.to_owned());
|
let keybox = Box::new(q.to_owned());
|
||||||
let keyref = Box::leak(keybox);
|
let keyref = Box::leak(keybox);
|
||||||
values.push((keyref, true));
|
values.push((keyref, true));
|
||||||
let token = Token::<T>::from_id(uniq_key);
|
let token = Tok::<T>::from_id(uniq_key);
|
||||||
(keyref, token)
|
(keyref, token)
|
||||||
});
|
});
|
||||||
*kv.1
|
*kv.1
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Resolve a token, obtaining an object
|
/// Resolve a token, obtaining an object
|
||||||
/// It is illegal to use a token obtained from one interner with another.
|
/// It is illegal to use a token obtained from one interner with
|
||||||
pub fn r(&self, t: Token<T>) -> &T {
|
/// another.
|
||||||
|
pub fn r(&self, t: Tok<T>) -> &T {
|
||||||
let values = self.values.borrow();
|
let values = self.values.borrow();
|
||||||
let key = t.into_usize() - 1;
|
let key = t.into_usize() - 1;
|
||||||
values[key].0
|
values[key].0
|
||||||
@@ -52,17 +56,20 @@ impl<T: Eq + Hash + Clone> TypedInterner<T> {
|
|||||||
|
|
||||||
/// Intern a static reference without allocating the data on the heap
|
/// Intern a static reference without allocating the data on the heap
|
||||||
#[allow(unused)]
|
#[allow(unused)]
|
||||||
pub fn intern_static(&self, tref: &'static T) -> Token<T> {
|
pub fn intern_static(&self, tref: &'static T) -> Tok<T> {
|
||||||
let mut tokens = self.tokens.borrow_mut();
|
let mut tokens = self.tokens.borrow_mut();
|
||||||
let token = *tokens.raw_entry_mut().from_key(tref)
|
let token = *tokens
|
||||||
|
.raw_entry_mut()
|
||||||
|
.from_key(tref)
|
||||||
.or_insert_with(|| {
|
.or_insert_with(|| {
|
||||||
let mut values = self.values.borrow_mut();
|
let mut values = self.values.borrow_mut();
|
||||||
let uniq_key: NonZeroU32 = (values.len() as u32 + 1u32)
|
let uniq_key: NonZeroU32 =
|
||||||
.try_into().expect("can never be zero");
|
(values.len() as u32 + 1u32).try_into().expect("can never be zero");
|
||||||
values.push((tref, false));
|
values.push((tref, false));
|
||||||
let token = Token::<T>::from_id(uniq_key);
|
let token = Tok::<T>::from_id(uniq_key);
|
||||||
(tref, token)
|
(tref, token)
|
||||||
}).1;
|
})
|
||||||
|
.1;
|
||||||
token
|
token
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -74,10 +81,10 @@ impl<T: Eq + Hash + Clone> Drop for TypedInterner<T> {
|
|||||||
// which negates the need for unsafe here
|
// which negates the need for unsafe here
|
||||||
let mut values = self.values.borrow_mut();
|
let mut values = self.values.borrow_mut();
|
||||||
for (item, owned) in values.drain(..) {
|
for (item, owned) in values.drain(..) {
|
||||||
if !owned {continue}
|
if !owned {
|
||||||
unsafe {
|
continue;
|
||||||
Box::from_raw((item as *const T).cast_mut())
|
}
|
||||||
};
|
unsafe { Box::from_raw((item as *const T).cast_mut()) };
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -85,7 +92,7 @@ impl<T: Eq + Hash + Clone> Drop for TypedInterner<T> {
|
|||||||
/// Helper function to compute hashes outside a hashmap
|
/// Helper function to compute hashes outside a hashmap
|
||||||
fn compute_hash(
|
fn compute_hash(
|
||||||
hash_builder: &impl BuildHasher,
|
hash_builder: &impl BuildHasher,
|
||||||
key: &(impl Hash + ?Sized)
|
key: &(impl Hash + ?Sized),
|
||||||
) -> u64 {
|
) -> u64 {
|
||||||
use core::hash::Hasher;
|
use core::hash::Hasher;
|
||||||
let mut state = hash_builder.build_hasher();
|
let mut state = hash_builder.build_hasher();
|
||||||
|
|||||||
@@ -1,14 +1,17 @@
|
|||||||
|
use std::any::{Any, TypeId};
|
||||||
use std::borrow::Borrow;
|
use std::borrow::Borrow;
|
||||||
use std::cell::{RefCell, RefMut};
|
use std::cell::{RefCell, RefMut};
|
||||||
use std::any::{TypeId, Any};
|
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use super::monotype::TypedInterner;
|
use super::monotype::TypedInterner;
|
||||||
use super::token::Token;
|
use super::token::Tok;
|
||||||
|
|
||||||
|
/// A collection of interners based on their type. Allows to intern any object
|
||||||
|
/// that implements [ToOwned]. Objects of the same type are stored together in a
|
||||||
|
/// [TypedInterner].
|
||||||
pub struct Interner {
|
pub struct Interner {
|
||||||
interners: RefCell<HashMap<TypeId, Rc<dyn Any>>>,
|
interners: RefCell<HashMap<TypeId, Rc<dyn Any>>>,
|
||||||
}
|
}
|
||||||
@@ -17,56 +20,59 @@ impl Interner {
|
|||||||
Self { interners: RefCell::new(HashMap::new()) }
|
Self { interners: RefCell::new(HashMap::new()) }
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn i<Q: ?Sized + Eq + Hash + ToOwned>(&self, q: &Q)
|
pub fn i<Q: ?Sized + Eq + Hash + ToOwned>(&self, q: &Q) -> Tok<Q::Owned>
|
||||||
-> Token<Q::Owned>
|
where
|
||||||
where Q::Owned: 'static + Eq + Hash + Clone + Borrow<Q>
|
Q::Owned: 'static + Eq + Hash + Clone + Borrow<Q>,
|
||||||
{
|
{
|
||||||
let mut interners = self.interners.borrow_mut();
|
let mut interners = self.interners.borrow_mut();
|
||||||
let interner = get_interner(&mut interners);
|
let interner = get_interner(&mut interners);
|
||||||
interner.i(q)
|
interner.i(q)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn r<T: 'static + Eq + Hash + Clone>(&self, t: Token<T>) -> &T {
|
pub fn r<T: 'static + Eq + Hash + Clone>(&self, t: Tok<T>) -> &T {
|
||||||
let mut interners = self.interners.borrow_mut();
|
let mut interners = self.interners.borrow_mut();
|
||||||
let interner = get_interner(&mut interners);
|
let interner = get_interner(&mut interners);
|
||||||
// TODO: figure this out
|
// TODO: figure this out
|
||||||
unsafe{ (interner.r(t) as *const T).as_ref().unwrap() }
|
unsafe { (interner.r(t) as *const T).as_ref().unwrap() }
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Fully resolve
|
/// Fully resolve
|
||||||
/// TODO: make this generic over containers
|
/// TODO: make this generic over containers
|
||||||
pub fn extern_vec<T: 'static + Eq + Hash + Clone>(&self,
|
pub fn extern_vec<T: 'static + Eq + Hash + Clone>(
|
||||||
t: Token<Vec<Token<T>>>
|
&self,
|
||||||
|
t: Tok<Vec<Tok<T>>>,
|
||||||
) -> Vec<T> {
|
) -> Vec<T> {
|
||||||
let mut interners = self.interners.borrow_mut();
|
let mut interners = self.interners.borrow_mut();
|
||||||
let v_int = get_interner(&mut interners);
|
let v_int = get_interner(&mut interners);
|
||||||
let t_int = get_interner(&mut interners);
|
let t_int = get_interner(&mut interners);
|
||||||
let v = v_int.r(t);
|
let v = v_int.r(t);
|
||||||
v.iter()
|
v.iter().map(|t| t_int.r(*t)).cloned().collect()
|
||||||
.map(|t| t_int.r(*t))
|
|
||||||
.cloned()
|
|
||||||
.collect()
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn extern_all<T: 'static + Eq + Hash + Clone>(&self,
|
pub fn extern_all<T: 'static + Eq + Hash + Clone>(
|
||||||
s: &[Token<T>]
|
&self,
|
||||||
|
s: &[Tok<T>],
|
||||||
) -> Vec<T> {
|
) -> Vec<T> {
|
||||||
s.iter()
|
s.iter().map(|t| self.r(*t)).cloned().collect()
|
||||||
.map(|t| self.r(*t))
|
}
|
||||||
.cloned()
|
}
|
||||||
.collect()
|
|
||||||
|
impl Default for Interner {
|
||||||
|
fn default() -> Self {
|
||||||
|
Self::new()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Get or create an interner for a given type.
|
/// Get or create an interner for a given type.
|
||||||
fn get_interner<T: 'static + Eq + Hash + Clone>(
|
fn get_interner<T: 'static + Eq + Hash + Clone>(
|
||||||
interners: &mut RefMut<HashMap<TypeId, Rc<dyn Any>>>
|
interners: &mut RefMut<HashMap<TypeId, Rc<dyn Any>>>,
|
||||||
) -> Rc<TypedInterner<T>> {
|
) -> Rc<TypedInterner<T>> {
|
||||||
let boxed = interners.raw_entry_mut().from_key(&TypeId::of::<T>())
|
let boxed = interners
|
||||||
.or_insert_with(|| (
|
.raw_entry_mut()
|
||||||
TypeId::of::<T>(),
|
.from_key(&TypeId::of::<T>())
|
||||||
Rc::new(TypedInterner::<T>::new())
|
.or_insert_with(|| (TypeId::of::<T>(), Rc::new(TypedInterner::<T>::new())))
|
||||||
)).1.clone();
|
.1
|
||||||
|
.clone();
|
||||||
boxed.downcast().expect("the typeid is supposed to protect from this")
|
boxed.downcast().expect("the typeid is supposed to protect from this")
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -94,7 +100,8 @@ mod test {
|
|||||||
#[allow(unused)]
|
#[allow(unused)]
|
||||||
pub fn test_str_slice() {
|
pub fn test_str_slice() {
|
||||||
let interner = Interner::new();
|
let interner = Interner::new();
|
||||||
let key1 = interner.i(&vec!["a".to_string(), "b".to_string(), "c".to_string()]);
|
let key1 =
|
||||||
|
interner.i(&vec!["a".to_string(), "b".to_string(), "c".to_string()]);
|
||||||
let key2 = interner.i(&["a", "b", "c"][..]);
|
let key2 = interner.i(&["a", "b", "c"][..]);
|
||||||
// assert_eq!(key1, key2);
|
// assert_eq!(key1, key2);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,14 +1,18 @@
|
|||||||
use std::{num::NonZeroU32, marker::PhantomData};
|
use std::cmp::PartialEq;
|
||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
|
use std::marker::PhantomData;
|
||||||
|
use std::num::NonZeroU32;
|
||||||
|
|
||||||
use std::cmp::PartialEq;
|
/// A number representing an object of type `T` stored in some interner. It is a
|
||||||
|
/// logic error to compare tokens obtained from different interners, or to use a
|
||||||
pub struct Token<T>{
|
/// token with an interner other than the one that created it, but this is
|
||||||
|
/// currently not enforced.
|
||||||
|
pub struct Tok<T> {
|
||||||
id: NonZeroU32,
|
id: NonZeroU32,
|
||||||
phantom_data: PhantomData<T>
|
phantom_data: PhantomData<T>,
|
||||||
}
|
}
|
||||||
impl<T> Token<T> {
|
impl<T> Tok<T> {
|
||||||
pub fn from_id(id: NonZeroU32) -> Self {
|
pub fn from_id(id: NonZeroU32) -> Self {
|
||||||
Self { id, phantom_data: PhantomData }
|
Self { id, phantom_data: PhantomData }
|
||||||
}
|
}
|
||||||
@@ -21,36 +25,38 @@ impl<T> Token<T> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> Debug for Token<T> {
|
impl<T> Debug for Tok<T> {
|
||||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||||
write!(f, "Token({})", self.id)
|
write!(f, "Token({})", self.id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> Copy for Token<T> {}
|
impl<T> Copy for Tok<T> {}
|
||||||
impl<T> Clone for Token<T> {
|
impl<T> Clone for Tok<T> {
|
||||||
fn clone(&self) -> Self {
|
fn clone(&self) -> Self {
|
||||||
Self{ id: self.id, phantom_data: PhantomData }
|
Self { id: self.id, phantom_data: PhantomData }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> Eq for Token<T> {}
|
impl<T> Eq for Tok<T> {}
|
||||||
impl<T> PartialEq for Token<T> {
|
impl<T> PartialEq for Tok<T> {
|
||||||
fn eq(&self, other: &Self) -> bool { self.id == other.id }
|
fn eq(&self, other: &Self) -> bool {
|
||||||
|
self.id == other.id
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> Ord for Token<T> {
|
impl<T> Ord for Tok<T> {
|
||||||
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
|
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
|
||||||
self.id.cmp(&other.id)
|
self.id.cmp(&other.id)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
impl<T> PartialOrd for Token<T> {
|
impl<T> PartialOrd for Tok<T> {
|
||||||
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
|
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
|
||||||
Some(self.cmp(&other))
|
Some(self.cmp(other))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<T> Hash for Token<T> {
|
impl<T> Hash for Tok<T> {
|
||||||
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
|
||||||
state.write_u32(self.id.into())
|
state.write_u32(self.id.into())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,36 +1,41 @@
|
|||||||
|
use super::context::Context;
|
||||||
|
use super::error::RuntimeError;
|
||||||
|
use super::Return;
|
||||||
use crate::foreign::AtomicReturn;
|
use crate::foreign::AtomicReturn;
|
||||||
use crate::representations::Primitive;
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
use crate::representations::PathSet;
|
use crate::representations::{PathSet, Primitive};
|
||||||
use crate::representations::interpreted::{ExprInst, Clause};
|
|
||||||
use crate::utils::Side;
|
use crate::utils::Side;
|
||||||
|
|
||||||
use super::Return;
|
/// Process the clause at the end of the provided path. Note that paths always
|
||||||
use super::error::RuntimeError;
|
/// point to at least one target. Note also that this is not cached as a
|
||||||
use super::context::Context;
|
/// normalization step in the intermediate expressions.
|
||||||
|
|
||||||
/// Process the clause at the end of the provided path.
|
|
||||||
/// Note that paths always point to at least one target.
|
|
||||||
/// Note also that this is not cached as a normalization step in the
|
|
||||||
/// intermediate expressions.
|
|
||||||
fn map_at<E>(
|
fn map_at<E>(
|
||||||
path: &[Side], source: ExprInst,
|
path: &[Side],
|
||||||
mapper: &mut impl FnMut(&Clause) -> Result<Clause, E>
|
source: ExprInst,
|
||||||
|
mapper: &mut impl FnMut(&Clause) -> Result<Clause, E>,
|
||||||
) -> Result<ExprInst, E> {
|
) -> Result<ExprInst, E> {
|
||||||
source.try_update(|value| {
|
source
|
||||||
|
.try_update(|value| {
|
||||||
// Pass right through lambdas
|
// Pass right through lambdas
|
||||||
if let Clause::Lambda { args, body } = value {
|
if let Clause::Lambda { args, body } = value {
|
||||||
return Ok((Clause::Lambda {
|
return Ok((
|
||||||
|
Clause::Lambda {
|
||||||
args: args.clone(),
|
args: args.clone(),
|
||||||
body: map_at(path, body.clone(), mapper)?
|
body: map_at(path, body.clone(), mapper)?,
|
||||||
}, ()))
|
},
|
||||||
|
(),
|
||||||
|
));
|
||||||
}
|
}
|
||||||
// If the path ends here, process the next (non-lambda) node
|
// If the path ends here, process the next (non-lambda) node
|
||||||
let (head, tail) = if let Some(sf) = path.split_first() {sf} else {
|
let (head, tail) = if let Some(sf) = path.split_first() {
|
||||||
return Ok((mapper(value)?, ()))
|
sf
|
||||||
|
} else {
|
||||||
|
return Ok((mapper(value)?, ()));
|
||||||
};
|
};
|
||||||
// If it's an Apply, execute the next step in the path
|
// If it's an Apply, execute the next step in the path
|
||||||
if let Clause::Apply { f, x } = value {
|
if let Clause::Apply { f, x } = value {
|
||||||
return Ok((match head {
|
return Ok((
|
||||||
|
match head {
|
||||||
Side::Left => Clause::Apply {
|
Side::Left => Clause::Apply {
|
||||||
f: map_at(tail, f.clone(), mapper)?,
|
f: map_at(tail, f.clone(), mapper)?,
|
||||||
x: x.clone(),
|
x: x.clone(),
|
||||||
@@ -38,66 +43,84 @@ fn map_at<E>(
|
|||||||
Side::Right => Clause::Apply {
|
Side::Right => Clause::Apply {
|
||||||
f: f.clone(),
|
f: f.clone(),
|
||||||
x: map_at(tail, x.clone(), mapper)?,
|
x: map_at(tail, x.clone(), mapper)?,
|
||||||
}
|
},
|
||||||
}, ()))
|
},
|
||||||
|
(),
|
||||||
|
));
|
||||||
}
|
}
|
||||||
panic!("Invalid path")
|
panic!("Invalid path")
|
||||||
}).map(|p| p.0)
|
})
|
||||||
|
.map(|p| p.0)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Replace the [Clause::LambdaArg] placeholders at the ends of the [PathSet]
|
||||||
|
/// with the value in the body. Note that a path may point to multiple
|
||||||
|
/// placeholders.
|
||||||
fn substitute(paths: &PathSet, value: Clause, body: ExprInst) -> ExprInst {
|
fn substitute(paths: &PathSet, value: Clause, body: ExprInst) -> ExprInst {
|
||||||
let PathSet{ steps, next } = paths;
|
let PathSet { steps, next } = paths;
|
||||||
map_at(&steps, body, &mut |checkpoint| -> Result<Clause, !> {
|
map_at(steps, body, &mut |checkpoint| -> Result<Clause, !> {
|
||||||
match (checkpoint, next) {
|
match (checkpoint, next) {
|
||||||
(Clause::Lambda{..}, _) => unreachable!("Handled by map_at"),
|
(Clause::Lambda { .. }, _) => unreachable!("Handled by map_at"),
|
||||||
(Clause::Apply { f, x }, Some((left, right))) => Ok(Clause::Apply {
|
(Clause::Apply { f, x }, Some((left, right))) => Ok(Clause::Apply {
|
||||||
f: substitute(&left, value.clone(), f.clone()),
|
f: substitute(left, value.clone(), f.clone()),
|
||||||
x: substitute(&right, value.clone(), x.clone()),
|
x: substitute(right, value.clone(), x.clone()),
|
||||||
}),
|
}),
|
||||||
(Clause::LambdaArg, None) => Ok(value.clone()),
|
(Clause::LambdaArg, None) => Ok(value.clone()),
|
||||||
(_, None) => panic!("Substitution path ends in something other than LambdaArg"),
|
(_, None) =>
|
||||||
(_, Some(_)) => panic!("Substitution path leads into something other than Apply"),
|
panic!("Substitution path ends in something other than LambdaArg"),
|
||||||
|
(_, Some(_)) =>
|
||||||
|
panic!("Substitution path leads into something other than Apply"),
|
||||||
}
|
}
|
||||||
}).into_ok()
|
})
|
||||||
|
.into_ok()
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Apply a function-like expression to a parameter.
|
/// Apply a function-like expression to a parameter.
|
||||||
/// If any work is being done, gas will be deducted.
|
|
||||||
pub fn apply(
|
pub fn apply(
|
||||||
f: ExprInst, x: ExprInst, ctx: Context
|
f: ExprInst,
|
||||||
|
x: ExprInst,
|
||||||
|
ctx: Context,
|
||||||
) -> Result<Return, RuntimeError> {
|
) -> Result<Return, RuntimeError> {
|
||||||
let (state, (gas, inert)) = f.clone().try_update(|clause| match clause {
|
let (state, (gas, inert)) = f.try_update(|clause| match clause {
|
||||||
// apply an ExternFn or an internal function
|
// apply an ExternFn or an internal function
|
||||||
Clause::P(Primitive::ExternFn(f)) => {
|
Clause::P(Primitive::ExternFn(f)) => {
|
||||||
let clause = f.apply(x, ctx.clone())
|
let clause =
|
||||||
.map_err(|e| RuntimeError::Extern(e))?;
|
f.apply(x, ctx.clone()).map_err(|e| RuntimeError::Extern(e))?;
|
||||||
Ok((clause, (ctx.gas.map(|g| g - 1), false)))
|
Ok((clause, (ctx.gas.map(|g| g - 1), false)))
|
||||||
}
|
},
|
||||||
Clause::Lambda{args, body} => Ok(if let Some(args) = args {
|
Clause::Lambda { args, body } => Ok(if let Some(args) = args {
|
||||||
let x_cls = x.expr().clause.clone();
|
let x_cls = x.expr().clause.clone();
|
||||||
let new_xpr_inst = substitute(args, x_cls, body.clone());
|
let new_xpr_inst = substitute(args, x_cls, body.clone());
|
||||||
let new_xpr = new_xpr_inst.expr();
|
let new_xpr = new_xpr_inst.expr();
|
||||||
// cost of substitution
|
// cost of substitution
|
||||||
// XXX: should this be the number of occurrences instead?
|
// XXX: should this be the number of occurrences instead?
|
||||||
(new_xpr.clause.clone(), (ctx.gas.map(|x| x - 1), false))
|
(new_xpr.clause.clone(), (ctx.gas.map(|x| x - 1), false))
|
||||||
} else {(body.expr().clause.clone(), (ctx.gas, false))}),
|
} else {
|
||||||
|
(body.expr().clause.clone(), (ctx.gas, false))
|
||||||
|
}),
|
||||||
Clause::Constant(name) => {
|
Clause::Constant(name) => {
|
||||||
let symval = if let Some(sym) = ctx.symbols.get(name) {sym.clone()}
|
let symval = if let Some(sym) = ctx.symbols.get(name) {
|
||||||
else { panic!("missing symbol for function {}",
|
sym.clone()
|
||||||
|
} else {
|
||||||
|
panic!(
|
||||||
|
"missing symbol for function {}",
|
||||||
ctx.interner.extern_vec(*name).join("::")
|
ctx.interner.extern_vec(*name).join("::")
|
||||||
)};
|
)
|
||||||
Ok((Clause::Apply { f: symval, x, }, (ctx.gas, false)))
|
};
|
||||||
}
|
Ok((Clause::Apply { f: symval, x }, (ctx.gas, false)))
|
||||||
Clause::P(Primitive::Atom(atom)) => { // take a step in expanding atom
|
},
|
||||||
|
Clause::P(Primitive::Atom(atom)) => {
|
||||||
|
// take a step in expanding atom
|
||||||
let AtomicReturn { clause, gas, inert } = atom.run(ctx.clone())?;
|
let AtomicReturn { clause, gas, inert } = atom.run(ctx.clone())?;
|
||||||
Ok((Clause::Apply { f: clause.wrap(), x }, (gas, inert)))
|
Ok((Clause::Apply { f: clause.wrap(), x }, (gas, inert)))
|
||||||
},
|
},
|
||||||
Clause::Apply{ f: fun, x: arg } => { // take a step in resolving pre-function
|
Clause::Apply { f: fun, x: arg } => {
|
||||||
|
// take a step in resolving pre-function
|
||||||
let ret = apply(fun.clone(), arg.clone(), ctx.clone())?;
|
let ret = apply(fun.clone(), arg.clone(), ctx.clone())?;
|
||||||
let Return { state, inert, gas } = ret;
|
let Return { state, inert, gas } = ret;
|
||||||
Ok((Clause::Apply{ f: state, x }, (gas, inert)))
|
Ok((Clause::Apply { f: state, x }, (gas, inert)))
|
||||||
},
|
},
|
||||||
_ => Err(RuntimeError::NonFunctionApplication(f.clone()))
|
_ => Err(RuntimeError::NonFunctionApplication(f.clone())),
|
||||||
})?;
|
})?;
|
||||||
Ok(Return { state, gas, inert })
|
Ok(Return { state, gas, inert })
|
||||||
}
|
}
|
||||||
@@ -1,29 +1,27 @@
|
|||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
|
use crate::interner::{Interner, Sym};
|
||||||
use crate::representations::interpreted::ExprInst;
|
use crate::representations::interpreted::ExprInst;
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
|
|
||||||
|
/// All the data associated with an interpreter run
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Context<'a> {
|
pub struct Context<'a> {
|
||||||
pub symbols: &'a HashMap<Token<Vec<Token<String>>>, ExprInst>,
|
/// Table used to resolve constants
|
||||||
|
pub symbols: &'a HashMap<Sym, ExprInst>,
|
||||||
|
/// The interner used for strings internally, so external functions can deduce
|
||||||
|
/// referenced constant names on the fly
|
||||||
pub interner: &'a Interner,
|
pub interner: &'a Interner,
|
||||||
|
/// The number of reduction steps the interpreter can take before returning
|
||||||
pub gas: Option<usize>,
|
pub gas: Option<usize>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Context<'_> {
|
/// All the data produced by an interpreter run
|
||||||
pub fn is_stuck(&self, res: Option<usize>) -> bool {
|
|
||||||
match (res, self.gas) {
|
|
||||||
(Some(a), Some(b)) => a == b,
|
|
||||||
(None, None) => false,
|
|
||||||
(None, Some(_)) => panic!("gas not tracked despite limit"),
|
|
||||||
(Some(_), None) => panic!("gas tracked without request"),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Clone)]
|
#[derive(Clone)]
|
||||||
pub struct Return {
|
pub struct Return {
|
||||||
|
/// The new expression tree
|
||||||
pub state: ExprInst,
|
pub state: ExprInst,
|
||||||
|
/// Leftover [Context::gas] if counted
|
||||||
pub gas: Option<usize>,
|
pub gas: Option<usize>,
|
||||||
|
/// If true, the next run would not modify the expression
|
||||||
pub inert: bool,
|
pub inert: bool,
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
use std::fmt::Display;
|
use std::fmt::Display;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::representations::interpreted::ExprInst;
|
|
||||||
use crate::foreign::ExternError;
|
use crate::foreign::ExternError;
|
||||||
|
use crate::representations::interpreted::ExprInst;
|
||||||
|
|
||||||
/// Problems in the process of execution
|
/// Problems in the process of execution
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
@@ -21,7 +21,8 @@ impl Display for RuntimeError {
|
|||||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||||
match self {
|
match self {
|
||||||
Self::Extern(e) => write!(f, "Error in external function: {e}"),
|
Self::Extern(e) => write!(f, "Error in external function: {e}"),
|
||||||
Self::NonFunctionApplication(loc) => write!(f, "Primitive applied as function at {loc:?}")
|
Self::NonFunctionApplication(loc) =>
|
||||||
|
write!(f, "Primitive applied as function at {loc:?}"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
mod apply;
|
mod apply;
|
||||||
mod error;
|
|
||||||
mod context;
|
mod context;
|
||||||
|
mod error;
|
||||||
mod run;
|
mod run;
|
||||||
|
|
||||||
pub use context::{Context, Return};
|
pub use context::{Context, Return};
|
||||||
|
|||||||
@@ -1,42 +1,44 @@
|
|||||||
use std::mem;
|
use std::mem;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::foreign::{AtomicReturn, Atomic, ExternError, Atom};
|
|
||||||
use crate::representations::Primitive;
|
|
||||||
use crate::representations::interpreted::{Clause, ExprInst};
|
|
||||||
|
|
||||||
use super::apply::apply;
|
use super::apply::apply;
|
||||||
use super::error::RuntimeError;
|
|
||||||
use super::context::{Context, Return};
|
use super::context::{Context, Return};
|
||||||
|
use super::error::RuntimeError;
|
||||||
|
use crate::foreign::{Atom, Atomic, AtomicReturn, ExternError};
|
||||||
|
use crate::representations::interpreted::{Clause, ExprInst};
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
|
||||||
pub fn run(
|
/// Normalize an expression using beta reduction with memoization
|
||||||
expr: ExprInst,
|
pub fn run(expr: ExprInst, mut ctx: Context) -> Result<Return, RuntimeError> {
|
||||||
mut ctx: Context
|
let (state, (gas, inert)) =
|
||||||
) -> Result<Return, RuntimeError> {
|
expr.try_normalize(|cls| -> Result<(Clause, _), RuntimeError> {
|
||||||
let (state, (gas, inert)) = expr.try_normalize(|cls| -> Result<(Clause, _), RuntimeError> {
|
|
||||||
let mut i = cls.clone();
|
let mut i = cls.clone();
|
||||||
while ctx.gas.map(|g| g > 0).unwrap_or(true) {
|
while ctx.gas.map(|g| g > 0).unwrap_or(true) {
|
||||||
match &i {
|
match &i {
|
||||||
Clause::Apply { f, x } => {
|
Clause::Apply { f, x } => {
|
||||||
let res = apply(f.clone(), x.clone(), ctx.clone())?;
|
let res = apply(f.clone(), x.clone(), ctx.clone())?;
|
||||||
if res.inert {return Ok((i, (res.gas, true)))}
|
if res.inert {
|
||||||
|
return Ok((i, (res.gas, true)));
|
||||||
|
}
|
||||||
ctx.gas = res.gas;
|
ctx.gas = res.gas;
|
||||||
i = res.state.expr().clause.clone();
|
i = res.state.expr().clause.clone();
|
||||||
}
|
},
|
||||||
Clause::P(Primitive::Atom(data)) => {
|
Clause::P(Primitive::Atom(data)) => {
|
||||||
let ret = data.run(ctx.clone())?;
|
let ret = data.run(ctx.clone())?;
|
||||||
let AtomicReturn { clause, gas, inert } = ret;
|
let AtomicReturn { clause, gas, inert } = ret;
|
||||||
if inert {return Ok((i, (gas, true)))}
|
if inert {
|
||||||
|
return Ok((i, (gas, true)));
|
||||||
|
}
|
||||||
ctx.gas = gas;
|
ctx.gas = gas;
|
||||||
i = clause.clone();
|
i = clause.clone();
|
||||||
}
|
},
|
||||||
Clause::Constant(c) => {
|
Clause::Constant(c) => {
|
||||||
let symval = ctx.symbols.get(c).expect("missing symbol for value");
|
let symval = ctx.symbols.get(c).expect("missing symbol for value");
|
||||||
ctx.gas = ctx.gas.map(|g| g - 1); // cost of lookup
|
ctx.gas = ctx.gas.map(|g| g - 1); // cost of lookup
|
||||||
i = symval.expr().clause.clone();
|
i = symval.expr().clause.clone();
|
||||||
}
|
},
|
||||||
// non-reducible
|
// non-reducible
|
||||||
_ => return Ok((i, (ctx.gas, true)))
|
_ => return Ok((i, (ctx.gas, true))),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// out of gas
|
// out of gas
|
||||||
@@ -45,61 +47,108 @@ pub fn run(
|
|||||||
Ok(Return { state, gas, inert })
|
Ok(Return { state, gas, inert })
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Opaque inert data that may encode a command to a [Handler]
|
||||||
pub type HandlerParm = Box<dyn Atomic>;
|
pub type HandlerParm = Box<dyn Atomic>;
|
||||||
pub type HandlerRes = Result<
|
|
||||||
Result<ExprInst, Rc<dyn ExternError>>,
|
|
||||||
HandlerParm
|
|
||||||
>;
|
|
||||||
|
|
||||||
pub trait Handler {
|
/// Reasons why a [Handler] could not interpret a command. Convertible from
|
||||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes;
|
/// either variant
|
||||||
|
pub enum HandlerErr {
|
||||||
fn then<T: Handler>(self, t: T) -> impl Handler where Self: Sized {
|
/// The command was addressed to us but its execution resulted in an error
|
||||||
Pair(self, t)
|
Extern(Rc<dyn ExternError>),
|
||||||
|
/// This handler is not applicable, either because the [HandlerParm] is not a
|
||||||
|
/// command or because it's meant for some other handler
|
||||||
|
NA(HandlerParm),
|
||||||
|
}
|
||||||
|
impl From<Rc<dyn ExternError>> for HandlerErr {
|
||||||
|
fn from(value: Rc<dyn ExternError>) -> Self {
|
||||||
|
Self::Extern(value)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl<T> From<T> for HandlerErr
|
||||||
|
where
|
||||||
|
T: ExternError + 'static,
|
||||||
|
{
|
||||||
|
fn from(value: T) -> Self {
|
||||||
|
Self::Extern(value.into_extern())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
impl From<HandlerParm> for HandlerErr {
|
||||||
|
fn from(value: HandlerParm) -> Self {
|
||||||
|
Self::NA(value)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<F> Handler for F where F: FnMut(HandlerParm) -> HandlerRes {
|
/// Various possible outcomes of a [Handler] execution.
|
||||||
|
pub type HandlerRes = Result<ExprInst, HandlerErr>;
|
||||||
|
|
||||||
|
/// A trait for things that may be able to handle commands returned by Orchid
|
||||||
|
/// code. This trait is implemented for [FnMut(HandlerParm) -> HandlerRes] and
|
||||||
|
/// [(Handler, Handler)], users are not supposed to implement it themselves.
|
||||||
|
///
|
||||||
|
/// A handler receives an arbitrary inert [Atomic] and uses [Atomic::as_any]
|
||||||
|
/// then [std::any::Any::downcast_ref] to obtain a known type. If this fails, it
|
||||||
|
/// returns the box in [HandlerErr::NA] which will be passed to the next
|
||||||
|
/// handler.
|
||||||
|
pub trait Handler {
|
||||||
|
/// Attempt to resolve a command with this handler.
|
||||||
|
fn resolve(&mut self, data: HandlerParm) -> HandlerRes;
|
||||||
|
|
||||||
|
/// If this handler isn't applicable, try the other one.
|
||||||
|
fn or<T: Handler>(self, t: T) -> impl Handler
|
||||||
|
where
|
||||||
|
Self: Sized,
|
||||||
|
{
|
||||||
|
(self, t)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<F> Handler for F
|
||||||
|
where
|
||||||
|
F: FnMut(HandlerParm) -> HandlerRes,
|
||||||
|
{
|
||||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
||||||
self(data)
|
self(data)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct Pair<T, U>(T, U);
|
impl<T: Handler, U: Handler> Handler for (T, U) {
|
||||||
|
|
||||||
impl<T: Handler, U: Handler> Handler for Pair<T, U> {
|
|
||||||
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
fn resolve(&mut self, data: HandlerParm) -> HandlerRes {
|
||||||
match self.0.resolve(data) {
|
match self.0.resolve(data) {
|
||||||
Ok(out) => Ok(out),
|
Err(HandlerErr::NA(data)) => self.1.resolve(data),
|
||||||
Err(data) => self.1.resolve(data)
|
x => x,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// [run] orchid code, executing any commands it returns using the specified
|
||||||
|
/// [Handler]s.
|
||||||
pub fn run_handler(
|
pub fn run_handler(
|
||||||
mut expr: ExprInst,
|
mut expr: ExprInst,
|
||||||
mut handler: impl Handler,
|
mut handler: impl Handler,
|
||||||
mut ctx: Context
|
mut ctx: Context,
|
||||||
) -> Result<Return, RuntimeError> {
|
) -> Result<Return, RuntimeError> {
|
||||||
loop {
|
loop {
|
||||||
let ret = run(expr.clone(), ctx.clone())?;
|
let ret = run(expr.clone(), ctx.clone())?;
|
||||||
if ret.gas == Some(0) {
|
if ret.gas == Some(0) {
|
||||||
return Ok(ret)
|
return Ok(ret);
|
||||||
}
|
}
|
||||||
let state_ex = ret.state.expr();
|
let state_ex = ret.state.expr();
|
||||||
let a = if let Clause::P(Primitive::Atom(a)) = &state_ex.clause {a}
|
let a = if let Clause::P(Primitive::Atom(a)) = &state_ex.clause {
|
||||||
else {
|
a
|
||||||
|
} else {
|
||||||
mem::drop(state_ex);
|
mem::drop(state_ex);
|
||||||
return Ok(ret)
|
return Ok(ret);
|
||||||
};
|
};
|
||||||
let boxed = a.clone().0;
|
let boxed = a.clone().0;
|
||||||
expr = match handler.resolve(boxed) {
|
expr = match handler.resolve(boxed) {
|
||||||
Ok(r) => r.map_err(RuntimeError::Extern)?,
|
Ok(expr) => expr,
|
||||||
Err(e) => return Ok(Return{
|
Err(HandlerErr::Extern(ext)) => Err(ext)?,
|
||||||
|
Err(HandlerErr::NA(atomic)) =>
|
||||||
|
return Ok(Return {
|
||||||
gas: ret.gas,
|
gas: ret.gas,
|
||||||
inert: ret.inert,
|
inert: ret.inert,
|
||||||
state: Clause::P(Primitive::Atom(Atom(e))).wrap()
|
state: Clause::P(Primitive::Atom(Atom(atomic))).wrap(),
|
||||||
})
|
}),
|
||||||
};
|
};
|
||||||
ctx.gas = ret.gas;
|
ctx.gas = ret.gas;
|
||||||
}
|
}
|
||||||
|
|||||||
21
src/main.rs
21
src/main.rs
@@ -12,19 +12,20 @@
|
|||||||
#![feature(trait_alias)]
|
#![feature(trait_alias)]
|
||||||
#![feature(return_position_impl_trait_in_trait)]
|
#![feature(return_position_impl_trait_in_trait)]
|
||||||
|
|
||||||
mod parse;
|
mod cli;
|
||||||
|
mod external;
|
||||||
|
pub(crate) mod foreign;
|
||||||
|
mod foreign_macros;
|
||||||
mod interner;
|
mod interner;
|
||||||
mod interpreter;
|
mod interpreter;
|
||||||
mod utils;
|
mod parse;
|
||||||
|
mod pipeline;
|
||||||
mod representations;
|
mod representations;
|
||||||
mod rule;
|
mod rule;
|
||||||
pub(crate) mod foreign;
|
|
||||||
mod external;
|
|
||||||
mod foreign_macros;
|
|
||||||
mod pipeline;
|
|
||||||
mod run_dir;
|
mod run_dir;
|
||||||
mod cli;
|
mod utils;
|
||||||
use std::{path::PathBuf, fs::File};
|
use std::fs::File;
|
||||||
|
use std::path::PathBuf;
|
||||||
|
|
||||||
use clap::Parser;
|
use clap::Parser;
|
||||||
use cli::prompt;
|
use cli::prompt;
|
||||||
@@ -37,7 +38,7 @@ use run_dir::run_dir;
|
|||||||
struct Args {
|
struct Args {
|
||||||
/// Folder containing main.orc
|
/// Folder containing main.orc
|
||||||
#[arg(short, long)]
|
#[arg(short, long)]
|
||||||
pub project: Option<String>
|
pub project: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
fn main() {
|
fn main() {
|
||||||
@@ -48,7 +49,7 @@ fn main() {
|
|||||||
path.push("main.orc");
|
path.push("main.orc");
|
||||||
match File::open(&path) {
|
match File::open(&path) {
|
||||||
Ok(_) => Ok(p),
|
Ok(_) => Ok(p),
|
||||||
Err(e) => Err(format!("{}: {e}", path.display()))
|
Err(e) => Err(format!("{}: {e}", path.display())),
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,13 +1,15 @@
|
|||||||
pub use chumsky::{self, prelude::*, Parser};
|
pub use chumsky::prelude::*;
|
||||||
|
pub use chumsky::{self, Parser};
|
||||||
|
|
||||||
|
use super::decls::SimpleParser;
|
||||||
|
|
||||||
/// Parses Lua-style comments
|
/// Parses Lua-style comments
|
||||||
pub fn comment_parser() -> impl Parser<char, String, Error = Simple<char>> {
|
pub fn comment_parser() -> impl SimpleParser<char, String> {
|
||||||
choice((
|
choice((
|
||||||
just("--[").ignore_then(take_until(
|
just("--[").ignore_then(take_until(just("]--").ignored())),
|
||||||
just("]--").ignored()
|
just("--").ignore_then(take_until(just("\n").rewind().ignored().or(end()))),
|
||||||
)),
|
|
||||||
just("--").ignore_then(take_until(
|
|
||||||
just("\n").rewind().ignored().or(end())
|
|
||||||
))
|
))
|
||||||
)).map(|(vc, ())| vc).collect().labelled("comment")
|
.map(|(vc, ())| vc)
|
||||||
|
.collect()
|
||||||
|
.labelled("comment")
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -9,9 +9,9 @@ use crate::interner::Interner;
|
|||||||
pub trait Context: Clone {
|
pub trait Context: Clone {
|
||||||
type Op: AsRef<str>;
|
type Op: AsRef<str>;
|
||||||
|
|
||||||
fn ops<'a>(&'a self) -> &'a [Self::Op];
|
fn ops(&self) -> &[Self::Op];
|
||||||
fn file(&self) -> Rc<Vec<String>>;
|
fn file(&self) -> Rc<Vec<String>>;
|
||||||
fn interner<'a>(&'a self) -> &'a Interner;
|
fn interner(&self) -> &Interner;
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Struct implementing context
|
/// Struct implementing context
|
||||||
@@ -21,28 +21,35 @@ pub trait Context: Clone {
|
|||||||
pub struct ParsingContext<'a, Op> {
|
pub struct ParsingContext<'a, Op> {
|
||||||
pub ops: &'a [Op],
|
pub ops: &'a [Op],
|
||||||
pub interner: &'a Interner,
|
pub interner: &'a Interner,
|
||||||
pub file: Rc<Vec<String>>
|
pub file: Rc<Vec<String>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<'a, Op> ParsingContext<'a, Op> {
|
impl<'a, Op> ParsingContext<'a, Op> {
|
||||||
pub fn new(ops: &'a [Op], interner: &'a Interner, file: Rc<Vec<String>>)
|
pub fn new(
|
||||||
-> Self { Self { ops, interner, file } }
|
ops: &'a [Op],
|
||||||
|
interner: &'a Interner,
|
||||||
|
file: Rc<Vec<String>>,
|
||||||
|
) -> Self {
|
||||||
|
Self { ops, interner, file }
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<'a, Op> Clone for ParsingContext<'a, Op> {
|
impl<'a, Op> Clone for ParsingContext<'a, Op> {
|
||||||
fn clone(&self) -> Self {
|
fn clone(&self) -> Self {
|
||||||
Self {
|
Self { ops: self.ops, interner: self.interner, file: self.file.clone() }
|
||||||
ops: self.ops,
|
|
||||||
interner: self.interner,
|
|
||||||
file: self.file.clone()
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<Op: AsRef<str>> Context for ParsingContext<'_, Op> {
|
impl<Op: AsRef<str>> Context for ParsingContext<'_, Op> {
|
||||||
type Op = Op;
|
type Op = Op;
|
||||||
|
|
||||||
fn interner<'a>(&'a self) -> &'a Interner { self.interner }
|
fn interner(&self) -> &Interner {
|
||||||
fn file(&self) -> Rc<Vec<String>> {self.file.clone()}
|
self.interner
|
||||||
fn ops<'a>(&'a self) -> &'a [Self::Op] { self.ops }
|
}
|
||||||
|
fn file(&self) -> Rc<Vec<String>> {
|
||||||
|
self.file.clone()
|
||||||
|
}
|
||||||
|
fn ops(&self) -> &[Self::Op] {
|
||||||
|
self.ops
|
||||||
|
}
|
||||||
}
|
}
|
||||||
14
src/parse/decls.rs
Normal file
14
src/parse/decls.rs
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
use std::hash::Hash;
|
||||||
|
|
||||||
|
use chumsky::prelude::Simple;
|
||||||
|
use chumsky::recursive::Recursive;
|
||||||
|
use chumsky::{BoxedParser, Parser};
|
||||||
|
|
||||||
|
/// Wrapper around [Parser] with [Simple] error to avoid repeating the input
|
||||||
|
pub trait SimpleParser<I: Eq + Hash + Clone, O> =
|
||||||
|
Parser<I, O, Error = Simple<I>>;
|
||||||
|
/// Boxed version of [SimpleParser]
|
||||||
|
pub type BoxedSimpleParser<'a, I, O> = BoxedParser<'a, I, O, Simple<I>>;
|
||||||
|
/// [Recursive] specialization of [SimpleParser] to parameterize calls to
|
||||||
|
/// [chumsky::recursive::recursive]
|
||||||
|
pub type SimpleRecursive<'a, I, O> = Recursive<'a, I, O, Simple<I>>;
|
||||||
@@ -1,8 +1,12 @@
|
|||||||
/// Produces filter_mapping functions for enum types:
|
/// Produces filter_mapping functions for enum types:
|
||||||
/// ```rs
|
/// ```rs
|
||||||
/// enum_parser!(Foo::Bar | "Some error!") // Accepts Foo::Bar(T) into T
|
/// enum_parser!(Foo::Bar | "Some error!")
|
||||||
/// enum_parser!(Foo::Bar) // same as above but with the default error "Expected Foo::Bar"
|
/// // Foo::Bar(T) into T
|
||||||
/// enum_parser!(Foo >> Quz; Bar, Baz) // Parses Foo::Bar(T) into Quz::Bar(T) and Foo::Baz(U) into Quz::Baz(U)
|
/// enum_parser!(Foo::Bar)
|
||||||
|
/// // same as above but with the default error "Expected Foo::Bar"
|
||||||
|
/// enum_parser!(Foo >> Quz; Bar, Baz)
|
||||||
|
/// // Foo::Bar(T) into Quz::Bar(T)
|
||||||
|
/// // Foo::Baz(U) into Quz::Baz(U)
|
||||||
/// ```
|
/// ```
|
||||||
#[macro_export]
|
#[macro_export]
|
||||||
macro_rules! enum_filter {
|
macro_rules! enum_filter {
|
||||||
|
|||||||
@@ -1,107 +1,110 @@
|
|||||||
use std::ops::Range;
|
use std::ops::Range;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use chumsky::{self, prelude::*, Parser};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::{self, Parser};
|
||||||
use crate::enum_filter;
|
|
||||||
use crate::representations::Primitive;
|
|
||||||
use crate::representations::ast::{Clause, Expr};
|
|
||||||
use crate::representations::location::Location;
|
|
||||||
use crate::interner::Token;
|
|
||||||
|
|
||||||
use super::context::Context;
|
use super::context::Context;
|
||||||
use super::lexer::{Lexeme, Entry, filter_map_lex};
|
use super::decls::SimpleParser;
|
||||||
|
use super::lexer::{filter_map_lex, Entry, Lexeme};
|
||||||
|
use crate::enum_filter;
|
||||||
|
use crate::interner::Sym;
|
||||||
|
use crate::representations::ast::{Clause, Expr};
|
||||||
|
use crate::representations::location::Location;
|
||||||
|
use crate::representations::Primitive;
|
||||||
|
|
||||||
/// Parses any number of expr wrapped in (), [] or {}
|
/// Parses any number of expr wrapped in (), [] or {}
|
||||||
fn sexpr_parser(
|
fn sexpr_parser(
|
||||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone
|
expr: impl SimpleParser<Entry, Expr> + Clone,
|
||||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone {
|
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone {
|
||||||
let body = expr.repeated();
|
let body = expr.repeated();
|
||||||
choice((
|
choice((
|
||||||
Lexeme::LP('(').parser().then(body.clone())
|
Lexeme::LP('(').parser().then(body.clone()).then(Lexeme::RP('(').parser()),
|
||||||
.then(Lexeme::RP('(').parser()),
|
Lexeme::LP('[').parser().then(body.clone()).then(Lexeme::RP('[').parser()),
|
||||||
Lexeme::LP('[').parser().then(body.clone())
|
Lexeme::LP('{').parser().then(body).then(Lexeme::RP('{').parser()),
|
||||||
.then(Lexeme::RP('[').parser()),
|
))
|
||||||
Lexeme::LP('{').parser().then(body.clone())
|
.map(|((lp, body), rp)| {
|
||||||
.then(Lexeme::RP('{').parser()),
|
let Entry { lexeme, range: Range { start, .. } } = lp;
|
||||||
)).map(|((lp, body), rp)| {
|
|
||||||
let Entry{lexeme, range: Range{start, ..}} = lp;
|
|
||||||
let end = rp.range.end;
|
let end = rp.range.end;
|
||||||
let char = if let Lexeme::LP(c) = lexeme {c}
|
let char = if let Lexeme::LP(c) = lexeme {
|
||||||
else {unreachable!("The parser only matches Lexeme::LP")};
|
c
|
||||||
|
} else {
|
||||||
|
unreachable!("The parser only matches Lexeme::LP")
|
||||||
|
};
|
||||||
(Clause::S(char, Rc::new(body)), start..end)
|
(Clause::S(char, Rc::new(body)), start..end)
|
||||||
}).labelled("S-expression")
|
})
|
||||||
|
.labelled("S-expression")
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parses `\name.body` or `\name:type.body` where name is any valid name
|
/// Parses `\name.body` or `\name:type.body` where name is any valid name
|
||||||
/// and type and body are both expressions. Comments are allowed
|
/// and type and body are both expressions. Comments are allowed
|
||||||
/// and ignored everywhere in between the tokens
|
/// and ignored everywhere in between the tokens
|
||||||
fn lambda_parser<'a>(
|
fn lambda_parser<'a>(
|
||||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone + 'a,
|
expr: impl SimpleParser<Entry, Expr> + Clone + 'a,
|
||||||
ctx: impl Context + 'a
|
ctx: impl Context + 'a,
|
||||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||||
Lexeme::BS.parser()
|
Lexeme::BS
|
||||||
|
.parser()
|
||||||
.ignore_then(expr.clone())
|
.ignore_then(expr.clone())
|
||||||
.then_ignore(Lexeme::Name(ctx.interner().i(".")).parser())
|
.then_ignore(Lexeme::Name(ctx.interner().i(".")).parser())
|
||||||
.then(expr.repeated().at_least(1))
|
.then(expr.repeated().at_least(1))
|
||||||
.map_with_span(move |(arg, body), span| {
|
.map_with_span(move |(arg, body), span| {
|
||||||
(Clause::Lambda(Rc::new(arg), Rc::new(body)), span)
|
(Clause::Lambda(Rc::new(arg), Rc::new(body)), span)
|
||||||
}).labelled("Lambda")
|
})
|
||||||
|
.labelled("Lambda")
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parses a sequence of names separated by :: <br/>
|
/// Parses a sequence of names separated by :: <br/>
|
||||||
/// Comments and line breaks are allowed and ignored in between
|
/// Comments and line breaks are allowed and ignored in between
|
||||||
pub fn ns_name_parser<'a>(ctx: impl Context + 'a)
|
pub fn ns_name_parser<'a>(
|
||||||
-> impl Parser<Entry, (Token<Vec<Token<String>>>, Range<usize>), Error = Simple<Entry>> + Clone + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, (Sym, Range<usize>)> + Clone + 'a {
|
||||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||||
.separated_by(Lexeme::NS.parser()).at_least(1)
|
.separated_by(Lexeme::NS.parser())
|
||||||
|
.at_least(1)
|
||||||
.map(move |elements| {
|
.map(move |elements| {
|
||||||
let start = elements.first().expect("can never be empty").1.start;
|
let start = elements.first().expect("can never be empty").1.start;
|
||||||
let end = elements.last().expect("can never be empty").1.end;
|
let end = elements.last().expect("can never be empty").1.end;
|
||||||
let tokens =
|
let tokens = (elements.iter().map(|(t, _)| *t)).collect::<Vec<_>>();
|
||||||
/*ctx.prefix().iter().copied().chain*/(
|
|
||||||
elements.iter().map(|(t, _)| *t)
|
|
||||||
).collect::<Vec<_>>();
|
|
||||||
(ctx.interner().i(&tokens), start..end)
|
(ctx.interner().i(&tokens), start..end)
|
||||||
}).labelled("Namespaced name")
|
})
|
||||||
|
.labelled("Namespaced name")
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn namelike_parser<'a>(ctx: impl Context + 'a)
|
pub fn namelike_parser<'a>(
|
||||||
-> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||||
choice((
|
choice((
|
||||||
filter_map_lex(enum_filter!(Lexeme::PH))
|
filter_map_lex(enum_filter!(Lexeme::PH))
|
||||||
.map(|(ph, range)| (Clause::Placeh(ph), range)),
|
.map(|(ph, range)| (Clause::Placeh(ph), range)),
|
||||||
ns_name_parser(ctx)
|
ns_name_parser(ctx).map(|(token, range)| (Clause::Name(token), range)),
|
||||||
.map(|(token, range)| (Clause::Name(token), range)),
|
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn clause_parser<'a>(
|
pub fn clause_parser<'a>(
|
||||||
expr: impl Parser<Entry, Expr, Error = Simple<Entry>> + Clone + 'a,
|
expr: impl SimpleParser<Entry, Expr> + Clone + 'a,
|
||||||
ctx: impl Context + 'a
|
ctx: impl Context + 'a,
|
||||||
) -> impl Parser<Entry, (Clause, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
) -> impl SimpleParser<Entry, (Clause, Range<usize>)> + Clone + 'a {
|
||||||
choice((
|
choice((
|
||||||
filter_map_lex(enum_filter!(Lexeme >> Primitive; Literal))
|
filter_map_lex(enum_filter!(Lexeme >> Primitive; Literal))
|
||||||
.map(|(p, s)| (Clause::P(p), s)).labelled("Literal"),
|
.map(|(p, s)| (Clause::P(p), s))
|
||||||
|
.labelled("Literal"),
|
||||||
sexpr_parser(expr.clone()),
|
sexpr_parser(expr.clone()),
|
||||||
lambda_parser(expr.clone(), ctx.clone()),
|
lambda_parser(expr, ctx.clone()),
|
||||||
namelike_parser(ctx),
|
namelike_parser(ctx),
|
||||||
)).labelled("Clause")
|
))
|
||||||
|
.labelled("Clause")
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parse an expression
|
/// Parse an expression
|
||||||
pub fn xpr_parser<'a>(ctx: impl Context + 'a)
|
pub fn xpr_parser<'a>(
|
||||||
-> impl Parser<Entry, Expr, Error = Simple<Entry>> + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, Expr> + 'a {
|
||||||
recursive(move |expr| {
|
recursive(move |expr| {
|
||||||
clause_parser(expr, ctx.clone())
|
clause_parser(expr, ctx.clone()).map(move |(value, range)| Expr {
|
||||||
.map(move |(value, range)| {
|
value,
|
||||||
Expr{
|
location: Location::Range { file: ctx.file(), range },
|
||||||
value: value.clone(),
|
|
||||||
location: Location::Range { file: ctx.file(), range }
|
|
||||||
}
|
|
||||||
})
|
})
|
||||||
}).labelled("Expression")
|
})
|
||||||
|
.labelled("Expression")
|
||||||
}
|
}
|
||||||
@@ -1,58 +1,59 @@
|
|||||||
use std::fmt::Debug;
|
use std::fmt::Debug;
|
||||||
|
|
||||||
use chumsky::{prelude::*, Parser};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::Parser;
|
||||||
use thiserror::Error;
|
use thiserror::Error;
|
||||||
|
|
||||||
use crate::representations::sourcefile::{FileEntry};
|
|
||||||
use crate::parse::sourcefile::split_lines;
|
|
||||||
|
|
||||||
use super::context::Context;
|
use super::context::Context;
|
||||||
use super::{lexer, line_parser, Entry};
|
use super::{lexer, line_parser, Entry};
|
||||||
|
use crate::parse::sourcefile::split_lines;
|
||||||
|
use crate::representations::sourcefile::FileEntry;
|
||||||
|
|
||||||
#[derive(Error, Debug, Clone)]
|
#[derive(Error, Debug, Clone)]
|
||||||
pub enum ParseError {
|
pub enum ParseError {
|
||||||
#[error("Could not tokenize {0:?}")]
|
#[error("Could not tokenize {0:?}")]
|
||||||
Lex(Vec<Simple<char>>),
|
Lex(Vec<Simple<char>>),
|
||||||
#[error("Could not parse {:?} on line {}", .0.first().unwrap().1.span(), .0.first().unwrap().0)]
|
#[error(
|
||||||
Ast(Vec<(usize, Simple<Entry>)>)
|
"Could not parse {:?} on line {}",
|
||||||
|
.0.first().unwrap().1.span(),
|
||||||
|
.0.first().unwrap().0
|
||||||
|
)]
|
||||||
|
Ast(Vec<(usize, Simple<Entry>)>),
|
||||||
}
|
}
|
||||||
|
|
||||||
/// All the data required for parsing
|
|
||||||
|
|
||||||
|
|
||||||
/// Parse a string of code into a collection of module elements;
|
/// Parse a string of code into a collection of module elements;
|
||||||
/// imports, exports, comments, declarations, etc.
|
/// imports, exports, comments, declarations, etc.
|
||||||
///
|
///
|
||||||
/// Notice that because the lexer splits operators based on the provided
|
/// Notice that because the lexer splits operators based on the provided
|
||||||
/// list, the output will only be correct if operator list already
|
/// list, the output will only be correct if operator list already
|
||||||
/// contains all operators defined or imported by this module.
|
/// contains all operators defined or imported by this module.
|
||||||
pub fn parse<'a>(data: &str, ctx: impl Context)
|
pub fn parse(
|
||||||
-> Result<Vec<FileEntry>, ParseError>
|
data: &str,
|
||||||
{
|
ctx: impl Context,
|
||||||
|
) -> Result<Vec<FileEntry>, ParseError> {
|
||||||
// TODO: wrap `i`, `ops` and `prefix` in a parsing context
|
// TODO: wrap `i`, `ops` and `prefix` in a parsing context
|
||||||
let lexie = lexer(ctx.clone());
|
let lexie = lexer(ctx.clone());
|
||||||
let token_batchv = lexie.parse(data).map_err(ParseError::Lex)?;
|
let token_batchv = lexie.parse(data).map_err(ParseError::Lex)?;
|
||||||
// println!("Lexed:\n{}", LexedText(token_batchv.clone()).bundle(ctx.interner()));
|
|
||||||
// println!("Lexed:\n{:?}", token_batchv.clone());
|
|
||||||
let parsr = line_parser(ctx).then_ignore(end());
|
let parsr = line_parser(ctx).then_ignore(end());
|
||||||
let (parsed_lines, errors_per_line) = split_lines(&token_batchv)
|
let (parsed_lines, errors_per_line) = split_lines(&token_batchv)
|
||||||
.enumerate()
|
.enumerate()
|
||||||
.map(|(i, entv)| (i,
|
.map(|(i, entv)| {
|
||||||
entv.iter()
|
(i, entv.iter().filter(|e| !e.is_filler()).cloned().collect::<Vec<_>>())
|
||||||
.filter(|e| !e.is_filler())
|
})
|
||||||
.cloned()
|
.filter(|(_, l)| !l.is_empty())
|
||||||
.collect::<Vec<_>>()
|
|
||||||
))
|
|
||||||
.filter(|(_, l)| l.len() > 0)
|
|
||||||
.map(|(i, l)| (i, parsr.parse(l)))
|
.map(|(i, l)| (i, parsr.parse(l)))
|
||||||
.map(|(i, res)| match res {
|
.map(|(i, res)| match res {
|
||||||
Ok(r) => (Some(r), (i, vec![])),
|
Ok(r) => (Some(r), (i, vec![])),
|
||||||
Err(e) => (None, (i, e))
|
Err(e) => (None, (i, e)),
|
||||||
}).unzip::<_, _, Vec<_>, Vec<_>>();
|
})
|
||||||
let total_err = errors_per_line.into_iter()
|
.unzip::<_, _, Vec<_>, Vec<_>>();
|
||||||
|
let total_err = errors_per_line
|
||||||
|
.into_iter()
|
||||||
.flat_map(|(i, v)| v.into_iter().map(move |e| (i, e)))
|
.flat_map(|(i, v)| v.into_iter().map(move |e| (i, e)))
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
if !total_err.is_empty() { Err(ParseError::Ast(total_err)) }
|
if !total_err.is_empty() {
|
||||||
else { Ok(parsed_lines.into_iter().map(Option::unwrap).collect()) }
|
Err(ParseError::Ast(total_err))
|
||||||
|
} else {
|
||||||
|
Ok(parsed_lines.into_iter().map(Option::unwrap).collect())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,16 +1,20 @@
|
|||||||
use chumsky::{Parser, prelude::*};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::Parser;
|
||||||
use itertools::Itertools;
|
use itertools::Itertools;
|
||||||
|
|
||||||
|
use super::context::Context;
|
||||||
|
use super::decls::{SimpleParser, SimpleRecursive};
|
||||||
|
use super::lexer::{filter_map_lex, Lexeme};
|
||||||
|
use super::Entry;
|
||||||
|
use crate::interner::Tok;
|
||||||
use crate::representations::sourcefile::Import;
|
use crate::representations::sourcefile::Import;
|
||||||
use crate::utils::iter::{box_once, box_flatten, into_boxed_iter, BoxedIterIter};
|
use crate::utils::iter::{
|
||||||
use crate::interner::Token;
|
box_flatten, box_once, into_boxed_iter, BoxedIterIter,
|
||||||
|
};
|
||||||
use crate::{box_chain, enum_filter};
|
use crate::{box_chain, enum_filter};
|
||||||
|
|
||||||
use super::Entry;
|
|
||||||
use super::context::Context;
|
|
||||||
use super::lexer::{Lexeme, filter_map_lex};
|
|
||||||
|
|
||||||
/// initialize a BoxedIter<BoxedIter<String>> with a single element.
|
/// initialize a BoxedIter<BoxedIter<String>> with a single element.
|
||||||
fn init_table(name: Token<String>) -> BoxedIterIter<'static, Token<String>> {
|
fn init_table(name: Tok<String>) -> BoxedIterIter<'static, Tok<String>> {
|
||||||
// I'm not at all confident that this is a good approach.
|
// I'm not at all confident that this is a good approach.
|
||||||
box_once(box_once(name))
|
box_once(box_once(name))
|
||||||
}
|
}
|
||||||
@@ -21,56 +25,74 @@ fn init_table(name: Token<String>) -> BoxedIterIter<'static, Token<String>> {
|
|||||||
/// preferably contain crossplatform filename-legal characters but the
|
/// preferably contain crossplatform filename-legal characters but the
|
||||||
/// symbols are explicitly allowed to go wild.
|
/// symbols are explicitly allowed to go wild.
|
||||||
/// There's a blacklist in [name]
|
/// There's a blacklist in [name]
|
||||||
pub fn import_parser<'a>(ctx: impl Context + 'a)
|
pub fn import_parser<'a>(
|
||||||
-> impl Parser<Entry, Vec<Import>, Error = Simple<Entry>> + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, Vec<Import>> + 'a {
|
||||||
// TODO: this algorithm isn't cache friendly and copies a lot
|
// TODO: this algorithm isn't cache friendly and copies a lot
|
||||||
recursive({
|
recursive({
|
||||||
let ctx = ctx.clone();
|
let ctx = ctx.clone();
|
||||||
move |expr:Recursive<Entry, BoxedIterIter<Token<String>>, Simple<Entry>>| {
|
move |expr: SimpleRecursive<Entry, BoxedIterIter<Tok<String>>>| {
|
||||||
filter_map_lex(enum_filter!(Lexeme::Name)).map(|(t, _)| t)
|
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||||
|
.map(|(t, _)| t)
|
||||||
.separated_by(Lexeme::NS.parser())
|
.separated_by(Lexeme::NS.parser())
|
||||||
.then(
|
.then(
|
||||||
Lexeme::NS.parser()
|
Lexeme::NS
|
||||||
.ignore_then(
|
.parser()
|
||||||
choice((
|
.ignore_then(choice((
|
||||||
expr.clone()
|
expr
|
||||||
|
.clone()
|
||||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
.delimited_by(
|
||||||
|
Lexeme::LP('(').parser(),
|
||||||
|
Lexeme::RP('(').parser(),
|
||||||
|
)
|
||||||
.map(|v| box_flatten(v.into_iter()))
|
.map(|v| box_flatten(v.into_iter()))
|
||||||
.labelled("import group"),
|
.labelled("import group"),
|
||||||
// Each expr returns a list of imports, flatten into common list
|
// Each expr returns a list of imports, flatten into common list
|
||||||
Lexeme::Name(ctx.interner().i("*")).parser()
|
Lexeme::Name(ctx.interner().i("*"))
|
||||||
|
.parser()
|
||||||
.map(move |_| init_table(ctx.interner().i("*")))
|
.map(move |_| init_table(ctx.interner().i("*")))
|
||||||
.labelled("wildcard import"), // Just a *, wrapped
|
.labelled("wildcard import"), // Just a *, wrapped
|
||||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||||
.map(|(t, _)| init_table(t))
|
.map(|(t, _)| init_table(t))
|
||||||
.labelled("import terminal") // Just a name, wrapped
|
.labelled("import terminal"), // Just a name, wrapped
|
||||||
))
|
)))
|
||||||
).or_not()
|
.or_not(),
|
||||||
)
|
)
|
||||||
.map(|(name, opt_post): (Vec<Token<String>>, Option<BoxedIterIter<Token<String>>>)|
|
.map(
|
||||||
-> BoxedIterIter<Token<String>> {
|
|(name, opt_post): (
|
||||||
|
Vec<Tok<String>>,
|
||||||
|
Option<BoxedIterIter<Tok<String>>>,
|
||||||
|
)|
|
||||||
|
-> BoxedIterIter<Tok<String>> {
|
||||||
if let Some(post) = opt_post {
|
if let Some(post) = opt_post {
|
||||||
Box::new(post.map(move |el| {
|
Box::new(
|
||||||
box_chain!(name.clone().into_iter(), el)
|
post.map(move |el| box_chain!(name.clone().into_iter(), el)),
|
||||||
}))
|
)
|
||||||
} else {
|
} else {
|
||||||
box_once(into_boxed_iter(name))
|
box_once(into_boxed_iter(name))
|
||||||
}
|
}
|
||||||
})
|
},
|
||||||
|
)
|
||||||
}
|
}
|
||||||
}).map(move |paths| {
|
})
|
||||||
paths.filter_map(|namespaces| {
|
.map(move |paths| {
|
||||||
|
paths
|
||||||
|
.filter_map(|namespaces| {
|
||||||
let mut path = namespaces.collect_vec();
|
let mut path = namespaces.collect_vec();
|
||||||
let name = path.pop()?;
|
let name = path.pop()?;
|
||||||
Some(Import {
|
Some(Import {
|
||||||
path: ctx.interner().i(&path),
|
path: ctx.interner().i(&path),
|
||||||
name: {
|
name: {
|
||||||
if name == ctx.interner().i("*") { None }
|
if name == ctx.interner().i("*") {
|
||||||
else { Some(name) }
|
None
|
||||||
|
} else {
|
||||||
|
Some(name)
|
||||||
}
|
}
|
||||||
|
},
|
||||||
})
|
})
|
||||||
}).collect()
|
})
|
||||||
}).labelled("import")
|
.collect()
|
||||||
|
})
|
||||||
|
.labelled("import")
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,21 +1,22 @@
|
|||||||
use std::fmt;
|
use std::fmt;
|
||||||
use std::ops::Range;
|
use std::ops::Range;
|
||||||
|
|
||||||
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::text::keyword;
|
||||||
|
use chumsky::{Parser, Span};
|
||||||
use ordered_float::NotNan;
|
use ordered_float::NotNan;
|
||||||
use chumsky::{Parser, prelude::*, text::keyword, Span};
|
|
||||||
|
|
||||||
use crate::ast::{Placeholder, PHClass};
|
|
||||||
use crate::representations::Literal;
|
|
||||||
use crate::interner::{Token, InternedDisplay, Interner};
|
|
||||||
|
|
||||||
use super::context::Context;
|
use super::context::Context;
|
||||||
use super::placeholder;
|
use super::decls::SimpleParser;
|
||||||
use super::{number, string, name, comment};
|
use super::{comment, name, number, placeholder, string};
|
||||||
|
use crate::ast::{PHClass, Placeholder};
|
||||||
|
use crate::interner::{InternedDisplay, Interner, Tok};
|
||||||
|
use crate::representations::Literal;
|
||||||
|
|
||||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||||
pub struct Entry{
|
pub struct Entry {
|
||||||
pub lexeme: Lexeme,
|
pub lexeme: Lexeme,
|
||||||
pub range: Range<usize>
|
pub range: Range<usize>,
|
||||||
}
|
}
|
||||||
impl Entry {
|
impl Entry {
|
||||||
pub fn is_filler(&self) -> bool {
|
pub fn is_filler(&self) -> bool {
|
||||||
@@ -25,7 +26,11 @@ impl Entry {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl InternedDisplay for Entry {
|
impl InternedDisplay for Entry {
|
||||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, i: &Interner) -> std::fmt::Result {
|
fn fmt_i(
|
||||||
|
&self,
|
||||||
|
f: &mut std::fmt::Formatter<'_>,
|
||||||
|
i: &Interner,
|
||||||
|
) -> std::fmt::Result {
|
||||||
self.lexeme.fmt_i(f, i)
|
self.lexeme.fmt_i(f, i)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -40,21 +45,24 @@ impl Span for Entry {
|
|||||||
type Context = Lexeme;
|
type Context = Lexeme;
|
||||||
type Offset = usize;
|
type Offset = usize;
|
||||||
|
|
||||||
fn context(&self) -> Self::Context {self.lexeme.clone()}
|
fn context(&self) -> Self::Context {
|
||||||
fn start(&self) -> Self::Offset {self.range.start()}
|
self.lexeme.clone()
|
||||||
fn end(&self) -> Self::Offset {self.range.end()}
|
|
||||||
fn new(context: Self::Context, range: Range<Self::Offset>) -> Self {
|
|
||||||
Self{
|
|
||||||
lexeme: context,
|
|
||||||
range
|
|
||||||
}
|
}
|
||||||
|
fn start(&self) -> Self::Offset {
|
||||||
|
self.range.start()
|
||||||
|
}
|
||||||
|
fn end(&self) -> Self::Offset {
|
||||||
|
self.range.end()
|
||||||
|
}
|
||||||
|
fn new(context: Self::Context, range: Range<Self::Offset>) -> Self {
|
||||||
|
Self { lexeme: context, range }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||||
pub enum Lexeme {
|
pub enum Lexeme {
|
||||||
Literal(Literal),
|
Literal(Literal),
|
||||||
Name(Token<String>),
|
Name(Tok<String>),
|
||||||
Rule(NotNan<f64>),
|
Rule(NotNan<f64>),
|
||||||
/// Walrus operator (formerly shorthand macro)
|
/// Walrus operator (formerly shorthand macro)
|
||||||
Const,
|
Const,
|
||||||
@@ -74,11 +82,15 @@ pub enum Lexeme {
|
|||||||
Export,
|
Export,
|
||||||
Import,
|
Import,
|
||||||
Namespace,
|
Namespace,
|
||||||
PH(Placeholder)
|
PH(Placeholder),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl InternedDisplay for Lexeme {
|
impl InternedDisplay for Lexeme {
|
||||||
fn fmt_i(&self, f: &mut std::fmt::Formatter<'_>, i: &Interner) -> std::fmt::Result {
|
fn fmt_i(
|
||||||
|
&self,
|
||||||
|
f: &mut std::fmt::Formatter<'_>,
|
||||||
|
i: &Interner,
|
||||||
|
) -> std::fmt::Result {
|
||||||
match self {
|
match self {
|
||||||
Self::Literal(l) => write!(f, "{:?}", l),
|
Self::Literal(l) => write!(f, "{:?}", l),
|
||||||
Self::Name(token) => write!(f, "{}", i.r(*token)),
|
Self::Name(token) => write!(f, "{}", i.r(*token)),
|
||||||
@@ -90,9 +102,9 @@ impl InternedDisplay for Lexeme {
|
|||||||
'(' => write!(f, ")"),
|
'(' => write!(f, ")"),
|
||||||
'[' => write!(f, "]"),
|
'[' => write!(f, "]"),
|
||||||
'{' => write!(f, "}}"),
|
'{' => write!(f, "}}"),
|
||||||
_ => f.debug_tuple("RP").field(l).finish()
|
_ => f.debug_tuple("RP").field(l).finish(),
|
||||||
},
|
},
|
||||||
Self::BR => write!(f, "\n"),
|
Self::BR => writeln!(f),
|
||||||
Self::BS => write!(f, "\\"),
|
Self::BS => write!(f, "\\"),
|
||||||
Self::At => write!(f, "@"),
|
Self::At => write!(f, "@"),
|
||||||
Self::Type => write!(f, ":"),
|
Self::Type => write!(f, ":"),
|
||||||
@@ -103,27 +115,30 @@ impl InternedDisplay for Lexeme {
|
|||||||
Self::PH(Placeholder { name, class }) => match *class {
|
Self::PH(Placeholder { name, class }) => match *class {
|
||||||
PHClass::Scalar => write!(f, "${}", i.r(*name)),
|
PHClass::Scalar => write!(f, "${}", i.r(*name)),
|
||||||
PHClass::Vec { nonzero, prio } => {
|
PHClass::Vec { nonzero, prio } => {
|
||||||
if nonzero {write!(f, "...")}
|
if nonzero {
|
||||||
else {write!(f, "..")}?;
|
write!(f, "...")
|
||||||
|
} else {
|
||||||
|
write!(f, "..")
|
||||||
|
}?;
|
||||||
write!(f, "${}", i.r(*name))?;
|
write!(f, "${}", i.r(*name))?;
|
||||||
if prio != 0 {write!(f, ":{}", prio)?;};
|
if prio != 0 {
|
||||||
|
write!(f, ":{}", prio)?;
|
||||||
|
};
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
},
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Lexeme {
|
impl Lexeme {
|
||||||
pub fn rule(prio: impl Into<f64>) -> Self {
|
pub fn rule(prio: impl Into<f64>) -> Self {
|
||||||
Lexeme::Rule(
|
Lexeme::Rule(NotNan::new(prio.into()).expect("Rule priority cannot be NaN"))
|
||||||
NotNan::new(prio.into())
|
|
||||||
.expect("Rule priority cannot be NaN")
|
|
||||||
)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn parser<E: chumsky::Error<Entry>>(self)
|
pub fn parser<E: chumsky::Error<Entry>>(
|
||||||
-> impl Parser<Entry, Entry, Error = E> + Clone {
|
self,
|
||||||
|
) -> impl Parser<Entry, Entry, Error = E> + Clone {
|
||||||
filter(move |ent: &Entry| ent.lexeme == self)
|
filter(move |ent: &Entry| ent.lexeme == self)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -141,16 +156,14 @@ impl InternedDisplay for LexedText {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn paren_parser(lp: char, rp: char)
|
fn paren_parser(lp: char, rp: char) -> impl SimpleParser<char, Lexeme> {
|
||||||
-> impl Parser<char, Lexeme, Error=Simple<char>>
|
just(lp).to(Lexeme::LP(lp)).or(just(rp).to(Lexeme::RP(lp)))
|
||||||
{
|
|
||||||
just(lp).to(Lexeme::LP(lp))
|
|
||||||
.or(just(rp).to(Lexeme::RP(lp)))
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn literal_parser() -> impl Parser<char, Literal, Error = Simple<char>> {
|
pub fn literal_parser() -> impl SimpleParser<char, Literal> {
|
||||||
choice((
|
choice((
|
||||||
number::int_parser().map(Literal::Uint), // all ints are valid floats so it takes precedence
|
// all ints are valid floats so it takes precedence
|
||||||
|
number::int_parser().map(Literal::Uint),
|
||||||
number::float_parser().map(Literal::Num),
|
number::float_parser().map(Literal::Num),
|
||||||
string::char_parser().map(Literal::Char),
|
string::char_parser().map(Literal::Char),
|
||||||
string::str_parser().map(Literal::Str),
|
string::str_parser().map(Literal::Str),
|
||||||
@@ -159,10 +172,12 @@ pub fn literal_parser() -> impl Parser<char, Literal, Error = Simple<char>> {
|
|||||||
|
|
||||||
pub static BASE_OPS: &[&str] = &[",", ".", "..", "..."];
|
pub static BASE_OPS: &[&str] = &[",", ".", "..", "..."];
|
||||||
|
|
||||||
pub fn lexer<'a>(ctx: impl Context + 'a)
|
pub fn lexer<'a>(
|
||||||
-> impl Parser<char, Vec<Entry>, Error=Simple<char>> + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<char, Vec<Entry>> + 'a {
|
||||||
let all_ops = ctx.ops().iter()
|
let all_ops = ctx
|
||||||
|
.ops()
|
||||||
|
.iter()
|
||||||
.map(|op| op.as_ref())
|
.map(|op| op.as_ref())
|
||||||
.chain(BASE_OPS.iter().cloned())
|
.chain(BASE_OPS.iter().cloned())
|
||||||
.map(str::to_string)
|
.map(str::to_string)
|
||||||
@@ -175,7 +190,10 @@ pub fn lexer<'a>(ctx: impl Context + 'a)
|
|||||||
paren_parser('[', ']'),
|
paren_parser('[', ']'),
|
||||||
paren_parser('{', '}'),
|
paren_parser('{', '}'),
|
||||||
just(":=").to(Lexeme::Const),
|
just(":=").to(Lexeme::Const),
|
||||||
just("=").ignore_then(number::float_parser()).then_ignore(just("=>")).map(Lexeme::rule),
|
just("=")
|
||||||
|
.ignore_then(number::float_parser())
|
||||||
|
.then_ignore(just("=>"))
|
||||||
|
.map(Lexeme::rule),
|
||||||
comment::comment_parser().map(Lexeme::Comment),
|
comment::comment_parser().map(Lexeme::Comment),
|
||||||
just("::").to(Lexeme::NS),
|
just("::").to(Lexeme::NS),
|
||||||
just('\\').to(Lexeme::BS),
|
just('\\').to(Lexeme::BS),
|
||||||
@@ -184,20 +202,18 @@ pub fn lexer<'a>(ctx: impl Context + 'a)
|
|||||||
just('\n').to(Lexeme::BR),
|
just('\n').to(Lexeme::BR),
|
||||||
placeholder::placeholder_parser(ctx.clone()).map(Lexeme::PH),
|
placeholder::placeholder_parser(ctx.clone()).map(Lexeme::PH),
|
||||||
literal_parser().map(Lexeme::Literal),
|
literal_parser().map(Lexeme::Literal),
|
||||||
name::name_parser(&all_ops).map(move |n| {
|
name::name_parser(&all_ops)
|
||||||
Lexeme::Name(ctx.interner().i(&n))
|
.map(move |n| Lexeme::Name(ctx.interner().i(&n))),
|
||||||
})
|
|
||||||
))
|
))
|
||||||
.map_with_span(|lexeme, range| Entry{ lexeme, range })
|
.map_with_span(|lexeme, range| Entry { lexeme, range })
|
||||||
.padded_by(one_of(" \t").repeated())
|
.padded_by(one_of(" \t").repeated())
|
||||||
.repeated()
|
.repeated()
|
||||||
.then_ignore(end())
|
.then_ignore(end())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
pub fn filter_map_lex<'a, O, M: ToString>(
|
pub fn filter_map_lex<'a, O, M: ToString>(
|
||||||
f: impl Fn(Lexeme) -> Result<O, M> + Clone + 'a
|
f: impl Fn(Lexeme) -> Result<O, M> + Clone + 'a,
|
||||||
) -> impl Parser<Entry, (O, Range<usize>), Error = Simple<Entry>> + Clone + 'a {
|
) -> impl SimpleParser<Entry, (O, Range<usize>)> + Clone + 'a {
|
||||||
filter_map(move |s: Range<usize>, e: Entry| {
|
filter_map(move |s: Range<usize>, e: Entry| {
|
||||||
let out = f(e.lexeme).map_err(|msg| Simple::custom(s.clone(), msg))?;
|
let out = f(e.lexeme).map_err(|msg| Simple::custom(s.clone(), msg))?;
|
||||||
Ok((out, s))
|
Ok((out, s))
|
||||||
|
|||||||
@@ -1,19 +1,20 @@
|
|||||||
mod string;
|
|
||||||
mod number;
|
|
||||||
mod name;
|
|
||||||
mod lexer;
|
|
||||||
mod comment;
|
mod comment;
|
||||||
mod expression;
|
|
||||||
mod sourcefile;
|
|
||||||
mod import;
|
|
||||||
mod parse;
|
|
||||||
mod enum_filter;
|
|
||||||
mod placeholder;
|
|
||||||
mod context;
|
mod context;
|
||||||
|
mod decls;
|
||||||
|
mod enum_filter;
|
||||||
|
mod expression;
|
||||||
|
mod facade;
|
||||||
|
mod import;
|
||||||
|
mod lexer;
|
||||||
|
mod name;
|
||||||
|
mod number;
|
||||||
|
mod placeholder;
|
||||||
|
mod sourcefile;
|
||||||
|
mod string;
|
||||||
|
|
||||||
pub use sourcefile::line_parser;
|
|
||||||
pub use lexer::{lexer, Lexeme, Entry};
|
|
||||||
pub use name::is_op;
|
|
||||||
pub use parse::{parse, ParseError};
|
|
||||||
pub use number::{float_parser, int_parser};
|
|
||||||
pub use context::ParsingContext;
|
pub use context::ParsingContext;
|
||||||
|
pub use facade::{parse, ParseError};
|
||||||
|
pub use lexer::{lexer, Entry, Lexeme};
|
||||||
|
pub use name::is_op;
|
||||||
|
pub use number::{float_parser, int_parser};
|
||||||
|
pub use sourcefile::line_parser;
|
||||||
|
|||||||
@@ -1,18 +1,24 @@
|
|||||||
use chumsky::{self, prelude::*, Parser};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::{self, Parser};
|
||||||
|
|
||||||
|
use super::decls::{BoxedSimpleParser, SimpleParser};
|
||||||
|
|
||||||
/// Matches any one of the passed operators, preferring longer ones
|
/// Matches any one of the passed operators, preferring longer ones
|
||||||
fn op_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
fn op_parser<'a>(
|
||||||
-> BoxedParser<'a, char, String, Simple<char>>
|
ops: &[impl AsRef<str> + Clone],
|
||||||
{
|
) -> BoxedSimpleParser<'a, char, String> {
|
||||||
let mut sorted_ops: Vec<String> = ops.iter()
|
let mut sorted_ops: Vec<String> =
|
||||||
.map(|t| t.as_ref().to_string()).collect();
|
ops.iter().map(|t| t.as_ref().to_string()).collect();
|
||||||
sorted_ops.sort_by_key(|op| -(op.len() as i64));
|
sorted_ops.sort_by_key(|op| -(op.len() as i64));
|
||||||
sorted_ops.into_iter()
|
sorted_ops
|
||||||
|
.into_iter()
|
||||||
.map(|op| just(op).boxed())
|
.map(|op| just(op).boxed())
|
||||||
.reduce(|a, b| a.or(b).boxed())
|
.reduce(|a, b| a.or(b).boxed())
|
||||||
.unwrap_or_else(|| {
|
.unwrap_or_else(|| {
|
||||||
empty().map(|()| panic!("Empty isn't meant to match")).boxed()
|
empty().map(|()| panic!("Empty isn't meant to match")).boxed()
|
||||||
}).labelled("operator").boxed()
|
})
|
||||||
|
.labelled("operator")
|
||||||
|
.boxed()
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Characters that cannot be parsed as part of an operator
|
/// Characters that cannot be parsed as part of an operator
|
||||||
@@ -39,24 +45,23 @@ static NOT_NAME_CHAR: &[char] = &[
|
|||||||
/// TODO: `.` could possibly be parsed as an operator in some contexts.
|
/// TODO: `.` could possibly be parsed as an operator in some contexts.
|
||||||
/// This operator is very common in maths so it's worth a try.
|
/// This operator is very common in maths so it's worth a try.
|
||||||
/// Investigate.
|
/// Investigate.
|
||||||
pub fn modname_parser<'a>()
|
pub fn modname_parser<'a>() -> impl SimpleParser<char, String> + 'a {
|
||||||
-> impl Parser<char, String, Error = Simple<char>> + 'a
|
|
||||||
{
|
|
||||||
filter(move |c| !NOT_NAME_CHAR.contains(c) && !c.is_whitespace())
|
filter(move |c| !NOT_NAME_CHAR.contains(c) && !c.is_whitespace())
|
||||||
.repeated().at_least(1)
|
.repeated()
|
||||||
|
.at_least(1)
|
||||||
.collect()
|
.collect()
|
||||||
.labelled("modname")
|
.labelled("modname")
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parse an operator or name. Failing both, parse everything up to
|
/// Parse an operator or name. Failing both, parse everything up to
|
||||||
/// the next whitespace or blacklisted character as a new operator.
|
/// the next whitespace or blacklisted character as a new operator.
|
||||||
pub fn name_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
pub fn name_parser<'a>(
|
||||||
-> impl Parser<char, String, Error = Simple<char>> + 'a
|
ops: &[impl AsRef<str> + Clone],
|
||||||
{
|
) -> impl SimpleParser<char, String> + 'a {
|
||||||
choice((
|
choice((
|
||||||
op_parser(ops), // First try to parse a known operator
|
op_parser(ops), // First try to parse a known operator
|
||||||
text::ident().labelled("plain text"), // Failing that, parse plain text
|
text::ident().labelled("plain text"), // Failing that, parse plain text
|
||||||
modname_parser() // Finally parse everything until tne next forbidden char
|
modname_parser(), // Finally parse everything until tne next forbidden char
|
||||||
))
|
))
|
||||||
.labelled("name")
|
.labelled("name")
|
||||||
}
|
}
|
||||||
@@ -66,6 +71,6 @@ pub fn name_parser<'a>(ops: &[impl AsRef<str> + Clone])
|
|||||||
pub fn is_op(s: impl AsRef<str>) -> bool {
|
pub fn is_op(s: impl AsRef<str>) -> bool {
|
||||||
return match s.as_ref().chars().next() {
|
return match s.as_ref().chars().next() {
|
||||||
Some(x) => !x.is_alphanumeric(),
|
Some(x) => !x.is_alphanumeric(),
|
||||||
None => false
|
None => false,
|
||||||
}
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,9 @@
|
|||||||
use chumsky::{self, prelude::*, Parser};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::{self, Parser};
|
||||||
use ordered_float::NotNan;
|
use ordered_float::NotNan;
|
||||||
|
|
||||||
|
use super::decls::SimpleParser;
|
||||||
|
|
||||||
fn assert_not_digit(base: u32, c: char) {
|
fn assert_not_digit(base: u32, c: char) {
|
||||||
if base > (10 + (c as u32 - 'a' as u32)) {
|
if base > (10 + (c as u32 - 'a' as u32)) {
|
||||||
panic!("The character '{}' is a digit in base ({})", c, base)
|
panic!("The character '{}' is a digit in base ({})", c, base)
|
||||||
@@ -10,7 +13,7 @@ fn assert_not_digit(base: u32, c: char) {
|
|||||||
/// Parse an arbitrarily grouped sequence of digits starting with an underscore.
|
/// Parse an arbitrarily grouped sequence of digits starting with an underscore.
|
||||||
///
|
///
|
||||||
/// TODO: this should use separated_by and parse the leading group too
|
/// TODO: this should use separated_by and parse the leading group too
|
||||||
fn separated_digits_parser(base: u32) -> impl Parser<char, String, Error = Simple<char>> {
|
fn separated_digits_parser(base: u32) -> impl SimpleParser<char, String> {
|
||||||
just('_')
|
just('_')
|
||||||
.ignore_then(text::digits(base))
|
.ignore_then(text::digits(base))
|
||||||
.repeated()
|
.repeated()
|
||||||
@@ -20,47 +23,52 @@ fn separated_digits_parser(base: u32) -> impl Parser<char, String, Error = Simpl
|
|||||||
/// parse a grouped uint
|
/// parse a grouped uint
|
||||||
///
|
///
|
||||||
/// Not to be confused with [int_parser] which does a lot more
|
/// Not to be confused with [int_parser] which does a lot more
|
||||||
fn uint_parser(base: u32) -> impl Parser<char, u64, Error = Simple<char>> {
|
fn uint_parser(base: u32) -> impl SimpleParser<char, u64> {
|
||||||
text::int(base)
|
text::int(base).then(separated_digits_parser(base)).map(
|
||||||
.then(separated_digits_parser(base))
|
move |(s1, s2): (String, String)| {
|
||||||
.map(move |(s1, s2): (String, String)| {
|
|
||||||
u64::from_str_radix(&(s1 + &s2), base).unwrap()
|
u64::from_str_radix(&(s1 + &s2), base).unwrap()
|
||||||
})
|
},
|
||||||
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// parse exponent notation, or return 0 as the default exponent.
|
/// parse exponent notation, or return 0 as the default exponent.
|
||||||
/// The exponent is always in decimal.
|
/// The exponent is always in decimal.
|
||||||
fn pow_parser() -> impl Parser<char, i32, Error = Simple<char>> {
|
fn pow_parser() -> impl SimpleParser<char, i32> {
|
||||||
choice((
|
choice((
|
||||||
just('p')
|
just('p').ignore_then(text::int(10)).map(|s: String| s.parse().unwrap()),
|
||||||
.ignore_then(text::int(10))
|
|
||||||
.map(|s: String| s.parse().unwrap()),
|
|
||||||
just("p-")
|
just("p-")
|
||||||
.ignore_then(text::int(10))
|
.ignore_then(text::int(10))
|
||||||
.map(|s: String| -s.parse::<i32>().unwrap()),
|
.map(|s: String| -s.parse::<i32>().unwrap()),
|
||||||
)).or_else(|_| Ok(0))
|
))
|
||||||
|
.or_else(|_| Ok(0))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// returns a mapper that converts a mantissa and an exponent into an uint
|
/// returns a mapper that converts a mantissa and an exponent into an uint
|
||||||
///
|
///
|
||||||
/// TODO it panics if it finds a negative exponent
|
/// TODO it panics if it finds a negative exponent
|
||||||
fn nat2u(base: u64) -> impl Fn((u64, i32),) -> u64 {
|
fn nat2u(base: u64) -> impl Fn((u64, i32)) -> u64 {
|
||||||
move |(val, exp)| {
|
move |(val, exp)| {
|
||||||
if exp == 0 {val}
|
if exp == 0 {
|
||||||
else {val * base.checked_pow(exp.try_into().unwrap()).unwrap()}
|
val
|
||||||
|
} else {
|
||||||
|
val * base.checked_pow(exp.try_into().unwrap()).unwrap()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// returns a mapper that converts a mantissa and an exponent into a float
|
/// returns a mapper that converts a mantissa and an exponent into a float
|
||||||
fn nat2f(base: u64) -> impl Fn((NotNan<f64>, i32),) -> NotNan<f64> {
|
fn nat2f(base: u64) -> impl Fn((NotNan<f64>, i32)) -> NotNan<f64> {
|
||||||
move |(val, exp)| {
|
move |(val, exp)| {
|
||||||
if exp == 0 {val}
|
if exp == 0 {
|
||||||
else {val * (base as f64).powf(exp.try_into().unwrap())}
|
val
|
||||||
|
} else {
|
||||||
|
val * (base as f64).powf(exp.try_into().unwrap())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// parse an uint from exponential notation (panics if 'p' is a digit in base)
|
/// parse an uint from exponential notation (panics if 'p' is a digit in base)
|
||||||
fn pow_uint_parser(base: u32) -> impl Parser<char, u64, Error = Simple<char>> {
|
fn pow_uint_parser(base: u32) -> impl SimpleParser<char, u64> {
|
||||||
assert_not_digit(base, 'p');
|
assert_not_digit(base, 'p');
|
||||||
uint_parser(base).then(pow_parser()).map(nat2u(base.into()))
|
uint_parser(base).then(pow_parser()).map(nat2u(base.into()))
|
||||||
}
|
}
|
||||||
@@ -68,7 +76,7 @@ fn pow_uint_parser(base: u32) -> impl Parser<char, u64, Error = Simple<char>> {
|
|||||||
/// parse an uint from a base determined by its prefix or lack thereof
|
/// parse an uint from a base determined by its prefix or lack thereof
|
||||||
///
|
///
|
||||||
/// Not to be confused with [uint_parser] which is a component of it.
|
/// Not to be confused with [uint_parser] which is a component of it.
|
||||||
pub fn int_parser() -> impl Parser<char, u64, Error = Simple<char>> {
|
pub fn int_parser() -> impl SimpleParser<char, u64> {
|
||||||
choice((
|
choice((
|
||||||
just("0b").ignore_then(pow_uint_parser(2)),
|
just("0b").ignore_then(pow_uint_parser(2)),
|
||||||
just("0x").ignore_then(pow_uint_parser(16)),
|
just("0x").ignore_then(pow_uint_parser(16)),
|
||||||
@@ -78,35 +86,40 @@ pub fn int_parser() -> impl Parser<char, u64, Error = Simple<char>> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// parse a float from dot notation
|
/// parse a float from dot notation
|
||||||
fn dotted_parser(base: u32) -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
fn dotted_parser(base: u32) -> impl SimpleParser<char, NotNan<f64>> {
|
||||||
uint_parser(base)
|
uint_parser(base)
|
||||||
.then(
|
.then(
|
||||||
just('.').ignore_then(
|
just('.')
|
||||||
text::digits(base).then(separated_digits_parser(base))
|
.ignore_then(text::digits(base).then(separated_digits_parser(base)))
|
||||||
).map(move |(frac1, frac2)| {
|
.map(move |(frac1, frac2)| {
|
||||||
let frac = frac1 + &frac2;
|
let frac = frac1 + &frac2;
|
||||||
let frac_num = u64::from_str_radix(&frac, base).unwrap() as f64;
|
let frac_num = u64::from_str_radix(&frac, base).unwrap() as f64;
|
||||||
let dexp = base.pow(frac.len().try_into().unwrap());
|
let dexp = base.pow(frac.len().try_into().unwrap());
|
||||||
frac_num / dexp as f64
|
frac_num / dexp as f64
|
||||||
}).or_not().map(|o| o.unwrap_or_default())
|
})
|
||||||
).try_map(|(wh, f), s| {
|
.or_not()
|
||||||
NotNan::new(wh as f64 + f).map_err(|_| Simple::custom(s, "Float literal evaluates to NaN"))
|
.map(|o| o.unwrap_or_default()),
|
||||||
|
)
|
||||||
|
.try_map(|(wh, f), s| {
|
||||||
|
NotNan::new(wh as f64 + f)
|
||||||
|
.map_err(|_| Simple::custom(s, "Float literal evaluates to NaN"))
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
/// parse a float from dotted and optionally also exponential notation
|
/// parse a float from dotted and optionally also exponential notation
|
||||||
fn pow_float_parser(base: u32) -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
fn pow_float_parser(base: u32) -> impl SimpleParser<char, NotNan<f64>> {
|
||||||
assert_not_digit(base, 'p');
|
assert_not_digit(base, 'p');
|
||||||
dotted_parser(base).then(pow_parser()).map(nat2f(base.into()))
|
dotted_parser(base).then(pow_parser()).map(nat2f(base.into()))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// parse a float with dotted and optionally exponential notation from a base determined by its
|
/// parse a float with dotted and optionally exponential notation from a base
|
||||||
/// prefix
|
/// determined by its prefix
|
||||||
pub fn float_parser() -> impl Parser<char, NotNan<f64>, Error = Simple<char>> {
|
pub fn float_parser() -> impl SimpleParser<char, NotNan<f64>> {
|
||||||
choice((
|
choice((
|
||||||
just("0b").ignore_then(pow_float_parser(2)),
|
just("0b").ignore_then(pow_float_parser(2)),
|
||||||
just("0x").ignore_then(pow_float_parser(16)),
|
just("0x").ignore_then(pow_float_parser(16)),
|
||||||
just('0').ignore_then(pow_float_parser(8)),
|
just('0').ignore_then(pow_float_parser(8)),
|
||||||
pow_float_parser(10),
|
pow_float_parser(10),
|
||||||
)).labelled("float")
|
))
|
||||||
|
.labelled("float")
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,16 +1,18 @@
|
|||||||
use chumsky::{Parser, prelude::*};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::Parser;
|
||||||
|
|
||||||
use crate::ast::{Placeholder, PHClass};
|
use super::context::Context;
|
||||||
|
use super::decls::SimpleParser;
|
||||||
|
use super::number::int_parser;
|
||||||
|
use crate::ast::{PHClass, Placeholder};
|
||||||
|
|
||||||
use super::{number::int_parser, context::Context};
|
pub fn placeholder_parser(
|
||||||
|
ctx: impl Context,
|
||||||
pub fn placeholder_parser<'a>(ctx: impl Context + 'a)
|
) -> impl SimpleParser<char, Placeholder> {
|
||||||
-> impl Parser<char, Placeholder, Error = Simple<char>> + 'a
|
|
||||||
{
|
|
||||||
choice((
|
choice((
|
||||||
just("...").to(Some(true)),
|
just("...").to(Some(true)),
|
||||||
just("..").to(Some(false)),
|
just("..").to(Some(false)),
|
||||||
empty().to(None)
|
empty().to(None),
|
||||||
))
|
))
|
||||||
.then(just("$").ignore_then(text::ident()))
|
.then(just("$").ignore_then(text::ident()))
|
||||||
.then(just(":").ignore_then(int_parser()).or_not())
|
.then(just(":").ignore_then(int_parser()).or_not())
|
||||||
@@ -19,12 +21,10 @@ pub fn placeholder_parser<'a>(ctx: impl Context + 'a)
|
|||||||
if let Some(nonzero) = vec_nonzero {
|
if let Some(nonzero) = vec_nonzero {
|
||||||
let prio = vec_prio.unwrap_or_default();
|
let prio = vec_prio.unwrap_or_default();
|
||||||
Ok(Placeholder { name, class: PHClass::Vec { nonzero, prio } })
|
Ok(Placeholder { name, class: PHClass::Vec { nonzero, prio } })
|
||||||
} else {
|
} else if vec_prio.is_some() {
|
||||||
if vec_prio.is_some() {
|
|
||||||
Err(Simple::custom(span, "Scalar placeholders have no priority"))
|
Err(Simple::custom(span, "Scalar placeholders have no priority"))
|
||||||
} else {
|
} else {
|
||||||
Ok(Placeholder { name, class: PHClass::Scalar })
|
Ok(Placeholder { name, class: PHClass::Scalar })
|
||||||
}
|
}
|
||||||
}
|
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,55 +1,67 @@
|
|||||||
use std::iter;
|
use std::iter;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::representations::location::Location;
|
use chumsky::prelude::*;
|
||||||
use crate::representations::sourcefile::{FileEntry, Member};
|
use chumsky::Parser;
|
||||||
use crate::enum_filter;
|
|
||||||
use crate::ast::{Rule, Constant, Expr, Clause};
|
|
||||||
use crate::interner::Token;
|
|
||||||
|
|
||||||
use super::Entry;
|
|
||||||
use super::context::Context;
|
|
||||||
use super::expression::xpr_parser;
|
|
||||||
use super::import::import_parser;
|
|
||||||
use super::lexer::{Lexeme, filter_map_lex};
|
|
||||||
|
|
||||||
use chumsky::{Parser, prelude::*};
|
|
||||||
use itertools::Itertools;
|
use itertools::Itertools;
|
||||||
|
|
||||||
fn rule_parser<'a>(ctx: impl Context + 'a)
|
use super::context::Context;
|
||||||
-> impl Parser<Entry, Rule, Error = Simple<Entry>> + 'a
|
use super::decls::{SimpleParser, SimpleRecursive};
|
||||||
{
|
use super::expression::xpr_parser;
|
||||||
xpr_parser(ctx.clone()).repeated().at_least(1)
|
use super::import::import_parser;
|
||||||
|
use super::lexer::{filter_map_lex, Lexeme};
|
||||||
|
use super::Entry;
|
||||||
|
use crate::ast::{Clause, Constant, Expr, Rule};
|
||||||
|
use crate::enum_filter;
|
||||||
|
use crate::representations::location::Location;
|
||||||
|
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||||
|
|
||||||
|
fn rule_parser<'a>(
|
||||||
|
ctx: impl Context + 'a,
|
||||||
|
) -> impl SimpleParser<Entry, Rule> + 'a {
|
||||||
|
xpr_parser(ctx.clone())
|
||||||
|
.repeated()
|
||||||
|
.at_least(1)
|
||||||
.then(filter_map_lex(enum_filter!(Lexeme::Rule)))
|
.then(filter_map_lex(enum_filter!(Lexeme::Rule)))
|
||||||
.then(xpr_parser(ctx).repeated().at_least(1))
|
.then(xpr_parser(ctx).repeated().at_least(1))
|
||||||
.map(|((s, (prio, _)), t)| Rule{
|
.map(|((s, (prio, _)), t)| Rule {
|
||||||
source: Rc::new(s),
|
source: Rc::new(s),
|
||||||
prio,
|
prio,
|
||||||
target: Rc::new(t)
|
target: Rc::new(t),
|
||||||
}).labelled("Rule")
|
})
|
||||||
|
.labelled("Rule")
|
||||||
}
|
}
|
||||||
|
|
||||||
fn const_parser<'a>(ctx: impl Context + 'a)
|
fn const_parser<'a>(
|
||||||
-> impl Parser<Entry, Constant, Error = Simple<Entry>> + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, Constant> + 'a {
|
||||||
filter_map_lex(enum_filter!(Lexeme::Name))
|
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||||
.then_ignore(Lexeme::Const.parser())
|
.then_ignore(Lexeme::Const.parser())
|
||||||
.then(xpr_parser(ctx.clone()).repeated().at_least(1))
|
.then(xpr_parser(ctx.clone()).repeated().at_least(1))
|
||||||
.map(move |((name, _), value)| Constant{
|
.map(move |((name, _), value)| Constant {
|
||||||
name,
|
name,
|
||||||
value: if let Ok(ex) = value.iter().exactly_one() { ex.clone() }
|
value: if let Ok(ex) = value.iter().exactly_one() {
|
||||||
else {
|
ex.clone()
|
||||||
let start = value.first().expect("value cannot be empty")
|
} else {
|
||||||
.location.range().expect("all locations in parsed source are known")
|
let start = value
|
||||||
|
.first()
|
||||||
|
.expect("value cannot be empty")
|
||||||
|
.location
|
||||||
|
.range()
|
||||||
|
.expect("all locations in parsed source are known")
|
||||||
.start;
|
.start;
|
||||||
let end = value.last().expect("asserted right above")
|
let end = value
|
||||||
.location.range().expect("all locations in parsed source are known")
|
.last()
|
||||||
|
.expect("asserted right above")
|
||||||
|
.location
|
||||||
|
.range()
|
||||||
|
.expect("all locations in parsed source are known")
|
||||||
.end;
|
.end;
|
||||||
Expr{
|
Expr {
|
||||||
location: Location::Range { file: ctx.file(), range: start..end },
|
location: Location::Range { file: ctx.file(), range: start..end },
|
||||||
value: Clause::S('(', Rc::new(value))
|
value: Clause::S('(', Rc::new(value)),
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
},
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -60,56 +72,61 @@ pub fn collect_errors<T, E: chumsky::Error<T>>(e: Vec<E>) -> E {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn namespace_parser<'a>(
|
fn namespace_parser<'a>(
|
||||||
line: impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a,
|
line: impl SimpleParser<Entry, FileEntry> + 'a,
|
||||||
) -> impl Parser<Entry, (Token<String>, Vec<FileEntry>), Error = Simple<Entry>> + 'a {
|
) -> impl SimpleParser<Entry, Namespace> + 'a {
|
||||||
Lexeme::Namespace.parser()
|
Lexeme::Namespace
|
||||||
|
.parser()
|
||||||
.ignore_then(filter_map_lex(enum_filter!(Lexeme::Name)))
|
.ignore_then(filter_map_lex(enum_filter!(Lexeme::Name)))
|
||||||
.then(
|
.then(
|
||||||
any().repeated().delimited_by(
|
any()
|
||||||
Lexeme::LP('(').parser(),
|
.repeated()
|
||||||
Lexeme::RP('(').parser()
|
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
||||||
).try_map(move |body, _| {
|
.try_map(move |body, _| {
|
||||||
split_lines(&body)
|
split_lines(&body)
|
||||||
.map(|l| line.parse(l))
|
.map(|l| line.parse(l))
|
||||||
.collect::<Result<Vec<_>,_>>()
|
.collect::<Result<Vec<_>, _>>()
|
||||||
.map_err(collect_errors)
|
.map_err(collect_errors)
|
||||||
})
|
}),
|
||||||
).map(move |((name, _), body)| {
|
)
|
||||||
(name, body)
|
.map(move |((name, _), body)| Namespace { name, body })
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn member_parser<'a>(
|
fn member_parser<'a>(
|
||||||
line: impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a,
|
line: impl SimpleParser<Entry, FileEntry> + 'a,
|
||||||
ctx: impl Context + 'a
|
ctx: impl Context + 'a,
|
||||||
) -> impl Parser<Entry, Member, Error = Simple<Entry>> + 'a {
|
) -> impl SimpleParser<Entry, Member> + 'a {
|
||||||
choice((
|
choice((
|
||||||
namespace_parser(line)
|
namespace_parser(line).map(Member::Namespace),
|
||||||
.map(|(name, body)| Member::Namespace(name, body)),
|
|
||||||
rule_parser(ctx.clone()).map(Member::Rule),
|
rule_parser(ctx.clone()).map(Member::Rule),
|
||||||
const_parser(ctx).map(Member::Constant),
|
const_parser(ctx).map(Member::Constant),
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn line_parser<'a>(ctx: impl Context + 'a)
|
pub fn line_parser<'a>(
|
||||||
-> impl Parser<Entry, FileEntry, Error = Simple<Entry>> + 'a
|
ctx: impl Context + 'a,
|
||||||
{
|
) -> impl SimpleParser<Entry, FileEntry> + 'a {
|
||||||
recursive(|line: Recursive<Entry, FileEntry, Simple<Entry>>| {
|
recursive(|line: SimpleRecursive<Entry, FileEntry>| {
|
||||||
choice((
|
choice((
|
||||||
// In case the usercode wants to parse doc
|
// In case the usercode wants to parse doc
|
||||||
filter_map_lex(enum_filter!(Lexeme >> FileEntry; Comment)).map(|(ent, _)| ent),
|
filter_map_lex(enum_filter!(Lexeme >> FileEntry; Comment))
|
||||||
|
.map(|(ent, _)| ent),
|
||||||
// plain old imports
|
// plain old imports
|
||||||
Lexeme::Import.parser()
|
Lexeme::Import
|
||||||
|
.parser()
|
||||||
.ignore_then(import_parser(ctx.clone()).map(FileEntry::Import)),
|
.ignore_then(import_parser(ctx.clone()).map(FileEntry::Import)),
|
||||||
Lexeme::Export.parser().ignore_then(choice((
|
Lexeme::Export.parser().ignore_then(choice((
|
||||||
// token collection
|
// token collection
|
||||||
Lexeme::NS.parser().ignore_then(
|
Lexeme::NS
|
||||||
filter_map_lex(enum_filter!(Lexeme::Name)).map(|(e, _)| e)
|
.parser()
|
||||||
|
.ignore_then(
|
||||||
|
filter_map_lex(enum_filter!(Lexeme::Name))
|
||||||
|
.map(|(e, _)| e)
|
||||||
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
.separated_by(Lexeme::Name(ctx.interner().i(",")).parser())
|
||||||
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser())
|
.delimited_by(Lexeme::LP('(').parser(), Lexeme::RP('(').parser()),
|
||||||
).map(FileEntry::Export),
|
)
|
||||||
|
.map(FileEntry::Export),
|
||||||
// public declaration
|
// public declaration
|
||||||
member_parser(line.clone(), ctx.clone()).map(FileEntry::Exported)
|
member_parser(line.clone(), ctx.clone()).map(FileEntry::Exported),
|
||||||
))),
|
))),
|
||||||
// This could match almost anything so it has to go last
|
// This could match almost anything so it has to go last
|
||||||
member_parser(line, ctx).map(FileEntry::Internal),
|
member_parser(line, ctx).map(FileEntry::Internal),
|
||||||
@@ -123,13 +140,13 @@ pub fn split_lines(data: &[Entry]) -> impl Iterator<Item = &[Entry]> {
|
|||||||
let mut finished = false;
|
let mut finished = false;
|
||||||
iter::from_fn(move || {
|
iter::from_fn(move || {
|
||||||
let mut paren_count = 0;
|
let mut paren_count = 0;
|
||||||
while let Some((i, Entry{ lexeme, .. })) = source.next() {
|
for (i, Entry { lexeme, .. }) in source.by_ref() {
|
||||||
match lexeme {
|
match lexeme {
|
||||||
Lexeme::LP(_) => paren_count += 1,
|
Lexeme::LP(_) => paren_count += 1,
|
||||||
Lexeme::RP(_) => paren_count -= 1,
|
Lexeme::RP(_) => paren_count -= 1,
|
||||||
Lexeme::BR if paren_count == 0 => {
|
Lexeme::BR if paren_count == 0 => {
|
||||||
let begin = last_slice;
|
let begin = last_slice;
|
||||||
last_slice = i+1;
|
last_slice = i + 1;
|
||||||
return Some(&data[begin..i]);
|
return Some(&data[begin..i]);
|
||||||
},
|
},
|
||||||
_ => (),
|
_ => (),
|
||||||
@@ -138,8 +155,9 @@ pub fn split_lines(data: &[Entry]) -> impl Iterator<Item = &[Entry]> {
|
|||||||
// Include last line even without trailing newline
|
// Include last line even without trailing newline
|
||||||
if !finished {
|
if !finished {
|
||||||
finished = true;
|
finished = true;
|
||||||
return Some(&data[last_slice..])
|
return Some(&data[last_slice..]);
|
||||||
}
|
}
|
||||||
None
|
None
|
||||||
}).filter(|s| s.len() > 0)
|
})
|
||||||
|
.filter(|s| !s.is_empty())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,10 @@
|
|||||||
use chumsky::{self, prelude::*, Parser};
|
use chumsky::prelude::*;
|
||||||
|
use chumsky::{self, Parser};
|
||||||
|
|
||||||
|
use super::decls::SimpleParser;
|
||||||
|
|
||||||
/// Parses a text character that is not the specified delimiter
|
/// Parses a text character that is not the specified delimiter
|
||||||
fn text_parser(delim: char) -> impl Parser<char, char, Error = Simple<char>> {
|
fn text_parser(delim: char) -> impl SimpleParser<char, char> {
|
||||||
// Copied directly from Chumsky's JSON example.
|
// Copied directly from Chumsky's JSON example.
|
||||||
let escape = just('\\').ignore_then(
|
let escape = just('\\').ignore_then(
|
||||||
just('\\')
|
just('\\')
|
||||||
@@ -12,7 +15,8 @@ fn text_parser(delim: char) -> impl Parser<char, char, Error = Simple<char>> {
|
|||||||
.or(just('n').to('\n'))
|
.or(just('n').to('\n'))
|
||||||
.or(just('r').to('\r'))
|
.or(just('r').to('\r'))
|
||||||
.or(just('t').to('\t'))
|
.or(just('t').to('\t'))
|
||||||
.or(just('u').ignore_then(
|
.or(
|
||||||
|
just('u').ignore_then(
|
||||||
filter(|c: &char| c.is_ascii_hexdigit())
|
filter(|c: &char| c.is_ascii_hexdigit())
|
||||||
.repeated()
|
.repeated()
|
||||||
.exactly(4)
|
.exactly(4)
|
||||||
@@ -24,23 +28,26 @@ fn text_parser(delim: char) -> impl Parser<char, char, Error = Simple<char>> {
|
|||||||
'\u{FFFD}' // unicode replacement character
|
'\u{FFFD}' // unicode replacement character
|
||||||
})
|
})
|
||||||
}),
|
}),
|
||||||
)),
|
),
|
||||||
|
),
|
||||||
);
|
);
|
||||||
filter(move |&c| c != '\\' && c != delim).or(escape)
|
filter(move |&c| c != '\\' && c != delim).or(escape)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parse a character literal between single quotes
|
/// Parse a character literal between single quotes
|
||||||
pub fn char_parser() -> impl Parser<char, char, Error = Simple<char>> {
|
pub fn char_parser() -> impl SimpleParser<char, char> {
|
||||||
just('\'').ignore_then(text_parser('\'')).then_ignore(just('\''))
|
just('\'').ignore_then(text_parser('\'')).then_ignore(just('\''))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Parse a string between double quotes
|
/// Parse a string between double quotes
|
||||||
pub fn str_parser() -> impl Parser<char, String, Error = Simple<char>> {
|
pub fn str_parser() -> impl SimpleParser<char, String> {
|
||||||
just('"')
|
just('"')
|
||||||
.ignore_then(
|
.ignore_then(
|
||||||
text_parser('"').map(Some)
|
text_parser('"').map(Some)
|
||||||
.or(just("\\\n").map(|_| None)) // Newlines preceded by backslashes are ignored.
|
.or(just("\\\n").map(|_| None)) // Newlines preceded by backslashes are ignored.
|
||||||
.repeated()
|
.repeated(),
|
||||||
).then_ignore(just('"'))
|
)
|
||||||
.flatten().collect()
|
.then_ignore(just('"'))
|
||||||
|
.flatten()
|
||||||
|
.collect()
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,15 +1,15 @@
|
|||||||
mod project_error;
|
|
||||||
mod parse_error_with_path;
|
|
||||||
mod unexpected_directory;
|
|
||||||
mod module_not_found;
|
mod module_not_found;
|
||||||
mod not_exported;
|
mod not_exported;
|
||||||
|
mod parse_error_with_path;
|
||||||
|
mod project_error;
|
||||||
mod too_many_supers;
|
mod too_many_supers;
|
||||||
|
mod unexpected_directory;
|
||||||
mod visibility_mismatch;
|
mod visibility_mismatch;
|
||||||
|
|
||||||
pub use project_error::{ErrorPosition, ProjectError};
|
|
||||||
pub use parse_error_with_path::ParseErrorWithPath;
|
|
||||||
pub use unexpected_directory::UnexpectedDirectory;
|
|
||||||
pub use module_not_found::ModuleNotFound;
|
pub use module_not_found::ModuleNotFound;
|
||||||
pub use not_exported::NotExported;
|
pub use not_exported::NotExported;
|
||||||
|
pub use parse_error_with_path::ParseErrorWithPath;
|
||||||
|
pub use project_error::{ErrorPosition, ProjectError};
|
||||||
pub use too_many_supers::TooManySupers;
|
pub use too_many_supers::TooManySupers;
|
||||||
|
pub use unexpected_directory::UnexpectedDirectory;
|
||||||
pub use visibility_mismatch::VisibilityMismatch;
|
pub use visibility_mismatch::VisibilityMismatch;
|
||||||
@@ -1,12 +1,12 @@
|
|||||||
use crate::utils::{BoxedIter, iter::box_once};
|
use super::{ErrorPosition, ProjectError};
|
||||||
|
use crate::utils::iter::box_once;
|
||||||
use super::{ProjectError, ErrorPosition};
|
use crate::utils::BoxedIter;
|
||||||
|
|
||||||
/// Error produced when an import refers to a nonexistent module
|
/// Error produced when an import refers to a nonexistent module
|
||||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||||
pub struct ModuleNotFound {
|
pub struct ModuleNotFound {
|
||||||
pub file: Vec<String>,
|
pub file: Vec<String>,
|
||||||
pub subpath: Vec<String>
|
pub subpath: Vec<String>,
|
||||||
}
|
}
|
||||||
impl ProjectError for ModuleNotFound {
|
impl ProjectError for ModuleNotFound {
|
||||||
fn description(&self) -> &str {
|
fn description(&self) -> &str {
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::{utils::BoxedIter, representations::location::Location};
|
use super::{ErrorPosition, ProjectError};
|
||||||
|
use crate::representations::location::Location;
|
||||||
use super::{ProjectError, ErrorPosition};
|
use crate::utils::BoxedIter;
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct NotExported {
|
pub struct NotExported {
|
||||||
@@ -16,21 +16,21 @@ impl ProjectError for NotExported {
|
|||||||
"An import refers to a symbol that exists but isn't exported"
|
"An import refers to a symbol that exists but isn't exported"
|
||||||
}
|
}
|
||||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||||
Box::new([
|
Box::new(
|
||||||
ErrorPosition{
|
[
|
||||||
|
ErrorPosition {
|
||||||
location: Location::File(Rc::new(self.file.clone())),
|
location: Location::File(Rc::new(self.file.clone())),
|
||||||
message: Some(format!(
|
message: Some(format!("{} isn't exported", self.subpath.join("::"))),
|
||||||
"{} isn't exported",
|
|
||||||
self.subpath.join("::")
|
|
||||||
)),
|
|
||||||
},
|
},
|
||||||
ErrorPosition{
|
ErrorPosition {
|
||||||
location: Location::File(Rc::new(self.referrer_file.clone())),
|
location: Location::File(Rc::new(self.referrer_file.clone())),
|
||||||
message: Some(format!(
|
message: Some(format!(
|
||||||
"{} cannot see this symbol",
|
"{} cannot see this symbol",
|
||||||
self.referrer_subpath.join("::")
|
self.referrer_subpath.join("::")
|
||||||
)),
|
)),
|
||||||
}
|
},
|
||||||
].into_iter())
|
]
|
||||||
|
.into_iter(),
|
||||||
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,21 +1,21 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
|
use super::{ErrorPosition, ProjectError};
|
||||||
|
use crate::parse::ParseError;
|
||||||
use crate::representations::location::Location;
|
use crate::representations::location::Location;
|
||||||
use crate::utils::BoxedIter;
|
use crate::utils::BoxedIter;
|
||||||
use crate::parse::ParseError;
|
|
||||||
|
|
||||||
use super::ErrorPosition;
|
|
||||||
use super::ProjectError;
|
|
||||||
|
|
||||||
/// Produced by stages that parse text when it fails.
|
/// Produced by stages that parse text when it fails.
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct ParseErrorWithPath {
|
pub struct ParseErrorWithPath {
|
||||||
pub full_source: String,
|
pub full_source: String,
|
||||||
pub path: Vec<String>,
|
pub path: Vec<String>,
|
||||||
pub error: ParseError
|
pub error: ParseError,
|
||||||
}
|
}
|
||||||
impl ProjectError for ParseErrorWithPath {
|
impl ProjectError for ParseErrorWithPath {
|
||||||
fn description(&self) -> &str {"Failed to parse code"}
|
fn description(&self) -> &str {
|
||||||
|
"Failed to parse code"
|
||||||
|
}
|
||||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||||
match &self.error {
|
match &self.error {
|
||||||
ParseError::Lex(lex) => Box::new(lex.iter().map(|s| ErrorPosition {
|
ParseError::Lex(lex) => Box::new(lex.iter().map(|s| ErrorPosition {
|
||||||
@@ -23,14 +23,19 @@ impl ProjectError for ParseErrorWithPath {
|
|||||||
file: Rc::new(self.path.clone()),
|
file: Rc::new(self.path.clone()),
|
||||||
range: s.span(),
|
range: s.span(),
|
||||||
},
|
},
|
||||||
message: Some(s.to_string())
|
message: Some(s.to_string()),
|
||||||
})),
|
})),
|
||||||
ParseError::Ast(ast) => Box::new(ast.iter().map(|(_i, s)| ErrorPosition {
|
ParseError::Ast(ast) => Box::new(ast.iter().map(|(_i, s)| {
|
||||||
location: s.found().map(|e| Location::Range {
|
ErrorPosition {
|
||||||
|
location: s
|
||||||
|
.found()
|
||||||
|
.map(|e| Location::Range {
|
||||||
file: Rc::new(self.path.clone()),
|
file: Rc::new(self.path.clone()),
|
||||||
range: e.range.clone()
|
range: e.range.clone(),
|
||||||
}).unwrap_or_else(|| Location::File(Rc::new(self.path.clone()))),
|
})
|
||||||
message: Some(s.label().unwrap_or("Parse error").to_string())
|
.unwrap_or_else(|| Location::File(Rc::new(self.path.clone()))),
|
||||||
|
message: Some(s.label().unwrap_or("Parse error").to_string()),
|
||||||
|
}
|
||||||
})),
|
})),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ use crate::utils::BoxedIter;
|
|||||||
/// processing got stuck, a command that is likely to be incorrect
|
/// processing got stuck, a command that is likely to be incorrect
|
||||||
pub struct ErrorPosition {
|
pub struct ErrorPosition {
|
||||||
pub location: Location,
|
pub location: Location,
|
||||||
pub message: Option<String>
|
pub message: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl ErrorPosition {
|
impl ErrorPosition {
|
||||||
@@ -24,12 +24,17 @@ pub trait ProjectError: Debug {
|
|||||||
/// A general description of this type of error
|
/// A general description of this type of error
|
||||||
fn description(&self) -> &str;
|
fn description(&self) -> &str;
|
||||||
/// A formatted message that includes specific parameters
|
/// A formatted message that includes specific parameters
|
||||||
fn message(&self) -> String {String::new()}
|
fn message(&self) -> String {
|
||||||
|
String::new()
|
||||||
|
}
|
||||||
/// Code positions relevant to this error
|
/// Code positions relevant to this error
|
||||||
fn positions(&self) -> BoxedIter<ErrorPosition>;
|
fn positions(&self) -> BoxedIter<ErrorPosition>;
|
||||||
/// Convert the error into an [Rc<dyn ProjectError>] to be able to
|
/// Convert the error into an [Rc<dyn ProjectError>] to be able to
|
||||||
/// handle various errors together
|
/// handle various errors together
|
||||||
fn rc(self) -> Rc<dyn ProjectError> where Self: Sized + 'static {
|
fn rc(self) -> Rc<dyn ProjectError>
|
||||||
|
where
|
||||||
|
Self: Sized + 'static,
|
||||||
|
{
|
||||||
Rc::new(self)
|
Rc::new(self)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -41,7 +46,9 @@ impl Display for dyn ProjectError {
|
|||||||
let positions = self.positions();
|
let positions = self.positions();
|
||||||
write!(f, "Problem with the project: {description}; {message}")?;
|
write!(f, "Problem with the project: {description}; {message}")?;
|
||||||
for ErrorPosition { location, message } in positions {
|
for ErrorPosition { location, message } in positions {
|
||||||
write!(f, "@{location}: {}",
|
write!(
|
||||||
|
f,
|
||||||
|
"@{location}: {}",
|
||||||
message.unwrap_or("location of interest".to_string())
|
message.unwrap_or("location of interest".to_string())
|
||||||
)?
|
)?
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,8 +1,9 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::{utils::{BoxedIter, iter::box_once}, representations::location::Location};
|
use super::{ErrorPosition, ProjectError};
|
||||||
|
use crate::representations::location::Location;
|
||||||
use super::{ProjectError, ErrorPosition};
|
use crate::utils::iter::box_once;
|
||||||
|
use crate::utils::BoxedIter;
|
||||||
|
|
||||||
/// Error produced when an import path starts with more `super` segments
|
/// Error produced when an import path starts with more `super` segments
|
||||||
/// than the current module's absolute path
|
/// than the current module's absolute path
|
||||||
@@ -10,12 +11,12 @@ use super::{ProjectError, ErrorPosition};
|
|||||||
pub struct TooManySupers {
|
pub struct TooManySupers {
|
||||||
pub path: Vec<String>,
|
pub path: Vec<String>,
|
||||||
pub offender_file: Vec<String>,
|
pub offender_file: Vec<String>,
|
||||||
pub offender_mod: Vec<String>
|
pub offender_mod: Vec<String>,
|
||||||
}
|
}
|
||||||
impl ProjectError for TooManySupers {
|
impl ProjectError for TooManySupers {
|
||||||
fn description(&self) -> &str {
|
fn description(&self) -> &str {
|
||||||
"an import path starts with more `super` segments than \
|
"an import path starts with more `super` segments than the current \
|
||||||
the current module's absolute path"
|
module's absolute path"
|
||||||
}
|
}
|
||||||
fn message(&self) -> String {
|
fn message(&self) -> String {
|
||||||
format!(
|
format!(
|
||||||
@@ -32,7 +33,7 @@ impl ProjectError for TooManySupers {
|
|||||||
"path {} in {} contains too many `super` steps.",
|
"path {} in {} contains too many `super` steps.",
|
||||||
self.path.join("::"),
|
self.path.join("::"),
|
||||||
self.offender_mod.join("::")
|
self.offender_mod.join("::")
|
||||||
))
|
)),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,18 +1,17 @@
|
|||||||
use crate::utils::{BoxedIter, iter::box_once};
|
use super::{ErrorPosition, ProjectError};
|
||||||
|
use crate::utils::iter::box_once;
|
||||||
use super::ErrorPosition;
|
use crate::utils::BoxedIter;
|
||||||
use super::ProjectError;
|
|
||||||
|
|
||||||
/// Produced when a stage that deals specifically with code encounters
|
/// Produced when a stage that deals specifically with code encounters
|
||||||
/// a path that refers to a directory
|
/// a path that refers to a directory
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct UnexpectedDirectory {
|
pub struct UnexpectedDirectory {
|
||||||
pub path: Vec<String>
|
pub path: Vec<String>,
|
||||||
}
|
}
|
||||||
impl ProjectError for UnexpectedDirectory {
|
impl ProjectError for UnexpectedDirectory {
|
||||||
fn description(&self) -> &str {
|
fn description(&self) -> &str {
|
||||||
"A stage that deals specifically with code encountered a path \
|
"A stage that deals specifically with code encountered a path that refers \
|
||||||
that refers to a directory"
|
to a directory"
|
||||||
}
|
}
|
||||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||||
box_once(ErrorPosition::just_file(self.path.clone()))
|
box_once(ErrorPosition::just_file(self.path.clone()))
|
||||||
|
|||||||
@@ -1,13 +1,14 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
use crate::representations::location::Location;
|
|
||||||
use crate::utils::{BoxedIter, iter::box_once};
|
|
||||||
|
|
||||||
use super::project_error::{ProjectError, ErrorPosition};
|
use super::project_error::{ErrorPosition, ProjectError};
|
||||||
|
use crate::representations::location::Location;
|
||||||
|
use crate::utils::iter::box_once;
|
||||||
|
use crate::utils::BoxedIter;
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct VisibilityMismatch{
|
pub struct VisibilityMismatch {
|
||||||
pub namespace: Vec<String>,
|
pub namespace: Vec<String>,
|
||||||
pub file: Rc<Vec<String>>
|
pub file: Rc<Vec<String>>,
|
||||||
}
|
}
|
||||||
impl ProjectError for VisibilityMismatch {
|
impl ProjectError for VisibilityMismatch {
|
||||||
fn description(&self) -> &str {
|
fn description(&self) -> &str {
|
||||||
@@ -19,7 +20,7 @@ impl ProjectError for VisibilityMismatch {
|
|||||||
message: Some(format!(
|
message: Some(format!(
|
||||||
"{} is opened multiple times with different visibilities",
|
"{} is opened multiple times with different visibilities",
|
||||||
self.namespace.join("::")
|
self.namespace.join("::")
|
||||||
))
|
)),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,25 +1,23 @@
|
|||||||
use std::path::Path;
|
use std::path::{Path, PathBuf};
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
use std::path::PathBuf;
|
use std::{fs, io};
|
||||||
use std::io;
|
|
||||||
use std::fs;
|
|
||||||
|
|
||||||
|
use crate::interner::{Interner, Sym};
|
||||||
|
use crate::pipeline::error::{
|
||||||
|
ErrorPosition, ProjectError, UnexpectedDirectory,
|
||||||
|
};
|
||||||
use crate::utils::iter::box_once;
|
use crate::utils::iter::box_once;
|
||||||
use crate::utils::{Cache, BoxedIter};
|
use crate::utils::{BoxedIter, Cache};
|
||||||
use crate::interner::{Interner, Token};
|
|
||||||
use crate::pipeline::error::UnexpectedDirectory;
|
|
||||||
use crate::pipeline::error::{ProjectError, ErrorPosition};
|
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct FileLoadingError{
|
pub struct FileLoadingError {
|
||||||
file: io::Error,
|
file: io::Error,
|
||||||
dir: io::Error,
|
dir: io::Error,
|
||||||
path: Vec<String>
|
path: Vec<String>,
|
||||||
}
|
}
|
||||||
impl ProjectError for FileLoadingError {
|
impl ProjectError for FileLoadingError {
|
||||||
fn description(&self) -> &str {
|
fn description(&self) -> &str {
|
||||||
"Neither a file nor a directory could be read from \
|
"Neither a file nor a directory could be read from the requested path"
|
||||||
the requested path"
|
|
||||||
}
|
}
|
||||||
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
fn positions(&self) -> BoxedIter<ErrorPosition> {
|
||||||
box_once(ErrorPosition::just_file(self.path.clone()))
|
box_once(ErrorPosition::just_file(self.path.clone()))
|
||||||
@@ -37,57 +35,55 @@ pub enum Loaded {
|
|||||||
Collection(Rc<Vec<String>>),
|
Collection(Rc<Vec<String>>),
|
||||||
}
|
}
|
||||||
impl Loaded {
|
impl Loaded {
|
||||||
pub fn is_code(&self) -> bool {matches!(self, Loaded::Code(_))}
|
pub fn is_code(&self) -> bool {
|
||||||
|
matches!(self, Loaded::Code(_))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub type IOResult = Result<Loaded, Rc<dyn ProjectError>>;
|
pub type IOResult = Result<Loaded, Rc<dyn ProjectError>>;
|
||||||
|
|
||||||
pub type FileCache<'a> = Cache<'a, Token<Vec<Token<String>>>, IOResult>;
|
pub type FileCache<'a> = Cache<'a, Sym, IOResult>;
|
||||||
|
|
||||||
/// Load a file from a path expressed in Rust strings, but relative to
|
/// Load a file from a path expressed in Rust strings, but relative to
|
||||||
/// a root expressed as an OS Path.
|
/// a root expressed as an OS Path.
|
||||||
pub fn load_file(root: &Path, path: &[impl AsRef<str>]) -> IOResult {
|
pub fn load_file(root: &Path, path: &[impl AsRef<str>]) -> IOResult {
|
||||||
// let os_path = path.into_iter()
|
let full_path = path.iter().fold(root.to_owned(), |p, s| p.join(s.as_ref()));
|
||||||
// .map_into::<OsString>()
|
|
||||||
// .collect::<Vec<_>>();
|
|
||||||
let full_path = path.iter().fold(
|
|
||||||
root.to_owned(),
|
|
||||||
|p, s| p.join(s.as_ref())
|
|
||||||
);
|
|
||||||
let file_path = full_path.with_extension("orc");
|
let file_path = full_path.with_extension("orc");
|
||||||
let file_error = match fs::read_to_string(&file_path) {
|
let file_error = match fs::read_to_string(file_path) {
|
||||||
Ok(string) => return Ok(Loaded::Code(Rc::new(string))),
|
Ok(string) => return Ok(Loaded::Code(Rc::new(string))),
|
||||||
Err(err) => err
|
Err(err) => err,
|
||||||
};
|
};
|
||||||
let dir = match fs::read_dir(&full_path) {
|
let dir = match fs::read_dir(&full_path) {
|
||||||
Ok(dir) => dir,
|
Ok(dir) => dir,
|
||||||
Err(dir_error) => {
|
Err(dir_error) =>
|
||||||
return Err(FileLoadingError {
|
return Err(
|
||||||
|
FileLoadingError {
|
||||||
file: file_error,
|
file: file_error,
|
||||||
dir: dir_error,
|
dir: dir_error,
|
||||||
path: path.iter()
|
path: path.iter().map(|s| s.as_ref().to_string()).collect(),
|
||||||
.map(|s| s.as_ref().to_string())
|
|
||||||
.collect(),
|
|
||||||
}.rc())
|
|
||||||
}
|
}
|
||||||
|
.rc(),
|
||||||
|
),
|
||||||
};
|
};
|
||||||
let names = dir.filter_map(Result::ok)
|
let names = dir
|
||||||
|
.filter_map(Result::ok)
|
||||||
.filter_map(|ent| {
|
.filter_map(|ent| {
|
||||||
let fname = ent.file_name().into_string().ok()?;
|
let fname = ent.file_name().into_string().ok()?;
|
||||||
let ftyp = ent.metadata().ok()?.file_type();
|
let ftyp = ent.metadata().ok()?.file_type();
|
||||||
Some(if ftyp.is_dir() {fname} else {
|
Some(if ftyp.is_dir() {
|
||||||
|
fname
|
||||||
|
} else {
|
||||||
fname.strip_suffix(".or")?.to_string()
|
fname.strip_suffix(".or")?.to_string()
|
||||||
})
|
})
|
||||||
}).collect();
|
})
|
||||||
|
.collect();
|
||||||
Ok(Loaded::Collection(Rc::new(names)))
|
Ok(Loaded::Collection(Rc::new(names)))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Generates a cached file loader for a directory
|
/// Generates a cached file loader for a directory
|
||||||
pub fn mk_cache(root: PathBuf, i: &Interner) -> FileCache {
|
pub fn mk_cache(root: PathBuf, i: &Interner) -> FileCache {
|
||||||
Cache::new(move |token: Token<Vec<Token<String>>>, _this| -> IOResult {
|
Cache::new(move |token: Sym, _this| -> IOResult {
|
||||||
let path = i.r(token).iter()
|
let path = i.r(token).iter().map(|t| i.r(*t).as_str()).collect::<Vec<_>>();
|
||||||
.map(|t| i.r(*t).as_str())
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
load_file(&root, &path)
|
load_file(&root, &path)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
@@ -95,12 +91,18 @@ pub fn mk_cache(root: PathBuf, i: &Interner) -> FileCache {
|
|||||||
/// Loads the string contents of a file at the given location.
|
/// Loads the string contents of a file at the given location.
|
||||||
/// If the path points to a directory, raises an error.
|
/// If the path points to a directory, raises an error.
|
||||||
pub fn load_text(
|
pub fn load_text(
|
||||||
path: Token<Vec<Token<String>>>,
|
path: Sym,
|
||||||
load_file: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
load_file: &impl Fn(Sym) -> IOResult,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Result<Rc<String>, Rc<dyn ProjectError>> {
|
) -> Result<Rc<String>, Rc<dyn ProjectError>> {
|
||||||
if let Loaded::Code(s) = load_file(path)? {Ok(s)}
|
if let Loaded::Code(s) = load_file(path)? {
|
||||||
else {Err(UnexpectedDirectory{
|
Ok(s)
|
||||||
path: i.r(path).iter().map(|t| i.r(*t)).cloned().collect()
|
} else {
|
||||||
}.rc())}
|
Err(
|
||||||
|
UnexpectedDirectory {
|
||||||
|
path: i.r(path).iter().map(|t| i.r(*t)).cloned().collect(),
|
||||||
|
}
|
||||||
|
.rc(),
|
||||||
|
)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,32 +1,32 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::representations::tree::Module;
|
|
||||||
use crate::representations::sourcefile::absolute_path;
|
|
||||||
use crate::utils::{Substack};
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
|
|
||||||
use super::error::{ProjectError, TooManySupers};
|
use super::error::{ProjectError, TooManySupers};
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::representations::sourcefile::absolute_path;
|
||||||
|
use crate::utils::Substack;
|
||||||
|
|
||||||
pub fn import_abs_path(
|
pub fn import_abs_path(
|
||||||
src_path: &[Token<String>],
|
src_path: &[Tok<String>],
|
||||||
mod_stack: Substack<Token<String>>,
|
mod_stack: Substack<Tok<String>>,
|
||||||
module: &Module<impl Clone, impl Clone>,
|
import_path: &[Tok<String>],
|
||||||
import_path: &[Token<String>],
|
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> Result<Vec<Token<String>>, Rc<dyn ProjectError>> {
|
) -> Result<Vec<Tok<String>>, Rc<dyn ProjectError>> {
|
||||||
// path of module within file
|
// path of module within file
|
||||||
let mod_pathv = mod_stack.iter().rev_vec_clone();
|
let mod_pathv = mod_stack.iter().rev_vec_clone();
|
||||||
// path of module within compilation
|
// path of module within compilation
|
||||||
let abs_pathv = src_path.iter().copied()
|
let abs_pathv = src_path
|
||||||
|
.iter()
|
||||||
|
.copied()
|
||||||
.chain(mod_pathv.iter().copied())
|
.chain(mod_pathv.iter().copied())
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
// preload-target path relative to module
|
// preload-target path relative to module
|
||||||
// preload-target path within compilation
|
// preload-target path within compilation
|
||||||
absolute_path(&abs_pathv, import_path, i, &|n| {
|
absolute_path(&abs_pathv, import_path, i).map_err(|_| {
|
||||||
module.items.contains_key(&n)
|
TooManySupers {
|
||||||
}).map_err(|_| TooManySupers{
|
|
||||||
path: import_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
path: import_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||||
offender_file: src_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
offender_file: src_path.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||||
offender_mod: mod_pathv.iter().map(|t| i.r(*t)).cloned().collect(),
|
offender_mod: mod_pathv.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||||
}.rc())
|
}
|
||||||
|
.rc()
|
||||||
|
})
|
||||||
}
|
}
|
||||||
@@ -1,18 +1,20 @@
|
|||||||
use hashbrown::{HashMap, HashSet};
|
|
||||||
|
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
|
|
||||||
use crate::interner::Token;
|
use hashbrown::{HashMap, HashSet};
|
||||||
|
|
||||||
|
use crate::interner::Sym;
|
||||||
|
|
||||||
#[derive(Clone, Debug, Default)]
|
#[derive(Clone, Debug, Default)]
|
||||||
pub struct AliasMap{
|
pub struct AliasMap {
|
||||||
pub targets: HashMap<Token<Vec<Token<String>>>, Token<Vec<Token<String>>>>,
|
pub targets: HashMap<Sym, Sym>,
|
||||||
pub aliases: HashMap<Token<Vec<Token<String>>>, HashSet<Token<Vec<Token<String>>>>>,
|
pub aliases: HashMap<Sym, HashSet<Sym>>,
|
||||||
}
|
}
|
||||||
impl AliasMap {
|
impl AliasMap {
|
||||||
pub fn new() -> Self {Self::default()}
|
pub fn new() -> Self {
|
||||||
|
Self::default()
|
||||||
|
}
|
||||||
|
|
||||||
pub fn link(&mut self, alias: Token<Vec<Token<String>>>, target: Token<Vec<Token<String>>>) {
|
pub fn link(&mut self, alias: Sym, target: Sym) {
|
||||||
let prev = self.targets.insert(alias, target);
|
let prev = self.targets.insert(alias, target);
|
||||||
debug_assert!(prev.is_none(), "Alias already has a target");
|
debug_assert!(prev.is_none(), "Alias already has a target");
|
||||||
multimap_entry(&mut self.aliases, &target).insert(alias);
|
multimap_entry(&mut self.aliases, &target).insert(alias);
|
||||||
@@ -21,9 +23,7 @@ impl AliasMap {
|
|||||||
for alt in alts {
|
for alt in alts {
|
||||||
// Assert that this step has always been done in the past
|
// Assert that this step has always been done in the past
|
||||||
debug_assert!(
|
debug_assert!(
|
||||||
self.aliases.get(&alt)
|
self.aliases.get(&alt).map(HashSet::is_empty).unwrap_or(true),
|
||||||
.map(HashSet::is_empty)
|
|
||||||
.unwrap_or(true),
|
|
||||||
"Alias set of alias not empty"
|
"Alias set of alias not empty"
|
||||||
);
|
);
|
||||||
debug_assert!(
|
debug_assert!(
|
||||||
@@ -35,7 +35,7 @@ impl AliasMap {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn resolve(&self, alias: Token<Vec<Token<String>>>) -> Option<Token<Vec<Token<String>>>> {
|
pub fn resolve(&self, alias: Sym) -> Option<Sym> {
|
||||||
self.targets.get(&alias).copied()
|
self.targets.get(&alias).copied()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -44,9 +44,10 @@ impl AliasMap {
|
|||||||
/// map-to-set (aka. multimap)
|
/// map-to-set (aka. multimap)
|
||||||
fn multimap_entry<'a, K: Eq + Hash + Clone, V>(
|
fn multimap_entry<'a, K: Eq + Hash + Clone, V>(
|
||||||
map: &'a mut HashMap<K, HashSet<V>>,
|
map: &'a mut HashMap<K, HashSet<V>>,
|
||||||
key: &'_ K
|
key: &'_ K,
|
||||||
) -> &'a mut HashSet<V> {
|
) -> &'a mut HashSet<V> {
|
||||||
map.raw_entry_mut()
|
map
|
||||||
|
.raw_entry_mut()
|
||||||
.from_key(key)
|
.from_key(key)
|
||||||
.or_insert_with(|| (key.clone(), HashSet::new()))
|
.or_insert_with(|| (key.clone(), HashSet::new()))
|
||||||
.1
|
.1
|
||||||
|
|||||||
@@ -2,22 +2,28 @@ use std::rc::Rc;
|
|||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use crate::{utils::Substack, interner::{Token, Interner}, pipeline::{ProjectModule, ProjectExt}, representations::tree::{ModEntry, ModMember}, ast::{Rule, Expr}};
|
use super::alias_map::AliasMap;
|
||||||
|
use super::decls::InjectedAsFn;
|
||||||
use super::{alias_map::AliasMap, decls::InjectedAsFn};
|
use crate::ast::{Expr, Rule};
|
||||||
|
use crate::interner::{Interner, Sym, Tok};
|
||||||
|
use crate::pipeline::{ProjectExt, ProjectModule};
|
||||||
|
use crate::representations::tree::{ModEntry, ModMember};
|
||||||
|
use crate::utils::Substack;
|
||||||
|
|
||||||
fn resolve(
|
fn resolve(
|
||||||
token: Token<Vec<Token<String>>>,
|
token: Sym,
|
||||||
alias_map: &AliasMap,
|
alias_map: &AliasMap,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> Option<Vec<Token<String>>> {
|
) -> Option<Vec<Tok<String>>> {
|
||||||
if let Some(alias) = alias_map.resolve(token) {
|
if let Some(alias) = alias_map.resolve(token) {
|
||||||
Some(i.r(alias).clone())
|
Some(i.r(alias).clone())
|
||||||
} else if let Some((foot, body)) = i.r(token).split_last() {
|
} else if let Some((foot, body)) = i.r(token).split_last() {
|
||||||
let mut new_beginning = resolve(i.i(body), alias_map, i)?;
|
let mut new_beginning = resolve(i.i(body), alias_map, i)?;
|
||||||
new_beginning.push(*foot);
|
new_beginning.push(*foot);
|
||||||
Some(new_beginning)
|
Some(new_beginning)
|
||||||
} else {None}
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn process_expr(
|
fn process_expr(
|
||||||
@@ -26,74 +32,91 @@ fn process_expr(
|
|||||||
injected_as: &impl InjectedAsFn,
|
injected_as: &impl InjectedAsFn,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> Expr {
|
) -> Expr {
|
||||||
expr.map_names(&|n| {
|
expr
|
||||||
|
.map_names(&|n| {
|
||||||
injected_as(&i.r(n)[..]).or_else(|| {
|
injected_as(&i.r(n)[..]).or_else(|| {
|
||||||
let next_v = resolve(n, alias_map, i)?;
|
let next_v = resolve(n, alias_map, i)?;
|
||||||
// println!("Resolved alias {} to {}",
|
// println!("Resolved alias {} to {}",
|
||||||
// i.extern_vec(n).join("::"),
|
// i.extern_vec(n).join("::"),
|
||||||
// i.extern_all(&next_v).join("::")
|
// i.extern_all(&next_v).join("::")
|
||||||
// );
|
// );
|
||||||
Some(
|
Some(injected_as(&next_v).unwrap_or_else(|| i.i(&next_v)))
|
||||||
injected_as(&next_v)
|
|
||||||
.unwrap_or_else(|| i.i(&next_v))
|
|
||||||
)
|
|
||||||
})
|
})
|
||||||
}).unwrap_or_else(|| expr.clone())
|
})
|
||||||
|
.unwrap_or_else(|| expr.clone())
|
||||||
}
|
}
|
||||||
|
|
||||||
// TODO: replace is_injected with injected_as
|
// TODO: replace is_injected with injected_as
|
||||||
/// Replace all aliases with the name they're originally defined as
|
/// Replace all aliases with the name they're originally defined as
|
||||||
fn apply_aliases_rec(
|
fn apply_aliases_rec(
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
module: &ProjectModule,
|
module: &ProjectModule,
|
||||||
alias_map: &AliasMap,
|
alias_map: &AliasMap,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
injected_as: &impl InjectedAsFn,
|
injected_as: &impl InjectedAsFn,
|
||||||
) -> ProjectModule {
|
) -> ProjectModule {
|
||||||
let items = module.items.iter().map(|(name, ent)| {
|
let items = module
|
||||||
let ModEntry{ exported, member } = ent;
|
.items
|
||||||
|
.iter()
|
||||||
|
.map(|(name, ent)| {
|
||||||
|
let ModEntry { exported, member } = ent;
|
||||||
let member = match member {
|
let member = match member {
|
||||||
ModMember::Item(expr) => ModMember::Item(
|
ModMember::Item(expr) =>
|
||||||
process_expr(expr, alias_map, injected_as, i)
|
ModMember::Item(process_expr(expr, alias_map, injected_as, i)),
|
||||||
),
|
|
||||||
ModMember::Sub(module) => {
|
ModMember::Sub(module) => {
|
||||||
let subpath = path.push(*name);
|
let subpath = path.push(*name);
|
||||||
let is_ignored = injected_as(&subpath.iter().rev_vec_clone()).is_some();
|
let is_ignored =
|
||||||
let new_mod = if is_ignored {module.clone()} else {
|
injected_as(&subpath.iter().rev_vec_clone()).is_some();
|
||||||
|
let new_mod = if is_ignored {
|
||||||
|
module.clone()
|
||||||
|
} else {
|
||||||
let module = module.as_ref();
|
let module = module.as_ref();
|
||||||
Rc::new(apply_aliases_rec(
|
Rc::new(apply_aliases_rec(
|
||||||
subpath, module,
|
subpath,
|
||||||
alias_map, i, injected_as
|
module,
|
||||||
|
alias_map,
|
||||||
|
i,
|
||||||
|
injected_as,
|
||||||
))
|
))
|
||||||
};
|
};
|
||||||
ModMember::Sub(new_mod)
|
ModMember::Sub(new_mod)
|
||||||
}
|
},
|
||||||
};
|
};
|
||||||
(*name, ModEntry{ exported: *exported, member })
|
(*name, ModEntry { exported: *exported, member })
|
||||||
}).collect::<HashMap<_, _>>();
|
})
|
||||||
let rules = module.extra.rules.iter().map(|rule| {
|
.collect::<HashMap<_, _>>();
|
||||||
let Rule{ source, prio, target } = rule;
|
let rules = module
|
||||||
Rule{
|
.extra
|
||||||
|
.rules
|
||||||
|
.iter()
|
||||||
|
.map(|rule| {
|
||||||
|
let Rule { source, prio, target } = rule;
|
||||||
|
Rule {
|
||||||
prio: *prio,
|
prio: *prio,
|
||||||
source: Rc::new(source.iter()
|
source: Rc::new(
|
||||||
|
source
|
||||||
|
.iter()
|
||||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||||
.collect::<Vec<_>>()
|
.collect::<Vec<_>>(),
|
||||||
),
|
),
|
||||||
target: Rc::new(target.iter()
|
target: Rc::new(
|
||||||
|
target
|
||||||
|
.iter()
|
||||||
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
.map(|expr| process_expr(expr, alias_map, injected_as, i))
|
||||||
.collect::<Vec<_>>()
|
.collect::<Vec<_>>(),
|
||||||
),
|
),
|
||||||
}
|
}
|
||||||
}).collect::<Vec<_>>();
|
})
|
||||||
ProjectModule{
|
.collect::<Vec<_>>();
|
||||||
|
ProjectModule {
|
||||||
items,
|
items,
|
||||||
imports: module.imports.clone(),
|
imports: module.imports.clone(),
|
||||||
extra: ProjectExt{
|
extra: ProjectExt {
|
||||||
rules,
|
rules,
|
||||||
exports: module.extra.exports.clone(),
|
exports: module.extra.exports.clone(),
|
||||||
file: module.extra.file.clone(),
|
file: module.extra.file.clone(),
|
||||||
imports_from: module.extra.imports_from.clone(),
|
imports_from: module.extra.imports_from.clone(),
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,62 +1,70 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::representations::tree::{WalkErrorKind, ModMember};
|
|
||||||
use crate::pipeline::error::{ProjectError, NotExported};
|
|
||||||
use crate::pipeline::project_tree::{ProjectTree, split_path, ProjectModule};
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::utils::{Substack, pushed};
|
|
||||||
|
|
||||||
use super::alias_map::AliasMap;
|
use super::alias_map::AliasMap;
|
||||||
use super::decls::InjectedAsFn;
|
use super::decls::InjectedAsFn;
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::pipeline::error::{NotExported, ProjectError};
|
||||||
|
use crate::pipeline::project_tree::{split_path, ProjectModule, ProjectTree};
|
||||||
|
use crate::representations::tree::{ModMember, WalkErrorKind};
|
||||||
|
use crate::utils::{pushed, Substack};
|
||||||
|
|
||||||
/// Assert that a module identified by a path can see a given symbol
|
/// Assert that a module identified by a path can see a given symbol
|
||||||
fn assert_visible(
|
fn assert_visible(
|
||||||
source: &[Token<String>], // must point to a file or submodule
|
source: &[Tok<String>], // must point to a file or submodule
|
||||||
target: &[Token<String>], // may point to a symbol or module of any kind
|
target: &[Tok<String>], // may point to a symbol or module of any kind
|
||||||
project: &ProjectTree,
|
project: &ProjectTree,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Result<(), Rc<dyn ProjectError>> {
|
) -> Result<(), Rc<dyn ProjectError>> {
|
||||||
let (tgt_item, tgt_path) = if let Some(s) = target.split_last() {s}
|
let (tgt_item, tgt_path) = if let Some(s) = target.split_last() {
|
||||||
else {return Ok(())};
|
s
|
||||||
let shared_len = source.iter()
|
} else {
|
||||||
.zip(tgt_path.iter())
|
return Ok(());
|
||||||
.take_while(|(a, b)| a == b)
|
};
|
||||||
.count();
|
let shared_len =
|
||||||
let shared_root = project.0.walk(&tgt_path[..shared_len], false)
|
source.iter().zip(tgt_path.iter()).take_while(|(a, b)| a == b).count();
|
||||||
.expect("checked in parsing");
|
let shared_root =
|
||||||
let direct_parent = shared_root.walk(&tgt_path[shared_len..], true)
|
project.0.walk(&tgt_path[..shared_len], false).expect("checked in parsing");
|
||||||
.map_err(|e| match e.kind {
|
let direct_parent =
|
||||||
|
shared_root.walk(&tgt_path[shared_len..], true).map_err(|e| {
|
||||||
|
match e.kind {
|
||||||
WalkErrorKind::Missing => panic!("checked in parsing"),
|
WalkErrorKind::Missing => panic!("checked in parsing"),
|
||||||
WalkErrorKind::Private => {
|
WalkErrorKind::Private => {
|
||||||
let full_path = &tgt_path[..shared_len + e.pos];
|
let full_path = &tgt_path[..shared_len + e.pos];
|
||||||
let (file, sub) = split_path(full_path, &project);
|
let (file, sub) = split_path(full_path, project);
|
||||||
let (ref_file, ref_sub) = split_path(source, &project);
|
let (ref_file, ref_sub) = split_path(source, project);
|
||||||
NotExported{
|
NotExported {
|
||||||
file: i.extern_all(file),
|
file: i.extern_all(file),
|
||||||
subpath: i.extern_all(sub),
|
subpath: i.extern_all(sub),
|
||||||
referrer_file: i.extern_all(ref_file),
|
referrer_file: i.extern_all(ref_file),
|
||||||
referrer_subpath: i.extern_all(ref_sub),
|
referrer_subpath: i.extern_all(ref_sub),
|
||||||
}.rc()
|
}
|
||||||
|
.rc()
|
||||||
|
},
|
||||||
}
|
}
|
||||||
})?;
|
})?;
|
||||||
let tgt_item_exported = direct_parent.extra.exports.contains_key(tgt_item);
|
let tgt_item_exported = direct_parent.extra.exports.contains_key(tgt_item);
|
||||||
let target_prefixes_source = shared_len == tgt_path.len()
|
let target_prefixes_source =
|
||||||
&& source.get(shared_len) == Some(tgt_item);
|
shared_len == tgt_path.len() && source.get(shared_len) == Some(tgt_item);
|
||||||
if !tgt_item_exported && !target_prefixes_source {
|
if !tgt_item_exported && !target_prefixes_source {
|
||||||
let (file, sub) = split_path(target, &project);
|
let (file, sub) = split_path(target, project);
|
||||||
let (ref_file, ref_sub) = split_path(source, &project);
|
let (ref_file, ref_sub) = split_path(source, project);
|
||||||
Err(NotExported{
|
Err(
|
||||||
|
NotExported {
|
||||||
file: i.extern_all(file),
|
file: i.extern_all(file),
|
||||||
subpath: i.extern_all(sub),
|
subpath: i.extern_all(sub),
|
||||||
referrer_file: i.extern_all(ref_file),
|
referrer_file: i.extern_all(ref_file),
|
||||||
referrer_subpath: i.extern_all(ref_sub),
|
referrer_subpath: i.extern_all(ref_sub),
|
||||||
}.rc())
|
}
|
||||||
} else {Ok(())}
|
.rc(),
|
||||||
|
)
|
||||||
|
} else {
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Populate target and alias maps from the module tree recursively
|
/// Populate target and alias maps from the module tree recursively
|
||||||
fn collect_aliases_rec(
|
fn collect_aliases_rec(
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
module: &ProjectModule,
|
module: &ProjectModule,
|
||||||
project: &ProjectTree,
|
project: &ProjectTree,
|
||||||
alias_map: &mut AliasMap,
|
alias_map: &mut AliasMap,
|
||||||
@@ -65,7 +73,9 @@ fn collect_aliases_rec(
|
|||||||
) -> Result<(), Rc<dyn ProjectError>> {
|
) -> Result<(), Rc<dyn ProjectError>> {
|
||||||
// Assume injected module has been alias-resolved
|
// Assume injected module has been alias-resolved
|
||||||
let mod_path_v = path.iter().rev_vec_clone();
|
let mod_path_v = path.iter().rev_vec_clone();
|
||||||
if injected_as(&mod_path_v).is_some() {return Ok(())};
|
if injected_as(&mod_path_v).is_some() {
|
||||||
|
return Ok(());
|
||||||
|
};
|
||||||
for (&name, &target_mod) in module.extra.imports_from.iter() {
|
for (&name, &target_mod) in module.extra.imports_from.iter() {
|
||||||
let target_mod_v = i.r(target_mod);
|
let target_mod_v = i.r(target_mod);
|
||||||
let target_sym_v = pushed(target_mod_v, name);
|
let target_sym_v = pushed(target_mod_v, name);
|
||||||
@@ -78,11 +88,16 @@ fn collect_aliases_rec(
|
|||||||
for (&name, entry) in module.items.iter() {
|
for (&name, entry) in module.items.iter() {
|
||||||
let submodule = if let ModMember::Sub(s) = &entry.member {
|
let submodule = if let ModMember::Sub(s) = &entry.member {
|
||||||
s.as_ref()
|
s.as_ref()
|
||||||
} else {continue};
|
} else {
|
||||||
|
continue;
|
||||||
|
};
|
||||||
collect_aliases_rec(
|
collect_aliases_rec(
|
||||||
path.push(name),
|
path.push(name),
|
||||||
submodule, project, alias_map,
|
submodule,
|
||||||
i, injected_as,
|
project,
|
||||||
|
alias_map,
|
||||||
|
i,
|
||||||
|
injected_as,
|
||||||
)?
|
)?
|
||||||
}
|
}
|
||||||
Ok(())
|
Ok(())
|
||||||
@@ -97,7 +112,11 @@ pub fn collect_aliases(
|
|||||||
injected_as: &impl InjectedAsFn,
|
injected_as: &impl InjectedAsFn,
|
||||||
) -> Result<(), Rc<dyn ProjectError>> {
|
) -> Result<(), Rc<dyn ProjectError>> {
|
||||||
collect_aliases_rec(
|
collect_aliases_rec(
|
||||||
Substack::Bottom, module, project, alias_map,
|
Substack::Bottom,
|
||||||
i, injected_as
|
module,
|
||||||
|
project,
|
||||||
|
alias_map,
|
||||||
|
i,
|
||||||
|
injected_as,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
@@ -1,5 +1,3 @@
|
|||||||
use crate::interner::Token;
|
use crate::interner::{Sym, Tok};
|
||||||
|
|
||||||
pub trait InjectedAsFn = Fn(
|
pub trait InjectedAsFn = Fn(&[Tok<String>]) -> Option<Sym>;
|
||||||
&[Token<String>]
|
|
||||||
) -> Option<Token<Vec<Token<String>>>>;
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
mod alias_map;
|
mod alias_map;
|
||||||
mod collect_aliases;
|
|
||||||
mod apply_aliases;
|
mod apply_aliases;
|
||||||
mod resolve_imports;
|
mod collect_aliases;
|
||||||
mod decls;
|
mod decls;
|
||||||
|
mod resolve_imports;
|
||||||
|
|
||||||
pub use resolve_imports::resolve_imports;
|
pub use resolve_imports::resolve_imports;
|
||||||
|
|||||||
@@ -2,16 +2,14 @@ use std::rc::Rc;
|
|||||||
|
|
||||||
use itertools::Itertools;
|
use itertools::Itertools;
|
||||||
|
|
||||||
|
use super::alias_map::AliasMap;
|
||||||
|
use super::apply_aliases::apply_aliases;
|
||||||
|
use super::collect_aliases::collect_aliases;
|
||||||
|
use super::decls::InjectedAsFn;
|
||||||
use crate::interner::Interner;
|
use crate::interner::Interner;
|
||||||
use crate::pipeline::error::ProjectError;
|
use crate::pipeline::error::ProjectError;
|
||||||
use crate::pipeline::project_tree::ProjectTree;
|
use crate::pipeline::project_tree::ProjectTree;
|
||||||
|
|
||||||
|
|
||||||
use super::alias_map::AliasMap;
|
|
||||||
use super::collect_aliases::collect_aliases;
|
|
||||||
use super::apply_aliases::apply_aliases;
|
|
||||||
use super::decls::InjectedAsFn;
|
|
||||||
|
|
||||||
/// Follow import chains to locate the original name of all tokens, then
|
/// Follow import chains to locate the original name of all tokens, then
|
||||||
/// replace these aliases with the original names throughout the tree
|
/// replace these aliases with the original names throughout the tree
|
||||||
pub fn resolve_imports(
|
pub fn resolve_imports(
|
||||||
@@ -20,14 +18,14 @@ pub fn resolve_imports(
|
|||||||
injected_as: &impl InjectedAsFn,
|
injected_as: &impl InjectedAsFn,
|
||||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||||
let mut map = AliasMap::new();
|
let mut map = AliasMap::new();
|
||||||
collect_aliases(
|
collect_aliases(project.0.as_ref(), &project, &mut map, i, injected_as)?;
|
||||||
project.0.as_ref(),
|
println!(
|
||||||
&project, &mut map,
|
"Aliases: {{{:?}}}",
|
||||||
i, injected_as
|
map
|
||||||
)?;
|
.targets
|
||||||
println!("Aliases: {{{:?}}}",
|
.iter()
|
||||||
map.targets.iter()
|
.map(|(kt, vt)| format!(
|
||||||
.map(|(kt, vt)| format!("{} => {}",
|
"{} => {}",
|
||||||
i.extern_vec(*kt).join("::"),
|
i.extern_vec(*kt).join("::"),
|
||||||
i.extern_vec(*vt).join("::")
|
i.extern_vec(*vt).join("::")
|
||||||
))
|
))
|
||||||
|
|||||||
@@ -1,19 +1,14 @@
|
|||||||
pub mod error;
|
pub mod error;
|
||||||
|
pub mod file_loader;
|
||||||
|
mod import_abs_path;
|
||||||
|
mod import_resolution;
|
||||||
|
mod parse_layer;
|
||||||
mod project_tree;
|
mod project_tree;
|
||||||
mod source_loader;
|
mod source_loader;
|
||||||
mod import_abs_path;
|
|
||||||
mod split_name;
|
mod split_name;
|
||||||
mod import_resolution;
|
|
||||||
pub mod file_loader;
|
|
||||||
mod parse_layer;
|
|
||||||
|
|
||||||
pub use parse_layer::parse_layer;
|
pub use parse_layer::parse_layer;
|
||||||
pub use project_tree::{
|
pub use project_tree::{
|
||||||
ConstTree, ProjectExt, ProjectModule, ProjectTree, from_const_tree,
|
collect_consts, collect_rules, from_const_tree, ConstTree, ProjectExt,
|
||||||
collect_consts, collect_rules,
|
ProjectModule, ProjectTree,
|
||||||
};
|
};
|
||||||
// pub use file_loader::{Loaded, FileLoadingError, IOResult};
|
|
||||||
// pub use error::{
|
|
||||||
// ErrorPosition, ModuleNotFound, NotExported, ParseErrorWithPath,
|
|
||||||
// ProjectError, TooManySupers, UnexpectedDirectory
|
|
||||||
// };
|
|
||||||
@@ -1,13 +1,10 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::representations::sourcefile::FileEntry;
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
|
|
||||||
use super::{project_tree, import_resolution};
|
|
||||||
use super::source_loader;
|
|
||||||
use super::file_loader::IOResult;
|
|
||||||
use super::error::ProjectError;
|
use super::error::ProjectError;
|
||||||
use super::ProjectTree;
|
use super::file_loader::IOResult;
|
||||||
|
use super::{import_resolution, project_tree, source_loader, ProjectTree};
|
||||||
|
use crate::interner::{Interner, Sym, Tok};
|
||||||
|
use crate::representations::sourcefile::FileEntry;
|
||||||
|
|
||||||
/// Using an IO callback, produce a project tree that includes the given
|
/// Using an IO callback, produce a project tree that includes the given
|
||||||
/// target symbols or files if they're defined.
|
/// target symbols or files if they're defined.
|
||||||
@@ -17,34 +14,31 @@ use super::ProjectTree;
|
|||||||
/// prelude which will be prepended to each individual file. Since the
|
/// prelude which will be prepended to each individual file. Since the
|
||||||
/// prelude gets compiled with each file, normally it should be a glob
|
/// prelude gets compiled with each file, normally it should be a glob
|
||||||
/// import pointing to a module in the environment.
|
/// import pointing to a module in the environment.
|
||||||
pub fn parse_layer<'a>(
|
pub fn parse_layer(
|
||||||
targets: &[Token<Vec<Token<String>>>],
|
targets: &[Sym],
|
||||||
loader: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
loader: &impl Fn(Sym) -> IOResult,
|
||||||
environment: &'a ProjectTree,
|
environment: &ProjectTree,
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||||
// A path is injected if it is walkable in the injected tree
|
// A path is injected if it is walkable in the injected tree
|
||||||
let injected_as = |path: &[Token<String>]| {
|
let injected_as = |path: &[Tok<String>]| {
|
||||||
let (item, modpath) = path.split_last()?;
|
let (item, modpath) = path.split_last()?;
|
||||||
let module = environment.0.walk(modpath, false).ok()?;
|
let module = environment.0.walk(modpath, false).ok()?;
|
||||||
let inj = module.extra.exports.get(item).copied()?;
|
let inj = module.extra.exports.get(item).copied()?;
|
||||||
Some(inj)
|
Some(inj)
|
||||||
};
|
};
|
||||||
let injected_names = |path: Token<Vec<Token<String>>>| {
|
let injected_names = |path: Tok<Vec<Tok<String>>>| {
|
||||||
let pathv = &i.r(path)[..];
|
let module = environment.0.walk(&i.r(path)[..], false).ok()?;
|
||||||
let module = environment.0.walk(&pathv, false).ok()?;
|
Some(Rc::new(module.extra.exports.keys().copied().collect()))
|
||||||
Some(Rc::new(
|
|
||||||
module.extra.exports.keys().copied().collect()
|
|
||||||
))
|
|
||||||
};
|
};
|
||||||
let source = source_loader::load_source(
|
let source =
|
||||||
targets, prelude, i, loader, &|path| injected_as(path).is_some()
|
source_loader::load_source(targets, prelude, i, loader, &|path| {
|
||||||
)?;
|
injected_as(path).is_some()
|
||||||
|
})?;
|
||||||
let tree = project_tree::build_tree(source, i, prelude, &injected_names)?;
|
let tree = project_tree::build_tree(source, i, prelude, &injected_names)?;
|
||||||
let sum = ProjectTree(Rc::new(
|
let sum = ProjectTree(Rc::new(
|
||||||
environment.0.as_ref().clone()
|
environment.0.as_ref().clone() + tree.0.as_ref().clone(),
|
||||||
+ tree.0.as_ref().clone()
|
|
||||||
));
|
));
|
||||||
let resolvd = import_resolution::resolve_imports(sum, i, &injected_as)?;
|
let resolvd = import_resolution::resolve_imports(sum, i, &injected_as)?;
|
||||||
// Addition among modules favours the left hand side.
|
// Addition among modules favours the left hand side.
|
||||||
|
|||||||
@@ -1,23 +1,19 @@
|
|||||||
use crate::representations::sourcefile::{Member, FileEntry};
|
use crate::interner::Tok;
|
||||||
use crate::interner::Token;
|
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||||
|
|
||||||
fn member_rec(
|
fn member_rec(
|
||||||
// object
|
// object
|
||||||
member: Member,
|
member: Member,
|
||||||
// context
|
// context
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
) -> Member {
|
) -> Member {
|
||||||
match member {
|
match member {
|
||||||
Member::Namespace(name, body) => {
|
Member::Namespace(Namespace { name, body }) => {
|
||||||
let new_body = entv_rec(
|
let new_body = entv_rec(body, path, prelude);
|
||||||
body,
|
Member::Namespace(Namespace { name, body: new_body })
|
||||||
path,
|
|
||||||
prelude
|
|
||||||
);
|
|
||||||
Member::Namespace(name, new_body)
|
|
||||||
},
|
},
|
||||||
any => any
|
any => any,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -25,27 +21,25 @@ fn entv_rec(
|
|||||||
// object
|
// object
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
// context
|
// context
|
||||||
mod_path: &[Token<String>],
|
mod_path: &[Tok<String>],
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
prelude.iter().cloned()
|
prelude
|
||||||
.chain(data.into_iter()
|
.iter()
|
||||||
.map(|ent| match ent {
|
.cloned()
|
||||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
.chain(data.into_iter().map(|ent| match ent {
|
||||||
mem, mod_path, prelude
|
FileEntry::Exported(mem) =>
|
||||||
)),
|
FileEntry::Exported(member_rec(mem, mod_path, prelude)),
|
||||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
FileEntry::Internal(mem) =>
|
||||||
mem, mod_path, prelude
|
FileEntry::Internal(member_rec(mem, mod_path, prelude)),
|
||||||
)),
|
any => any,
|
||||||
any => any
|
}))
|
||||||
})
|
|
||||||
)
|
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn add_prelude(
|
pub fn add_prelude(
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
entv_rec(data, path, prelude)
|
entv_rec(data, path, prelude)
|
||||||
|
|||||||
@@ -2,38 +2,41 @@ use std::rc::Rc;
|
|||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use crate::pipeline::error::ProjectError;
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::utils::iter::{box_once, box_empty};
|
|
||||||
use crate::utils::{Substack, pushed};
|
|
||||||
use crate::ast::{Expr, Constant};
|
|
||||||
use crate::pipeline::source_loader::{LoadedSourceTable, LoadedSource};
|
|
||||||
use crate::representations::tree::{Module, ModMember, ModEntry};
|
|
||||||
use crate::representations::sourcefile::{FileEntry, Member, absolute_path};
|
|
||||||
|
|
||||||
use super::collect_ops::InjectedOperatorsFn;
|
use super::collect_ops::InjectedOperatorsFn;
|
||||||
use super::{collect_ops, ProjectTree, ProjectExt};
|
|
||||||
use super::parse_file::parse_file;
|
use super::parse_file::parse_file;
|
||||||
|
use super::{collect_ops, ProjectExt, ProjectTree};
|
||||||
|
use crate::ast::{Constant, Expr};
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::pipeline::error::ProjectError;
|
||||||
|
use crate::pipeline::source_loader::{LoadedSource, LoadedSourceTable};
|
||||||
|
use crate::representations::sourcefile::{absolute_path, FileEntry, Member};
|
||||||
|
use crate::representations::tree::{ModEntry, ModMember, Module};
|
||||||
|
use crate::utils::iter::{box_empty, box_once};
|
||||||
|
use crate::utils::{pushed, Substack};
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
struct ParsedSource<'a> {
|
struct ParsedSource<'a> {
|
||||||
path: Vec<Token<String>>,
|
path: Vec<Tok<String>>,
|
||||||
loaded: &'a LoadedSource,
|
loaded: &'a LoadedSource,
|
||||||
parsed: Vec<FileEntry>
|
parsed: Vec<FileEntry>,
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn split_path<'a>(path: &'a [Token<String>], proj: &'a ProjectTree)
|
pub fn split_path<'a>(
|
||||||
-> (&'a [Token<String>], &'a [Token<String>])
|
path: &'a [Tok<String>],
|
||||||
{
|
proj: &'a ProjectTree,
|
||||||
let (end, body) = if let Some(s) = path.split_last() {s}
|
) -> (&'a [Tok<String>], &'a [Tok<String>]) {
|
||||||
else {return (&[], &[])};
|
let (end, body) = if let Some(s) = path.split_last() {
|
||||||
let mut module = proj.0.walk(body, false).expect("invalid path cannot be split");
|
s
|
||||||
|
} else {
|
||||||
|
return (&[], &[]);
|
||||||
|
};
|
||||||
|
let mut module =
|
||||||
|
proj.0.walk(body, false).expect("invalid path cannot be split");
|
||||||
if let ModMember::Sub(m) = &module.items[end].member {
|
if let ModMember::Sub(m) = &module.items[end].member {
|
||||||
module = m.clone();
|
module = m.clone();
|
||||||
}
|
}
|
||||||
let file = module.extra.file.as_ref()
|
let file =
|
||||||
.map(|s| &path[..s.len()])
|
module.extra.file.as_ref().map(|s| &path[..s.len()]).unwrap_or(path);
|
||||||
.unwrap_or(&path[..]);
|
|
||||||
let subpath = &path[file.len()..];
|
let subpath = &path[file.len()..];
|
||||||
(file, subpath)
|
(file, subpath)
|
||||||
}
|
}
|
||||||
@@ -41,7 +44,7 @@ pub fn split_path<'a>(path: &'a [Token<String>], proj: &'a ProjectTree)
|
|||||||
/// Convert normalized, prefixed source into a module
|
/// Convert normalized, prefixed source into a module
|
||||||
fn source_to_module(
|
fn source_to_module(
|
||||||
// level
|
// level
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
preparsed: &Module<impl Clone, impl Clone>,
|
preparsed: &Module<impl Clone, impl Clone>,
|
||||||
// data
|
// data
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
@@ -50,35 +53,38 @@ fn source_to_module(
|
|||||||
filepath_len: usize,
|
filepath_len: usize,
|
||||||
) -> Rc<Module<Expr, ProjectExt>> {
|
) -> Rc<Module<Expr, ProjectExt>> {
|
||||||
let path_v = path.iter().rev_vec_clone();
|
let path_v = path.iter().rev_vec_clone();
|
||||||
let imports = data.iter()
|
let imports = data
|
||||||
.filter_map(|ent| if let FileEntry::Import(impv) = ent {
|
.iter()
|
||||||
|
.filter_map(|ent| {
|
||||||
|
if let FileEntry::Import(impv) = ent {
|
||||||
Some(impv.iter())
|
Some(impv.iter())
|
||||||
} else {None})
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
|
})
|
||||||
.flatten()
|
.flatten()
|
||||||
.cloned()
|
.cloned()
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
let imports_from = imports.iter()
|
let imports_from = imports
|
||||||
|
.iter()
|
||||||
.map(|imp| {
|
.map(|imp| {
|
||||||
let mut imp_path_v = i.r(imp.path).clone();
|
let mut imp_path_v = i.r(imp.path).clone();
|
||||||
imp_path_v.push(imp.name.expect("imports normalized"));
|
imp_path_v.push(imp.name.expect("imports normalized"));
|
||||||
let mut abs_path = absolute_path(
|
let mut abs_path =
|
||||||
&path_v,
|
absolute_path(&path_v, &imp_path_v, i).expect("tested in preparsing");
|
||||||
&imp_path_v,
|
|
||||||
i, &|n| preparsed.items.contains_key(&n)
|
|
||||||
).expect("tested in preparsing");
|
|
||||||
let name = abs_path.pop().expect("importing the global context");
|
let name = abs_path.pop().expect("importing the global context");
|
||||||
(name, i.i(&abs_path))
|
(name, i.i(&abs_path))
|
||||||
})
|
})
|
||||||
.collect::<HashMap<_, _>>();
|
.collect::<HashMap<_, _>>();
|
||||||
let exports = data.iter()
|
let exports = data
|
||||||
|
.iter()
|
||||||
.flat_map(|ent| {
|
.flat_map(|ent| {
|
||||||
let mk_ent = |name| (name, i.i(&pushed(&path_v, name)));
|
let mk_ent = |name| (name, i.i(&pushed(&path_v, name)));
|
||||||
match ent {
|
match ent {
|
||||||
FileEntry::Export(names)
|
FileEntry::Export(names) => Box::new(names.iter().copied().map(mk_ent)),
|
||||||
=> Box::new(names.iter().copied().map(mk_ent)),
|
|
||||||
FileEntry::Exported(mem) => match mem {
|
FileEntry::Exported(mem) => match mem {
|
||||||
Member::Constant(constant) => box_once(mk_ent(constant.name)),
|
Member::Constant(constant) => box_once(mk_ent(constant.name)),
|
||||||
Member::Namespace(name, _) => box_once(mk_ent(*name)),
|
Member::Namespace(ns) => box_once(mk_ent(ns.name)),
|
||||||
Member::Rule(rule) => {
|
Member::Rule(rule) => {
|
||||||
let mut names = Vec::new();
|
let mut names = Vec::new();
|
||||||
for e in rule.source.iter() {
|
for e in rule.source.iter() {
|
||||||
@@ -89,13 +95,14 @@ fn source_to_module(
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
Box::new(names.into_iter())
|
Box::new(names.into_iter())
|
||||||
}
|
},
|
||||||
}
|
},
|
||||||
_ => box_empty()
|
_ => box_empty(),
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
.collect::<HashMap<_, _>>();
|
.collect::<HashMap<_, _>>();
|
||||||
let rules = data.iter()
|
let rules = data
|
||||||
|
.iter()
|
||||||
.filter_map(|ent| match ent {
|
.filter_map(|ent| match ent {
|
||||||
FileEntry::Exported(Member::Rule(rule)) => Some(rule),
|
FileEntry::Exported(Member::Rule(rule)) => Some(rule),
|
||||||
FileEntry::Internal(Member::Rule(rule)) => Some(rule),
|
FileEntry::Internal(Member::Rule(rule)) => Some(rule),
|
||||||
@@ -103,38 +110,51 @@ fn source_to_module(
|
|||||||
})
|
})
|
||||||
.cloned()
|
.cloned()
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
let items = data.into_iter()
|
let items = data
|
||||||
|
.into_iter()
|
||||||
.filter_map(|ent| match ent {
|
.filter_map(|ent| match ent {
|
||||||
FileEntry::Exported(Member::Namespace(name, body)) => {
|
FileEntry::Exported(Member::Namespace(ns)) => {
|
||||||
let prep_member = &preparsed.items[&name].member;
|
let prep_member = &preparsed.items[&ns.name].member;
|
||||||
let new_prep = if let ModMember::Sub(s) = prep_member {s.as_ref()}
|
let new_prep = if let ModMember::Sub(s) = prep_member {
|
||||||
else { panic!("preparsed missing a submodule") };
|
s.as_ref()
|
||||||
|
} else {
|
||||||
|
panic!("preparsed missing a submodule")
|
||||||
|
};
|
||||||
let module = source_to_module(
|
let module = source_to_module(
|
||||||
path.push(name),
|
path.push(ns.name),
|
||||||
new_prep, body, i, filepath_len
|
new_prep,
|
||||||
|
ns.body,
|
||||||
|
i,
|
||||||
|
filepath_len,
|
||||||
);
|
);
|
||||||
let member = ModMember::Sub(module);
|
let member = ModMember::Sub(module);
|
||||||
Some((name, ModEntry{ exported: true, member }))
|
Some((ns.name, ModEntry { exported: true, member }))
|
||||||
}
|
},
|
||||||
FileEntry::Internal(Member::Namespace(name, body)) => {
|
FileEntry::Internal(Member::Namespace(ns)) => {
|
||||||
let prep_member = &preparsed.items[&name].member;
|
let prep_member = &preparsed.items[&ns.name].member;
|
||||||
let new_prep = if let ModMember::Sub(s) = prep_member {s.as_ref()}
|
let new_prep = if let ModMember::Sub(s) = prep_member {
|
||||||
else { panic!("preparsed missing a submodule") };
|
s.as_ref()
|
||||||
|
} else {
|
||||||
|
panic!("preparsed missing a submodule")
|
||||||
|
};
|
||||||
let module = source_to_module(
|
let module = source_to_module(
|
||||||
path.push(name),
|
path.push(ns.name),
|
||||||
new_prep, body, i, filepath_len
|
new_prep,
|
||||||
|
ns.body,
|
||||||
|
i,
|
||||||
|
filepath_len,
|
||||||
);
|
);
|
||||||
let member = ModMember::Sub(module);
|
let member = ModMember::Sub(module);
|
||||||
Some((name, ModEntry{ exported: false, member }))
|
Some((ns.name, ModEntry { exported: false, member }))
|
||||||
}
|
},
|
||||||
FileEntry::Exported(Member::Constant(Constant{ name, value })) => {
|
FileEntry::Exported(Member::Constant(Constant { name, value })) => {
|
||||||
let member = ModMember::Item(value);
|
let member = ModMember::Item(value);
|
||||||
Some((name, ModEntry{ exported: true, member }))
|
Some((name, ModEntry { exported: true, member }))
|
||||||
}
|
},
|
||||||
FileEntry::Internal(Member::Constant(Constant{ name, value })) => {
|
FileEntry::Internal(Member::Constant(Constant { name, value })) => {
|
||||||
let member = ModMember::Item(value);
|
let member = ModMember::Item(value);
|
||||||
Some((name, ModEntry{ exported: false, member }))
|
Some((name, ModEntry { exported: false, member }))
|
||||||
}
|
},
|
||||||
_ => None,
|
_ => None,
|
||||||
})
|
})
|
||||||
.collect::<HashMap<_, _>>();
|
.collect::<HashMap<_, _>>();
|
||||||
@@ -150,15 +170,15 @@ fn source_to_module(
|
|||||||
imports_from,
|
imports_from,
|
||||||
exports,
|
exports,
|
||||||
rules,
|
rules,
|
||||||
file: Some(path_v[..filepath_len].to_vec())
|
file: Some(path_v[..filepath_len].to_vec()),
|
||||||
}
|
},
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn files_to_module(
|
fn files_to_module(
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
files: &[ParsedSource],
|
files: &[ParsedSource],
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Rc<Module<Expr, ProjectExt>> {
|
) -> Rc<Module<Expr, ProjectExt>> {
|
||||||
let lvl = path.len();
|
let lvl = path.len();
|
||||||
let path_v = path.iter().rev_vec_clone();
|
let path_v = path.iter().rev_vec_clone();
|
||||||
@@ -167,19 +187,22 @@ fn files_to_module(
|
|||||||
path,
|
path,
|
||||||
files[0].loaded.preparsed.0.as_ref(),
|
files[0].loaded.preparsed.0.as_ref(),
|
||||||
files[0].parsed.clone(),
|
files[0].parsed.clone(),
|
||||||
i, path.len()
|
i,
|
||||||
)
|
path.len(),
|
||||||
|
);
|
||||||
}
|
}
|
||||||
let items = files.group_by(|a, b| a.path[lvl] == b.path[lvl]).into_iter()
|
let items = files
|
||||||
|
.group_by(|a, b| a.path[lvl] == b.path[lvl])
|
||||||
.map(|files| {
|
.map(|files| {
|
||||||
let namespace = files[0].path[lvl];
|
let namespace = files[0].path[lvl];
|
||||||
let subpath = path.push(namespace);
|
let subpath = path.push(namespace);
|
||||||
let module = files_to_module(subpath, files, i);
|
let module = files_to_module(subpath, files, i);
|
||||||
let member = ModMember::Sub(module);
|
let member = ModMember::Sub(module);
|
||||||
(namespace, ModEntry{ exported: true, member })
|
(namespace, ModEntry { exported: true, member })
|
||||||
})
|
})
|
||||||
.collect::<HashMap<_, _>>();
|
.collect::<HashMap<_, _>>();
|
||||||
let exports: HashMap<_, _> = items.keys()
|
let exports: HashMap<_, _> = items
|
||||||
|
.keys()
|
||||||
.copied()
|
.copied()
|
||||||
.map(|name| (name, i.i(&pushed(&path_v, name))))
|
.map(|name| (name, i.i(&pushed(&path_v, name))))
|
||||||
.collect();
|
.collect();
|
||||||
@@ -188,37 +211,43 @@ fn files_to_module(
|
|||||||
// i.extern_all(&path_v[..]).join("::"),
|
// i.extern_all(&path_v[..]).join("::"),
|
||||||
// exports.keys().map(|t| i.r(*t)).join(", ")
|
// exports.keys().map(|t| i.r(*t)).join(", ")
|
||||||
// );
|
// );
|
||||||
Rc::new(Module{
|
Rc::new(Module {
|
||||||
items,
|
items,
|
||||||
imports: vec![],
|
imports: vec![],
|
||||||
extra: ProjectExt {
|
extra: ProjectExt {
|
||||||
exports,
|
exports,
|
||||||
imports_from: HashMap::new(),
|
imports_from: HashMap::new(),
|
||||||
rules: vec![], file: None,
|
rules: vec![],
|
||||||
}
|
file: None,
|
||||||
|
},
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn build_tree<'a>(
|
pub fn build_tree(
|
||||||
files: LoadedSourceTable,
|
files: LoadedSourceTable,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
injected: &impl InjectedOperatorsFn,
|
injected: &impl InjectedOperatorsFn,
|
||||||
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
) -> Result<ProjectTree, Rc<dyn ProjectError>> {
|
||||||
let ops_cache = collect_ops::mk_cache(&files, i, injected);
|
let ops_cache = collect_ops::mk_cache(&files, i, injected);
|
||||||
let mut entries = files.iter()
|
let mut entries = files
|
||||||
.map(|(path, loaded)| Ok((
|
.iter()
|
||||||
|
.map(|(path, loaded)| {
|
||||||
|
Ok((
|
||||||
i.r(*path),
|
i.r(*path),
|
||||||
loaded,
|
loaded,
|
||||||
parse_file(*path, &files, &ops_cache, i, prelude)?
|
parse_file(*path, &files, &ops_cache, i, prelude)?,
|
||||||
)))
|
))
|
||||||
|
})
|
||||||
.collect::<Result<Vec<_>, Rc<dyn ProjectError>>>()?;
|
.collect::<Result<Vec<_>, Rc<dyn ProjectError>>>()?;
|
||||||
// sort by similarity, then longest-first
|
// sort by similarity, then longest-first
|
||||||
entries.sort_unstable_by(|a, b| a.0.cmp(&b.0).reverse());
|
entries.sort_unstable_by(|a, b| a.0.cmp(b.0).reverse());
|
||||||
let files = entries.into_iter()
|
let files = entries
|
||||||
.map(|(path, loaded, parsed)| ParsedSource{
|
.into_iter()
|
||||||
loaded, parsed,
|
.map(|(path, loaded, parsed)| ParsedSource {
|
||||||
path: path.clone()
|
loaded,
|
||||||
|
parsed,
|
||||||
|
path: path.clone(),
|
||||||
})
|
})
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
Ok(ProjectTree(files_to_module(Substack::Bottom, &files, i)))
|
Ok(ProjectTree(files_to_module(Substack::Bottom, &files, i)))
|
||||||
|
|||||||
@@ -4,73 +4,80 @@ use std::rc::Rc;
|
|||||||
use hashbrown::HashSet;
|
use hashbrown::HashSet;
|
||||||
use itertools::Itertools;
|
use itertools::Itertools;
|
||||||
|
|
||||||
use crate::representations::tree::WalkErrorKind;
|
use crate::interner::{Interner, Sym, Tok};
|
||||||
|
use crate::pipeline::error::{ModuleNotFound, ProjectError};
|
||||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||||
use crate::pipeline::error::{ProjectError, ModuleNotFound};
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::utils::Cache;
|
|
||||||
use crate::pipeline::split_name::split_name;
|
use crate::pipeline::split_name::split_name;
|
||||||
|
use crate::representations::tree::WalkErrorKind;
|
||||||
|
use crate::utils::Cache;
|
||||||
|
|
||||||
pub type OpsResult = Result<Rc<HashSet<Token<String>>>, Rc<dyn ProjectError>>;
|
pub type OpsResult = Result<Rc<HashSet<Tok<String>>>, Rc<dyn ProjectError>>;
|
||||||
pub type ExportedOpsCache<'a> = Cache<'a, Token<Vec<Token<String>>>, OpsResult>;
|
pub type ExportedOpsCache<'a> = Cache<'a, Sym, OpsResult>;
|
||||||
|
|
||||||
pub trait InjectedOperatorsFn = Fn(
|
pub trait InjectedOperatorsFn = Fn(Sym) -> Option<Rc<HashSet<Tok<String>>>>;
|
||||||
Token<Vec<Token<String>>>
|
|
||||||
) -> Option<Rc<HashSet<Token<String>>>>;
|
|
||||||
|
|
||||||
fn coprefix<T: Eq>(
|
fn coprefix<T: Eq>(
|
||||||
l: impl Iterator<Item = T>,
|
l: impl Iterator<Item = T>,
|
||||||
r: impl Iterator<Item = T>
|
r: impl Iterator<Item = T>,
|
||||||
) -> usize {
|
) -> usize {
|
||||||
l.zip(r).take_while(|(a, b)| a == b).count()
|
l.zip(r).take_while(|(a, b)| a == b).count()
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Collect all names exported by the module at the specified path
|
/// Collect all names exported by the module at the specified path
|
||||||
pub fn collect_exported_ops(
|
pub fn collect_exported_ops(
|
||||||
path: Token<Vec<Token<String>>>,
|
path: Sym,
|
||||||
loaded: &LoadedSourceTable,
|
loaded: &LoadedSourceTable,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
injected: &impl InjectedOperatorsFn
|
injected: &impl InjectedOperatorsFn,
|
||||||
) -> OpsResult {
|
) -> OpsResult {
|
||||||
if let Some(ops) = injected(path) {
|
if let Some(ops) = injected(path) {
|
||||||
if path == i.i(&[i.i("prelude")][..]) {
|
if path == i.i(&[i.i("prelude")][..]) {
|
||||||
println!("%%% Prelude exported ops %%%");
|
println!("%%% Prelude exported ops %%%");
|
||||||
println!("{}", ops.iter().map(|t| i.r(*t)).join(", "));
|
println!("{}", ops.iter().map(|t| i.r(*t)).join(", "));
|
||||||
}
|
}
|
||||||
return Ok(ops)
|
return Ok(ops);
|
||||||
}
|
}
|
||||||
let is_file = |n: &[Token<String>]| loaded.contains_key(&i.i(n));
|
let is_file = |n: &[Tok<String>]| loaded.contains_key(&i.i(n));
|
||||||
let path_s = &i.r(path)[..];
|
let path_s = &i.r(path)[..];
|
||||||
let name_split = split_name(path_s, &is_file);
|
let name_split = split_name(path_s, &is_file);
|
||||||
let (fpath_v, subpath_v) = if let Some(f) = name_split {f} else {
|
let (fpath_v, subpath_v) = if let Some(f) = name_split {
|
||||||
return Ok(Rc::new(loaded.keys().copied()
|
f
|
||||||
|
} else {
|
||||||
|
return Ok(Rc::new(
|
||||||
|
loaded
|
||||||
|
.keys()
|
||||||
|
.copied()
|
||||||
.filter_map(|modname| {
|
.filter_map(|modname| {
|
||||||
let modname_s = i.r(modname);
|
let modname_s = i.r(modname);
|
||||||
if path_s.len() == coprefix(path_s.iter(), modname_s.iter()) {
|
if path_s.len() == coprefix(path_s.iter(), modname_s.iter()) {
|
||||||
Some(modname_s[path_s.len()])
|
Some(modname_s[path_s.len()])
|
||||||
} else {None}
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
})
|
})
|
||||||
.collect::<HashSet<_>>()
|
.collect::<HashSet<_>>(),
|
||||||
))
|
));
|
||||||
};
|
};
|
||||||
let fpath = i.i(fpath_v);
|
let fpath = i.i(fpath_v);
|
||||||
let preparsed = &loaded[&fpath].preparsed;
|
let preparsed = &loaded[&fpath].preparsed;
|
||||||
let module = preparsed.0.walk(&subpath_v, false)
|
let module = preparsed.0.walk(subpath_v, false).map_err(|walk_err| {
|
||||||
.map_err(|walk_err| match walk_err.kind {
|
match walk_err.kind {
|
||||||
WalkErrorKind::Private => unreachable!("visibility is not being checked here"),
|
WalkErrorKind::Private =>
|
||||||
WalkErrorKind::Missing => ModuleNotFound{
|
unreachable!("visibility is not being checked here"),
|
||||||
|
WalkErrorKind::Missing => ModuleNotFound {
|
||||||
file: i.extern_vec(fpath),
|
file: i.extern_vec(fpath),
|
||||||
subpath: subpath_v.into_iter()
|
subpath: subpath_v
|
||||||
|
.iter()
|
||||||
.take(walk_err.pos)
|
.take(walk_err.pos)
|
||||||
.map(|t| i.r(*t))
|
.map(|t| i.r(*t))
|
||||||
.cloned()
|
.cloned()
|
||||||
.collect()
|
.collect(),
|
||||||
}.rc(),
|
}
|
||||||
|
.rc(),
|
||||||
|
}
|
||||||
})?;
|
})?;
|
||||||
let out: HashSet<_> = module.items.iter()
|
let out: HashSet<_> =
|
||||||
.filter(|(_, v)| v.exported)
|
module.items.iter().filter(|(_, v)| v.exported).map(|(k, _)| *k).collect();
|
||||||
.map(|(k, _)| *k)
|
|
||||||
.collect();
|
|
||||||
if path == i.i(&[i.i("prelude")][..]) {
|
if path == i.i(&[i.i("prelude")][..]) {
|
||||||
println!("%%% Prelude exported ops %%%");
|
println!("%%% Prelude exported ops %%%");
|
||||||
println!("{}", out.iter().map(|t| i.r(*t)).join(", "));
|
println!("{}", out.iter().map(|t| i.r(*t)).join(", "));
|
||||||
@@ -83,7 +90,5 @@ pub fn mk_cache<'a>(
|
|||||||
i: &'a Interner,
|
i: &'a Interner,
|
||||||
injected: &'a impl InjectedOperatorsFn,
|
injected: &'a impl InjectedOperatorsFn,
|
||||||
) -> ExportedOpsCache<'a> {
|
) -> ExportedOpsCache<'a> {
|
||||||
Cache::new(|path, _this| {
|
Cache::new(|path, _this| collect_exported_ops(path, loaded, i, injected))
|
||||||
collect_exported_ops(path, loaded, i, injected)
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
@@ -2,7 +2,7 @@ mod exported_ops;
|
|||||||
mod ops_for;
|
mod ops_for;
|
||||||
|
|
||||||
pub use exported_ops::{
|
pub use exported_ops::{
|
||||||
ExportedOpsCache, OpsResult, InjectedOperatorsFn,
|
collect_exported_ops, mk_cache, ExportedOpsCache, InjectedOperatorsFn,
|
||||||
collect_exported_ops, mk_cache
|
OpsResult,
|
||||||
};
|
};
|
||||||
pub use ops_for::collect_ops_for;
|
pub use ops_for::collect_ops_for;
|
||||||
@@ -3,20 +3,19 @@ use std::rc::Rc;
|
|||||||
use hashbrown::HashSet;
|
use hashbrown::HashSet;
|
||||||
use itertools::Itertools;
|
use itertools::Itertools;
|
||||||
|
|
||||||
|
use super::exported_ops::{ExportedOpsCache, OpsResult};
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
use crate::parse::is_op;
|
use crate::parse::is_op;
|
||||||
use crate::pipeline::error::ProjectError;
|
use crate::pipeline::error::ProjectError;
|
||||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::representations::tree::{Module, ModMember};
|
|
||||||
use crate::pipeline::import_abs_path::import_abs_path;
|
use crate::pipeline::import_abs_path::import_abs_path;
|
||||||
|
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||||
use super::exported_ops::{ExportedOpsCache, OpsResult};
|
use crate::representations::tree::{ModMember, Module};
|
||||||
|
|
||||||
/// Collect all operators and names, exported or local, defined in this
|
/// Collect all operators and names, exported or local, defined in this
|
||||||
/// tree.
|
/// tree.
|
||||||
fn tree_all_ops(
|
fn tree_all_ops(
|
||||||
module: &Module<impl Clone, impl Clone>,
|
module: &Module<impl Clone, impl Clone>,
|
||||||
ops: &mut HashSet<Token<String>>
|
ops: &mut HashSet<Tok<String>>,
|
||||||
) {
|
) {
|
||||||
ops.extend(module.items.keys().copied());
|
ops.extend(module.items.keys().copied());
|
||||||
for ent in module.items.values() {
|
for ent in module.items.values() {
|
||||||
@@ -28,21 +27,22 @@ fn tree_all_ops(
|
|||||||
|
|
||||||
/// Collect all names imported in this file
|
/// Collect all names imported in this file
|
||||||
pub fn collect_ops_for(
|
pub fn collect_ops_for(
|
||||||
file: &[Token<String>],
|
file: &[Tok<String>],
|
||||||
loaded: &LoadedSourceTable,
|
loaded: &LoadedSourceTable,
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> OpsResult {
|
) -> OpsResult {
|
||||||
let tree = &loaded[&i.i(file)].preparsed.0;
|
let tree = &loaded[&i.i(file)].preparsed.0;
|
||||||
let mut ret = HashSet::new();
|
let mut ret = HashSet::new();
|
||||||
println!("collecting ops for {}", i.extern_all(file).join("::"));
|
println!("collecting ops for {}", i.extern_all(file).join("::"));
|
||||||
tree_all_ops(tree.as_ref(), &mut ret);
|
tree_all_ops(tree.as_ref(), &mut ret);
|
||||||
tree.visit_all_imports(&mut |modpath, module, import| {
|
tree.visit_all_imports(&mut |modpath, _module, import| {
|
||||||
if let Some(n) = import.name { ret.insert(n); } else {
|
if let Some(n) = import.name {
|
||||||
|
ret.insert(n);
|
||||||
|
} else {
|
||||||
println!("\tglob import from {}", i.extern_vec(import.path).join("::"));
|
println!("\tglob import from {}", i.extern_vec(import.path).join("::"));
|
||||||
let path = import_abs_path(
|
let path = import_abs_path(file, modpath, &i.r(import.path)[..], i)
|
||||||
&file, modpath, module, &i.r(import.path)[..], i
|
.expect("This error should have been caught during loading");
|
||||||
).expect("This error should have been caught during loading");
|
|
||||||
ret.extend(ops_cache.find(&i.i(&path))?.iter().copied());
|
ret.extend(ops_cache.find(&i.i(&path))?.iter().copied());
|
||||||
}
|
}
|
||||||
Ok::<_, Rc<dyn ProjectError>>(())
|
Ok::<_, Rc<dyn ProjectError>>(())
|
||||||
|
|||||||
@@ -1,26 +1,26 @@
|
|||||||
use std::{ops::Add, rc::Rc};
|
use std::ops::Add;
|
||||||
|
use std::rc::Rc;
|
||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
|
use super::{ProjectExt, ProjectModule, ProjectTree};
|
||||||
|
use crate::ast::{Clause, Expr};
|
||||||
|
use crate::foreign::{Atom, Atomic, ExternFn};
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::representations::location::Location;
|
||||||
use crate::representations::tree::{ModEntry, ModMember, Module};
|
use crate::representations::tree::{ModEntry, ModMember, Module};
|
||||||
use crate::representations::Primitive;
|
use crate::representations::Primitive;
|
||||||
use crate::representations::location::Location;
|
use crate::utils::{pushed, Substack};
|
||||||
use crate::foreign::{ExternFn, Atomic, Atom};
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::ast::{Expr, Clause};
|
|
||||||
use crate::utils::{Substack, pushed};
|
|
||||||
|
|
||||||
use super::{ProjectModule, ProjectExt, ProjectTree};
|
|
||||||
|
|
||||||
pub enum ConstTree {
|
pub enum ConstTree {
|
||||||
Const(Expr),
|
Const(Expr),
|
||||||
Tree(HashMap<Token<String>, ConstTree>)
|
Tree(HashMap<Tok<String>, ConstTree>),
|
||||||
}
|
}
|
||||||
impl ConstTree {
|
impl ConstTree {
|
||||||
pub fn primitive(primitive: Primitive) -> Self {
|
pub fn primitive(primitive: Primitive) -> Self {
|
||||||
Self::Const(Expr{
|
Self::Const(Expr {
|
||||||
location: Location::Unknown,
|
location: Location::Unknown,
|
||||||
value: Clause::P(primitive)
|
value: Clause::P(primitive),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
pub fn xfn(xfn: impl ExternFn + 'static) -> Self {
|
pub fn xfn(xfn: impl ExternFn + 'static) -> Self {
|
||||||
@@ -29,9 +29,7 @@ impl ConstTree {
|
|||||||
pub fn atom(atom: impl Atomic + 'static) -> Self {
|
pub fn atom(atom: impl Atomic + 'static) -> Self {
|
||||||
Self::primitive(Primitive::Atom(Atom(Box::new(atom))))
|
Self::primitive(Primitive::Atom(Atom(Box::new(atom))))
|
||||||
}
|
}
|
||||||
pub fn tree(
|
pub fn tree(arr: impl IntoIterator<Item = (Tok<String>, Self)>) -> Self {
|
||||||
arr: impl IntoIterator<Item = (Token<String>, Self)>
|
|
||||||
) -> Self {
|
|
||||||
Self::Tree(arr.into_iter().collect())
|
Self::Tree(arr.into_iter().collect())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -57,27 +55,29 @@ impl Add for ConstTree {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn from_const_tree_rec(
|
fn from_const_tree_rec(
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
consts: HashMap<Token<String>, ConstTree>,
|
consts: HashMap<Tok<String>, ConstTree>,
|
||||||
file: &[Token<String>],
|
file: &[Tok<String>],
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> ProjectModule {
|
) -> ProjectModule {
|
||||||
let mut items = HashMap::new();
|
let mut items = HashMap::new();
|
||||||
let path_v = path.iter().rev_vec_clone();
|
let path_v = path.iter().rev_vec_clone();
|
||||||
for (name, item) in consts {
|
for (name, item) in consts {
|
||||||
items.insert(name, ModEntry{
|
items.insert(name, ModEntry {
|
||||||
exported: true,
|
exported: true,
|
||||||
member: match item {
|
member: match item {
|
||||||
ConstTree::Const(c) => ModMember::Item(c),
|
ConstTree::Const(c) => ModMember::Item(c),
|
||||||
ConstTree::Tree(t) => ModMember::Sub(Rc::new(
|
ConstTree::Tree(t) => ModMember::Sub(Rc::new(from_const_tree_rec(
|
||||||
from_const_tree_rec(path.push(name), t, file, i)
|
path.push(name),
|
||||||
)),
|
t,
|
||||||
}
|
file,
|
||||||
|
i,
|
||||||
|
))),
|
||||||
|
},
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
let exports = items.keys()
|
let exports =
|
||||||
.map(|name| (*name, i.i(&pushed(&path_v, *name))))
|
items.keys().map(|name| (*name, i.i(&pushed(&path_v, *name)))).collect();
|
||||||
.collect();
|
|
||||||
Module {
|
Module {
|
||||||
items,
|
items,
|
||||||
imports: vec![],
|
imports: vec![],
|
||||||
@@ -85,13 +85,13 @@ fn from_const_tree_rec(
|
|||||||
exports,
|
exports,
|
||||||
file: Some(file.to_vec()),
|
file: Some(file.to_vec()),
|
||||||
..Default::default()
|
..Default::default()
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn from_const_tree(
|
pub fn from_const_tree(
|
||||||
consts: HashMap<Token<String>, ConstTree>,
|
consts: HashMap<Tok<String>, ConstTree>,
|
||||||
file: &[Token<String>],
|
file: &[Tok<String>],
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
) -> ProjectTree {
|
) -> ProjectTree {
|
||||||
let module = from_const_tree_rec(Substack::Bottom, consts, file, i);
|
let module = from_const_tree_rec(Substack::Bottom, consts, file, i);
|
||||||
|
|||||||
@@ -1,38 +1,30 @@
|
|||||||
/* FILE SEPARATION BOUNDARY
|
// FILE SEPARATION BOUNDARY
|
||||||
|
//
|
||||||
|
// Collect all operators accessible in each file, parse the files with
|
||||||
|
// correct tokenization, resolve glob imports, convert expressions to
|
||||||
|
// refer to tokens with (local) absolute path, and connect them into a
|
||||||
|
// single tree.
|
||||||
|
//
|
||||||
|
// The module checks for imports from missing modules (including
|
||||||
|
// submodules). All other errors must be checked later.
|
||||||
|
//
|
||||||
|
// Injection strategy:
|
||||||
|
// Return all items of the given module in the injected tree for
|
||||||
|
// `injected` The output of this stage is a tree, which can simply be
|
||||||
|
// overlaid with the injected tree
|
||||||
|
|
||||||
Collect all operators accessible in each file, parse the files with
|
mod add_prelude;
|
||||||
correct tokenization, resolve glob imports, convert expressions to
|
|
||||||
refer to tokens with (local) absolute path, and connect them into a
|
|
||||||
single tree.
|
|
||||||
|
|
||||||
The module checks for imports from missing modules (including submodules).
|
|
||||||
All other errors must be checked later.
|
|
||||||
|
|
||||||
Injection strategy:
|
|
||||||
Return all items of the given module in the injected tree for `injected`
|
|
||||||
The output of this stage is a tree, which can simply be overlaid with
|
|
||||||
the injected tree
|
|
||||||
*/
|
|
||||||
|
|
||||||
mod collect_ops;
|
|
||||||
mod parse_file;
|
|
||||||
mod build_tree;
|
mod build_tree;
|
||||||
|
mod collect_ops;
|
||||||
|
mod const_tree;
|
||||||
mod normalize_imports;
|
mod normalize_imports;
|
||||||
|
mod parse_file;
|
||||||
mod prefix;
|
mod prefix;
|
||||||
mod tree;
|
mod tree;
|
||||||
mod const_tree;
|
|
||||||
mod add_prelude;
|
|
||||||
|
|
||||||
|
pub use build_tree::{build_tree, split_path};
|
||||||
pub use collect_ops::InjectedOperatorsFn;
|
pub use collect_ops::InjectedOperatorsFn;
|
||||||
|
pub use const_tree::{from_const_tree, ConstTree};
|
||||||
pub use const_tree::{
|
|
||||||
ConstTree, from_const_tree,
|
|
||||||
};
|
|
||||||
|
|
||||||
pub use tree::{
|
pub use tree::{
|
||||||
ProjectExt, ProjectModule, ProjectTree, collect_consts, collect_rules
|
collect_consts, collect_rules, ProjectExt, ProjectModule, ProjectTree,
|
||||||
};
|
|
||||||
|
|
||||||
pub use build_tree::{
|
|
||||||
build_tree, split_path
|
|
||||||
};
|
};
|
||||||
@@ -1,74 +1,88 @@
|
|||||||
use crate::representations::tree::{Module, ModMember};
|
|
||||||
use crate::representations::sourcefile::{Member, FileEntry, Import};
|
|
||||||
use crate::utils::BoxedIter;
|
|
||||||
use crate::utils::{Substack, iter::box_once};
|
|
||||||
use crate::interner::{Interner, Token};
|
|
||||||
use crate::pipeline::import_abs_path::import_abs_path;
|
|
||||||
|
|
||||||
use super::collect_ops::ExportedOpsCache;
|
use super::collect_ops::ExportedOpsCache;
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::pipeline::import_abs_path::import_abs_path;
|
||||||
|
use crate::representations::sourcefile::{
|
||||||
|
FileEntry, Import, Member, Namespace,
|
||||||
|
};
|
||||||
|
use crate::representations::tree::{ModMember, Module};
|
||||||
|
use crate::utils::iter::box_once;
|
||||||
|
use crate::utils::{BoxedIter, Substack};
|
||||||
|
|
||||||
fn member_rec(
|
fn member_rec(
|
||||||
// level
|
// level
|
||||||
mod_stack: Substack<Token<String>>,
|
mod_stack: Substack<Tok<String>>,
|
||||||
preparsed: &Module<impl Clone, impl Clone>,
|
preparsed: &Module<impl Clone, impl Clone>,
|
||||||
// object
|
// object
|
||||||
member: Member,
|
member: Member,
|
||||||
// context
|
// context
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Member {
|
) -> Member {
|
||||||
match member {
|
match member {
|
||||||
Member::Namespace(name, body) => {
|
Member::Namespace(Namespace { name, body }) => {
|
||||||
let prepmember = &preparsed.items[&name].member;
|
let prepmember = &preparsed.items[&name].member;
|
||||||
let subprep = if let ModMember::Sub(m) = prepmember {m.clone()}
|
let subprep = if let ModMember::Sub(m) = prepmember {
|
||||||
else {unreachable!("This name must point to a namespace")};
|
m.clone()
|
||||||
|
} else {
|
||||||
|
unreachable!("This name must point to a namespace")
|
||||||
|
};
|
||||||
let new_body = entv_rec(
|
let new_body = entv_rec(
|
||||||
mod_stack.push(name),
|
mod_stack.push(name),
|
||||||
subprep.as_ref(),
|
subprep.as_ref(),
|
||||||
body,
|
body,
|
||||||
path, ops_cache, i
|
path,
|
||||||
|
ops_cache,
|
||||||
|
i,
|
||||||
);
|
);
|
||||||
Member::Namespace(name, new_body)
|
Member::Namespace(Namespace { name, body: new_body })
|
||||||
},
|
},
|
||||||
any => any
|
any => any,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn entv_rec(
|
fn entv_rec(
|
||||||
// level
|
// level
|
||||||
mod_stack: Substack<Token<String>>,
|
mod_stack: Substack<Tok<String>>,
|
||||||
preparsed: &Module<impl Clone, impl Clone>,
|
preparsed: &Module<impl Clone, impl Clone>,
|
||||||
// object
|
// object
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
// context
|
// context
|
||||||
mod_path: &[Token<String>],
|
mod_path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
data.into_iter()
|
data
|
||||||
|
.into_iter()
|
||||||
.map(|ent| match ent {
|
.map(|ent| match ent {
|
||||||
FileEntry::Import(imps) => FileEntry::Import(imps.into_iter()
|
FileEntry::Import(imps) => FileEntry::Import(
|
||||||
.flat_map(|import| if let Import{ name: None, path } = import {
|
imps
|
||||||
let p = import_abs_path(
|
.into_iter()
|
||||||
mod_path, mod_stack, preparsed, &i.r(path)[..], i
|
.flat_map(|import| {
|
||||||
).expect("Should have emerged in preparsing");
|
if let Import { name: None, path } = import {
|
||||||
let names = ops_cache.find(&i.i(&p))
|
let p = import_abs_path(mod_path, mod_stack, &i.r(path)[..], i)
|
||||||
|
.expect("Should have emerged in preparsing");
|
||||||
|
let names = ops_cache
|
||||||
|
.find(&i.i(&p))
|
||||||
.expect("Should have emerged in second parsing");
|
.expect("Should have emerged in second parsing");
|
||||||
let imports = names.iter()
|
let imports = names
|
||||||
.map(move |&n| Import{ name: Some(n), path })
|
.iter()
|
||||||
|
.map(move |&n| Import { name: Some(n), path })
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
Box::new(imports.into_iter()) as BoxedIter<Import>
|
Box::new(imports.into_iter()) as BoxedIter<Import>
|
||||||
} else {box_once(import)})
|
} else {
|
||||||
.collect()
|
box_once(import)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.collect(),
|
||||||
),
|
),
|
||||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
||||||
mod_stack, preparsed, mem, mod_path, ops_cache, i
|
mod_stack, preparsed, mem, mod_path, ops_cache, i,
|
||||||
)),
|
)),
|
||||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
||||||
mod_stack, preparsed, mem, mod_path, ops_cache, i
|
mod_stack, preparsed, mem, mod_path, ops_cache, i,
|
||||||
)),
|
)),
|
||||||
any => any
|
any => any,
|
||||||
})
|
})
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
@@ -76,9 +90,9 @@ fn entv_rec(
|
|||||||
pub fn normalize_imports(
|
pub fn normalize_imports(
|
||||||
preparsed: &Module<impl Clone, impl Clone>,
|
preparsed: &Module<impl Clone, impl Clone>,
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
entv_rec(Substack::Bottom, preparsed, data, path, ops_cache, i)
|
entv_rec(Substack::Bottom, preparsed, data, path, ops_cache, i)
|
||||||
}
|
}
|
||||||
@@ -1,18 +1,17 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::parse;
|
|
||||||
use crate::pipeline::error::ProjectError;
|
|
||||||
use crate::representations::sourcefile::{FileEntry, normalize_namespaces};
|
|
||||||
use crate::pipeline::source_loader::LoadedSourceTable;
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
|
|
||||||
use super::add_prelude::add_prelude;
|
use super::add_prelude::add_prelude;
|
||||||
use super::collect_ops::{ExportedOpsCache, collect_ops_for};
|
use super::collect_ops::{collect_ops_for, ExportedOpsCache};
|
||||||
use super::normalize_imports::normalize_imports;
|
use super::normalize_imports::normalize_imports;
|
||||||
use super::prefix::prefix;
|
use super::prefix::prefix;
|
||||||
|
use crate::interner::{Interner, Sym};
|
||||||
|
use crate::parse;
|
||||||
|
use crate::pipeline::error::ProjectError;
|
||||||
|
use crate::pipeline::source_loader::LoadedSourceTable;
|
||||||
|
use crate::representations::sourcefile::{normalize_namespaces, FileEntry};
|
||||||
|
|
||||||
pub fn parse_file(
|
pub fn parse_file(
|
||||||
path: Token<Vec<Token<String>>>,
|
path: Sym,
|
||||||
loaded: &LoadedSourceTable,
|
loaded: &LoadedSourceTable,
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
@@ -21,24 +20,24 @@ pub fn parse_file(
|
|||||||
let ld = &loaded[&path];
|
let ld = &loaded[&path];
|
||||||
// let ops_cache = collect_ops::mk_cache(loaded, i);
|
// let ops_cache = collect_ops::mk_cache(loaded, i);
|
||||||
let ops = collect_ops_for(&i.r(path)[..], loaded, ops_cache, i)?;
|
let ops = collect_ops_for(&i.r(path)[..], loaded, ops_cache, i)?;
|
||||||
let ops_vec = ops.iter()
|
let ops_vec = ops.iter().map(|t| i.r(*t)).cloned().collect::<Vec<_>>();
|
||||||
.map(|t| i.r(*t))
|
let ctx = parse::ParsingContext {
|
||||||
.cloned()
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
let ctx = parse::ParsingContext{
|
|
||||||
interner: i,
|
interner: i,
|
||||||
ops: &ops_vec,
|
ops: &ops_vec,
|
||||||
file: Rc::new(i.extern_vec(path))
|
file: Rc::new(i.extern_vec(path)),
|
||||||
};
|
};
|
||||||
let entries = parse::parse(ld.text.as_str(), ctx)
|
let entries = parse::parse(ld.text.as_str(), ctx)
|
||||||
.expect("This error should have been caught during loading");
|
.expect("This error should have been caught during loading");
|
||||||
let with_prelude = add_prelude(entries, &i.r(path)[..], prelude);
|
let with_prelude = add_prelude(entries, &i.r(path)[..], prelude);
|
||||||
let impnormalized = normalize_imports(
|
let impnormalized = normalize_imports(
|
||||||
&ld.preparsed.0, with_prelude, &i.r(path)[..], ops_cache, i
|
&ld.preparsed.0,
|
||||||
|
with_prelude,
|
||||||
|
&i.r(path)[..],
|
||||||
|
ops_cache,
|
||||||
|
i,
|
||||||
);
|
);
|
||||||
let nsnormalized = normalize_namespaces(
|
let nsnormalized = normalize_namespaces(Box::new(impnormalized.into_iter()))
|
||||||
Box::new(impnormalized.into_iter()), i
|
.expect("This error should have been caught during preparsing");
|
||||||
).expect("This error should have been caught during preparsing");
|
|
||||||
let prefixed = prefix(nsnormalized, &i.r(path)[..], ops_cache, i);
|
let prefixed = prefix(nsnormalized, &i.r(path)[..], ops_cache, i);
|
||||||
Ok(prefixed)
|
Ok(prefixed)
|
||||||
}
|
}
|
||||||
@@ -1,82 +1,78 @@
|
|||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
use crate::ast::{Constant, Rule};
|
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
use crate::utils::Substack;
|
|
||||||
use crate::representations::sourcefile::{Member, FileEntry};
|
|
||||||
|
|
||||||
use super::collect_ops::ExportedOpsCache;
|
use super::collect_ops::ExportedOpsCache;
|
||||||
|
use crate::ast::{Constant, Rule};
|
||||||
|
use crate::interner::{Interner, Tok};
|
||||||
|
use crate::representations::sourcefile::{FileEntry, Member, Namespace};
|
||||||
|
use crate::utils::Substack;
|
||||||
|
|
||||||
fn member_rec(
|
fn member_rec(
|
||||||
// level
|
// level
|
||||||
mod_stack: Substack<Token<String>>,
|
mod_stack: Substack<Tok<String>>,
|
||||||
// object
|
// object
|
||||||
data: Member,
|
data: Member,
|
||||||
// context
|
// context
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Member {
|
) -> Member {
|
||||||
// let except = |op| imported.contains(&op);
|
// let except = |op| imported.contains(&op);
|
||||||
let except = |_| false;
|
let except = |_| false;
|
||||||
let prefix_v = path.iter().copied()
|
let prefix_v = path
|
||||||
|
.iter()
|
||||||
|
.copied()
|
||||||
.chain(mod_stack.iter().rev_vec_clone().into_iter())
|
.chain(mod_stack.iter().rev_vec_clone().into_iter())
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
let prefix = i.i(&prefix_v);
|
let prefix = i.i(&prefix_v);
|
||||||
match data {
|
match data {
|
||||||
Member::Namespace(name, body) => {
|
Member::Namespace(Namespace { name, body }) => {
|
||||||
let new_body = entv_rec(
|
let new_body = entv_rec(mod_stack.push(name), body, path, ops_cache, i);
|
||||||
mod_stack.push(name),
|
Member::Namespace(Namespace { name, body: new_body })
|
||||||
body,
|
},
|
||||||
path, ops_cache, i
|
Member::Constant(constant) => Member::Constant(Constant {
|
||||||
);
|
|
||||||
Member::Namespace(name, new_body)
|
|
||||||
}
|
|
||||||
Member::Constant(constant) => Member::Constant(Constant{
|
|
||||||
name: constant.name,
|
name: constant.name,
|
||||||
value: constant.value.prefix(prefix, i, &except)
|
value: constant.value.prefix(prefix, i, &except),
|
||||||
}),
|
}),
|
||||||
Member::Rule(rule) => Member::Rule(Rule{
|
Member::Rule(rule) => Member::Rule(Rule {
|
||||||
prio: rule.prio,
|
prio: rule.prio,
|
||||||
source: Rc::new(rule.source.iter()
|
source: Rc::new(
|
||||||
.map(|e| e.prefix(prefix, i, &except))
|
rule.source.iter().map(|e| e.prefix(prefix, i, &except)).collect(),
|
||||||
.collect()
|
|
||||||
),
|
),
|
||||||
target: Rc::new(rule.target.iter()
|
target: Rc::new(
|
||||||
.map(|e| e.prefix(prefix, i, &except))
|
rule.target.iter().map(|e| e.prefix(prefix, i, &except)).collect(),
|
||||||
.collect()
|
|
||||||
),
|
),
|
||||||
})
|
}),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn entv_rec(
|
fn entv_rec(
|
||||||
// level
|
// level
|
||||||
mod_stack: Substack<Token<String>>,
|
mod_stack: Substack<Tok<String>>,
|
||||||
// object
|
// object
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
// context
|
// context
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
data.into_iter().map(|fe| match fe {
|
data
|
||||||
FileEntry::Exported(mem) => FileEntry::Exported(member_rec(
|
.into_iter()
|
||||||
mod_stack, mem, path, ops_cache, i
|
.map(|fe| match fe {
|
||||||
)),
|
FileEntry::Exported(mem) =>
|
||||||
FileEntry::Internal(mem) => FileEntry::Internal(member_rec(
|
FileEntry::Exported(member_rec(mod_stack, mem, path, ops_cache, i)),
|
||||||
mod_stack, mem, path, ops_cache, i
|
FileEntry::Internal(mem) =>
|
||||||
)),
|
FileEntry::Internal(member_rec(mod_stack, mem, path, ops_cache, i)),
|
||||||
// XXX should [FileEntry::Export] be prefixed?
|
// XXX should [FileEntry::Export] be prefixed?
|
||||||
any => any
|
any => any,
|
||||||
}).collect()
|
})
|
||||||
|
.collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn prefix(
|
pub fn prefix(
|
||||||
data: Vec<FileEntry>,
|
data: Vec<FileEntry>,
|
||||||
path: &[Token<String>],
|
path: &[Tok<String>],
|
||||||
ops_cache: &ExportedOpsCache,
|
ops_cache: &ExportedOpsCache,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Vec<FileEntry> {
|
) -> Vec<FileEntry> {
|
||||||
entv_rec(Substack::Bottom, data, path, ops_cache, i)
|
entv_rec(Substack::Bottom, data, path, ops_cache, i)
|
||||||
}
|
}
|
||||||
@@ -1,33 +1,36 @@
|
|||||||
use std::{ops::Add, rc::Rc};
|
use std::ops::Add;
|
||||||
|
use std::rc::Rc;
|
||||||
|
|
||||||
use hashbrown::HashMap;
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use crate::representations::tree::{Module, ModMember};
|
use crate::ast::{Expr, Rule};
|
||||||
use crate::ast::{Rule, Expr};
|
use crate::interner::{Interner, Sym, Tok};
|
||||||
use crate::interner::{Token, Interner};
|
use crate::representations::tree::{ModMember, Module};
|
||||||
use crate::utils::Substack;
|
use crate::utils::Substack;
|
||||||
|
|
||||||
#[derive(Clone, Debug, Default)]
|
#[derive(Clone, Debug, Default)]
|
||||||
pub struct ProjectExt{
|
pub struct ProjectExt {
|
||||||
/// Pairs each foreign token to the module it was imported from
|
/// Pairs each foreign token to the module it was imported from
|
||||||
pub imports_from: HashMap<Token<String>, Token<Vec<Token<String>>>>,
|
pub imports_from: HashMap<Tok<String>, Sym>,
|
||||||
/// Pairs each exported token to its original full name.
|
/// Pairs each exported token to its original full name.
|
||||||
pub exports: HashMap<Token<String>, Token<Vec<Token<String>>>>,
|
pub exports: HashMap<Tok<String>, Sym>,
|
||||||
/// All rules defined in this module, exported or not
|
/// All rules defined in this module, exported or not
|
||||||
pub rules: Vec<Rule>,
|
pub rules: Vec<Rule>,
|
||||||
/// Filename, if known, for error reporting
|
/// Filename, if known, for error reporting
|
||||||
pub file: Option<Vec<Token<String>>>
|
pub file: Option<Vec<Tok<String>>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Add for ProjectExt {
|
impl Add for ProjectExt {
|
||||||
type Output = Self;
|
type Output = Self;
|
||||||
|
|
||||||
fn add(mut self, rhs: Self) -> Self::Output {
|
fn add(mut self, rhs: Self) -> Self::Output {
|
||||||
let ProjectExt{ imports_from, exports, rules, file } = rhs;
|
let ProjectExt { imports_from, exports, rules, file } = rhs;
|
||||||
self.imports_from.extend(imports_from.into_iter());
|
self.imports_from.extend(imports_from.into_iter());
|
||||||
self.exports.extend(exports.into_iter());
|
self.exports.extend(exports.into_iter());
|
||||||
self.rules.extend(rules.into_iter());
|
self.rules.extend(rules.into_iter());
|
||||||
if file.is_some() { self.file = file }
|
if file.is_some() {
|
||||||
|
self.file = file
|
||||||
|
}
|
||||||
self
|
self
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -51,10 +54,10 @@ pub fn collect_rules(project: &ProjectTree) -> Vec<Rule> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn collect_consts_rec(
|
fn collect_consts_rec(
|
||||||
path: Substack<Token<String>>,
|
path: Substack<Tok<String>>,
|
||||||
bag: &mut HashMap<Token<Vec<Token<String>>>, Expr>,
|
bag: &mut HashMap<Sym, Expr>,
|
||||||
module: &ProjectModule,
|
module: &ProjectModule,
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) {
|
) {
|
||||||
for (key, entry) in module.items.iter() {
|
for (key, entry) in module.items.iter() {
|
||||||
match &entry.member {
|
match &entry.member {
|
||||||
@@ -62,26 +65,18 @@ fn collect_consts_rec(
|
|||||||
let mut name = path.iter().rev_vec_clone();
|
let mut name = path.iter().rev_vec_clone();
|
||||||
name.push(*key);
|
name.push(*key);
|
||||||
bag.insert(i.i(&name), expr.clone());
|
bag.insert(i.i(&name), expr.clone());
|
||||||
}
|
},
|
||||||
ModMember::Sub(module) => {
|
ModMember::Sub(module) =>
|
||||||
collect_consts_rec(
|
collect_consts_rec(path.push(*key), bag, module, i),
|
||||||
path.push(*key),
|
|
||||||
bag, module, i
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn collect_consts(project: &ProjectTree, i: &Interner)
|
pub fn collect_consts(
|
||||||
-> HashMap<Token<Vec<Token<String>>>, Expr>
|
project: &ProjectTree,
|
||||||
{
|
i: &Interner,
|
||||||
|
) -> HashMap<Sym, Expr> {
|
||||||
let mut consts = HashMap::new();
|
let mut consts = HashMap::new();
|
||||||
collect_consts_rec(
|
collect_consts_rec(Substack::Bottom, &mut consts, project.0.as_ref(), i);
|
||||||
Substack::Bottom,
|
|
||||||
&mut consts,
|
|
||||||
project.0.as_ref(),
|
|
||||||
i
|
|
||||||
);
|
|
||||||
consts
|
consts
|
||||||
}
|
}
|
||||||
@@ -1,47 +1,58 @@
|
|||||||
use std::iter;
|
use std::iter;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
|
use super::loaded_source::{LoadedSource, LoadedSourceTable};
|
||||||
|
use super::preparse::preparse;
|
||||||
|
use crate::interner::{Interner, Sym, Tok};
|
||||||
use crate::pipeline::error::ProjectError;
|
use crate::pipeline::error::ProjectError;
|
||||||
|
use crate::pipeline::file_loader::{load_text, IOResult, Loaded};
|
||||||
use crate::pipeline::import_abs_path::import_abs_path;
|
use crate::pipeline::import_abs_path::import_abs_path;
|
||||||
use crate::pipeline::split_name::split_name;
|
use crate::pipeline::split_name::split_name;
|
||||||
use crate::interner::{Token, Interner};
|
|
||||||
|
|
||||||
use crate::pipeline::file_loader::{Loaded, load_text, IOResult};
|
|
||||||
use crate::representations::sourcefile::FileEntry;
|
use crate::representations::sourcefile::FileEntry;
|
||||||
use super::loaded_source::{LoadedSourceTable, LoadedSource};
|
|
||||||
use super::preparse::preparse;
|
|
||||||
|
|
||||||
/// Load the source at the given path or all within if it's a collection,
|
/// Load the source at the given path or all within if it's a collection,
|
||||||
/// and all sources imported from these.
|
/// and all sources imported from these.
|
||||||
fn load_abs_path_rec(
|
fn load_abs_path_rec(
|
||||||
abs_path: Token<Vec<Token<String>>>,
|
abs_path: Sym,
|
||||||
table: &mut LoadedSourceTable,
|
table: &mut LoadedSourceTable,
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
get_source: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
get_source: &impl Fn(Sym) -> IOResult,
|
||||||
is_injected: &impl Fn(&[Token<String>]) -> bool
|
is_injected: &impl Fn(&[Tok<String>]) -> bool,
|
||||||
) -> Result<(), Rc<dyn ProjectError>> {
|
) -> Result<(), Rc<dyn ProjectError>> {
|
||||||
let abs_pathv = i.r(abs_path);
|
let abs_pathv = i.r(abs_path);
|
||||||
// short-circuit if this import is defined externally or already known
|
// short-circuit if this import is defined externally or already known
|
||||||
if is_injected(&abs_pathv) | table.contains_key(&abs_path) {
|
if is_injected(abs_pathv) | table.contains_key(&abs_path) {
|
||||||
return Ok(())
|
return Ok(());
|
||||||
}
|
}
|
||||||
// try splitting the path to file, swallowing any IO errors
|
// try splitting the path to file, swallowing any IO errors
|
||||||
let is_file = |p| (get_source)(p).map(|l| l.is_code()).unwrap_or(false);
|
let is_file = |p| (get_source)(p).map(|l| l.is_code()).unwrap_or(false);
|
||||||
let name_split = split_name(&abs_pathv, &|p| is_file(i.i(p)));
|
let name_split = split_name(abs_pathv, &|p| is_file(i.i(p)));
|
||||||
let filename = if let Some((f, _)) = name_split {f} else {
|
let filename = if let Some((f, _)) = name_split {
|
||||||
|
f
|
||||||
|
} else {
|
||||||
// If the path could not be split to file, load it as directory
|
// If the path could not be split to file, load it as directory
|
||||||
let coll = if let Loaded::Collection(c) = (get_source)(abs_path)? {c}
|
let coll = if let Loaded::Collection(c) = (get_source)(abs_path)? {
|
||||||
|
c
|
||||||
|
}
|
||||||
// ^^ raise any IO error that was previously swallowed
|
// ^^ raise any IO error that was previously swallowed
|
||||||
else {panic!("split_name returned None but the path is a file")};
|
else {
|
||||||
|
panic!("split_name returned None but the path is a file")
|
||||||
|
};
|
||||||
// recurse on all files and folders within
|
// recurse on all files and folders within
|
||||||
for item in coll.iter() {
|
for item in coll.iter() {
|
||||||
let abs_subpath = abs_pathv.iter()
|
let abs_subpath = abs_pathv
|
||||||
|
.iter()
|
||||||
.copied()
|
.copied()
|
||||||
.chain(iter::once(i.i(item)))
|
.chain(iter::once(i.i(item)))
|
||||||
.collect::<Vec<_>>();
|
.collect::<Vec<_>>();
|
||||||
load_abs_path_rec(
|
load_abs_path_rec(
|
||||||
i.i(&abs_subpath), table, prelude, i, get_source, is_injected
|
i.i(&abs_subpath),
|
||||||
|
table,
|
||||||
|
prelude,
|
||||||
|
i,
|
||||||
|
get_source,
|
||||||
|
is_injected,
|
||||||
)?
|
)?
|
||||||
}
|
}
|
||||||
return Ok(());
|
return Ok(());
|
||||||
@@ -50,18 +61,23 @@ fn load_abs_path_rec(
|
|||||||
let text = load_text(i.i(filename), &get_source, i)?;
|
let text = load_text(i.i(filename), &get_source, i)?;
|
||||||
let preparsed = preparse(
|
let preparsed = preparse(
|
||||||
filename.iter().map(|t| i.r(*t)).cloned().collect(),
|
filename.iter().map(|t| i.r(*t)).cloned().collect(),
|
||||||
text.as_str(), prelude, i
|
text.as_str(),
|
||||||
|
prelude,
|
||||||
|
i,
|
||||||
)?;
|
)?;
|
||||||
table.insert(abs_path, LoadedSource{ text, preparsed: preparsed.clone() });
|
table.insert(abs_path, LoadedSource { text, preparsed: preparsed.clone() });
|
||||||
// recurse on all imported modules
|
// recurse on all imported modules
|
||||||
preparsed.0.visit_all_imports(&mut |modpath, module, import| {
|
preparsed.0.visit_all_imports(&mut |modpath, _module, import| {
|
||||||
let abs_pathv = import_abs_path(
|
let abs_pathv =
|
||||||
&filename, modpath,
|
import_abs_path(filename, modpath, &import.nonglob_path(i), i)?;
|
||||||
module, &import.nonglob_path(i), i
|
|
||||||
)?;
|
|
||||||
// recurse on imported module
|
// recurse on imported module
|
||||||
load_abs_path_rec(
|
load_abs_path_rec(
|
||||||
i.i(&abs_pathv), table, prelude, i, get_source, is_injected
|
i.i(&abs_pathv),
|
||||||
|
table,
|
||||||
|
prelude,
|
||||||
|
i,
|
||||||
|
get_source,
|
||||||
|
is_injected,
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
@@ -69,20 +85,15 @@ fn load_abs_path_rec(
|
|||||||
/// Load and preparse all files reachable from the load targets via
|
/// Load and preparse all files reachable from the load targets via
|
||||||
/// imports that aren't injected.
|
/// imports that aren't injected.
|
||||||
pub fn load_source(
|
pub fn load_source(
|
||||||
targets: &[Token<Vec<Token<String>>>],
|
targets: &[Sym],
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
i: &Interner,
|
i: &Interner,
|
||||||
get_source: &impl Fn(Token<Vec<Token<String>>>) -> IOResult,
|
get_source: &impl Fn(Sym) -> IOResult,
|
||||||
is_injected: &impl Fn(&[Token<String>]) -> bool,
|
is_injected: &impl Fn(&[Tok<String>]) -> bool,
|
||||||
) -> Result<LoadedSourceTable, Rc<dyn ProjectError>> {
|
) -> Result<LoadedSourceTable, Rc<dyn ProjectError>> {
|
||||||
let mut table = LoadedSourceTable::new();
|
let mut table = LoadedSourceTable::new();
|
||||||
for target in targets {
|
for target in targets {
|
||||||
load_abs_path_rec(
|
load_abs_path_rec(*target, &mut table, prelude, i, get_source, is_injected)?
|
||||||
*target,
|
|
||||||
&mut table,
|
|
||||||
prelude,
|
|
||||||
i, get_source, is_injected
|
|
||||||
)?
|
|
||||||
}
|
}
|
||||||
Ok(table)
|
Ok(table)
|
||||||
}
|
}
|
||||||
@@ -1,8 +1,8 @@
|
|||||||
use std::{rc::Rc, collections::HashMap};
|
use std::collections::HashMap;
|
||||||
|
use std::rc::Rc;
|
||||||
use crate::interner::Token;
|
|
||||||
|
|
||||||
use super::preparse::Preparsed;
|
use super::preparse::Preparsed;
|
||||||
|
use crate::interner::Sym;
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct LoadedSource {
|
pub struct LoadedSource {
|
||||||
@@ -10,4 +10,4 @@ pub struct LoadedSource {
|
|||||||
pub preparsed: Preparsed,
|
pub preparsed: Preparsed,
|
||||||
}
|
}
|
||||||
|
|
||||||
pub type LoadedSourceTable = HashMap<Token<Vec<Token<String>>>, LoadedSource>;
|
pub type LoadedSourceTable = HashMap<Sym, LoadedSource>;
|
||||||
|
|||||||
@@ -1,25 +1,24 @@
|
|||||||
/* PULL LOGISTICS BOUNDARY
|
// PULL LOGISTICS BOUNDARY
|
||||||
|
//
|
||||||
Specifying exactly what this module should be doing was an unexpectedly
|
// Specifying exactly what this module should be doing was an unexpectedly
|
||||||
hard challenge. It is intended to encapsulate all pull logistics, but
|
// hard challenge. It is intended to encapsulate all pull logistics, but
|
||||||
this definition is apparently prone to scope creep.
|
// this definition is apparently prone to scope creep.
|
||||||
|
//
|
||||||
Load files, preparse them to obtain a list of imports, follow these.
|
// Load files, preparse them to obtain a list of imports, follow these.
|
||||||
Preparsing also returns the module tree and list of exported synbols
|
// Preparsing also returns the module tree and list of exported synbols
|
||||||
for free, which is needed later so the output of preparsing is also
|
// for free, which is needed later so the output of preparsing is also
|
||||||
attached to the module output.
|
// attached to the module output.
|
||||||
|
//
|
||||||
The module checks for IO errors, syntax errors, malformed imports and
|
// The module checks for IO errors, syntax errors, malformed imports and
|
||||||
imports from missing files. All other errors must be checked later.
|
// imports from missing files. All other errors must be checked later.
|
||||||
|
//
|
||||||
Injection strategy:
|
// Injection strategy:
|
||||||
see whether names are valid in the injected tree for is_injected
|
// see whether names are valid in the injected tree for is_injected
|
||||||
*/
|
|
||||||
|
|
||||||
mod load_source;
|
mod load_source;
|
||||||
mod loaded_source;
|
mod loaded_source;
|
||||||
mod preparse;
|
mod preparse;
|
||||||
|
|
||||||
pub use loaded_source::{LoadedSource, LoadedSourceTable};
|
|
||||||
pub use load_source::load_source;
|
pub use load_source::load_source;
|
||||||
|
pub use loaded_source::{LoadedSource, LoadedSourceTable};
|
||||||
pub use preparse::Preparsed;
|
pub use preparse::Preparsed;
|
||||||
@@ -1,39 +1,34 @@
|
|||||||
use hashbrown::HashMap;
|
|
||||||
use std::hash::Hash;
|
use std::hash::Hash;
|
||||||
use std::rc::Rc;
|
use std::rc::Rc;
|
||||||
|
|
||||||
|
use hashbrown::HashMap;
|
||||||
|
|
||||||
use crate::ast::Constant;
|
use crate::ast::Constant;
|
||||||
use crate::pipeline::error::{ProjectError, ParseErrorWithPath, VisibilityMismatch};
|
|
||||||
use crate::representations::sourcefile::{normalize_namespaces, Member};
|
|
||||||
use crate::representations::tree::{ModEntry, ModMember};
|
|
||||||
use crate::interner::Interner;
|
use crate::interner::Interner;
|
||||||
use crate::parse::{self, ParsingContext};
|
use crate::parse::{self, ParsingContext};
|
||||||
use crate::representations::{sourcefile::{FileEntry, imports}, tree::Module};
|
use crate::pipeline::error::{
|
||||||
|
ParseErrorWithPath, ProjectError, VisibilityMismatch,
|
||||||
|
};
|
||||||
|
use crate::representations::sourcefile::{
|
||||||
|
imports, normalize_namespaces, FileEntry, Member,
|
||||||
|
};
|
||||||
|
use crate::representations::tree::{ModEntry, ModMember, Module};
|
||||||
|
|
||||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||||
pub struct Preparsed(pub Rc<Module<(), ()>>);
|
pub struct Preparsed(pub Rc<Module<(), ()>>);
|
||||||
|
|
||||||
/// Add an internal flat name if it does not exist yet
|
/// Add an internal flat name if it does not exist yet
|
||||||
fn add_intern<K: Eq + Hash>(
|
fn add_intern<K: Eq + Hash>(map: &mut HashMap<K, ModEntry<(), ()>>, k: K) {
|
||||||
map: &mut HashMap<K, ModEntry<(), ()>>, k: K
|
let _ = map
|
||||||
) {
|
.try_insert(k, ModEntry { exported: false, member: ModMember::Item(()) });
|
||||||
let _ = map.try_insert(k, ModEntry {
|
|
||||||
exported: false,
|
|
||||||
member: ModMember::Item(()),
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Add an exported flat name or export any existing entry
|
/// Add an exported flat name or export any existing entry
|
||||||
fn add_export<K: Eq + Hash>(
|
fn add_export<K: Eq + Hash>(map: &mut HashMap<K, ModEntry<(), ()>>, k: K) {
|
||||||
map: &mut HashMap<K, ModEntry<(), ()>>, k: K
|
|
||||||
) {
|
|
||||||
if let Some(entry) = map.get_mut(&k) {
|
if let Some(entry) = map.get_mut(&k) {
|
||||||
entry.exported = true
|
entry.exported = true
|
||||||
} else {
|
} else {
|
||||||
map.insert(k, ModEntry {
|
map.insert(k, ModEntry { exported: true, member: ModMember::Item(()) });
|
||||||
exported: true,
|
|
||||||
member: ModMember::Item(()),
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -41,47 +36,53 @@ fn add_export<K: Eq + Hash>(
|
|||||||
fn to_module(
|
fn to_module(
|
||||||
src: &[FileEntry],
|
src: &[FileEntry],
|
||||||
prelude: &[FileEntry],
|
prelude: &[FileEntry],
|
||||||
i: &Interner
|
i: &Interner,
|
||||||
) -> Rc<Module<(), ()>> {
|
) -> Rc<Module<(), ()>> {
|
||||||
let all_src = || src.iter().chain(prelude.iter());
|
let all_src = || src.iter().chain(prelude.iter());
|
||||||
let imports = imports(all_src()).cloned().collect::<Vec<_>>();
|
let imports = imports(all_src()).cloned().collect::<Vec<_>>();
|
||||||
let mut items = all_src().filter_map(|ent| match ent {
|
let mut items = all_src()
|
||||||
FileEntry::Internal(Member::Namespace(name, data)) => {
|
.filter_map(|ent| match ent {
|
||||||
let member = ModMember::Sub(to_module(data, prelude, i));
|
FileEntry::Internal(Member::Namespace(ns)) => {
|
||||||
let entry = ModEntry{ exported: false, member };
|
let member = ModMember::Sub(to_module(&ns.body, prelude, i));
|
||||||
Some((*name, entry))
|
let entry = ModEntry { exported: false, member };
|
||||||
}
|
Some((ns.name, entry))
|
||||||
FileEntry::Exported(Member::Namespace(name, data)) => {
|
},
|
||||||
let member = ModMember::Sub(to_module(data, prelude, i));
|
FileEntry::Exported(Member::Namespace(ns)) => {
|
||||||
let entry = ModEntry{ exported: true, member };
|
let member = ModMember::Sub(to_module(&ns.body, prelude, i));
|
||||||
Some((*name, entry))
|
let entry = ModEntry { exported: true, member };
|
||||||
}
|
Some((ns.name, entry))
|
||||||
_ => None
|
},
|
||||||
}).collect::<HashMap<_, _>>();
|
_ => None,
|
||||||
for file_entry in all_src() { match file_entry {
|
})
|
||||||
FileEntry::Comment(_) | FileEntry::Import(_)
|
.collect::<HashMap<_, _>>();
|
||||||
| FileEntry::Internal(Member::Namespace(..))
|
for file_entry in all_src() {
|
||||||
| FileEntry::Exported(Member::Namespace(..)) => (),
|
match file_entry {
|
||||||
FileEntry::Export(tokv) => for tok in tokv {
|
FileEntry::Comment(_)
|
||||||
|
| FileEntry::Import(_)
|
||||||
|
| FileEntry::Internal(Member::Namespace(_))
|
||||||
|
| FileEntry::Exported(Member::Namespace(_)) => (),
|
||||||
|
FileEntry::Export(tokv) =>
|
||||||
|
for tok in tokv {
|
||||||
add_export(&mut items, *tok)
|
add_export(&mut items, *tok)
|
||||||
}
|
},
|
||||||
FileEntry::Internal(Member::Constant(Constant{ name, .. }))
|
FileEntry::Internal(Member::Constant(Constant { name, .. })) =>
|
||||||
=> add_intern(&mut items, *name),
|
add_intern(&mut items, *name),
|
||||||
FileEntry::Exported(Member::Constant(Constant{ name, .. }))
|
FileEntry::Exported(Member::Constant(Constant { name, .. })) =>
|
||||||
=> add_export(&mut items, *name),
|
add_export(&mut items, *name),
|
||||||
FileEntry::Internal(Member::Rule(rule)) => {
|
FileEntry::Internal(Member::Rule(rule)) => {
|
||||||
let names = rule.collect_single_names(i);
|
let names = rule.collect_single_names(i);
|
||||||
for name in names {
|
for name in names {
|
||||||
add_intern(&mut items, name)
|
add_intern(&mut items, name)
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
FileEntry::Exported(Member::Rule(rule)) => {
|
FileEntry::Exported(Member::Rule(rule)) => {
|
||||||
let names = rule.collect_single_names(i);
|
let names = rule.collect_single_names(i);
|
||||||
for name in names {
|
for name in names {
|
||||||
add_export(&mut items, name)
|
add_export(&mut items, name)
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}}
|
|
||||||
Rc::new(Module { imports, items, extra: () })
|
Rc::new(Module { imports, items, extra: () })
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -95,16 +96,21 @@ pub fn preparse(
|
|||||||
) -> Result<Preparsed, Rc<dyn ProjectError>> {
|
) -> Result<Preparsed, Rc<dyn ProjectError>> {
|
||||||
// Parse with no operators
|
// Parse with no operators
|
||||||
let ctx = ParsingContext::<&str>::new(&[], i, Rc::new(file.clone()));
|
let ctx = ParsingContext::<&str>::new(&[], i, Rc::new(file.clone()));
|
||||||
let entries = parse::parse(source, ctx)
|
let entries = parse::parse(source, ctx).map_err(|error| {
|
||||||
.map_err(|error| ParseErrorWithPath{
|
ParseErrorWithPath {
|
||||||
full_source: source.to_string(),
|
full_source: source.to_string(),
|
||||||
error,
|
error,
|
||||||
path: file.clone()
|
path: file.clone(),
|
||||||
}.rc())?;
|
}
|
||||||
let normalized = normalize_namespaces(Box::new(entries.into_iter()), i)
|
.rc()
|
||||||
.map_err(|ns| VisibilityMismatch{
|
})?;
|
||||||
|
let normalized = normalize_namespaces(Box::new(entries.into_iter()))
|
||||||
|
.map_err(|ns| {
|
||||||
|
VisibilityMismatch {
|
||||||
namespace: ns.into_iter().map(|t| i.r(t)).cloned().collect(),
|
namespace: ns.into_iter().map(|t| i.r(t)).cloned().collect(),
|
||||||
file: Rc::new(file.clone())
|
file: Rc::new(file.clone()),
|
||||||
}.rc())?;
|
}
|
||||||
|
.rc()
|
||||||
|
})?;
|
||||||
Ok(Preparsed(to_module(&normalized, prelude, i)))
|
Ok(Preparsed(to_module(&normalized, prelude, i)))
|
||||||
}
|
}
|
||||||
@@ -1,13 +1,15 @@
|
|||||||
use crate::interner::Token;
|
use crate::interner::Tok;
|
||||||
|
|
||||||
|
#[allow(clippy::type_complexity)]
|
||||||
|
// FIXME couldn't find a good factoring
|
||||||
pub fn split_name<'a>(
|
pub fn split_name<'a>(
|
||||||
path: &'a [Token<String>],
|
path: &'a [Tok<String>],
|
||||||
is_valid: &impl Fn(&[Token<String>]) -> bool
|
is_valid: &impl Fn(&[Tok<String>]) -> bool,
|
||||||
) -> Option<(&'a [Token<String>], &'a [Token<String>])> {
|
) -> Option<(&'a [Tok<String>], &'a [Tok<String>])> {
|
||||||
for split in (0..=path.len()).rev() {
|
for split in (0..=path.len()).rev() {
|
||||||
let (filename, subpath) = path.split_at(split);
|
let (filename, subpath) = path.split_at(split);
|
||||||
if is_valid(filename) {
|
if is_valid(filename) {
|
||||||
return Some((filename, subpath))
|
return Some((filename, subpath));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
None
|
None
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user