Skip to content

Commit

Permalink
Fixes to CLI (#155)
Browse files Browse the repository at this point in the history
- Improved contributor and sponsor list under `info`
* Add glob entry points (#81) + MaxDiagnostics
* Update JS docs (Fixes #162)
  • Loading branch information
kaleidawave authored Jun 13, 2024
1 parent 9dd05b7 commit 5191329
Show file tree
Hide file tree
Showing 18 changed files with 330 additions and 130 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/github-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,18 +61,18 @@ jobs:
edges {
node {
sponsor {
name
name, login
}
}
}
}
}
}' -q '.data.user.sponsorshipsAsMaintainer.edges | map(.node.sponsor.name) | join(",")')
}' -q '.data.user.sponsorshipsAsMaintainer.edges | map(.node.sponsor.name // .node.sponsor.login) | join(",") | join(",")')
echo "SPONSORS=$SPONSORS" >> $GITHUB_OUTPUT
CONTRIBUTORS=$(
gh pr list --search "-author:@me" --state merged --json author | jq 'map(.author.name) | unique | join(",")' --raw-output
gh pr list --state merged --json author | jq 'map(.author.name // .author.login) | unique | join(",")' --raw-output
)
echo "CONTRIBUTORS=$CONTRIBUTORS" >> $GITHUB_OUTPUT
Expand Down
20 changes: 6 additions & 14 deletions .github/workflows/performance-and-size.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,11 @@ jobs:
LINES_OF_CODE=$(scc -c --no-cocomo -f json demo.ts | jq ".[0].Code")
echo "### Checking
\`\`\`shell
$(hyperfine -i './target/release/ezno check demo.ts')
\`\`\`" >> $GITHUB_STEP_SUMMARY
echo "<details>
<summary>Input</summary>
Expand All @@ -72,23 +77,10 @@ jobs:
<summary>Diagnostics</summary>
\`\`\`
$(./target/release/ezno check demo.ts --timings 2>&1 || true)
$(./target/release/ezno check demo.ts --timings --max-diagnostics all 2>&1 || true)
\`\`\`
</details>
" >> $GITHUB_STEP_SUMMARY
echo "### Checking
\`\`\`shell
$(hyperfine -i './target/release/ezno check demo.ts')
\`\`\`" >> $GITHUB_STEP_SUMMARY
echo "::group::Comparing printing of diagnostics"
hyperfine -i './target/release/ezno check demo.ts' './target/release/ezno check demo.ts --compact-diagnostics' './target/release/ezno check demo.ts --count-diagnostics'
echo "::endgroup::"
echo "::group::cargo tree"
cargo tree
echo "::endgroup::"
- name: Run parser, minfier/stringer performance
shell: bash
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,18 +66,18 @@ jobs:
edges {
node {
sponsor {
name
name, login
}
}
}
}
}
}' -q '.data.user.sponsorshipsAsMaintainer.edges | map(.node.sponsor.name) | join(",")')
}' -q '.data.user.sponsorshipsAsMaintainer.edges | map(.node.sponsor.name // .node.sponsor.login) | join(",") | join(",")')
echo "SPONSORS=$SPONSORS" >> $GITHUB_OUTPUT
CONTRIBUTORS=$(
gh pr list --search "-author:@me" --state merged --json author | jq 'map(.author.name) | unique | join(",")' --raw-output
gh pr list --state merged --json author | jq 'map(.author.name // .author.login) | unique | join(",")' --raw-output
)
echo "CONTRIBUTORS=$CONTRIBUTORS" >> $GITHUB_OUTPUT
Expand Down
7 changes: 7 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@ enum-variants-strings = "0.3"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
simple-json-parser = "0.0.2"
glob = "0.3"

[target.'cfg(not(target_family = "wasm"))'.dependencies]
# For updating binary
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,11 +23,11 @@ What Ezno is not
- Smarter as a means to allow more *dynamic patterns*. Keep things simple!
- A binary executable compiler. It takes in JavaScript (or a TypeScript or Ezno superset) and does similar processes to traditional compilers, but at the end emits JavaScript. However, in the future, it _could_ generate a lower level format using its event (side-effect) representation.

Read more about Ezno:
Read more about Ezno (in chronological order)
- [Introducing Ezno](https://kaleidawave.github.io/posts/introducing-ezno/)
- [Ezno in '23](https://kaleidawave.github.io/posts/ezno-23/)
- [A preview of the checker](https://kaleidawave.github.io/posts/a-preview-of-the-checker/)
- [The Quest Continues](https://kaleidawave.github.io/posts/the-quest-continues/)
- [The quest continues](https://kaleidawave.github.io/posts/the-quest-continues/)

---

Expand Down
2 changes: 1 addition & 1 deletion checker/binary-serialize-derive/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ syn-helpers = "0.5"

[lib]
path = "macro.rs"
proc_macro = true
proc-macro = true
8 changes: 5 additions & 3 deletions checker/documentation/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,15 @@ Ezno is work in progress. It doesn't currently support all the features of JavaS

While it **is not worth it trying it on existing codebases at this time** (as it likely will blow up 💥), **you can try out the snippets in the [specification](../specification/specification.md)** and other small pieces of code today.

You can try the `check` command of ezno using `npx`
The best way to try the type checker is on the [web playground](https://kaleidawave.github.io/ezno/playground).

Alternative you can try the checker locally using the `check` command of ezno binary. The simplest way is using `npx`

```shell
npx ezno check file.ts
```

Or download the binary with `npm install ezno`, `cargo install ezno` or on [GitHub releases](https://github.com/kaleidawave/ezno/releases).
You can also download the binary with `npm install ezno`. Or for the native (non WASM version) you can get it with `cargo install ezno` or on [GitHub releases](https://github.com/kaleidawave/ezno/releases).

---

Expand All @@ -19,4 +21,4 @@ const x = 6;
print_type(x + 8)
```

If you find any unexpected exceptions, please leave an issue 😁
If you find any unexpected exceptions, please [leave an issue 😁](https://github.com/kaleidawave/ezno/issues/new)
2 changes: 1 addition & 1 deletion parser/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ fn main() {
- See expression identifiers can be used to bind information to
- Retain source positions for use in analysis diagnostics and generating source maps
- All AST should be visitable. Immutably to collect facts or mutable to transform/remove
- Optionally via configuration extend the ECMAscript language definition
- Optionally via configuration extend the *ECMAScript language definition*
- TypeScript type annotations
- Interfaces, enums and type alias statements
- Parameter, return type and variable annotations
Expand Down
62 changes: 50 additions & 12 deletions parser/examples/code_blocks_to_script.rs
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
use std::{collections::HashSet, io::Write};
use std::{collections::HashSet, io::Write, path::PathBuf};

use ezno_parser::{
ast::{InterfaceDeclaration, TypeAlias},
Expand All @@ -8,18 +8,27 @@ use ezno_parser::{
};

fn main() -> Result<(), Box<dyn std::error::Error>> {
let mut args = std::env::args().skip(1);
let path = args.next().ok_or("expected path to markdown file")?;
let out = args.next();
let args = std::env::args().skip(1).collect::<Vec<_>>();
let path = args.first().ok_or("expected path to markdown file")?;

let replace_satisfies_with_as = args.iter().any(|item| item == "--satisfies-with-as");

let into_files_directory_and_extension = args.windows(3).find_map(|item| {
matches!(item[0].as_str(), "--into-files").then_some((item[1].clone(), item[2].clone()))
});
let out_file = args
.windows(2)
.find_map(|item| matches!(item[0].as_str(), "--out").then_some(item[1].clone()));

let content = std::fs::read_to_string(&path)?;

let filters: Vec<&str> = vec!["import", "export", "declare"];

let blocks = if path.ends_with(".md") {
let mut blocks = Vec::new();

let mut lines = content.lines();
let mut current = String::default();

while let Some(line) = lines.next() {
if line.starts_with("```ts") {
let mut indented_code = lines
Expand All @@ -34,17 +43,37 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
debug_assert_eq!(indented_code.pop(), Some('\t'));

if !filters.iter().any(|filter| indented_code.contains(filter)) {
blocks.push(indented_code);
blocks.push((std::mem::take(&mut current), indented_code));
}
} else if let Some(header) = line.strip_prefix("#### ") {
current = header.to_owned();
}
}
blocks
} else {
todo!("parse module, split by statement braced")
};

if let Some((under, extension)) = into_files_directory_and_extension {
let under = PathBuf::from(under);
for (header, code) in blocks {
let mut name = heading_to_rust_identifier(&header);
name.push_str(".");
name.push_str(&extension);
let mut file = std::fs::File::create(under.join(name))?;
// Fix for FLow
let code =
if replace_satisfies_with_as { code.replace(" satisfies ", " as ") } else { code };
for line in code.lines() {
writeln!(file, "{}", line.strip_prefix('\t').unwrap_or(line))?;
}
}
return Ok(());
}

// Else bundle into one, bound in arrow functions to prevent namespace collision
let mut final_blocks: Vec<(HashSet<String>, String)> = Vec::new();
for code in blocks {
for (header, code) in blocks {
let module = Module::from_string(code.clone(), Default::default()).map_err(Box::new)?;

let mut names = HashSet::new();
Expand Down Expand Up @@ -93,7 +122,9 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
final_blocks.iter_mut().find(|(uses, _)| uses.is_disjoint(&names))
{
items.extend(names.into_iter());
block.push_str("\n");
block.push_str("\n// ");
block.push_str(&header);
block.push('\n');
block.push_str(&code);
} else {
final_blocks.push((names, code));
Expand All @@ -102,16 +133,16 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {

// eprintln!("Generated {:?} blocks", final_blocks.len());

if let Some(out) = out {
let mut out = std::fs::File::create(out).expect("Cannot open file");
if let Some(out) = out_file {
let mut out = std::fs::File::create(out)?;
for (_items, block) in final_blocks {
writeln!(out, "() => {{\n{block}}};\n").unwrap();
writeln!(out, "() => {{\n{block}}};\n")?;
}
} else {
let mut out = std::io::stdout();
for (_items, block) in final_blocks {
// eprintln!("block includes: {items:?}\n{block}\n---");
writeln!(out, "() => {{\n{block}}};\n").unwrap();
writeln!(out, "() => {{\n{block}}};\n")?;
}
}

Expand All @@ -137,3 +168,10 @@ impl<'a>
}
}
}

fn heading_to_rust_identifier(heading: &str) -> String {
heading
.replace([' ', '-', '/', '&', '.', '+'], "_")
.replace(['*', '\'', '`', '"', '!', '(', ')', ','], "")
.to_lowercase()
}
33 changes: 21 additions & 12 deletions src/ast_explorer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -136,10 +136,13 @@ impl ExplorerSubCommand {
}
}
// TODO temp
Err(err) => {
emit_diagnostics(std::iter::once((err, source_id).into()), &fs, false)
.unwrap()
}
Err(err) => emit_diagnostics(
std::iter::once((err, source_id).into()),
&fs,
false,
crate::utilities::MaxDiagnostics::All,
)
.unwrap(),
}
}
ExplorerSubCommand::FullAST(cfg) => {
Expand All @@ -159,10 +162,13 @@ impl ExplorerSubCommand {
}
}
// TODO temp
Err(err) => {
emit_diagnostics(std::iter::once((err, source_id).into()), &fs, false)
.unwrap()
}
Err(err) => emit_diagnostics(
std::iter::once((err, source_id).into()),
&fs,
false,
crate::utilities::MaxDiagnostics::All,
)
.unwrap(),
}
}
ExplorerSubCommand::Prettifier(_) | ExplorerSubCommand::Uglifier(_) => {
Expand All @@ -179,10 +185,13 @@ impl ExplorerSubCommand {
};
print_to_cli(format_args!("{}", module.to_string(&options)));
}
Err(err) => {
emit_diagnostics(std::iter::once((err, source_id).into()), &fs, false)
.unwrap()
}
Err(err) => emit_diagnostics(
std::iter::once((err, source_id).into()),
&fs,
false,
crate::utilities::MaxDiagnostics::All,
)
.unwrap(),
}
}
ExplorerSubCommand::Lexer(_) => {
Expand Down
Loading

0 comments on commit 5191329

Please sign in to comment.