f4504a93fe
In most places, we talk about "generating a parser", not "building a parser", which the function name should reflect. Also, mentioning a parser in the name is not necessary as in case of a parser generator it's pretty clear what is generated. |
9 years ago | |
---|---|---|
.. | ||
css | 15 years ago | |
json | 15 years ago | |
vendor | 9 years ago | |
README.md | 11 years ago | |
benchmarks.js | 9 years ago | |
index.css | 12 years ago | |
index.html | 9 years ago | |
index.js | 9 years ago | |
run | 9 years ago | |
runner.js | 9 years ago |
README.md
PEG.js Benchmark Suite
This is the PEG.js benchmark suite. It measures speed of the parsers generated by PEG.js on various inputs. Its main goal is to provide data for code generator optimizations.
Running in Node.js
All commands in the following steps need to be executed in PEG.js root directory (one level up from this one).
-
Install all PEG.js dependencies, including development ones:
$ npm install
-
Execute the benchmark suite:
$ make spec
-
Wait for results.
Running in the Browser
All commands in the following steps need to be executed in PEG.js root directory (one level up from this one).
-
Make sure you have Node.js and Python installed.
-
Install all PEG.js dependencies, including development ones:
$ npm install
-
Build browser version of PEG.js:
$ make browser
-
Serve PEG.js root directory using a web server:
$ python -m SimpleHTTPServer
-
Point your browser to the benchmark suite.
-
Click the Run button and wait for results.