pegjs/benchmark
David Majda f4504a93fe Rename the "buildParser" function to "generate"
In most places, we talk about "generating a parser", not "building a
parser", which the function name should reflect. Also, mentioning a
parser in the name is not necessary as in case of a parser generator
it's pretty clear what is generated.
2016-05-04 14:01:14 +02:00
..
css
json
vendor Update jQuery.scrollTo's LICENSE file 2015-08-21 20:38:51 +02:00
benchmarks.js ESLint: Set environments better 2016-01-29 14:50:38 +01:00
index.css
index.html Update version to 0.9.0 2015-08-30 08:22:26 +02:00
index.js Fix ESLint errors in benchmark/index.js 2016-01-22 14:21:28 +01:00
README.md Use sentence case consistently in {spec,benchmark}/README.md headers 2014-05-10 16:40:39 +02:00
run Rename the "PEG" variable to "peg" 2016-05-04 12:37:13 +02:00
runner.js Rename the "buildParser" function to "generate" 2016-05-04 14:01:14 +02:00

PEG.js Benchmark Suite

This is the PEG.js benchmark suite. It measures speed of the parsers generated by PEG.js on various inputs. Its main goal is to provide data for code generator optimizations.

Running in Node.js

All commands in the following steps need to be executed in PEG.js root directory (one level up from this one).

  1. Install all PEG.js dependencies, including development ones:

    $ npm install

  2. Execute the benchmark suite:

    $ make spec

  3. Wait for results.

Running in the Browser

All commands in the following steps need to be executed in PEG.js root directory (one level up from this one).

  1. Make sure you have Node.js and Python installed.

  2. Install all PEG.js dependencies, including development ones:

    $ npm install

  3. Build browser version of PEG.js:

    $ make browser

  4. Serve PEG.js root directory using a web server:

    $ python -m SimpleHTTPServer

  5. Point your browser to the benchmark suite.

  6. Click the Run button and wait for results.