Why are Python, Ruby, and Node.js so much slower than Bash, AWK, Perl? -


while making polyglot makefile (which launches many thousands of processes), noticed scripting languages vary enormously in start-up performance.


bash

$ timeformat='%3r'; time bash -c "echo 'hello world'" > /dev/null 0.002 

awk

$ timeformat='%3r'; time awk "begin { print \"hello world\" }" > /dev/null 0.002 

perl

$ timeformat='%3r'; time perl -e "print \"hello world\n\"" > /dev/null 0.003 

all same. each of these scripting languages order of magnitude(!) slower.

python

$ timeformat='%3r'; time python -c "print 'hello world'" > /dev/null 0.023 

ruby

$ timeformat='%3r'; time ruby -e "puts 'hello world'" > /dev/null 0.024 

node.js

$ timeformat='%3r'; time node -e "console.log('hello world')" > /dev/null 0.082 

what sorts of things python, ruby, , node.js doing make them slower equivalent bash, awk, , perl programs? way things turned out, or there more fundamental design make them have more overhead?

in examples, practically measuring how take start. so, slowest 3 slowest because interpreters more @ beginning, not means in long running program slower.

you can write long running examples (where thousands of calculations), can see 1 slowest in long run.

bear in mind each of meant different things, it's 1 of them fast processing files (awk) while others fast @ doing 10s of things simultaneously (node.js).


Comments

Popular posts from this blog

ios - MKAnnotationView layer is not of expected type: MKLayer -

ZeroMQ on Windows, with Qt Creator -

unity3d - Unity SceneManager.LoadScene quits application -