Well, by now, you've done all your arguing on Twitter. Is Turbolinks a good idea, or the Worst Thing Ever?
See these guys? One of them said this:
"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil"
Wisdom.
But what makes optimization premature? When you don't know if you should do it or not. How do you know?
Measuring. It's good for you. You can do it. If you measure things, you can be sure what's up.
But like eating your veggies, nobody measures. Ever.
Computer SCIENCE is called science for a reason, yo. Be a scientist. Don't just argue about stuff on blogs. Measure things. Then report back.
This probably isn't even a good test. I don't care. Tell me how it sucks. Let's figure it out. But having actual measurements beats complaining about shit on Twitter any day.
Well, it fires up Selenium, and clicks some links. 1000 of them, by default. Then it prints out how long that took.
- master: a plain Rails app.
- Just CSS: This has Basecamp Next's CSS file in it.
- Just JS: This has Basecamp Next's JS file in it.
- All the Assets: This has both.
- With Little Sleep: With a
sleep 0.1
in the controller, to simulate database access - With Lots of Sleep: Ditto, but
sleep 0.5
.
I've added some commits to master since I originally forked the branches. If you try this, you may want to add some of them in.
$ bundle
$ rake assets:precompile
$ TIMES=100 rspec
Done.
What I get:
With 1000 pages:
$ rspec
user system total real
no turbolinks 11.170000 0.980000 12.460000 (138.656728)
yes turbolinks 10.800000 0.870000 11.670000 ( 80.436286)
With 100 pages:
$ rspec
user system total real
no turbolinks 1.640000 0.190000 2.140000 ( 15.652763)
yes turbolinks 1.120000 0.090000 1.210000 ( 7.776116)