#!/usr/bin/env perl
use Dancer2;
use DBI;

my $dbh = DBI->connect("dbi:SQLite:dbname=:memory:");
$dbh->do("CREATE TABLE counter (
$dbh->do("INSERT INTO counter (cnt) VALUES (?)", undef, 0);

get '/' => sub {
    my $sth = $dbh->prepare("SELECT * from counter");
    my ($counter) = $sth->fetchrow_array();
    $dbh->do("UPDATE counter SET cnt=?", undef, $counter);

    return $counter;


In order to run this you'll have to have Dancer2, DBI, and DBD::SQLite installed.

cpanm Dancer2 DBI DBD::SQLite

Then you can launch the development server:

plackup dancer_counter_in_memory_sqlite.psgi

and access the application via http://localhost:5000/

This was an interesting example to write, but I am not sure what would be the value of having an in-memory database for something like this as this means the data won't survive a restart of the application-server.

The data and the database is persistent among the requests that are served from the same process, but different processes will have different database (and in any real world situation you'll want to have more than one process) and if you need to restart the application server (e.g. Starman) then you will lose all the data.

An external caching used by the application might make more sense.

Using Redis for example provides persistence as well as atomic updates of the counter for performance via blocking incr().