id,node_id,name,full_name,private,owner,html_url,description,fork,created_at,updated_at,pushed_at,homepage,size,stargazers_count,watchers_count,language,has_issues,has_projects,has_downloads,has_wiki,has_pages,forks_count,archived,disabled,open_issues_count,license,topics,forks,open_issues,watchers,default_branch,permissions,temp_clone_token,organization,network_count,subscribers_count,readme,readme_html,allow_forking,visibility,is_template,template_repository,web_commit_signoff_required,has_discussions 271665336,MDEwOlJlcG9zaXRvcnkyNzE2NjUzMzY=,datasette-auth-tokens,simonw/datasette-auth-tokens,0,9599,https://github.com/simonw/datasette-auth-tokens,Datasette plugin for authenticating access using API tokens,0,2020-06-11T23:23:30Z,2021-10-15T00:52:53Z,2021-10-15T00:54:20Z,,34,4,4,Python,1,1,1,1,0,1,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin""]",1,0,4,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,1,3,"# datasette-auth-tokens [![PyPI](https://img.shields.io/pypi/v/datasette-auth-tokens.svg)](https://pypi.org/project/datasette-auth-tokens/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-auth-tokens?include_prereleases&label=changelog)](https://github.com/simonw/datasette-auth-tokens/releases) [![Tests](https://github.com/simonw/datasette-auth-tokens/workflows/Test/badge.svg)](https://github.com/simonw/datasette-auth-tokens/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-auth-tokens/blob/main/LICENSE) Datasette plugin for authenticating access using API tokens ## Installation Install this plugin in the same environment as Datasette. $ pip install datasette-auth-tokens ## Hard-coded tokens Read about Datasette's [authentication and permissions system](https://datasette.readthedocs.io/en/latest/authentication.html). This plugin lets you configure secret API tokens which can be used to make authenticated requests to Datasette. First, create a random API token. A useful recipe for doing that is the following: $ python -c 'import secrets; print(secrets.token_hex(32))' 5f9a486dd807de632200b17508c75002bb66ca6fde1993db1de6cbd446362589 Decide on the actor that this token should represent, for example: ```json { ""bot_id"": ""my-bot"" } ``` You can then use `""allow""` blocks to provide that token with permission to access specific actions. To enable access to a configured writable SQL query you could use this in your `metadata.json`: ```json { ""plugins"": { ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""BOT_TOKEN"" }, ""actor"": { ""bot_id"": ""my-bot"" } } ] } }, ""databases"": { "":memory:"": { ""queries"": { ""show_version"": { ""sql"": ""select sqlite_version()"", ""allow"": { ""bot_id"": ""my-bot"" } } } } } } ``` This uses Datasette's [secret configuration values mechanism](https://datasette.readthedocs.io/en/stable/plugins.html#secret-configuration-values) to allow the secret token to be passed as an environment variable. Run Datasette like this: BOT_TOKEN=""this-is-the-secret-token"" \ datasette -m metadata.json You can now run authenticated API queries like this: $ curl -H 'Authorization: Bearer this-is-the-secret-token' \ 'http://127.0.0.1:8001/:memory:/show_version.json?_shape=array' [{""sqlite_version()"": ""3.31.1""}] Additionally you can allow passing the token as a query string parameter, although that's disabled by default given the security implications of URLs with secret tokens included. This may be useful to easily allow embedding data between different services. Simply enable it using the `param` config value: ```json { ""plugins"": { ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""BOT_TOKEN"" }, ""actor"": { ""bot_id"": ""my-bot"" }, } ], ""param"": ""_auth_token"" } }, ""databases"": { "":memory:"": { ""queries"": { ""show_version"": { ""sql"": ""select sqlite_version()"", ""allow"": { ""bot_id"": ""my-bot"" } } } } } } ``` You can now run authenticated API queries like this: $ curl http://127.0.0.1:8001/:memory:/show_version.json?_shape=array&_auth_token=this-is-the-secret-token [{""sqlite_version()"": ""3.31.1""}] ## Tokens from your database As an alternative (or in addition) to the hard-coded list of tokens you can store tokens in a database table and configure the plugin to access them using a SQL query. Your query needs to take a `:token_id` parameter and return at least two columns: one called `token_secret` and one called `actor_*` - usually `actor_id`. Further `actor_` prefixed columns can be returned to provide more details for the authenticated actor. Here's a simple example of a configuration query: ```sql select actor_id, actor_name, token_secret from tokens where token_id = :token_id ``` This can run against a table like this one: | token_id | token_secret | actor_id | actor_name | | -------- | ------------ | -------- | ---------- | | 1 | bd3c94f51fcd | 78 | Cleopaws | | 2 | 86681b4d6f66 | 32 | Pancakes | The tokens are formed as the token ID, then a hyphen, then the token secret. For example: - `1-bd3c94f51fcd` - `2-86681b4d6f66` The SQL query will be executed with the portion before the hyphen as the `:token_id` parameter. The `token_secret` value returned by the query will be compared to the portion of the token after the hyphen to check if the token is valid. Columns with a prefix of `actor_` will be used to populate the actor dictionary. In the above example, a token of `2-86681b4d6f66` will become an actor dictionary of `{""id"": 32, ""name"": ""Pancakes""}`. To configure this, use a `""query""` block in your plugin configuration like this: ```json { ""plugins"": { ""datasette-auth-tokens"": { ""query"": { ""sql"": ""select actor_id, actor_name, token_secret from tokens where token_id = :token_id"", ""database"": ""tokens"" } } }, ""databases"": { ""tokens"": { ""allow"": {} } } } ``` The `""sql""` key here contains the SQL query. The `""database""` key has the name of the attached database file that the query should be executed against - in this case it would execute against `tokens.db`. ### Securing your tokens Anyone with access to your Datasette instance can use it to read the `token_secret` column in your tokens table. This probably isn't what you want! To avoid this, you should lock down access to that table. The configuration example above shows how to do this using an `""allow"": {}` block. Consult Datasette's [Permissions documentation](https://datasette.readthedocs.io/en/stable/authentication.html#permissions) for more information about how to lock down this kind of access. ","

datasette-auth-tokens

Datasette plugin for authenticating access using API tokens

Installation

Install this plugin in the same environment as Datasette.

$ pip install datasette-auth-tokens

Hard-coded tokens

Read about Datasette's authentication and permissions system.

This plugin lets you configure secret API tokens which can be used to make authenticated requests to Datasette.

First, create a random API token. A useful recipe for doing that is the following:

$ python -c 'import secrets; print(secrets.token_hex(32))'
5f9a486dd807de632200b17508c75002bb66ca6fde1993db1de6cbd446362589

Decide on the actor that this token should represent, for example:

{
    ""bot_id"": ""my-bot""
}

You can then use ""allow"" blocks to provide that token with permission to access specific actions. To enable access to a configured writable SQL query you could use this in your metadata.json:

{
    ""plugins"": {
        ""datasette-auth-tokens"": {
            ""tokens"": [
                {
                    ""token"": {
                        ""$env"": ""BOT_TOKEN""
                    },
                    ""actor"": {
                        ""bot_id"": ""my-bot""
                    }
                }
            ]
        }
    },
    ""databases"": {
        "":memory:"": {
            ""queries"": {
                ""show_version"": {
                    ""sql"": ""select sqlite_version()"",
                    ""allow"": {
                        ""bot_id"": ""my-bot""
                    }
                }
            }
        }
    }
}

This uses Datasette's secret configuration values mechanism to allow the secret token to be passed as an environment variable.

Run Datasette like this:

BOT_TOKEN=""this-is-the-secret-token"" \
    datasette -m metadata.json

You can now run authenticated API queries like this:

$ curl -H 'Authorization: Bearer this-is-the-secret-token' \
  'http://127.0.0.1:8001/:memory:/show_version.json?_shape=array'
[{""sqlite_version()"": ""3.31.1""}]

Additionally you can allow passing the token as a query string parameter, although that's disabled by default given the security implications of URLs with secret tokens included. This may be useful to easily allow embedding data between different services.

Simply enable it using the param config value:

{
    ""plugins"": {
        ""datasette-auth-tokens"": {
            ""tokens"": [
                {
                    ""token"": {
                        ""$env"": ""BOT_TOKEN""
                    },
                    ""actor"": {
                        ""bot_id"": ""my-bot""
                    },
                }
            ],
            ""param"": ""_auth_token""
        }
    },
    ""databases"": {
        "":memory:"": {
            ""queries"": {
                ""show_version"": {
                    ""sql"": ""select sqlite_version()"",
                    ""allow"": {
                        ""bot_id"": ""my-bot""
                    }
                }
            }
        }
    }
}

You can now run authenticated API queries like this:

$ curl http://127.0.0.1:8001/:memory:/show_version.json?_shape=array&_auth_token=this-is-the-secret-token
[{""sqlite_version()"": ""3.31.1""}]

Tokens from your database

As an alternative (or in addition) to the hard-coded list of tokens you can store tokens in a database table and configure the plugin to access them using a SQL query.

Your query needs to take a :token_id parameter and return at least two columns: one called token_secret and one called actor_* - usually actor_id. Further actor_ prefixed columns can be returned to provide more details for the authenticated actor.

Here's a simple example of a configuration query:

select actor_id, actor_name, token_secret from tokens where token_id = :token_id

This can run against a table like this one:

token_id token_secret actor_id actor_name
1 bd3c94f51fcd 78 Cleopaws
2 86681b4d6f66 32 Pancakes

The tokens are formed as the token ID, then a hyphen, then the token secret. For example:

The SQL query will be executed with the portion before the hyphen as the :token_id parameter.

The token_secret value returned by the query will be compared to the portion of the token after the hyphen to check if the token is valid.

Columns with a prefix of actor_ will be used to populate the actor dictionary. In the above example, a token of 2-86681b4d6f66 will become an actor dictionary of {""id"": 32, ""name"": ""Pancakes""}.

To configure this, use a ""query"" block in your plugin configuration like this:

{
    ""plugins"": {
        ""datasette-auth-tokens"": {
            ""query"": {
                ""sql"": ""select actor_id, actor_name, token_secret from tokens where token_id = :token_id"",
                ""database"": ""tokens""
            }
        }
    },
    ""databases"": {
        ""tokens"": {
            ""allow"": {}
        }
    }
}

The ""sql"" key here contains the SQL query. The ""database"" key has the name of the attached database file that the query should be executed against - in this case it would execute against tokens.db.

Securing your tokens

Anyone with access to your Datasette instance can use it to read the token_secret column in your tokens table. This probably isn't what you want!

To avoid this, you should lock down access to that table. The configuration example above shows how to do this using an ""allow"": {} block. Consult Datasette's Permissions documentation for more information about how to lock down this kind of access.

",1,public,0,,, 335175637,MDEwOlJlcG9zaXRvcnkzMzUxNzU2Mzc=,datasette-tiles,simonw/datasette-tiles,0,9599,https://github.com/simonw/datasette-tiles,"Mapping tile server for Datasette, serving tiles from MBTiles packages",0,2021-02-02T05:11:12Z,2022-03-22T01:52:30Z,2022-03-22T01:52:27Z,https://datasette.io/plugins/datasette-tiles,54,4,4,Python,1,1,1,1,0,4,0,0,8,,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""mbtiles""]",4,8,4,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,4,3,"# datasette-tiles [![PyPI](https://img.shields.io/pypi/v/datasette-tiles.svg)](https://pypi.org/project/datasette-tiles/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-tiles?include_prereleases&label=changelog)](https://github.com/simonw/datasette-tiles/releases) [![Tests](https://github.com/simonw/datasette-tiles/workflows/Test/badge.svg)](https://github.com/simonw/datasette-tiles/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-tiles/blob/main/LICENSE) Datasette plugin for serving MBTiles map tiles ## Installation Install this plugin in the same environment as Datasette. $ datasette install datasette-tiles ## Demo You can try this plugin out at https://datasette-tiles-demo.datasette.io/-/tiles ## Usage This plugin scans all database files connected to Datasette to see if any of them are valid MBTiles databases. It can then serve tiles from those databases at the following URL: /-/tiles/db-name/zoom/x/y.png An example map for each database demonstrating the configured minimum and maximum zoom for that database can be found at `/-/tiles/db-name` - this can also be accessed via the table and database action menus for that database. Visit `/-/tiles` for an index page of attached valid databases. You can install the [datasette-basemap](https://datasette.io/plugins/datasette-basemap) plugin to get a `basemap` default set of tiles, handling zoom levels 0 to 6 using OpenStreetMap. ### Tile coordinate systems There are two tile coordinate systems in common use for online maps. The first is used by OpenStreetMap and Google Maps, the second is from a specification called [Tile Map Service](https://en.wikipedia.org/wiki/Tile_Map_Service), or TMS. Both systems use three components: `z/x/y` - where `z` is the zoom level, `x` is the column and `y` is the row. The difference is in the way the `y` value is counted. OpenStreetMap has y=0 at the top. TMS has y=0 at the bottom. An illustrative example: at zoom level 2 the map is divided into 16 total tiles. The OpenStreetMap scheme numbers them like so: 0/0 1/0 2/0 3/0 0/1 1/1 2/1 3/1 0/2 1/2 2/2 3/2 0/3 1/3 2/3 3/3 The TMS scheme looks like this: 0/3 1/3 2/3 3/3 0/2 1/2 2/2 3/2 0/1 1/1 2/1 3/1 0/0 1/0 2/0 3/0 `datasette-tiles` can serve tiles using either of these standards. For the OpenStreetMap / Google Maps 0-at-the-top system, use the following URL: /-/tiles/database-name/{z}/{x}/{y}.png For the TMS 0-at-the-bottom system, use this: /-/tiles-tms/database-name/{z}/{x}/{y}.png ### Configuring a Leaflet tile layer The following JavaScript will configure a [Leaflet TileLayer](https://leafletjs.com/reference-1.7.1.html#tilelayer) for use with this plugin: ```javascript var tiles = leaflet.tileLayer(""/-/tiles/basemap/{z}/{x}/{y}.png"", { minZoom: 0, maxZoom: 6, attribution: ""\u00a9 OpenStreetMap contributors"" }); ``` ### Tile stacks `datasette-tiles` can be configured to serve tiles from multiple attached MBTiles files, searching each database in order for a tile and falling back to the next in line if that tile is not found. For a demo of this in action, visit https://datasette-tiles-demo.datasette.io/-/tiles-stack and zoom in on Japan. It should start showing [Stamen's Toner map](maps.stamen.com) of Japan once you get to zoom level 6 and 7. The `/-/tiles-stack/{z}/{x}/{y}.png` endpoint provides this feature. If you start Datasette like this: datasette world.mbtiles country.mbtiles city1.mbtiles city2.mbtiles Any requests for a tile from the `/-/tiles-stack` path will first check the `city2` database, than `city1`, then `country`, then `world`. If you have the [datasette-basemap](https://datasette.io/plugins/datasette-basemap) plugin installed it will be given special treatment: the `basemap` database will always be the last database checked for a tile. Rather than rely on the order in which databases were attached, you can instead configure an explicit order using the `tiles-stack-order` plugin setting. Add the following to your `metadata.json` file: ```json { ""plugins"": { ""datasette-tiles"": { ""tiles-stack-order"": [""world"", ""country""] } } } ``` You can then run Datasette like this: datasette -m metadata.json country.mbtiles world.mbtiles This endpoint serves tiles using the OpenStreetMap / Google Maps coordinate system. To load tiles using the TMS coordinate system use this endpoint instead: /-/tiles-stack-tms/{z}/{x}/{y}.png ### Retina tiles Retina (double resolution) tiles are supported by `datasette-tiles` if the MBTiles database file contains 512x512 tile images as opposed to the default of 256x256. JavaScript libraries such as Leaflet will serve these tiles with a fixed 256x256 size, which will cause them to be displayed correctly by capable operating systems. ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd datasette-tiles python3 -mvenv venv source venv/bin/activate Or if you are using `pipenv`: pipenv shell Now install the dependencies and tests: pip install -e '.[test]' To run the tests: pytest ","

datasette-tiles

Datasette plugin for serving MBTiles map tiles

Installation

Install this plugin in the same environment as Datasette.

$ datasette install datasette-tiles

Demo

You can try this plugin out at https://datasette-tiles-demo.datasette.io/-/tiles

Usage

This plugin scans all database files connected to Datasette to see if any of them are valid MBTiles databases.

It can then serve tiles from those databases at the following URL:

/-/tiles/db-name/zoom/x/y.png

An example map for each database demonstrating the configured minimum and maximum zoom for that database can be found at /-/tiles/db-name - this can also be accessed via the table and database action menus for that database.

Visit /-/tiles for an index page of attached valid databases.

You can install the datasette-basemap plugin to get a basemap default set of tiles, handling zoom levels 0 to 6 using OpenStreetMap.

Tile coordinate systems

There are two tile coordinate systems in common use for online maps. The first is used by OpenStreetMap and Google Maps, the second is from a specification called Tile Map Service, or TMS.

Both systems use three components: z/x/y - where z is the zoom level, x is the column and y is the row.

The difference is in the way the y value is counted. OpenStreetMap has y=0 at the top. TMS has y=0 at the bottom.

An illustrative example: at zoom level 2 the map is divided into 16 total tiles. The OpenStreetMap scheme numbers them like so:

0/0  1/0  2/0  3/0
0/1  1/1  2/1  3/1
0/2  1/2  2/2  3/2
0/3  1/3  2/3  3/3

The TMS scheme looks like this:

0/3  1/3  2/3  3/3
0/2  1/2  2/2  3/2
0/1  1/1  2/1  3/1
0/0  1/0  2/0  3/0

datasette-tiles can serve tiles using either of these standards. For the OpenStreetMap / Google Maps 0-at-the-top system, use the following URL:

/-/tiles/database-name/{z}/{x}/{y}.png

For the TMS 0-at-the-bottom system, use this:

/-/tiles-tms/database-name/{z}/{x}/{y}.png

Configuring a Leaflet tile layer

The following JavaScript will configure a Leaflet TileLayer for use with this plugin:

var tiles = leaflet.tileLayer(""/-/tiles/basemap/{z}/{x}/{y}.png"", {
  minZoom: 0,
  maxZoom: 6,
  attribution: ""\u00a9 OpenStreetMap contributors""
});

Tile stacks

datasette-tiles can be configured to serve tiles from multiple attached MBTiles files, searching each database in order for a tile and falling back to the next in line if that tile is not found.

For a demo of this in action, visit https://datasette-tiles-demo.datasette.io/-/tiles-stack and zoom in on Japan. It should start showing Stamen's Toner map of Japan once you get to zoom level 6 and 7.

The /-/tiles-stack/{z}/{x}/{y}.png endpoint provides this feature.

If you start Datasette like this:

datasette world.mbtiles country.mbtiles city1.mbtiles city2.mbtiles

Any requests for a tile from the /-/tiles-stack path will first check the city2 database, than city1, then country, then world.

If you have the datasette-basemap plugin installed it will be given special treatment: the basemap database will always be the last database checked for a tile.

Rather than rely on the order in which databases were attached, you can instead configure an explicit order using the tiles-stack-order plugin setting. Add the following to your metadata.json file:

{
    ""plugins"": {
        ""datasette-tiles"": {
            ""tiles-stack-order"": [""world"", ""country""]
        }
    }
}

You can then run Datasette like this:

datasette -m metadata.json country.mbtiles world.mbtiles

This endpoint serves tiles using the OpenStreetMap / Google Maps coordinate system. To load tiles using the TMS coordinate system use this endpoint instead:

/-/tiles-stack-tms/{z}/{x}/{y}.png

Retina tiles

Retina (double resolution) tiles are supported by datasette-tiles if the MBTiles database file contains 512x512 tile images as opposed to the default of 256x256. JavaScript libraries such as Leaflet will serve these tiles with a fixed 256x256 size, which will cause them to be displayed correctly by capable operating systems.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-tiles
python3 -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and tests:

pip install -e '.[test]'

To run the tests:

pytest
",1,public,0,,,