id,node_id,name,full_name,private,owner,html_url,description,fork,created_at,updated_at,pushed_at,homepage,size,stargazers_count,watchers_count,language,has_issues,has_projects,has_downloads,has_wiki,has_pages,forks_count,archived,disabled,open_issues_count,license,topics,forks,open_issues,watchers,default_branch,permissions,temp_clone_token,organization,network_count,subscribers_count,readme,readme_html,allow_forking,visibility,is_template,template_repository,web_commit_signoff_required,has_discussions 163790822,MDEwOlJlcG9zaXRvcnkxNjM3OTA4MjI=,datasette-sqlite-fts4,simonw/datasette-sqlite-fts4,0,9599,https://github.com/simonw/datasette-sqlite-fts4,Datasette plugin that adds custom SQL functions for working with SQLite FTS4,0,2019-01-02T03:40:41Z,2022-07-31T16:33:25Z,2022-07-31T14:46:26Z,https://datasette.io/plugins/datasette-sqlite-fts4,14,3,3,Python,1,1,1,1,0,1,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""plugin""]",1,0,3,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,1,2,"# datasette-sqlite-fts4 [![PyPI](https://img.shields.io/pypi/v/datasette-sqlite-fts4.svg)](https://pypi.org/project/datasette-sqlite-fts4/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-sqlite-fts4?include_prereleases&label=changelog)](https://github.com/simonw/datasette-sqlite-fts4/releases) [![Tests](https://github.com/simonw/datasette-sqlite-fts4/workflows/Test/badge.svg)](https://github.com/simonw/datasette-sqlite-fts4/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-sqlite-fts4/blob/main/LICENSE) Datasette plugin that exposes the custom SQL functions from [sqlite-fts4](https://github.com/simonw/sqlite-fts4). [Interactive demo](https://datasette-sqlite-fts4.datasette.io/24ways-fts4?sql=select%0D%0A++++json_object%28%0D%0A++++++++""label""%2C+articles.title%2C+""href""%2C+articles.url%0D%0A++++%29+as+article%2C%0D%0A++++articles.author%2C%0D%0A++++rank_score%28matchinfo%28articles_fts%2C+""pcx""%29%29+as+score%2C%0D%0A++++rank_bm25%28matchinfo%28articles_fts%2C+""pcnalx""%29%29+as+bm25%2C%0D%0A++++json_object%28%0D%0A++++++++""pre""%2C+annotate_matchinfo%28matchinfo%28articles_fts%2C+""pcxnalyb""%29%2C+""pcxnalyb""%29%0D%0A++++%29+as+annotated_matchinfo%2C%0D%0A++++matchinfo%28articles_fts%2C+""pcxnalyb""%29+as+matchinfo%2C%0D%0A++++decode_matchinfo%28matchinfo%28articles_fts%2C+""pcxnalyb""%29%29+as+decoded_matchinfo%0D%0Afrom%0D%0A++++articles_fts+join+articles+on+articles.rowid+%3D+articles_fts.rowid%0D%0Awhere%0D%0A++++articles_fts+match+%3Asearch%0D%0Aorder+by+bm25&search=jquery+maps). Read [Exploring search relevance algorithms with SQLite](https://simonwillison.net/2019/Jan/7/exploring-search-relevance-algorithms-sqlite/) for further details on this project. ## Installation pip install datasette-sqlite-fts4 If you are deploying a database using `datasette publish` you can include this plugin using the `--install` option: datasette publish now mydb.db --install=datasette-sqlite-fts4 ","

datasette-sqlite-fts4

Datasette plugin that exposes the custom SQL functions from sqlite-fts4.

Interactive demo. Read Exploring search relevance algorithms with SQLite for further details on this project.

Installation

pip install datasette-sqlite-fts4

If you are deploying a database using datasette publish you can include this plugin using the --install option:

datasette publish now mydb.db --install=datasette-sqlite-fts4
",1,public,0,,0, 236110759,MDEwOlJlcG9zaXRvcnkyMzYxMTA3NTk=,datasette-auth-existing-cookies,simonw/datasette-auth-existing-cookies,0,9599,https://github.com/simonw/datasette-auth-existing-cookies,Datasette plugin that authenticates users based on existing domain cookies,0,2020-01-25T01:20:31Z,2022-05-28T01:50:15Z,2022-05-30T17:10:11Z,,54,3,3,Python,1,1,1,1,0,1,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin""]",1,0,3,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,1,3,"# datasette-auth-existing-cookies [![PyPI](https://img.shields.io/pypi/v/datasette-auth-existing-cookies.svg)](https://pypi.org/project/datasette-auth-existing-cookies/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-auth-existing-cookies?include_prereleases&label=changelog)](https://github.com/simonw/datasette-auth-existing-cookies/releases) [![Tests](https://github.com/simonw/datasette-auth-existing-cookies/workflows/Test/badge.svg)](https://github.com/simonw/datasette-auth-existing-cookies/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-auth-existing-cookies/blob/master/LICENSE) Datasette plugin that authenticates users based on existing domain cookies. ## When to use this This plugin allows you to build custom authentication for Datasette when you are hosting a Datasette instance on the same domain as another, authenticated website. Consider a website on `www.example.com` which supports user authentication. You could run Datasette on `data.example.com` in a way that lets it see cookies that were set for the `.example.com` domain. Using this plugin, you could build an API endpoint at `www.example.com/user-for-cookies` which returns a JSON object representing the currently signed-in user, based on their cookies. The plugin running on `data.example.com` will then make the `actor` available to the rest of Datasette based on the response from that API. Read about [Datasette's authentication and permissions system](https://docs.datasette.io/en/stable/authentication.html) for more on how actors and permissions work. ## Configuration This plugin requires some configuration in the Datasette [metadata.json file](https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration). The following configuration options are supported: - `api_url`: this is the API endpoint that Datasette should call with the user's cookies in order to identify the logged in user. - `cookies`: optional. A list of cookie names that should be passed through to the API endpoint - if left blank, the default is to send all cookies. - `ttl`: optional. By default Datasette will make a request to the API endpoint for every HTTP request recieved by Datasette itself. A `ttl` value of 5 will cause Datasette to cache the actor associated with the user's cookies for 5 seconds, reducing that API traffic. - `headers`: an optional list of other headers to forward to the API endpoint as query string parameters. Here is an example that uses all four of these settings: ```json { ""plugins"": { ""datasette-auth-existing-cookies"": { ""api_url"": ""http://www.example.com/user-from-cookies"", ""cookies"": [""sessionid""], ""headers"": [""host""], ""ttl"": 10 } } } ``` With this configuration any hit to a Datasette hosted at `data.example.com` will result in the following request being made to the `http://www.example.com/user-from-cookies` API endpoint: ``` GET http://www.example.com/user-from-cookies?host=data.example.com Cookie: sessionid=abc123 ``` That API is expected to return a JSON object representing the current user: ```json { ""id"": 1, ""name"": ""Barry"" } ``` Since `ttl` is set to 10 that actor will be cached for ten seconds against that exact combination of cookies and headers. When that cache expires another hit will be made to the API. When deciding on a TTL value, take into account that users who lose access to the core site - maybe because their session expires, or their account is disabled - will still be able to access the Datasette instance until that cache expires. ","

datasette-auth-existing-cookies

Datasette plugin that authenticates users based on existing domain cookies.

When to use this

This plugin allows you to build custom authentication for Datasette when you are hosting a Datasette instance on the same domain as another, authenticated website.

Consider a website on www.example.com which supports user authentication.

You could run Datasette on data.example.com in a way that lets it see cookies that were set for the .example.com domain.

Using this plugin, you could build an API endpoint at www.example.com/user-for-cookies which returns a JSON object representing the currently signed-in user, based on their cookies.

The plugin running on data.example.com will then make the actor available to the rest of Datasette based on the response from that API.

Read about Datasette's authentication and permissions system for more on how actors and permissions work.

Configuration

This plugin requires some configuration in the Datasette metadata.json file.

The following configuration options are supported:

Here is an example that uses all four of these settings:

{
    ""plugins"": {
        ""datasette-auth-existing-cookies"": {
            ""api_url"": ""http://www.example.com/user-from-cookies"",
            ""cookies"": [""sessionid""],
            ""headers"": [""host""],
            ""ttl"": 10
        }
    }
}

With this configuration any hit to a Datasette hosted at data.example.com will result in the following request being made to the http://www.example.com/user-from-cookies API endpoint:

GET http://www.example.com/user-from-cookies?host=data.example.com
Cookie: sessionid=abc123

That API is expected to return a JSON object representing the current user:

{
    ""id"": 1,
    ""name"": ""Barry""
}

Since ttl is set to 10 that actor will be cached for ten seconds against that exact combination of cookies and headers. When that cache expires another hit will be made to the API.

When deciding on a TTL value, take into account that users who lose access to the core site - maybe because their session expires, or their account is disabled - will still be able to access the Datasette instance until that cache expires.

",1,public,0,,, 275711254,MDEwOlJlcG9zaXRvcnkyNzU3MTEyNTQ=,datasette-write,simonw/datasette-write,0,9599,https://github.com/simonw/datasette-write,Datasette plugin providing a UI for executing SQL writes against the database,0,2020-06-29T02:27:31Z,2021-09-11T06:00:31Z,2021-09-11T06:03:07Z,https://datasette.io/plugins/datasette-write,15,3,3,Python,1,1,1,1,0,2,0,0,2,,"[""datasette-io"", ""datasette-plugin""]",2,2,3,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,2,2,"# datasette-write [![PyPI](https://img.shields.io/pypi/v/datasette-write.svg)](https://pypi.org/project/datasette-write/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-write?label=changelog)](https://github.com/simonw/datasette-write/releases) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-write/blob/master/LICENSE) Datasette plugin providing a UI for writing to a database ## Installation Install this plugin in the same environment as Datasette. $ pip install datasette-write ## Usage Having installed the plugin, visit `/-/write` on your Datasette instance to submit SQL queries that will be executed against a write connection to the specified database. By default only the `root` user can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page. The `datasette-write` permission governs access. You can use permission plugins such as [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to grant additional access to the write interface. ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd datasette-write python3 -mvenv venv source venv/bin/activate Or if you are using `pipenv`: pipenv shell Now install the dependencies and tests: pip install -e '.[test]' To run the tests: pytest ","

datasette-write

Datasette plugin providing a UI for writing to a database

Installation

Install this plugin in the same environment as Datasette.

$ pip install datasette-write

Usage

Having installed the plugin, visit /-/write on your Datasette instance to submit SQL queries that will be executed against a write connection to the specified database.

By default only the root user can access the page - so you'll need to run Datasette with the --root option and click on the link shown in the terminal to sign in and access the page.

The datasette-write permission governs access. You can use permission plugins such as datasette-permissions-sql to grant additional access to the write interface.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-write
python3 -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and tests:

pip install -e '.[test]'

To run the tests:

pytest
",,,,,, 374846311,MDEwOlJlcG9zaXRvcnkzNzQ4NDYzMTE=,datasette-geojson,eyeseast/datasette-geojson,0,25778,https://github.com/eyeseast/datasette-geojson,Add GeoJSON output to Datasette queries,0,2021-06-08T01:33:19Z,2022-02-16T19:59:42Z,2022-02-16T20:02:49Z,,1102,3,3,Python,1,1,1,1,0,1,0,0,3,,"[""datasette-io"", ""datasette-plugin"", ""geojson"", ""gis"", ""sqlite""]",1,3,3,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,1,1,"# datasette-geojson [![PyPI](https://img.shields.io/pypi/v/datasette-geojson.svg)](https://pypi.org/project/datasette-geojson/) [![Changelog](https://img.shields.io/github/v/release/eyeseast/datasette-geojson?include_prereleases&label=changelog)](https://github.com/eyeseast/datasette-geojson/releases) [![Tests](https://github.com/eyeseast/datasette-geojson/workflows/Test/badge.svg)](https://github.com/eyeseast/datasette-geojson/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/eyeseast/datasette-geojson/blob/main/LICENSE) Add GeoJSON as an output option for datasette queries. ## Installation Install this plugin in the same environment as Datasette. datasette install datasette-geojson ## Usage To render GeoJSON, add a `.geojson` extension to any query URL that includes a `geometry` column. That column should be a valid [GeoJSON geometry](https://datatracker.ietf.org/doc/html/rfc7946#section-3.1). For example, you might use [geojson-to-sqlite](https://pypi.org/project/geojson-to-sqlite/) or [shapefile-to-sqlite](https://pypi.org/project/shapefile-to-sqlite/) to load [neighborhood boundaries](https://bostonopendata-boston.opendata.arcgis.com/datasets/3525b0ee6e6b427f9aab5d0a1d0a1a28_0/explore) into a SQLite database. ```sh wget -O neighborhoods.geojson https://opendata.arcgis.com/datasets/3525b0ee6e6b427f9aab5d0a1d0a1a28_0.geojson geojson-to-sqlite boston.db neighborhoods neighborhoods.geojson --spatial-index # create a spatial index datasette serve boston.db --load-extension spatialite ``` If you're using Spatialite, the geometry column will be in a binary format. If not, make sure the `geometry` column is a well-formed [GeoJSON geometry](https://datatracker.ietf.org/doc/html/rfc7946#section-3.1). If you used `geojson-to-sqlite` or `shapefile-to-sqlite`, you should be all set. Run this query in Datasette and you'll see a link to download GeoJSON: ```sql select rowid, OBJECTID, Name, Acres, Neighborhood_ID, SqMiles, ShapeSTArea, ShapeSTLength, geometry from neighborhoods order by rowid limit 101 ``` Note that the geometry column needs to be explicitly _named_ `geometry` or you won't get the option to export GeoJSON. If you want to use a different column, rename it with `AS`: `SELECT other AS geometry FROM my_table`. ![export geojson](img/export-options.png) ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd datasette-geojson python3 -mvenv venv source venv/bin/activate Or if you are using `pipenv`: pipenv shell Now install the dependencies and tests: pip install -e '.[test]' To run the tests: pytest ","

datasette-geojson

Add GeoJSON as an output option for datasette queries.

Installation

Install this plugin in the same environment as Datasette.

datasette install datasette-geojson

Usage

To render GeoJSON, add a .geojson extension to any query URL that includes a geometry column. That column should be a valid GeoJSON geometry.

For example, you might use geojson-to-sqlite or shapefile-to-sqlite to load neighborhood boundaries into a SQLite database.

wget -O neighborhoods.geojson https://opendata.arcgis.com/datasets/3525b0ee6e6b427f9aab5d0a1d0a1a28_0.geojson
geojson-to-sqlite boston.db neighborhoods neighborhoods.geojson --spatial-index # create a spatial index
datasette serve boston.db --load-extension spatialite

If you're using Spatialite, the geometry column will be in a binary format. If not, make sure the geometry column is a well-formed GeoJSON geometry. If you used geojson-to-sqlite or shapefile-to-sqlite, you should be all set.

Run this query in Datasette and you'll see a link to download GeoJSON:

select
  rowid,
  OBJECTID,
  Name,
  Acres,
  Neighborhood_ID,
  SqMiles,
  ShapeSTArea,
  ShapeSTLength,
  geometry
from
  neighborhoods
order by
  rowid
limit
  101

Note that the geometry column needs to be explicitly named geometry or you won't get the option to export GeoJSON. If you want to use a different column, rename it with AS: SELECT other AS geometry FROM my_table.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-geojson
python3 -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and tests:

pip install -e '.[test]'

To run the tests:

pytest
",1,public,0,,,