id,node_id,name,full_name,private,owner,html_url,description,fork,created_at,updated_at,pushed_at,homepage,size,stargazers_count,watchers_count,language,has_issues,has_projects,has_downloads,has_wiki,has_pages,forks_count,archived,disabled,open_issues_count,license,topics,forks,open_issues,watchers,default_branch,permissions,temp_clone_token,organization,network_count,subscribers_count,readme,readme_html,allow_forking,visibility,is_template,template_repository,web_commit_signoff_required,has_discussions
236867027,MDEwOlJlcG9zaXRvcnkyMzY4NjcwMjc=,datasette-sentry,simonw/datasette-sentry,0,9599,https://github.com/simonw/datasette-sentry,Datasette plugin for configuring Sentry,0,2020-01-28T23:41:27Z,2022-07-18T20:28:25Z,2022-10-06T22:31:29Z,,26,6,6,Python,1,1,1,1,0,0,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""sentry""]",0,0,6,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,2,"# datasette-sentry
[](https://pypi.org/project/datasette-sentry/)
[](https://github.com/simonw/datasette-sentry/releases)
[](https://github.com/simonw/datasette-sentry/actions?query=workflow%3ATest)
[](https://github.com/simonw/datasette-sentry/blob/main/LICENSE)
Datasette plugin for configuring Sentry for error reporting
## Installation
pip install datasette-sentry
## Usage
This plugin only takes effect if your `metadata.json` file contains relevant top-level plugin configuration in a `""datasette-sentry""` configuration key.
You will need a Sentry DSN - see their [Getting Started instructions](https://docs.sentry.io/error-reporting/quickstart/?platform=python).
Add it to `metadata.json` like this:
```json
{
""plugins"": {
""datasette-sentry"": {
""dsn"": ""https://KEY@sentry.io/PROJECTID""
}
}
}
```
Settings in `metadata.json` are visible to anyone who visits the `/-/metadata` URL so this is a good place to take advantage of Datasette's [secret configuration values](https://datasette.readthedocs.io/en/stable/plugins.html#secret-configuration-values), in which case your configuration will look more like this:
```json
{
""plugins"": {
""datasette-sentry"": {
""dsn"": {
""$env"": ""SENTRY_DSN""
}
}
}
}
```
Then make a `SENTRY_DSN` environment variable available to Datasette.
## Configuration
In addition to the `dsn` setting, you can also configure the Sentry [sample rate](https://docs.sentry.io/platforms/python/configuration/sampling/) by setting `sample_rate` to a floating point number between 0 and 1.
For example, to capture 25% of errors you would do this:
```json
{
""plugins"": {
""datasette-sentry"": {
""dsn"": {
""$env"": ""SENTRY_DSN""
},
""sample_rate"": 0.25
}
}
}
```
","
datasette-sentry
Datasette plugin for configuring Sentry for error reporting
Installation
pip install datasette-sentry
Usage
This plugin only takes effect if your metadata.json file contains relevant top-level plugin configuration in a ""datasette-sentry"" configuration key.
Settings in metadata.json are visible to anyone who visits the /-/metadata URL so this is a good place to take advantage of Datasette's secret configuration values, in which case your configuration will look more like this:
",1,public,0,,0,
245856731,MDEwOlJlcG9zaXRvcnkyNDU4NTY3MzE=,datasette-search-all,simonw/datasette-search-all,0,9599,https://github.com/simonw/datasette-search-all,Datasette plugin for searching all searchable tables at once,0,2020-03-08T17:21:54Z,2021-12-19T04:06:49Z,2022-10-05T01:53:33Z,,186,6,6,Python,1,1,1,1,0,2,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""search""]",2,0,6,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,2,2,"# datasette-search-all
[](https://pypi.org/project/datasette-search-all/)
[](https://github.com/simonw/datasette-search-all/releases)
[](https://github.com/simonw/datasette-search-all/actions?query=workflow%3ATest)
[](https://github.com/simonw/datasette-search-all/blob/main/LICENSE)
Datasette plugin for searching all searchable tables at once.
## Installation
Install the plugin in the same Python environment as Datasette:
pip install datasette-search-all
## Background
See [datasette-search-all: a new plugin for searching multiple Datasette tables at once](https://simonwillison.net/2020/Mar/9/datasette-search-all/) for background on this project. You can try the plugin out at https://fara.datasettes.com/
## Usage
This plugin only works if at least one of the tables connected to your Datasette instance has been configured for SQLite's full-text search.
The [Datasette search documentation](https://docs.datasette.io/en/stable/full_text_search.html) includes details on how to enable full-text search for a table.
You can also use the following tools:
* [sqlite-utils](https://sqlite-utils.datasette.io/en/stable/cli.html#configuring-full-text-search) includes a command-line tool for enabling full-text search.
* [datasette-enable-fts](https://github.com/simonw/datasette-enable-fts) is a Datasette plugin that adds a web interface for enabling search for specific columns.
If the plugin detects at least one searchable table it will add a search form to the homepage.
You can also navigate to `/-/search` on your Datasette instance to use the search interface directly.
## Screenshot

","
datasette-search-all
Datasette plugin for searching all searchable tables at once.
Installation
Install the plugin in the same Python environment as Datasette:
sqlite-utils includes a command-line tool for enabling full-text search.
datasette-enable-fts is a Datasette plugin that adds a web interface for enabling search for specific columns.
If the plugin detects at least one searchable table it will add a search form to the homepage.
You can also navigate to /-/search on your Datasette instance to use the search interface directly.
Screenshot
",1,public,0,,0,
247527438,MDEwOlJlcG9zaXRvcnkyNDc1Mjc0Mzg=,datasette-edit-schema,simonw/datasette-edit-schema,0,9599,https://github.com/simonw/datasette-edit-schema,Datasette plugin for modifying table schemas,0,2020-03-15T18:34:06Z,2022-07-01T22:20:25Z,2022-08-22T22:45:58Z,,133,6,6,JavaScript,1,1,1,1,0,0,0,0,10,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin""]",0,10,6,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,1,"# datasette-edit-schema
[](https://pypi.org/project/datasette-edit-schema/)
[](https://github.com/simonw/datasette-edit-schema/releases)
[](https://github.com/simonw/datasette-edit-schema/actions?query=workflow%3ATest)
[](https://github.com/simonw/datasette-edit-schema/blob/master/LICENSE)
Datasette plugin for modifying table schemas
## Features
* Add new columns to a table
* Rename columns in a table
* Modify the type of columns in a table
* Re-order the columns in a table
* Rename a table
* Delete a table
## Installation
Install this plugin in the same environment as Datasette.
$ pip install datasette-edit-schema
## Usage
Navigate to `/-/edit-schema/dbname/tablename` on your Datasette instance to edit a specific table.
Use `/-/edit-schema/dbname` to create a new table in a specific database.
By default only [the root actor](https://datasette.readthedocs.io/en/stable/authentication.html#using-the-root-actor) can access the page - so you'll need to run Datasette with the `--root` option and click on the link shown in the terminal to sign in and access the page.
## Permissions
The `edit-schema` permission governs access. You can use permission plugins such as [datasette-permissions-sql](https://github.com/simonw/datasette-permissions-sql) to grant additional access to the write interface.
These permission checks will call the `permission_allowed()` plugin hook with three arguments:
- `action` will be the string `""edit-schema""`
- `actor` will be the currently authenticated actor - usually a dictionary
- `resource` will be the string name of the database
## Screenshot

## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd datasette-edit-schema
python3 -mvenv venv
source venv/bin/activate
Or if you are using `pipenv`:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
","
datasette-edit-schema
Datasette plugin for modifying table schemas
Features
Add new columns to a table
Rename columns in a table
Modify the type of columns in a table
Re-order the columns in a table
Rename a table
Delete a table
Installation
Install this plugin in the same environment as Datasette.
$ pip install datasette-edit-schema
Usage
Navigate to /-/edit-schema/dbname/tablename on your Datasette instance to edit a specific table.
Use /-/edit-schema/dbname to create a new table in a specific database.
By default only the root actor can access the page - so you'll need to run Datasette with the --root option and click on the link shown in the terminal to sign in and access the page.
Permissions
The edit-schema permission governs access. You can use permission plugins such as datasette-permissions-sql to grant additional access to the write interface.
These permission checks will call the permission_allowed() plugin hook with three arguments:
action will be the string ""edit-schema""
actor will be the currently authenticated actor - usually a dictionary
resource will be the string name of the database
Screenshot
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd datasette-edit-schema
python3 -mvenv venv
source venv/bin/activate
Or if you are using pipenv:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
",1,public,0,,0,
299143849,MDEwOlJlcG9zaXRvcnkyOTkxNDM4NDk=,datasette-dateutil,simonw/datasette-dateutil,0,9599,https://github.com/simonw/datasette-dateutil,dateutil functions for Datasette,0,2020-09-28T00:14:20Z,2022-03-01T00:09:57Z,2022-03-01T01:40:21Z,,18,6,6,Python,1,1,1,1,0,0,0,0,2,,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""dateutil""]",0,2,6,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,2,"# datasette-dateutil
[](https://pypi.org/project/datasette-dateutil/)
[](https://github.com/simonw/datasette-dateutil/releases)
[](https://github.com/simonw/datasette-dateutil/actions?query=workflow%3ATest)
[](https://github.com/simonw/datasette-dateutil/blob/main/LICENSE)
dateutil functions for Datasette
## Installation
Install this plugin in the same environment as Datasette.
$ datasette install datasette-dateutil
## Usage
This function adds custom SQL functions that expose functionality from the [dateutil](https://dateutil.readthedocs.io/) Python library.
Once installed, the following SQL functions become available:
### Parsing date strings
- `dateutil_parse(text)` - returns an ISO8601 date string parsed from the text, or `null` if the input could not be parsed. `dateutil_parse(""10 october 2020 3pm"")` returns `2020-10-10T15:00:00`.
- `dateutil_parse_fuzzy(text)` - same as `dateutil_parse()` but this also works against strings that contain a date somewhere within them - that date will be returned, or `null` if no dates could be found. `dateutil_parse_fuzzy(""This is due 10 september"")` returns `2020-09-10T00:00:00` (but will start returning the 2021 version of that if the year is 2021).
The `dateutil_parse()` and `dateutil_parse_fuzzy()` functions both follow the American convention of assuming that `1/2/2020` lists the month first, evaluating this example to the 2nd of January.
If you want to assume that the day comes first, use these two functions instead:
- `dateutil_parse_dayfirst(text)`
- `dateutil_parse_fuzzy_dayfirst(text)`
Here's a query demonstrating these functions:
```sql
select
dateutil_parse(""10 october 2020 3pm""),
dateutil_parse_fuzzy(""This is due 10 september""),
dateutil_parse(""1/2/2020""),
dateutil_parse(""2020-03-04""),
dateutil_parse_dayfirst(""2020-03-04"");
```
[Try that query](https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_parse%28%2210+october+2020+3pm%22%29%2C%0D%0A++dateutil_parse_fuzzy%28%22This+is+due+10+september%22%29%2C%0D%0A++dateutil_parse%28%221%2F2%2F2020%22%29%2C%0D%0A++dateutil_parse%28%222020-03-04%22%29%2C%0D%0A++dateutil_parse_dayfirst%28%222020-03-04%22%29%3B)
### Optional default dates
The `dateutil_parse()`, `dateutil_parse_fuzzy()`, `dateutil_parse_dayfirst()` and `dateutil_parse_fuzzy_dayfirst()` functions all accept an optional second argument specifying a ""default"" datetime to consider if some of the details are missing. For example, the following:
```sql
select dateutil_parse('1st october', '1985-01-01')
```
Will return `1985-10-01T00:00:00` - the missing year is replaced with the year from the default date.
[Example query demonstrating the default date argument](https://latest-with-plugins.datasette.io/fixtures?sql=with+times+as+%28%0D%0A++select%0D%0A++++datetime%28%27now%27%29+as+t%0D%0A++union%0D%0A++select%0D%0A++++datetime%28%27now%27%2C+%27-1+year%27%29%0D%0A++union%0D%0A++select%0D%0A++++datetime%28%27now%27%2C+%27-3+years%27%29%0D%0A%29%0D%0Aselect+t%2C+dateutil_parse_fuzzy%28%22This+is+due+10+september%22%2C+t%29+from+times)
### Calculating Easter
- `dateutil_easter(year)` - returns the date for Easter in that year, for example `dateutil_easter(""2020"")` returns `2020-04-12`.
[Example Easter query](https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_easter%282019%29%2C%0D%0A++dateutil_easter%282020%29%2C%0D%0A++dateutil_easter%282021%29)
### JSON arrays of dates
Several functions return JSON arrays of date strings. These can be used with SQLite's `json_each()` function to perform joins against dates from a specific date range or recurrence rule.
These functions can return up to 10,000 results. They will return an error if more than 10,000 dates would be returned - this is to protect against denial of service attacks.
- `dateutil_dates_between('1 january 2020', '5 jan 2020')` - given two dates (in any format that can be handled by `dateutil_parse()`) this function returns a JSON string containing the dates between those two days, inclusive. This example returns `[""2020-01-01"", ""2020-01-02"", ""2020-01-03"", ""2020-01-04"", ""2020-01-05""]`.
- `dateutil_dates_between('1 january 2020', '5 jan 2020', 0)` - set the optional third argument to `0` to specify that you would like this to be exclusive of the last day. This example returns `[""2020-01-01"", ""2020-01-02"", ""2020-01-03"", ""2020-01-04""]`.
[Try these queries](https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_dates_between%28%271+january+2020%27%2C+%275+jan+2020%27%29%2C%0D%0A++dateutil_dates_between%28%271+january+2020%27%2C+%275+jan+2020%27%2C+0%29)
The `dateutil_rrule()` and `dateutil_rrule_date()` functions accept the iCalendar standard ``rrule` format - see [the dateutil documentation](https://dateutil.readthedocs.io/en/stable/rrule.html#rrulestr-examples) for more examples.
This format lets you specify recurrence rules such as ""the next four last mondays of the month"".
- `dateutil_rrule(rrule, optional_dtsart)` - given an rrule returns a JSON array of ISO datetimes. The second argument is optional and will be treated as the start date for the rule.
- `dateutil_rrule_date(rrule, optional_dtsart)` - same as `dateutil_rrule()` but returns ISO dates.
Example query:
```sql
select
dateutil_rrule('FREQ=HOURLY;COUNT=5'),
dateutil_rrule_date(
'FREQ=DAILY;COUNT=3',
'1st jan 2020'
);
```
[Try the rrule example query](https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++dateutil_rrule('FREQ%3DHOURLY%3BCOUNT%3D5')%2C%0D%0A++dateutil_rrule_date(%0D%0A++++'FREQ%3DDAILY%3BCOUNT%3D3'%2C%0D%0A++++'1st+jan+2020'%0D%0A++)%3B)
### Joining data using json_each()
SQLite's [json_each() function](https://www.sqlite.org/json1.html#jeach) can be used to turn a JSON array of dates into a table that can be joined against other data. Here's a query that returns a table showing every day in January 2019:
```sql
select
value as date
from
json_each(
dateutil_dates_between('1 Jan 2019', '31 Jan 2019')
)
```
[Try that query](https://latest-with-plugins.datasette.io/fixtures?sql=select%0D%0A++value+as+date%0D%0Afrom%0D%0A++json_each%28%0D%0A++++dateutil_dates_between%28%271+Jan+2019%27%2C+%2731+Jan+2019%27%29%0D%0A++%29)
You can run joins against this table by assigning it a name using SQLite's [support for Common Table Expressions (CTEs)](https://sqlite.org/lang_with.html).
This example query uses `substr(created, 0, 11)` to retrieve the date portion of the `created` column in the [facetable demo table](https://latest-with-plugins.datasette.io/fixtures/facetable), then joins that against the table of days in January to calculate the count of rows created on each day. The `LEFT JOIN` against `days_in_january` ensures that days which had no created records are still returned in the results, with a count of 0.
```sql
with created_dates as (
select
substr(created, 0, 11) as date
from
facetable
),
days_in_january as (
select
value as date
from
json_each(
dateutil_dates_between('1 Jan 2019', '31 Jan 2019')
)
)
select
days_in_january.date,
count(created_dates.date) as total
from
days_in_january
left join created_dates on days_in_january.date = created_dates.date
group by
days_in_january.date;
```
[Try that query](https://latest-with-plugins.datasette.io/fixtures?sql=with+created_dates+as+%28%0D%0A++select%0D%0A++++substr%28created%2C+0%2C+11%29+as+date%0D%0A++from%0D%0A++++facetable%0D%0A%29%2C%0D%0Adays_in_january+as+%28%0D%0A++select%0D%0A++++value+as+date%0D%0A++from%0D%0A++++json_each%28%0D%0A++++++dateutil_dates_between%28%271+Jan+2019%27%2C+%2731+Jan+2019%27%29%0D%0A++++%29%0D%0A%29%0D%0Aselect%0D%0A++days_in_january.date%2C%0D%0A++count%28created_dates.date%29+as+total%0D%0Afrom%0D%0A++days_in_january%0D%0A++left+join+created_dates+on+days_in_january.date+%3D+created_dates.date%0D%0Agroup+by%0D%0A++days_in_january.date%3B#g.mark=bar&g.x_column=date&g.x_type=ordinal&g.y_column=total&g.y_type=quantitative) with a bar chart rendered using the [datasette-vega](https://github.com/simonw/datasette-vega) plugin.
## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd datasette-dateutil
python3 -mvenv venv
source venv/bin/activate
Or if you are using `pipenv`:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
","
datasette-dateutil
dateutil functions for Datasette
Installation
Install this plugin in the same environment as Datasette.
$ datasette install datasette-dateutil
Usage
This function adds custom SQL functions that expose functionality from the dateutil Python library.
Once installed, the following SQL functions become available:
Parsing date strings
dateutil_parse(text) - returns an ISO8601 date string parsed from the text, or null if the input could not be parsed. dateutil_parse(""10 october 2020 3pm"") returns 2020-10-10T15:00:00.
dateutil_parse_fuzzy(text) - same as dateutil_parse() but this also works against strings that contain a date somewhere within them - that date will be returned, or null if no dates could be found. dateutil_parse_fuzzy(""This is due 10 september"") returns 2020-09-10T00:00:00 (but will start returning the 2021 version of that if the year is 2021).
The dateutil_parse() and dateutil_parse_fuzzy() functions both follow the American convention of assuming that 1/2/2020 lists the month first, evaluating this example to the 2nd of January.
If you want to assume that the day comes first, use these two functions instead:
dateutil_parse_dayfirst(text)
dateutil_parse_fuzzy_dayfirst(text)
Here's a query demonstrating these functions:
select
dateutil_parse(""10 october 2020 3pm""),
dateutil_parse_fuzzy(""This is due 10 september""),
dateutil_parse(""1/2/2020""),
dateutil_parse(""2020-03-04""),
dateutil_parse_dayfirst(""2020-03-04"");
The dateutil_parse(), dateutil_parse_fuzzy(), dateutil_parse_dayfirst() and dateutil_parse_fuzzy_dayfirst() functions all accept an optional second argument specifying a ""default"" datetime to consider if some of the details are missing. For example, the following:
Several functions return JSON arrays of date strings. These can be used with SQLite's json_each() function to perform joins against dates from a specific date range or recurrence rule.
These functions can return up to 10,000 results. They will return an error if more than 10,000 dates would be returned - this is to protect against denial of service attacks.
dateutil_dates_between('1 january 2020', '5 jan 2020') - given two dates (in any format that can be handled by dateutil_parse()) this function returns a JSON string containing the dates between those two days, inclusive. This example returns [""2020-01-01"", ""2020-01-02"", ""2020-01-03"", ""2020-01-04"", ""2020-01-05""].
dateutil_dates_between('1 january 2020', '5 jan 2020', 0) - set the optional third argument to 0 to specify that you would like this to be exclusive of the last day. This example returns [""2020-01-01"", ""2020-01-02"", ""2020-01-03"", ""2020-01-04""].
The dateutil_rrule() and dateutil_rrule_date() functions accept the iCalendar standard ``rrule` format - see the dateutil documentation for more examples.
This format lets you specify recurrence rules such as ""the next four last mondays of the month"".
dateutil_rrule(rrule, optional_dtsart) - given an rrule returns a JSON array of ISO datetimes. The second argument is optional and will be treated as the start date for the rule.
dateutil_rrule_date(rrule, optional_dtsart) - same as dateutil_rrule() but returns ISO dates.
Example query:
select
dateutil_rrule('FREQ=HOURLY;COUNT=5'),
dateutil_rrule_date(
'FREQ=DAILY;COUNT=3',
'1st jan 2020'
);
SQLite's json_each() function can be used to turn a JSON array of dates into a table that can be joined against other data. Here's a query that returns a table showing every day in January 2019:
select
value asdatefrom
json_each(
dateutil_dates_between('1 Jan 2019', '31 Jan 2019')
)
This example query uses substr(created, 0, 11) to retrieve the date portion of the created column in the facetable demo table, then joins that against the table of days in January to calculate the count of rows created on each day. The LEFT JOIN against days_in_january ensures that days which had no created records are still returned in the results, with a count of 0.
with created_dates as (
select
substr(created, 0, 11) asdatefrom
facetable
),
days_in_january as (
select
value asdatefrom
json_each(
dateutil_dates_between('1 Jan 2019', '31 Jan 2019')
)
)
selectdays_in_january.date,
count(created_dates.date) as total
from
days_in_january
left join created_dates ondays_in_january.date=created_dates.dategroup bydays_in_january.date;