id,node_id,name,full_name,private,owner,html_url,description,fork,created_at,updated_at,pushed_at,homepage,size,stargazers_count,watchers_count,language,has_issues,has_projects,has_downloads,has_wiki,has_pages,forks_count,archived,disabled,open_issues_count,license,topics,forks,open_issues,watchers,default_branch,permissions,temp_clone_token,organization,network_count,subscribers_count,readme,readme_html,allow_forking,visibility,is_template,template_repository,web_commit_signoff_required,has_discussions
175550127,MDEwOlJlcG9zaXRvcnkxNzU1NTAxMjc=,yaml-to-sqlite,simonw/yaml-to-sqlite,0,9599,https://github.com/simonw/yaml-to-sqlite,Utility for converting YAML files to SQLite,0,2019-03-14T04:49:08Z,2021-06-13T09:04:40Z,2021-06-13T04:45:52Z,,19,36,36,Python,1,1,1,1,0,2,0,0,0,apache-2.0,"[""yaml"", ""sqlite"", ""datasette-io"", ""datasette-tool""]",2,0,36,main,"{""admin"": false, ""push"": false, ""pull"": false}",,,2,1,"# yaml-to-sqlite
[](https://pypi.org/project/yaml-to-sqlite/)
[](https://github.com/simonw/yaml-to-sqlite/releases)
[](https://github.com/simonw/yaml-to-sqlite/actions?query=workflow%3ATest)
[](https://github.com/simonw/yaml-to-sqlite/blob/main/LICENSE)
Load the contents of a YAML file into a SQLite database table.
```
$ yaml-to-sqlite --help
Usage: yaml-to-sqlite [OPTIONS] DB_PATH TABLE YAML_FILE
Convert YAML files to SQLite
Options:
--version Show the version and exit.
--pk TEXT Column to use as a primary key
--single-column TEXT If YAML file is a list of values, populate this column
--help Show this message and exit.
```
## Usage
Given a `news.yml` file containing the following:
```yaml
- date: 2021-06-05
body: |-
[Datasette 0.57](https://docs.datasette.io/en/stable/changelog.html#v0-57) is out with an important security patch.
- date: 2021-05-10
body: |-
[Django SQL Dashboard](https://simonwillison.net/2021/May/10/django-sql-dashboard/) is a new tool that brings a useful authenticated subset of Datasette to Django projects that are built on top of PostgreSQL.
```
Running this command:
```bash
$ yaml-to-sqlite news.db stories news.yml
```
Will create a database file with this schema:
```bash
$ sqlite-utils schema news.db
CREATE TABLE [stories] (
[date] TEXT,
[body] TEXT
);
```
The `--pk` option can be used to set a column as the primary key for the table:
```bash
$ yaml-to-sqlite news.db stories news.yml --pk date
$ sqlite-utils schema news.db
CREATE TABLE [stories] (
[date] TEXT PRIMARY KEY,
[body] TEXT
);
```
## Single column YAML lists
The `--single-column` option can be used when the YAML file is a list of values, for example a file called `dogs.yml` containing the following:
```yaml
- Cleo
- Pancakes
- Nixie
```
Running this command:
```bash
$ yaml-to-sqlite dogs.db dogs.yaml --single-column=name
```
Will create a single `dogs` table with a single `name` column that is the primary key:
```bash
$ sqlite-utils schema dogs.db
CREATE TABLE [dogs] (
[name] TEXT PRIMARY KEY
);
$ sqlite-utils dogs.db 'select * from dogs' -t
name
--------
Cleo
Pancakes
Nixie
```
","
yaml-to-sqlite
Load the contents of a YAML file into a SQLite database table.
$ yaml-to-sqlite --help
Usage: yaml-to-sqlite [OPTIONS] DB_PATH TABLE YAML_FILE
Convert YAML files to SQLite
Options:
--version Show the version and exit.
--pk TEXT Column to use as a primary key
--single-column TEXT If YAML file is a list of values, populate this column
--help Show this message and exit.
Usage
Given a news.yml file containing the following:
- date: 2021-06-05body: |- [Datasette 0.57](https://docs.datasette.io/en/stable/changelog.html#v0-57) is out with an important security patch.
- date: 2021-05-10body: |- [Django SQL Dashboard](https://simonwillison.net/2021/May/10/django-sql-dashboard/) is a new tool that brings a useful authenticated subset of Datasette to Django projects that are built on top of PostgreSQL.
Will create a single dogs table with a single name column that is the primary key:
$ sqlite-utils schema dogs.db
CREATE TABLE [dogs] (
[name] TEXT PRIMARY KEY
);
$ sqlite-utils dogs.db 'select * from dogs' -t
name
--------
Cleo
Pancakes
Nixie
",,,,,,
291339086,MDEwOlJlcG9zaXRvcnkyOTEzMzkwODY=,airtable-export,simonw/airtable-export,0,9599,https://github.com/simonw/airtable-export,"Export Airtable data to YAML, JSON or SQLite files on disk",0,2020-08-29T19:51:37Z,2021-06-08T17:30:30Z,2021-04-09T23:41:52Z,https://datasette.io/tools/airtable-export,41,33,33,Python,1,1,1,1,0,5,0,0,6,apache-2.0,"[""yaml"", ""airtable"", ""airtable-api"", ""datasette-io"", ""datasette-tool""]",5,6,33,main,"{""admin"": false, ""push"": false, ""pull"": false}",,,5,3,"# airtable-export
[](https://pypi.org/project/airtable-export/)
[](https://github.com/simonw/airtable-export/releases)
[](https://github.com/simonw/airtable-export/actions?query=workflow%3ATest)
[](https://github.com/simonw/airtable-export/blob/master/LICENSE)
Export Airtable data to files on disk
## Installation
Install this tool using `pip`:
$ pip install airtable-export
## Usage
You will need to know the following information:
- Your Airtable base ID - this is a string starting with `app...`
- Your Airtable API key - this is a string starting with `key...`
- The names of each of the tables that you wish to export
You can export all of your data to a folder called `export/` by running the following:
airtable-export export base_id table1 table2 --key=key
This example would create two files: `export/table1.yml` and `export/table2.yml`.
Rather than passing the API key using the `--key` option you can set it as an environment variable called `AIRTABLE_KEY`.
## Export options
By default the tool exports your data as YAML.
You can also export as JSON or as [newline delimited JSON](http://ndjson.org/) using the `--json` or `--ndjson` options:
airtable-export export base_id table1 table2 --key=key --ndjson
You can pass multiple format options at once. This command will create a `.json`, `.yml` and `.ndjson` file for each exported table:
airtable-export export base_id table1 table2 \
--key=key --ndjson --yaml --json
### SQLite database export
You can export tables to a SQLite database file using the `--sqlite database.db` option:
airtable-export export base_id table1 table2 \
--key=key --sqlite database.db
This can be combined with other format options. If you only specify `--sqlite` the export directory argument will be ignored.
The SQLite database will have a table created for each table you export. Those tables will have a primary key column called `airtable_id`.
If you run this command against an existing SQLite database records with matching primary keys will be over-written by new records from the export.
## Request options
By default the tool uses [python-httpx](https://www.python-httpx.org)'s default configurations.
You can override the `user-agent` using the `--user-agent` option:
airtable-export export base_id table1 table2 --key=key --user-agent ""Airtable Export Robot""
You can override the [timeout during a network read operation](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) using the `--http-read-timeout` option. If not set, this defaults to 5s.
airtable-export export base_id table1 table2 --key=key --http-read-timeout 60
## Running this using GitHub Actions
[GitHub Actions](https://github.com/features/actions) is GitHub's workflow automation product. You can use it to run `airtable-export` in order to back up your Airtable data to a GitHub repository. Doing this gives you a visible commit history of changes you make to your Airtable data - like [this one](https://github.com/natbat/rockybeaches/commits/main/airtable).
To run this for your own Airtable database you'll first need to add the following secrets to your GitHub repository:
AIRTABLE_BASE_ID
The base ID, a string beginning `app...`
AIRTABLE_KEY
Your Airtable API key
AIRTABLE_TABLES
A space separated list of the Airtable tables that you want to backup. If any of these contain spaces you will need to enclose them in single quotes, e.g. 'My table with spaces in the name' OtherTableWithNoSpaces
Once you have set those secrets, add the following as a file called `.github/workflows/backup-airtable.yml`:
```yaml
name: Backup Airtable
on:
workflow_dispatch:
schedule:
- cron: '32 0 * * *'
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Check out repo
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-
restore-keys: |
${{ runner.os }}-pip-
- name: Install airtable-export
run: |
pip install airtable-export
- name: Backup Airtable to backups/
env:
AIRTABLE_BASE_ID: ${{ secrets.AIRTABLE_BASE_ID }}
AIRTABLE_KEY: ${{ secrets.AIRTABLE_KEY }}
AIRTABLE_TABLES: ${{ secrets.AIRTABLE_TABLES }}
run: |-
airtable-export backups $AIRTABLE_BASE_ID $AIRTABLE_TABLES -v
- name: Commit and push if it changed
run: |-
git config user.name ""Automated""
git config user.email ""actions@users.noreply.github.com""
git add -A
timestamp=$(date -u)
git commit -m ""Latest data: ${timestamp}"" || exit 0
git push
```
This will run once a day (at 32 minutes past midnight UTC) and will also run if you manually click the ""Run workflow"" button, see [GitHub Actions: Manual triggers with workflow_dispatch](https://github.blog/changelog/2020-07-06-github-actions-manual-triggers-with-workflow_dispatch/).
## Development
To contribute to this tool, first checkout the code. Then create a new virtual environment:
cd airtable-export
python -mvenv venv
source venv/bin/activate
Or if you are using `pipenv`:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
","
airtable-export
Export Airtable data to files on disk
Installation
Install this tool using pip:
$ pip install airtable-export
Usage
You will need to know the following information:
Your Airtable base ID - this is a string starting with app...
Your Airtable API key - this is a string starting with key...
The names of each of the tables that you wish to export
You can export all of your data to a folder called export/ by running the following:
GitHub Actions is GitHub's workflow automation product. You can use it to run airtable-export in order to back up your Airtable data to a GitHub repository. Doing this gives you a visible commit history of changes you make to your Airtable data - like this one.
To run this for your own Airtable database you'll first need to add the following secrets to your GitHub repository:
AIRTABLE_BASE_ID
The base ID, a string beginning `app...`
AIRTABLE_KEY
Your Airtable API key
AIRTABLE_TABLES
A space separated list of the Airtable tables that you want to backup. If any of these contain spaces you will need to enclose them in single quotes, e.g. 'My table with spaces in the name' OtherTableWithNoSpaces
Once you have set those secrets, add the following as a file called .github/workflows/backup-airtable.yml:
To contribute to this tool, first checkout the code. Then create a new virtual environment:
cd airtable-export
python -mvenv venv
source venv/bin/activate
Or if you are using pipenv:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
",,,,,,
291359358,MDEwOlJlcG9zaXRvcnkyOTEzNTkzNTg=,datasette-yaml,simonw/datasette-yaml,0,9599,https://github.com/simonw/datasette-yaml,Export Datasette records as YAML,0,2020-08-29T22:32:15Z,2020-12-28T03:20:36Z,2021-05-13T08:59:53Z,,7,2,2,Python,1,1,1,1,0,1,0,0,1,,"[""yaml"", ""datasette"", ""datasette-plugin"", ""datasette-io""]",1,1,2,main,"{""admin"": false, ""push"": false, ""pull"": false}",,,1,1,"# datasette-yaml
[](https://pypi.org/project/datasette-yaml/)
[](https://github.com/simonw/datasette-yaml/releases)
[](https://github.com/simonw/datasette-yaml/actions?query=workflow%3ATest)
[](https://github.com/simonw/datasette-yaml/blob/main/LICENSE)
Export Datasette records as YAML
## Installation
Install this plugin in the same environment as Datasette.
$ datasette install datasette-yaml
## Usage
Having installed this plugin, every table and query will gain a new `.yaml` export link.
You can also construct these URLs directly: `/dbname/tablename.yaml`
## Demo
The plugin is running on [covid-19.datasettes.com](https://covid-19.datasettes.co/) - for example [/covid/latest_ny_times_counties_with_populations.yaml](https://covid-19.datasettes.com/covid/latest_ny_times_counties_with_populations.yaml)
## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd datasette-yaml
python3 -mvenv venv
source venv/bin/activate
Or if you are using `pipenv`:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
","
datasette-yaml
Export Datasette records as YAML
Installation
Install this plugin in the same environment as Datasette.
$ datasette install datasette-yaml
Usage
Having installed this plugin, every table and query will gain a new .yaml export link.
You can also construct these URLs directly: /dbname/tablename.yaml