id,node_id,name,full_name,private,owner,html_url,description,fork,created_at,updated_at,pushed_at,homepage,size,stargazers_count,watchers_count,language,has_issues,has_projects,has_downloads,has_wiki,has_pages,forks_count,archived,disabled,open_issues_count,license,topics,forks,open_issues,watchers,default_branch,permissions,temp_clone_token,organization,network_count,subscribers_count,readme,readme_html,allow_forking,visibility,is_template,template_repository,web_commit_signoff_required,has_discussions 272098486,MDEwOlJlcG9zaXRvcnkyNzIwOTg0ODY=,datasette-psutil,simonw/datasette-psutil,0,9599,https://github.com/simonw/datasette-psutil,Datasette plugin adding a /-/psutil debugging endpoint,0,2020-06-13T22:57:07Z,2022-03-07T15:36:30Z,2022-03-07T15:35:57Z,https://datasette.io/plugins/datasette-psutil,12,2,2,Python,1,1,1,1,0,0,0,0,1,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""psutil""]",0,1,2,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,2,"# datasette-psutil [![PyPI](https://img.shields.io/pypi/v/datasette-psutil.svg)](https://pypi.org/project/datasette-psutil/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-psutil?include_prereleases&label=changelog)](https://github.com/simonw/datasette-psutil/releases) [![Tests](https://github.com/simonw/datasette-psutil/workflows/Test/badge.svg)](https://github.com/simonw/datasette-psutil/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-psutil/blob/main/LICENSE) Datasette plugin adding a `/-/psutil` debugging endpoint ## Installation Install this plugin in the same environment as Datasette. $ pip install datasette-psutil ## Usage Visit `/-/psutil` on your Datasette instance to see various information provided by [psutil](https://psutil.readthedocs.io/). ## Demo https://latest-with-plugins.datasette.io/-/psutil is a live demo of this plugin, hosted on Google Cloud Run. ","

datasette-psutil

Datasette plugin adding a /-/psutil debugging endpoint

Installation

Install this plugin in the same environment as Datasette.

$ pip install datasette-psutil

Usage

Visit /-/psutil on your Datasette instance to see various information provided by psutil.

Demo

https://latest-with-plugins.datasette.io/-/psutil is a live demo of this plugin, hosted on Google Cloud Run.

",1,public,0,,, 274293597,MDEwOlJlcG9zaXRvcnkyNzQyOTM1OTc=,datasette-block-robots,simonw/datasette-block-robots,0,9599,https://github.com/simonw/datasette-block-robots,Datasette plugin that blocks robots and crawlers using robots.txt,0,2020-06-23T02:52:23Z,2022-08-30T16:13:40Z,2022-08-30T16:25:38Z,https://datasette.io/plugins/datasette-block-robots,21,2,2,Python,1,1,1,1,0,0,0,0,0,,"[""datasette"", ""datasette-io"", ""datasette-plugin"", ""robots-txt""]",0,0,2,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,2,"# datasette-block-robots [![PyPI](https://img.shields.io/pypi/v/datasette-block-robots.svg)](https://pypi.org/project/datasette-block-robots/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-block-robots?label=changelog)](https://github.com/simonw/datasette-block-robots/releases) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-block-robots/blob/master/LICENSE) Datasette plugin that blocks robots and crawlers using robots.txt ## Installation Install this plugin in the same environment as Datasette. $ pip install datasette-block-robots ## Usage Having installed the plugin, `/robots.txt` on your Datasette instance will return the following: User-agent: * Disallow: / This will request all robots and crawlers not to visit any of the pages on your site. Here's a demo of the plugin in action: https://sqlite-generate-demo.datasette.io/robots.txt ## Configuration By default the plugin will block all access to the site, using `Disallow: /`. If you want the index page to be indexed by search engines without crawling the database, table or row pages themselves, you can use the following: ```json { ""plugins"": { ""datasette-block-robots"": { ""allow_only_index"": true } } } ``` This will return a `/robots.txt` like so: User-agent: * Disallow: /db1 Disallow: /db2 With a `Disallow` line for every attached database. To block access to specific areas of the site using custom paths, add this to your `metadata.json` configuration file: ```json { ""plugins"": { ""datasette-block-robots"": { ""disallow"": [""/mydatabase/mytable""] } } } ``` This will result in a `/robots.txt` that looks like this: User-agent: * Disallow: /mydatabase/mytable Alternatively you can set the full contents of the `robots.txt` file using the `literal` configuration option. Here's how to do that if you are using YAML rather than JSON and have a `metadata.yml` file: ```yaml plugins: datasette-block-robots: literal: |- User-agent: * Disallow: / User-agent: Bingbot User-agent: Googlebot Disallow: ``` This example would block all crawlers with the exception of Googlebot and Bingbot, which are allowed to crawl the entire site. ## Extending this with other plugins This plugin adds a new [plugin hook](https://docs.datasette.io/en/stable/plugin_hooks.html) to Datasete called `block_robots_extra_lines()` which can be used by other plugins to add their own additional lines to the `robots.txt` file. The hook can optionally accept these parameters: - `datasette`: The current [Datasette instance](https://docs.datasette.io/en/stable/internals.html#datasette-class). You can use this to execute SQL queries or read plugin configuration settings. - `request`: The [Request object](https://docs.datasette.io/en/stable/internals.html#request-object) representing the incoming request to `/robots.txt`. The hook should return a list of strings, each representing a line to be added to the `robots.txt` file. It can also return an `async def` function, which will be awaited and used to generate a list of lines. Use this option if you need to make `await` calls inside you hook implementation. This example uses the hook to add a `Sitemap: http://example.com/sitemap.xml` line to the `robots.txt` file: ```python from datasette import hookimpl @hookimpl def block_robots_extra_lines(datasette, request): return [ ""Sitemap: {}"".format(datasette.absolute_url(request, ""/sitemap.xml"")), ] ``` This example blocks access to paths based on a database query: ```python @hookimpl def block_robots_extra_lines(datasette): async def inner(): db = datasette.get_database() result = await db.execute(""select path from mytable"") return [ ""Disallow: /{}"".format(row[""path""]) for row in result ] return inner ``` [datasette-sitemap](https://datasette.io/plugins/datasette-sitemap) is an example of a plugin that uses this hook. ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd datasette-block-robots python3 -mvenv venv source venv/bin/activate Or if you are using `pipenv`: pipenv shell Now install the dependencies and tests: pip install -e '.[test]' To run the tests: pytest ","

datasette-block-robots

Datasette plugin that blocks robots and crawlers using robots.txt

Installation

Install this plugin in the same environment as Datasette.

$ pip install datasette-block-robots

Usage

Having installed the plugin, /robots.txt on your Datasette instance will return the following:

User-agent: *
Disallow: /

This will request all robots and crawlers not to visit any of the pages on your site.

Here's a demo of the plugin in action: https://sqlite-generate-demo.datasette.io/robots.txt

Configuration

By default the plugin will block all access to the site, using Disallow: /.

If you want the index page to be indexed by search engines without crawling the database, table or row pages themselves, you can use the following:

{
    ""plugins"": {
        ""datasette-block-robots"": {
            ""allow_only_index"": true
        }
    }
}

This will return a /robots.txt like so:

User-agent: *
Disallow: /db1
Disallow: /db2

With a Disallow line for every attached database.

To block access to specific areas of the site using custom paths, add this to your metadata.json configuration file:

{
    ""plugins"": {
        ""datasette-block-robots"": {
            ""disallow"": [""/mydatabase/mytable""]
        }
    }
}

This will result in a /robots.txt that looks like this:

User-agent: *
Disallow: /mydatabase/mytable

Alternatively you can set the full contents of the robots.txt file using the literal configuration option. Here's how to do that if you are using YAML rather than JSON and have a metadata.yml file:

plugins:
    datasette-block-robots:
        literal: |-
            User-agent: *
            Disallow: /
            User-agent: Bingbot
            User-agent: Googlebot
            Disallow:

This example would block all crawlers with the exception of Googlebot and Bingbot, which are allowed to crawl the entire site.

Extending this with other plugins

This plugin adds a new plugin hook to Datasete called block_robots_extra_lines() which can be used by other plugins to add their own additional lines to the robots.txt file.

The hook can optionally accept these parameters:

The hook should return a list of strings, each representing a line to be added to the robots.txt file.

It can also return an async def function, which will be awaited and used to generate a list of lines. Use this option if you need to make await calls inside you hook implementation.

This example uses the hook to add a Sitemap: http://example.com/sitemap.xml line to the robots.txt file:

from datasette import hookimpl

@hookimpl
def block_robots_extra_lines(datasette, request):
    return [
        ""Sitemap: {}"".format(datasette.absolute_url(request, ""/sitemap.xml"")),
    ]

This example blocks access to paths based on a database query:

@hookimpl
def block_robots_extra_lines(datasette):
    async def inner():
        db = datasette.get_database()
        result = await db.execute(""select path from mytable"")
        return [
            ""Disallow: /{}"".format(row[""path""]) for row in result
        ]
    return inner

datasette-sitemap is an example of a plugin that uses this hook.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-block-robots
python3 -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and tests:

pip install -e '.[test]'

To run the tests:

pytest
",1,public,0,,0, 438003374,R_kgDOGhtmrg,datasette-pretty-traces,simonw/datasette-pretty-traces,0,9599,https://github.com/simonw/datasette-pretty-traces,Prettier formatting for ?_trace=1 traces,0,2021-12-13T19:43:28Z,2021-12-19T20:40:10Z,2022-01-14T02:08:51Z,,22,2,2,JavaScript,1,1,1,1,0,0,0,0,0,apache-2.0,"[""datasette"", ""datasette-io"", ""datasette-plugin""]",0,0,2,main,"{""admin"": false, ""maintain"": false, ""push"": false, ""triage"": false, ""pull"": false}",,,0,1,"# datasette-pretty-traces [![PyPI](https://img.shields.io/pypi/v/datasette-pretty-traces.svg)](https://pypi.org/project/datasette-pretty-traces/) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette-pretty-traces?include_prereleases&label=changelog)](https://github.com/simonw/datasette-pretty-traces/releases) [![Tests](https://github.com/simonw/datasette-pretty-traces/workflows/Test/badge.svg)](https://github.com/simonw/datasette-pretty-traces/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette-pretty-traces/blob/main/LICENSE) Prettier formatting for `?_trace=1` traces ## Installation Install this plugin in the same environment as Datasette. $ datasette install datasette-pretty-traces ## Usage Once installed, run Datasette using `--setting trace_debug 1`: datasette fixtures.db --setting trace_debug 1 Then navigate to any page and add `?_trace=` to the URL: http://localhost:8001/?_trace=1 The plugin will scroll you down the page to the visualized trace information. ## Demo You can try out the demo here: - [/?_trace=1](https://latest-with-plugins.datasette.io/?_trace=1) tracing the homepage - [/github/commits?_trace=1](https://latest-with-plugins.datasette.io/github/commits?_trace=1) tracing a table page ## Screenshot ![Screenshot showing the visualization produced by the plugin](https://user-images.githubusercontent.com/9599/145883732-a53accdd-5feb-4629-94cd-f73407c7943d.png) ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd datasette-pretty-traces python3 -mvenv venv source venv/bin/activate Or if you are using `pipenv`: pipenv shell Now install the dependencies and test dependencies: pip install -e '.[test]' To run the tests: pytest ","

datasette-pretty-traces

Prettier formatting for ?_trace=1 traces

Installation

Install this plugin in the same environment as Datasette.

$ datasette install datasette-pretty-traces

Usage

Once installed, run Datasette using --setting trace_debug 1:

datasette fixtures.db --setting trace_debug 1

Then navigate to any page and add ?_trace= to the URL:

http://localhost:8001/?_trace=1

The plugin will scroll you down the page to the visualized trace information.

Demo

You can try out the demo here:

Screenshot

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-pretty-traces
python3 -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest
",1,public,0,,,