Pylowdb

Downloads
Downloads

Simple to use local JSON database ?

# This is pure python, not specific to pylowdb ;)
db.data['posts'] = ({ 'id': 1, 'title': 'pylowdb is awesome' })

# Save to file
db.write()

# db.json
{
  "posts": [
    { "id": 1, "title": "pylowdb is awesome" }
  ]
}

Support me

"Buy Me A Coffee"

Features

  • Lightweight
  • Minimalist and easy to learn API
  • Query and modify data using plain Python
  • Atomic write
  • Hackable:
    • Change storage, file format (JSON, YAML, …) or add encryption via adapters

Install

pip install pylowdb

Usage

import os
from os import path
from pylowdb import (
    Low,
    JSONFile,
)

# Use JSON file for storage
file = path.join(os.getcwd(), 'db.json')
adapter = JSONFile(file)
db = Low(adapter)

# Read data from JSON file, this will set db.data content
db.read()

# If file.json doesn't exist, db.data will be None
# Set default data
# db.data = db.data or { 'posts': [] }
db.data = db.data or { 'posts': [] }

# Create and query items using plain Python

db.data['posts'].append('hello world')
db.data['posts'][0]

# You can also use this syntax if you prefer
posts = db.data['posts']
posts.append('hello world'

# Write db.data content to db.json
db.write()

// db.json
{
  "posts": [ "hello world" ]
}

More examples

For more example, see examples/ directory.

API

Classes

Pylowdb has classes (for synchronous adapters).

Low(adapter)

from pylowdb import (
    Low,
    JSONFile,
)
db = Low(JSONFile('file.json'))
db.read()
db.write()

Methods

db.read()

Calls adapter.read() and sets db.data.

Note: JSONFile adapter will set db.data to None if file doesn’t exist.

db.data  # is None
db.read()
db.data # is not None

db.write()

Calls adapter.write(db.data).

db.data = { 'posts': [] }
db.write() # file.json will be { posts: [] }
db.data = {}
db.write() # file.json will be {}

Properties

db.data

Holds your db content. If you’re using the adapters coming with pylowdb, it can be any type supported by json.dumbs.

For example:

db.data = 'string'
db.data = [1, 2, 3]
db.data = { 'key': 'value' }

Adapters

Pylowdb adapters

JSONFile

Adapter for reading and writing JSON files.

Low(JSONFile(filename))

Memory

In-memory adapter. Useful for speeding up unit tests.

Low(Memory())

YAMLFile

Adapter for reading and writing YAML files.

Low(YAMLFile(filename))

TextFile

Adapters for reading and writing text. Useful for creating custom adapters.

Third-party adapters

If you’ve published an adapter for pylowdb, feel free to create a PR to add it here.

Writing your own adapter

You may want to create an adapter to write db.data to YAML, XML, encrypt data, a remote storage, …

An adapter is a simple class that just needs to expose two methods:

class CustomAdapter:
    read(self):
        # should return deserialized data
        pass
    write(self, data):
        # should return nothing
        pass

For example, let’s say you have some async storage and want to create an adapter for it:

api = YourAPI()

class CustomAdapter:
    # Optional: your adapter can take arguments

    def __init__(self, *args, **kwargs):
        pass

    def read(self):
        data = api.read()
    return data

    def write(self, data):
        api.write(data)

adapter = CustomAdapter()
db = Low(adapter)

See pylowdb/adapters for more examples.

Custom serialization

To create an adapter for another format than JSON, you can use
TextFile.

For example:

from pylowdb import (
    Adapter,
    Low,
    TextFile,
)
import yaml

class YAMLFile(Adapter):
    def __init__(self, filename: str):
        self.adapter = TextFile(filename)

    def read(self):
        data = self.adapter.read()
        if data is None:  
            return null
        else:
            return YAML.deserialize(data)

    def write(self, obj):
        return self.adapter.write(YAML.serialize(obj))

adapter = YAMLFile('file.yaml')
db = Low(adapter)

Limits

If you have large Python objects (~10-100MB) you may hit some performance issues. This is because whenever you call db.write, the whole db.data is serialized and written to storage.

Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write only when you need it.

If you plan to scale, it’s highly recommended to use databases like PostgreSQL, MySql, Oracle …

GitHub

View Github