Yo boys lets make a perfect tri


let myStore = new TriKV()

myStore.get(key, options, cb) // returns promise cb called on each find
myStore.put(key, value, options, cb) // garunteed insert resolves when safe to get

//Hos does this work? 

let m = {
    get(key) {
        let found = this.m.lookup(k)
        if(found) {
        } else {
            //get from kv
    put(key, val) {
        let found = this.m.lookup(key)

anyway the point is it is fast

Limitations Kv can only have 24Mb of data

Files large then that must be chunked or destributed as r2

Files are saved as uint8 arrays

the key is the murmur hash of the data.

the user, group, client, ref, time, tags are saved in the meta data

on data input -> raw text is extracted -> leving is with the fallowing object {data, text, vec, meta}

What is the problem what are we trying to fix?

arbatrary scale key value for secure data of users.

economies of scale by public data.

why not ipfs? slow

why not d1 to small

why not r2 to expensive

answoer kv get put list // prefec

    List by prefice
    fast is 
    may fail randomly idk we have to do retry loics or fail over maybe 

ok lets build it

input any file


var fs = require('fs');

var convertString = {
    bytesToString: function(bytes) {
        return{ return String.fromCharCode(x) }).join('')
    stringToBytes: function(str) {
        return str.split('').map(function(x) { return x.charCodeAt(0) })

convertString.UTF8 = {
    bytesToString: function(bytes) {
        return decodeURIComponent(escape(convertString.bytesToString(bytes)))
    stringToBytes: function(str) {
        return convertString.stringToBytes(unescape(encodeURIComponent(str)))

function getByteArray(filePath){
    let fileData = fs.readFileSync(filePath).toString('hex');
    let result = []
    for (var i = 0; i < fileData.length; i+=2)
    return result;

result = getByteArray('/path/to/file')