# UniStore

## The Universal Database

### UniStore is Open

UniStore comes in two flavors - Fully Open Source Community Edition and Heavily Standardized Enterprise Edition, which supports the same functionality, offering better vertical and horizontal scalability.

Both versions conform to the same UKV interface, natively bringing the support of numerous Internet Engineering Task Force Standards and a connection to the Apache Arrow ecosystem. This makes UniStore compatible with multiple data-processing tools, like PyTorch, Ray, Apache Kafka, and ClickHouse.

### UniStore is Exceptionally Fast

Incorporating parallel computing technologies into Transactional Databases is a complex task. It took us 7 years to master those techniques before the initial launch, and now we have numerous concurrent lock-free data structures, explicit SIMD code for x86 and Arm, and GPU acceleration. The results speak for themselves.

### UniStore is Multi-Modal

You can put Documents and Tabular Data, Media and Binary Data, Networks or Graphs, and even high-dimensional Vectors in one database. Today, most companies use several Databases for different workloads, needing a better generic solution.

  1 2 3 4 5 6 7 8 9 10  mongo['movies'].insert_one({ '_id': avatar_id, 'title': 'Avatar', 'price': 10 }) s3.upload_file('avatar.png', 'posters') neo4j.run( # CYPHER 'MERGE (a:V {id: %d}) MERGE (b:V {id: %d}) MERGE (a)-[:E]->(b)' .format(avatar_id, james_cameron_id) ) postgres.execute( # SQL 'INSERT INTO purchases (product, customer) VALUES (%s,%s)', (avatar_id, alice_id) ) 

It is not just messy but also error-prone. With UniStore, you don’t need to learn 10 different products with 10 interfaces. You need just one, and it is consistent across modalities.

 1 2 3 4  db['movies'].docs[avatar_id] = { 'title': 'Avatar', 'price': 10 } db['posters'].blobs[avatar_id] = open('avatar.png').read() db['relations'].graph.add_edge(avatar_id, james_cameron_id) db['purchases'].docs.append({ 'product': avatar_id, 'customer': alice_id }) 

### UniStore is ACID Transactional

Most NoSQL stores are Eventually Consistent, so the classical notion of transactions loses most of the expected guarantees. Furthermore, when using multiple stores, you further degrade those guarantees, having to add manual synchronization between different APIs.

 1 2 3 4 5 6 7 8 9  if mongo_passed: if neo4j_passed: if s3_passed: pass else: neo4j.rollback() mongo.rollback() else: mongo.rollback() 

UniStore is Strictly Serializable, which is the strongest Consistency guarantee. No boilerplate synchronization is needed, even for transactions spanning multiple Collections or Modalities.

 1 2 3 4  with db.transaction() as txn: txn['movies'].docs[avatar_id] = ... txn['posters'].blobs[avatar_id] = ... txn['relations'].graph.add_edge( ... ) 

### UniStore is Ready for Machine Learning

Machine Learning and Artificial Intelligence workloads differ from classical Business Intelligence. You may not need absolute consistency, but you need high throughput and the kinds of operations that Databases don’t do. Well, except for the UniStore.

 1 2 3  while loss > threshold: images = db['pictures'].sample(1000) neural_network.train(images) 

We natively support randomized batch sampling. This changes how we can approach AI Training! It accelerates our path towards Active Learning, where training Neural Networks is not a one-off operation performed on static, already outdated exported dump of data but is a part of a continuous refinement process.

The classical approach to search is to build deterministic indexes offline, one for every form of data. However, the modern method, employed across all major search engines, shifts towards Approximate Nearest Neighbor Search in High-Dimensional often-non-metric spaces.

 1 2 3  product_images = db['pictures'].blobs[new_product_ids] product_representations = neural_network(product_images) db['vectors'].kann.add(new_product_ids, product_representations) 

Not only can you train Neural Networks by feeding them from the UniStore, but you can also send the Semantic Representations back into our Vector Collections. This brings the power of Google from B2C experiences into Enterprise Data Warehouses.