Opening the ArangoDB ArangoGraph API & Terraform Provider
Estimated reading time: 0 minutes
ArangoDB ArangoGraph, the cloud service of ArangoDB, has been available for a few months now and is growing quickly. The ArangoGraph team got a lot of requests to provide more ways to manage deployments, access policies and other aspects of ArangoGraph.
After adding support for Azure earlier this year, we’re now opening up the ArangoGraph API for all supported cloud providers like Google Cloud and AWS. Read more
Efficient Massive Inserts into ArangoDB with Node.js
Estimated reading time: 3 minutes
Nothing performs faster than arangoimport and arangorestore for bulk loading or massive inserts into ArangoDB. However, if you need to do additional processing on each row inserted, this blog will help with that type of functionality.
If the data source is a streaming solution (such as Kafka, Spark, Flink, etc), where there is a need to transform data before inserting into ArangoDB, this solution will provide insight into that scenario as well. Read more
Using the ArangoDB Swagger.io Interactive API Documentation
ArangoDB bundles its regular API Documentation also in Swagger.IO API description format. You can browse and explore it interactively via the ArangoDB Webinterface.
Read more
ArangoDB 2.6 API Changes: Updates & Enhancements
ArangoDB 2.6 comes with new and changed APIs as well as changed behavior regarding document keys and several graph functions.
If you use Travis-CI for your tests you can download the Travis-CI ArangoDB build here: Travis-CI/ArangoDB-2.6.0-alpha2.tar.gz
The changes so far:
APIs added
- added batch document removal and lookup APIs:
These APIs can be used to perform multi-document lookup and removal operations efficiently. The arguments to these APIs are the name of the collection plus the array of document keys to fetch or remove.
The endpoints for these APIs are as follows:
PUT /_api/simple/lookup-by-keys PUT /_api/simple/remove-by-keys
Example call to fetch documents:
curl -X PUT \ http://127.0.0.1:8529/\_db/\_system/_api/simple/lookup-by-keys \ --data '{"collection":"myCollection","keys":["test1","test3"]}'
The documents will be returned in an attribute
documents
of the HTTP response.documents
is an array containing all documents found. Only those documents that were actually found will be returned. Documents that were searched but do not exist will not be returned and do not trigger any errors. (more…)
LoopBack Connector for ArangoDB: Seamless Integration
ArangoDB can be used as a backend data source for APIs that you compose with the popular open-source LoopBack Node.js framework.
In a recent blog article on StrongLoop, Nicholas Duffy explains how to use his new loopback-connector-arango connector to access ArangoDB:
Getting Started with the Node.js LoopBack Connector for ArangoDB
The tutorial uses the loopback-connector-arango which is available as npm
and a demo application which is available from Github. (more…)
Bulk Document Lookups: Efficient Data Retrieval with ArangoDB
ArangoDB 2.6 comes with a specialized API for bulk document lookups. The new API allows fetching multiple documents from the server using a single request, making bulk document retrieval more efficient than when using one request per document to fetch.
Provided the documents keys are known, all the client application needs to do is to call the collection’s lookupByKeys
method:
// list of document keys
var keys = [ "foo", "bar", "baz", ...];
var results = db.test.lookupByKeys(keys);
// now all documents are contained in variable 'results'
Additionally, the server-side REST API method for bulk document lookups can be invoked directly via HTTP as follows:
curl \
-X PUT \
http://127.0.0.1:8529/_api/simple/lookup-by-keys \
--data '{"collection":"test","keys":["foo","bar","baz"]}'
Jan compared the functionality with single document requests in his latest blog post.
Exporting Data for Offline Processing in PHP: ArangoDB Guide
A few weeks ago I wrote about ArangoDB’s specialized export API.
The export API is useful when the goal is to extract all documents from a given collection and to process them outside of ArangoDB.
The export API can provide quick and memory-efficient snapshots of the data in the underlying collection, making it suitable for extract all documents of the collection. It will be able to provide data much faster than with an AQL query that will extract all documents.
In this post I’ll show how to use the export API to extract data and process it with PHP.
Please read the full blog post Exporting Data for Offline Processing.
More Efficient Data Exports with new Export API
ArangoDB 2.6 provides a specialized export API for exporting all documents from a collection and shipping them to a client application. It is rather limited but faster than the general-purpose AQL cursor API and can store its snapshots using less memory.
A side effect of the speedup is that the first results will arrive much earlier in the client application. This will help in reducing client connection timeouts in case clients are enforcing them on temporarily non-responding connections. (more…)
Improved Cursor API: ArangoDB Query Efficiency Boost
This week we pushed some modifications for ArangoDB’s cursor API into the devel
branch. The change will result in less copying of AQL query results between the AQL and the HTTP layers. As a positive side effect, this will reduce the amount of garbage collection the built-in V8 has to do.
These modifications should improve the cursor API performance significantly for many cases, while at the same time keeping its REST API stable. Client programs do not need to be adjusted to reap the benefits. In a blog post, Jan shows some first unscientific performance tests comparing the old cursor API with its new, improved implementation.
(more…)
Securing your Foxx with API Keys
ArangoDB’s Foxx allows you to easily build an API to access your data sources. But now this API is either public or restricted to users having an account, but those still get unlimited access.
In many use cases you do not want to expose your data in this fashion, but you want to expose it with a more controllable access pattern and want to restrict the requests one user could issue in a certain time period. Popular examples for these API restrictions are Twitter or Facebook. This allows you to offer all of your data but only in limited chunks, and then possibly charge your customers to increase the chunk limit they can request.
All this is done via API keys, which are bound to a user and has become a common pattern to monetize the data you have collected. (more…)
Get the latest tutorials,
blog posts and news:
Thanks for subscribing! Please check your email for further instructions.