Been doing some deep dives and testing on various databases. I could write for days on this subject but only have a few minutes, so briefly…
Representative picture of the basic structure of data tested.
Challenge is to accommodate desktop/web/mobile platforms seamlessly with massive data. Questions remain on if this all stays in cloud or is hybrid in cloud and in house.
First testing path was to try to use typical databases and create an optimized design for max throughput. Benchmarked them all and set aside.
In memory types are pretty quick- benchmarked and set aside.
Next was the typical NoSQL data structures. This was very interesting. Big data development all tends towards distributed toolsets like Hadoop etc…
Lastly been evaluating some other NoSQL types like graph databases, and some hybrid optimized types (ex: GIS database.)
Will have more to say later but this is a big decision. It’s one thing to develop and assume something but it’s another when you start looking at commercialization and what’s required. Phew.