Ruggero Tonelli - DataOps Barcelona 2019 · # Polyglot Persistence 24 # Multi-model DBs # Data...

Post on 06-Jun-2020

1 views 0 download

transcript

Ruggero Tonelli - DataOps Barcelona 2019

Ruggero Tonelli - DataOps Barcelona 2019

Ruggero Tonelli - DataOps Barcelona 2019

5

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

ArangoDB

ClickHousePostgreSQLAeroSpike

MySQL MongoDB

6

TiDB

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

ClickHouse is faster than MySQL in OLAP

7

PostgreSQL is faster!!!

Our workload is OLTP!!!

MySQL handles everything

MongoDB can do SQL!

Aurora is better!

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

RUN A BENCHMARK !!!

8

please?..

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

9

Shouldn't we get all the requirements, constraints and restrictions before even start?

Do we know what’s the expected load and performance?

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

10

So… are you telling me that choosing a DB is not about faith, dogmas or bullying the others?

...we should run Vitesse!

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

11

Humans are so boring… the correct answer is ORA * * E, always!

Of requirements, constraints and restrictions

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Of requirements, constraints and restrictions

# Budget

13

# Time to Market, MVP or PoC

# Internal know-how

# Coding languages (support maturity)

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Of requirements, constraints and restrictions

# Paid support

14

# Adoption level (maturity)

# Software licensing

# Workload types

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Of requirements, constraints and restrictions

# Resiliency

15

# Scalability

# Performance

# Encryption at rest and on-the-fly

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Of requirements, constraints and restrictions

# Vendor lock-in

16

# Mind the Cloud

# SW/HW “limitations”

# Eventual migration path

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Of requirements, constraints and restrictions

# Ease of management

17

# Documentation

# Known users and specific cases

# Maturity ....did we say maturity enough?

Benchmarking definition, criteria and tools

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Benchmarking definition, criteria and tools

# Essential requirements for “experiments”

19

# Product’s best practices

# “Coding” your own (benchmark)

# Open Source benchmarks

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Benchmarking definition, criteria and tools

# SysBench

20

# YCSB

# Your own workload

# Your peers connections

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Benchmarking definition, criteria and tools

# Benchmarks you find in the Internet

21

# Researching “matching” issues

# Drawing your own conclusions

# Document processes and trade-offs

Data Engineering and Experience

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Data Engineering and Experience

# Know your enemies or RTFM

23

# Capacity planning & forecasting

# Think BIG

# Monitoring and Observability

# Plan for the worst

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Data Engineering and Experience

# Polyglot Persistence

24

# Multi-model DBs

# Data integration

# Multiverse databases!

@ru

gger

oto

nel

li -

Dat

aOp

s 20

19

Recap

Requirements and restriction are not that hard.

Benchmarking is difficult, you better have an objective and consistent results.

Reaching consensus on choosing a DB Engine is better when you have numbers.

26

Thank You@ruggerotonelli

Q&A

Images

Page 1-3 1 2 3 R F

Page 6 - 11 Dilbert by Scott Adams