A gradient-descent approach to minimizing software risk, seen in the wild.
- used DataF, initially
- use microservices because they wanted to get away from a single point of failure
- but, few, limited standards
- moved away from using shared libraries because of version hell
- you break the compiler because now everyone is using different languages
- no consistent model of software correctness.. software evolves
- what risks does this manage? Why did they focus on these risks?
- production risk
- therefore, release (very) often
is DevOps maturity measured in releases per day?
- all the tools replace other, existing tools, but are worse because fewer people use them and so the feedback loop is less.
Add Your Star On GitHub to receive an invite to the GitHub Risk-First team.