The top posts on the journey from 0 to 100k unique monthly visitors on the OverOps blog
It has been just over 2 years since the launch of the OverOps blog, and recently we celebrated a nice milestone – crossed 100k readers a month.
— Takipi (@takipid) July 9, 2015
When publishing new posts, we always try to guess if they’ll get traction, become popular, or start a live discussion between industry experts and the people behind the tools and technologies we’re writing about. In some cases, we were able to anticipate the traction a post will receive, in other cases, the success was quite surprising. We thought this would be a good chance to celebrate this milestone, reflect and plan the road ahead by revealing the list of our most popular posts so far:
Based on some data crunching over Github’s data, we were able to extract the most popular libraries that Java, JS and Ruby projects use. Since then we’ve re-ran the Java benchmark and also reported on more recent results.
One of the hottest recent new features in Java are lambda expressions, which were already available for a while in functional programming languages like Scala. In this post we’ve taken a look at these two from the JVM perspective back when we added Scala support to OverOps – And survived to share the findings.
Lambda expressions were definitely the most hyped feature in Java 8. But not the only one of course. In this post we ran through some of the top features that most likely already changed the way Java developers code: With the streams and parallel operations, and the date and time API among others.
Woah, this is a thorny one. With Github’s Atom recently reaching version 1.0, this dilemma got back to the spotlight (which btw, if you haven’t already – you just have the watch this video). And please don’t get us started with vim vs emacs.
The production quality tooling landscape is going through some major changes. If not too long ago you’d pretty much be left in the dark after deploying your code to production, today the situation is completely different. In this post we’ve taken a look at the tools modern teams use to keep an eye out on what’s going in their production environment.
The Application Performance Monitoring (APM) toolkit is also going through some major changes since entrance of the SaaS model, and many teams struggle with making the right decisions for the tools they choose to use. Here we discuss the differences between AppDynamics and New Relic and hope to help you reach the right decision for your environment.
Back to our beloved lambdas, not everything is so shiny and bright. In this post we’ve examined some of the less talked about downsides to using lambdas in Java 8. When you look at the stack trace that errors with lambdas produce – things get a bit darker.
High scale systems produce tons of log data and it’s getting super hard to manage it without some log management tool in place. In this post we’ve showcased some of the more popular choices that you have in front of you today when making the call to keep your log files in check.
Garbage collection is definitely one of the more interesting topics of the JVM. With recent news around the debates of making G1 the default garbage collector in Java 9, it’s a great opportunity to catch up on the available garbage collectors and their features (yes, unlike popular belief, there’s more than one).
This post takes on the question of how to get a production ready Java application off the ground in the shortest time possible. Favoring convention over configuration, and taking off through the decisions made by the 2 most popular lightweight Java frameworks today: Dropwizard and Spring Boot.
We’d like to give a huge huge shout out to the community, all the developers, authors, and industry experts that help us along the way. We’re happy to be able to help you make sense of the software development landscape from a hands-on developer perspective. Be it core Java, Scala, JVM internals, news, benchmarks, devops practices and tools – We’re hungry digging for insights and look forward to keep sharing our research with you.