121
From the 80's to 2024 - how CI tests were invented and optimized
graphite.devWhile Google started automating its build tests in 2003, the engineering industry took longer to do the same. But automation was sorely needed: Software systems are growing larger and ever more complex… To make matters worse, new versions are pushed to users frequently, sometimes multiple times each day. This is a far cry from the world of shrink-wrapped software that saw updates only once or twice a year. The ability for humans to manually validate every behavior in a system has been unable to keep pace with the explosion of features and platforms in most software. - Software Engineering at Googlehttps://abseil.io/resources/swe-book/html/ch11.html#testing_overview Add author/source Sun Microsystem’s engineer, Kohsuke Kawaguchi, was key to ushering in the next era of testing. In 2004, he created “Hudson”https://community.jenkins.io/t/lets-thank-kohsuke-the-creator-of-jenkins/168 (later renamed to Jenkins in fun Oracle drama). At his day job, Kohsuke “got tired of incurring the wrath of his team every time his code broke the build.” He could have manually triggered tests before each code contribution, but instead, Kohsuke chose the classic engineering solution and created an automated program. The Hudson tool acted as a long-lived test server that could automatically verify each code change as it integrated into the codebase.
You know what’s a hard pill to swallow for Jenkins haters? It’s likely older than your career, and is going to outlive you too. Like bash, and C, and gnu-utils.
Want to appear godlike in any org? Learn a tiny amount of groovy and read the pipelines pages - https://www.jenkins.io/doc/book/pipeline/
Jenkins is battle tested, Jenkins is likely already in your org, and replacing it for anything else is almost not worth the time from a strategic perspective. But it isn’t perfect, testing it in particular - a pain in the ass
So here’s the best tip: skinny Jenkinsfiles. When you use a sh: have it run a Makefile command, or your build tool command. Keep them short single line things. Don’t rely on massive ENVs. Dockerfiles for most stuff. Dynamic container agents in the cloud are actually good. Learn to use archiveArtifact, integrate with test report plugins. Learn about parallel pipelines.
Why is that a hard pill to swallow? Longevity doesn’t imply goodness, especially in software. Same for Bash, C and gnu utils.
It also doesn’t mean it’s a good idea to use it. I would strongly recommend… basically anything else over Jenkins.
Maybe I missed your point there.
why do you recommend other tools over things which are tested and will last way longer than whatever the current fad is? The best part of Jenkins is it’s ubiquitousness - writing code that will run forever is not to be sniffed at
That’s one of the reasons I like Java. It definitely has problems, but it’s been around so long that there are an insane number of libraries to work with. And you can practically guarantee that your project will run on a given computer with minimal fuss.
Can you give some examples of other tools for the job you’d rather use, which can be self hosted?
Yeah, this is something I stressed at my place. Your Jeninksfile should set up environment variables, authentication related stuff, and call out to some build tool to build the project. The Jenkinsfile should also be configure to use a docker container to run the build within. In projects at my place that’s a Docker file on the project that ultimately sets up and installs all the tools and dependencies required for a valid build environment that’s just checked in along side the Jenkinsfile.