I recently relaunched my website with Hugo, Gitlab.com and AWS services. But now, I want encrypted source code all the way from my laptop to deployment. No trust required on Cloud services for privacy! Is it possible ? Let us over-engineer some shit!
I host my source code on a Gitlab.com private repo. Commits to
master branch trigger a Gitlab CI job, which then run Hugo to build the static content. The content is then pushed to an S3 bucket, served by Amazon CloudFront. At the end of the job, I create a CloudFront invalidation for refreshing CloudFront cache. Hey, well, this setup was Über cool 10 years ago
I also have some sensitive values configured on Gitlab CI. The AWS credentials of an IAM role with permissions for AWS S3 and CloudFront. I host the project on a private Gitlab repo, but it is possible, as far as i know, for a Gitlab actor to read the source code. Could also be possible for that actor to have access to the AWS credentials. A good or a bad actor. Could, could not. So many questions!
So, given that Keybase.io supports encrypted Git repositories, I've been thinking ... Is it possible to design a CI/CD pipeline leveraging Keybase ? Have source code, plus secrets, only decrypted by me at rest, in transit and on to job server execution ? For this particular example, the resulting static site artefacts doesn't matter, they're public. BUT, will also take that use case in consideration.
I only started utilising Keybase a couple of years ago. I was part of a software team that utilised Keybase for sharing secrets. Keybase has a series of interesting, individual or team based end-to-end encrypted features. Amongst them: chat, filesystems and Git repositories. It also provides both a GUI application and a CLI tool.
I'm neither super knowledgeable on Keybase or SecOps, which is also a disclaimer for this blog post! So, I took a look at the documentation to see which features I needed for this design. Keybase supports end-to-end encrypted Git repos. Repositories can be individual or team based. But only team based repositories have post-push chat notifications! You will soon understand why.
Assuming the user already belongs to a team named "my-team", creating a repository with the CLI is as easy as:
$ keybase git create --team=my-team secret-sauce Repo created! You can clone it with: git clone keybase://team/my-team/secret-sauce Or add it as a remote to an existing repo with: git remote add origin keybase://team/my-team/secret-sauce
Me (or anyone on my Keybase team) can now interact with this repo as any regular Git repo, over the
$ git clone keybase://team/my-team/example-repo Cloning into 'example-repo'... Initializing Keybase... done. Syncing with Keybase... done. Syncing encrypted data to Keybase: (100.00%) 3.72/3.72 KB... done. warning: You appear to have cloned an empty repository. $ cd example-repo $ example-repo git:(master) echo "secret sauce" > README.md $ example-repo git:(master) ✗ git add README.md $ example-repo git:(master) ✗ git commit -am "Added some secret sauce" [master (root-commit) 228befc] Added some secret sauce 1 file changed, 1 insertion(+) create mode 100644 README.md $ example-repo git:(master) git push Initializing Keybase... done. Syncing with Keybase... done. Counting objects: 229 bytes... done. Preparing and encrypting objects: (100.00%) 229/229 bytes... done. Counting refs: 41 bytes... done. Preparing and encrypting refs: (100.00%) 41/41 bytes... done. To keybase://team/my-team/example-repo * [new branch] master -> master
git pushes to this repository will trigger a chat message on the Keybase team chat! Furthermore, the Keybase CLI client supports listening to chat notifications:
$ keybase chat api-listen --help NAME: keybase chat api-listen - Listen and print incoming chat actions in JSON format USAGE: keybase chat api-listen [command options]
To login to Keybase and listen to those message, lets say on a job server, we can utilise the
keybase oneshot --help NAME: keybase oneshot - Establish a oneshot device, as in logging into keybase from a disposable docker USAGE: keybase oneshot [command options]
$ keybase oneshot login requires a paper key as auth credential. So now the question is, where do we not only store the paper key, but also provide it to the job server ? An answer is, Hashicorp Vault.
What does this all mean ?! Now we have:
- end-to-end encrypted Git repos
- end-to-end encrypted messaging system (pun intended!) to trigger CI jobs on new commits from the CLI
- a job server that can read authentication credentials and source code end-to-end encrypted data sources
- as a bonus, we can also put in place Keybase powered ChatOps! The job server can listen to custom commands and start jobs! I love exclamation marks!
So, putting it all together, I have the following architecture in mind:
The final question is, what about data on the Job Server itself ? Data is end-to-end encrypted all the way to the Job Server. The Job Server would run on a Cloud Instance, lets say Jenkins running on AWS EC2. The Job Server clones the repo inside the instance, to filesystem or memory. Instance data can be encrypted at rest. An OS process then executes for building the static site. Can any other actor than me, for example an AWS employee, access a running instance memory ? As soon as unencrypted data hits AWS servers hardware, it is game over! Right ?
(╯°□°)╯︵ ┻━┻ !
So I guess the only fully end-to-end solution is to run a Job Server on my Raspberry Pi in my kitchen.
As for a reference implementation of all the above, I will "kick the can down the road" until another post. I am still deciding what to utilise as Job executor itself as a replacement for Gitlab CI. Currently leaning towards Jenkins-X. Jenkins is a bit overkill but can poll a local git repository and has web hooks.
Feel free to get in touch if you have any thoughts on this. Or want to kick my ass for any dumb statements
Since publishing this 30m ago, it has hit me that I don’t need Hashicorp Vault. Secrets and job configuration on the Keybase filesystem 🤠.