Mentalbrew

Web Development Craft

Nickle and Diming Myself

It wasn’t that long ago that spending money on something tech was often a big ticket item. While today there are still many expensive things to purchase (tablets, smartphones, etc) I’m finding that a decent chunk of my outgoing money is tied up in a bunch of little one-off purchases and subscriptions.

While I have found value in those things I have decided this month to cancel them all and find out which ones actually are necessary and which aren’t. I’m also scaling back on the one-off purchases for a month like ebooks (which I buy often) and such.

Why?

The biggest reason why is that I want to focus on my spending habits that are impulse buys and which add up over the course of a month. Many of these things are completely worth the price paid for them but I end up not utilizing the item or reading the ebook until a much later date if at all. I simply get it because it’s something I’m interested in and it’s cheap. What’s $5 dollars here or there right?

Another reason for the change in habits is simply this: without constant new items to keep my attention or distract me, I’m hoping to see a more focused approach to that which I am working on now. Many times I feel I’m running a mile-wide and and inch deep. Gaining perspective on what I should be doing and giving time to work on the things I want to do is a goal hindered by lack of focus.

From Here To There

I’m going to trial run this month and see how things go. See what’s improved and test the idea that I’m somehow missing out on something if I don’t have these subscriptions or items. Hopefully, reducing peripheral clutter will illuminate those things around me that are really important.

The Inflammatory Need Not Apply

Yesterday I posted an article on something that I think could be improved and/or make a good addition to the Linux distribution world. I however, went about it the wrong way and worded it in a manner that was nothing more than inflammatory.

This, as you would expect, wasn’t well received by some and rightly so. Sometimes it is easy to forget that real people work on open source projects and they have feelings. It’s also easy to forget that open discussion on any topic should be voiced in a manner that invites discussion rather than shutting it down.

Years ago I decided on some simple principles for how I would conduct myself online and I thought I would share them here:

  • Don’t be anonymous - While I can respect a person’s right to be anonymous I personally feel that having my real identity associated with my words and actions is necessary toward my improvement as a person and openness with others. It’s easy to attack or deride others behind a veil of anonymity but becomes much harder when I can be just as easily scrutinized.
  • Don’t Delete - I’ve made it a point to never delete or edit a post I’ve made publicly. If I ever decide to edit something it would be for an addendum to my post and never to remove or conceal something I said. The few times I have removed something it was at the request of another person involved which to my remembrance has only happened twice and wasn’t regarding a personal opinion of mine nor denigrating anyone. What this means is basically there will be stupid things I have said that I regret or would word differently but I’m treating them as though they are spoken words which cannot be uncalled.

I have a new one after yesterday:

  • Be nice - Basically, treat others how I would want to be treated and encourage rather than discourage.

Hopefully, more thought will go into my future discussions and comments and more importantly, transparency as always.

Where Linux Needs to Change

I got started with Linux around 1996/1997-ish. At the time I was completely fascinated by the fact that there was something out there that wasn’t Microsoft or Apple. I had been a Windows user my entire life and the idea that people would come together and provide a free operating system and related software greatly interested me.

So, over the years I investigated Linux and not that long after switched to dual-booting Linux and Windows. I’m still amazed that I was ever able to get things accomplished using Linux compared to where it was and what it provided when I started.

It wasn’t uncommon for me to switch distributions weekly to try out all there was in Linux. RedHat, SuSE, Slackware, Mandrake (Mandriva), and on and on. I became a master at backing up my home directory and re-installing an entire system within an hour.

I remember distinctly doing web development freelance work on Slackware, using Gimp for my graphics needs on the newly minted Gnome 2.0 desktop environment that I compiled from source by hand.

Eventually, I moved away from dual booting anything and simply used Linux as my one and only operating system.

So what I’m about to say is not coming from somebody who briefly used Linux and decided it’s not ready for the desktop. I believe experience-wise, Linux distributions can provide a great user experience for the newcomer and superuser alike. My relationship with Linux goes back over 15 years. I have used some distribution of Linux for my main and only desktop for over 12 years and every server under my personal management is running Linux. I really like Linux.

Where Things Broke Down

Having discussed my history with Linux some may be surprised that my most recent purchase for a new laptop was nothing other than a MacBook Pro. Why? Whhhhhhyyyyyy?! (think of George from Seinfeld shaking his fists in the air).

There are two primary reasons I switched to Mac for my development needs:

  • It was Linux/Unix -like. I can get most of the benefits of being on Linux and/or Unix that I’ve come to love. Having a solid command line option with all of the Unix tools I’m used to is a huge win for me.
  • I can manage my development requirements to my liking. It is not dictated that my entire OS version decides what specific version of something I need to run.

Of these two points, I would like to address the second one and how I believe Linux would be better off if things changed.

Choices Are Essential

I’m a polyglot developer. I work in PHP, Ruby, and some Java. I also work with software that is considered legacy and therefore need to manage my requirements with that. So I will often run multiple versions of PHP and Ruby.

Ruby has largely solved this issue with tools like RVM and RBENV. But the PHP issue remains and I don’t understand how it’s going to get any better.

Basically, my whole point here is that a version of a distribution, say Ubuntu 12.04, dictates a version freeze on every piece of software that is included in that distribution.

This applies to everything, not just PHP, which I am using in this specific instance. Because of how EVERY major distribution is designed, they all share one common flaw: Whatever version of software I receive by using that distribution will stay the same FOR THE LIFE of that distribution and trying to get newer versions of that software will only create headache and heart pain for the user.

You would literally need to wait for a newer version of YOUR ENTIRE OS, to be able to get an updated version of just one piece of software that is running on it.

This is stupid. Just plain dumb. The work that distributions go through to backport bug fixes and security updates for every piece of software on their release is a waste of resources.

There is no good reason why the OS needs to dictate what version of git, f-spot, LibreOffice, Tuxracer, or whatever I would like to run.

The Solution

I would love to see a distribution that did three things:

  • Provide a stable platform w/ bug fixes and security updates for the critical pieces of the operating system.
  • Provide a consistent framework for integration with third party apps and let the App owners take care of the rest.
  • Universal binaries that work across all multiple versions of the OS within reason. The FatElf project looks like a great step in this direction but GPL issues may interfere with that.

Conclusion

It is ironic that the Unix mantra of “Do one thing and do it well” is completely ignored and the standard practice for current distributions is to provide the entire kitchen sink, take-it-or-leave-it approach for the entire OS stack. This is asinine and unecessary.

Project Launch and Lessons Learned

A little while ago I finished up a project developing a calling card website and store for my friend Evan Calkins at Hoban Cards. The business creates hand-made calling cards on one of several vintage presses he has in his shop.

This project turned out to be a great learning experience in several ways and the fact that I didn’t have to design the site, but only work on development as Evan is a designer himself was really nice.

Deploy, Deploy, Deploy

The first thing I tackled when working on the application was to get it deployable right off the bat. Even if it was nothing but a shell.

We were going to be hosting this on a Rackspace virtual server which I manage myself. Getting everything set up in a quick and easily deployable form took more time upfront but payed off quite a bit in the long run.

I was surprised at the benefits gained by simply having things easy to push out and be live. The largest noticible difference was that Evan and I could literally work side by side and quickly have our changes live as we iterated over the site. No messing with copying files, making sure permissions worked, database migrations, etc. Either of us could deploy at any given moment and rollback if things messed up. That opened things up for us to work on what was important at the time and not having to wait on each other to finish.

We found quite a few bugs earlier that only would have cropped up in production. Building our deployment meant we found these issues right away and could deal with them in the context of the release rather than as one big ball of ugly to fight with at the end of the project when things are really tight.

Use Pre-Existing Tools, Replace Later

One item I found to be very helpful was to utilize Twitter Bootstrap for both the frontend and backend UI. This allowed me to focus on the code and how things worked rather than dwell on where to put things and why they didn’t do what I want.

Once we were far enough along Evan was able to slice up his design and completely replace the frontend easily. Since bootstrap worked so well, and there was no driving reason to replace it we decided to leave it for the backend.

Git And Friends

The project would have definitely died without Git http://git-scm.com/. It allowed us to work separately on the application without stepping on each others toes. Should something come up that we both had worked on, merging took care of the differences and life was good.

Since this was a non-opensource project we hosted it privately using Gitolite https://github.com/sitaramc/gitolite. If you don’t have the desire to store all your private code in Github, hosting your own is relatively easy and the fined grain permissions make Gitolite a great asset in your toolbox.

Stripe and Paypal

The previous incarnation of Hoban Cards was a simple site with Paypal. While Paypal has served Evan well he despised the backend and having many international customers there was some concern about his options if Paypal decided to contest something.

So we decided to integrate both Stripe payment processing and Paypal, giving the customers the choice.

Since launching we’ve been pleasantly surprised that most customers chose the custom card processing and those that chose Paypal replied they only did it because it was easier than getting their credit card out. Most expressed that they did not care one way or the other.

Moving forward, Evan may choose to remove the Paypal option altogether.

Conclusion

This was not a big site or application by any means, which actually turned out great for developing a process of constant iteration. At the end of the day, the project was fun for both of us to work on, customers got a great purchasing experience, and Evan got a backend to manage orders and keep things organized.

Tools

  • Rails 3.2
  • Postgresql
  • Twitter Bootstrap
  • Stripe
  • Paypal
  • RapidSSL
  • Ubuntu
  • Rackspace Cloud Server
  • Capistrano

Web Development Is Not for Beginners

This article will mainly be in response to another blog post I came across awhile ago here: Rails is not for Beginners; but is related to a larger topic I’ve wanted to write about.

The crux of the blog post was mainly that the overhead and complexities of Rails is so great that it’s making it harder for beginners to get started in web development.

I’m not going to disagree about it being hard for a beginner to learn rails but I think the underlying issue is not the framework but the entire industry.

When I started as a web developer (called webmaster at the time) things were pretty simple. If you had HTML3.2, a basic understanding of images and linking, and could figure out how to upload your files you were doing pretty well.

Over the years the entire industry has matured and started building more and more on those technologies to deliver more complex sites. My main point is that we are not developing the same websites we did 10 years ago and therefore what we now call web development is not for beginners.

Requirements

When I look at the staggering amount of technologies a person needs to know just to start out in web development it’s incredibly tiring:

  • HTML5
  • CSS
  • Javascript or at least jQuery
  • HTTP Protocol (get, post, redirects, status codes, etc)
  • Web servers (apache, nginx, or whatever for development)
  • Git/Mercurial (if you’re doing it right)
  • Browser quirks
  • Code editor (can’t get much done without one)
  • SSH/FTP
  • Server side language (php, perl, ruby, python, java, etc)
  • Frameworks (rails, zend, codeigniter, catalyst, etc)
  • CMS/Blogging (drupal, wordpress, refinery, typo3, etc)
  • Databases (MySQL, Postgres, sqlite, etc)
  • DB Theory (normalization, queries, etc)

This is just to get started if you want to be able to provide services to clients. Once you’ve picked your stack then you get the joy of maintaining knowledge of a moving target.

I haven’t even touched on what the ever growing mobile market is doing to web design/development and things that need to be considered for being accessible to everyone.

Tools

The article mentioned also touched on the size difference in the codebases regarding Rails vs Sinatra. Ignoring the apples vs oranges issue, the main thing that jumped out at me was the fact that while Sinatra might be a lighter, easier framework to learn, when you need it to do something beyond simple functions you’re going to need to write it yourself. That’s not so much a problem unless of course you’re a beginner and wouldn’t know where to start.

At least with Rails there’s a very good chance a solution exists out there that you can use without really have to get your hands dirty.

I’m not attempting to put down Sinatra as I think it is a great framework and has a strong place in a web developer’s arsenal, but things need to be weighed with what a person can develop with it if they have a beginner’s understanding of things.

At the end of the day, all of these things are tools and skill can only really be forged from experience and practice.

Keep on

Learning things is hard. I’m not writing this to discourage a person from being a web developer. But like developing any skill, you start small and build from there. Where I disagree with the author is that the point of difficulty for web development doesn’t fall in the design of a specific framework but even lower in the technologies that framework is built upon.

Fear of Failure

As any good developer I’m constantly working on my craft to become better and more efficient. I study what others are doing, I subscribe to newsletters, screencasts, and any other information on web development that will inspire me to up my skills and stay motivated.

As such, I’ve found that even code I wrote just a few short months ago now seems stupid and embarassing to me. This wouldn’t be such a big deal if this same issue didn’t effect another aspect of my development: code sharing and publishing.

Basically, since I know that I’m probably not doing something the best way it can be done or in a manner that is deemed “proper” I fear putting my code out there for all to see. Risking that somebody who knows far more than me will call me on my amatuerish attempt at programming. In short I fear looking stupid.

This thinking serves no good purpose since the times that I have grown the most in any practice of discipline in my life was when I had the ability to learn from others and accept critique on my most recent work. Sharing my work and learning from others is a very critical step in the development of any skill.

What Can Be Done?

I was inspired to write this by a recent article I read from a developer who is well respected in the Ruby community. Basically he stated how his development process goes including the mistakes he makes and the bad decisions he made in the past. He even detailed how he didn’t follow exact procedures that the are considered “blessed” by the established leaders.

Fear is a powerful force. The worst type of fear is not that which is external but that which is internal. Your own fear is hard to defeat and can leave you paralyzed. Here are a few of the fears I face specifically in development:

  • Unknown - Sometimes I procrastinate on doing something because I’ve never done it before. It’s something totally new to me and perhaps I don’t even know where to start. There times when I can have several solutions to a single problem and fear selecting the wrong one even after a great deal of thought.
  • Monumental - There have been times I’ve literally been frozen in front of my screen trying to get started but I have no idea where to start due to the complexity and scope of the project. Breaking tasks down into small pieces and timeboxing myself usually gets me going in the right direction.
  • Criticism - I fear posting something I’ve worked on due to the possibility that I may be denigrated for doing it wrong or poorly. I will never know however how well I’m doing until others can oversee and give me feedback.

These are just a few of the things that hold me back. I’m sure you have plenty of your own.

The Solution

There are several keys I’ve found to overcoming these obsticles and getting over yourself to get things done.

  • Surround yourself with good people. A good person wants to see you succeed and will do what they can to encourage you. Be that person for somebody else as well. The learning goes both ways.
  • Surround yourself with difficult people. Putting yourself in a situation where you need to keep your game up is also very motivating if you have the right attitude for it.
  • Metrics, metrics, metrics. Set goals for yourself that you can measure and constantly check to make sure you’re meeting those goals. Keep records so you can be reminded of how far you have come. Re-visit old code and see what mistakes you made then that are obvious to you now.
  • Observe others. See how they work. What tools do they use? What philisophy do they follow?
  • Get it out the door. If you have a project that you are waiting on to be “just right” before releasing it to the world, stop, and push it out the door. Chances are nobody is going to care about your project at all let alone run through it with a fine toothed comb.

Conclusion

Fear of failure or rejection needs to be conquered. It’s dead weight that will hold you back from pursuing your dreams. Anything that gets in the way of progress needs to be dealt with. I’m actively working on dealing with those fears. If you recognize any of these traits in you then I would encourage you to sit down and make a plan to overcome them. Find what motivates you and move forward.

Using Rbenv to Manage Your Ruby Installs

I’ve used RVM for quite awhile to manage my Ruby installations both on my development machines and my servers. Early on using RVM for production machines was pretty difficult, but support for everything I needed soon followed and life was good.

Recently, RVM has been challenged by a newcomer in the Ruby community called rbenv created by 37signals’ Sam Stephenson.

The Conflict

The introduction to rbenv was not without some controversy as the new project appeared to be derogative toward the work that Wayne Seguin had put into RVM. This naturally rubbed people the wrong way and led to some spirited debate regarding which product was better.

I really didn’t care at the time which was technically better as RVM simply worked for me. I never ran into the issues opponents of RVM stated regarding the overwriting of cd or the gem commands and such. My work never introduced me to any of these issues and the main feeling I get was that very few people did have actual issues, but those who simply did not like RVM were quick to use these things as another excuse to bash RVM.

One thing has led me to try rbenv over RVM and that was overhead. RVM introduces more complexity throughout your environment that has to be planned and accounted for when using the solution on your development machine as well as for production use. rbenv is simply a little closer to the metal and provides a similar workflow as RVM.

I would encourage a person to try both and see which they like most and why. Following is my installation and workflow that I use with my new rbenv install.

The Players

In typical Unix/Linux tradition rbenv is a single function utility that is best used when combined with other utilities. In my setup I will use the following:

  • rbenv - This provides the ability to switch between installed ruby versions by using “shims” in your environment path.
  • ruby-build - This utility actually downloads and installs the different ruby platforms as “definitions”. You can download and compile the ruby installs by hand that rbenv can then use but this simply makes it easier.

Setup

The first step is to clone the rbenv project into your home directory under ~/.rbenv like so:

bash console
1
git clone https://github.com/sstephenson/rbenv ~/.rbenv

All of your ruby installs will then be stored in ~/.rbenv/versions.

Next step is to add the rbenv path to your shell environment. In bash this would be either .bashrc, .bash_profile, or even .profile. If you are using Zsh you can place it in .zshrc or .zshenv.

bash
1
2
export PATH="$HOME/.rbenv/bin:$PATH"
eval "$(rbenv init -)"

You can now start a new shell or run exec $SHELL to use the new configuration.

This alone will give you the ability to switch between multiple ruby installs but ruby-build provides a nice, easy way to install these so you don’t have to do it by hand. To install ruby-build you first need to clone the git repo:

bash console
1
https://github.com/sstephenson/ruby-build

Now change into the ruby-build directory and run:

bash console
1
2
cd ruby-build/
./install.sh

If ./install.sh doesn’t work you may need to run it with super user priviledges like so: sudo ./install.sh.

Now you should have everything you need to manage multiple ruby installations.

Workflow

The first step is to install a ruby version of your choice. This is very easy with ruby-build:

bash console
1
ruby-build 1.9.2-p290 ~/.rbenv/versions/1.9.2-p290

That is kind of more work that I like to do and ruby-build makes it easier by adding to rbenv for easy local installation:

bash console
1
rbenv install 1.9.2-p290

The previous command takes care of where to install your ruby versions which is one less thing you need to remember.

Once I have a ruby version install I like to set one to be my global ruby interpreter for my shell sessions like so:

bash console
1
rbenv global 1.9.2-p290

Now you can test to make sure you have the correct ruby version by typing:

bash console
1
ruby -v

The Rehash Annoyance

One very annoying issue with rbenv is the fact that it needs to be told to rehash its shims after you install a new ruby version or any gem install that provides a binary. There is a solution in the works but until that is ready if you find yourself wondering why a command isn’t working when you just “gem installed” it, try running:

bash console
1
rbenv rehash

Per Project Ruby

If you have a project that needs to be configured to use a specific ruby version you can change to the project directory and type:

bash console
1
rbenv local 1.9.2-p290

This will drop a .rbenv-version file in that folder specifying the ruby you want. Anytime rbenv comes across a .rbenv-version file it will default to using that specified ruby version.

System Ruby

If you need to have access to the default system ruby for example to compile the Command-T extension for Vim you can simply type:

bash console
1
rbenv shell system

This command will tell your current shell that you want the system ruby version available. This will not effect other shells you have open or any new shells you create.

Conclusion

rbenv has a very low overhead for everyday use and follows the Unix/Linux tradition of doing one thing and doing it well. I purposefully didn’t touch on having gemsets as I’ve found with most of my projects using bundler I had no issues. If you need to have something comparable to RVM gemsets you can use the rbenv plugin created by Jamis Buck called rbenv-gemsets. It currently has issues on Zsh though to be aware if you’re using that for your shell.

All-in-all rbenv is a solid replacement for RVM. Aside from having to rehash the shims which should hopefully be resolved soon I currently find using rbenv a good experience.

Update - I forgot to mention and was reminded in the comments that RVM and rbenv are NOT compatible and you will need to uninstall RVM before moving forward. It’s a really easy process. Simply type:

bash console
1
rvm implode

This will remove RVM from your system. You will still need to go through your .bashrc/.bash_profile configs and remove any RVM information there.

A Simple Git Workflow

In my previous post on git I covered the basics of how to host your own repositories for items you wish to not have on Github or that you simply do not want to have to pay for private repositories.

Today I will go over a very basic and simple Git workflow that easily works for small teams and is simple to keep track of in your head.

Basics

I like to keep my master branch production ready at any given moment. There are several reasons for this:

  • Many third party tools don’t need to be configured to work with the master branch as it’s almost always the default.
  • When somebody clones your repo the master branch is usually the default that they start with. Having master be the stable production branch is easier on them.

So, once we decide we want our master branch to be the production ready branch we then must move onto how we will develop our workflow.

Development

I normally start with a development branch when starting out and I don’t have anybody else working on the project for the moment.

When working in development I don’t concern myself with whether or not things are in a working state. The goal is to make sure any changes we are making live on the server and we don’t have a long time passing where changes are just sitting on the local development machine.

Since I am assuming the product is very young and nobody is currently working on it I initially use development to flesh out the workings of the application. Once things are good I then will get things passing and merge into master.

Getting The Ball Rolling

Now that we have a stable master branch and a development branch for ourselves this is where you can start implementing a team approach to working on the app.

Team members are encouraged to use a topic branch both locally on their machine and tracking on the server to make sure work is being backed up regularly. When their topic branch is done then they should merge into development. If everything in development looks good you can then merge into master and deploy immediately.

Deploying immediately is very important as you don’t want code sitting in master that isn’t being exercised in production. Only running in production will you know that the code works.

Assumptions

You may wonder why you don’t just make sure your topic branch is good and merge directly into master. This is totally fine if that’s how you want to manage things. With Git there is no ‘one way’ to do things.

I like having a development branch so things can be reviewed before being deployed. If you have people on your team who are not comfortable with deployment it’s good to have a process where you can oversee what’s going on in the system.

Walkthrough

So, assuming we have a local repo that is tracking a server repo with just a master branch we can create our development branch like so:

bash console
1
git checkout -b development

Then push that branch to the server and track it:

bash console
1
git push -u origin development

You can also create your topic branches the same way:

bash console
1
git push -u origin topic_branch

Eventually you’ll be all done with your topic branches you can simply remove them by typing:

bash console
1
git push origin :topic_branch

Conclusion

This isn’t the only right way to manage your development workflow. This isn’t a wrong way either. Pick and choose what will work best for your team and discuss it with them. You may find some things work great for your team and others fail miserably. The beauty of Git is it lets you choose. It also gives you enough rope to hang yourself.

Using the Pomodoro Technique for Development

The deadline is looming, the project you’re working on is very complex, things are broken, and you have more features to implement. If you’re like me this kind of stress paralyzes your brain. You just stare at your screen willing some magical force to push your programming into action but the more you try to focus the more scattered things become.

It’s like trying to go to sleep because it’s late and you have to get up in the morning but you spend your time counting how many hours of sleep you’ll get if you could just fall asleep “right now”.

If this scenario or something like it has played out for you before I would like to introduce you to a system that has helped me get things back under control and structured.

The Pomodoro Technique

The Pomodoro Technique is a timeboxing practice where you list out all of the things you need to get done and priortize them not quite unlike the “Getting Things Done” system. The main difference here is that once you’re done making a list of items to be addressed you pick out all of the ones you think you can accomplish just for today.

Once you have your “Today” list done you then choose the most important of those items and prepare for a pomodoro.

A pomodoro is a unit of time which is usually 25 minutes for which you focus completely and solely on the task at hand. No interruptions, no email, phone, or even co-workers bothering you.

The goal is to get through the entire 25 minutes focused completely at what you need to do.

When the pomodoro is up you take a 5 minute break and spend time doing something unrelated to your main task. Get some water, go to bathroom, or whatever. Once your 5 minutes is up you start the timer again for another 25 minute time block.

How Does This Help?

Time is the thing in this life we cannot control. Everybody is alotted the same amount. By timeboxing your current activity your mind is allowed to train itself for sustained focus on a particular task.

Once you’ve started doing a few pomodori (plural) you’ll see the difference in how quickly your mind can get back on task.

Knowing that you only have to concern yourself with the next 25 minutes also relieves the subconscience of those nagging tasks that would normally occupy your mind and keep you distracted.

How To Get Started

All you really need to have to get started is simply paper for making a list and a timer of some sort. Kitchen timers work well but there are also timers you can get for Windows, Mac, and Linux which can help you keep track of finished Pomodori as well as set your chat applications to “Away”.

Conclusion

Do I use the pomodoro technique everyday? Nope. Although I really like it and find it very valuable I only turn to it when I’m having a hard time focusing or are very stressed about the mountain of work ahead of me and need to get things back on track.

At the end of the day it’s a tool to help you do better.

Hosting Your Own Git Repositories

Coming from a background as a Linux administrator I manage my own development server. It’s a VPS that I pretty much store my development life on. Getting started setting up a remote git repository over ssh is not as straight forward as it is in mercurial or even svn. This is my technique for managing my own private repositories.

Requirements

  • Linux/Unix - Whatever flavor you choose.
  • Git - Installed on both the server and locally
  • SSH - You will need to have this installed and configured

On The Server

It’s best to decide where you want to store your git repos. I like using /srv/git for everything I do but you can pretty much place it anywhere you want. Even your home directory if you only need it just for yourself.

In this directory you’ll want to create a bare git repo with this command:

bash console
1
2
3
mkdir reponame.git
cd reponame.git
git init --bare

That’s pretty much all there is to it on the server end of things. Using git over ssh is super simple.

On Your Local Machine

On your local machine you will need to create a git repo and have at least one commit into it.

If you wondering why you couldn’t just clone your git repo from the server and start there, git won’t allow it as technically there is nothing in your server git folder yet.

If you have a repo on your local machine that has at least one changeset then you can add your server as the remote origin like so:

bash console
1
git remote add origin user@yourserver.com:/srv/git/reponame.git

Once that is done you can then push your local master to the server and you’re good to go:

bash console
1
git push -u origin master

The -u turns on tracking for that branch so git will notify you if you need to push back to the server after changes.

Conclusion

It’s pretty simple once you know that steps but if you’ve come from an Subversion or Mercurial background it doesn’t make much sense at first.

If you are running Redmine on the same local machine you can easily add your server git path to a project and browse your changesets through Redmine.