Slide Design for Developers

This is a great article by a guy over at Github about slide design. He makes some really good points, and his slides really are beautiful. I want to see more presentations with these design pointers in mind.

More on SOPA and Protect-IP (Don't break the internet!)

More on SOPA and Protect-IP (Don't break the internet!)

As I said in my previous post, SOPA (the Stop Online Piracy Act) and Protect-IP are both very bad pieces of legislation. They are not only technically damning for the internet, but they are unconstitutional. When I started doing my due diligence the other day with regards to these bills, I didn't know much about them. After some research, though, I have discovered just how bad these bills are. Here's some more information, in a post by Adam Savage (from MythBusters).

That article is not that technical and doesn't really quote the bills much. For the full story on the bills and just how technically bad they are, as well as how unconstitutional they are, read the Standford Law Review's take on the bills.

These bills are being pushed by the entertainment industry pretty hard, and not necessarily for bad reason -- piracy is an issue. But these bills are technically stupid, won't actually solve anything (the second link addresses this -- it would just force piraters to use alternate services from DNS, which is just bad for the internet at a whole and won't stop their activities), and puts the US on a slippery slope towards censorship like Iran or China. The bills are being fought by every single major tech company. (And probably all the non-major tech companies as well -- they're just not as visible)

Leave your thoughts in the comments. And speak up to stop SOPA and Protect-IP!

Stop SOPA (Stop Online Privacy Act)

I've heard a lot of chatter about SOPA (the Stop Online Privacy Act), which is currently going through congress. However, today I finally decided to look into it. And I definitely didn't like what I found. Here's a video summing up the problems (the video is for the Protect IP act, which was introduced earlier this year, but most of the same problems apply to both bills). (Found via Matt Cutt's Blog

PROTECT IP / SOPA Breaks The Internet from Fight for the Future on Vimeo.

Watch the video. Then Google "SOPA" for more information. You can also sign a petition against the act here. Visit Matt Cutt's Blog Post for some more links to information and ways you can get involved. Join with me and so many others in speaking out against SOPA!

Android vs. iOS

My iPhone 4S came this last week. Previously, I was a proud owner of the HTC Evo 4G, a great phone. But still I switched. Since I've had a few people ask me why, I decided I'd better just blog it. Sorry for my rambling.

Both platforms have their advantages. I suppose I'll go platform by platform and examine some pros and cons:


With iOS 5, an iPhone was finally a valid alternative to my Android phone. Previously, I never would have switched, because of the lack of a good notifications system. If a push notification came in, it would interrupt whatever you were doing, sometimes to the effect of restarting your progress in a game or something. This in and of itself was an annoyance, but could be overlooked. The main problem was that once a notification popped, if didn't immediately go to that app and take care of it, it was gone. Poof. You couldn't decide to leave this or that notification for later, because they weren't being stored anyplace. That all changed with Notification Center in iOS 5.

This was basically just a modified copy of Android's version of notifications, which they got right from the beginning. Drag from the top of the screen and you have a list of all the notifications which you have not yet acted upon. In my opinion, Apple did it even better than Android -- you can define the order in which notifications are shown manually, or you can have it based on the time of the notification. You can also act on the notification straight from the lock screen by sliding-to-unlock using the notification icon. (Hard to explain, ask someone with an iPhone to show you if you don't know what I mean) It's a pretty schweet feature.

Other things iOS has going for it:

  • Very stable (rarely crashes, almost never have to reboot, etc)
  • Very clean (everything is hardware accelerated, so every action is smooth out of the box)
  • Higher resolution screen than most Android phones, even the ones with bigger screens. People complain about the screen size on iPhones, but you don't even notice because of how clear the retina display is.
  • More secure (all apps are sandboxed, making malware pretty much nonexistent)
  • More polished apps (I don't know why this is the way it is -- perhaps because of Apple's app store policies, but apps are so much polished on average. It's true of both OSX and iOS) This is a huge one. The general level of quality, both of games and of daily-use apps, is much higher on iOS than on Android.
  • Better hardware, with very little fragmentation. You have great battery life, on every iPhone. It's very compact for its power. You almost never have to worry about your phone not meeting "minimum requirements." This is probably a large part of what makes the previous statement (polished apps) true -- developers can focus on the quality of the app, rather than focusing on making the app work for hundreds of different phones. You're also guaranteed to get all the iOS updates for at least a couple of years, something which 99% of Android users do NOT get.
  • Great camera. Yes, some Android phones also have great cameras, but many of them are pretty much junk.

The iPhone is a pretty great phone, but it does have some cons:

  • Limited customization. Can't replace the stock keyboard, very few homescreen customization options, etc.
  • Sandboxed apps. This is both a pro and a con. Apps can't really interact because they're sandboxed. You also don't have a file system in the classic sense, which makes it more limited for a computer replacement. Most people don't care about this con, but it's still there.

Those are the only cons I can think of right now, leave a comment if you find one I missed.


Android is also a great platform. I love Google, and love their products. Things Android has going for it:

  • The same type of notification system, tried and true.
  • Tons of customization options. There are many great options for homescreen apps which can change the look and functionality of your homescreen and app drawer drastically
  • Separation between homescreen and general apps. This means you can use your homescreen more like the desktop on your computer, with only apps you use often showing there, and the rest hidden in the "app drawer," which you can open at will from your homescreen.
  • True multitasking. This means that apps can run in the background and perform tasks without being open, which is very uncommon in iOS, which utilizes push notifications. However, it can have performance and battery ramifications, which we will explore in the cons section.
  • Apps have more freedom. This means that you can have Google Voice, for example, seamlessly integrate with the phone app, as opposed to being separate. It also allows problems like malware.
  • Dedicated menu and back buttons. This is both a pro and a con, but I find myself missing a dedicated back button on iOS, and having these buttons as hardware buttons saves space on the screen which would be taken up by these buttons.
  • Much better integration with the Google-sphere. Since everything I use is Google, (Gmail, Calendar, Contacts, Voice, Talk), having this integration is really nice. You can get pretty decent integration in iOS, but it takes more setup and is not as seamless.

Here are some cons:

  • Instability. I had to restart my phone every few days because it would become unresponsive or strange bugs would rear their heads. A pain when your phone takes a few minutes to restart.
  • Reliance on phone manufacturer for operating system updates. This is a big one -- unless you've rooted your phone and are using custom roms, most OS updates either never come, or are 6 months to a year late, and by then a new version has been released. This also causes problems with app requirements.
  • Inconsistent hardware quality. Another big one. Most Android touch screens are not as precise or as quick to respond as the iPhone screens. In addition, some of the Android phones are cheaply made, and/or have really bad battery life.
  • True multitasking. Like I said, this is a pro and a con. The con is that a frozen or buggy app can rampage in the background, sucking battery power and processing power. Even the ones that don't act up can continually such battery life in the background. In addition, apps are much harder to kill on Android, since you have to go into system settings and wade through the list of apps to do it.
  • Malware. Without Apple-esque restrictions on apps, there is much more danger of malware. This is becoming an increasing issue as Android becomes more popular.
  • Lower app quality standard. We've visited this already.

Summary and Other Resources

I decided to switch from one of the better Android phones to an iPhone 4S. And unless something big changes in the future, I'll never go back. The stability and polish is important to me, as is battery life. I also will get the newest versions of iOS right as they are released for at least a few years, where Android users are often left out in the cold when new versions of Android come out. What you get will depend on what's most important to you. But even as a power user, I chose iPhone.

Here's a recent article on the subject: Link

Sorry for my rambling, be sure to leave a comment below with your opinions.

More on Python

My good friend Chad recently started learning about Python. After getting a fair bit into a book on the subject, he posted on Google+:

Language seems cool, though I haven't found a compelling reason to need it. I think Django is the main reason I want to use it. Calling upon the powers of +Julio Carlos Menendez and +Colton Myers to give me some examples where using Python is WAY better than another language.

I decided to try to collect some of the research that I found while selecting Python as my latest language-of-choice, and post it here. Hopefully it will be helpful to anyone who's trying to decide if Python is worth picking up. Be aware, though, this is going to be more a random collection of thoughts and links rather than a linear blog post on the awesomeness of Python.


I suppose one of the first things I should point out, before we get into any language specifics, is the community. I love the Python community. Whether it's the mailing list(s), or the #python channel on Freenode, the community is very active and very helpful. In fact, the #python channel is one of the most active on my IRC client. With all that help available, solving problems becomes much less daunting than in other languages.

Indentation as Syntax

One of the most immediately obvious things people notice about Python is the significance of whitespace. This is in contrast to most every other modern programming language, and often throws people when they first discover it. I think of this part of Python as a very positive feature of the language: having indentation as part of the actual syntax of the language makes for very consistent code -- across almost every Python program, control structures look the same, because the indentation delimits code blocks, rather than braces. If this feature seems weird or unnatural to you, I suggest you try writing in Python for your next project -- the initial "weirdness" of this feature rapidly fades away, and if you're like me, you find yourself appreciating the innate readability that it gives Python code.

Intuitive Language Design

For this next section, I'll quote an article by Eric Raymond, where he details his early experience with Python:

My second [surprise] came a couple of hours into the project, when I noticed (allowing for pauses needed to look up new features in Programming Python) I was generating working code nearly as fast as I could type. When I realized this, I was quite startled. An important measure of effort in coding is the frequency with which you write something that doesn't actually match your mental representation of the problem, and have to backtrack on realizing that what you just typed won't actually tell the language to do what you're thinking. An important measure of good language design is how rapidly the percentage of missteps of this kind falls as you gain experience with the language.

When you're writing working code nearly as fast as you can type and your misstep rate is near zero, it generally means you've achieved mastery of the language. But that didn't make sense, because it was still day one and I was regularly pausing to look up new language and library features!

This was my first clue that, in Python, I was actually dealing with an exceptionally good design. Most languages have so much friction and awkwardness built into their design that you learn most of their feature set long before your misstep rate drops anywhere near zero. Python was the first general-purpose language I'd ever used that reversed this process.

I've experienced this myself -- Python's design is such that it works much more fluidly with the solutions as they live in my head. I can just start coding and the solution flows easily from my brainwaves to working Python code. Try it, I think you'll be surprised.

While on the topic of Python's intuitiveness, I think we should talk about IDEs. C# is one of my favorite languages. Using Visual Studio, you can create very advanced and full-featured GUI applications on Windows with relative ease. However, they key part of that last statement is "Using Visual Studio". I find when I'm writing in Java, C#, Objective-C, etc, I end up relying heavily on intellisense to help me recall syntax and method names. In contrast, Python is designed so intuitively that I find that I am able to write full-featured programs with only Vim and a few trips to the Python documentation to refresh the name of a certain function within a certain module. I think that's another testament to the great design of Python, and just how intuitive it is.

In addition, though C# can be used to create great Windows GUIs, Visual Studio is again used to abstract away thousands of lines of GUI code which is generated by Visual Studio as you code the GUI. Have you ever tried to write a C#, Java, or Objective-C GUI in a non-IDE text editor such as Vim? It's nigh unto impossible, because the syntax is so verbose and not intuitive enough to easily remember. Contrast this with Python's Tkinter toolkit, which allows one to create GUIs with relative ease, and with no reliance on an expensive IDE. And the Python GUIs look native on each platform.

Java vs. Python

Might as well throw in a side-by-side comparison of Java vs. Python. Shows some of the differences in verbosity and complexity between the two languages, even if only in small examples.

Java vs. Python


Obviously Python is not perfect. There are definitely downsides to having a completely dynamically-typed language like Python -- if you're not careful, you can have difficult-to-find bugs crop up. And the task at hand can sometimes require the speed of C, for example, or features from other languages -- Python is not a cure-all, and I don't pretend that it is. But for me, it's pretty close.

What it comes down to is that I've come to really enjoy programming in Python, finding it intuitive, straightforward, and full of features that make my job easier as a programmer. And I hope that it treats you as well. =)

Vim -- why???

I use vim.

Vim is a 20-year-old text editor, based on the 35-year-old "vi" editor (pronounced vee-eye, not vie). vi and emacs are both part of an editor holy war that has been going on for decades. I discovered this holy war years ago, mentioned on some website or another, and started to research. I tried both editors briefly, and then decided to take vim's side of the holy war. At the time, this pick was kind of arbitrary -- I really didn't know enough about the two editors to make a truly educated decision. I think I saw vim as the underdog in the war, and as the more interesting of the two editors, with it's strange idea of separate modes for text entry and for commands.

I've never looked back. When I first picked up vim, it was mainly so I'd have bragging rights that I could program in a crazy-old terminal editor. However, a few years later and I do everything I can to work in vim as exclusively as possible, whether for school, work, or personal projects.

People often comment on my choice. Why am I not using the feature-rich IDEs available for the language in which I'm coding? Why am I using such an old, out-dated piece of software? (This question just shows ignorance -- the latest version of vim, 7.3, was released Aug 2010) What's so great about vim?

That's what this article is about. Just as John Beltran de Heredia did in his article, I'm going to try to break some of the misconceptions surrounding vi/vim, and show you why vim is king. For those who already are convinced and are looking for some vim resources, jump to the resources section at the bottom.

Normal vs. Insert Modes

The first time you try vi/vim without any real introduction to it, the result is almost always the same. First, there's the disgust that you feel when you find out that to even enter any text, you have to hit 'i' to enter insert mode. The normal and insert modes of vim are probably its most misunderstood feature, and what makes vim so powerful. But misunderstood, the result usually is that you get into insert mode, use the arrow keys to navigate around, and do everything you can to stay in insert mode. That's how we've been trained -- if we enter a letter, that letter should appear on the screen. So you stay in insert mode, dink around for a few minutes, then throw your arms in the air, yell "What's the point??? This is so stupid.", and then never come back.

It turns out that this is not the way to use vim at all. The key thing to remember with vim is that you stay in normal mode almost all the time, entering insert mode for short bursts of typing text, only to return immediately to normal mode. John Beltran hits the concept on the head in the article I linked to earlier:

Thus, the remembering-the-mode problem just doesn't exist: you don't answer the phone in insert mode to get back to vi and not remember where you were. If you are typing text and the phone rings, you exit insert mode and then answer the phone. Or you press '' when you come back. But you never think about insert mode as a mode where you stay.

He continues:

Let me explain the philosophy behind this.

Commands in vi/vim are meant to be combined - 'd' means delete, 'e' means 'move to end of word', then 'de' is a complete command that deletes to the end of the current word (something like Ctrl-Shift-Right, Left, Del in most regular editors).

When you compare 'de' to the 'Ctrl-Shift-Right, Left, Del' in most regular editors, you start to see the beauty of the system.

Interestingly enough, inserts are considered commands as well. If you type 'i' to begin inserting text before the current character, type a word or two, and then hit 'Esc', that entire operation is a command. This is important to remember because of another key piece of functionality: the '.' key. When in normal mode, the '.' key will repeat the last complete, combined editing command you executed. This could be the 'de' command we mentioned earlier, or it could involve inserts. For example, if you typed 'iHello', then it would insert the word "Hello" before the starting location. Then, if typed a '.', it would repeat that operation.

The interesting thing is that you can also add a number argument before almost any command (whether movement command or editing command), and that command will be repeated that many times. All these concepts can be combined to result in incredibly flexible editing power. Jon Beltran does a good job of a more in-depth exploration of the power of vim, so I'll link you over to his article again if you want to learn more.

Go Explore!

Vim is generally known for it's very steep learning curve. I won't deny, the learning curve is definitely there. However, I will say that if you can stick with it, you'll never regret it. Vim key bindings allow you to basically ditch your mouse, as well as carpal-tunnel inducing crazy key bindings for basic operations. Everything is at your fingertips, and it's so powerful! You can also find vim emulation plugins for many modern IDEs, such as Eclipse, Visual Studio, etc. Another interesting fact is that the default shortcut keys in Gmail are vim-inspired!

If you really want to try to learn vim, head over to Jon Beltran's site and get his vi/vim graphical cheat sheet. I found this invaluable as I learned vim. You'll also find many good books on vim. Just follow the reviews on Amazon or a similar site, and you'll find them.

Once you have the basics down, you can start exploring ways to extend and customize vim. You'd be amazed to find out how truly customizable it is. In fact, if you're interested, check out my dotfiles! This is a collection of my various configuration files, including my vim configuration files. I've tried to comment everything thoroughly enough that you'll be able to follow what purpose each command serves, but feel free to use the issue tracker to ask questions! You can even fork the repository (it's on Github), and modify to suit your needs! It's designed to be cross-platform (it requires one small change in the .vimrc to define the platform), so it should be pretty easy to incorporate.



Why vim is the "Killerest"

vi/vim Graphical Cheat Sheet

My dotfiles

I'd love to add to this list of resources, so if you have a good one, leave a comment!

Python: My Language of Choice

I recently found a "poem" about the philosophies of Python. It pretty well embodies the reasons behind my recently acquired love of Python:

The Zen Of Python

Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one—and preferably only one—obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than right now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!

This poem was actually immortalized in Python's PEP 20, where the abstract reads:

Long time Pythoneer Tim Peters succinctly channels the BDFL's guiding principles for Python's design into 20 aphorisms, only 19 of which have been written down.

For those who might not be familiar, BDFL stands for Benevolent Dictator For Life, a title which belongs to Guido van Rossum, the creator of Python. PEPs are Python Enhancement Proposals.

Now that we have the lingo out of the way, we can talk about the language itself, and how I was introduced to it.

I had spent a fair number of hours fighting to write shell scripts to handle a few tasks on my web server. After tearing a few handfuls of hair out in frustration, I shot an e-mail to my good friend Andrew, who works as a programmer and sysadmin for a local company. He told me that he avoid shell scripts where possible, writing scripts instead in Perl. I had heard of these so-called scripting languages, Perl, Python, and Ruby being the most prominent in my mind, so I decided to look into them. I did a lot of research into these languages, and eventually settled on Python, primarily because of the principles in the poem above.

These principles just completely jive with my thoughts on programming. Python is designed such that most people that haven't ever written any Python code can read it and understand what's going on. Some have referred to Python as executable pseudo-code, and it's almost true! I found a quote which supports the idea of human-readability in code perfectly:

"Programs must be written for people to read, and only incidentally for machines to execute."

--Abelson & Sussman, Structure and Interpretation of Computer Programs

This is just one of the many reasons for which I chose Python as the language I would attempt to master. Another was the community -- the #python channel on Freenode is almost never quiet -- there are always people asking for help with some issue or another, and always people there to answer. I'm sure there are similar communities with Ruby and Perl, but so far I really like the community that comes with Python.

The thing that surprised me as I continued to learn about Python is that it is much more than a scripting language -- it is powerfully object-oriented, and even has very powerful GUI toolkits such as Tkinter.

I keep discovering more and more exciting things as I continue to learn about Python, and look forward to leveraging it's power more in my own personal projects. Now I just need to slowly work at getting it introduced at work...

What's your language of choice, and why?

...and again.

Here we go again.

I mean, seriously. You've gotta be wondering, "Why does he even keep trying?" This has to be my 6th or 7th attempt at a blog. Shouldn't I just give up already?

Well this time I'm trying something a little bit different. I'm going to blog about technology, rather than my personal life. Obviously I'll try to throw some personal life stuff in there, but the problem is, I don't feel like my life is interesting to write about, and thus I have a hard time staying interested in the writing, and I assume others have a hard time staying interested in the reading. Those of you who aren't "techies" out there, I hope you'll still come read a little. I think you'll find some interesting stuff. And who knows, if I can keep up interest in a tech blog, maybe I'll be able to make and keep a personal blog too.

More to come. I'm going to post a little on the reasoning behind this post a little later.

Always and Forever...

Like I mentioned in the last post, I'm taking a little bit of a different approach with this blog. Rather than try to blog about my life, which, in my opinion, leads to rapid loss of interest both on my part and the part of my readers, I've decided to focus on something that I'm actually interested in (and I hope interests a few of you): technology. Specifically, computers.

I got pulled in at a pretty young age. First it was the computer games -- Load Runner on our black-and-white Macintosh; later, Warcraft 2 on our 486. It just escalated from there. I still play a lot of video games (my parents will tell you that I play way too much, and they're probably right), but the less... flashy end of computing pulled me in, slowly but surely, as time went on.

My Dad taught me some basic DOS commands as we installed Doom from about nine floppies. I learned how to install and uninstall programs, tweak the operating system to gain more performance, mess with the registry enough to need a Windows reinstallation... (not very hard, that last one -- but a rant on the Windows registry should be a topic for another day)

I downloaded tool after tool, tinkering with different things. I discovered the editor wars (Vim vs. Emacs), and decided I wanted to take a side in that war. (Vim, by the's where I'm editing this article right now!) I eventually took a large course in C# and Java my senior year in high school, and that's when the fun really began.

Along this journey I discovered a few things.

  • The ability to build a computer from components means nothing. Anyone who claims they're computer-savvy because they built their own computer is naïve and generally not computer-savvy at all. Computer-savvy people know that building computers means basically nothing.

  • Nobody knows everything about computers. It's true. This ties in nicely with the previous point in that if someone claims they know everything about computers, they probably know next to nothing.

  • "Computer genius" is not defined by the knowledge you have -- it's defined by the ability to successfully find, interpret, and apply the information needed to solve a problem. 99% of the time, any problem you're having has been solved by someone else. Finding that information is an art, however. Anyone can type things into Google. But how many people do you know who always find the answer they're looking for from Google?

The point is, I don't claim to know everything. In fact, I generally know just enough to (1) be dangerous, and (2) have an idea of how much I do not know. But I do know a few things, and hope to be able to share some of them. I also hope to be able to learn a few things in turn from you, the reader. So please, comment! I love questions, and I'll do my best to find the answer for you, and I love it when people give new ideas/tricks, or yes, even when they correct errors in my knowledge.

So let's get to it! =)

PS: Now that you know the topic of this post, props to whoever can guess the reference in the title. Party on.