View http://www.hanselman.com/blog/AmIReallyADeveloperOrJustAGoodGoogler.aspx
I find this post really intriguing for a number of reasons. Primarily, I instantly asked myself the title question when it first caught my attention; it resonated with me like crazy. I’ll often talk to people about programming or development that have never done it and they make comments about how it’s something they could never do, or that code looks like an extremely complex foreign language. For some reason programming is viewed as more complex than something they do, which is (very) likely not the case. As you do something more it becomes easier and you’re able to increase your knowledge, speed, and the quality of output as time goes on. Programming is like that in every sense of it. I can nearly guarantee that when you started to program it looked nothing like the code you wrote yesterday. Whether you were teaching yourself or part of a formal instruction process, everything was broken down into smaller pieces that you could actually digest.
Many of us are self taught in what we do, in fact I’d go so far as to say that we’re all self taught. While the minority is completely and wholly self taught, we’re constantly learning new things about what we do either by choice or by force. We’ll read an article that piques our interest and learn something new. We’ll get an assignment (or openly commit) to a task we’ve never done before or even thought about. It’s virtually impossible to not learn something new on a consistent basis.
Is that a bad thing? I have absolutely no idea if that transcends into other professions as prevalently, but I think there are parallels in other industries. Are traditional artists learning new techniques all the time? Maybe not entirely, but they’re probably refining their skill with each piece. Do lawyers get tripped up when a new case requires additional research? I would think so. Do patients come to doctors with symptoms needing consultation with colleagues? All the time, I would think.
I guess where I’m going with all this is the notion that it’s bad if developers Google a lot. Blanket statement aside, I think there’s some truth to that, but not so much in the action of searching for help all the time, but instead what happens as you’re searching and more importantly: what happens after you’ve found your answer. I think you can evaluate what you’re searching as well. I vividly recall getting so frustrated when trying to build something in WordPress because I was searching for the most ridiculous, specific things because I simply didn’t know where else to look. As time went on and my knowledge about WordPress grew, I learned how to better search for answers to my questions. Most often it comes with the knowledge of needing to search on the most high level possible, trying to find a generic answer/article that will help me with my specific issue. I still Google stuff all the time, with every project, not only to find answers to problems, but also to discover new approaches to things I’ve become comfortable with. Software in particular matures over time. Chances are if you’ve had the same exact approach to something for an extended period of time, there might be a better way to accomplish it.
All this to say I can sympathize with the fear that constant Googling for answers makes you feel like an inadequate developer, but I don’t think it’s binary. It’s more important to look at the bigger picture, and the outcome of your searching. I don’t think being interactive with your work on that level is automatically a bad thing, or something that classifies you in your profession, it’s just a workflow. Let it rip.