12.28.2009

Avatar, A Look At The Confused State Of The Technologist




Avatar is a wonder of modern technology. We've all heard about, some have seen it, and we all agree that James Cameron worked bloody hard to make this movie the most advanced visual experience to grace the silver screen since the first frame of superficial animation. In ten years, Cameron has reinvented the idea of 3D cinema, the art of Computer Graphics, and the role of the real in the unreal. He spearheaded the development of audio/visual technology in ways nobody had imagined previously, and he had the cojones to spend more money on a movie than anybody else in history. Though it is entertainment, it is also a ballsy display of modern innovation and technological dominance that sets the bar higher for everyone else.

This is horribly ironic, this being the most anti-technology, environmentalist, tree-hugging, machines-are-evil, meditative-nature-religion, save-the-rainforest, movie anyone could possible dream of. It's funny that the movie that most exhibits the power of the non-limits of human innovation is its own worst enemy. I mean, look at the numbers:

- Rendering the movie: 8 gigabytes/sec, 24 hours a day, over a period of months.
- Storing the movie: Each minute of film comes to around 18 GB.
- Processing Power: A feat of parallel processing, 34 racks of 32 quad-core processors each (34 x 32 x 4 = 4352 processors!). All running at 8Gb/sec for 24 hours/day.

That's pretty intense for a movie with an environmentalist message, don't you think?

Then I read a New York Times Op-Ed by Neal Stephenson. Here's the link. It talks about Star Wars, and the subtle war between geeking out and vegging out. It's a very interesting article, but it poses a very bleak scenario for the technologists of the world (He even uses that word! Yes!). He writes that the Jedi Knights were the geeks of the Star Wars universe. They were the ones that everyone depended on to make things work, to know all the arcane knowledge there is to know. They were the ones called in to solve the hard problems, and they were well-versed in many aspects of technology, from flying warships to building light sabers. If you read a lot of the back stories of Star Wars, the various novels and off-shoots of the main cinema canon, you realize that the Jedi were hated by most simply because they were very good at what they did, they were masters of their world, and they were necessary for the world to function properly. This resentment is perfectly modeled through the writing of the last three episodes (episodes I, II, and III). The Jedi way is increasingly attributed to meditation and "trusting your feelings" than to the study of warfare and calculus.  Stephenson explains this as the writers writing what the viewers want to see. They want to see the abilities and powers of the Jedi as a product of meditation, something that anybody can do, as opposed to a product of years of training and study.

It's really nice to see on the screen that just by "trusting your feelings" you can deftly pilot a dog-fighting warship into the laser-riddled hellstorm in the beginning of episode III and come out victorious. or that winning a pod race is a simple as flipping some switches and "trusting your instincts".

In reality, technology is hard. Too many "geeks" of the world believe that playing World of Warcraft in weeklong marathons makes you a master of technology. They are falling into the cultural trap that says you don't have to try to be who you want to be. You just have to look and talk like that person and you can become that. You can now be part of the high-tech community simply by using certain Internet memes a lot and by knowing certain movie scripts by heart. There's a reason why the science nerds are all becoming Asian. It's because Americans have stopped caring about the effort it takes to run our world and control its minutiae. Students are flocking to liberal arts majors instead of taking the more challenging route and excelling in something important. Biology is giving way to psychology, physics to journalism, and Computer Science to MIS (I'm guilty of that one). We want all the perks of intellectual dominance, but we are lulled into the dream that simply acting intelligent is a valid substitute for being intelligent.

So, James Cameron bothers me even more now. Seen in this light, the Na'vi in the movie are naturally gifted masters of their environment, able to interface with many complicated aspects of animal and floral life via a "wire" of sorts that can attach to all kinds of things in the wild. For example, banshee riding is made simple through a natural neural connection between banshee and rider that allows the rider to control the animal with thought alone. No training required. They are masters but they never have to try. They just need to "be one with nature" and all of a sudden they are the masters of their world. The humans, on the other hand, have obviously made great leaps and bounds in technology to do what they do on the planet of Pandora. We see snippets of it in the form of the Avatar technology and the military prowess, but we can only imagine what Earth looks like at this point in history. They are the villain in this story. Avatar shows us that progress only leads to destruction and genocide. Only by communing with nature, and by "being one with the wild" can we ever hope to achieve any kind of moral civilization.

Audiences will eat this up. The techies, the ones responsible for running the stuff that keeps us alive, they're the evil ones. The guys that train for years, the ones that are steeped in knowledge and experience, the ones that apply what they have years learning to everyday life, the people who try to make progress, they're the evil ones. If only we didn't have to depend on those people to survive; if we could be more like the Na'vi, not having to work for mastery, then we would do no evil.

Ironic, coming from the guy in Hollywood who is arguably the most steeped in cutting-edge technology and money. I'm not sure what the message is supposed to be regarding all this, but I found the irony....ironic.

12.23.2009

Nerdiest Holiday Humor Ever

Seriously. If you find anything else more scientifically anal, please let me know.

12.21.2009

Google's Endgame




It's almost funny.

Google has released 38 products in the past 70 days.

Most of these won't be seen by all but the most dedicated Googler, but some will change the face of the SEO (Search Engine Optimization) industry forever. And the venerated company doesn't seem to be losing any steam. Anyone paying even slight attention to Google for the past ten years understands that their meteoric rise to world dominance wan't a fluke. It has followed a pretty steep upwards curve and nothing short of full-on catastrophic Google-cide will remove them from their throne upon high.

One of the 38 releases I refer to is a nifty little tool called Real Time Search. It wasn't hyped much, and much like other Google releases, flew under the radar waiting for people to stumble upon it. It's a simple idea, really. When you search for something like Healthcare Reform, a little box appears within your search results that gives real-time updates of news from that subject. "News" includes Twitter updates on the subject, RSS feeds from blogs, as well as Facebook and MySpace feeds about it. This all flows down in the box at a steady pace, updating every few seconds to show the lastest results.

At first glance, this seems pretty cool, but borderline useless.

But then I got a glimpse of the Singularity, otherwise known as Web 3.0.

Web 2.0 is still a buzzword these days. It represents the idea that users should be able to put publicly accessible data into web containers hosted on websites. The most visceral and raw version of Web 2.0 is YouTube. The content is all user-generated, and the website is only as good as the content users dump into it (which means YouTube is pretty atrocious as far as web sites go). This idea took off and became MySpace, followed by Facebook. It became Picasa and Flickr, Blogger and Digg.Web 2.0 changed the way we see the Internet, and is the sole reason the Internet recovered from the .com burst a few years back.

Twitter came and sounded the death knell of Web 2.0. I don't think they meant to do this, but they quickly grew from something resembling a glorified status-update to a quantum leap in the speed of collaborative communication. Twitter provided a near-instantaneous walls-free link to the 140-character thoughts of anonymous users all over the world. The use of the simple hashtag (#) to group subjects and keywords grouped tweets into streams that reflected what people around are thinking about, say, #google, right now. This has no boundaries, no friends to accept, no privacy settings to drill through, and no networks to join. Just pure, raw, text-based flow of collective consciousness. This wasn't Web 3.0, but it was definitely 2.5 at least./ It abstracted the ownership and control of data on the Internet even more than social networking did, and became a media-driving giant faster than anyone could believe.

There's another pattern behind the versioning of the Web, aside from the issues concerning control over content. As the technology behind data communications grows by leaps and bounds, as the connections between nodes become exponentially faster, continuing the ever-popular and ever-insistent trend of Moore's Law, we're getting closer and closer to an ideal form of communication that is instantaneous, and is passive instead of active. We are coming into an era where information is not something you seek out, but something you absorb as it simply appears on your screen the moment it is created.

Let me cut to the chase here, and explain to you how Google's real-time search works. This is an innocuous little feature that works like this: When you search for anything, you can go into the search options to display "updates only". What this will do is provide a continuously updating feed of comments and thoughts from sites like twitter, facebook, and myspace. This all happens in real time. I tried it out. I had a search open for "google" in google. In another window, I had my Twitter account open (@Interwebsruler fololw me!). I "tweeted" (I hate that word, but I guess I should get used to it) "google is awesome!" Almost immediately, my tweet showed up in the google search, with a little "posted 1 sec ago" blurb at the bottom. This is an unbelievable feat of search engine power. And it means that Google has achieved the beginning of a revolution in the way we interface with data.

Remember, the entire paradigm of data in the Internet is changing. Instead of the search for knowledge being an active process, an activity that has the user searching for something that already exists on the Internet somewhere, the tables are turning, and the whole idea of search is switching sides. As soon as information comes into existence, it's already on your screen. It's now a passive experience. The idea of "jacking in" to cyberspace and living in a world of consistently updating data flows is now a reality.

This is what Google is aiming for. Google wants to create an Internet that is ubiquitous and instant, not a repository of information, but a living, breathing, dynamic, world of information that it always alive and always changing. As users, our job is not to find the data, but simply allow the data to find us. When you put content into the Internet, you don't give it a destination. It doesn't travel, really. It just becomes instantly available for anyone that wants access to it. Instead of the Internet being a framework in which data repositories like Facebook and wikipedia can communicate with each other, the Internet will eventually be the container in which all the data is stored. The website will no longer be a place to control and input data. It will simply be a filtering system to direct the persistent flow of data to those who want it.

And guess who is going to control that data container? You guessed it. The Google.

I know that a lot of this probably makes no sense to many people. It doesn't make a whole lot of sense to me either. But I think I've given you a clearer picture of what the Internet will look like once the age of Web 2.0 is left behind for a newer, even more daring, frontier. And I think we all know that Google will be at the forefront of this war over the control of data flow.

The final frontier of Google dominance is where the war with Microsoft begins. You see, an integral part of the control Google wants over the flow of information is the protals through which the data flows. As of now, Microsoft owns that. The browser and the OS that is used to access the Internet is still not under Google's control. This is Google's biggest challenge. Data flow is only as good as the method used to get the data. Google plans on fixing this. They released the Chrome web browser (which is now able to have plugins, much like firefox) last year. They released Android, their Mobile OS, this year. Next year is the release date for Chromium, their PC OS. Seems like they've got their bases covered. They even have a little package called "Google Chrome Frame" which lets you take your IE and emulate Chrome on it.

It's going to be an interesting few years, and I'm just excited to be a part of it.

12.11.2009

Net Neutrality: Real Consequences


The Blogosphere has been throwing around the term "Net Neutrality" for a while now, discussing the theoretical morality and practicality of such an edict, and generally discussing it on a very high level that only the Technologists of the world really care to argue about.

Things are about to get ugly.

Ralph de la Vega, CEO of AT&T Mobility and Consumer Markets, recently spoke at UBS (big investment company) conference in NY, and spoke about the possibility of charging heavy data users more than the average  data user. He specifically pointed to the fact that 3% of AT&T smartphone users are responsible for 40% of total data usage. He went on to describe in which AT&T is devising ways for users to be able to track their data usage in real time, hopefully curbing the data usage behaviors of the super-heavy users.

(As a total aside, that's ridiculous. Do they really expect users to stop using data because they feel bad? They bought unlimited plans; unlimited is what they'll get. To expect them to be socially conscious of their data usage is just bad market research)

Market analysts are seeing this announcement as a threat to users: Rein in data usage, or we'll charge you more for it. There's nothing wrong with doing that, only that you'd lose all of your customers. Your status as an "unlimited" user doesn't alleviate the scarcity of data that ISPs have to deal with. There is a finite number that represents the size of total bandwidth, and simple rules of supply and demand require ISPs to charge for demand when supply is scarce. So, why now? Why is AT&T talking about this now when this has been a problem since phones became smart years ago?

Jared Newman, at PC World, has a great theory. He claims that AT&T is bluffing to try and stave off the Net Neutrality movement. Proponents of NN want to impose rehgulations that make ISP bandwidth allocation equal across all legal uses of data communication. One advantage of this is that Comcast would have to stop throttling bandwidth of heavy peer-to-peer file sharing users. This is the kind of behavior that the people behind NN are displaying as reasons to regulate bandwidth. However, it also means that bandwidth will become even scarcer than it already is.

Right now, AT&T users on video chat in Skype are getting a faster connection speed than users checking their email. This makes obvious sense. Why give users connection speeds they don't need? This is a dynamic and fluid allocation, because it is dependent on what you are currently using. The email checker gets a higher speed when he switches to using video chat and the video chatter gets his bandwidth throttled when he checks his mail. This is how ISPs conserve bandwidth and ultimately save money for the consumer.

Newman thinks AT&T is trying to scare off proponents of NN by threatening to raise rates. And they would have no choice but to raise rates if everyone was forced to have equal bandwidth all the time. It would mean that the amount of bandwidth required by the heavy users would have to be equal to the bandwidth given to the light users. AT&T wants to make something very clear: Do this, and the customer will pay. We'll gladly give out more bandwidth, but bandwidth is much more scarce in the wireless industry than in the wired industry, and someone has to foot the bill. Newman thinks we need to call their bluff, and force them to stop throttling bandwidth. He doesn't think they'll follow through and charge more. I disagree.

People take "unlimited plans" for granted. ISPs market it as a golden ticket to infinite data when in reality your speeds are measured and change constantly so that the illusion of unlimited can be maintained. When you sign up for wired internet, you don't ever get an unlimited plan. It's already assumed that you can use as much data as you like. However, there are very clear brackets of download speeds that cost more or less. The speed you get is the speed you pay for, and that's how things have been working for quite some time. There's no reason why AT&T can't start pursuing a similar model. Putting a data rate cap on low-paying users would be no different than Comcast putting a cap on users in their budget cable Internet plan. People are seeing these threats as "against the consumer" and "typical evil corporation" when in reality they are just planning on doing what the rest of the wired Internet world has been doing for years.

So, while Net Neutrality may cause this to happen, it will end up happening regardless of Net Neutrality laws being passed. Data will always be a scarce resource, becuase no matter how fast you manage to get the data to users, someone will build software that requires the full potential of whatever data rate you can dream up. Net Neutrality will just force wireless providers to make an already scarce resource even scarcer, making users who wouldn't need to pay more for speed spend more money just to support the small-time users who won'n need the bandwidth anyways. This just seems like a big waste to me.

12.08.2009

The Technologist Manifesto


There’s something fundamentally wrong with a group of people that define themselves proudly by a term that is considered derogatory for 95% of cultured civilization. No, I’m not talking about Democrats (Joke. Please don’t unleash the Flames). I’m referring to Geeks.

It makes me cringe to even say it now. Why have we been putting up with this for so long? The fact that we’re really good at what we do has been completely subsumed by the idea that we don’t take showers (we don’t. But that’s not the point). The fact that the world is becoming more dependent on our services every day is completely ignored next to the fact that we can’t talk to members of the opposite gender (we actually can. As long as that person knows their way around a command prompt and can give us the decimal representation of a 6-digit binary number, we could SO have a coherent conversation). The fact that several of the world’s wealthiest people subscribe to our dogmas, and that that the President of the Frikin’ USA demanded that his blackberry be tweaked, by us, for use as the Official Smartphone of the Commander in Chief does nothing to ameliorate our allegedly more important lack of fashion sense.

You see, we have an image problem. When Best Buy unveiled its newest nefarious plan to coerce money from the wallets of innocent consumers, The Geek Squad, they made it very clear that these people couldn’t dress themselves beyond black and white, with a skinny black tie. They made it very clear that these people had no life whatsoever and that they were created simply to hook up all your crap, and drive Beetles while doing it. They wanted people to come in expecting an anti-social college grad that hadn’t showered in at least 2 days, who acted superior and condescending, who spoke incoherently about all things technical, and in the name of all that is unholy, they succeeded.

The point I’m trying to get at here is that we need to find a different name for ourselves. Yes, I know that we like to be non-conformist and identify ourselves with a label that only we think is ‘cool’. But seriously, we aren't in high school anymore (We are in college). We need to grow up and realize that as much as we like being ‘different’ and ‘counter-culture’, we need to get jobs. The world has come a long way in the past 40 or so years making the computer world a necessary part of life, but those who master the science of the geek must come out from under the silicon curtain. We need to free ourselves from our self-imposed showerless incarceration and start wearing clothes that match. We need to break free from the chains of parental dependence and start proving to the world that can apply the same skills we use to kill users kill ninjas troubleshoot PCs to living a life as productive citizens.

Now, I know that this is a lot to ask; Identifying color schemes between all the various necessary clothing objects on my person is a skill set that will haunt me for the rest of my days. However, I propose a small but important step. We need to change our name. I don’t mean translating our current names into Klingon (Google already offers this important service to humanity). We have to do away with the term “Geek”. Maybe if we stop calling ourselves geeks, we wouldn’t follow the negative behaviors associated with the term. If we were called Computerologists, maybe we would take one more shower a week. If we were called Computing Professionals, we would dress a little better. Maybe if we weren’t called “IT guys” but “IT Men” we would have some more self-confidence in casual conversation. Maybe if Desktop support was replaced with Desktop Pwnage Operation, we would get paid a little more.

There are many possible names, some better than others (IT Men sounds like it would be a great name for a gay Internet CafĂ©). In all seriousness, I’m leaning towards Technologist (Cham-vaD, loosely translated into Klingon [“for technology”]). I go around calling myself a professional geek, but I would feel a whole lot better about myself if I would go around calling myself a technologist. A company wouldn’t have an IT guy (or Man), but instead have a Resident Technologist. There are probably better names out there on the Interwebs, but that’s the best I can think of right now.

From here on out, I will try to refrain from using the word “geek” to describe my Technologist brethren. The image of the “geek” is one we made for ourselves and we are the only ones that can fix it. We have an image to save, and right now it’s being saved as a low-quality JPEG. Get that lossy **** of a format off my hard drive, take your rightful place in society, and don’t ever wear a brown belt with black shoes.

Note: This is a work in progress. I plan on making revisions and improvements to this as they come to me. I'll keep you posted. Maybe one day somebody will take it seriously.

12.01.2009

Reinstalling the Healthcare OS




The American Healthcare system is broken.

Or, as the geek would say, the OS needs to be reinstalled. If I may coin an analogy here, the American Healthcare system is to Socialized Medicine as Microsoft Windows is to Linux. Now, before you jump down my throat and rip out my innards with your vastly superior Conservative claws, let me elaborate. Linux works. Nobody doubts that. On a small scale, in an ideal environment, where everyone involved knows the system and help is just a Google search away, Linux is the best OS out there. And it’s free. Unfortunately, we don’t live in that world; users break the unbreakable, they know nothing about computers, and they can’t take care of themselves when things go south. Windows isn’t the most stable or cheap OS out there. But it is comfortable, packed with user-friendly features, and if you have a minor problem, it can be solved by most people with basic computer literacy. Unfortunately, there’s a price tag. And it’s not only the money. If you ever have a serious problem in Windows, the back-end infrastructure is so vast and bottomless that even the most grizzled IT pro out there needs help from time to time. There are entire libraries of back-end architecture support tools that less than 1% of the population will ever even look at. Look up WMI on Wikipedia for a great example. Here’s an excerpt from the overview:

In order to unify the management techniques for the sake of simplicity, the DMTF defined CIM to represent real-world manageable entities in a unified way. The CIM object model is an object database model using terms and semantics that are unique to all constructors and software developers. This object model is implemented in a database called the CIM repository.

Based on the CIM model, WMI includes real-world manageable components, available from the DMTF standards with some specific extensions that represent the various Windows components. Moreover, WMI exposes a collection of COM-scriptable objects that allow various applications to take advantage of the management information.

As part of the installation process, most of the Microsoft applications available today (e.g. SQL Server, Exchange Server, Microsoft Office, Internet Explorer, Host Integration Server, Automated Deployment Services) extend the standard CIM object model to add the representation of their manageable entities in the CIM repository. This representation is called a WMI class, and it exposes information through properties and allows the execution of some actions via methods. The access to the manageable entities is made via a software component, called a “provider” which is simply a DLL implementing a COM object written in C/C++. Because a provider is designed to access some specific management information, the CIM repository is also logically divided into several areas called namespaces. Each namespace contains a set of providers with their related classes specific to a management area (i.e. RootDirectoryDAP for Active Directory, RootSNMP for SNMP information or RootMicrosoftIISv2 for Internet Information Services information).

To locate the huge amount of management information available from the CIM repository, WMI comes with a SQL-like language called the WMI Query Language (WQL).

See what I mean?

Windows is shiny and sleek, with an absolutely huge set of tools mostly inaccessible to the average user, and expensive to boot. It puts on a good show; it makes people think they’re being treated to a 5-star hotel when they’re really just staying in a souped up Econo Lodge with free HBO, Continental Breakfast, and a concierge that pretends to care (The Taskbar!!!! It Will Change The World!!!!). No matter how much free food Econo Lodge throws at you, the building is still on the verge of collapsing on you while you sleep. Free HBO doesn’t make up for the cigarette burns in the carpet or the slight stink in your sheets. However, the price of admission isn’t high, you know it when you see it, and you don’t get confused figuring out what to do once you pull into the parking lot.

I believe that’s what our Healthcare system is like right now. The concept is great. The whole triangle of patient-insurance-caregiver is a good model, and it does what it needs to do. However, when it breaks, it breaks hard. The seemingly simple interface of the triangle opens up to be the most complicated geometric object in Euclidian reality, and when something isn’t working, when insurance decides not to follow the triangle and doesn’t reimburse the caregiver, you are left to traverse the insurmountable labyrinth of private insurance to get answers. This leads to inefficient care, inefficient spending, and consumer apathy. The consumer becomes a simple pawn in the complex chess game of coverage, and is forced into premiums and charges that will break the system. This sounds a lot like Vista.

Now, let’s look at Linux. Back to the hotel example, Linux is like a hostel. Yeah, it’s free. But go there alone for the first time without a friend, and you might get raped.

Just sayin’.

There are distributions like Ubuntu that attempt to make the experience user-friendly, but in the end, the terminal will still be the only way to do things properly. If you don’t learn yourself some command-line, you won’t really be using Linux, and your experience will be second-rate at best. There’s an amazing amount of free tools and flexibility in Linux, but unless you know how to get around, or know someone who can teach you, you’re doomed to a bare-bones, barely functional OS, with maybe half of your hardware having the right drivers. The success of Linux is completely dependent on the ability of the user to learn how to get around. Help is out there, but it can be hard to understand and usually depends on your knowledge of other obscure command-line tools. Most users will simply give up and say that they have a working system when in fact they don’t know what they’re missing.

I believe that this is the state of Socialized medicine. Everyone seems to be happy with it, but they don’t know what they’re missing because they don’t bother to try harder. For them, a month wait for a surgery may be awesome, but nobody in America would even consider waiting that long. Sure, there are plenty of people who have learned how to navigate the Socialized medicine system and come out with a great experience, but that won’t be the case for the majority of people. So, does the potential to be a great OS make it a great OS? Does the fact that a healthcare system should work make it a good system?

I know that this analogy is somewhat controversial, as I am impinging on two separate religious flame wars at the same, and I don’t expect everyone to agree with me. This is just a framework with which to understand the questions and the problems. I think I may have some theoretical answers, but I’ll save that for another post. I need to flesh it out a bit more.


P.S. Mac OSX is like the healthcare system of Sweden. They think they’re so special and awesome, but they’re still Sweden. Nobody cares.

Edit: Thanks to the Benjamin at The Daily Harangue for fixing up my graphic. Sorry, I was using MS Paint. I'm old school like that.

11.20.2009

The Pathology of a Viral Video

YouTube is force of nature. It is a law of Physics all to itself, and it is unstoppable. It is an open medium with which people share their thoughts and ideas in the form of badly-produced, low-quality videos. It is free, and it is easy to use. It has become the preferred method of comedic outlet in the digital era, and it created this thing called the Viral Video. The viral video has its roots in the Internet Meme. A meme is something, usually a word or phrase, that has no inherent value or comedy other than the fact that it has been arbitrarily applied all over the Internet to a certain situation. A good example is "fail". Nobody knows the exact source of this meme, but when someone does something stupid, a label of "FAIL" gets applied. If it is utterly stupid, that person gets an "EPIC FAIL". Now, these things only mean what they do because of the amount of people using them, and that's a testament to the power of the Internet, and the vast numbers of people using it.

The Viral Video is the Web 2.0 version of a meme. Take the Rick Roll. A musician by the name of Rick Astley made a music video once. It was horrendous. At some point in time, that video was used in a prank where a seemingly innocuous link on the Interwebs brought you not to your intended destination, but to a Rick Astley music video. Shorty thereafter, the RickRoll was canonized in the Interwebs  lexicon of Funny Things That Only People Who Are On YouTube For At Least 2 Hours A Day Find Funny. Rickrolling became an Internet sensation, and only a select few people actually know who Rick Astley even is. Look it up on Youtube. If you dare.

The example I want to focus on today is viral on a much smaller scale, but it happened so quickly and so decisively,  and it disappeared just as fast, that it's a great example of how Viral Videos do their thing.

So, there's this show that used to be really popular with the angsty teens. It was called 'The OC', and those who remember the fad know how indescribably terrifying the specter of watching that abomination was, and how all your guy friends who started watching were obviously either doing it for their girlfriends, or subtly coming out of the closet.

Well, here's the last scene of the 2nd season.



Got that?

Good.

Now here's Andy Samberg's take on it from SNL.



That's comedy gold, friends.

Now, in a pre-youtube world, that would be the end. We would all have a laugh, and walk away. But nooooooo, this little juicy nibblet of comedy had an appointment with destiny, and it was about to smash in the office door with impunity.

People started doing parodies of the parody. It started innocently enough, with people adding the idea to video game scenes, like this one.



It then got applied to other famous youtube videos. Like this one.



And then, dear readers, things only went downhill. Or, I should say, things threw themselves off a sheer cliff into the neverending abyss of the pop culture comedic hell that is the YouTube population en masse. In rapid fire, here's some of the main offenders.







There's so much more, and a simple Youtube search will uncover startling amounts of these videos, all starting from one finale of a season of the OC. They definitely didn't have this in mind when filming that.

So, here's my favorite. I laughed out loud. At work. Very awkward.




Oh, and I forgot about this one. Very similar to the original SNL version.



pwned by the Astley. If you didn't see that coming from at least a few miles out, you haven't been spending enough time on YouTube.

Some would say you should keep it that way.

11.18.2009

Identity Theft Madness


Identity Theft in The Matrix

There is nothing I hate more than being lied to.

Ok, that's not really true, but using 'hate' in the first sentence really makes the blood boil, doesn't it?

Good. Now that your blood is at a slow simmer, I can explain to you from whence this hatred comes. I was watching the television last night, enjoying the feeling of my brains rotting from the inside out, when a commercial came on that grabbed my attention. If I could find a video of it, I would post it. For now, you'll have to take my word for it, because I don't remember the company's name.

A lonely man is buying something over the Interwebs and he whips out the ol' credit card to pay. He types the numbers into the appropriate field and hits Enter. Cut to a squalid slum in a desert in some country in Africa (Nigeria, I think), where a dirty, poor-looking man is staring intently at a LED-backlit, super-awesome computer connected to the Internet, waiting for something to happen. All of a sudden, credit card numbers appear in bold font, one number at a time, and he smiles smiles of joy and pleasure as he quickly prints out the numbers and hands it off to an emaciated little boy, also smiling like he was just given double rice rations by the Nigerian government. The boy runs along the dirty city market streets, where chickens squawk and burly men rip off the destitute horde with impunity, until he reached a small shack where the purchased items are stored. The man in charge takes the slip of paper with the card number, smiling like his 4th wife just gave birth to a boy, and packs up the item for the lonely man. Cut to an ominous warning about identity theft, and how it will happen to you unless you buy our product..

Now, I know that commercials lie. I mean, marketing is just a word for that blurry area between lie and truth that businesses pour money into and cultivate to capitalize on the almost-lie that compels people to purchase their products. But sometimes the line of falsehood is crossed, and when it happens in a subject that I am somewhat educated in, it drives me nuts.

Let's go over some factual knowledge. This is how eCommerce functions these days.

The basic rule of ID security on the Interwebs is that the last person to see credit card numbers in unencrypted clear text is you. Once you finish entering data and submit it, assuming you're using a legit eCommerce service (more on that later), the relevant data is encrypted using a cryptography algorithm. The common algorithms used today are all variants of RSA (Rivest, Shamir, Aldeman, the people who wrote it) and are typically encased in a protocol called TLS (Transport Layer Security, the successor to the ever-popular SSL [Secure Socket Layer]) that uses RSA encryption with 1024 or 2048-bit keys.

Now, for some cryptography 101 (5 in binary). Plaintext is encrypted through the use of keys. Keys both encrypt and decrypt data by using whatever algorithm you choose to apply the key to the data. There are two ways to increase security in a data transfer. You can make the algorithm stronger and less vulnerable to computational flaws and vulnerabilities, or you can simply make the key longer and thus harder to guess. This is where the terminology can get confusing.

Saying an algorithm has 12 bits of security is very different than saying the algorithm has a 12-bit key, and people tend to get confused. When I say that an algorithm has 12 bits of security, I'm saying that there has been a proven method to break the algorithm and find a key with 11 bits of complexity. So the effective security is 12 bits. When I say the key is 12 bits, I'm saying that the key itself is 12 bits long. Just for perspective, 12 bits (11111111111) means that there are  4096 possibilities for the key.

If you want to know how complex the RSA algorithm is (the original one written in the 70's) just take a gander here and hope your brain doesn't explode. I'm going to discuss the details of that. The relevant point is that the algorithm uses keys that are 1024 or 2048 bits in length. That means that there are 22048 or 1024 possibilities that the key could be. Just for some perspective once again, IP addresses are 32-bits. There are 4,294,967,296 possibilities there. The number is bigger than anyone can imagine, and binary represents it in 2048 digits (Any internet calculator in which you type in 2^2048 will return 'Infinity'. So, unless someone comes up with an efficient way to break RSA as it stands today (Hint: Nobody has), if you transfer data over the Interwebs with TLS, nobody will know your credit card numbers.

So, how do companies use this? Enter Public Key Cryptography. Also developed in the 70's, PKC revolutionized the way messages were ciphered and deciphered, and is the standard operating procedure of all eCommerce and other data sensitive services on the Interwebs. An analogy widely used is that of a mail slot. I own a locked mailbox with a slot. People know where the box is and where the slot is to deliver mail, but only I have the key that unlocks the box to get access to the mail. The mail slot is the public key, and my unique access to the mailbox is the private key. The keys are related in that they both are relevant to the mailbox, and this is reflected mathematically in practice. There has to be a trust system that guarantees the keys are related, just as the post office is responsible for relating mailboxes to keys. When I drop my mail in the mailbox, I want to know that the private key to that mailbox is only owned by the one person that belongs to the mailbox. This is accomplished using the PKI (Public Key Infrastructure). It is a system that sets up companies to distribute certificates that guarantee relationships between private and public keys on the Internet. The biggest company these days in Verisign.

Oh, and all this, all this complex infrastructure and cryptography, is symbolized in your browser by one letter. Normally, in your URL, you'll see http://blabla.bla. If you're using TLS, and therefore RSA and PKI, you'll see https://blabla.bla. The s stands for secure, and you know you're dealing with a legit and secure website.

So, how does this all go down in real life?

When you buy a product from amazon.com, this is what happens to your information. If you'll notice the address bar on the page that asks you for your credit card information,

it has changed to https. This means that any data entered into this paged will be encrypted with a public key given to Amazon by a certification company (VeriSign is this case) that tells my computer to trust that encrypted communication between me and amazon will be only readable by amazon. Since you purchased a product from amazon, you are authorizing them to draft payment from your account. If you don't Amazon to keep your information safe, then you shouldn't ever spend with anything other than cash from now on. And then get some professional help, because you are a hopeless paranoid. Anytime you swipe your card, you are giving some business your numbers. They don't steal those numbers because it is in their best interests not to. If you ever find unauthorized transactions on your statement, banks are very lenient about cancelling those charges, and re-issuing a card if needed. And if Amazon ever steals your credit card information, the entire company would face huge lawsuits and litigation and would not come out profiting from the venture.

The other, potentially more vulnerable, method of purchase is through third-party portals like Ebay. Things are a little different there. They use Paypal. What paypal does is manage both sides of the transaction through their own TLS certificates, and this way the seller never sees any payment information. Paypal drafts the account and pays the seller, so the only security vulnerability is PayPal's trustworthiness. Being that their entire business model hinges on that one attribute, I wouldn't worry about them too much.

So, basically, it's just as safe, if not safer, to shop online than to shop in a store. Nobody in Nigeria is reading your numbers and smiling like a moron.

Oh, and the whole pitch of Identity Theft? Ya, you can't get that from a credit card number. Unless you can access bank records that match credit accounts to SS numbers, and then match SS numbers to whatever else you need, your "Identity" is safe. Nobody is going to be walking around with your Passport and ID pretending to be you. The only possible theft here is from your bank account.

I guess that says a lot about what people perceive as their identity, and that's sad.

11.16.2009

Google Brings the Chutzpah

I write a lot about Google. They're an interesting phenomena whose true agenda and nature is constantly fluctuating between good, bad, evil, righteous, and various other complex moral states. Sometimes, one word sums it all up, and fits so well that you wonder why you hadn't thought of this earlier. That word is Chutzpah. From Wikipedia (or "The Ultimate and Omniscient Source of Universal Knowledge and Stuff"):

In Hebrew, chutzpah is used indignantly, to describe someone who has over-stepped the boundaries of accepted behavior with no shame. But in Yiddish and Englishchutzpah has developed ambivalent and even positive connotations. Chutzpah can be used to express admiration for non-conformist but gutsy audacity. Leo Rosten in The Joys of Yiddish defines chutzpah as "gall, brazen nerve, effrontery, incredible 'guts,' presumption plus arrogance such as no other word and no other language can do justice to." In this sense, chutzpah expresses both strong disapproval and a grudging admiration.

Or, for the reading impaired:




On that note, I would like to announce that Google has deemed me worthy of giving me a developer preview of their new collaborative content platform called Google Wave. It's really hard to put the idea into words and give it proper identification, but in their own words: Google Wave is what email would look like if it was invented today. Meaning, Email was modeled off the snail mail system, in which you wrote a letter, and the recipient received that letter and now owned it. Wave does away with that silly notion and innovates the way we look at cloud collaboration. Put simply, when you write a message to somebody, he can see you typing the message at the same time as you. Both sender and recipient own the content, and both edit it along the relationships formed by normal mail contact information.

Like I said, it's hard to explain. A more robust definition can be found in the 2-hour long presentation given by Google Product managers a while back. Or there's an abridged version for fakers. Regardless, this post is not about Wave. Well, sort of. Let me explain.

After I got my invite (and danced the dance of joy and happiness), I opened up IE (no choice; it was a work laptop) and I opened up Google Wave. I was then hit with a splash screen that blew my face off (very messy). It was the most chutzpah I've seen on the Interwebs in a long time, and it may signal the start of a new era in the browser wars.

IE is not supported by Google Wave.

Ok, read that again.

Are you still with me?

I'm not.

You do realize how unbelievably mind-shattering this is, right? You don't just release a product that isn't supported under IE (Just to be clear - it does run in IE if you install the Google Chrome Frame Activex control that may or may not break IE. But the point still holds. IE out of the box does not work with Wave). It's like releasing a car that doesn't drive on interstates. Or a music player that doesn't play mp3 files. It doesn't make sense, it's audacious, it's arrogant, it's brazen, it's non-conforming, and it's straight up Chutzpah.

It's also the stuff that revolutions are made of.

The age of IE (Internet Exploder, Internet Exploiter, take your pick) is coming to a close. Years of chipping away by browsers like Opera, Chrome, and most importantly and successfully, Mozilla Firefox, are finally bringing the industry to a point where things like can happen. Google Wave is going to change the way we collaborate in the Interwebs (many are calling it the harbinger of Web 3.0), and IE has been left out of the party. It's a blatant act of war, and one that's been a long time coming.

This may all be sensationalist thinking, as the restriction may be a quirk of the developer preview, but hey, the sensationalist way is typically the fun way, and I will indulge.

Don't worry, as soon as I get some more time with Wave, I'll let you know not if it rocks my world, but how much it rocks my world.

11.10.2009

Google Is Up To Something


At first glance, this is awesome.

Wi-fi in airports has always been a diabolical scam that frustrates people to no end. Well, there is an end. It's murder. I haven't gotten to that end yet, but I've gotten close. Now Google, the Great Provider Of Free Stuff and also The Ones Who Punched Me In The Face, are going to give it for free this holiday season. You sit down at your gate, turn on your computer, and you have Internet connectivity. They even offer a "donate" option for those in the holoday spirit. Yipee.

Now, for the paranoia.

What does Google stand to gain by this? All they are doing is paying companies like Boingo for their service so that they can give it to the user for free. Let's say you pay 10 bucks for 20 minutes of Internet. Google will not make that much off of you in 20 minutes. So they will likely lose cash on this.

Why is it only for the holiday season? If they wanted entry into the airport wifi market, they would set something up a little more permanent. I guarantee you that they are not doing this out of pure holiday jubilation.

Let's start with this: Google is not in the business of service delivery. They are in the business of data. And they have the best business model ever. Everything they give you for free, you really are giving them your data. This data is more valuable then you'll ever know, because it isn't your data that's valuable, rather the same data coming from millions of people. You may get access to all that data, but in the end, it belongs to them. And it makes them very powerful.

So this is what I think: Google doesn't want to make you happy. It wants data on travellers. It wants to know where you browse, what you search for, how long you stay on sites, if you're working or goofing off, and if it is worth it for them to make permanent deals with wi-fi providers to grant them that data. If this little data experiment shows that data mining air travellers is profitable, you'll likely see google search as a default search option in the Boingo browser home page. You'll get an option to download google toolbar when you install the Boingo client, and your data will belong to them.

Another option would be that Google is considering acquiring Boingo (or others) and actually rolling out a free wifi service based on ads, but that doesn't seem likely given the insane profit margins Boingo likely has.

This is something to keep track of, and it should not be dismissed as philanthropy.

Edit: I'm probably wrong about my theory. Here's an article that gives a much better motive to the madness.
The Article

PQBE7A26ZT8T

11.09.2009

How The Internet Has Changed The Way We Think About Politics



I found myself at lunch with a co-worker engaging in a heated debate (read: reality flame war) about the current state of politics and government. As usual, nobody won (except for me) and we all went about our geeky ways, thankfully without any bloodshed. A point was raised that made me think (even more so than the thinking I do all day). The other guy was saying that the corporations run the government, that we, as voters, don't really have power, and various other silly things like the CIA killed JFK and that the government knew about 9/11 and let it happen. He was right about one thing. As voters, we have close to no influence on national politics. At first, this bothered me. Then it didn't. And now I'll tell you why.

The Internet changed the way we see ourselves relative to the world. It brought the world closer to us. Anything we ever wanted to know, Wikipedia knew it. Any news story was on the Interwebs within minutes, and had 100 comments in just a few minutes more. Since the important national and global issues became so easily accessible and immediate, we only cared about those stories because the local stuff seemed banal compared to presidential scandals and health care reform.

The problem started when we drew a logical connection between our ability to voice our opinion and our ability to control the vast quantities of information we were fed. If we could know all this information so easily, it must also be easy to control all that. Knowing this, the news media highlighted story after story of individual Interwebs users changing the face of government. Tools like Facebook and Twitter became key buzzwords in the news for their ability to empower the anonymous individual.

Once the power of the individual became an entitlement, we started treating our system like a true democracy, instead of the representative republic that it really is. We assumed that since we could post our status on Facebook and people would listen, that our congressmen would do the same. We assumed that since we now had a voice that our government really wanted to listen.

So, it becomes a shock when we find out that the big businesses and bigger pocketbooks actually run the country, not the people. All of a sudden, the corporation is seen as a usurper of our right to change and our right to influence the powers that be. That makes them evil.

People don't realize that it's always been this way, and it always will be this way. Greed and power will always be the main element of politics, and philanthropy will almost always be a power tool. When faced with listening to the people and protecting a revenue stream/job security, the voter almost always loses. Politics never rewards altruism and dedication to country, because that doesn't pay the bills. What you see in the media is not what really goes on behind the scenes, and the big players are always more important than the small fries.

But it isn't all cynicism and gloom. There's a whole level of politics and policy that the average Joe has power over and shouldn't in any way ignore.
The media has portrayed the national political scene as a somehow accessible environment that a little hope and a little spunk can change. And that's when we started ignoring our local governments.

Local government is where the real battles are fought. Local governments are made up of community leaders who understand that for a city council vote, every vote does count. Every person that they do good by will tell other people about it, and get votes. City officials live in the city and therefore care about your community as much as you do. Yes, money and greed are still the impetus, but here it's on a level that the layperson can get into and affect. In the hullabaloo over health care, my personal state government passed two laws directly affecting taxes. Why was this ignored? If anything, I should have wanted to know about that more than healthcare news because I could have tried to affect the state legislature. I could have talked to my district's representative and he may have actually listened. I could have gotten involved on a personal level and I would have been much more satisfied than engaging in flame wars about the government's knowledge of 9/11.

If you really care about politics outside of the desire to argue with people, then you absolutely can. Join some society's, lead some community projects, meet people, talk to people, and make yourself known in the local scene. Write some editorials, attend city council meetings, and before you know it, you'll be "in politics".

Unfortunately, the Internet has eroded our sense of local community. We all live in a country, but we also live in our community, and the local community has issues that directly affect the way you live that you actually have control over.

So I humbly ask the denizens of the Interwebs: If you all really do know everything (which you allegedly do), then stop wasting your breath on the forum trolls. No matter how vehemently you oppose gay marriage, and no matter how big you can make your words defending that stance, you will not change anything. If you think your opinions should matter to somebody, and to legislative policy, get off my Interwebs. We don't do stuff like that here.

Finishing with my original point, people have to stop thinking that our federal government cares what you think. Let’s get this straight: you are one in a few hundred million. They don’t care about you. Getting all upset at government putting businesses over individuals is futile, because it’s a fact you’re going to have to deal with. On behalf of the Interwebs, I apologize for making you think you matter. You don’t, and your Twitter followers won’t change that. In order to regain our individual importance, we need to get back in to local politics, and make a scene. If we do that, I think we can change the face of national politics, slowly but surely. And to bring this to a technology standpoint, local politics needs some serious Interwebs help. If you can run a decent viral marketing campaign, you can win a local election. I don’t think local campaigns have leveraged the power of digital communities yet, and now would be a great time to take advantage of that.

I know this wasn’t a tech post really, but thank you for sticking with me. I needed to get it off my chest. And I have no friends.

11.02.2009

The Horrid State of IT on TV Shows

I've been watching Jericho, a TV show cancelled in 2007 after a short one season but was critically acclaimed as "good TV". I agree. It is good TV.

However, it suffers from the same malady that plagues every TV show released in the past decade. They have a complete disregard for reality when it comes to anything computer related.

Now, don't get me wrong. I understand that this is TV, and that every body of knowledge has been desecrated in some way or form by the imaginations of writers, especially the laws of physics (and the laws of going to the bathroom. Nobody ever needs to use the toilet on TV. It drives me nuts). But some things are so easy to get right, yet writers always get them wrong. Almost like they're doing it on purpose, just to make me angry (a perfectly plausible reason).

Some background: Jericho is about a nuclear attack on the US in which 23 major cities are vaporized in one short moment. The show chronicles the story of the inhabitants of the smalll town of Jericho, KS, and their quest to survive and adjust to post-apocolyptic America.

So, the scene I'm about to show you transpires after eletricity is temporarily restored to the town, and the population realizes that the Internet doesn't work. As a girl names Sklar attemps to check her email, Allison (whose father may or may not be a Super Duper Secret Agent Man) tries to offer some tech tips.

Failure Ensues.


Ok, let's break this down.

Skylar is fake-typing furiously at a screen that says "You aren't connected to stuff". Protip: If you're going to fake-type, at least make it so that it looks like you're doing something. When in the history of Internet Explorer have you been able to magically connect to the Internet by typing in magical hacker codes that don't even show up on the screen that you're typing on?! At least pull up a command prompt! This is just a symptom of another pet IT peeve of mine in TV shows:

Nobody ever uses a mouse.

Now, I know that in the geek realms that I reside in and call home, keyboard shortcuts are key to productivity, and if you ever have to use a mouse to accomplish anything, you're doing something wrong. What you're doing with your mouse could be done in a fraction of the time with shortcuts, and you are therefore inferior. However, that's not what most of the world. Most teenage mall rats trying to check their email after the apocolypse will not know the various wonderful shortcuts the "Windows" key makes possible, let alone know magical hacker codes that connect you to the post-apocolyptic Interwebs.

So, along comes Allison, angsty teenager turned networking guru, with some helpful tips. After witnessing Skylar utterly fail at fake-typing, Allison reccomends that she type in the staright IP address.

As Allison would say, "Oh, no you di'n't".

Here's some Basic knowledge easily looked up in Wikipedia. When you type in a web address, this nifty little thing called a DNS server translates your web address into an IP address. This was implemented so that you we don't have to memorize 12 digits every time we wanted to go to a website. Very simple. So, Allison thinks that maybe only the DNS servers were wiped out in the nuclear apocolypse, but the rest of the country's network infrastucture is still intact. And I know exactly where this misconception came from. The writer probably saw his company IT guy connect to a company server using its IP address (sometimes it's just simpler to troubleshoot that way, expecially when your local DNS is wonky), and thought "Wow, a magical hacker code to connect to Internet! Even when all the wires in major cities were wiped out in a nuclear apocolypse! This must be someting all the cool trendy teens know how to do. I'll put it in my show and impress the techies." And just to continue the theme here, let's assume that the Internet magically survived. It's really a huge testament and praise to the people in charge of the Internet that viewers assume that after the nuclear annihilation of 23 cities, you'll be able to login to Facebook the day after at your local Starbucks Wi-Fi hotspot without a hitch in service. News Flash: The Internet isn't some transcendent being that resides in our physical realm through the magic and wonder of computer geek fake-typing. It's a bunch of computers dustributed accross major universities and research centers that communicate with each others and the powers that be have duubed "The Internet".

And just when you thought it couldn't get worse, Allison goes ahead and types in an IP address. The address she types is....

827.750.304.001.

Now, before I go and explain how this isn't even remotely close to anything even resembling a valid IP address, let me point out to ridiculosity (that word rocks, shut up) of the whole scenario once again. Let's say Skylar was tring to check Hotmail. Quick, what's the IP address for Hotmail? Ya, didn't think so. Neither does anybody else. And one other slight criticism: You need to press the Enter button when finished with your super hacker fake-typing. "Enter" is the universal keystroke of "Go do all that stuff." Internet Explore requires you to press the "Enter" key when finished typing in your address in the address. Go ahead, see what happens when you type in an address up there and not press the "Enter" key. Hint: Nothing happens.

Now, let me tell you what an IP address is. The Creators Of The Internet (Capitalized for Effect) decided that the world would need a standard system in binary that would store the unique addresses of Internet sites all over the world. So they took 32 bits of binary units (4 bytes, if you're counting), and split up the Internet. The amount of possibilities come to 4,294,967,296 diffferent unique addresses, give or take none. They split it up (in a little trick called subnetting, which I won't explain now) so that instead of referring to yahoo.com as site #3,518,979,381 (the value of 11010001101111110101110100110101, the IP address of yahoo.com), they split all those billions of IP addresses into 4 bytes. So that big binary number up there becomes 1101000.11011111.0111010.0110101. In decimal, that's 209.191.93.53. If you made all the numbers in those 32 bits "1", it would be 255.255.255.255. That's the highest possible IP address in the current IPv4 way of doing things.

That may have been a little complicated, but still. 5 seconds of Google love is all it takes to figure that 827.750.304.001 is an incredibly dumb number to use. An it's not just that all the numbers are not even close to the range. That last number, 001, is just stupid. The whole point of writingit out the way we do is because it is a real number representation of binary. 001 is not a real number. It's hard to believe that nobody on the set at the time pointed this out to someone. It's a pointless lack of simple research.

And it only gets better. Allison says, "But the Internet was designed by the military (SPOILER ALERT: It wasn't really, they just started it) to withstand a nuclear blast (except for the DNS servers)!" I won't even bother to explain how asinine that tidbit of wisdom is, being that "The Internet" isn't a "thing" that can be "destroyed" by bombs. This, followed by Skylar's shining moment of ingenious intellect and insight, "So, why can't I check my email?? (while continuing to quickly fake-type)" should be enough to make any self-respecting nerd start foaming at the mouth.

Alas, things only only go downhill from there. All of a sudden, everyone's IE browser is covered with a Public Service Warning that tells everyone that everything is going to be OK. Now, this makes perfect sense being that nobody had access to the Internet. This is the only plausible way something like this could maybe happen:

- Some super-powerful Internet server is still functioning somewhere that has access to all the DNS servers and has the ability to force your browser to route every request to a single server with that message, without having the user refresh the page or actually try to click on something (SPOILER ALERT: This can't happen).
- It apparently is able to grant Internet Access to everyone that didn't have access to it a moment ago (SPOILER ALERT: This is impossible as well).
- All the physical wiring needs to be intact, along with the routing functionality of all ISPs providing access to the DNS servers. (SPOILER ALERT: What part of Apocolyptic Nuclear Annihilation didn't the writers understand?)
- It would have to bypass spyware, malware and virus checkers (SPOILER ALERT: It won't. You'll get a warning that you have a trojan or something and that would be the end of it).

Of course, the proper response to this is more furious fake-typing, until Skylar finally gives up and secretly admits her failure to implement her obviously l33t haxzor skilz and dies a little inside.

This is just one of many tech stupidities that pervade TV shows and movies these days, and I don't mean to say everything needs to be exactly accurate. That would make things unbelievably boring (If Jack Bauer forgot his password to his Windows Active directory account, he would have to contact the helpdesk to reset that. I don't care how badass he is. IT owns him). But please, please do some basic research. I know you can't always follow the laws of physics, and I'm cool with that, but some things are too easy to get wrong.

10.30.2009

The Seventh Coming




It was with trepidation and anxiety that I pressed the Install button on the Windows 7 setup screen. It's futuristic blue hues and user-friendly language did nothing to assuage the mounting tension I felt as I remembered the last time I installed a new OS. I Vista-ed my computer last time (sort of like "bricked my computer" but worse). That was when I lost my faith in computing humanity. After suffering through the despotic ruler that was Windows Vista, the Seventh coming of Windows has arrived, and I have received it and it's legal license key.

To be specific, I have a Win7 Professional 64-bit OS running on my system, and now I want to talk to you about it.

I was impressed from the get-go. When I first logged in, I expected to put aside 3 hours of my valuable time to clear the OS of bloatware like Norton's ubiquitous deal with Satan 30-day trial and ebay web links. Instead, all I got was the Recycle Bin on the desktop. I did a double take. Did that really just happen? Did Microsoft really just provide me withe clean OS? Something horrible must have happened. It must be a virus. Or Something. This just felt awkward. Like if your parents actually did buy you a car for your birthday. What are you supposed to say? Thank you? Words can't contain the gratitude I felt a that moment.

After that, everything seemed sorta the same. Yes, yes, yes, the taskbar is different. But nothing like "OMG,OMG!!! THE TASKBAR IS THE BEST THING SINCE SLICED BREAD!!! IT WILL CHANGE THE WAY WE PUT TASKS IN OUR BARS!!!! YAAAAAAYYYYYYY!!" It's definitely a slick improvement, but it's nothing revolutionary. All the changes I've seen thus far from a user experience standpoint have been small things that make stuff easier to find and do. Which is great. But it hasn't wowed me in any big way yet.

A lot of people are saying that Windows 7 is like a big service pack to Vista. I would use a different analogy. I would say it is what Windows 98 was to Windows 95. From a user experience standpoint, nothing was all that different. The foundations of the desktop environment are still there. It's the stuff under the hood that made 98 a much more stable, and therefore longer lived, operating system. I haven't had enough time to get into the nitty gritty yet, but I'll let you know when I do. Maybe I'll host a launch party!




Smile moments: Powershell 2.0 included, speedy wake-up from standby, trippy backgrounds, small improvements to explorer interface, the library system.

Frown moments: Powershell was surprisingly slow, actually liked the vista look better, still not so speedy startup, not sure if homegroups will really take off, nobody came to my launch party.

10.26.2009

Net Neutrality and the End of the World As We Know It





Net Neutrality is a buzzword these days. For those who live in huts (or use dial-up), Net Neutrality is the most recent left wing attempt at overhauling an established private infrastructure towards a more government-centric paradigm. That infrastructure would be the Interwebs.

The FCC is trying to pass legislation that will regulate ISPs and their bandwidth allocation. The FCC believes everybody has a right to the information superhighway (I haven't used that name since the 90's and it makes me feel old) and that ISPs have no right to restrict access to it. The timing is obviously tactical. Recently, many ISPs, and specifically Comcast, have been heavily criticized by the right wing for throttling bandwidth for customers engaging in frequent Peer to Peer transfers (like BitTorrent). The thought is that the people using the most p2p bandwidth are most likely using it for illegal purposes and therefore can be slowed down. This comes after an already controversial move by Comcast and others to "plant" illegal copies of movies just to track who was downloading (and almost always inadvertently) and uploading the movie. Now that the right wing is all up in arms against the ISP, the left has decided it to be the perfect time to stage their coup on the Interwebs.

I can go on and on about how wrong it is to socialism-ize (that is now a word) the Interwebs, but I hope I don't need to tell you that. I have a different point to make. All I have to say is: Be careful who you mess with.

The Interwebs is owned by the geeks. It is their realm, their kingdom, and their Dungeons & Dragons (4th edition) dungeon. They live in the Internet, and the Internet lives through them. With the World Wide Web spread out before their superhumanly speedy fingers, the Geek will always find a way to get what he wants in his own homeland. Be it a movie, a program, a game, or top-secret classified government documents, the Geek will find it, and he will get he wants without anybody knowing he was there. Like a ninja. Bespectacled and zit-ridden in RL (Real Life), the Geek becomes Superman at the keyboard. If you tell a Geek that your system is impossible to break into, he will reply "Give me five minutes". If you tell him that what he's doing is illegal, he'll tell you "I already won that flame war. It is illegal no longer". If you tell him his lvl85 paladin looks like a pansy....he will cry.

If the government really tried to take away the Interwebs from the Geeks, the Geeks would revolt. The revolution and cyber-uprising that will ensue when policies of artificial scarcity are enforced on the Interwebs would put the geeks in charge of the new digital economy. This is because no matter how much regulation the government tries to put on bandwidth, Geeks will find a way to circumvent the regulations. They always havem and they always will. This will just leave the layperson with an overpriced and slow broadband connection for no good reason.

This is all based on historical precedent. Every time the government (usually manifested in the RIAA) tried to regulate data flow, the geeks came out on top. Securing the mp3 format led to the P2P movement, proliferating mp3 saturation by many orders of magnitude. When Napster was shut down, the BitTorrent method of file sharing was created, effectively protecting everyone involved, and further growing the market for music and software pirating. Also, understand that the copyright breaking is not usually done for profit. It can be quite spiteful. When Electronic Arts released it's blockbuster PC gaming masterpiece, Spore, it used an archaic version of DRM (Digital Rights Management) that infuriated the gaming community. In a startling testament to the spitefulness of the geek community, Spore quickly broke records upon release as being the most pirated game in PC gaming history. And it wasn't even that good.

You see, Geeks can be pushed around in RL, but if you try to invade our digital kingdom, we quickly become vindictive and violent (and alliterative). If the government really tries to restrict bandwidth, there will be war. And we both know who will come out victorious.

10.22.2009

Borderlands Review: The Border of Perfection


Just in case you didn't believe me, this is a "Midget Shotgunner". They do exist.
As I unwrapped the shimmering plastic shrink wrap and opened the radioactive green and new-smelling Xbox 360 copy of Borderlands, I was beyond excited. I have already written out why Borderlands was to be my dream game design come to fruition. It was time to put the game through my critical gauntlet.

As the opening credits roll, it's obvious that great care was given to the presentation and production value of this game. You're smiling the whole way through, and it makes you want to get into the game as soon as possible. The cel-shaded graphics was a huge risk on the part of Gearbox, and they implemented that style flawlessly. As mentioned by many other reviewers, it gives the game a distinct look and feel that suits its bizarre humor and wit a perfectly fitting stylistic context. It's hilarious when it needs to be. When a Pandoran hillbilly asks you to "please murder the crap out that guy", you can't help but chuckle. It may not break the boundaries of graphics technology, but that was never the point. It looks great, it sounds great, and the setting is just right, but that's just icing on the cake when it comes to the gameplay itself.

Anybody coming from the RPG field of gamers (especially MMORPGs) will feel immediately at home with the progression of the first few hours of the game. It's a slow grind through some rudimentary quests built to get you used to the world, the controls, the characters, and the weapons. Many would call this boring. I call it build-up.Because the moment you break out of the main hub and head out on some side-quests, and the true extent of the destructive possibilities in this world becomes known, you're mind is blown. Some would say that these possibilities should be obvious from the get-go, but I feel like it would simply be too much if the complexities hit a new player all at once.

The complexity comes primarily from the weapon system. The greatly hyped weapon generator is true to the hype, and as you progress through the game, you will indeed collect an eclectic and infinitely varied weapon set that never ceases to amaze. The weapon system basically breaks down like this: There are 8 weapon types: Pistols, Revolver, SMG, Assault Rifle, Shotgun, Sniper, Launcher, and grenades. There are a variety of different manufacturers that produce these weapons, and each manufacturer tends to focus on a certain quality. For example, if you find a Jakobs revolver, it will most likely be more powerful then, say, a Tedior revolver. Tedior makes their revolvers easier to use, so it will have a faster reload time. Some manufacturers focus on rate of fire, some on accuracy, etc. So, already you have hundreds of variations. But that's only the tip of the iceberg. Pretty much every aspect of your gun is a randomly generated variable, and can all be mixed and matched by the generator to produce some truly amazing and unique weaponry. For example, my level 15 hunter currently sports a quick reloading sniper that shoots incendiary rounds, a revolver that fires 7 bullets at once like a shotgun, and a generic but very powerful assault rifle that has a bigger clip. Each one of the many variations will manifest itself graphically in the gun, so a the gun with the improved clip actually physically has a bigger clip, and the revolver shotgun has a really big barrel. Each weapon has a unique name generated by the aggregation of its parts and its manufacturer. It's easy to understand how Borderland's claim of having more than 17 million guns isn't far-fetched. When all the other aspects of the game can get dull, the weapon generator makes it all worth it.

Like I said, parts of the game can get dull. I was playing this with my brother to get a taste of the co-op action, and he mentioned that the quests are like boring WoW quests, where you are tasked to collect objects and kill a number of enemies ad nauseum, with little variety. This is true in that the quests objectives are pretty generic, but that doesn't mean the gameplay is. The objectives are a moot point when faced with a den of high level spitter skags and a camp full of insane burning midget psychos. When a badass corrosive alpha skag shows up, your focus will not be on the objective, but on how you are going to kill this thing, and what shiny loot it is going to drop. Completing the objective simply acts as an excuse to return to a hub and sell your swag.

The story follows the same formula. It obediently takes a backseat to the action and character progression that the game wants you to focus. The whole story can basically be whittled down into "Find the best loot in the universe, conveniently hidden on the planet you are on right now. The writers understand that the player wants to loot, loot, and loot some more, and sympathetically crafted a story that eschews the normal RPG tones of epic space/medieval soap opera, and made the story fit the gameplay perfectly. The gameplay is about mindless looting, and so is the story. So, in other words, the story sucks, but it makes sense from a game design perspective. Personally, I was hoping for a complex spaghetti western action-epic where the guns are the stars (see the link above), but I will admit that the shallow story design fits the game well.

The character growth system walks a fine line between being too simple and too complex, and therefore has alienated a lot of players. The RPG fans bemoan the lack of armor options and relatively weak skill tree, while the action fans don't want to spend time tactically crafting skills and attributes, and want to start shotgunning badass skags, right this very second. In a game that takes many risks and succeeds in almost all of them, it's a little off-putting that they played it so safe with the skill tree.

The vehicle segments of the game feel like they were meant to be much deeper and more complicated at some point in the development cycle. Maybe they tried to make them as random as the guns or as varied, but somewhere down the line, they simplified it to two types of vehicles, with two types of weapons. The vehicles exist in-game purely as a means to travel faster. But the detailed models and exquisite targeting systems on the vehicles lead me to believe that the designers originally meant for that aspect to be deeper. For now, it's a disappointment.

So, is it my dream game? The short answer is almost. It comes so close to my personal gaming nirvana, and then leaves out one or two dealbreakers that stall the game at the finish line. I love the weapon system to death, but it doesn't let you scavenge for parts (a la Guild Wars), and it doesn't have a gun customization tool to let you build your own dream weapon from salvaged parts. That's always been the basis of my basic design. Don't get me wrong, I love the random weapons, but I really think that a customization tool could be implemented without breaking the balance, contrary to many critics' view. Even a way to add attributes, like gems in Diablo, would be welcome. But the designers left it out in favor of pure looting. In their words, "You're a gunslinger, not a gunsmith." And while I understand the designers' choice to forgo any semblance of a cohesive narrative plot, I miss it nonetheless. It would give the character a context to why he/she is grinding for loot. It isn't needed, because looters loot because it's there, not because they need a context. But I would like to have it.

Borderlands is the best console game I have played in a long time. It's probably the first hack 'n slash lootfest successfully implemented on a console, and it happens to be a great FPS at the same time. The presentation is slick and witty, and the unique attitude shines through and through. The hybrid of FPS and RPG works beautifully, and the hype of the weapon generator is completely realized. There are a few personal dislikes and design flaws, along with some minor technical issues, but they are minor. No other game has come closer to being my dream game, and unless a huge expansion pack addresses those flaws, it will most likely hold that title for some time.

In the meantime, I will gladly smite down the shotgunner midget psychos until they finally drop the gun I've been hankering for. And I will thoroughly enjoy it.

10.20.2009

A Tale Of Gaming Addiction

Mike Fahey, a popular editor at the gaming news site kotaku.com, has written a soul-bearing and heart-wrenching account of his battle with addiction to video games, and how Everquest almost consumed his life. Even if you have never played a video game in your life (you should try it; it's fun), it's worth a read. It's one of the most honest and heartfelt cries for help in this dangerous realm among the many accounts surfacing in the blogorhombus.

I Kept Playing - The Cost of My Gaming Addiction

As an lifetime, avid, video game hobbyist, this story strikes fear into my innards. The issue pops into my mind all the time. As more and more stories like this one come out, I find myself pondering the state of my hobby. I ask myself, "Am I addicted to videogames?" The answer is always no. But it doesn't always come right away. Whenever I read these stories, I see so many behaviors that I have engaged in at some point in time. It scares me to think that the one hobby that has been a constant in my life since early childhood could one day take over my life and consume it.

If there's no clear line, and the enjoyment aspect is the same impetus as the addiction, then when do say "I have a problem?" At what point do you differentiate between the need to have good food, and the need to play a good video game? If you tell me that addiction is when you forgo other responsibilities to play video games, I would tell you that people do this all the time, and it's called recreation. Wanting to have a good time in spite of responsibility is not addiction.

Addiction in a video game needs to be measured by a different yardstick. I think the author is very clear about this. When the superficial world became more important than real life, the game had officially taken control. Everything else is just symptomatic. This point will come at different times for different people, and as Fahey points out, this is a very personal and individual battle. When the line is so thin and unpredictable, gamers have to be vigilant and responsible in toeing that line.

So, I am not addicted. I have a life, a wife, a kid, a job, etc. I take care of things, and when I don't, it's because of laziness, and not because some virtual world needs my attention. However, I do catch myself making excuses to keep playing. I find myself sucked into long playing sessions when I have time for them, and I enjoy it. This is because I enjoy video games, and the ability to escape for a little bit has always been an awesome distraction.

It comes down to discipline, responsibility, and honesty. I hope people are learning valuable lessons from the stories coming out, and I applaud Mr. Fahey for sharing his.