Tuesday, March 22, 2011

One Week With The iPad 2

I just wanted to offer a few thoughts on Apple’s iPad 2 after spending roughly a week with my new purchase.

One of my first articles was my impression of Apple's first generation tablet device; a machine that I felt was a brilliant piece of hardware.


Design -
There is really nothing that I can say that hasn’t been said about Apple’s commitment to design. The iPad 2 is a beautiful device. The brushed aluminum back with it’s tasteful black Apple logo is simply iconic. All of the device’s buttons are also clad in black and offer a nice accent to the metallic theme.

The device is very easy to hold and handle. It’s extreme thinness imparts a far better feel than the first generation iPad, which was no slouch in the design department.

 The white bezel is a nice change of pace from the standard black bezel equipped first-generation device. The tapered edges and the flat, shallow back are a welcome change from the elegant curves of the iPad prime and make it far more stable when lying flat.

Hardware -
Where to begin?

The iPad 2 is sporting a dual-core processor that doubles its theoretical performance. Honestly, I can’t really discern any noticeable difference between the iPad 2 and iPad prime in terms of general tasks. Both seem up to the challenge of simple computing and both handle applications with a gracious ease.

The screen is unchanged from the first generation iPad and to be honest, there is nothing of significance to report here.. The screen is still a responsive, bright (although not quite as bright as the original iPad’s screen) and beautiful IPS panel. Apple’s engineers (and I thank them for this) wisely decided to mount the microphone just above the front facing camera, a welcome change over the previous generation’s side-mounted microphone.

The iPad 2's GPU has been vastly improved, which makes me wonder if it was originally destined to bear the retina display. As with the dual core A5 processor, the real-world effect of the iPad 2's next-generation GPU is minimal when it comes to basic tasks, however, gaming sees a noticeable performance bump. Jenga is just one example of an awesome game that utilizes the iPad 2’s improved graphics processing power.

Two front-facing cameras were included in this iteration of the iPad, but they are of the fairly low quality sort that are found on the fourth generation iPod Touch. All I have to say is that you shouldn't purchase the iPad 2 solely for it’s cameras, they just aren’t that good...

The iPad 2 also includes a 3G radio (as an option, which I opted for), GPS (on the 3G version only) and accelerometer. These options are nice and I appreciate their inclusion, but you definitely pay a premium for the 3G radio and GPS functionality.

Summation -
The iPad 2 is a great device, but the price, performance and battery life may not justify the leap from the iPad prime to the second generation. If you absolutely need or want front-facing cameras, 3G functionality, or passage into the iTunes ecosystem, then I would highly recommend the iPad 2. However, if you currently own the iPad prime, I would have a hard time recommending this device.

 In addition, Samsung is making ready to release a new crop of Android powered tablets which are thinner and lighter than the iPad 2, but, as always, it remains to be seen if they can match Apple in terms of pure performance, battery life and "app" variety.

Apple's calling card remains price, performance, variety of "apps", battery life and the maturity of the iOS platform and I am happy to report that that has not changed with the introduction of the iPad 2.

Friday, February 25, 2011

A Time To Build And A Time To Tear Down.


I think shoddy build quality is completely unacceptable in the age of micro-electronic. There! I said it. I am not ashamed to wave Quality’s flag on my blog.

At any given time, the U.S. consumer electronics market seems flooded with low-quality, low cost consumer electronics produced with poor components and the bare minimum of workmanship.

I hate sounding like one of those people; you know them, the kind of people who complain about good things. Believe me when I say that never before in the history of mankind has the consumer had it so good when it comes to affordable technology. Prices on things like CPUs, hard disk storage, RAM, video graphics, MP3 players, etc. etc. have never been as cheap as they are now. However, there is a downside to all of this abundance; poor quality in a great many products we, the nerds of the universe, tend to buy…

I came across this article at eweek this morning: http://www.eweek.com/c/a/Desktops-and-Notebooks/Apple-MacBook-Pro-Teardown-Finds-Improved-Wireless-Quality-Complaints-542596/

Just thought I would point out some choice highlights from the iFixit teardown of the 2011 MacBook Pro:

Later, though, the (iFixit) team uncovered what looks like toothpaste blobs. "Holy thermal paste! Time will tell if the gobs of thermal paste applied to the CPU and GPU will cause overheating issues down the road," wrote iFixit, coming across the first issue to make it wonder whether Apple is having some quality control issues.

Come on Apple, really? Blobs of thermal paste haphazardly splattered on the CPU and GPU? I know Foxconn has a reputation for driving its employees to suicide, but this is ridiculous! Okay, okay, I made that joke in poor taste. I’m sorry.

As if the amateurish application of thermal compound to the most critical (critical, because the logic board contained in the MBP is non-serviceable and costs nearly as much as a new laptop itself) components of the new MBP wasn't enough, iFixit also detailed another disturbing problem.

The team also discovered a stripped screw near the subwoofer enclosure and an "unlocked ZIF socket for the IR sensor," two things, said iFixit, that should not be found inside a completely unmolested computer with an $1,800 base price.

I couldn’t agree more with iFixit’s assessment. There is absolutely no excuse for a company like Apple, which prides its reputation for aesthetics and design, to ship products that have stripped screws.


I honestly question what the suits in such companies are thinking? Perhaps they’ve grown savvy to the apathy of the masses. Or perhaps slapping a one-year warranty on a $1,800.00USD piece of equipment creates a false sense of security because they know that you and I (aka The Consumer) might feel a strong compulsion to purchase an Applecare service plan at the ridiculous price of $250.00 for three years of extortion, errpp, I mean "protection". Regardless, I think it’s high time that we, as consumers, hold these companies responsible for what they ship. 


In the end it is up to each one of us to vote with our wallets. We must entertain the idea of paying a tad bit more to a company that values quality and craftsmanship as diligently as it pursues innovation in the unceasing quest to increase the bottom line. 



Friday, February 11, 2011

I'm In Love With A Roku...

There is a lot to love about the little black box named Roku.

The last few days have been a revelation of sorts; I have rediscovered the fact that I love television. Never, in my life, have I watched as much television as I have since purchasing my Roku XDS.

Pros:

+ Size: It’s small, very, very small; even smaller than a CD case.

+ Apps: Roku has tons of free apps including Amazon On Demand, Netflix, Vimeo and Hulu Plus.

+ Build Quality: Seems like a fairly well built device to me. The remote control is excellent with nice rubberized buttons.

+ Image Quality: High definition content really pops on my Vizio 37” LCD television. My television is a 1080i device (I’ve set my XDS to transmit a 720p signal), but that doesn’t stop the little black box from streaming up beautiful images. Although Roku claims 1080p compatibility, I have my doubts about this device's ability to keep up. Oh yeah, the XDS definitely won't replace your Blu-Ray player either. Nevertheless, the XDS is good: very, very good.

+ Wireless: Dual-band wireless adapter. 2.4/5Ghz gives you some flexibility when connecting to your wireless router. Unfortunately, my mixed mode network is running at “G” speeds most of the time. Not the XDS’ fault…

+ USB Port: Allows you to connect and play various media through built-in USB port. Yay!


Cons:

+ Heat: Supposedly the XDS consumes 5-6 watts in standby mode. The little guy gets fairly warm when idle. I suspect the RAM chips to be the culprit. No active cooling, or even passive cooling, just the bare board and a honey-combed top-cover…

+ Mystery Hardware: The decoder chip has been identified, as has the RAM, but the wireless chip is still a mystery. Roku doesn’t seem forthcoming with “hard specs”.

+ Wireless: I had trouble connecting the XDS to my network initially, but this was my fault. Upon resuming from standby mode, the XDS’ wireless chip seems to need a bit of time to “catch-up”. Video quality seems to suffer…

+ Off/On Switch: None, as in, the XDS lacks a On/Off switch. I am a big believer in powering down devices when they are not in use, so the lack of a physical hardware switch is somewhat disturbing to me. However, I could easily remedy this problem by powering down my surge protector…

+ Pay To Play: Gotta play for Netflix, Hulu Plus and Amazon. While a good deal of content can be had for free, the premium stuff costs money. This ain't Kansas anymore Dorothy!


So in summary, the Roku Box XDS is a great little device and well worth the $99.99USD I paid for it. Take into account the affordable nature of both Netflix and Hulu Plus and the fact that any content you choose to purchase using Amazon is yours to keep and you simply can’t go wrong with this amazing little device!

Back are the days when I come home and actually look forward to watching some television!




Wednesday, February 02, 2011

Yay! More Android Stupidity...

http://www.pcmag.com/article2/0,2817,2379264,00.asp

Just read the above interesting article from the guys at PC Mag about the further fragmentation of the Android platform and the fact that the current generation of iPad hardware can't keep up with the Tegra II platform in the soon-to-be-released Xoom tablet.

Really? The iPad's hardware is technologically inferior to a device that hasn't even been released yet, running an operating system that, likewise, has yet to be released?

Yeah, I think anybody who keeps up with the technology is well aware of the current iPad's specs. The other quality device (and I stress the word quality), Samsung's Galaxy Tab, currently runs a single core processor clocked at 1Ghz. The difference between the two devices is a relatively small 256MB of system memory, operating system and approximately 2.8" of screen size.

This quote tickled me:

Executives at War Drum Studios, which works next door to Trendy in Gainesville, Florida, also praised the new Android tablet. "Great Battles: Medieval" looks quite a bit like the "Total War" series of games, with individual soldiers battling on a 3D battlefield. In the demonstration War Drum showed off, over 500 individual soldiers were modeled on the battlefield, with no redundant animations, said Thomas Williamson, the company's chief executive. In total, there are probably about 300,000 to 400,000 polygons, he said.

And would his game run on the iPad? "No way, no way," Williamson said. "If this ran on the current generation iPad, it would be about 2, 3 frames per second".

Williamson said that Google's Android facilitates games updates as it allows incremental updates that can written to an SD card. New iOS builds must be essentially zipped up into a new build, he said.


So, to summarize, this guy doesn't think that his company's game, which is currently in development for an as-yet-unreleased build of Android would run on a soon-to-be-released tablet device that reportedly has twice the "power" (whatever that means, I guess they're referring to core count, RAM and gpu horsepower specifically) of a piece of hardware that is approximately one year old.

Thank you for stating the obvious Captain Williamson...

Please note my response to the sheer stupidity of this article:

So let me get this straight. The future of Android is in games? Wait, I thought Android was an operating system that powered small cellular communications devices?

Does this mean that "Honeycomb" is yet another iteration of the Android platform? I'm confused and thats the biggest knock against Android. It's schizophrenic and it doesn't know what it is. It has plenty of other issues too (like generally poor performance when stacked up against its "iOS" rival), not to mention the fact that Android developers (God bless em') are going to have to implement a "minimum spec" requirement.

This is where I saw Android going when it first began fragmenting and honestly, this is where Android is going to bury itself if Google doesn't get a handle on the situation quick, fast and in a hurry.

Imagine buying your brand spanking new Tegra II based tablet to play the latest game only to find out that your device just barely meets the minimum spec. Then, with high hopes, you download the game only to find that it performs poorly. What do you do then? Buy a new $800.00 Xoom tablet to run the latest Android game? These devices aren't PCs and you just can't pop them open and plug in more memory, a new processor or video-card.

Potential problems like these (which leave the consumer out in the cold) do not bode well for the development of the Android platform, nor do they speak well about the responsibility Google claims it has to it's customers. Remember fellas, do no evil!


Thanks for listening to my rant, check out PC Mag's article and offer your own opinion about the subject!

Saturday, January 22, 2011

To Media Or Not To Media...

Just got finished installing Windows 7 on my "media center" PC that has previously run Ubuntu or Windows XP but I must say, Windows 7, despite all of the hype is quite a revelation...




Forgive the sub-par picture quality, but frankly my digital camera sucks. The above picture is a shot of my media PC's internal components. This project has come together over the course of a few years during which time I've managed to assemble a pretty respectable machine complete with Blu-ray player, 500GB Seagate HDD and wireless network card. Unfortunately, the most expensive component in my media center PC was the Blu-Ray disc drive and the primary reason why this project was born in the first place. The most frustrating aspect of integrating a Blu-Ray drive on your PC is the lack of a decent, yet affordable, Blu-Ray software-based decoder (hardware decoders died in the days of the DVD). The software bundled with an OEM Blu-Ray drive may or may not work with Windows 7. I learned this hard way after attempting to install PowerDVD 8 HD and receiving error messages warning me of it's incompatibility with Windows 7. PowerDVD 10 Ultra is compatible with my build, but the software costs $99.00USD, which is what I paid for my OEM copy of Windows 7...


This build also used an Antec Minuet case ($89.99USD) which I've lovingly managed to scratch up over the better part of 2 years. I can't recommend this case enough to people looking for a HTPC case with a decent power supply (350watt Antec model) and enough room for a micro ATX motherboard. Not only is the Antec Minuet 350 a good looking case for your media center, it is highly functional, roomy and as you can see above it can accommodate a large Zalmann flower 92mm fan/heatsink. One of the nice features of this case are the front mounted USB, eSATA and audio inputs (Microphone/Speaker/Headphone).



My media PC is powered by the humble, yet very affordable (I got it at a major electronics retailer for $35.00USD) Athlon II X2 dual-core processor clocked at 3.0Ghz. While it is certainty the case that this processor won't be winning any benchmarks, this little beast is a heck of a chip in its own right running very, very cool with the addition of the afore mentioned Zalmann cooling solution. The motherboard is an unremarkable, yet very budget oriented Biostar model utilizing AMD's 785GE chipset and 4GB of DDR2 memory. This combination of motherboard (chipset), processor and memory provides fairly decent performance, but be warned, Windows 7 is a memory hog and any amount of RAM under 4GB shouldn't be considered "ideal"...


Unsurprisingly, the install of Windows 7 Home Premium went pretty well. I picked up a copy of Home Premium for $99.00USD. I can't say enough about the relative painlessness of the install process. Windows 7 is definitely slick and recognized every device on my motherboard. In fact, it was so painless that I didn't even need to break out the driver disc supplied with my motherboard. I did, however, install the latest Radeon Catalyst drivers. I recommend that anybody using an integrated or stand-alone graphics adapter, install the aftermarket graphics drivers provided by the respective hardware manufacturer.



The next couple shots are of my media center in "action". After installing Windows, I immediately set about the task of installing several programs including iTunes, Google Chrome, Adobe Flash, etc. etc. I also made sure I set up a separate user account for everyday use. Speaking of "security, establishing a standard user account that is separate from the normal "administrator" account and installing antivirus/malware can go a long way towards securing your machine. I agree with those of you out there that consider any version of Windows as being a highly insecure platform. However, one could argue that all operating systems have their respective vulnerabilities and that taking steps to improve security on your system is better than taking no steps at all...




Another dark picture of Windows 7's media center sotware in action (this time streaming a video podcast). Thankfully many of the frustrations and issues I experienced with Ubuntu and, to a lesser extent, Windows XP disappeared with Windows 7. However, it's early in this adventure, but the short time I've worked with Windows 7, as a HTPC platform, seems promising.

That is not to say that Ubuntu is bad. No, I love open source computing and I am a hugely enthusiastic supporter of Ubuntu and Linux. Ubuntu one of those forces in the computing world that seems to be keeping their competition (MS, Apple and anybody else who charges for decent software) honest, but I must nonetheless concede the fact that Windows 7 makes a great O/S for those with a little know-how and some relatively "modern", budget-friendly hardware...

Tuesday, January 18, 2011

Where Have All The Android Tablets Gone?

I have a problem. Well, actually, its more like an addiction...

My addiction is technology, more specifically gadgets. I love gadgets!

Now that CES is a couple of weeks old and I've had a good, long look at the plethora of next generation Android tablets I have to ask myself

"What happened to the FIRST GENERATION tablets?!"

I'm not talking about the plethora of low-quality (i.e. "cheap") Chinese products currently flooding the market in the form of brands like Coby, Augen, iCan, etc. etc. These devices are generally described as being fairly underpowered with resistive touch displays and limited or no Android Market access.

No, I'm talking about quality products like the Samsung Galaxy Tab.

The Galaxy Tab strikes me as being the only real attempt at a first-generation product that is high-quality, fully functional with a beautiful (forgive me) "Apple-like" capacitive display. I guess the Galaxy Tab's only drawback is that it is not running the most current iteration of Android. It also doesn't appear that the Galaxy Tab will have the horsepower to run Android 3.0 when it is released and, if as rumored, it needs a dual core processor for the "optimal experience". Again, this last point is a sore spot for many Android fans, but that is for another blog post...

During Apple's most recent quarterly report Tim Cook, the acting CEO, offered his thoughts on Android:

"Then you have Android tablets, and the varieties that are out shipping today, their operating system wasn't designed for tablets. Google has said this, this isn't just Apple saying this."

I think Mr. Cook has a point. There is a conspicious lack of Android powered tablets on the market that provide the kinds of experiences, compelling experiences, compelling computer-like experiences that Apple's iPad delivers. Its as simple as that... Sure Android has thousands and thousands of apps and so does Apple, but the iPad provides an experience that the current iterations of Android simply cannot provide.

That being said, I would love, love, LOVE to purchase an Android tablet and I've looked long and hard at several devices currently on the market. As I've mentioned before, the Galaxy Tab is a very compelling platform, but at $599.00USD (Unsubsidized. That is $100.00USD MORE expensive than the wi-fi enabled iPad) one could bemoan the seeming existence of an "Android Luxury Tax"...

[It gets worse!]

On the software side of things the fragmentation of the Android platform is disturbing to those of us who have come to expect operating systems that offer the latest and greatest "updates" and security fixes using fixed hardware. Can you imagine the rancor and furor caused by Microsoft if they demanded a hardware upgrade for every patch, security update or hotfix? Hell, Microsoft plans on supporting Windows XP until 2014! A policy like that would be the French Revolution times a thousand and possibly the end of that software's market adoption, but that is the current state of the Android platform.

Mr. Cook made the following comments regarding the fragmentation of the Android platform:

"We firmly believe that our integrated approach is better than the fragmented approach. You can see this in a number of ways -- from the number of fragmented app stores with a variety of ways to pay, people will pull their hair out. Who's on the latest OS -- Android always lags ... In net we think our integrated approach is better, rather than making the end user a systems integrator. I don't know a lot of people who want to be systems integrators. And I think the same thing about iPad. It's the same set of issues, at the end of the day."

I agree with his assesment 100%! I also feel that it is not environmentally conscious or responsible to demand arbitrary "upgrades" in order to run the latest version of Android. Think of the millions of tons of e-waste generated by that kind of policy?

Not good...

Another worrying issue, or rather obstacle in my quest for the perfect Android tablet (or at least a tablet that provides me with an experience equivalent to my iPad) is Android barring wi-fi only enabled tablets from the Google Market place. I've noticed this strange phenomena affecting the various devices that utilize Android 2.1. Ironically, the latest and greatest from the guys at Archos (Arcos 32, 43, 70 and 101) that include Android 2.2 still lack the Android Market and instead use a proprietary piece of software called "Appslib". This scaled down version of the Android Market, by all accounts, (pardon my French... get it!? Haha) sucks.  

This is, in my humble opinion, completely unacceptable and counter-productive to Android's growith as a platform. How do they expect to grow and innovate if they starve a large part of their user base?

All hope is not lost though, because this year's CES demonstrated the looming tablet revolution. Devices of all shapes and sizes are currently in the late stages of development or pre-production. RIM's Playbook looked very interesting. Some of the Tegra II devices looked interesting as well, but I've lost a lot of faith in Nvidia over the last decade (again, another blog post).

For 5 days, a mind-boggling host of Android powered devices were paraded across the pages of Engadget and that gave me hope.Yes, there is hope for the Android platform; a platform with a lot of potential!

And yet there are still these nagging questions. Also, it is my belief that Apple will not wait for Android to catch up...

Only time will tell, but I hope to review my future Android powered tablet sometime soon...

Saturday, January 15, 2011

The Ubuntu Project. Part II.

The last seven hours have been an interesting and informative experience.

A person can learn much about himself or herself, other people and our fundamental interactions with technology.

For example, I learned today that social engineering isn't all that difficult. In fact, social engineering is quite easy. Hackers, crackers, coders or an especially attentive, adroit home user can gain a staggering amount of information with a cleverly worded email, conversation or phone call.

It isn't that the victims of computer crimes are stupid "newbies", although, I would agree that that naivety plays a large part of the problem. No, I think the problems with computer security and privacy stem from a fundamental assumption in the minds of a more inexperienced, less savvy computer user.

As I write this blog post I find myself wondering. I wonder if people from generations past had the same problems with their interactions vis-a-vis technology. Or, as I postulate, their interactions with technology were more basic, more immediate, more visceral and grounded in survival. They were members of generations that depended on face to face interactions and the trustworthiness of those interactions. Alas, I digress and wax winsome thoughts about an age long since over...

The point is the very real difference between my mother's expectation of technology and the nature of those interactions.  I think the majority of older users out there in cyberspace not only expect, but take it for granted that every single exchange of data on the internet, whether it be Facebook, Hotmail or some other site, will be an open, honest affair. I had to work really, really hard to convince my mother that clicking every link that mentions her being in a video or making her financial dreams come true is not a wise decision.

"If it sounds too good to be true Mom, then it probably is..."

That line elicited the most miserable, confused look I have seen on a human face in a long, long time. I couldn't help but feel a twinge of sadness at the thought of two worlds colliding and virtually ensuring that one or both would be irreparably damaged in the process. My mom, ever the guileless user, seemed resigned to this new aspect of her digital lifestyle.

"Mom" I continued "would you open a letter from just anybody?"

Perhaps this last analogy wasn't quite accurate. Many of the junk emails that filled her inbox (427 separate emails to be exact) included her name in their subject line. I went on to explain to her that her recently exploited computer and Facebook account in all probability gave these malicious spam messages all the data they needed to tailor a personalized message of destruction.

Finally I resorted to threats...

"Mom, I'm not going to do this again! I'm not going to fix this damned computer again!" I insisted, but I knew it was one of those rhetorical threats people make when emphasizing a point. Of course I would fix her machine again. However, I know this time I will have to take a more drastic approach.

Enter Ubuntu!

My mom was watching me work while running a live instance of Ubuntu 10.10. "WOW!?" she enthusiastically exclaimed after briefly glancing at the GNOME desktop. I am happy to report that my earlier fears of rejection were allayed when I heard her visibly coo at Ubuntu's clean desktop and generally sleek looking interface.

However, by the end of my visit I had restored her computer to its factory condition; with Windows 7 and all. I instituted all of the changes I planned on instituting and set her up under a somewhat restrictive "standard" user account. I hope my measures will prevent her system from being compromised in the future, but again, I feel a gnawing trepidation. I just don't believe that Windows is the answer for our parents and grandparents. I think its pretty needless to say, that at this point, I believe that her machines chances of reinfection remain exceedingly high.

I think the next three months will be very interesting saga indeed and I believe that there will be far more to come.

I see visions of Ubuntu in my mother's future. It will be a bumpy ride, but she and I are game if you're game...

Wednesday, January 12, 2011

The Ubuntu Project. Part I.

From a security standpoint, the better part of the decade has proven disastrous for Microsoft . A multitude of viruses, exploits, back-doors, trojans, worms and malware have not only marred Microsoft's image as a creator and innovator, but Microsoft's apparent lack of regard and innovation vis-a-vis security and privacy is responsible for introducing a whole host of headaches for the home user.

"IT Guys" around the world have bemoaned Microsoft's woeful attitude towards security. Thousands upon thousands of websites exist for the sole purpose of introducing, infecting and propagating malware under the Windows software environment. Being a geek in the world of Windows is not an easy job, especially when you're not getting paid for it...

Don't get me wrong, I am not saying that I eschew the use of Windows. Just the opposite, I use Windows on a daily basis. Besides, no operating system, no software company is beyond reproach. No, my short introduction illustrates, in very broad strokes, a problem of mine as of late. I'd like to offer you, the non-existent reader (my blog hasn't been doing too well lately), my thoughts on going rogue on Windows. This sad tale has a banal enough beginning.

A few weeks ago I decided to boot my mother's Windows 7 based machine during a visit. I typically boot her machine, check out the software environment and see whats lurking around in her memory and hard drive. Usually, without fail, her machine is infected with malware. I've accepted the fact that she will probably never be able to implement a security conscious computer lifestyle. During my most recent visit though, I was shocked at the sheer number of malicious programs, posing as legitimate software, plaguing her computer. So much so, in fact, that her machine refused to operate for more than five minutes before BSOD'ing. It was bad; very, very bad...

I decided right then and there that I was going to introduce my mother to the virtues of open source. I was going to convert my mother to Ubuntu!

I've fallen in love with Ubuntu over the years and I have watched, with great interest, Ubuntu's maturation into an honest-to-goodness alternative to the ubiquitous Windows operating system. I will freely admit, that in the beginning, Ubuntu was not a very good operating system. Bugs and lack of support really plagued Ubuntu's early distros and to be perfectly honest, they were fun to experiment with (i.e. load on one of your crappy computers and listen to it groan back into un-life like a zombie), but certainly not ready for "prime-time" in any and every conceivable sense of the word.

Fast forward roughly six years to Ubuntu's 10.10 release. I can't say enough about this distro. It's a joy to use. It's fast, smooth, polished and efficient. It is, most of all, exactly what you want in an operating system: secure! Ubuntu takes advantage of clearly defined administrator/user dynamics of the Unix based operating system. It also capitalizes off a large community of highly knowledgable users and developers. The Ubuntu community might be the most hyper-vigilant bunch on earth; actively debugging source code, culling Ubuntu's apps for problems and distributing patches via the built-in GUI based update utility.

The biggest knock on Ubuntu is it's steep learning curve. Many users accustomed to a Windows environment might find Ubuntu's stark GNOME visual interface intimidating. Ubuntu is still Linux and Linux is still, in a very vague sense, Unix. It is what it is and the whole idea of a GUI (Graphical User Interface) is an after thought in the Unix world of yore. I tried to visualize the system shock that would, perhaps, come to define my mom's first interaction with an Ubuntu/Linux based system.

Could she adjust? Would she adjust? Was the learning curve just too damned steep? Were the potential hours spent helping her around he new O/S going to be worth the investment? Could she interact with Ubuntu in a meaningful way; a way that wouldn't frustrate or discourage her from using her machine? All of these questions came up in the course of my contemplation.

Since then I've decided on a course of action. My plan still involves Windows and a small, controlled step into the world of Ubuntu. The first phase of my plan involves using a live instance of Ubuntu to scan her current FUBAR'd install of Windows for viruses. Once that scan is complete and hopefully, after all of the offending malware has been removed I plan on a basic introduction to the Ubuntu platform. In other words, I plan on giving her a very short, very focused tour of her future Ubuntu operating system.

I'll give her three months to change her attitude towards Windows. I will make this gesture knowing full well that she will not change, not blossom into the type of user that is cognizant of something as ephemeral as "computer security". I will arm her with the knowledge to alt+f4 her way out of trouble. I will arm her with Microsoft Security Essentials. I will attempt to lockdown her system and I will fail.

However my approach isn't as defeatist as it sounds. No, my stratagem is two-fold in that I will severely restrict her interactions with her machine by restricting her user rights and removing her ability to install/uninstall devices and software on her Windows system. Its my belief that Windows, despite it's best intentions, does not adequately address the idea of user rights and thus puts its users at unnecessary risk.. It's approach seems very casual when compared to the very strict enforcement of user rights under an Linux/Unix based system.

Hopefully my approach proves successful, but something tells me that my mission to convert a hardcore Windows junkie has just begun...

Wednesday, January 05, 2011

The Death Of Christmas Trees.

Every year, around this time, I notice dozens of discarded Christmas trees tagged, bagged and haphazardly heaped on curb sides all over my neighborhood

Now you may be thinking, “What does this have to do with technology?” to which I would reply “A great deal actually!”

Please bear with me as I explain:

Christmas, it is theorized, had its beginnings in the ceremonies and rituals of the Celts and Romans. These hardy men and women symbolically supplicated the fierce gods of Winter by bringing the branches of evergreens into their homes and decorating them. “Christmas”, Saturnalia or Jul (Yule) as it was called in those yesterdays represented Man’s attempt to control and influence the forces of the natural world, now fast forward two thousand years or so to our modern age. 

Modern man no longer needs the intercession of the gods to survive the long, cold winters of yore. Hence Christmas has been transformed into a purely commercial holiday. Man has developed technology. Man has built a world that is not entirely dependent on the disposition of Nature. Yes, men are still subject to the forces of natural world and to disasters of a natural sort, but surviving any given winter isn’t the life and death struggle that it once was. Again, you can thank technology for all of that glorious heat it brings us in the wintertime.

Getting back on topic though: would our ancestors, no, COULD our ancestors ever conceive of a time when the bright, green centerpieces of their winter "celebrations" would be created from completely artificial materials that have absolutely nothing to do with the natural world? Could they possibly fathom the irony of a Christmas/Saturnalia/Jul celebration that did not need to placate the gods of heat, winter, life and growth? Furthermore, do the artificial monstrosities, commonly referred to as a family’s “fake” Christmas tree somehow reflect our attitude towards ritual and not only ritual, but technology as well?

Far in the past is the time when men thought of technology as “magical” and yet, technology, for all of it's pragmatism, is often described as "magical". Many devices have come along and “revolutionized” the way we, as humans, interact with each other and our world. It is easy to overlook these devices, these daily exchanges with the technology that we manipulate. It is easy to discount the profound effect these devices make on and in our lives.  The point of my rant today, if it is anything, is our unconscious, take-it-for-granted relationship with technology. My point might be our lack of historical perspective vis-a-vis technology.  

The point of my rant is the shrinking number of browning Christmas trees I notice on the curbside.

Yeah, it’s a good thing (environmentally speaking) we have going with our “fake” Chrismtas trees

And perhaps it's a horrible thing, or, perhaps, the truly awful thing is our tendency to take a wondrous technological innovation like the fake Christmas tree for granted...


Monday, January 03, 2011

Dreams Of A Digital Future.

The man in front of me has his head tilted downward. His neck looks stiff and awkward while his whole attention is firmly fixed on the object he holds in his hand. As I glance up and down the line, I notice several people holding similar devices.

A woman standing behind me attracts my interest. She has her head similarly positioned in a downward orientation. Her eyes are fixed on a familiar, rectangular object. She uses her delicate fingers to poke, prod, and swipe at the device she cradles in her hand.

She smiles while interacting with the object and, at other times, she furrows her brow as if wrapped in thought. She casually interacts with device and seems heedless to the goings-on around her. Amazingly, as the line shuffles forward, the individuals standing in line hardly notice their own zombie-like movement.

Its like they are not even here...

I perceive a sense of recognition in the woman's eyes as she glances up for a moment, suddenly aware of the line's movement and her surroundings.

“Oh shit, I’m here, at the coffee shop, getting coffee!” her quizzical, dazed look suggests and then she retreats, back into the world of her mobile electronic device. I watch as her eyes glaze, fascinated by the light of her phone's tiny, three inch screen and I watch as the reality of our collective human experience recedes into a soft, white glow...


I just wanted to make an observation about technology and make a few bold predictions about our future. I think “tech” has fundamentally changed human interaction, sometimes for the better and sometimes for the worse. I think people, myself included, interact with technology in ways that we don’t really comprehend on a conscious level and I think that it's easy to lose track of the role that technology plays in our lives.

You might be thinking: “Pat, are you saying this is a bad thing?”

No, I am not saying it is a bad thing, nor am I saying that it’s a good thing. What I am saying is that our current interaction with technology foreshadows a more profound future. I’m saying that the future is now, has always been now and will always be now! Put that in your hard drive my friends!

Remember the show “Buck Rogers”? Remember Buck’s robot “friend” Twiggy (or Twikkie)?

That’s what our portable electronic devices have become to us; they are our buddies. Imagine a future where we will be able to interact with a sentient electronic device. Will you be lonely? Will you be bored? Will you ever need the companionship of others in that not-so-distant future?

It is my firm belief, as an amateur “futurist” and “techie”, that the digital frontier isn’t “technology” or “devices”. The digital frontier, the digital future, is human intelligence in the kinds of devices that we will, that we currently, carry around on a daily basis.

The future is an iPhone that can and will talk back to its user: a mobile device that can carry on a sparkling conversation or plumb the depths of your thoughts/Facebook status for “problems” in your personal life. Imagine this device being part friend, part confidant, part personal assistant, part doctor and part your very own HAL 9000 inspired Sigmund Freud!

I predict a future in which personal computers will commiserate and laugh with their users. Systems that will monitor, in real-time, your home’s security, structural integrity, environment, you and your families vital statistics, etc. etc. A device that will be able to intelligently interface with human beings in cases of emergency or crisis...

I predict a “smart” phone that will listen, empathize, console, aid, assist and ultimately save. Our phones will one day be our guardian angels; no longer confined to the realms of the spiritual and invisible. Angels that will no longer be the subjects of conjecture. No, our "angels" will be devices that are with us in a very literal sense, on stand-by mode, in our pockets…

One day, perhaps in a future that my little buddy Evan Luce inhabits (he's twwwwoooooooooooo), people will blithely carry around the sum total of all human knowledge in their pockets. These devices will have instantaneous access to this data and most importantly, the intelligence (or processing power) to make the kinds of experiential connections and decisions that only a truly conscious, truly alive, human being can currently make.

These devices will have access to massive, planet-wide networks filled with the knowledge and the accumulated experience of many billions of human beings (imagine "life-blogging" on steroids).  These experiences and our collective wisdom will not be locked inside the cold, mindless dynamic of a "dumb" system. No, I predict these future devices, will profoundly influence our daily interactions. They will be our quiet, fully sentient, conscious, kinetically powered, electronic guardian angels ready to protect, guide and serve us when we most need them... or until their batteries run out!

So today, while you are out and about, think about the little gadget you hold in your hands when you are bored. Think about how readily that device fulfills its purpose and how it is always waiting to serve, entertain and (one day, in the not-so-distant future literally) save you from your doldrums.

Most of all, dream, yes my friends, dream. Dream of our glorious digital future! Dream of our soon-to-be created digital overlords and reboot your brain!