Search My Thoughts...

Tuesday, February 28, 2012

iPad 3 March 7th Event!

Apple has finally announced its iPad (2S or 3?) event! I have been waiting for this announcement since the release of iPad 2. The primary reason I waited for the next generation iPad is the (very probable) introduction of the Retina Display which is currently utilized on the iPhone 4 and 4S. Apple's Retina Display is rumored to have four times the number of pixels as the current model. Apple will also probably boost up the rear-facing camera with its 8MP model currently used in the iPhone 4S (this would give us a boost from 720p recording to 1080p recording!).



What else may we expect from Apple?

I would really like to see a next generation processor worthy of the A6 title. However, rumor has it (with photo evidence you can view bellow) that Apple will release an A5X chip, which will remain duel-core. Earlier rumors claimed that Apple would introduce a quad-core A6 chip for the next generation iPad.


Personally, I don't care if Apple releases a version with a smaller screen. However, I would love to see a larger 10" screen rather than the 9.7" screen the current models have. Of course it should have more memory as well. The iPad 2 only has 512MB of ram; I would really like to see an upgrade to 1GB! With the new display I would like to see an upgrade to the current PowerVR SGX543MP2 graphics chip as well. Of course it will also have Siri. It may also have Gorilla Glass, which would be awesome! Finally, and the least probable (not a shot in Hell), I would love to see Flash brought to all Apple devices with OS5 upgrade. That is truly wishful thinking, however :( 

What might I not want in the new iPad?

One thing that has been rumored is that Apple will release iPad with a smaller dock connector, which in my opinion would be terrible, as older accessories would be obsolete for your new device. That is pretty much the only thing that I wouldn't want to see in the new iPad. Although there was a rumor that the new model will not have a home button - but that rumor was based on a picture of the new device in portrait mode (see below). Thus, the home button would not be visible in the photo. 


Overall, the Apple community is really excited and so am I! I have waited to buy an iPad for a long time and I hope this model will be worth my wait. I will be sitting near my computer on March 7 waiting for the event to hit the Apple website (and reading a live feed). For the iPhone 4S event many websites "claimed" to have a live feed, when in reality it was just a few kids talking and blogging speculative information as the event proceeded - not even cool. If you can't wait to watch it (like myself), I suggest reading about the event live from Gizmodo's website. For the iPhone 4S even they ran a wonderful live text feed from inside the presentation. Until then, there is plenty of time for lots of wishful thinking within the rumor mill. 
"Well, everybody's heard iPad stories around the campfire. Heck, my grandma used to spin yarns about the iPad 3 that would rocket past the farm where she grew up!"
Do you know where I adapted this quote from? 

Monday, February 20, 2012

Inactivity Messages on Internet Radio

Hello Everyone!

Everyday, when I am working on various project (mostly my Dissertation), I listen to Sirius Satellite or Pandora Internet Radio. After about an hour of listening, the music (or talk radio) just stops... silence.

Pandora Radio will stop because the company wants to reduce exceeding bandwidth and the cost of royalties for played music. I am not sure as to why Sirius does this but I assume it's for the same reasons. An interesting thing about Pandora is that this same limitation does not occur with their smartphone application. Nevertheless, the user has to interrupt what they are doing and click "I'm Still Listening" to continue. The apparent reason for this - I assume - is to prevent the user from leaving the room and leaving the content streaming.

For Sirius Radio, the system is even worse. The user has to log back in as their inactivity message actually boots the user off the service. I hate the inactivity messages! Slacker Radio is pretty awesome but I have yet to fix up my stations to be as good as my Pandora stations (which took forever to customize using the thumbs up / thumbs down buttons - and Pandora still plays strange selections. For example, my station will be entitled, "Black Sabbath" and Pandora will play Journey every so often). 

How often do you change the radio station when listening to "regular radio". Personally, I have a favorite station and simply listen to the that station for hours on end. My regular radio does't just stop... I guess I may be over-annoyed by a minor problem, but this constant annoyance drives me crazy. 

In order to prevent the message on Pandora you have to like or dislike songs (or change the station). I am not quite sure what you have to do for Sirius Radio (other than change the station). This does not seem like an annoyance, but imagine you are streaming Sirius or Pandora Radio from your computer throughout the house and you hop in the shower without pressing a button to keep it from going inactive and the music just stops! I can't tell you how much that annoys me! I think you have to upgrade to Pandora One and this will stop happening, along with the elimination of commercials.

On a side note, Sirius Radio is the WORST, not just for inactivity, but for random server errors and network drops! I hate my Sirius Radio with a passion! Sometime in the near future I will dedicate a post to why Sirius Radio is terrible (post merger with XM). But needless to say, when a certain radio personality retires, I will be cancelling my service. I hope Sirius is listening to customer like myself because they will not survive in the ling-run given the current landscape of portable audio (at least not without content)!

Anyway, this was just a short venting post about a small but really annoying personal pet peeve.  

Thursday, February 16, 2012

Get Ready for OLED!

In early January Samsung announced the release of their "Super OLED" 55" television. The television wasn't the production model and we will have to wait until later this year before they hit stores, but this history in the making - as Plasma, LCD, and LED will be history when these sets become affordable for everyone. LG has also announced a 55" OLED for later this year.

 

OLED stands for "Organic Light Emitting Diode" and the technology essentially combines the best of what plasma has to offer (speed and contrast) with the best of what LED sets have to offer (thin bezels with flexible mounting options, pure whites, bright screens). OLED sets utilize electroluminescence, which means that the organic compounds, when given an electric current, emit light. OLED sets can use the same active-matrix technology (utilizing a thin film transistor (TFT) back-plate) to switch individual pixels on or off. Because the pixels themselves produce light, eliminating the need for a back-light, OLED sets can achieve extraordinary blacks. Moreover, without the need for a back-light, these televisions can be ultra-thin and light! The response time on these sets is mega fast as well, so there will be no worries about lag or ugly speed artifacts.OLED sets have a response time of 0.01 ms, which would enable up to 100,000 hz refresh rate! If you don't know what that means, see this older blog post. OLED sets will also have better contrast ratios than either Plasma, LCD, or LED televisions. Finally, the color reproduction of OLED cells is quite good (at least before degradation, see below for more information on the disadvantages of the technology). 

These sets will most likely arrive with a price tag between $8,000 and $10,000, but prices will decrease steadily as manufacturing costs become cheaper and competing companies enter the fray. Sony has offered an 11" OLED for some time now and this small set costs a pretty penny - it's still around $800.00. Sony's 25" Professional OLED costs about $6,100! So a $8,000 price tag would be pretty good for a 55" OLED TV.

So it may sound like I have described the "holy grail" of TV technology. However, as with all technologies there are a few disadvantages. As far as I know (and companies may have made significant progress here, we won't know until the sets have been released) the organic compounds used for OLED technologies have a lifespan of about 15,000 hours - that's about 5 years at 8 hours a day. Your typical LCD or Plasma has a lifespan of around 60,000 to 100,000 hours. Moreover, the compounds used for different color pallets have different degradation rates, which can lead to color imbalances over time (with the color blue having a shorter life span). Even worse, the process of degradation can actually lead to "burn-in" as some pixels can actually fade as they are utilized more often with certain content. While power consumption is generally lower than LCD/LED technologies, it does take more power for an OLED to display bright images which are heavy with lighter colors, especially white. Finally, OLED cells are damaged by UV light, but his has been corrected by utilizing UV light blockers. Without the blockers, OLED sets can become useless after only a few months of exposure to normal lighting! 

Nevertheless, at the outset all of our current technologies there were severe downfalls - Plasma sets didn't used to last 100,000 to half-life - in fact they used to dim significantly after only a year's use. In fact, the first plasma TV's did't come with speakers or TV tuners, those "accessories" were extra and the TVs were only "monitors". Early LCD sets had terrible lag and back-light bleed-through. They were also much heavier and thicker than the current displays. I am sure, just like all technology, that OLED will get better over time - but from the looks of the first generation sets, there won't be much to complain about, besides the price.

Yes, you can say I am excited! 

Tuesday, February 14, 2012

The Worst Mouse Pad Ever...

Hello again everyone! I hope your Valentine's Day has been well (mine was fanatic). This post is about a personal vexation I have with my Laptop and a warning to unsuspecting buyers on what type of mouse pad NOT to purchase.

The Computer

Model: 2011 HP Pavilion dv7-4100 CTO Select Edition Entertainment Notebook PC.
Processor: Intel Core i5 at 2.52 GHZ
RAM: 8GB DDR Memory
Storage: 750GB 7200 RPM Hard Drive
Video Card: ATI Radeon HD 3450 with 512 Dedicated Memory and Switchable with Graphics 
Monitor: 17" 1440x900 Pixel HD Display
Portable Media Storage: LightScribe Blu-Ray ROM + SuperMulti DVD+/-R/RW With Double Layer Support
Battery: 8-Cell Lithium Ion at 6 hrs max
Personalization Devices: Web Cam and Fingerprint Reader
OS: Windows 7 Ultimate 64-bit Edition
Keyboard: Full Keyboard with Numerical Pad



So, a brief glance at the machine should indicate that it's pretty decent. Moreover, I got it for a great price - $829.99 with the educator's discount and the Christmas special. I had researched laptops for more than a year prior to this purchase and I thought I had found the machine that would carry me through another few years. What I did not do was play with this machine in person - all research was done online.This post was compiled on the above machine. If I had to rate the machine on a scale from 1 to 10, it would receive a 6! Why?

The Worst Mouse Pad Ever!

The number one reason that I hate my new laptop enough to give it a 6 out of 10 star rating is the mouse pad design. The mouse pad is large and looks great. However, upon further inspection you will find that the left and right click buttons are not separated. That's right the physical button does not separate in the center and the left and right click are differentiated with a painted line. What a crappy design. I hit the right click button when I really want the left click button and the left click button when I want the right click all the time. In fact, at this point in the post it has already happened 4 times (and I am sure it will happen another 5 by the end of the post).

What's worse is that the large single left/right click buttons are also active motion sensors! With my old laptop I could set my index finger (some of you may use your thumb) on the left click button while I used my middle finger to navigate the arrow around the screen (whoops I just hit the right click button again prompting the pop-up menu). With the dv7 mouse pad this is impossible because touching the left/right click buttons, while trying to navigate the cursor makes the cursor jump all around the screen sporadically! Seriously... WTF... this mouse pad is terrible! Don't believe me, see this HP post on a complaint / support forum.

If you thought the problems ended there you would be wrong. The left/right click button is also very difficult to press and makes an obnoxious sound when pressed... it makes the same noise my automatic transmission lever makes when switching from park to drive - and my car is pretty old, if you get my drift.

I have contacted HP, looked at the instructions for the touch pad, and even tried to order a different mouse pad for the machine. All to no avail... the best option (changing the touch pad) is impossible according to HP. The mouse pad is almost unusable and this is no exaggeration. What's worse is that the newest models have a completely different mouse pad which has corrected each and every problem with my mouse pad! I will NEVER buy a laptop again without using it first and I suggest the same to all of you. If I had used this mouse pad for all of 5 minutes, I would have rethought my purchase.

On a quick side note, the keyboard that came with the computer squealed when I pressed the "delete", "space bar", "A", "R", and "Tab" keys. I had to remove the keyboard and use a piece of sandpaper to remove leftover burs which were rubbing against the metal springs in the keyboard design. To fix this issue HP wanted me to send away my computer for 1-2 weeks, as if I have the time to be without my machine...? I share an office with another colleague; can you image what he thought when I was typing a paper? All he could hear was the squeal of my keyboard and the obnoxious click of my mouse pad - not to mention that the fan on this machine is really loud (but that's a different story). What a nightmare!

The lesson which I learned from this fiasco is to always use, play with, and physically inspect your laptop or any other consumer electronic before you purchase. It felt good to write this post and get this complaint off my chest! I have been holding this in for some time now :)

P.S. I will probably never purchase an HP product again... and to think I used to be a proud supporter. I had my last HP business laptop for over 4 years before I decided to buy my new one - not as pleased with the new model as I was with thew old one. As a result of this experience, I will be purchasing a new machine in the next two years.

Monday, February 13, 2012

The Mac vs. PC Dilemma

The Dilemma...

Everyone knows about the Mac vs. PC tech war. I am not sure if it's definitely a war, but there are two sides, both with enmity for each other, and a series of stock arguments and rebuttals. In fact, there are a "community" of Mac lovers which will jump at the chance to tell you Mac is the only game in town.

See this rather hilarious link for an amusing take from the PC side of the debate.

I was once a PC enthusiast who went out of his way to bash Macs  occasionally participated in Mac vs. PC debates. In fact, when the original iPod was competing with other mp3 players (like the Zune) I took the side of generic portable media players, as any good PC enthusiast at the time would have.Of course I was mistaken. Recently, I have become much more open minded to world of Apple.

My open-mindedness began with my first iPod Nano purchase. Eventually, I broke down and bought one. I had a few mp3 players prior to my Apple and I hated them! I had an old Philips Nike PSA Play 64 and it didn't last a year. I purchased a series of Samsung "wanna-be" iPod Shuffles and none of them were very good. In the end I bought a first generation iPod Nano and from then on out I was sold. My love for iPod led to the purchase of an iPhone and I will probably never buy another type of smart phone again. Moreover, I WILL be purchasing an iPad 3 (hopefully in March 2012). Nevertheless, my love for Apple portable media and tablets has not manifested into a Mac computer purchase. Why? I definitely want one!

Top 3 Reasons a Mac is Still Out of the Question

(1) Hardware for the Price

The price is way too high given the internal hardware specs! This is really the biggest reason for my hesitancy. Let's go through a quick demonstration. 
Cost for Base Model: $1799.99
Processor: Intel 2.2GHz quad-core Core i7
RAM: 4GB DDR3
Storage: 500 GB 5400 RPM Hard Drive
Display: 1440x900 HD Glossy Display
Graphics: AMD Radeon HD 6750M with 512MB dedicated memory
Battery Life: 7 hours

After only about 2 minutes of using the Best Buy Laptop filter feature on their website (meaning there are probably better and cheaper examples) I found this Samsung:

Samsung - 15.6" Series 7 Laptop:

Cost for Base Model: $999.99
Processor: Intel 2.2GHz quad-core Core i7 (second generation)
RAM: 8 GB DDR3
Storage: 1 TB (RPM not specified, but it is at least 5400 RPM, if not 7200) Hard Drive
Display: 1600x900 HD Glossy Display
Graphics: AMD Radeon HD 6490M with 512MB dedicated memory
Battery Life: 7 hours, 48 minutes

Now let's configure the MacBook to have similar specs to the Samsung. An upgrade to 8GB RAM costs $200. MacBook won't let you upgrade to 1TB, but they will let you upgrade to 750GB for another $100. That brings us to $2100 or $1100 more than the Samsung with (almost - the Mac is still lacking 250 GB of hard drive space) the same hardware configuration. I can now buy 2 top-of-the-line Samsung PCs for less than the price of 1 similarly configured Mac Book Pro. I don't know about you, but I can use that extra $1100 for other things!

I think - at least on the hardware side - I have demonstrated my point. "Pound-for-Pound" the PC destroys the MacBook and costs $800.00 less in the standard configuration and $1100 less with similar hardware configurations (i.e. after bringing the MacBook up to par with the Samsung)! Moreover (and maybe a future post) have you seen these Samsung PCs? They look and feel phenomenal. From a hardware (design omitted) standpoint, why would I spend more on a Mac? I wouldn't. The cute little graphic (from here) illustrates my thoughts on the subject :)



Essentially, Apple must charge more because design (especially the uni-body aluminum MacBook) costs more to produce. They are also charging for their "superior" operating system. On a side note, I think Windows 7 has been the best Microsoft OS release since XP; in fact, I have had relatively little functionality issues with 7 and I am running it on several computers (with one of them being an older model). Is a Mac design and OS worth the huge additional cost?... NO WAY! See this awesome video from Apple, which shows you why you spend extra money on the design of a MacBook.


(2) The Virus Myth

Mac people are quick to tell you that their computers do not get viruses. The simple fact is the virus market is like any other market, there will be a greater supply of viruses on the platform that has greater demand (in this case demand is ignorant  unsuspecting users that can be targeted). See here for a more detailed explanation from a series of experts essentially making the same point. Thus, the increase in Mac market share will lead to an increase in viruses targeting Macs. Mac purchases are contributing to their own future security problems, Moreover, I am not computer illiterate; I have not been the victim of a virus since Windows Millennium edition was released. But of course I am an exception... simply put, I kill viruses. But that's besides the point, the fact that Macs are less susceptible to viruses is a supply and demand issue and will most likely change. Moreover, people who own Macs do in fact get viruses in the present day and time. Here is an entire website devoted to Mac security...

(3) I Already Have So Much Software for PC!

This point is a little less "anti-Mac" and more of a practicality problem. If you are like me, you  have acquired a range of PC software which you currently use. For some of us this was a substantial investment. Form a smaller subset of us there is still software we need that does not run on a Mac. Yes you can buy an emulator... but I don't want that headache. Moreover, the availability of opensource programs for Windows is still greater then for Mac (another consequence of the large PC market). Thus, there is a massive initial investment of purchasing a Mac.

There Are Things I Really Like (Ambivalence...)

There is obviously more that goes into this decision. I like the Mac OS (although I don't like reduced administrative privileges - I don't want to make continuous trips to the "Genius Bar", I just want to fix it myself). I love the interplay between iPod, iPhone, iPad, and Mac on iCloud. I even like the Mac design (despite how much it adds to the price). I love the look and feel of a Mac... YES, I said it, I love the "look and the feel" of the MacBook Pro. - it has style, I am not blind! But the benefit is nowhere near the cost of these additional so-called "features". So what would make me buy a Mac?

Simply put, I would buy a Mac if the price was in-line with other reasonably priced PCs. I don't care about the Mac community or all the wonderful things Mac users say makes the price "worth it". And I don't want a MacBook Air which I consider to be a beefed up net book (yup, a beefed up net book). I want a reasonably priced MacBook...period. I am not sure if Apple will ever provide this (and the iPad is not a substitution for a computer) because there are people willing to buy Mac machines regardless of the price. And, if I had a following (which I don't... yet), I am sure I would get a bunch of comments from Mac users about how I am wrong about the differences between the two computers.

Nevertheless, I bet there are a lot of PC users like myself, which would be willing to try a Mac if the price was right. If Apple wanted to increase market share, they would design a reasonable priced machine for everyone else.... i.e. those people that don't necessarily want to pay more money to be a part of some community.

P.S. Please don't tell me the Mac Mini is a good deal. It's $600 and doesn't come with a monitor. It only has 2GB of RAM, a 500 GB hard drive, and on-board Intel graphics. Essentially, the specs on the Mini are comparable to a top of the line desktop PC 3 or 4 years ago, besides the Core i5 Intel processor (yes I know it is only 1.4 inches tall, 7.7 inches wide, 7.7 inches deep, and 2.7 pounds - I don't care).   

Sunday, February 12, 2012

The Amazing Spider-Man Trailer Looks Good

I am not usually a fan of reboots, but this looks like it's going to be a great movie! I am officially excited...

Friday, February 10, 2012

Plasma vs. LCD vs. LED

Plasma vs. LCD vs. LED

I know there is an enormous amount of information on the Internet regarding this subject, with thousands of Blog Posts taking up space on servers everywhere. Even so, the first thing a customer asks me when they are looking to purchase a new television is:

"What's better, Plasma, LCD, or LED?"
or the infamous:

"What's the difference between LCD and Plasma?" 
So, even though there is a plethora of information about the topic, people still ask this (rather annoying) question. This post will offer an answer without a lot of technical information. In other words, I'll spare you the geek talk and give you some practical information for purchasing a new TV.

General Criteria

First let's set out some general criteria for purchasing any of the current technologies. First get a 1080p television. You may encounter the classic part-time sales person that says, 
"You don't need 1080p, you can't really tell the difference between 720p and 1080p unless you are watching a Blu-Ray disc and how often do watch those? Do You even have a Blu-Ray player?"
Ignore this person and walk away. There is a significant mathematical difference between 1080p and 720p. Specifically, 1080p is 1920x1080 pixels, while 720p is 1280x720 pixels. Most over-the-air broadcasts are in 1080i - the i stands for interlaced and the p stands for progressive. Progressive scan technology literally scans the entire picture line by line every sixteenth of a second, while interlaced scanning divides the horizontal lines of the television into odd and even lines and then alternately refreshes them at 30 frames per second. The second number in the pixel aspect ratio indicates the horizontal lines. Thus, a 1080i scan will be 540 odd and 540 even. Now on a 1080p television all 1080 horizontal lines will be there. However, on a 720p television there will only be 720 lines, thus a 360 odd and 360 even split. Whoops, we just lost about 360 lines of resolution! If your following the logic, only a 1080p television set can produce full resolution pictures on all types of scans - interlaced and progressive. So stick with a 1080p set (no matter what the technology). Especially if you are purchasing a TV 40" or larger, do yourself a favor and buy a 1080p set (your eye cannot really tell the difference below 40"). Now, from resolution to speed.

Early LCD televisions had severe problems with what is know as response time. Response time is the amount of time a pixel in an LCD monitor takes to go from one value to another and back again and is measured in milliseconds (ms). The higher the response time, the slower each pixel is to change and thus the higher the lag on your television set. I have an old 26" Sharp Aquos from 2003 (or maybe 2004) with a response tome of 8ms. When I watch a football game on this set all I see are squares and lag lines (artifacts which are a result of its slow response time). The set lags terribly! Response time and refresh rate are different and most LCD/LED screen have ultra fast response times in today's market. Refresh rate is not response time but it is still important.

I suggest that you buy a television which has at least a 120hz refresh rate. Television is recorded in 24hz   (frames per second) and unconverted to 30 frames per second. (fps) This is know as 3:2 pull down. The process is used to transfer film which runs at 24fps to 30fps to match you television's interlaced scan (progressive scan makes a huge improvement on any video). The frames are then combined and interchanged to get 60hz. So why so we need 120hz if the image is actually being presented at 60hz? The answer is that companies have developed advanced algorithms which are able to smooth the picture out better than the standard 60hz, by producing frames in-between frames, which already exist. They are not adding any more detail, but they are making the conversion easier on your eyes. TVs with 120hz or above  (although anything over 120hz does not significantly add to the picture quality and hikes the price up unnecessarily) produce a picture which is substantially more smooth than TVs at 60hz and most people can tell the difference.

These are the two criteria that I think every TV purchase should include. Now let's get to the different technologies.

Plasma: Pros and Cons

Don't let the myths about Plasma TVs prevent you from considering this technology! Plasma TVs have no issues with speed - most run at 600hz per sub-pixel and response time is irrelevant. Football, basketball, hockey, the explosion in space - it will all be smooth with a Plasma. Black levels will be amazing - no strange back-light bleed through to worry about. Color reproduction will be accurate and the colors will be rich and deep. Most plasma sets have 100,000 to half-life, meaning that you will get about 40 years of viewing at 6 hours a day! These sets do not need to be re-charged or refilled (I don't even know where that rumor started, nor do I care). In fact, these are awesome TVs, especially from the brand Panasonic. Most importantly Plasma has a huge advantage in PRICE! The technology for Plasma sets has been around longer and the manufacturing process is cheaper, allowing you to get more "bang for your buck". This can mean a difference, for two sets with the same specs, of just under $1000.00 in some cases - plus you can get larger Plasma sets for the price of some smaller LCD models (i.e. a 46" LCD will be the same price - sometimes higher - as a 50" Plasma). 

There are a few cons. First, these TVs have a thick glass screen which can produce glare (however, most LCD TVs now have a super-glossy screen anyway, so if you have a glare problem it's hard to get away from in some cases). Second, the color pallet for whites is not as pure as on an LCD or LED TV. In fact, shades of white can look rather grey, milky, or even purplish on Plasma screens - it can be annoying. Next, plasma screens are not as bright as most LCD / LED televisions; but really this doesn't matter when the TV is in your house and not side-by-side with 15 other TV models that are all on the "dynamic" picture setting. I am always amazed by the customer who says, "but that TV is brighter and I like bright", not bothering to look at the picture quality. Finally, and most importantly, Plasma screens still experience burn-in. Burn-in occurs when a static image is left on the screen for hours at a time - I have recently witnessed this occur with a static image which was left on a brand new Plasma for only 4 hours! Thus, if you watch Fox News, ESPN, or MSNBC religiously and they have that stock / news ticker on the bottom of the screen, it will become a part of your screen (it actually looks like a dark shadow which will become most noticeable when lighter images are on the screen). Moreover, the scroll bar "fix" (this black and white interchanging bar that just rolls accross the screen for hours in hopes of removing the burn-in) does not work, so don't bother using that feature at all. Burn-in will eventually fade out, but it takes a really long time. 

The conclusion here is that Plasma sets actually have a fantastic picture with rich colors, dark blacks, and super smooth images. Plasma is still a viable technology for anyone interested in buying an HDTV. But BEWARE, Plasma screens are still sold in the 720p resolution - LCD and LED TVs are not sold with this resolution in larger sizes anymore. Buy 1080p unless your really care less about resolution and are just looking for the so called "basement TV".

LCD and LED: Pros ans Cons


Liquid Crystal Diode displays and Light Emitting Diode based displays are essentially the same thing, except that LCD displays use fluorescent lamps for illumination, while LEDs use light emitting diodes to illuminate the TV. Liquid crystal technologies do not produce light on their own (like Plasma TVs which burn phosphors) so they must be back-lit. LCD have come a long ways since they were introduced. They are very bright televisions with most florescent bulbs putting out around 80 lumens (most plasma sets can produce around 3 lumens). But as I already said, this may not be a significant advantage. LCDs and LED also have very good color reproduction in today's models, not so much in the past. LCD TVs have also corrected the issue with response time, yet LCD TVs are still slower than Plasma models. Moreover, as long as the LCD or LED has 120hz motion technology, it will have a smooth picture. LCD TVs are also lighter and have more diverse mounting options; Plasma screens are heavier and require a two-stud mount, but most LCD TVs are light enough to use one-stud full swivel mounts. LCD televisions tend to have more depth than Plasma sets, meaning that the pictures are much more vibrant under normal conditions, especially on static images. Finally, LCDs with a "matted" screen eliminates nasty glare; this only applies to LCD sets without the glossy screen, some LCD / LED sets have MORE glare than any Plasma and some Plasma TVs have an anti-glare coating which makes them as good as any matted LCD. Another advantage is power consumption. LCD sets utilize less power than plasma sets because the technology needs less power to operate. Moreover, the LCD and LED TVs generally have a higher native resolution than Plasma, which will still offer a range of 720p sets. As mentioned before, still images look amazing on LCD and LED TVs; thus they are much better for use as a duel TV / Computer Monitor.

There are a few cons. First black levels on most LCD TVs can be improved. The contrast ratio tells us the ratio of the luminance (a measure of the intensity of light) of the brightest color white to the darkest color black. Just because a TV is brighter than the rest does not mean it has a better contrast ratio. Plasma sets produce very dark blacks and thus can reach very high contrast ratios without having to be super bright. LCD (we will get to LED shortly) sets are back-lit with fluorescent tube lighting and have the tendency to allow light to bleed through, preventing perfect blacks (which will affect the dark colors on your set). Thus, LCD sets have trouble presenting true blacks which are important for overall picture quality. The viewing angle on an LCD (because it is back-lit) is only 178 degrees, rather the full 180 degrees of most plasma sets. Thus at the extreme viewing angles the picture on an LCD can be distorted. LCDs and Plasma sets have similar life spans, thus there is no real difference in this category. Plasma sets have burn-in (which is not permanent), but LCDs also have their idiosyncratic issue called "defective pixels" or the "dead pixel effect". A defective pixel is literally a pixel on an LCD / LED screen which either never shows light or always shows light - essentially it is static and never presents an image leaving a small point on the screen which never changes. I hate these little monsters! Imagine you are watching a Star Trek movie where the Enterprise is about to kick in to warp drive. The image on the screen shows the Enterprise in deep space... wait what it that thing in the middle of the screen?... Is it a star?... No it's a neon green defective pixel that you are just noticing for the first time because a dark image is on the screen. Now whenever you look at your set you will see that neon green bastard. Worse, you call the manufacturer and they tell you that they will not cover this defect under the one-year limited warranty unless there are 20 or more defective pixels in a cluster on the screen - otherwise they don't consider it a "defect". That is the truth of the matter and I know several customer who went through this nasty scenario. By the way, most "extended warranties" only extend the manufactures warranty and will not cover this issue.

Speaking of "extended warranties" do not buy one on an LCD unless it covers 1 or more defective pixels! LCD sets require very little maintenance and outside of the defective pixel issue, these TVs are generally stable. The only major issue is the defective pixel so if the extended warranty covers this issue, you might want to go for it. You have to replace the entire LCD panel to fix the problem and that costs about 80% of the value of the TV. The same goes for Plasma, if they cover burn-in, it may be a decent deal. However, some Plasma sets do have substantial cooling systems which should be occasionally cleaned. If the extended warranty offers preventative maintenance (PM) checks or cleanings it may be worth it. I am not advocating for the "extended warranty" by any means, but in some cases - if the cost is low enough - it could be worth it. In most cases it's not worth it at all. Think about it in terms of expected value and probability theory.

Let's say for the sake of argument that 10% (a high estimate) of TVs like the one you are going to buy will require a repair 2, 3, or 4 years (not per year) after the initial one year warranty. Let's say that the extended warranty costs $150.00 (most cost more) for 3 years additional coverage beyond the manufacture warranty (usually the extended warranty subsumes the manufactures, but I'll give them the benefit of the doubt in this example). Let us further assume that the average repair costs is about $300. Thus, 10% of the time we would have a $300 repair three years after the manufactures warranty expires and the extended warranty costs us an initial $150, while 90% of the time we would not need a repair in four years and the warranty still costs us $150. Let's do the math to find the expected value of the extended warranty: .10(300-150) + .90(-150) = .10(150) - 135 =  15 - 135 = -120. Thus, on average a customer should expect to lose $120 on the purchase of the extended warranty - and keep in mind that I chose a percentage of defection that is rather high. The point of this exercise is to demonstrate that generally risk adverse people (who are scared or pressured by the salesman) purchase these warranties. Now, of course, if the warranty offers additional services beyond repair (like PM checks and cleanings) it might be worth it because it is possible for the warranty to "pay for itself". For example, if a PM check costs $100 a year and the warranty costs $300, and you use all your PM checks, then it has paid for itself. Or if the warranty covers high probability repairs which cost a lot to fix (like defective pixels) it might be worth it. It's your call!

All LED Technology Is Not Created Equal

LED sets are LCD TVs with a different back-light - they use Light Emitting Diodes rather the Cold Cathode Fluorescent Lighting. They are thinner, lighter, and have more versatile and slim-fit mounting systems. There are generally three types of LED lighting patters: (1) Edge-Lit; (2) Full Array; and (3) Dynamic Clusters. The most common type is Edge-Lit, which means that the TV has LED around the edge and that light is diffused throughout the rest of the TV and is also the most useless advertising mechanism to inflate price - yes these LED sets do not provide any advantage over LCD sets, except maybe slightly purer whites. The advantage of LED rests in the "local dimming" feature of Full-Array and Dynamic sets. When a TV is a full array LED, this means that each diode is independently controlled and thus certain areas of the screen can be dimmed to achieve more balanced blacks and ultimately higher contrast ratios. Edge-Lit sets do not have local dimming. Dynamic cluster, while not as good as Full-Array, do have the ability to dim clusters of diodes and thus improve picture performance. Thus, Full-Array and Dynamic Cluster LED sets are worth the extra cost, while Edge-Lit LED sets do not actually improve the picture quality in any substantial way, other than to make colors slightly more pure. Moreover, in some cases Edge-Lit LED TVs can have more light distribution problems because they tend to be brighter at the edges! Don't be fooled, not all LED sets are created equal.

A Brief Word on 3D

There are generally two types of 3D televisions: (1) Active and (2) Passive. 3D works by tricking your eyes to see two slightly different images. Active 3D technology uses glasses equipped with little LCDs (that generally cost $149.00!!) to dim (not block) right and left images in succession to trick your eyes. These glasses have to communicate with the TV via an inferred sensor and need batteries to operate. Why anyone would buy this technology is beyond me! For a family of 4 to watch a single 3D movie they have to make an initial investment of $450 (the first pair are free). The price of your new TV just went up by almost 1/3 of the cost of most 50" screens. Do the movie theaters spend $150 on their 3D glasses? NO, because they use Passive 3D technology. Passive 3D works by using a special pair of polarized glasses which are designed to block specific types of light emitted by the TV. Thus, the TV does all the work and the glasses don't need batteries and cost about $10.00. But, technically, since the glasses are blocking (not dimming) out light you are not getting the full 1080p resolution with the passive technology. Nevertheless, I have seen both operate and the Active 3D is not that much better (if at all) than the passive 3D - it is definitely not worth the extra $450 (as most Passive sets come with 4 pairs of glasses upon purchase). 3D is cool and it will get better over time, especially because actual broadcast television swill start to utilize the technology, but right now it should not make or break your decision on which TV to buy (unless of course you really care about this feature - it's subjective); but I would never buy an active 3D TV since the future of the technology will be  most likely be Passive. 

Conclusion

All three types of TV sets are viable technologies and you should consider all three when purchasing a TV. They all have pros and cons to look out for. In fact, I own all three types (plus a rear projection LCD which I am not going to bother to talk about because no one sells them anymore, albeit in the largest of the large sizes and even then front projection is better. There was also DLP technology which suffered the same fate as Rear projection LCD). If you have a budget constraint I think Plasma sets give you the most for the money. If you have a huge budget, Full-Array LED TVs are great but so are THX certified Plasma sets. I am not going to crown a winner but I should have given you enough information to make up your own mind. Happy TV Hunting! 

Thursday, February 9, 2012

The Future of Optical Drives?

So, today's thought (which actually inspired the idea for the whole blog) is: What is the Future of Optical Drives for Consumer Electronics Writ Large?

This is an interesting question. We have seen the evolution of optical media: CD-ROM, CD-R / CD-RW, DVD-ROM, DVD-R / DVD-RW, BD, and now BD-R / BD-RE. We can actually think about the evolution in terms of how much information we can fit on a disc. CD-R optical lasers began with a wavelength of 780 nano-meters (nm) (inferred range), DVD-R lasers were reduced to 650 nm, and BD-R lasers are around 405 nm. CD-R can generally fit about 700 MB of data, DVD-R can generally fit about 4.7 GB (single layer), and a BD-R can generally fit about 25 GB (single layer); the amount of data depends on whether the disc is single or double layered, as double layered discs are able to fit twice the information. Notice the trend - it's pretty obvious that storage is getting larger and this is a direct result of what  is placed on the disc. We have went from recording music to 480p movies to 1080p movies. 

Some of the really, really nerdy people out there might remember the BD vs. HD-DVD format war that occurred in the mid 2000s. HD-DVD could store about 15GB on a single layer disc and was mainly produced by Toshiba and  used for 720p movies on Paramount and Warner Bros. labels. In 2008 (Toshiba's) HD-DVD was defeated by (Sony's) Blu-Ray as the last of the movie production companies switched to BD discs.

Nevertheless, as the era of High Definition (HD) solidified we needed a media capable of giving us the content we wanted. Now it looks as though broadband streaming of online HD content could potentially kill the optical drive - not to mention the fact that we can download most software for our computers. Also, reasonably priced SD (and other types) memory cards (and thumb / flash drives) can hold 32 GB of data and that is getting larger all the time! Moreover, solid state drives are becoming more popular and demonstrate the ability to store large amounts of information on small frames (i.e. the MacBook Air). Many of you may have probably heard the rumor that new MacBook Pro laptops may be dropping the optical drive in order to make them slimmer and lighter. Consumers can buy plenty of Windows based laptops without Optical Drives already. The only time I ever use my optical burner in my computer is to burn CDs for my car - and that is rare because I have an iPhone connector in my sweet Pioneer Premier deck. In fact, when I sell Blu-Ray Players, customers are more interested in whether they have a built-in internet receiver so they can stream Netflix than the quality of the optical player itself! Let's not forget about the "Cloud" (generally defined here as any off-site data storage which can be accessed via the internet). Will services like Netflix, websites like www.download.com, and the "Cloud" kill the optical drive?

Not likely in the near future. Whether you know it or not, when you stream content from Netflix and other services (including cable and satellite set-top box (on-demand) services) your video is compressed and compression reduces the quality of your video content - in fact compression artifacts are pretty common on Netflix, although this has gotten much better. Even so, there is a difference. Try streaming Iron Man 2 in HD on Netflix and playing Iron Man 2 on Blu-Ray, side-by-side on identical 1080p TVs. You WILL see a difference - I am picky so I say the difference is major!

However, this is NOT the reason I say optical technology will survive for at least another decade. 1920x1080 (1080p) displays are not the be-all-end-all of HD TV. "Ultra HD TV" (also known as Super Hi-Resolution)  is coming and it has a resolution of 4096x2060 pixels - but NHK in Japan has confirmed a 7680x4320 pixels display. For a 2 hour movie, this resolution would require about 100GB of data! Try streaming that resolution with any kind of buffer. This technology is about a decade away; but researchers at the University of California have demonstrated a laser wavelength of  385 nm. We will need a wavelength of 200 nm for the new high resolution technology, so it's not here yet. You can find a more thorough discussion of this technology and more on the future of optical drives here, from this Forbes article. Unless consumer broadband streaming streaming technology keeps up with HD content (as of now a T1 line has a potential bandwidth of around 1.5 mps and a T3 has a bandwidth around 45 mps - your local cable or DSL connection is capped at around 3 mps at the high-end), optical disc media will still have a use - hell, YouTube content still has trouble keeping a decent buffer at times.

Plus, at least some of us have the desire to own the case and non-digital media that come with albums and movies. Yes, I like looking at the CD booklet and I always hope the artist includes the lyrics. In the end, optical drives may be removed from PCs and Laptops, but they are going nowhere when it comes to HD movie content.

In all reality tech moves so fast that all of this could change in a split second... but I think we can bank on at least a decade of consumer electronic optical technology with at least 1 new evolution.


The Need For A Side Job...

Welcome to my "Side Job". During the week I am Political Scientist - more specifically a Ph.D. student at the University at Buffalo (SUNY). I have finished my course work and comprehensive exams and I am currently writing my dissertation. I am also an adjunct professor at Buffalo State College. For more information about me and my research see my website. In fact, I have a blog dedicated to my work. I love political science, my research, and teaching undergraduate classes; it is literally my dream come true! When you are doing something you love life is grand.

Now we come to the weekend. Since I don't have a "real" job yet (in the next year or so I should have an Assistant Professor of Political Science position somewhere in the U.S.), I have to work a side job. What is my side job you ask? I sell, install, and repair consumer electronics in the Rochester / Buffalo, New York area. When I am not thinking about Political Science, I am usually thinking about the gadgets of the world. This blog is a direct result of that thought process. Hopefully, I can provide you with some useful information in the process of elucidating my biased opinions. Thanks for visiting my blog. Be sure to give me your useful yet biased comments...